Ackermann, Hans D.; Pankratz, Leroy W.; Dansereau, Danny A.
1983-01-01
The computer programs published in Open-File Report 82-1065, A comprehensive system for interpreting seismic-refraction arrival-time data using interactive computer methods (Ackermann, Pankratz, and Dansereau, 1982), have been modified to run on a mini-computer. The new version uses approximately 1/10 of the memory of the initial version, is more efficient and gives the same results.
Automatic design of optical systems by digital computer
NASA Technical Reports Server (NTRS)
Casad, T. A.; Schmidt, L. F.
1967-01-01
Computer program uses geometrical optical techniques and a least squares optimization method employing computing equipment for the automatic design of optical systems. It evaluates changes in various optical parameters, provides comprehensive ray-tracing, and generally determines the acceptability of the optical system characteristics.
On the Computation of Comprehensive Boolean Gröbner Bases
NASA Astrophysics Data System (ADS)
Inoue, Shutaro
We show that a comprehensive Boolean Gröbner basis of an ideal I in a Boolean polynomial ring B (bar A,bar X) with main variables bar X and parameters bar A can be obtained by simply computing a usual Boolean Gröbner basis of I regarding both bar X and bar A as variables with a certain block term order such that bar X ≫ bar A. The result together with a fact that a finite Boolean ring is isomorphic to a direct product of the Galois field mathbb{GF}_2 enables us to compute a comprehensive Boolean Gröbner basis by only computing corresponding Gröbner bases in a polynomial ring over mathbb{GF}_2. Our implementation in a computer algebra system Risa/Asir shows that our method is extremely efficient comparing with existing computation algorithms of comprehensive Boolean Gröbner bases.
Simulation Framework for Intelligent Transportation Systems
DOT National Transportation Integrated Search
1996-10-01
A simulation framework has been developed for a large-scale, comprehensive, scaleable simulation of an Intelligent Transportation System. The simulator is designed for running on parellel computers and distributed (networked) computer systems, but ca...
Learner Assessment Methods Using a Computer Based Interactive Videodisc System.
ERIC Educational Resources Information Center
Ehrlich, Lisa R.
This paper focuses on item design considerations faced by instructional designers and evaluators when using computer videodisc delivery systems as a means of assessing learner comprehension and competencies. Media characteristics of various interactive computer/videodisc training systems are briefly discussed as well as reasons for using such…
ERIC Educational Resources Information Center
Debuse, Justin C. W.; Lawley, Meredith
2016-01-01
Providing students with high quality feedback is important and can be achieved using computer-based systems. While student and educator perspectives of such systems have been investigated, a comprehensive multidisciplinary study has not yet been undertaken. This study examines student and educator perspectives of a computer-based assessment and…
Models and techniques for evaluating the effectiveness of aircraft computing systems
NASA Technical Reports Server (NTRS)
Meyer, J. F.
1978-01-01
Progress in the development of system models and techniques for the formulation and evaluation of aircraft computer system effectiveness is reported. Topics covered include: analysis of functional dependence: a prototype software package, METAPHOR, developed to aid the evaluation of performability; and a comprehensive performability modeling and evaluation exercise involving the SIFT computer.
Toward a unified account of comprehension and production in language development.
McCauley, Stewart M; Christiansen, Morten H
2013-08-01
Although Pickering & Garrod (P&G) argue convincingly for a unified system for language comprehension and production, they fail to explain how such a system might develop. Using a recent computational model of language acquisition as an example, we sketch a developmental perspective on the integration of comprehension and production. We conclude that only through development can we fully understand the intertwined nature of comprehension and production in adult processing.
2012-04-21
the photoelectric effect. The typical shortest wavelengths needed for ion traps range from 194 nm for Hg+ to 493 nm for Ba +, corresponding to 6.4-2.5...REPORT Comprehensive Materials and Morphologies Study of Ion Traps (COMMIT) for scalable Quantum Computation - Final Report 14. ABSTRACT 16. SECURITY...CLASSIFICATION OF: Trapped ion systems, are extremely promising for large-scale quantum computation, but face a vexing problem, with motional quantum
ERIC Educational Resources Information Center
Hathaway, Walter E.
Efficient and convenient comprehensive information systems, long kept from coming into being by a variety of obstacles, are now made possible by the concept of distributive processing and the technology of micro- and mini-computer networks. Such systems can individualize instruction, group students efficiently, cut administrative costs, streamline…
ERIC Educational Resources Information Center
Reyes-Palomares, Armando; Sanchez-Jimenez, Francisca; Medina, Miguel Angel
2009-01-01
A comprehensive understanding of biological functions requires new systemic perspectives, such as those provided by systems biology. Systems biology approaches are hypothesis-driven and involve iterative rounds of model building, prediction, experimentation, model refinement, and development. Developments in computer science are allowing for ever…
Heterogeneity in Health Care Computing Environments
Sengupta, Soumitra
1989-01-01
This paper discusses issues of heterogeneity in computer systems, networks, databases, and presentation techniques, and the problems it creates in developing integrated medical information systems. The need for institutional, comprehensive goals are emphasized. Using the Columbia-Presbyterian Medical Center's computing environment as the case study, various steps to solve the heterogeneity problem are presented.
The Bilingual Language Interaction Network for Comprehension of Speech
ERIC Educational Resources Information Center
Shook, Anthony; Marian, Viorica
2013-01-01
During speech comprehension, bilinguals co-activate both of their languages, resulting in cross-linguistic interaction at various levels of processing. This interaction has important consequences for both the structure of the language system and the mechanisms by which the system processes spoken language. Using computational modeling, we can…
Computational 3-D Model of the Human Respiratory System
We are developing a comprehensive, morphologically-realistic computational model of the human respiratory system that can be used to study the inhalation, deposition, and clearance of contaminants, while being adaptable for age, race, gender, and health/disease status. The model ...
ERIC Educational Resources Information Center
Stevenson, Kimberly
This master's thesis describes the development of an expert system and interactive videodisc computer-based instructional job aid used for assisting in the integration of electron beam lithography devices. Comparable to all comprehensive training, expert system and job aid development require a criterion-referenced systems approach treatment to…
NASA Technical Reports Server (NTRS)
Snyder, W. V.; Hanson, R. J.
1986-01-01
Text Exchange System (TES) exchanges and maintains organized textual information including source code, documentation, data, and listings. System consists of two computer programs and definition of format for information storage. Comprehensive program used to create, read, and maintain TES files. TES developed to meet three goals: First, easy and efficient exchange of programs and other textual data between similar and dissimilar computer systems via magnetic tape. Second, provide transportable management system for textual information. Third, provide common user interface, over wide variety of computing systems, for all activities associated with text exchange.
National electronic medical records integration on cloud computing system.
Mirza, Hebah; El-Masri, Samir
2013-01-01
Few Healthcare providers have an advanced level of Electronic Medical Record (EMR) adoption. Others have a low level and most have no EMR at all. Cloud computing technology is a new emerging technology that has been used in other industry and showed a great success. Despite the great features of Cloud computing, they haven't been utilized fairly yet in healthcare industry. This study presents an innovative Healthcare Cloud Computing system for Integrating Electronic Health Record (EHR). The proposed Cloud system applies the Cloud Computing technology on EHR system, to present a comprehensive EHR integrated environment.
Design Principles for a Comprehensive Library System.
ERIC Educational Resources Information Center
Uluakar, Tamer; And Others
1981-01-01
Describes an online design featuring circulation control, catalog access, and serial holdings that uses an incremental approach to system development. Utilizing a dedicated computer, this second of three releases pays particular attention to present and predicted computing capabilities as well as trends in library automation. (Author/RAA)
Cost-effective use of minicomputers to solve structural problems
NASA Technical Reports Server (NTRS)
Storaasli, O. O.; Foster, E. P.
1978-01-01
Minicomputers are receiving increased use throughout the aerospace industry. Until recently, their use focused primarily on process control and numerically controlled tooling applications, while their exposure to and the opportunity for structural calculations has been limited. With the increased availability of this computer hardware, the question arises as to the feasibility and practicality of carrying out comprehensive structural analysis on a minicomputer. This paper presents results on the potential for using minicomputers for structural analysis by (1) selecting a comprehensive, finite-element structural analysis system in use on large mainframe computers; (2) implementing the system on a minicomputer; and (3) comparing the performance of the minicomputers with that of a large mainframe computer for the solution to a wide range of finite element structural analysis problems.
Solar Heating and Cooling for a Controls Manufacturing Plant Lumberton, New Jersey
NASA Technical Reports Server (NTRS)
1982-01-01
Comprehensive report documents computer-controlled system which has separate solar-collector and cooling-tower areas located away from building and is completely computer controlled. System description, test data, major problems and resolution, performance, operation and maintenance, manufacturer's literature and drawing comprise part of 257-page report.
Administrative Uses of Computers in the Schools.
ERIC Educational Resources Information Center
Bluhm, Harry P.
This book, intended for school administrators, provides a comprehensive account of how computer information systems can enable administrators at both middle and top management levels to manage the educational enterprise. It can be used as a textbook in an educational administration course emphasizing computer technology in education, an…
Computational principles of working memory in sentence comprehension.
Lewis, Richard L; Vasishth, Shravan; Van Dyke, Julie A
2006-10-01
Understanding a sentence requires a working memory of the partial products of comprehension, so that linguistic relations between temporally distal parts of the sentence can be rapidly computed. We describe an emerging theoretical framework for this working memory system that incorporates several independently motivated principles of memory: a sharply limited attentional focus, rapid retrieval of item (but not order) information subject to interference from similar items, and activation decay (forgetting over time). A computational model embodying these principles provides an explanation of the functional capacities and severe limitations of human processing, as well as accounts of reading times. The broad implication is that the detailed nature of cross-linguistic sentence processing emerges from the interaction of general principles of human memory with the specialized task of language comprehension.
Barrès, Victor; Lee, Jinyong
2014-01-01
How does the language system coordinate with our visual system to yield flexible integration of linguistic, perceptual, and world-knowledge information when we communicate about the world we perceive? Schema theory is a computational framework that allows the simulation of perceptuo-motor coordination programs on the basis of known brain operating principles such as cooperative computation and distributed processing. We present first its application to a model of language production, SemRep/TCG, which combines a semantic representation of visual scenes (SemRep) with Template Construction Grammar (TCG) as a means to generate verbal descriptions of a scene from its associated SemRep graph. SemRep/TCG combines the neurocomputational framework of schema theory with the representational format of construction grammar in a model linking eye-tracking data to visual scene descriptions. We then offer a conceptual extension of TCG to include language comprehension and address data on the role of both world knowledge and grammatical semantics in the comprehension performances of agrammatic aphasic patients. This extension introduces a distinction between heavy and light semantics. The TCG model of language comprehension offers a computational framework to quantitatively analyze the distributed dynamics of language processes, focusing on the interactions between grammatical, world knowledge, and visual information. In particular, it reveals interesting implications for the understanding of the various patterns of comprehension performances of agrammatic aphasics measured using sentence-picture matching tasks. This new step in the life cycle of the model serves as a basis for exploring the specific challenges that neurolinguistic computational modeling poses to the neuroinformatics community.
The Comprehensive Competencies Program Reference Manual. Volume I. Introduction.
ERIC Educational Resources Information Center
Taggart, Robert
Chapter 1 of this reference manual is a summary of the comprehensive competencies program (CCP). It describes this system for organizing, implementing, managing, and efficiently delivering individualized self-paced instruction, combined with group and experience-based learning activities, using computer-assisted instruction. (The CCP covers not…
Computers, the Human Mind, and My In-Laws' House.
ERIC Educational Resources Information Center
Esque, Timm J.
1996-01-01
Discussion of human memory, computer memory, and the storage of information focuses on a metaphor that can account for memory without storage and can set the stage for systemic research around a more comprehensive, understandable theory. (Author/LRW)
Feasibility Study on the Use of Computer Managed Learning in Secondary Schools in the U.S.A.
ERIC Educational Resources Information Center
Charp, Sylvia
A brief description of computer managed instruction (CMI), including its applications and capabilities, introduces case studies of schools in the United States that are using three different CMI systems. The first system discussed is the Comprehensive Achievement Monitoring (CAM) Program, which was developed by a small school district (Hopkins,…
ERIC Educational Resources Information Center
Exley, I. Sheck
The high percentage of high school pre-algebra students having difficulty learning the abstract concept of graphing ordered pairs on the Cartesian rectangular coordinate system was addressed by the creation and implementation of a computer-managed instructional program. Modules consisted of a pretest, instruction, two practice sessions, and a…
ERIC Educational Resources Information Center
Schneider, E. W.
The Interface System is a comprehensive method for developing and managing computer-assisted instructional courses or computer-managed instructional courses composed of sets of instructional modules. Each module is defined by one or more behavioral objectives and by a list of prerequisite modules that must be completed successfully before the…
R.B. Ferguson; V. Clark Baldwin
1987-01-01
Complete instructions for user operation of COMPUTE_P-LOB to include detailed examples of computer input and output, of a growth and yield prediction system providing volume and weight yields in stand and stock table format.A complete program listing is provided.
ERIC Educational Resources Information Center
Fischer, Robert
The report details development, at Southwest Texas State University and later at Pennsylvania State University, of a computer authoring system ("Libra") enabling foreign language faculty to develop multimedia lessons focusing on listening comprehension. Staff at Southwest Texas State University first developed a Macintosh version of the…
Comprehensive Digital Imaging Network Project At Georgetown University Hospital
NASA Astrophysics Data System (ADS)
Mun, Seong K.; Stauffer, Douglas; Zeman, Robert; Benson, Harold; Wang, Paul; Allman, Robert
1987-10-01
The radiology practice is going through rapid changes due to the introduction of state-of-the-art computed based technologies. For the last twenty years we have witnessed the introduction of many new medical diagnostic imaging systems such as x-ray computed tomo-graphy, digital subtraction angiography (DSA), computerized nuclear medicine, single pho-ton emission computed tomography (SPECT), positron emission tomography (PET) and more re-cently, computerized digital radiography and nuclear magnetic resonance imaging (MRI). Other than the imaging systems, there has been a steady introduction of computed based information systems for radiology departments and hospitals.
A practice course to cultivate students' comprehensive ability of photoelectricity
NASA Astrophysics Data System (ADS)
Lv, Yong; Liu, Yang; Niu, Chunhui; Liu, Lishuang
2017-08-01
After the studying of many theoretical courses, it's important and urgent for the students from specialty of optoelectronic information science and engineering to cultivate their comprehensive ability of photoelectricity. We set up a comprehensive practice course named "Integrated Design of Optoelectronic Information System" (IDOIS) for the purpose that students can integrate their knowledge of optics, electronics and computer programming to design, install and debug an optoelectronic system with independent functions. Eight years of practice shows that this practice course can train students' ability of analysis, design/development and debugging of photoelectric system, improve their ability in document retrieval, design proposal and summary report writing, teamwork, innovation consciousness and skill.
Hettinger, Lawrence J.; Kirlik, Alex; Goh, Yang Miang; Buckle, Peter
2015-01-01
Accurate comprehension and analysis of complex sociotechnical systems is a daunting task. Empirically examining, or simply envisioning the structure and behaviour of such systems challenges traditional analytic and experimental approaches as well as our everyday cognitive capabilities. Computer-based models and simulations afford potentially useful means of accomplishing sociotechnical system design and analysis objectives. From a design perspective, they can provide a basis for a common mental model among stakeholders, thereby facilitating accurate comprehension of factors impacting system performance and potential effects of system modifications. From a research perspective, models and simulations afford the means to study aspects of sociotechnical system design and operation, including the potential impact of modifications to structural and dynamic system properties, in ways not feasible with traditional experimental approaches. This paper describes issues involved in the design and use of such models and simulations and describes a proposed path forward to their development and implementation. Practitioner Summary: The size and complexity of real-world sociotechnical systems can present significant barriers to their design, comprehension and empirical analysis. This article describes the potential advantages of computer-based models and simulations for understanding factors that impact sociotechnical system design and operation, particularly with respect to process and occupational safety. PMID:25761227
Kellogg, Glen E; Fornabaio, Micaela; Chen, Deliang L; Abraham, Donald J; Spyrakis, Francesca; Cozzini, Pietro; Mozzarelli, Andrea
2006-05-01
Computational tools utilizing a unique empirical modeling system based on the hydrophobic effect and the measurement of logP(o/w) (the partition coefficient for solvent transfer between 1-octanol and water) are described. The associated force field, Hydropathic INTeractions (HINT), contains much rich information about non-covalent interactions in the biological environment because of its basis in an experiment that measures interactions in solution. HINT is shown to be the core of an evolving virtual screening system that is capable of taking into account a number of factors often ignored such as entropy, effects of solvent molecules at the active site, and the ionization states of acidic and basic residues and ligand functional groups. The outline of a comprehensive modeling system for virtual screening that incorporates these features is described. In addition, a detailed description of the Computational Titration algorithm is provided. As an example, three complexes of dihydrofolate reductase (DHFR) are analyzed with our system and these results are compared with the experimental free energies of binding.
The Bilingual Language Interaction Network for Comprehension of Speech*
Marian, Viorica
2013-01-01
During speech comprehension, bilinguals co-activate both of their languages, resulting in cross-linguistic interaction at various levels of processing. This interaction has important consequences for both the structure of the language system and the mechanisms by which the system processes spoken language. Using computational modeling, we can examine how cross-linguistic interaction affects language processing in a controlled, simulated environment. Here we present a connectionist model of bilingual language processing, the Bilingual Language Interaction Network for Comprehension of Speech (BLINCS), wherein interconnected levels of processing are created using dynamic, self-organizing maps. BLINCS can account for a variety of psycholinguistic phenomena, including cross-linguistic interaction at and across multiple levels of processing, cognate facilitation effects, and audio-visual integration during speech comprehension. The model also provides a way to separate two languages without requiring a global language-identification system. We conclude that BLINCS serves as a promising new model of bilingual spoken language comprehension. PMID:24363602
Comprehensive rotorcraft analysis methods
NASA Technical Reports Server (NTRS)
Stephens, Wendell B.; Austin, Edward E.
1988-01-01
The development and application of comprehensive rotorcraft analysis methods in the field of rotorcraft technology are described. These large scale analyses and the resulting computer programs are intended to treat the complex aeromechanical phenomena that describe the behavior of rotorcraft. They may be used to predict rotor aerodynamics, acoustic, performance, stability and control, handling qualities, loads and vibrations, structures, dynamics, and aeroelastic stability characteristics for a variety of applications including research, preliminary and detail design, and evaluation and treatment of field problems. The principal comprehensive methods developed or under development in recent years and generally available to the rotorcraft community because of US Army Aviation Research and Technology Activity (ARTA) sponsorship of all or part of the software systems are the Rotorcraft Flight Simulation (C81), Dynamic System Coupler (DYSCO), Coupled Rotor/Airframe Vibration Analysis Program (SIMVIB), Comprehensive Analytical Model of Rotorcraft Aerodynamics and Dynamics (CAMRAD), General Rotorcraft Aeromechanical Stability Program (GRASP), and Second Generation Comprehensive Helicopter Analysis System (2GCHAS).
A prototype expert/information system for examining environmental risks of KSC activities
NASA Technical Reports Server (NTRS)
Engel, Bernard A.
1993-01-01
Protection of the environment and natural resources at the Kennedy Space Center (KSC) is of great concern. An expert/information system to replace the paper-based KSC Environmental Checklist was developed. The computer-based system requests information only as a required and supplies assistance as needed. The most comprehensive portion of the system provides information about endangered species habitat at KSC. This module uses geographic information system (GIS) data and tools, expert rules, color graphics, computer-based video, and hypertext to provide information.
Program for creating an operating system generation cross reference index (SGINDEX)
NASA Technical Reports Server (NTRS)
Barth, C. W.
1972-01-01
Computer program to collect key data from Stage Two input of OS/360 system and to prepare formatted listing of index entries collected is discussed. Program eliminates manual paging through system output by providing comprehensive cross reference.
Argonne Simulation Framework for Intelligent Transportation Systems
DOT National Transportation Integrated Search
1996-01-01
A simulation framework has been developed which defines a high-level architecture for a large-scale, comprehensive, scalable simulation of an Intelligent Transportation System (ITS). The simulator is designed to run on parallel computers and distribu...
Distributed Computer Networks in Support of Complex Group Practices
Wess, Bernard P.
1978-01-01
The economics of medical computer networks are presented in context with the patient care and administrative goals of medical networks. Design alternatives and network topologies are discussed with an emphasis on medical network design requirements in distributed data base design, telecommunications, satellite systems, and software engineering. The success of the medical computer networking technology is predicated on the ability of medical and data processing professionals to design comprehensive, efficient, and virtually impenetrable security systems to protect data bases, network access and services, and patient confidentiality.
A Procedure for Measuring Latencies in Brain-Computer Interfaces
Wilson, J. Adam; Mellinger, Jürgen; Schalk, Gerwin; Williams, Justin
2011-01-01
Brain-computer interface (BCI) systems must process neural signals with consistent timing in order to support adequate system performance. Thus, it is important to have the capability to determine whether a particular BCI configuration (i.e., hardware, software) provides adequate timing performance for a particular experiment. This report presents a method of measuring and quantifying different aspects of system timing in several typical BCI experiments across a range of settings, and presents comprehensive measures of expected overall system latency for each experimental configuration. PMID:20403781
Comprehensive silicon solar cell computer modeling
NASA Technical Reports Server (NTRS)
Lamorte, M. F.
1984-01-01
The development of an efficient, comprehensive Si solar cell modeling program that has the capability of simulation accuracy of 5 percent or less is examined. A general investigation of computerized simulation is provided. Computer simulation programs are subdivided into a number of major tasks: (1) analytical method used to represent the physical system; (2) phenomena submodels that comprise the simulation of the system; (3) coding of the analysis and the phenomena submodels; (4) coding scheme that results in efficient use of the CPU so that CPU costs are low; and (5) modularized simulation program with respect to structures that may be analyzed, addition and/or modification of phenomena submodels as new experimental data become available, and the addition of other photovoltaic materials.
NASA Astrophysics Data System (ADS)
Yim, Keun Soo
This dissertation summarizes experimental validation and co-design studies conducted to optimize the fault detection capabilities and overheads in hybrid computer systems (e.g., using CPUs and Graphics Processing Units, or GPUs), and consequently to improve the scalability of parallel computer systems using computational accelerators. The experimental validation studies were conducted to help us understand the failure characteristics of CPU-GPU hybrid computer systems under various types of hardware faults. The main characterization targets were faults that are difficult to detect and/or recover from, e.g., faults that cause long latency failures (Ch. 3), faults in dynamically allocated resources (Ch. 4), faults in GPUs (Ch. 5), faults in MPI programs (Ch. 6), and microarchitecture-level faults with specific timing features (Ch. 7). The co-design studies were based on the characterization results. One of the co-designed systems has a set of source-to-source translators that customize and strategically place error detectors in the source code of target GPU programs (Ch. 5). Another co-designed system uses an extension card to learn the normal behavioral and semantic execution patterns of message-passing processes executing on CPUs, and to detect abnormal behaviors of those parallel processes (Ch. 6). The third co-designed system is a co-processor that has a set of new instructions in order to support software-implemented fault detection techniques (Ch. 7). The work described in this dissertation gains more importance because heterogeneous processors have become an essential component of state-of-the-art supercomputers. GPUs were used in three of the five fastest supercomputers that were operating in 2011. Our work included comprehensive fault characterization studies in CPU-GPU hybrid computers. In CPUs, we monitored the target systems for a long period of time after injecting faults (a temporally comprehensive experiment), and injected faults into various types of program states that included dynamically allocated memory (to be spatially comprehensive). In GPUs, we used fault injection studies to demonstrate the importance of detecting silent data corruption (SDC) errors that are mainly due to the lack of fine-grained protections and the massive use of fault-insensitive data. This dissertation also presents transparent fault tolerance frameworks and techniques that are directly applicable to hybrid computers built using only commercial off-the-shelf hardware components. This dissertation shows that by developing understanding of the failure characteristics and error propagation paths of target programs, we were able to create fault tolerance frameworks and techniques that can quickly detect and recover from hardware faults with low performance and hardware overheads.
Assessing Computer Literacy: A Validated Instrument and Empirical Results.
ERIC Educational Resources Information Center
Gabriel, Roy M.
1985-01-01
Describes development of a comprehensive computer literacy assessment battery for K-12 curriculum based on objectives of a curriculum implemented in the Worldwide Department of Defense Dependents Schools system. Test development and field test data are discussed and a correlational analysis which assists in interpretation of test results is…
Problem-Solving in the Pre-Clinical Curriculum: The Uses of Computer Simulations.
ERIC Educational Resources Information Center
Michael, Joel A.; Rovick, Allen A.
1986-01-01
Promotes the use of computer-based simulations in the pre-clinical medical curriculum as a means of providing students with opportunities for problem solving. Describes simple simulations of skeletal muscle loads, complex simulations of major organ systems and comprehensive simulation models of the entire human body. (TW)
MTA Computer Based Evaluation System.
ERIC Educational Resources Information Center
Brenner, Lisa P.; And Others
The MTA PLATO-based evaluation system, which has been implemented by a consortium of schools of medical technology, is designed to be general-purpose, modular, data-driven, and interactive, and to accommodate other national and local item banks. The system provides a comprehensive interactive item-banking system in conjunction with online student…
NASA Technical Reports Server (NTRS)
Ferrare, R. A.; Whiteman, D. N.; Melfi, S. H.; Goldsmith, J. E. M.; Bisson, S. E.; Lapp, M.
1991-01-01
We describe preliminary results from a comprehensive computer model developed to guide optimization of a Raman lidar system for measuring daytime profiles of atmospheric water vapor, emphasizing an ultraviolet, solar-blind approach.
NASA Technical Reports Server (NTRS)
1998-01-01
SYMED, Inc., developed a unique electronic medical records and information management system. The S2000 Medical Interactive Care System (MICS) incorporates both a comprehensive and interactive medical care support capability and an extensive array of digital medical reference materials in either text or high resolution graphic form. The system was designed, in cooperation with NASA, to improve the effectiveness and efficiency of physician practices. The S2000 is a MS (Microsoft) Windows based software product which combines electronic forms, medical documents, records management, and features a comprehensive medical information system for medical diagnostic support and treatment. SYMED, Inc. offers access to its medical systems to all companies seeking competitive advantages.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reckinger, Scott James; Livescu, Daniel; Vasilyev, Oleg V.
A comprehensive numerical methodology has been developed that handles the challenges introduced by considering the compressive nature of Rayleigh-Taylor instability (RTI) systems, which include sharp interfacial density gradients on strongly stratified background states, acoustic wave generation and removal at computational boundaries, and stratification-dependent vorticity production. The computational framework is used to simulate two-dimensional single-mode RTI to extreme late-times for a wide range of flow compressibility and variable density effects. The results show that flow compressibility acts to reduce the growth of RTI for low Atwood numbers, as predicted from linear stability analysis.
A Comprehensive Theory of Algorithms for Wireless Networks and Mobile Systems
2016-06-08
David Peleg. Nonuniform SINR+Voronoi Diagrams are Effectively Uniform. In Yoram Moses, editor, Distributed Computing: 29th International Symposium...in Computer Science, page 559. Springer, 2014. [16] Erez Kantor, Zvi Lotker, Merav Parter, and David Peleg. Nonuniform sINR+Voronoi dia- grams are...Merav Parter, and David Peleg. Nonuniform SINR+Voronoi diagrams are effectively uniform. In Yoram Moses, editor, Distributed Computing - 29th
Conceptualizing and Advancing Research Networking Systems
SCHLEYER, TITUS; BUTLER, BRIAN S.; SONG, MEI; SPALLEK, HEIKO
2013-01-01
Science in general, and biomedical research in particular, is becoming more collaborative. As a result, collaboration with the right individuals, teams, and institutions is increasingly crucial for scientific progress. We propose Research Networking Systems (RNS) as a new type of system designed to help scientists identify and choose collaborators, and suggest a corresponding research agenda. The research agenda covers four areas: foundations, presentation, architecture, and evaluation. Foundations includes project-, institution- and discipline-specific motivational factors; the role of social networks; and impression formation based on information beyond expertise and interests. Presentation addresses representing expertise in a comprehensive and up-to-date manner; the role of controlled vocabularies and folksonomies; the tension between seekers’ need for comprehensive information and potential collaborators’ desire to control how they are seen by others; and the need to support serendipitous discovery of collaborative opportunities. Architecture considers aggregation and synthesis of information from multiple sources, social system interoperability, and integration with the user’s primary work context. Lastly, evaluation focuses on assessment of collaboration decisions, measurement of user-specific costs and benefits, and how the large-scale impact of RNS could be evaluated with longitudinal and naturalistic methods. We hope that this article stimulates the human-computer interaction, computer-supported cooperative work, and related communities to pursue a broad and comprehensive agenda for developing research networking systems. PMID:24376309
Conceptualizing and Advancing Research Networking Systems.
Schleyer, Titus; Butler, Brian S; Song, Mei; Spallek, Heiko
2012-03-01
Science in general, and biomedical research in particular, is becoming more collaborative. As a result, collaboration with the right individuals, teams, and institutions is increasingly crucial for scientific progress. We propose Research Networking Systems (RNS) as a new type of system designed to help scientists identify and choose collaborators, and suggest a corresponding research agenda. The research agenda covers four areas: foundations, presentation, architecture , and evaluation . Foundations includes project-, institution- and discipline-specific motivational factors; the role of social networks; and impression formation based on information beyond expertise and interests. Presentation addresses representing expertise in a comprehensive and up-to-date manner; the role of controlled vocabularies and folksonomies; the tension between seekers' need for comprehensive information and potential collaborators' desire to control how they are seen by others; and the need to support serendipitous discovery of collaborative opportunities. Architecture considers aggregation and synthesis of information from multiple sources, social system interoperability, and integration with the user's primary work context. Lastly, evaluation focuses on assessment of collaboration decisions, measurement of user-specific costs and benefits, and how the large-scale impact of RNS could be evaluated with longitudinal and naturalistic methods. We hope that this article stimulates the human-computer interaction, computer-supported cooperative work, and related communities to pursue a broad and comprehensive agenda for developing research networking systems.
The development of an engineering computer graphics laboratory
NASA Technical Reports Server (NTRS)
Anderson, D. C.; Garrett, R. E.
1975-01-01
Hardware and software systems developed to further research and education in interactive computer graphics were described, as well as several of the ongoing application-oriented projects, educational graphics programs, and graduate research projects. The software system consists of a FORTRAN 4 subroutine package, in conjunction with a PDP 11/40 minicomputer as the primary computation processor and the Imlac PDS-1 as an intelligent display processor. The package comprises a comprehensive set of graphics routines for dynamic, structured two-dimensional display manipulation, and numerous routines to handle a variety of input devices at the Imlac.
The Further Development of CSIEC Project Driven by Application and Evaluation in English Education
ERIC Educational Resources Information Center
Jia, Jiyou; Chen, Weichao
2009-01-01
In this paper, we present the comprehensive version of CSIEC (Computer Simulation in Educational Communication), an interactive web-based human-computer dialogue system with natural language for English instruction, and its tentative application and evaluation in English education. First, we briefly introduce the motivation for this project,…
A comprehensive pathway map of epidermal growth factor receptor signaling
Oda, Kanae; Matsuoka, Yukiko; Funahashi, Akira; Kitano, Hiroaki
2005-01-01
The epidermal growth factor receptor (EGFR) signaling pathway is one of the most important pathways that regulate growth, survival, proliferation, and differentiation in mammalian cells. Reflecting this importance, it is one of the best-investigated signaling systems, both experimentally and computationally, and several computational models have been developed for dynamic analysis. A map of molecular interactions of the EGFR signaling system is a valuable resource for research in this area. In this paper, we present a comprehensive pathway map of EGFR signaling and other related pathways. The map reveals that the overall architecture of the pathway is a bow-tie (or hourglass) structure with several feedback loops. The map is created using CellDesigner software that enables us to graphically represent interactions using a well-defined and consistent graphical notation, and to store it in Systems Biology Markup Language (SBML). PMID:16729045
CODAP: Source Program Listings for the Univac 1108.
ERIC Educational Resources Information Center
Weissmuller, Johnny J.; And Others
Documentation of the Univac 1108 Comprehensive Occupational Data Analysis Programs (CODAP) system is being published in a series of three technical reports covering the control card and programing aspects of the system, which is a highly interactive and efficient system of computer routines for analyzing, organizing, and reporting occupational…
Tracking by Identification Using Computer Vision and Radio
Mandeljc, Rok; Kovačič, Stanislav; Kristan, Matej; Perš, Janez
2013-01-01
We present a novel system for detection, localization and tracking of multiple people, which fuses a multi-view computer vision approach with a radio-based localization system. The proposed fusion combines the best of both worlds, excellent computer-vision-based localization, and strong identity information provided by the radio system, and is therefore able to perform tracking by identification, which makes it impervious to propagated identity switches. We present comprehensive methodology for evaluation of systems that perform person localization in world coordinate system and use it to evaluate the proposed system as well as its components. Experimental results on a challenging indoor dataset, which involves multiple people walking around a realistically cluttered room, confirm that proposed fusion of both systems significantly outperforms its individual components. Compared to the radio-based system, it achieves better localization results, while at the same time it successfully prevents propagation of identity switches that occur in pure computer-vision-based tracking. PMID:23262485
ERIC Educational Resources Information Center
Aksoy, Gokhan
2013-01-01
The purpose of this study is to determine the effect of computer animation technique on academic achievement of students in the "Solar System and Beyond" unit lecture as part of the Science and Technology course of the seventh grade in primary education. The sample of the study consists of 60 students attending to the 7th grade of primary school…
Intelligent redundant actuation system requirements and preliminary system design
NASA Technical Reports Server (NTRS)
Defeo, P.; Geiger, L. J.; Harris, J.
1985-01-01
Several redundant actuation system configurations were designed and demonstrated to satisfy the stringent operational requirements of advanced flight control systems. However, this has been accomplished largely through brute force hardware redundancy, resulting in significantly increased computational requirements on the flight control computers which perform the failure analysis and reconfiguration management. Modern technology now provides powerful, low-cost microprocessors which are effective in performing failure isolation and configuration management at the local actuator level. One such concept, called an Intelligent Redundant Actuation System (IRAS), significantly reduces the flight control computer requirements and performs the local tasks more comprehensively than previously feasible. The requirements and preliminary design of an experimental laboratory system capable of demonstrating the concept and sufficiently flexible to explore a variety of configurations are discussed.
Deepen the Teaching Reform of Operating System, Cultivate the Comprehensive Quality of Students
ERIC Educational Resources Information Center
Liu, Jianjun
2010-01-01
Operating system is the core course of the specialty of computer science and technology. To understand and master the operating system will directly affect students' further study on other courses. The course of operating system focuses more on theories. Its contents are more abstract and the knowledge system is more complicated. Therefore,…
Computer imaging and workflow systems in the business office.
Adams, W T; Veale, F H; Helmick, P M
1999-05-01
Computer imaging and workflow technology automates many business processes that currently are performed using paper processes. Documents are scanned into the imaging system and placed in electronic patient account folders. Authorized users throughout the organization, including preadmission, verification, admission, billing, cash posting, customer service, and financial counseling staff, have online access to the information they need when they need it. Such streamlining of business functions can increase collections and customer satisfaction while reducing labor, supply, and storage costs. Because the costs of a comprehensive computer imaging and workflow system can be considerable, healthcare organizations should consider implementing parts of such systems that can be cost-justified or include implementation as part of a larger strategic technology initiative.
Towards Co-Engineering Communicating Autonomous Cyber-Physical Systems
NASA Technical Reports Server (NTRS)
Bujorianu, Marius C.; Bujorianu, Manuela L.
2009-01-01
In this paper, we sketch a framework for interdisciplinary modeling of space systems, by proposing a holistic view. We consider different system dimensions and their interaction. Specifically, we study the interactions between computation, physics, communication, uncertainty and autonomy. The most comprehensive computational paradigm that supports a holistic perspective on autonomous space systems is given by cyber-physical systems. For these, the state of art consists of collaborating multi-engineering efforts that prompt for an adequate formal foundation. To achieve this, we propose a leveraging of the traditional content of formal modeling by a co-engineering process.
Computers and Writing. Learning Package No. 33.
ERIC Educational Resources Information Center
Simic, Marge, Comp.; Smith, Carl, Ed.
Originally developed as part of a project for the Department of Defense Schools (DoDDS) system, this learning package on computers and writing is designed for teachers who wish to upgrade or expand their teaching skills on their own. The package includes an overview of the project; a comprehensive search of the ERIC database; a lecture giving an…
ERIC Educational Resources Information Center
Aleman-Centeno, Josefina R.
1983-01-01
Discusses the development and evaluation of CAVIS, which consists of an Apple microcomputer used with audiovisual dialogs. Includes research on the effects of three conditions: (1) computer with audio and visual, (2) computer with audio alone and (3) audio alone in short-term and long-term recall. (EKN)
Neural bases of event knowledge and syntax integration in comprehension of complex sentences.
Malaia, Evie; Newman, Sharlene
2015-01-01
Comprehension of complex sentences is necessarily supported by both syntactic and semantic knowledge, but what linguistic factors trigger a readers' reliance on a specific system? This functional neuroimaging study orthogonally manipulated argument plausibility and verb event type to investigate cortical bases of the semantic effect on argument comprehension during reading. The data suggest that telic verbs facilitate online processing by means of consolidating the event schemas in episodic memory and by easing the computation of syntactico-thematic hierarchies in the left inferior frontal gyrus. The results demonstrate that syntax-semantics integration relies on trade-offs among a distributed network of regions for maximum comprehension efficiency.
Comprehensive evaluation of garment assembly line with simulation
NASA Astrophysics Data System (ADS)
Xu, Y.; Thomassey, S.; Chen, Y.; Zeng, X.
2017-10-01
In this paper, a comprehensive evaluation system is established to assess the garment production performance. It is based on performance indicators and supported with the corresponding results obtained by manual calculation or computer simulation. The assembly lines of a typical men’s shirt are taken as the study objects. With the comprehensive evaluation results, garments production arrangement scenarios are better analysed and then the appropriate one is supposed to be put into actual production. This will be a guidance given to companies on quick decision-making and multi-objective optimization of garment production.
Microcomputer Interfaces with U.S. Marine Corps Tactical Communications Systems,
1991-04-01
Schneider and Mrs. Kirkpatrick at the Communications officer School Quantico, Virginia This document has been approvedf or p,bic Yleaze cncd sae; its...communications systems; however, there is not a comprehensive USMC Command, Control, Communications, and Computers (C4) operational policy that documents these...5-6 COMMUNICATIONS SYSTEMS .... ............. 5-6 ADP SYSTEMS ....... .................. 5-6 OPERATIONAL POLICIES
On Roles of Models in Information Systems
NASA Astrophysics Data System (ADS)
Sølvberg, Arne
The increasing penetration of computers into all aspects of human activity makes it desirable that the interplay among software, data and the domains where computers are applied is made more transparent. An approach to this end is to explicitly relate the modeling concepts of the domains, e.g., natural science, technology and business, to the modeling concepts of software and data. This may make it simpler to build comprehensible integrated models of the interactions between computers and non-computers, e.g., interaction among computers, people, physical processes, biological processes, and administrative processes. This chapter contains an analysis of various facets of the modeling environment for information systems engineering. The lack of satisfactory conceptual modeling tools seems to be central to the unsatisfactory state-of-the-art in establishing information systems. The chapter contains a proposal for defining a concept of information that is relevant to information systems engineering.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-12
... (CASAS) Life Skills Math Assessments--Application of Mathematics (Secondary Level). We are clarifying... Proficiency Test (MAPT) for Math. We are clarifying that the computer-adaptive test (CAT) is an approved...): (1) Comprehensive Adult Student Assessment Systems (CASAS) Employability Competency System (ECS) Math...
NASA Technical Reports Server (NTRS)
1990-01-01
A brief but comprehensive review is given of the technical accomplishments of the NASA Lewis Research Center during the past year. Topics covered include instrumentation and controls technology; internal fluid dynamics; aerospace materials, structures, propulsion, and electronics; space flight systems; cryogenic fluids; Space Station Freedom systems engineering, photovoltaic power module, electrical systems, and operations; and engineering and computational support.
Yeari, Menahem; van den Broek, Paul
2016-09-01
It is a well-accepted view that the prior semantic (general) knowledge that readers possess plays a central role in reading comprehension. Nevertheless, computational models of reading comprehension have not integrated the simulation of semantic knowledge and online comprehension processes under a unified mathematical algorithm. The present article introduces a computational model that integrates the landscape model of comprehension processes with latent semantic analysis representation of semantic knowledge. In three sets of simulations of previous behavioral findings, the integrated model successfully simulated the activation and attenuation of predictive and bridging inferences during reading, as well as centrality estimations and recall of textual information after reading. Analyses of the computational results revealed new theoretical insights regarding the underlying mechanisms of the various comprehension phenomena.
Planning Educational Small Computer Systems.
ERIC Educational Resources Information Center
Kiser, Chester
This paper defines problems of school officials in dealing with the new information society, and suggests a planning framework applicable to the design of school and district microcomputer systems. Educators' problems in addressing the demands and opportunities of the emergent information society include inadequate comprehension of a major shift…
The database management system: A topic and a tool
NASA Technical Reports Server (NTRS)
Plummer, O. R.
1984-01-01
Data structures and data base management systems are common tools employed to deal with the administrative information of a university. An understanding of these topics is needed by a much wider audience, ranging from those interested in computer aided design and manufacturing to those using microcomputers. These tools are becoming increasingly valuable to academic programs as they develop comprehensive computer support systems. The wide use of these tools relies upon the relational data model as a foundation. Experience with the use of the IPAD RIM5.0 program is described.
Comprehensive analysis of a Radiology Operations Management computer system.
Arenson, R L; London, J W
1979-11-01
The Radiology Operations Management computer system at the Hospital of the University of Pennsylvania is discussed. The scheduling and file room modules are based on the system at Massachusetts General Hospital. Patient delays are indicated by the patient tracking module. A reporting module allows CRT/keyboard entry by transcriptionists, entry of standard reports by radiologists using bar code labels, and entry by radiologists using a specialty designed diagnostic reporting terminal. Time-flow analyses demonstrate a significant improvement in scheduling, patient waiting, retrieval of radiographs, and report delivery. Recovery of previously lost billing contributes to the proved cost effectiveness of this system.
ERIC Educational Resources Information Center
Herron, Sherry; Gandy, Rex; Ye, Ningjun; Syed, Nasser
2012-01-01
A unique aspect of the implementation of a computer algebra system (CAS) at a comprehensive university in the U.S. allowed us to compare the student success and failure rates to the traditional method of teaching college algebra. Due to space limitations, the university offered sections of both CAS and traditional simultaneously and, upon…
The Computer as an Aid to Reading Instruction. Learning Package No. 27.
ERIC Educational Resources Information Center
Simic, Marge, Comp.; Smith, Carl, Ed.
Originally developed for the Department of Defense Schools (DoDDS) system, this learning package on computer use in reading is designed for teachers who wish to upgrade or expand their teaching skills on their own. The package includes an overview of the project; a comprehensive search of the ERIC database; a lecture giving an overview on the…
Xing, Fuyong; Yang, Lin
2016-01-01
Digital pathology and microscopy image analysis is widely used for comprehensive studies of cell morphology or tissue structure. Manual assessment is labor intensive and prone to interobserver variations. Computer-aided methods, which can significantly improve the objectivity and reproducibility, have attracted a great deal of interest in recent literature. Among the pipeline of building a computer-aided diagnosis system, nucleus or cell detection and segmentation play a very important role to describe the molecular morphological information. In the past few decades, many efforts have been devoted to automated nucleus/cell detection and segmentation. In this review, we provide a comprehensive summary of the recent state-of-the-art nucleus/cell segmentation approaches on different types of microscopy images including bright-field, phase-contrast, differential interference contrast, fluorescence, and electron microscopies. In addition, we discuss the challenges for the current methods and the potential future work of nucleus/cell detection and segmentation.
Mapping the Future Today: The Community College of Baltimore County Geospatial Applications Program
ERIC Educational Resources Information Center
Jeffrey, Scott; Alvarez, Jaime
2010-01-01
The Geospatial Applications Program at the Community College of Baltimore County (CCBC), located five miles west of downtown Baltimore, Maryland, provides comprehensive instruction in geographic information systems (GIS), remote sensing and global positioning systems (GPS). Geospatial techniques, which include computer-based mapping and remote…
MATREX: A Unifying Modeling and Simulation Architecture for Live-Virtual-Constructive Applications
2007-05-23
Deployment Systems Acquisition Operations & Support B C Sustainment FRP Decision Review FOC LRIP/IOT& ECritical Design Review Pre-Systems...CMS2 – Comprehensive Munitions & Sensor Server • CSAT – C4ISR Static Analysis Tool • C4ISR – Command & Control, Communications, Computers
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-22
... further extensions without further notice. Under Contract Number EP10H000097, Computer Science Corporation... EPA procedures, including comprehensive system security plans (SSPs) that are consistent with those...
Office Computer Software: A Comprehensive Review of Software Programs.
ERIC Educational Resources Information Center
Secretary, 1992
1992-01-01
Describes types of software including system software, application software, spreadsheets, accounting software, graphics packages, desktop publishing software, database, desktop and personal information management software, project and records management software, groupware, and shareware. (JOW)
The Applications of Cone-Beam Computed Tomography in Endodontics: A Review of Literature
Kiarudi, Amir Hosein; Eghbal, Mohammad Jafar; Safi, Yaser; Aghdasi, Mohammad Mehdi; Fazlyab, Mahta
2015-01-01
By producing undistorted three-dimensional images of the area under examination, cone-beam computed tomography (CBCT) systems have met many of the limitations of conventional radiography. These systems produce images with small field of view at low radiation doses with adequate spatial resolution that are suitable for many applications in endodontics from diagnosis to treatment and follow-up. This review article comprehensively assembles all the data from literature regarding the potential applications of CBCT in endodontics. PMID:25598804
2010-12-01
computers in 1953. HIL motion simulators were also built for the dynamic testing of vehicle com- ponents (e.g. suspensions, bodies ) with hydraulic or...complex, comprehensive mechanical systems can be simulated in real-time by parallel computers; examples include multi- body sys- tems, brake systems...hard constraints in a multivariable control framework. And the third aspect is the ability to perform online optimization. These aspects results in
Getting It Going--Keeping It Running: A Handbook for Career Information Systems.
ERIC Educational Resources Information Center
Clyde, John S.
This handbook was designed as a tool to assist school and agency staff in management and in-service training needed in implementing a comprehensive Career Information System (CIS). The background, goals, and organization of CIS are given briefly. An explanation of the System which is computer-based yet also employs a manual needle-sort system is…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sowell, E.
1979-06-01
The Building Loads Analysis and System Thermodynamics (BLAST) program is a comprehensive set of subprograms for predicting energy consumption in buildings. There are three major subprograms: (1) the space load predicting subprogram, which computes hourly space loads in a building or zone based on user input and hourly weather data; (2) the air distribution system simulation subprogram, which uses the computed space load and user inputs describing the building air-handling system to calculate hot water or steam, chilled water, and electric energy demands; and (3) the central plant simulation program, which simulates boilers, chillers, onsite power generating equipment and solarmore » energy systems and computes monthly and annual fuel and electrical power consumption and plant life cycle cost.« less
ERIC Educational Resources Information Center
Liu, Pei-Lin; Chen, Chiu-Jung; Chang, Yu-Ju
2010-01-01
The purpose of this research was to investigate the effects of a computer-assisted concept mapping learning strategy on EFL college learners' English reading comprehension. The research questions were: (1) what was the influence of the computer-assisted concept mapping learning strategy on different learners' English reading comprehension? (2) did…
Histological Image Feature Mining Reveals Emergent Diagnostic Properties for Renal Cancer
Kothari, Sonal; Phan, John H.; Young, Andrew N.; Wang, May D.
2016-01-01
Computer-aided histological image classification systems are important for making objective and timely cancer diagnostic decisions. These systems use combinations of image features that quantify a variety of image properties. Because researchers tend to validate their diagnostic systems on specific cancer endpoints, it is difficult to predict which image features will perform well given a new cancer endpoint. In this paper, we define a comprehensive set of common image features (consisting of 12 distinct feature subsets) that quantify a variety of image properties. We use a data-mining approach to determine which feature subsets and image properties emerge as part of an “optimal” diagnostic model when applied to specific cancer endpoints. Our goal is to assess the performance of such comprehensive image feature sets for application to a wide variety of diagnostic problems. We perform this study on 12 endpoints including 6 renal tumor subtype endpoints and 6 renal cancer grade endpoints. Keywords-histology, image mining, computer-aided diagnosis PMID:28163980
Status Report: Mathematics Curriculum-Development Projects Today
ERIC Educational Resources Information Center
Arithmetic Teacher, 1972
1972-01-01
Brief reports on the Cambridge Conference on School Mathematics, Comprehensive School Mathematics Program, Computer-Assisted Instruction Projects at Stanford, Individually Prescribed Instruction Project, The Madison Project, Mathematics/Science Learning System, MINNEMAST, and School Mathematics Study Group. (MM)
Buniatian, A A; Sablin, I N; Flerov, E V; Mierbekov, E M; Broĭtman, O G; Shevchenko, V V; Shitikov, I I
1995-01-01
Creation of computer monitoring systems (CMS) for operating rooms is one of the most important spheres of personal computer employment in anesthesiology. The authors developed a PC RS/AT-based CMS and effectively used it for more than 2 years. This system permits comprehensive monitoring in cardiosurgical operations by real time processing the values of arterial and central venous pressure, pressure in the pulmonary artery, bioelectrical activity of the brain, and two temperature values. Use of this CMS helped appreciably improve patients' safety during surgery. The possibility to assess brain function by computer monitoring the EEF simultaneously with central hemodynamics and body temperature permit the anesthesiologist to objectively assess the depth of anesthesia and to diagnose cerebral hypoxia. Automated anesthesiological chart issued by the CMS after surgery reliably reflects the patient's status and the measures taken by the anesthesiologist.
Computation, Mathematics and Logistics Department Report for Fiscal Year 1978.
1980-03-01
storage technology. A reference library on these and related areas is now composed of two thousand documents. The most comprehensive tool available...at DTNSRDC on the CDC 6000 Computer System for a variety of applications including Navy Logistics, Library Science, Ocean Science, Contract Manage... Library Science) Track technical documents on advanced ship design Univ. of Virginia at Charlottesville - (Ocean Science) Monitor research projects for
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, David; Agarwal, Deborah A.; Sun, Xin
2011-09-01
The Carbon Capture Simulation Initiative is developing state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technology. The CCSI Toolset consists of an integrated multi-scale modeling and simulation framework, which includes extensive use of reduced order models (ROMs) and a comprehensive uncertainty quantification (UQ) methodology. This paper focuses on the interrelation among high performance computing, detailed device simulations, ROMs for scale-bridging, UQ and the integration framework.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, D.; Agarwal, D.; Sun, X.
2011-01-01
The Carbon Capture Simulation Initiative is developing state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technology. The CCSI Toolset consists of an integrated multi-scale modeling and simulation framework, which includes extensive use of reduced order models (ROMs) and a comprehensive uncertainty quantification (UQ) methodology. This paper focuses on the interrelation among high performance computing, detailed device simulations, ROMs for scale-bridging, UQ and the integration framework.
Creating an Electronic Reference and Information Database for Computer-aided ECM Design
NASA Astrophysics Data System (ADS)
Nekhoroshev, M. V.; Pronichev, N. D.; Smirnov, G. V.
2018-01-01
The paper presents a review on electrochemical shaping. An algorithm has been developed to implement a computer shaping model applicable to pulse electrochemical machining. For that purpose, the characteristics of pulse current occurring in electrochemical machining of aviation materials have been studied. Based on integrating the experimental results and comprehensive electrochemical machining process data modeling, a subsystem for computer-aided design of electrochemical machining for gas turbine engine blades has been developed; the subsystem was implemented in the Teamcenter PLM system.
DBSecSys: a database of Burkholderia mallei secretion systems.
Memišević, Vesna; Kumar, Kamal; Cheng, Li; Zavaljevski, Nela; DeShazer, David; Wallqvist, Anders; Reifman, Jaques
2014-07-16
Bacterial pathogenicity represents a major public health concern worldwide. Secretion systems are a key component of bacterial pathogenicity, as they provide the means for bacterial proteins to penetrate host-cell membranes and insert themselves directly into the host cells' cytosol. Burkholderia mallei is a Gram-negative bacterium that uses multiple secretion systems during its host infection life cycle. To date, the identities of secretion system proteins for B. mallei are not well known, and their pathogenic mechanisms of action and host factors are largely uncharacterized. We present the Database of Burkholderia malleiSecretion Systems (DBSecSys), a compilation of manually curated and computationally predicted bacterial secretion system proteins and their host factors. Currently, DBSecSys contains comprehensive experimentally and computationally derived information about B. mallei strain ATCC 23344. The database includes 143 B. mallei proteins associated with five secretion systems, their 1,635 human and murine interacting targets, and the corresponding 2,400 host-B. mallei interactions. The database also includes information about 10 pathogenic mechanisms of action for B. mallei secretion system proteins inferred from the available literature. Additionally, DBSecSys provides details about 42 virulence attenuation experiments for 27 B. mallei secretion system proteins. Users interact with DBSecSys through a Web interface that allows for data browsing, querying, visualizing, and downloading. DBSecSys provides a comprehensive, systematically organized resource of experimental and computational data associated with B. mallei secretion systems. It provides the unique ability to study secretion systems not only through characterization of their corresponding pathogen proteins, but also through characterization of their host-interacting partners.The database is available at https://applications.bhsai.org/dbsecsys.
Huang, Shuo; Liu, Jing
2010-05-01
Application of clinical digital medical imaging has raised many tough issues to tackle, such as data storage, management, and information sharing. Here we investigated a mobile phone based medical image management system which is capable of achieving personal medical imaging information storage, management and comprehensive health information analysis. The technologies related to the management system spanning the wireless transmission technology, the technical capabilities of phone in mobile health care and management of mobile medical database were discussed. Taking medical infrared images transmission between phone and computer as an example, the working principle of the present system was demonstrated.
Information Systems; Modern Health Care and Medical Information.
ERIC Educational Resources Information Center
Brandejs, J. F., And Others
1975-01-01
To effectively handle changes in health policy and health information, new designs and applications of automation are explored. Increased use of computer-based information systems in health care could serve as a means of control over the costs of developing more comprehensive health service, with applications increasing not only the automation of…
Stability of Linear Equations--Algebraic Approach
ERIC Educational Resources Information Center
Cherif, Chokri; Goldstein, Avraham; Prado, Lucio M. G.
2012-01-01
This article could be of interest to teachers of applied mathematics as well as to people who are interested in applications of linear algebra. We give a comprehensive study of linear systems from an application point of view. Specifically, we give an overview of linear systems and problems that can occur with the computed solution when the…
Implementation of Project Based Learning in Mechatronic Lab Course at Bandung State Polytechnic
ERIC Educational Resources Information Center
Basjaruddin, Noor Cholis; Rakhman, Edi
2016-01-01
Mechatronics is a multidisciplinary that includes a combination of mechanics, electronics, control systems, and computer science. The main objective of mechatronics learning is to establish a comprehensive mindset in the development of mechatronic systems. Project Based Learning (PBL) is an appropriate method for use in the learning process of…
ERIC Educational Resources Information Center
Davison, Mark L.; Biancarosa, Gina; Carlson, Sarah E.; Seipel, Ben; Liu, Bowen
2018-01-01
The computer-administered Multiple-Choice Online Causal Comprehension Assessment (MOCCA) for Grades 3 to 5 has an innovative, 40-item multiple-choice structure in which each distractor corresponds to a comprehension process upon which poor comprehenders have been shown to rely. This structure requires revised thinking about measurement issues…
Subtlenoise: sonification of distributed computing operations
NASA Astrophysics Data System (ADS)
Love, P. A.
2015-12-01
The operation of distributed computing systems requires comprehensive monitoring to ensure reliability and robustness. There are two components found in most monitoring systems: one being visually rich time-series graphs and another being notification systems for alerting operators under certain pre-defined conditions. In this paper the sonification of monitoring messages is explored using an architecture that fits easily within existing infrastructures based on mature opensource technologies such as ZeroMQ, Logstash, and Supercollider (a synth engine). Message attributes are mapped onto audio attributes based on broad classification of the message (continuous or discrete metrics) but keeping the audio stream subtle in nature. The benefits of audio rendering are described in the context of distributed computing operations and may provide a less intrusive way to understand the operational health of these systems.
Hsu, John; Huang, Jie; Fung, Vicki; Robertson, Nan; Jimison, Holly; Frankel, Richard
2005-01-01
The aim of this study was to evaluate the impact of introducing health information technology (HIT) on physician-patient interactions during outpatient visits. This was a longitudinal pre-post study: two months before and one and seven months after introduction of examination room computers. Patient questionnaires (n = 313) after primary care visits with physicians (n = 8) within an integrated delivery system. There were three patient satisfaction domains: (1) satisfaction with visit components, (2) comprehension of the visit, and (3) perceptions of the physician's use of the computer. Patients reported that physicians used computers in 82.3% of visits. Compared with baseline, overall patient satisfaction with visits increased seven months after the introduction of computers (odds ratio [OR] = 1.50; 95% confidence interval [CI]: 1.01-2.22), as did satisfaction with physicians' familiarity with patients (OR = 1.60, 95% CI: 1.01-2.52), communication about medical issues (OR = 1.61; 95% CI: 1.05-2.47), and comprehension of decisions made during the visit (OR = 1.63; 95% CI: 1.06-2.50). In contrast, there were no significant changes in patient satisfaction with comprehension of self-care responsibilities, communication about psychosocial issues, or available visit time. Seven months post-introduction, patients were more likely to report that the computer helped the visit run in a more timely manner (OR = 1.76; 95% CI: 1.28-2.42) compared with the first month after introduction. There were no other significant changes in patient perceptions of the computer use over time. The examination room computers appeared to have positive effects on physician-patient interactions related to medical communication without significant negative effects on other areas such as time available for patient concerns. Further study is needed to better understand HIT use during outpatient visits.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hussain, Hameed; Malik, Saif Ur Rehman; Hameed, Abdul
An efficient resource allocation is a fundamental requirement in high performance computing (HPC) systems. Many projects are dedicated to large-scale distributed computing systems that have designed and developed resource allocation mechanisms with a variety of architectures and services. In our study, through analysis, a comprehensive survey for describing resource allocation in various HPCs is reported. The aim of the work is to aggregate under a joint framework, the existing solutions for HPC to provide a thorough analysis and characteristics of the resource management and allocation strategies. Resource allocation mechanisms and strategies play a vital role towards the performance improvement ofmore » all the HPCs classifications. Therefore, a comprehensive discussion of widely used resource allocation strategies deployed in HPC environment is required, which is one of the motivations of this survey. Moreover, we have classified the HPC systems into three broad categories, namely: (a) cluster, (b) grid, and (c) cloud systems and define the characteristics of each class by extracting sets of common attributes. All of the aforementioned systems are cataloged into pure software and hybrid/hardware solutions. The system classification is used to identify approaches followed by the implementation of existing resource allocation strategies that are widely presented in the literature.« less
Developments in REDES: The rocket engine design expert system
NASA Technical Reports Server (NTRS)
Davidian, Kenneth O.
1990-01-01
The Rocket Engine Design Expert System (REDES) is being developed at the NASA-Lewis to collect, automate, and perpetuate the existing expertise of performing a comprehensive rocket engine analysis and design. Currently, REDES uses the rigorous JANNAF methodology to analyze the performance of the thrust chamber and perform computational studies of liquid rocket engine problems. The following computer codes were included in REDES: a gas properties program named GASP, a nozzle design program named RAO, a regenerative cooling channel performance evaluation code named RTE, and the JANNAF standard liquid rocket engine performance prediction code TDK (including performance evaluation modules ODE, ODK, TDE, TDK, and BLM). Computational analyses are being conducted by REDES to provide solutions to liquid rocket engine thrust chamber problems. REDES is built in the Knowledge Engineering Environment (KEE) expert system shell and runs on a Sun 4/110 computer.
Developments in REDES: The Rocket Engine Design Expert System
NASA Technical Reports Server (NTRS)
Davidian, Kenneth O.
1990-01-01
The Rocket Engine Design Expert System (REDES) was developed at NASA-Lewis to collect, automate, and perpetuate the existing expertise of performing a comprehensive rocket engine analysis and design. Currently, REDES uses the rigorous JANNAF methodology to analyze the performance of the thrust chamber and perform computational studies of liquid rocket engine problems. The following computer codes were included in REDES: a gas properties program named GASP; a nozzle design program named RAO; a regenerative cooling channel performance evaluation code named RTE; and the JANNAF standard liquid rocket engine performance prediction code TDK (including performance evaluation modules ODE, ODK, TDE, TDK, and BLM). Computational analyses are being conducted by REDES to provide solutions to liquid rocket engine thrust chamber problems. REDES was built in the Knowledge Engineering Environment (KEE) expert system shell and runs on a Sun 4/110 computer.
A simulation model for wind energy storage systems. Volume 1: Technical report
NASA Technical Reports Server (NTRS)
Warren, A. W.; Edsinger, R. W.; Chan, Y. K.
1977-01-01
A comprehensive computer program for the modeling of wind energy and storage systems utilizing any combination of five types of storage (pumped hydro, battery, thermal, flywheel and pneumatic) was developed. The level of detail of Simulation Model for Wind Energy Storage (SIMWEST) is consistent with a role of evaluating the economic feasibility as well as the general performance of wind energy systems. The software package consists of two basic programs and a library of system, environmental, and load components. The first program is a precompiler which generates computer models (in FORTRAN) of complex wind source storage application systems, from user specifications using the respective library components. The second program provides the techno-economic system analysis with the respective I/O, the integration of systems dynamics, and the iteration for conveyance of variables. SIMWEST program, as described, runs on the UNIVAC 1100 series computers.
Appendix Y. The Integrated Communications Experiment (ICE) Summary.
ERIC Educational Resources Information Center
Coffin, Robert
This appendix describes the Integrated Communications Experiment (ICE), a comprehensive computer software capability developed for the ComField Project. Each major characteristic of the data processing system is treated separately: natural language processing, flexibility, noninterference with the educational process, multipurposeness,…
A CS1 pedagogical approach to parallel thinking
NASA Astrophysics Data System (ADS)
Rague, Brian William
Almost all collegiate programs in Computer Science offer an introductory course in programming primarily devoted to communicating the foundational principles of software design and development. The ACM designates this introduction to computer programming course for first-year students as CS1, during which methodologies for solving problems within a discrete computational context are presented. Logical thinking is highlighted, guided primarily by a sequential approach to algorithm development and made manifest by typically using the latest, commercially successful programming language. In response to the most recent developments in accessible multicore computers, instructors of these introductory classes may wish to include training on how to design workable parallel code. Novel issues arise when programming concurrent applications which can make teaching these concepts to beginning programmers a seemingly formidable task. Student comprehension of design strategies related to parallel systems should be monitored to ensure an effective classroom experience. This research investigated the feasibility of integrating parallel computing concepts into the first-year CS classroom. To quantitatively assess student comprehension of parallel computing, an experimental educational study using a two-factor mixed group design was conducted to evaluate two instructional interventions in addition to a control group: (1) topic lecture only, and (2) topic lecture with laboratory work using a software visualization Parallel Analysis Tool (PAT) specifically designed for this project. A new evaluation instrument developed for this study, the Perceptions of Parallelism Survey (PoPS), was used to measure student learning regarding parallel systems. The results from this educational study show a statistically significant main effect among the repeated measures, implying that student comprehension levels of parallel concepts as measured by the PoPS improve immediately after the delivery of any initial three-week CS1 level module when compared with student comprehension levels just prior to starting the course. Survey results measured during the ninth week of the course reveal that performance levels remained high compared to pre-course performance scores. A second result produced by this study reveals no statistically significant interaction effect between the intervention method and student performance as measured by the evaluation instrument over three separate testing periods. However, visual inspection of survey score trends and the low p-value generated by the interaction analysis (0.062) indicate that further studies may verify improved concept retention levels for the lecture w/PAT group.
ERIC Educational Resources Information Center
Swan, Karen; And Others
The Computer Pilot Program of the Division of Computer Information Services of the New York City Board of Education was designed to investigate the claim that comprehensive computer-based instruction (CBI) might best be used to improve the basic skills of educationally disadvantaged students. This ongoing project is designed to identify…
Numerical propulsion system simulation
NASA Technical Reports Server (NTRS)
Lytle, John K.; Remaklus, David A.; Nichols, Lester D.
1990-01-01
The cost of implementing new technology in aerospace propulsion systems is becoming prohibitively expensive. One of the major contributors to the high cost is the need to perform many large scale system tests. Extensive testing is used to capture the complex interactions among the multiple disciplines and the multiple components inherent in complex systems. The objective of the Numerical Propulsion System Simulation (NPSS) is to provide insight into these complex interactions through computational simulations. This will allow for comprehensive evaluation of new concepts early in the design phase before a commitment to hardware is made. It will also allow for rapid assessment of field-related problems, particularly in cases where operational problems were encountered during conditions that would be difficult to simulate experimentally. The tremendous progress taking place in computational engineering and the rapid increase in computing power expected through parallel processing make this concept feasible within the near future. However it is critical that the framework for such simulations be put in place now to serve as a focal point for the continued developments in computational engineering and computing hardware and software. The NPSS concept which is described will provide that framework.
Xing, Fuyong; Yang, Lin
2016-01-01
Digital pathology and microscopy image analysis is widely used for comprehensive studies of cell morphology or tissue structure. Manual assessment is labor intensive and prone to inter-observer variations. Computer-aided methods, which can significantly improve the objectivity and reproducibility, have attracted a great deal of interest in recent literatures. Among the pipeline of building a computer-aided diagnosis system, nucleus or cell detection and segmentation play a very important role to describe the molecular morphological information. In the past few decades, many efforts have been devoted to automated nucleus/cell detection and segmentation. In this review, we provide a comprehensive summary of the recent state-of-the-art nucleus/cell segmentation approaches on different types of microscopy images including bright-field, phase-contrast, differential interference contrast (DIC), fluorescence, and electron microscopies. In addition, we discuss the challenges for the current methods and the potential future work of nucleus/cell detection and segmentation. PMID:26742143
Technical Advances and Fifth Grade Reading Comprehension: Do Students Benefit?
ERIC Educational Resources Information Center
Fountaine, Drew
This paper takes a look at some recent studies on utilization of technical tools, primarily personal computers and software, for improving fifth-grade students' reading comprehension. Specifically, the paper asks what benefits an educator can expect students to derive from closed-captioning and computer-assisted reading comprehension products. It…
ERIC Educational Resources Information Center
Education Commission of the States, Denver, CO.
This paper provides an overview of Accelerated Reader, a system of computerized testing and record-keeping that supplements the regular classroom reading program. Accelerated Reader's primary goal is to increase literature-based reading practice. The program offers a computer-aided reading comprehension and management program intended to motivate…
DOT National Transportation Integrated Search
2008-09-01
FSUTMS training is a major activity of the Systems Planning Office of the Florida Department of : Transportation (FDOT). The training aims to establish and maintain quality assurance for consistent : statewide modeling standards and provide up-to-dat...
A demonstrative model of a lunar base simulation on a personal computer
NASA Technical Reports Server (NTRS)
1985-01-01
The initial demonstration model of a lunar base simulation is described. This initial model was developed on the personal computer level to demonstrate feasibility and technique before proceeding to a larger computer-based model. Lotus Symphony Version 1.1 software was used to base the demonstration model on an personal computer with an MS-DOS operating system. The personal computer-based model determined the applicability of lunar base modeling techniques developed at an LSPI/NASA workshop. In addition, the personnal computer-based demonstration model defined a modeling structure that could be employed on a larger, more comprehensive VAX-based lunar base simulation. Refinement of this personal computer model and the development of a VAX-based model is planned in the near future.
Four PPPPerspectives on computational creativity in theory and in practice
NASA Astrophysics Data System (ADS)
Jordanous, Anna
2016-04-01
Computational creativity is the modelling, simulating or replicating of creativity computationally. In examining and learning from these "creative systems", from what perspective should the creativity of a system be considered? Are we interested in the creativity of the system's output? Or of its creative processes? Features of the system? Or how it operates within its environment? Traditionally computational creativity has focused more on creative systems' products or processes, though this focus has widened recently. Creativity research offers the Four Ps of creativity: Person/Producer, Product, Process and Press/Environment. This paper presents the Four Ps, explaining each in the context of creativity research and how it relates to computational creativity. To illustrate the usefulness of the Four Ps in taking broader perspectives on creativity in its computational treatment, the concepts of novelty and value are explored using the Four Ps, highlighting aspects of novelty and value that may otherwise be overlooked. Analysis of recent research in computational creativity finds that although each of the Four Ps appears in the body of computational creativity work, individual pieces of work often do not acknowledge all Four Ps, missing opportunities to widen their work's relevance. We can see, though, that high-status computational creativity papers do typically address all Four Ps. This paper argues that the broader views of creativity afforded by the Four Ps is vital in guiding us towards more comprehensively useful computational investigations of creativity.
SoftLab: A Soft-Computing Software for Experimental Research with Commercialization Aspects
NASA Technical Reports Server (NTRS)
Akbarzadeh-T, M.-R.; Shaikh, T. S.; Ren, J.; Hubbell, Rob; Kumbla, K. K.; Jamshidi, M
1998-01-01
SoftLab is a software environment for research and development in intelligent modeling/control using soft-computing paradigms such as fuzzy logic, neural networks, genetic algorithms, and genetic programs. SoftLab addresses the inadequacies of the existing soft-computing software by supporting comprehensive multidisciplinary functionalities from management tools to engineering systems. Furthermore, the built-in features help the user process/analyze information more efficiently by a friendly yet powerful interface, and will allow the user to specify user-specific processing modules, hence adding to the standard configuration of the software environment.
Decision support methods for the detection of adverse events in post-marketing data.
Hauben, M; Bate, A
2009-04-01
Spontaneous reporting is a crucial component of post-marketing drug safety surveillance despite its significant limitations. The size and complexity of some spontaneous reporting system databases represent a challenge for drug safety professionals who traditionally have relied heavily on the scientific and clinical acumen of the prepared mind. Computer algorithms that calculate statistical measures of reporting frequency for huge numbers of drug-event combinations are increasingly used to support pharamcovigilance analysts screening large spontaneous reporting system databases. After an overview of pharmacovigilance and spontaneous reporting systems, we discuss the theory and application of contemporary computer algorithms in regular use, those under development, and the practical considerations involved in the implementation of computer algorithms within a comprehensive and holistic drug safety signal detection program.
Addressing Failures in Exascale Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Snir, Marc; Wisniewski, Robert; Abraham, Jacob
2014-01-01
We present here a report produced by a workshop on Addressing failures in exascale computing' held in Park City, Utah, 4-11 August 2012. The charter of this workshop was to establish a common taxonomy about resilience across all the levels in a computing system, discuss existing knowledge on resilience across the various hardware and software layers of an exascale system, and build on those results, examining potential solutions from both a hardware and software perspective and focusing on a combined approach. The workshop brought together participants with expertise in applications, system software, and hardware; they came from industry, government, andmore » academia, and their interests ranged from theory to implementation. The combination allowed broad and comprehensive discussions and led to this document, which summarizes and builds on those discussions.« less
Examining the Effect of Computer-Based Passage Presentation on Reading Test Performance
ERIC Educational Resources Information Center
Higgins, Jennifer; Russell, Michael; Hoffmann, Thomas
2005-01-01
To examine the impact of transitioning 4th grade reading comprehension assessments to the computer, 219 fourth graders were randomly assigned to take a one-hour reading comprehension assessment on paper, on a computer using scrolling text to navigate through passages, or on a computer using paging text to navigate through passages. This study…
Computing a Comprehensible Model for Spam Filtering
NASA Astrophysics Data System (ADS)
Ruiz-Sepúlveda, Amparo; Triviño-Rodriguez, José L.; Morales-Bueno, Rafael
In this paper, we describe the application of the Desicion Tree Boosting (DTB) learning model to spam email filtering.This classification task implies the learning in a high dimensional feature space. So, it is an example of how the DTB algorithm performs in such feature space problems. In [1], it has been shown that hypotheses computed by the DTB model are more comprehensible that the ones computed by another ensemble methods. Hence, this paper tries to show that the DTB algorithm maintains the same comprehensibility of hypothesis in high dimensional feature space problems while achieving the performance of other ensemble methods. Four traditional evaluation measures (precision, recall, F1 and accuracy) have been considered for performance comparison between DTB and others models usually applied to spam email filtering. The size of the hypothesis computed by a DTB is smaller and more comprehensible than the hypothesis computed by Adaboost and Naïve Bayes.
NASA Astrophysics Data System (ADS)
Homem-de-Mello, Luiz S.
1992-04-01
While in NASA's earlier space missions such as Voyager the number of sensors was in the hundreds, future platforms such as the Space Station Freedom will have tens of thousands sensors. For these planned missions it will be impossible to use the comprehensive monitoring strategy that was used in the past in which human operators monitored all sensors all the time. A selective monitoring strategy must be substituted for the current comprehensive strategy. This selective monitoring strategy uses computer tools to preprocess the incoming data and direct the operators' attention to the most critical parts of the physical system at any given time. There are several techniques that can be used to preprocess the incoming information. This paper presents an approach to using diagnostic reasoning techniques to preprocess the sensor data and detect which parts of the physical system require more attention because components have failed or are most likely to have failed. Given the sensor readings and a model of the physical system, a number of assertions are generated and expressed as Boolean equations. The resulting system of Boolean equations is solved symbolically. Using a priori probabilities of component failure and Bayes' rule, revised probabilities of failure can be computed. These will indicate what components have failed or are the most likely to have failed. This approach is suitable for systems that are well understood and for which the correctness of the assertions can be guaranteed. Also, the system must be such that assertions can be made from instantaneous measurements. And the system must be such that changes are slow enough to allow the computation.
Comprehensive Solar-Terrestrial Environment Model (COSTEM) for Space Weather Predictions
2007-07-01
research in data assimilation methodologies applicable to the space environment, as well as "threat adaptive" grid computing technologies, where we...SWMF is tested by(SWMF) [29, 43] was designed in 2001 and has sse et xriig mlil ope been developed to integrate and couple several system tests...its components. The night on several computer/compiler platforms. main design goals of the SWMF were to minimizedocumented. mai deigngoas o th SWF
Changes in travel in the Shirley Highway corridor, 1983-1986.
DOT National Transportation Integrated Search
1987-01-01
On June 5, 1985, a comprehensive, computer-controlled traffic management system (TMS) was implemented on a section of 1-95 and 1-395 in Northern Virginia. The roadway is a major commuter route into the District of Columbia. A before-and-after evaluat...
ERIC Educational Resources Information Center
Yohe, Paula
1997-01-01
"The Electronic Bookshelf," a computer-based reading motivation/management system, is described. The program verifies reading comprehension skills, keeps student records, provides a master list of titles for students to choose from, provides feedback and the opportunity to re-quiz, allows teachers and/or students to create quizzes, and includes a…
Development of Listening Proficiency in Russian.
ERIC Educational Resources Information Center
Robin, Richard M.; Leaver, Betty Lou
1989-01-01
Describes the Listening Comprehension Exercise Network, a system that allows for the sharing of listening exercises in Russian via computer networks. The network, which could be emulated in other languages, alleviates the problem of time spent on developing essentially "throw-away" exercises. (21 references) (Author/CB)
System Proposal for Mass Transit Service Quality Control Based on GPS Data
Padrón, Gabino; Cristóbal, Teresa; Alayón, Francisco; Quesada-Arencibia, Alexis; García, Carmelo R.
2017-01-01
Quality is an essential aspect of public transport. In the case of regular public passenger transport by road, punctuality and regularity are criteria used to assess quality of service. Calculating metrics related to these criteria continuously over time and comprehensively across the entire transport network requires the handling of large amounts of data. This article describes a system for continuously and comprehensively monitoring punctuality and regularity. The system uses location data acquired continuously in the vehicles and automatically transferred for analysis. These data are processed intelligently by elements that are commonly used by transport operators: GPS-based tracking system, onboard computer and wireless networks for mobile data communications. The system was tested on a transport company, for which we measured the punctuality of one of the routes that it operates; the results are presented in this article. PMID:28621745
System Proposal for Mass Transit Service Quality Control Based on GPS Data.
Padrón, Gabino; Cristóbal, Teresa; Alayón, Francisco; Quesada-Arencibia, Alexis; García, Carmelo R
2017-06-16
Quality is an essential aspect of public transport. In the case of regular public passenger transport by road, punctuality and regularity are criteria used to assess quality of service. Calculating metrics related to these criteria continuously over time and comprehensively across the entire transport network requires the handling of large amounts of data. This article describes a system for continuously and comprehensively monitoring punctuality and regularity. The system uses location data acquired continuously in the vehicles and automatically transferred for analysis. These data are processed intelligently by elements that are commonly used by transport operators: GPS-based tracking system, onboard computer and wireless networks for mobile data communications. The system was tested on a transport company, for which we measured the punctuality of one of the routes that it operates; the results are presented in this article.
ERIC Educational Resources Information Center
Marcovitz, Alan B., Ed.
The method of phase-plane presentation as an educational tool in the study of the dynamic behavior of systems is discussed. In the treatment of nonlinear or piecewise-linear systems, the phase-plane portrait is used to exhibit the nature of singular points, regions of stability, and switching lines to aid comprehension. A technique is described by…
Stereoscopic medical imaging collaboration system
NASA Astrophysics Data System (ADS)
Okuyama, Fumio; Hirano, Takenori; Nakabayasi, Yuusuke; Minoura, Hirohito; Tsuruoka, Shinji
2007-02-01
The computerization of the clinical record and the realization of the multimedia have brought improvement of the medical service in medical facilities. It is very important for the patients to obtain comprehensible informed consent. Therefore, the doctor should plainly explain the purpose and the content of the diagnoses and treatments for the patient. We propose and design a Telemedicine Imaging Collaboration System which presents a three dimensional medical image as X-ray CT, MRI with stereoscopic image by using virtual common information space and operating the image from a remote location. This system is composed of two personal computers, two 15 inches stereoscopic parallax barrier type LCD display (LL-151D, Sharp), one 1Gbps router and 1000base LAN cables. The software is composed of a DICOM format data transfer program, an operation program of the images, the communication program between two personal computers and a real time rendering program. Two identical images of 512×768 pixcels are displayed on two stereoscopic LCD display, and both images show an expansion, reduction by mouse operation. This system can offer a comprehensible three-dimensional image of the diseased part. Therefore, the doctor and the patient can easily understand it, depending on their needs.
High End Computing Technologies for Earth Science Applications: Trends, Challenges, and Innovations
NASA Technical Reports Server (NTRS)
Parks, John (Technical Monitor); Biswas, Rupak; Yan, Jerry C.; Brooks, Walter F.; Sterling, Thomas L.
2003-01-01
Earth science applications of the future will stress the capabilities of even the highest performance supercomputers in the areas of raw compute power, mass storage management, and software environments. These NASA mission critical problems demand usable multi-petaflops and exabyte-scale systems to fully realize their science goals. With an exciting vision of the technologies needed, NASA has established a comprehensive program of advanced research in computer architecture, software tools, and device technology to ensure that, in partnership with US industry, it can meet these demanding requirements with reliable, cost effective, and usable ultra-scale systems. NASA will exploit, explore, and influence emerging high end computing architectures and technologies to accelerate the next generation of engineering, operations, and discovery processes for NASA Enterprises. This article captures this vision and describes the concepts, accomplishments, and the potential payoff of the key thrusts that will help meet the computational challenges in Earth science applications.
Gomes, Carlos Augusto; Sartelli, Massimo; Di Saverio, Salomone; Ansaloni, Luca; Catena, Fausto; Coccolini, Federico; Inaba, Kenji; Demetriades, Demetrios; Gomes, Felipe Couto; Gomes, Camila Couto
2015-01-01
Advances in the technology and improved access to imaging modalities such as Computed Tomography and laparoscopy have changed the contemporary diagnostic and management of acute appendicitis. Complicated appendicitis (phlegmon, abscess and/ or diffuse peritonitis), is now reliably distinguished from uncomplicated cases. Therefore, a new comprehensive grading system for acute appendicitis is necessary. The goal is review and update the laparoscopic grading system of acute appendicitis and to provide a new standardized classification system to allow more uniform patient stratification. During the last World Society of Emergency Surgery Congress in Israel (July, 2015), a panel involving Acute Appendicitis Experts and the author's discussed many current aspects about the acute appendicitis between then, it will be submitted a new comprehensive disease grading system. It was idealized based on three aspect of the disease (clinical and imaging presentation and laparoscopic findings). The new grading system may provide a standardized system to allow more uniform patient stratification for appendicitis research. In addition, may aid in determining optimal management according to grade. Lastly, what we want is to draw a multicenter observational study within the World Society of Emergency Surgery (WSES) based on this design.
Stern, Pnina; Shalev, Lilach
2013-01-01
Difficulties in reading comprehension are common in children and adolescents with Attention Deficit/Hyperactivity Disorder (ADHD). The current study aimed at investigating the relation between sustained attention and reading comprehension among adolescents with and without ADHD. Another goal was to examine the impact of two manipulations of the text on the efficiency of reading comprehension: Spacing (standard- vs. double-spacing) and Type of presentation (computer screen vs. hard copy). Reading comprehension of two groups of adolescents (participants with ADHD and normal controls) was assessed and compared in four different conditions (standard printed, spaced printed, standard on computer screen, spaced on computer screen). In addition, participants completed a visual sustained attention task. Significant differences in reading comprehension and in sustained attention were obtained between the two groups. Also, a significant correlation was obtained between sustained attention and reading comprehension. Moreover, a significant interaction was revealed between presentation-type, spacing and level of sustained attention on reading comprehension. Implications for reading intervention and the importance of early assessment of attention functioning are discussed. Copyright © 2012 Elsevier Ltd. All rights reserved.
[The computer assisted pacemaker clinic at the regional hospital of Udine (author's transl)].
Feruglio, G A; Lestuzzi, L; Carminati, D
1978-01-01
For a close follow-up of large groups of pacemaker patients and for evaluation of long term pacing on a reliable statistical basis, many pacemaker centers in the world are now using computer systems. A patient data system with structured display records, designed to give complete, comprehensive and surveyable information and which are immediately retrievable 24 hours a day, on display or printed sets, seems to offer an ideal solution. The pacemaker clinic at the Regional Hospital of Udine has adopted this type of system. The clinic in linked to a live, on-line patient data system (G/3, Informatica Friuli-Venezia Giulia). The input and retrieval of information are made through a conventional keyboard. The input formats have fixed headings with coded alternatives and a limited space for comments in free text. The computer edits the coded information to surveyable reviews. Searches can be made on coded information and data of interest.
Synthesized Speech Output and Children: A Scoping Review
ERIC Educational Resources Information Center
Drager, Kathryn D. R.; Reichle, Joe; Pinkoski, Carrie
2010-01-01
Purpose: Many computer-based augmentative and alternative communication systems in use by children have speech output. This article (a) provides a scoping review of the literature addressing the intelligibility and listener comprehension of synthesized speech output with children and (b) discusses future research directions. Method: Studies…
China's Chemical Information Online Service: ChI2Net.
ERIC Educational Resources Information Center
Naiyan, Yu; And Others
1997-01-01
Describes the Chemical Integrated Information Service Network (ChI2Net), a comprehensive online information service system which includes chemical, technical, economic, market, news, and management information based on computer and modern communication technology that was built by the China National Chemical Information Centre. (Author/LRW)
MATREX Leads the Way in Implementing New DOD VV&A Documentation Standards
2007-05-24
Acquisition Operations & Support B C Sustainment FRP Decision Review FOC LRIP/IOT& ECritical Design Review Pre-Systems Acquisition Concept...Communications Human Performance Model • C3GRID – Command & Control, Computer GRID • CES – Communications Effects Server • CMS2 – Comprehensive
Interactive Video-Based Industrial Training in Basic Electronics.
ERIC Educational Resources Information Center
Mirkin, Barry
The Wisconsin Foundation for Vocational, Technical, and Adult Education is currently involved in the development, implementation, and distribution of a sophisticated interactive computer and video learning system. Designed to offer trainees an open entry and open exit opportunity to pace themselves through a comprehensive competency-based,…
Li, Xiaoqing; Zhao, Haiyan; Lu, Yong
2014-01-01
Sentence comprehension involves timely computing different types of relations between its verbs and noun arguments, such as morphosyntactic, semantic, and thematic relations. Here, we used EEG technique to investigate the potential differences in thematic role computing and lexical-semantic relatedness processing during on-line sentence comprehension, and the interaction between these two types of processes. Mandarin Chinese sentences were used as materials. The basic structure of those sentences is “Noun+Verb+‘le’+a two-character word”, with the Noun being the initial argument. The verb disambiguates the initial argument as an agent or a patient. Meanwhile, the initial argument and the verb are highly or lowly semantically related. The ERPs at the verbs revealed that: relative to the agent condition, the patient condition evoked a larger N400 only when the argument and verb were lowly semantically related; however, relative to the high-relatedness condition, the low-relatedness condition elicited a larger N400 regardless of the thematic relation; although both thematic role variation and semantic relatedness variation elicited N400 effects, the N400 effect elicited by the former was broadly distributed and reached maximum over the frontal electrodes, and the N400 effect elicited by the latter had a posterior distribution. In addition, the brain oscillations results showed that, although thematic role variation (patient vs. agent) induced power decreases around the beta frequency band (15–30 Hz), semantic relatedness variation (low-relatedness vs. high-relatedness) induced power increases in the theta frequency band (4–7 Hz). These results suggested that, in the sentence context, thematic role computing is modulated by the semantic relatedness between the verb and its argument; semantic relatedness processing, however, is in some degree independent from the thematic relations. Moreover, our results indicated that, during on-line sentence comprehension, thematic role computing and semantic relatedness processing are mediated by distinct neural systems. PMID:24755643
Documentation for the machine-readable version of photometric data for nearby stars
NASA Technical Reports Server (NTRS)
Warren, W. H., Jr.
1982-01-01
A computer list of all photometric systems (of those considered), in which each star was measured is provided. The file is a subset of a much larger and more comprehensive compilation, which lists all measured photoelectric photometric systems for any star that has been measured in at least one photoelectric system. In addition to the photometric system identifications, cross identifications to the Henry Draper and Durchmusterung catalogs and apparent visual magnitudes are included.
1983-12-01
while at the same time improving its operational efficiency. Through their integration and use, System Program Managers have a comprehensive analytical... systems . The NRLA program is hosted on the CREATE Operating System and contains approxiamately 5500 lines of computer code. It consists of a main...associated with C alternative maintenance plans. As the technological complexity of weapons systems has increased new and innovative logisitcal support
Hsu, John; Huang, Jie; Fung, Vicki; Robertson, Nan; Jimison, Holly; Frankel, Richard
2005-01-01
Objective: The aim of this study was to evaluate the impact of introducing health information technology (HIT) on physician-patient interactions during outpatient visits. Design: This was a longitudinal pre-post study: two months before and one and seven months after introduction of examination room computers. Patient questionnaires (n = 313) after primary care visits with physicians (n = 8) within an integrated delivery system. There were three patient satisfaction domains: (1) satisfaction with visit components, (2) comprehension of the visit, and (3) perceptions of the physician's use of the computer. Results: Patients reported that physicians used computers in 82.3% of visits. Compared with baseline, overall patient satisfaction with visits increased seven months after the introduction of computers (odds ratio [OR] = 1.50; 95% confidence interval [CI]: 1.01–2.22), as did satisfaction with physicians' familiarity with patients (OR = 1.60, 95% CI: 1.01–2.52), communication about medical issues (OR = 1.61; 95% CI: 1.05–2.47), and comprehension of decisions made during the visit (OR = 1.63; 95% CI: 1.06–2.50). In contrast, there were no significant changes in patient satisfaction with comprehension of self-care responsibilities, communication about psychosocial issues, or available visit time. Seven months post-introduction, patients were more likely to report that the computer helped the visit run in a more timely manner (OR = 1.76; 95% CI: 1.28–2.42) compared with the first month after introduction. There were no other significant changes in patient perceptions of the computer use over time. Conclusion: The examination room computers appeared to have positive effects on physician-patient interactions related to medical communication without significant negative effects on other areas such as time available for patient concerns. Further study is needed to better understand HIT use during outpatient visits. PMID:15802484
Liu, Xunying; Zhang, Chao; Woodland, Phil; Fonteneau, Elisabeth
2017-01-01
There is widespread interest in the relationship between the neurobiological systems supporting human cognition and emerging computational systems capable of emulating these capacities. Human speech comprehension, poorly understood as a neurobiological process, is an important case in point. Automatic Speech Recognition (ASR) systems with near-human levels of performance are now available, which provide a computationally explicit solution for the recognition of words in continuous speech. This research aims to bridge the gap between speech recognition processes in humans and machines, using novel multivariate techniques to compare incremental ‘machine states’, generated as the ASR analysis progresses over time, to the incremental ‘brain states’, measured using combined electro- and magneto-encephalography (EMEG), generated as the same inputs are heard by human listeners. This direct comparison of dynamic human and machine internal states, as they respond to the same incrementally delivered sensory input, revealed a significant correspondence between neural response patterns in human superior temporal cortex and the structural properties of ASR-derived phonetic models. Spatially coherent patches in human temporal cortex responded selectively to individual phonetic features defined on the basis of machine-extracted regularities in the speech to lexicon mapping process. These results demonstrate the feasibility of relating human and ASR solutions to the problem of speech recognition, and suggest the potential for further studies relating complex neural computations in human speech comprehension to the rapidly evolving ASR systems that address the same problem domain. PMID:28945744
Lee, Shu-Ping; Su, Hui-Kai; Lee, Shin-Da
2012-06-01
This study investigated the effects of immediate feedback on computer-based foreign language listening comprehension tests and on intrapersonal test-associated anxiety in 72 English major college students at a Taiwanese University. Foreign language listening comprehension of computer-based tests designed by MOODLE, a dynamic e-learning environment, with or without immediate feedback together with the state-trait anxiety inventory (STAI) were tested and repeated after one week. The analysis indicated that immediate feedback during testing caused significantly higher anxiety and resulted in significantly higher listening scores than in the control group, which had no feedback. However, repeated feedback did not affect the test anxiety and listening scores. Computer-based immediate feedback did not lower debilitating effects of anxiety but enhanced students' intrapersonal eustress-like anxiety and probably improved their attention during listening tests. Computer-based tests with immediate feedback might help foreign language learners to increase attention in foreign language listening comprehension.
ERIC Educational Resources Information Center
Srivastava, Pradyumn; Gray, Shelley
2012-01-01
Purpose: With the global expansion of technology, our reading platform has shifted from traditional text to hypertext, yet little consideration has been given to how this shift might help or hinder students' reading comprehension. The purpose of this study was to compare reading comprehension of computer-based and paper-based texts in adolescents…
The Effect of Computer-Assisted Language Learning on Reading Comprehension in an Iranian EFL Context
ERIC Educational Resources Information Center
Saeidi, Mahnaz; Yusefi, Mahsa
2012-01-01
This study is an attempt to examine the effect of computer-assisted language learning (CALL) on reading comprehension in an Iranian English as a foreign language (EFL) context. It was hypothesized that CALL has an effect on reading comprehension. Forty female learners of English at intermediate level after administering a proficiency test were…
The NJOY Nuclear Data Processing System, Version 2016
DOE Office of Scientific and Technical Information (OSTI.GOV)
Macfarlane, Robert; Muir, Douglas W.; Boicourt, R. M.
The NJOY Nuclear Data Processing System, version 2016, is a comprehensive computer code package for producing pointwise and multigroup cross sections and related quantities from evaluated nuclear data in the ENDF-4 through ENDF-6 legacy card-image formats. NJOY works with evaluated files for incident neutrons, photons, and charged particles, producing libraries for a wide variety of particle transport and reactor analysis codes.
Three-Dimensional Visualization of Ozone Process Data.
1997-06-18
Scattered Multivariate Data. IEEE Computer Graphics & Applications. 11 (May), 47-55. Odman, M.T. and Ingram, C.L. (1996) Multiscale Air Quality Simulation...the Multiscale Air Quality Simulation Platform (MAQSIP) modeling system. MAQSIP is a modular comprehensive air quality modeling system which MCNC...photolyzed back again to nitric oxide. Finally, oxides of 6 nitrogen are terminated through loss or combination into nitric acid, organic nitrates
Five-Year Technology Plan. Educational Planning Center.
ERIC Educational Resources Information Center
Virginia Beach City Public Schools, VA.
This 5-year plan is based on "A Comprehensive Plan for Instructional and Administrative Use of Computers in the Virginia Beach Public Schools" ("The Pogrow Report," completed in December 1984) and the "Task Force Report on Information Systems" (completed in July 1988). These reports document the importance of making…
The United States Air Force Occupational Research Project.
ERIC Educational Resources Information Center
Christal, Raymond E.
This informal presentation describes how and why the Air Force uses the job inventory approach for collecting, analyzing, and reporting information describing the work performed by its personnel. This is followed by a brief description of the Comprehensive Occupational Data Computer Program (CODAP) system, the applications of job survey…
Using Computer Technology To Monitor Student Progress and Remediate Reading Problems.
ERIC Educational Resources Information Center
McCullough, C. Sue
1995-01-01
Focuses on research about application of text-to-speech systems in diagnosing and remediating word recognition, vocabulary knowledge, and comprehension disabilities. As school psychologists move toward a consultative model of service delivery, they need to know about technology such as speech synthesizers, digitizers, optical-character-recognition…
Computerized Diagnostic Testing: Problems and Possibilities.
ERIC Educational Resources Information Center
McArthur, David L.
The use of computers to build diagnostic inferences is explored in two contexts. In computerized monitoring of liquid oxygen systems for the space shuttle, diagnoses are exact because they can be derived within a world which is closed. In computerized classroom testing of reading comprehension, programs deliver a constrained form of adaptive…
CODAP: Control Card Specifications for the Univac 1108.
ERIC Educational Resources Information Center
Stacey, William D.; And Others
The document is one of three in a series of technical reports covering the control card and programing aspects of the Comprehensive Occupational Data Analysis Programs (CODAP), a highly interactive and efficient system of computer routines for analyzing, organizing, and reporting occupational information. The document contains control card…
Quest: The Interactive Test Analysis System.
ERIC Educational Resources Information Center
Adams, Raymond J.; Khoo, Siek-Toon
The Quest program offers a comprehensive test and questionnaire analysis environment by providing a data analyst (a computer program) with access to the most recent developments in Rasch measurement theory, as well as a range of traditional analysis procedures. This manual helps the user use Quest to construct and validate variables based on…
NASA Technical Reports Server (NTRS)
Carroll, Chester C.; Youngblood, John N.; Saha, Aindam
1987-01-01
Improvements and advances in the development of computer architecture now provide innovative technology for the recasting of traditional sequential solutions into high-performance, low-cost, parallel system to increase system performance. Research conducted in development of specialized computer architecture for the algorithmic execution of an avionics system, guidance and control problem in real time is described. A comprehensive treatment of both the hardware and software structures of a customized computer which performs real-time computation of guidance commands with updated estimates of target motion and time-to-go is presented. An optimal, real-time allocation algorithm was developed which maps the algorithmic tasks onto the processing elements. This allocation is based on the critical path analysis. The final stage is the design and development of the hardware structures suitable for the efficient execution of the allocated task graph. The processing element is designed for rapid execution of the allocated tasks. Fault tolerance is a key feature of the overall architecture. Parallel numerical integration techniques, tasks definitions, and allocation algorithms are discussed. The parallel implementation is analytically verified and the experimental results are presented. The design of the data-driven computer architecture, customized for the execution of the particular algorithm, is discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carroll, C.C.; Youngblood, J.N.; Saha, A.
1987-12-01
Improvements and advances in the development of computer architecture now provide innovative technology for the recasting of traditional sequential solutions into high-performance, low-cost, parallel system to increase system performance. Research conducted in development of specialized computer architecture for the algorithmic execution of an avionics system, guidance and control problem in real time is described. A comprehensive treatment of both the hardware and software structures of a customized computer which performs real-time computation of guidance commands with updated estimates of target motion and time-to-go is presented. An optimal, real-time allocation algorithm was developed which maps the algorithmic tasks onto the processingmore » elements. This allocation is based on the critical path analysis. The final stage is the design and development of the hardware structures suitable for the efficient execution of the allocated task graph. The processing element is designed for rapid execution of the allocated tasks. Fault tolerance is a key feature of the overall architecture. Parallel numerical integration techniques, tasks definitions, and allocation algorithms are discussed. The parallel implementation is analytically verified and the experimental results are presented. The design of the data-driven computer architecture, customized for the execution of the particular algorithm, is discussed.« less
Model-Driven Configuration of SELinux Policies
NASA Astrophysics Data System (ADS)
Agreiter, Berthold; Breu, Ruth
The need for access control in computer systems is inherent. However, the complexity to configure such systems is constantly increasing which affects the overall security of a system negatively. We think that it is important to define security requirements on a non-technical level while taking the application domain into respect in order to have a clear and separated view on security configuration (i.e. unblurred by technical details). On the other hand, security functionality has to be tightly integrated with the system and its development process in order to provide comprehensive means of enforcement. In this paper, we propose a systematic approach based on model-driven security configuration to leverage existing operating system security mechanisms (SELinux) for realising access control. We use UML models and develop a UML profile to satisfy these needs. Our goal is to exploit a comprehensive protection mechanism while rendering its security policy manageable by a domain specialist.
Bashashati, Ali; Fatourechi, Mehrdad; Ward, Rabab K; Birch, Gary E
2007-06-01
Brain-computer interfaces (BCIs) aim at providing a non-muscular channel for sending commands to the external world using the electroencephalographic activity or other electrophysiological measures of the brain function. An essential factor in the successful operation of BCI systems is the methods used to process the brain signals. In the BCI literature, however, there is no comprehensive review of the signal processing techniques used. This work presents the first such comprehensive survey of all BCI designs using electrical signal recordings published prior to January 2006. Detailed results from this survey are presented and discussed. The following key research questions are addressed: (1) what are the key signal processing components of a BCI, (2) what signal processing algorithms have been used in BCIs and (3) which signal processing techniques have received more attention?
NASA Astrophysics Data System (ADS)
Bashashati, Ali; Fatourechi, Mehrdad; Ward, Rabab K.; Birch, Gary E.
2007-06-01
Brain computer interfaces (BCIs) aim at providing a non-muscular channel for sending commands to the external world using the electroencephalographic activity or other electrophysiological measures of the brain function. An essential factor in the successful operation of BCI systems is the methods used to process the brain signals. In the BCI literature, however, there is no comprehensive review of the signal processing techniques used. This work presents the first such comprehensive survey of all BCI designs using electrical signal recordings published prior to January 2006. Detailed results from this survey are presented and discussed. The following key research questions are addressed: (1) what are the key signal processing components of a BCI, (2) what signal processing algorithms have been used in BCIs and (3) which signal processing techniques have received more attention?
NASA Astrophysics Data System (ADS)
Gen, Mitsuo; Kawakami, Hiroshi; Tsujimura, Yasuhiro; Handa, Hisashi; Lin, Lin; Okamoto, Azuma
As efficient utilization of computational resources is increasing, evolutionary technology based on the Genetic Algorithm (GA), Genetic Programming (GP), Evolution Strategy (ES) and other Evolutionary Computations (ECs) is making rapid progress, and its social recognition and the need as applied technology are increasing. This is explained by the facts that EC offers higher robustness for knowledge information processing systems, intelligent production and logistics systems, most advanced production scheduling and other various real-world problems compared to the approaches based on conventional theories, and EC ensures flexible applicability and usefulness for any unknown system environment even in a case where accurate mathematical modeling fails in the formulation. In this paper, we provide a comprehensive survey of the current state-of-the-art in the fundamentals and applications of evolutionary technologies.
1989-09-01
OGT, F1EPQRTJTL4, W" - 3^ n"r-- n *ON EXPERT SYSTEMS IN DESIGN, CONSTRUCTION AND’, IWAJNTENANCE-OF STRUCTURES Arockiasamy, Sunghoon Lee Clepartrhent...based expert system applications in the areas of structural design, design standards, and construction planning. This study will aid in the development...of a comprehensive expert system for tvical hydraulic structures. Funding for this report was provided by the US Army Engineer Waterways Experiment
ERIC Educational Resources Information Center
Petscher, Yaacov; Foorman, Barbara R.; Truckenmiller, Adrea J.
2017-01-01
The objective of the present study was to evaluate the extent to which students who took a computer adaptive test of reading comprehension accounting for testlet effects were administered fewer passages and had a more precise estimate of their reading comprehension ability compared to students in the control condition. A randomized controlled…
Applying analytic hierarchy process to assess healthcare-oriented cloud computing service systems.
Liao, Wen-Hwa; Qiu, Wan-Li
2016-01-01
Numerous differences exist between the healthcare industry and other industries. Difficulties in the business operation of the healthcare industry have continually increased because of the volatility and importance of health care, changes to and requirements of health insurance policies, and the statuses of healthcare providers, which are typically considered not-for-profit organizations. Moreover, because of the financial risks associated with constant changes in healthcare payment methods and constantly evolving information technology, healthcare organizations must continually adjust their business operation objectives; therefore, cloud computing presents both a challenge and an opportunity. As a response to aging populations and the prevalence of the Internet in fast-paced contemporary societies, cloud computing can be used to facilitate the task of balancing the quality and costs of health care. To evaluate cloud computing service systems for use in health care, providing decision makers with a comprehensive assessment method for prioritizing decision-making factors is highly beneficial. Hence, this study applied the analytic hierarchy process, compared items related to cloud computing and health care, executed a questionnaire survey, and then classified the critical factors influencing healthcare cloud computing service systems on the basis of statistical analyses of the questionnaire results. The results indicate that the primary factor affecting the design or implementation of optimal cloud computing healthcare service systems is cost effectiveness, with the secondary factors being practical considerations such as software design and system architecture.
Driver comprehension of managed lane signing.
DOT National Transportation Integrated Search
2009-09-01
A statewide survey of driver comprehension of managed lane signing is reported. Computer-based surveys were conducted using video clips of computer animations as well as still images of signs. The surveys were conducted in four Texas cities with a to...
Comprehension and retrieval of failure cases in airborne observatories
NASA Technical Reports Server (NTRS)
Alvarado, Sergio J.; Mock, Kenrick J.
1995-01-01
This paper describes research dealing with the computational problem of analyzing and repairing failures of electronic and mechanical systems of telescopes in NASA's airborne observatories, such as KAO (Kuiper Airborne Observatory) and SOFIA (Stratospheric Observatory for Infrared Astronomy). The research has resulted in the development of an experimental system that acquires knowledge of failure analysis from input text, and answers questions regarding failure detection and correction. The system's design builds upon previous work on text comprehension and question answering, including: knowledge representation for conceptual analysis of failure descriptions, strategies for mapping natural language into conceptual representations, case-based reasoning strategies for memory organization and indexing, and strategies for memory search and retrieval. These techniques have been combined into a model that accounts for: (a) how to build a knowledge base of system failures and repair procedures from descriptions that appear in telescope-operators' logbooks and FMEA (failure modes and effects analysis) manuals; and (b) how to use that knowledge base to search and retrieve answers to questions about causes and effects of failures, as well as diagnosis and repair procedures. This model has been implemented in FANSYS (Failure ANalysis SYStem), a prototype text comprehension and question answering program for failure analysis.
Comprehension and retrieval of failure cases in airborne observatories
NASA Astrophysics Data System (ADS)
Alvarado, Sergio J.; Mock, Kenrick J.
1995-05-01
This paper describes research dealing with the computational problem of analyzing and repairing failures of electronic and mechanical systems of telescopes in NASA's airborne observatories, such as KAO (Kuiper Airborne Observatory) and SOFIA (Stratospheric Observatory for Infrared Astronomy). The research has resulted in the development of an experimental system that acquires knowledge of failure analysis from input text, and answers questions regarding failure detection and correction. The system's design builds upon previous work on text comprehension and question answering, including: knowledge representation for conceptual analysis of failure descriptions, strategies for mapping natural language into conceptual representations, case-based reasoning strategies for memory organization and indexing, and strategies for memory search and retrieval. These techniques have been combined into a model that accounts for: (a) how to build a knowledge base of system failures and repair procedures from descriptions that appear in telescope-operators' logbooks and FMEA (failure modes and effects analysis) manuals; and (b) how to use that knowledge base to search and retrieve answers to questions about causes and effects of failures, as well as diagnosis and repair procedures. This model has been implemented in FANSYS (Failure ANalysis SYStem), a prototype text comprehension and question answering program for failure analysis.
Integrating three-dimensional digital technologies for comprehensive implant dentistry.
Patel, Neal
2010-06-01
The increase in the popularity of and the demand for the use of dental implants to replace teeth has encouraged advancement in clinical technology and materials to improve patients' acceptance and clinical outcomes. Recent advances such as three-dimensional dental radiography with cone-beam computed tomography (CBCT), precision dental implant planning software and clinical execution with guided surgery all play a role in the success of implant dentistry. The author illustrates the technique of comprehensive implant dentistry planning through integration of computer-aided design/computer-aided manufacturing (CAD/CAM) and CBCT data. The technique includes clinical treatment with guided surgery, including the creation of a final restoration with a high-strength ceramic (IPS e.max CAD, Ivoclar Vivadent, Amherst, N.Y.). The author also introduces a technique involving CAD/CAM for fabricating custom implant abutments. The release of software integrating CEREC Acquisition Center with Bluecam (Sirona Dental Systems, Charlotte, N.C.) chairside CAD/CAM and Galileos CBCT imaging (Sirona Dental Systems) allows dentists to plan implant placement, perform implant dentistry with increased precision and provide predictable restorative results by using chairside IPS e.max CAD. The precision of clinical treatment provided by the integration of CAD/CAM and CBCT allows dentists to plan for ideal surgical placement and the appropriate thickness of restorative modalities before placing implants.
A computer-aided approach to nonlinear control systhesis
NASA Technical Reports Server (NTRS)
Wie, Bong; Anthony, Tobin
1988-01-01
The major objective of this project is to develop a computer-aided approach to nonlinear stability analysis and nonlinear control system design. This goal is to be obtained by refining the describing function method as a synthesis tool for nonlinear control design. The interim report outlines the approach by this study to meet these goals including an introduction to the INteractive Controls Analysis (INCA) program which was instrumental in meeting these study objectives. A single-input describing function (SIDF) design methodology was developed in this study; coupled with the software constructed in this study, the results of this project provide a comprehensive tool for design and integration of nonlinear control systems.
The Design and Implementation of an Operating System for the IBM Personal Computer.
1984-12-01
comprehensive study of an actual operating system in an effort to show students how theory has been put into action (Lions, 1978; McCharen, 1980). Another...Freedman, 1977). However, since it is easier to develop and maintain a program written in a high-order language (HOL), Pascal was chosen to be the primary...monolithic monitor approach and the kernel approach are strategies which can be used to structure operating systems ( Deitel , 1983; Holt, 1983
DOE Office of Scientific and Technical Information (OSTI.GOV)
Geng, Guangchao; Abhyankar, Shrirang; Wang, Xiaoyu
Transient stability-constrained optimal power flow is an important emerging problem with power systems pushed to the limits for economic benefits, dense and larger interconnected systems, and reduced inertia due to expected proliferation of renewable energy resources. In this study, two more approaches: single machine equivalent and computational intelligence are presented. Also discussed are various application areas, and future directions in this research area. In conclusion, a comprehensive resource for the available literature, publicly available test systems, and relevant numerical libraries is also provided.
NASA Astrophysics Data System (ADS)
Han, Suyue; Chang, Gary Han; Schirmer, Clemens; Modarres-Sadeghi, Yahya
2016-11-01
We construct a reduced-order model (ROM) to study the Wall Shear Stress (WSS) distributions in image-based patient-specific aneurysms models. The magnitude of WSS has been shown to be a critical factor in growth and rupture of human aneurysms. We start the process by running a training case using Computational Fluid Dynamics (CFD) simulation with time-varying flow parameters, such that these parameters cover the range of parameters of interest. The method of snapshot Proper Orthogonal Decomposition (POD) is utilized to construct the reduced-order bases using the training CFD simulation. The resulting ROM enables us to study the flow patterns and the WSS distributions over a range of system parameters computationally very efficiently with a relatively small number of modes. This enables comprehensive analysis of the model system across a range of physiological conditions without the need to re-compute the simulation for small changes in the system parameters.
Vectorial Representations of Meaning for a Computational Model of Language Comprehension
ERIC Educational Resources Information Center
Wu, Stephen Tze-Inn
2010-01-01
This thesis aims to define and extend a line of computational models for text comprehension that are humanly plausible. Since natural language is human by nature, computational models of human language will always be just that--models. To the degree that they miss out on information that humans would tap into, they may be improved by considering…
Kuttner, Samuel; Bujila, Robert; Kortesniemi, Mika; Andersson, Henrik; Kull, Love; Østerås, Bjørn Helge; Thygesen, Jesper; Tarp, Ivanka Sojat
2013-03-01
Quality assurance (QA) of computed tomography (CT) systems is one of the routine tasks for medical physicists in the Nordic countries. However, standardized QA protocols do not yet exist and the QA methods, as well as the applied tolerance levels, vary in scope and extent at different hospitals. To propose a standardized protocol for acceptance and constancy testing of CT scanners in the Nordic Region. Following a Nordic Association for Clinical Physics (NACP) initiative, a group of medical physicists, with representatives from four Nordic countries, was formed. Based on international literature and practical experience within the group, a comprehensive standardized test protocol was developed. The proposed protocol includes tests related to the mechanical functionality, X-ray tube, detector, and image quality for CT scanners. For each test, recommendations regarding the purpose, equipment needed, an outline of the test method, the measured parameter, tolerance levels, and the testing frequency are stated. In addition, a number of optional tests are briefly discussed that may provide further information about the CT system. Based on international references and medical physicists' practical experiences, a comprehensive QA protocol for CT systems is proposed, including both acceptance and constancy tests. The protocol may serve as a reference for medical physicists in the Nordic countries.
Tebani, Abdellah; Afonso, Carlos; Marret, Stéphane; Bekri, Soumeya
2016-01-01
The rise of technologies that simultaneously measure thousands of data points represents the heart of systems biology. These technologies have had a huge impact on the discovery of next-generation diagnostics, biomarkers, and drugs in the precision medicine era. Systems biology aims to achieve systemic exploration of complex interactions in biological systems. Driven by high-throughput omics technologies and the computational surge, it enables multi-scale and insightful overviews of cells, organisms, and populations. Precision medicine capitalizes on these conceptual and technological advancements and stands on two main pillars: data generation and data modeling. High-throughput omics technologies allow the retrieval of comprehensive and holistic biological information, whereas computational capabilities enable high-dimensional data modeling and, therefore, accessible and user-friendly visualization. Furthermore, bioinformatics has enabled comprehensive multi-omics and clinical data integration for insightful interpretation. Despite their promise, the translation of these technologies into clinically actionable tools has been slow. In this review, we present state-of-the-art multi-omics data analysis strategies in a clinical context. The challenges of omics-based biomarker translation are discussed. Perspectives regarding the use of multi-omics approaches for inborn errors of metabolism (IEM) are presented by introducing a new paradigm shift in addressing IEM investigations in the post-genomic era. PMID:27649151
Tebani, Abdellah; Afonso, Carlos; Marret, Stéphane; Bekri, Soumeya
2016-09-14
The rise of technologies that simultaneously measure thousands of data points represents the heart of systems biology. These technologies have had a huge impact on the discovery of next-generation diagnostics, biomarkers, and drugs in the precision medicine era. Systems biology aims to achieve systemic exploration of complex interactions in biological systems. Driven by high-throughput omics technologies and the computational surge, it enables multi-scale and insightful overviews of cells, organisms, and populations. Precision medicine capitalizes on these conceptual and technological advancements and stands on two main pillars: data generation and data modeling. High-throughput omics technologies allow the retrieval of comprehensive and holistic biological information, whereas computational capabilities enable high-dimensional data modeling and, therefore, accessible and user-friendly visualization. Furthermore, bioinformatics has enabled comprehensive multi-omics and clinical data integration for insightful interpretation. Despite their promise, the translation of these technologies into clinically actionable tools has been slow. In this review, we present state-of-the-art multi-omics data analysis strategies in a clinical context. The challenges of omics-based biomarker translation are discussed. Perspectives regarding the use of multi-omics approaches for inborn errors of metabolism (IEM) are presented by introducing a new paradigm shift in addressing IEM investigations in the post-genomic era.
NASA Astrophysics Data System (ADS)
Zhang, Yunju; Chen, Zhongyi; Guo, Ming; Lin, Shunsheng; Yan, Yinyang
2018-01-01
With the large capacity of the power system, the development trend of the large unit and the high voltage, the scheduling operation is becoming more frequent and complicated, and the probability of operation error increases. This paper aims at the problem of the lack of anti-error function, single scheduling function and low working efficiency for technical support system in regional regulation and integration, the integrated construction of the error prevention of the integrated architecture of the system of dispatching anti - error of dispatching anti - error of power network based on cloud computing has been proposed. Integrated system of error prevention of Energy Management System, EMS, and Operation Management System, OMS have been constructed either. The system architecture has good scalability and adaptability, which can improve the computational efficiency, reduce the cost of system operation and maintenance, enhance the ability of regional regulation and anti-error checking with broad development prospects.
Power source evaluation capabilities at Sandia National Laboratories
DOE Office of Scientific and Technical Information (OSTI.GOV)
Doughty, D.H.; Butler, P.C.
1996-04-01
Sandia National Laboratories maintains one of the most comprehensive power source characterization facilities in the U.S. National Laboratory system. This paper describes the capabilities for evaluation of fuel cell technologies. The facility has a rechargeable battery test laboratory and a test area for performing nondestructive and functional computer-controlled testing of cells and batteries.
CODAP: Programmer Notes for the Subroutine Library on the Univac 1108.
ERIC Educational Resources Information Center
Weissmuller, Johnny J.; And Others
The Comprehensive Occupational Data Analysis Programs (CODAP) package is a highly interactive and efficient system of computer routines for analyzing, organizing, and reporting occupational information. Since its inception in 1960, CODAP has grown in tandem with advances in job analysis methodology and is now capable of answering most of the wide…
A Contracign out Success Story
NASA Technical Reports Server (NTRS)
Davis, Esker K.
1993-01-01
This paper is based on the highly successfull Mission and Computing Support (MACS) Contract, between the Jet Propulsion Laboratory (JPL) and OAO Corporation, which encompasses a comprehensive set of work for information systems and services. JPL awarded the MACS contract in 1988 and saw very succeful operation during the first five years which were recently completed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abercrombie, Robert K; Sheldon, Frederick T; Grimaila, Michael R
2010-01-01
In earlier works, we presented a computational infrastructure that allows an analyst to estimate the security of a system in terms of the loss that each stakeholder stands to sustain as a result of security breakdowns. In this paper, we discuss how this infrastructure can be used in the subject domain of mission assurance as defined as the full life-cycle engineering process to identify and mitigate design, production, test, and field support deficiencies of mission success. We address the opportunity to apply the Cyberspace Security Econometrics System (CSES) to Carnegie Mellon University and Software Engineering Institute s Mission Assurance Analysismore » Protocol (MAAP) in this context.« less
Prlj, Antonio; Curchod, Basile F E; Corminboeuf, Clémence
2015-06-14
The computational elucidation and proper description of the ultrafast deactivation mechanisms of simple organic electronic units, such as thiophene and its oligomers, is as challenging as it is contentious. A comprehensive excited state dynamics analysis of these systems utilizing reliable electronic structure approaches is currently lacking, with earlier pictures of the photochemistry of these systems being conceived based upon high-level static computations or lower level dynamic trajectories. Here a detailed surface hopping molecular dynamics of thiophene and bithiophene using the algebraic diagrammatic construction to second order (ADC(2)) method is presented. Our findings illustrate that ring puckering plays an important role in thiophene photochemistry and that the photostability increases when going upon dimerization into bithiophene.
Chedjou, Jean Chamberlain; Kyamakya, Kyandoghere
2015-04-01
This paper develops and validates a comprehensive and universally applicable computational concept for solving nonlinear differential equations (NDEs) through a neurocomputing concept based on cellular neural networks (CNNs). High-precision, stability, convergence, and lowest-possible memory requirements are ensured by the CNN processor architecture. A significant challenge solved in this paper is that all these cited computing features are ensured in all system-states (regular or chaotic ones) and in all bifurcation conditions that may be experienced by NDEs.One particular quintessence of this paper is to develop and demonstrate a solver concept that shows and ensures that CNN processors (realized either in hardware or in software) are universal solvers of NDE models. The solving logic or algorithm of given NDEs (possible examples are: Duffing, Mathieu, Van der Pol, Jerk, Chua, Rössler, Lorenz, Burgers, and the transport equations) through a CNN processor system is provided by a set of templates that are computed by our comprehensive templates calculation technique that we call nonlinear adaptive optimization. This paper is therefore a significant contribution and represents a cutting-edge real-time computational engineering approach, especially while considering the various scientific and engineering applications of this ultrafast, energy-and-memory-efficient, and high-precise NDE solver concept. For illustration purposes, three NDE models are demonstratively solved, and related CNN templates are derived and used: the periodically excited Duffing equation, the Mathieu equation, and the transport equation.
Process computerization of No. 13 blast furnace at Gary works
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sherman, G.J.; Zmierski, M.L.; Hyle, F.W.
1993-10-01
No. 13 blast furnace underwent extensive modifications to the process control system during the second reline. This represents a major shift in control philosophy from hardwired relays and analog controllers to a totally integrated computer control system. The new system created the opportunity for comprehensive diagnostic analysis and increased flexibility for control strategy modifications. The goal of achieving maximum production with minimal delay was accomplished by an integrated test, and comprehensive operator and maintenance training. Normal production was reached within four days of blow-ink and design production capacity of 7800 NTHM/day was met in the third month of operation. Recordmore » furnace productivity of 8931 NTHM/day (8.56 NTHM/100 crwv) was achieved in less than five months of operation and again in April 1992 of 9062 NTHM/day (8.68 NTHM/100 cfwv).« less
A comprehensive analytical model of rotorcraft aerodynamics and dynamics. Part 3: Program manual
NASA Technical Reports Server (NTRS)
Johnson, W.
1980-01-01
The computer program for a comprehensive analytical model of rotorcraft aerodynamics and dynamics is described. This analysis is designed to calculate rotor performance, loads, and noise; the helicopter vibration and gust response; the flight dynamics and handling qualities; and the system aeroelastic stability. The analysis is a combination of structural, inertial, and aerodynamic models that is applicable to a wide range of problems and a wide class of vehicles. The analysis is intended for use in the design, testing, and evaluation of rotors and rotorcraft and to be a basis for further development of rotary wing theories.
Comprehensive silicon solar-cell computer modeling
NASA Technical Reports Server (NTRS)
Lamorte, M. F.
1984-01-01
A comprehensive silicon solar cell computer modeling scheme was developed to perform the following tasks: (1) model and analysis of the net charge distribution in quasineutral regions; (2) experimentally determined temperature behavior of Spire Corp. n+pp+ solar cells where n+-emitter is formed by ion implantation of 75As or 31P; and (3) initial validation results of computer simulation program using Spire Corp. n+pp+ cells.
Computational analysis of aircraft pressure relief doors
NASA Astrophysics Data System (ADS)
Schott, Tyler
Modern trends in commercial aircraft design have sought to improve fuel efficiency while reducing emissions by operating at higher pressures and temperatures than ever before. Consequently, greater demands are placed on the auxiliary bleed air systems used for a multitude of aircraft operations. The increased role of bleed air systems poses significant challenges for the pressure relief system to ensure the safe and reliable operation of the aircraft. The core compartment pressure relief door (PRD) is an essential component of the pressure relief system which functions to relieve internal pressure in the core casing of a high-bypass turbofan engine during a burst duct over-pressurization event. The successful modeling and analysis of a burst duct event are imperative to the design and development of PRD's to ensure that they will meet the increased demands placed on the pressure relief system. Leveraging high-performance computing coupled with advances in computational analysis, this thesis focuses on a comprehensive computational fluid dynamics (CFD) study to characterize turbulent flow dynamics and quantify the performance of a core compartment PRD across a range of operating conditions and geometric configurations. The CFD analysis was based on a compressible, steady-state, three-dimensional, Reynolds-averaged Navier-Stokes approach. Simulations were analyzed, and results show that variations in freestream conditions, plenum environment, and geometric configurations have a non-linear impact on the discharge, moment, thrust, and surface temperature characteristics. The CFD study revealed that the underlying physics for this behavior is explained by the interaction of vortices, jets, and shockwaves. This thesis research is innovative and provides a comprehensive and detailed analysis of existing and novel PRD geometries over a range of realistic operating conditions representative of a burst duct over-pressurization event. Further, the study provides aircraft manufacturers with valuable insight into the impact that operating conditions and geometric configurations have on PRD performance and how the information can be used to assist future research and development of PRD design.
PANORAMA: An approach to performance modeling and diagnosis of extreme-scale workflows
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deelman, Ewa; Carothers, Christopher; Mandal, Anirban
Here we report that computational science is well established as the third pillar of scientific discovery and is on par with experimentation and theory. However, as we move closer toward the ability to execute exascale calculations and process the ensuing extreme-scale amounts of data produced by both experiments and computations alike, the complexity of managing the compute and data analysis tasks has grown beyond the capabilities of domain scientists. Therefore, workflow management systems are absolutely necessary to ensure current and future scientific discoveries. A key research question for these workflow management systems concerns the performance optimization of complex calculation andmore » data analysis tasks. The central contribution of this article is a description of the PANORAMA approach for modeling and diagnosing the run-time performance of complex scientific workflows. This approach integrates extreme-scale systems testbed experimentation, structured analytical modeling, and parallel systems simulation into a comprehensive workflow framework called Pegasus for understanding and improving the overall performance of complex scientific workflows.« less
PANORAMA: An approach to performance modeling and diagnosis of extreme-scale workflows
Deelman, Ewa; Carothers, Christopher; Mandal, Anirban; ...
2015-07-14
Here we report that computational science is well established as the third pillar of scientific discovery and is on par with experimentation and theory. However, as we move closer toward the ability to execute exascale calculations and process the ensuing extreme-scale amounts of data produced by both experiments and computations alike, the complexity of managing the compute and data analysis tasks has grown beyond the capabilities of domain scientists. Therefore, workflow management systems are absolutely necessary to ensure current and future scientific discoveries. A key research question for these workflow management systems concerns the performance optimization of complex calculation andmore » data analysis tasks. The central contribution of this article is a description of the PANORAMA approach for modeling and diagnosing the run-time performance of complex scientific workflows. This approach integrates extreme-scale systems testbed experimentation, structured analytical modeling, and parallel systems simulation into a comprehensive workflow framework called Pegasus for understanding and improving the overall performance of complex scientific workflows.« less
The Temporal Dimension of Linguistic Prediction
ERIC Educational Resources Information Center
Chow, Wing Yee
2013-01-01
This thesis explores how predictions about upcoming language inputs are computed during real-time language comprehension. Previous research has demonstrated humans' ability to use rich contextual information to compute linguistic prediction during real-time language comprehension, and it has been widely assumed that contextual information can…
Comprehensive optimization process of paranasal sinus radiography.
Saarakkala, S; Nironen, K; Hermunen, H; Aarnio, J; Heikkinen, J O
2009-04-01
The optimization of radiological examinations is important in order to reduce unnecessary patient radiation exposure. To perform a comprehensive optimization process for paranasal sinus radiography at Mikkeli Central Hospital, Finland. Patients with suspicion of acute sinusitis were imaged with a Kodak computed radiography (CR) system (n=20) and with a Philips digital radiography (DR) system (n=30) using focus-detector distances (FDDs) of 110 cm, 150 cm, or 200 cm. Patients' radiation exposure was determined in terms of entrance surface dose and dose-area product. Furthermore, an anatomical phantom was used for the estimation of point doses inside the head. Clinical image quality was evaluated by an experienced radiologist, and physical image quality was evaluated from the digital radiography phantom. Patient doses were significantly lower and image quality better with the DR system compared to the CR system. The differences in patient dose and physical image quality were small with varying FDD. Clinical image quality of the DR system was lowest with FDD of 200 cm. Further, imaging with FDD of 150 cm was technically easier for the technologist to perform than with FDD of 110 cm. After optimization, it was recommended that the DR system with FDD of 150 cm should always be used at Mikkeli Central Hospital. We recommend this kind of comprehensive approach in all optimization processes of radiological examinations.
Shaw, M J; Beebe, T J; Tomshine, P A; Adlis, S A; Cass, O W
2001-02-01
The purpose of our study was to assess the effectiveness of computer-assisted instruction (CAI) in patients having colonoscopies. We conducted a randomized, controlled trial in large, multispecialty clinic. Eighty-six patients were referred for colonoscopies. The interventions were standard education versus standard education plus CAI, and the outcome measures were anxiety, comprehension, and satisfaction. Computer-assisted instruction had no effect on patients' anxiety. The group receiving CAI demonstrated better overall comprehension (p < 0.001). However, Comprehension of certain aspects of serious complications and appropriate postsedation behavior were unaffected by educational method. Patients in the CAI group were more likely to indicate satisfaction with the amount of information provided when compared with the standard education counterparts (p = 0.001). Overall satisfaction was unaffected by educational method. Computer-assisted instruction for colonoscopy provided better comprehension and greater satisfaction with the adequacy of education than standard education. Computer-assisted instruction helps physicians meet their educational responsibilities with no decrement to the interpersonal aspects of the patient-physician relationship.
Effect of attention therapy on reading comprehension.
Solan, Harold A; Shelley-Tremblay, John; Ficarra, Anthony; Silverman, Michael; Larson, Steven
2003-01-01
This study quantified the influence of visual attention therapy on the reading comprehension of Grade 6 children with moderate reading disabilities (RD) in the absence of specific reading remediation. Thirty students with below-average reading scores were identified using standardized reading comprehension tests. Fifteen children were placed randomly in the experimental group and 15 in the control group. The Attention Battery of the Cognitive Assessment System was administered to all participants. The experimental group received 12 one-hour sessions of individually monitored, computer-based attention therapy programs; the control group received no therapy during their 12-week period. Each group was retested on attention and reading comprehension measures. In order to stimulate selective and sustained visual attention, the vision therapy stressed various aspects of arousal, activation, and vigilance. At the completion of attention therapy, the mean standard attention and reading comprehension scores of the experimental group had improved significantly. The control group, however, showed no significant improvement in reading comprehension scores after 12 weeks. Although uncertainties still exist, this investigation supports the notion that visual attention is malleable and that attention therapy has a significant effect on reading comprehension in this often neglected population.
Data Base Management Systems Panel Workshop: Executive summary
NASA Technical Reports Server (NTRS)
1979-01-01
Data base management systems (DBMS) for space acquired and associated data are discussed. The full range of DBMS needs is covered including acquiring, managing, storing, archiving, accessing and dissemination of data for an application. Existing bottlenecks in DBMS operations, expected developments in the field of remote sensing, communications, and computer science are discussed, and an overview of existing conditions and expected problems is presented. The requirements for a proposed spatial information system and characteristics of a comprehensive browse facility for earth observations applications are included.
Requirements for Next Generation Comprehensive Analysis of Rotorcraft
NASA Technical Reports Server (NTRS)
Johnson, Wayne; Data, Anubhav
2008-01-01
The unique demands of rotorcraft aeromechanics analysis have led to the development of software tools that are described as comprehensive analyses. The next generation of rotorcraft comprehensive analyses will be driven and enabled by the tremendous capabilities of high performance computing, particularly modular and scaleable software executed on multiple cores. Development of a comprehensive analysis based on high performance computing both demands and permits a new analysis architecture. This paper describes a vision of the requirements for this next generation of comprehensive analyses of rotorcraft. The requirements are described and substantiated for what must be included and justification provided for what should be excluded. With this guide, a path to the next generation code can be found.
NASA Technical Reports Server (NTRS)
Feng, C.; Sun, X.; Shen, Y. N.; Lombardi, Fabrizio
1992-01-01
This paper covers the verification and protocol validation for distributed computer and communication systems using a computer aided testing approach. Validation and verification make up the so-called process of conformance testing. Protocol applications which pass conformance testing are then checked to see whether they can operate together. This is referred to as interoperability testing. A new comprehensive approach to protocol testing is presented which address: (1) modeling for inter-layer representation for compatibility between conformance and interoperability testing; (2) computational improvement to current testing methods by using the proposed model inclusive of formulation of new qualitative and quantitative measures and time-dependent behavior; (3) analysis and evaluation of protocol behavior for interactive testing without extensive simulation.
Study of the TRAC Airfoil Table Computational System
NASA Technical Reports Server (NTRS)
Hu, Hong
1999-01-01
The report documents the study of the application of the TRAC airfoil table computational package (TRACFOIL) to the prediction of 2D airfoil force and moment data over a wide range of angle of attack and Mach number. The TRACFOIL generates the standard C-81 airfoil table for input into rotorcraft comprehensive codes such as CAM- RAD. The existing TRACFOIL computer package is successfully modified to run on Digital alpha workstations and on Cray-C90 supercomputers. A step-by-step instruction for using the package on both computer platforms is provided. Application of the newer version of TRACFOIL is made for two airfoil sections. The C-81 data obtained using the TRACFOIL method are compared with those of wind-tunnel data and results are presented.
A simulation model for wind energy storage systems. Volume 2: Operation manual
NASA Technical Reports Server (NTRS)
Warren, A. W.; Edsinger, R. W.; Burroughs, J. D.
1977-01-01
A comprehensive computer program (SIMWEST) developed for the modeling of wind energy/storage systems utilizing any combination of five types of storage (pumped hydro, battery, thermal, flywheel, and pneumatic) is described. Features of the program include: a precompiler which generates computer models (in FORTRAN) of complex wind source/storage/application systems, from user specifications using the respective library components; a program which provides the techno-economic system analysis with the respective I/O the integration of system dynamics, and the iteration for conveyance of variables; and capability to evaluate economic feasibility as well as general performance of wind energy systems. The SIMWEST operation manual is presented and the usage of the SIMWEST program and the design of the library components are described. A number of example simulations intended to familiarize the user with the program's operation is given along with a listing of each SIMWEST library subroutine.
1983-07-01
Analysis of trace contaminants project at ORNL. Medium applied to movement of heavy metals through a forested watershed. OAQPS has not reviewed...computer cartography and site design aids; management information systems for facility planning, construction and * operation; and a computer...4 (5) Comprehensive 4 (6) Spills/ Heavy Gas 5 b. Regional 7 c. Reactive Pollutants 7 d. Special Purpose 8 e. Rocket Firing 8 f. Summary of Models by
Expert systems identify fossils and manage large paleontological databases
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beightol, D.S.; Conrad, M.A.
EXPAL is a computer program permitting creation and maintenance of comprehensive databases in marine paleontology. It is designed to assist specialists and non-specialists. EXPAL includes a powerful expert system based on the morphological descriptors specific to a given group of fossils. The expert system may be used, for example, to describe and automatically identify an unknown specimen. EXPAL was first applied to Dasycladales (Calcareous green algae). Projects are under way for corresponding expert systems and databases on planktonic foraminifers and calpionellids. EXPAL runs on an IBM XT or compatible microcomputer.
The computer-generated treatment plan... Create the nucleus to successful systems.
Bernhardt, Christene
2004-12-01
A well-managed, highly efficient practice relies on the comprehensive information provided through effective treatment planning. Computer-generated treatment plans are successful only if the 14 key points of information are included within the plan. Major systems such as scheduling, financial agreements, and insurance processing fail if adequate information is not provided through the treatment plan. Successful interactions with patients at the time of the consultation rely heavily on having adequate information at your fingertips. The treatment plan is truly the foundation to all communications that must occur during the patient's experience, and ensures that every team member has clear and easy access to the status of each patient as treatment unfolds.
Interfacing comprehensive rotorcraft analysis with advanced aeromechanics and vortex wake models
NASA Astrophysics Data System (ADS)
Liu, Haiying
This dissertation describes three aspects of the comprehensive rotorcraft analysis. First, a physics-based methodology for the modeling of hydraulic devices within multibody-based comprehensive models of rotorcraft systems is developed. This newly proposed approach can predict the fully nonlinear behavior of hydraulic devices, and pressure levels in the hydraulic chambers are coupled with the dynamic response of the system. The proposed hydraulic device models are implemented in a multibody code and calibrated by comparing their predictions with test bench measurements for the UH-60 helicopter lead-lag damper. Predicted peak damping forces were found to be in good agreement with measurements, while the model did not predict the entire time history of damper force to the same level of accuracy. The proposed model evaluates relevant hydraulic quantities such as chamber pressures, orifice flow rates, and pressure relief valve displacements. This model could be used to design lead-lag dampers with desirable force and damping characteristics. The second part of this research is in the area of computational aeroelasticity, in which an interface between computational fluid dynamics (CFD) and computational structural dynamics (CSD) is established. This interface enables data exchange between CFD and CSD with the goal of achieving accurate airloads predictions. In this work, a loose coupling approach based on the delta-airloads method is developed in a finite-element method based multibody dynamics formulation, DYMORE. To validate this aerodynamic interface, a CFD code, OVERFLOW-2, is loosely coupled with a CSD program, DYMORE, to compute the airloads of different flight conditions for Sikorsky UH-60 aircraft. This loose coupling approach has good convergence characteristics. The predicted airloads are found to be in good agreement with the experimental data, although not for all flight conditions. In addition, the tight coupling interface between the CFD program, OVERFLOW-2, and the CSD program, DYMORE, is also established. The ability to accurately capture the wake structure around a helicopter rotor is crucial for rotorcraft performance analysis. In the third part of this thesis, a new representation of the wake vortex structure based on Non-Uniform Rational B-Spline (NURBS) curves and surfaces is proposed to develop an efficient model for prescribed and free wakes. NURBS curves and surfaces are able to represent complex shapes with remarkably little data. The proposed formulation has the potential to reduce the computational cost associated with the use of Helmholtz's law and the Biot-Savart law when calculating the induced flow field around the rotor. An efficient free-wake analysis will considerably decrease the computational cost of comprehensive rotorcraft analysis, making the approach more attractive to routine use in industrial settings.
Towards pervasive computing in health care - a literature review.
Orwat, Carsten; Graefe, Andreas; Faulwasser, Timm
2008-06-19
The evolving concepts of pervasive computing, ubiquitous computing and ambient intelligence are increasingly influencing health care and medicine. Summarizing published research, this literature review provides an overview of recent developments and implementations of pervasive computing systems in health care. It also highlights some of the experiences reported in deployment processes. There is no clear definition of pervasive computing in the current literature. Thus specific inclusion criteria for selecting articles about relevant systems were developed. Searches were conducted in four scientific databases alongside manual journal searches for the period of 2002 to 2006. Articles included present prototypes, case studies and pilot studies, clinical trials and systems that are already in routine use. The searches identified 69 articles describing 67 different systems. In a quantitative analysis, these systems were categorized into project status, health care settings, user groups, improvement aims, and systems features (i.e., component types, data gathering, data transmission, systems functions). The focus is on the types of systems implemented, their frequency of occurrence and their characteristics. Qualitative analyses were performed of deployment issues, such as organizational and personnel issues, privacy and security issues, and financial issues. This paper provides a comprehensive access to the literature of the emerging field by addressing specific topics of application settings, systems features, and deployment experiences. Both an overview and an analysis of the literature on a broad and heterogeneous range of systems are provided. Most systems are described in their prototype stages. Deployment issues, such as implications on organization or personnel, privacy concerns, or financial issues are mentioned rarely, though their solution is regarded as decisive in transferring promising systems to a stage of regular operation. There is a need for further research on the deployment of pervasive computing systems, including clinical studies, economic and social analyses, user studies, etc.
An Investigation of Scaffolded Reading on EFL Hypertext Comprehension
ERIC Educational Resources Information Center
Shang, Hui-Fang
2015-01-01
With the rapid growth of computer technology, some printed texts are designed as hypertexts to help EFL (English as a foreign language) learners search for and process multiple resources in a timely manner for autonomous learning. The purpose of this study was to design a hypertext system and examine if a 14-week teacher-guided print-based and…
ERIC Educational Resources Information Center
Ozkan, Aysen; Yildirim, Kemal
2016-01-01
Problem Statement: Drafting course is essential for students in the design disciplines for becoming more organized and for complying with standards in the educational system. Drafting knowledge is crucial, both for comprehension of the issues and for the implementation phase. In any design project, drafting performance and success are as important…
ERIC Educational Resources Information Center
Osguthorpe, Russell T.; Li Chang, Linda
1988-01-01
A computerized symbol processor system using an Apple IIe computer and a Power Pad graphics tablet was tested with 22 nonspeaking, multiply disabled students. The students were taught to express themselves independently in writing, and they did significantly better than control students on measures of language comprehension and symbol recognition.…
A Computer System to Rate the Variety of Color in Drawings
ERIC Educational Resources Information Center
Kim, Seong-in; Hameed, Ibrahim A.
2009-01-01
For mental health professionals, art assessment is a useful tool for patient evaluation and diagnosis. Consideration of various color-related elements is important in art assessment. This correlational study introduces the concept of variety of color as a new color-related element of an artwork. This term represents a comprehensive use of color,…
The Effect of Electronic Storybooks on Struggling Fourth-Graders' Reading Comprehension
ERIC Educational Resources Information Center
Ertem, Ihsan Seyit
2010-01-01
This quantitative research examined the differences in struggling readers' comprehension of storybooks according to the medium of presentation. Each student was randomly assigned with one of three conditions: (1) computer presentation of storybooks with animation; (2) computer presentation of storybooks without animation; and (3) traditional print…
Rigby, Michael
2004-03-18
The effectiveness and quality of health informatics systems' support to healthcare delivery are largely determined by two factors-the suitability of the system installed, and the competence of the users. However, the profile of users of large-scale clinical health systems is significantly different from the profile of end-users in other enterprises such as the finance sector, insurance, travel or retail sales. Work with a mental health provider in Ireland, who was introducing a customized electronic patient record (EPR) system, identified the strong legal and ethical importance of adequately skills for the health professionals and others, who would be the system users. The experience identified the need for a clear and comprehensive generic user qualification at a basic but robust level. The European computer driving license (ECDL) has gained wide recognition as a basic generic qualification for users of computer systems. However, health systems and data have a series of characteristics that differentiate them from other data systems. The logical conclusion was the recognition of a need for an additional domain-specific qualification-an "ECDL Health Supplement". Development of this is now being progressed.
A Web GIS Enabled Comprehensive Hydrologic Information System for Indian Water Resources Systems
NASA Astrophysics Data System (ADS)
Goyal, A.; Tyagi, H.; Gosain, A. K.; Khosa, R.
2017-12-01
Hydrological systems across the globe are getting increasingly water stressed with each passing season due to climate variability & snowballing water demand. Hence, to safeguard food, livelihood & economic security, it becomes imperative to employ scientific studies for holistic management of indispensable resource like water. However, hydrological study of any scale & purpose is heavily reliant on various spatio-temporal datasets which are not only difficult to discover/access but are also tough to use & manage. Besides, owing to diversity of water sector agencies & dearth of standard operating procedures, seamless information exchange is challenging for collaborators. Extensive research is being done worldwide to address these issues but regrettably not much has been done in developing countries like India. Therefore, the current study endeavours to develop a Hydrological Information System framework in a Web-GIS environment for empowering Indian water resources systems. The study attempts to harmonize the standards for metadata, terminology, symbology, versioning & archiving for effective generation, processing, dissemination & mining of data required for hydrological studies. Furthermore, modelers with humble computing resources at their disposal, can consume this standardized data in high performance simulation modelling using cloud computing within the developed Web-GIS framework. They can also integrate the inputs-outputs of different numerical models available on the platform and integrate their results for comprehensive analysis of the chosen hydrological system. Thus, the developed portal is an all-in-one framework that can facilitate decision makers, industry professionals & researchers in efficient water management.
NASA Astrophysics Data System (ADS)
Gerjuoy, Edward
2005-06-01
The security of messages encoded via the widely used RSA public key encryption system rests on the enormous computational effort required to find the prime factors of a large number N using classical (conventional) computers. In 1994 Peter Shor showed that for sufficiently large N, a quantum computer could perform the factoring with much less computational effort. This paper endeavors to explain, in a fashion comprehensible to the nonexpert, the RSA encryption protocol; the various quantum computer manipulations constituting the Shor algorithm; how the Shor algorithm performs the factoring; and the precise sense in which a quantum computer employing Shor's algorithm can be said to accomplish the factoring of very large numbers with less computational effort than a classical computer. It is made apparent that factoring N generally requires many successive runs of the algorithm. Our analysis reveals that the probability of achieving a successful factorization on a single run is about twice as large as commonly quoted in the literature.
Addressing failures in exascale computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Snir, Marc; Wisniewski, Robert W.; Abraham, Jacob A.
2014-05-01
We present here a report produced by a workshop on “Addressing Failures in Exascale Computing” held in Park City, Utah, August 4–11, 2012. The charter of this workshop was to establish a common taxonomy about resilience across all the levels in a computing system; discuss existing knowledge on resilience across the various hardware and software layers of an exascale system; and build on those results, examining potential solutions from both a hardware and software perspective and focusing on a combined approach. The workshop brought together participants with expertise in applications, system software, and hardware; they came from industry, government, andmore » academia; and their interests ranged from theory to implementation. The combination allowed broad and comprehensive discussions and led to this document, which summarizes and builds on those discussions.« less
[A new concept for integration of image databanks into a comprehensive patient documentation].
Schöll, E; Holm, J; Eggli, S
2001-05-01
Image processing and archiving are of increasing importance in the practice of modern medicine. Particularly due to the introduction of computer-based investigation methods, physicians are dealing with a wide variety of analogue and digital picture archives. On the other hand, clinical information is stored in various text-based information systems without integration of image components. The link between such traditional medical databases and picture archives is a prerequisite for efficient data management as well as for continuous quality control and medical education. At the Department of Orthopedic Surgery, University of Berne, a software program was developed to create a complete multimedia electronic patient record. The client-server system contains all patients' data, questionnaire-based quality control, and a digital picture archive. Different interfaces guarantee the integration into the hospital's data network. This article describes our experiences in the development and introduction of a comprehensive image archiving system at a large orthopedic center.
A joint precoding scheme for indoor downlink multi-user MIMO VLC systems
NASA Astrophysics Data System (ADS)
Zhao, Qiong; Fan, Yangyu; Kang, Bochao
2017-11-01
In this study, we aim to improve the system performance and reduce the implementation complexity of precoding scheme for visible light communication (VLC) systems. By incorporating the power-method algorithm and the block diagonalization (BD) algorithm, we propose a joint precoding scheme for indoor downlink multi-user multi-input-multi-output (MU-MIMO) VLC systems. In this scheme, we apply the BD algorithm to eliminate the co-channel interference (CCI) among users firstly. Secondly, the power-method algorithm is used to search the precoding weight for each user based on the optimal criterion of signal to interference plus noise ratio (SINR) maximization. Finally, the optical power restrictions of VLC systems are taken into account to constrain the precoding weight matrix. Comprehensive computer simulations in two scenarios indicate that the proposed scheme always has better bit error rate (BER) performance and lower computation complexity than that of the traditional scheme.
NASA Technical Reports Server (NTRS)
Biernacki, John; Juhasz, John; Sadler, Gerald
1991-01-01
A team of Space Station Freedom (SSF) system engineers are in the process of extensive analysis of the SSF requirements, particularly those pertaining to the electrical power system (EPS). The objective of this analysis is the development of a comprehensive, computer-based requirements model, using an enhanced modern structured analysis methodology (EMSA). Such a model provides a detailed and consistent representation of the system's requirements. The process outlined in the EMSA methodology is unique in that it allows the graphical modeling of real-time system state transitions, as well as functional requirements and data relationships, to be implemented using modern computer-based tools. These tools permit flexible updating and continuous maintenance of the models. Initial findings resulting from the application of EMSA to the EPS have benefited the space station program by linking requirements to design, providing traceability of requirements, identifying discrepancies, and fostering an understanding of the EPS.
The computational future for climate and Earth system models: on the path to petaflop and beyond.
Washington, Warren M; Buja, Lawrence; Craig, Anthony
2009-03-13
The development of the climate and Earth system models has had a long history, starting with the building of individual atmospheric, ocean, sea ice, land vegetation, biogeochemical, glacial and ecological model components. The early researchers were much aware of the long-term goal of building the Earth system models that would go beyond what is usually included in the climate models by adding interactive biogeochemical interactions. In the early days, the progress was limited by computer capability, as well as by our knowledge of the physical and chemical processes. Over the last few decades, there has been much improved knowledge, better observations for validation and more powerful supercomputer systems that are increasingly meeting the new challenges of comprehensive models. Some of the climate model history will be presented, along with some of the successes and difficulties encountered with present-day supercomputer systems.
Challenges in reducing the computational time of QSTS simulations for distribution system analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deboever, Jeremiah; Zhang, Xiaochen; Reno, Matthew J.
The rapid increase in penetration of distributed energy resources on the electric power distribution system has created a need for more comprehensive interconnection modelling and impact analysis. Unlike conventional scenario - based studies , quasi - static time - series (QSTS) simulation s can realistically model time - dependent voltage controllers and the diversity of potential impacts that can occur at different times of year . However, to accurately model a distribution system with all its controllable devices, a yearlong simulation at 1 - second resolution is often required , which could take conventional computers a computational time of 10more » to 120 hours when an actual unbalanced distribution feeder is modeled . This computational burden is a clear l imitation to the adoption of QSTS simulation s in interconnection studies and for determining optimal control solutions for utility operations . Our ongoing research to improve the speed of QSTS simulation has revealed many unique aspects of distribution system modelling and sequential power flow analysis that make fast QSTS a very difficult problem to solve. In this report , the most relevant challenges in reducing the computational time of QSTS simulations are presented: number of power flows to solve, circuit complexity, time dependence between time steps, multiple valid power flow solutions, controllable element interactions, and extensive accurate simulation analysis.« less
Hashemi, Sepehr; Armand, Mehran; Gordon, Chad R
2016-10-01
To describe the development and refinement of the computer-assisted planning and execution (CAPE) system for use in face-jaw-teeth transplants (FJTTs). Although successful, some maxillofacial transplants result in suboptimal hybrid occlusion and may require subsequent surgical orthognathic revisions. Unfortunately, the use of traditional dental casts and splints pose several compromising shortcomings in the context of FJTT and hybrid occlusion. Computer-assisted surgery may overcome these challenges. Therefore, the use of computer-assisted orthognathic techniques and functional planning may prevent the need for such revisions and improve facial-skeletal outcomes. A comprehensive CAPE system for use in FJTT was developed through a multicenter collaboration and refined using plastic models, live miniature swine surgery, and human cadaver models. The system marries preoperative surgical planning and intraoperative execution by allowing on-table navigation of the donor fragment relative to recipient cranium, and real-time reporting of patient's cephalometric measurements relative to a desired dental-skeletal outcome. FJTTs using live-animal and cadaveric models demonstrate the CAPE system to be accurate in navigation and beneficial in improving hybrid occlusion and other craniofacial outcomes. Future refinement of the CAPE system includes integration of more commonly performed orthognathic/maxillofacial procedures.
Mast, Fred D.; Ratushny, Alexander V.
2014-01-01
Systems cell biology melds high-throughput experimentation with quantitative analysis and modeling to understand many critical processes that contribute to cellular organization and dynamics. Recently, there have been several advances in technology and in the application of modeling approaches that enable the exploration of the dynamic properties of cells. Merging technology and computation offers an opportunity to objectively address unsolved cellular mechanisms, and has revealed emergent properties and helped to gain a more comprehensive and fundamental understanding of cell biology. PMID:25225336
Demonstration Advanced Avionics System (DAAS) functional description. [Cessna 402B aircraft
NASA Technical Reports Server (NTRS)
1980-01-01
A comprehensive set of general aviation avionics were defined for integration into an advanced hardware mechanization for demonstration in a Cessna 402B aircraft. Block diagrams are shown and system and computer architecture as well as significant hardware elements are described. The multifunction integrated data control center and electronic horizontal situation indicator are discussed. The functions that the DAAS will perform are examined. This function definition is the basis for the DAAS hardware and software design.
ERIC Educational Resources Information Center
McNinch, George H., Ed.; And Others
Conference presentations of research on reading comprehension, reading instruction, computer applications in reading instruction, and reading theory are compiled in this yearbook. Titles and authors of some of the articles are as follows: "A Rationale for Teaching Children with Limited English Proficiency" (M. Zintz); "Preliminary Development of a…
Development of a Computer-Based Measure of Listening Comprehension of Science Talk
ERIC Educational Resources Information Center
Lin, Sheau-Wen; Liu, Yu; Chen, Shin-Feng; Wang, Jing-Ru; Kao, Huey-Lien
2015-01-01
The purpose of this study was to develop a computer-based assessment for elementary school students' listening comprehension of science talk within an inquiry-oriented environment. The development procedure had 3 steps: a literature review to define the framework of the test, collecting and identifying key constructs of science talk, and…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Orrell, S.; Ralstin, S.
1992-04-01
Many computer security plans specify that only a small percentage of the data processed will be classified. Thus, the bulk of the data on secure systems must be unclassified. Secure limited access sites operating approved classified computing systems sometimes also have a system ostensibly containing only unclassified files but operating within the secure environment. That system could be networked or otherwise connected to a classified system(s) in order that both be able to use common resources for file storage or computing power. Such a system must operate under the same rules as the secure classified systems. It is in themore » nature of unclassified files that they either came from, or will eventually migrate to, a non-secure system. Today, unclassified files are exported from systems within the secure environment typically by loading transport media and carrying them to an open system. Import of unclassified files is handled similarly. This media transport process, sometimes referred to as sneaker net, often is manually logged and controlled only by administrative procedures. A comprehensive system for secure bi-directional transfer of unclassified files between secure and open environments has yet to be developed. Any such secure file transport system should be required to meet several stringent criteria. It is the purpose of this document to begin a definition of these criteria.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Orrell, S.; Ralstin, S.
1992-01-01
Many computer security plans specify that only a small percentage of the data processed will be classified. Thus, the bulk of the data on secure systems must be unclassified. Secure limited access sites operating approved classified computing systems sometimes also have a system ostensibly containing only unclassified files but operating within the secure environment. That system could be networked or otherwise connected to a classified system(s) in order that both be able to use common resources for file storage or computing power. Such a system must operate under the same rules as the secure classified systems. It is in themore » nature of unclassified files that they either came from, or will eventually migrate to, a non-secure system. Today, unclassified files are exported from systems within the secure environment typically by loading transport media and carrying them to an open system. Import of unclassified files is handled similarly. This media transport process, sometimes referred to as sneaker net, often is manually logged and controlled only by administrative procedures. A comprehensive system for secure bi-directional transfer of unclassified files between secure and open environments has yet to be developed. Any such secure file transport system should be required to meet several stringent criteria. It is the purpose of this document to begin a definition of these criteria.« less
Student Use of Physics to Make Sense of Incomplete but Functional VPython Programs in a Lab Setting
NASA Astrophysics Data System (ADS)
Weatherford, Shawn A.
2011-12-01
Computational activities in Matter & Interactions, an introductory calculus-based physics course, have the instructional goal of providing students with the experience of applying the same set of a small number of fundamental principles to model a wide range of physical systems. However there are significant instructional challenges for students to build computer programs under limited time constraints, especially for students who are unfamiliar with programming languages and concepts. Prior attempts at designing effective computational activities were successful at having students ultimately build working VPython programs under the tutelage of experienced teaching assistants in a studio lab setting. A pilot study revealed that students who completed these computational activities had significant difficultly repeating the exact same tasks and further, had difficulty predicting the animation that would be produced by the example program after interpreting the program code. This study explores the interpretation and prediction tasks as part of an instructional sequence where students are asked to read and comprehend a functional, but incomplete program. Rather than asking students to begin their computational tasks with modifying program code, we explicitly ask students to interpret an existing program that is missing key lines of code. The missing lines of code correspond to the algebraic form of fundamental physics principles or the calculation of forces which would exist between analogous physical objects in the natural world. Students are then asked to draw a prediction of what they would see in the simulation produced by the VPython program and ultimately run the program to evaluate the students' prediction. This study specifically looks at how the participants use physics while interpreting the program code and creating a whiteboard prediction. This study also examines how students evaluate their understanding of the program and modification goals at the beginning of the modification task. While working in groups over the course of a semester, study participants were recorded while they completed three activities using these incomplete programs. Analysis of the video data showed that study participants had little difficulty interpreting physics quantities, generating a prediction, or determining how to modify the incomplete program. Participants did not base their prediction solely from the information from the incomplete program. When participants tried to predict the motion of the objects in the simulation, many turned to their knowledge of how the system would evolve if it represented an analogous real-world physical system. For example, participants attributed the real-world behavior of springs to helix objects even though the program did not include calculations for the spring to exert a force when stretched. Participants rarely interpreted lines of code in the computational loop during the first computational activity, but this changed during latter computational activities with most participants using their physics knowledge to interpret the computational loop. Computational activities in the Matter & Interactions curriculum were revised in light of these findings to include an instructional sequence of tasks to build a comprehension of the example program. The modified activities also ask students to create an additional whiteboard prediction for the time-evolution of the real-world phenomena which the example program will eventually model. This thesis shows how comprehension tasks identified by Palinscar and Brown (1984) as effective in improving reading comprehension are also effective in helping students apply their physics knowledge to interpret a computer program which attempts to model a real-world phenomena and identify errors in their understanding of the use, or omission, of fundamental physics principles in a computational model.
Simulation framework for intelligent transportation systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ewing, T.; Doss, E.; Hanebutte, U.
1996-10-01
A simulation framework has been developed for a large-scale, comprehensive, scaleable simulation of an Intelligent Transportation System (ITS). The simulator is designed for running on parallel computers and distributed (networked) computer systems, but can run on standalone workstations for smaller simulations. The simulator currently models instrumented smart vehicles with in-vehicle navigation units capable of optimal route planning and Traffic Management Centers (TMC). The TMC has probe vehicle tracking capabilities (display position and attributes of instrumented vehicles), and can provide two-way interaction with traffic to provide advisories and link times. Both the in-vehicle navigation module and the TMC feature detailed graphicalmore » user interfaces to support human-factors studies. Realistic modeling of variations of the posted driving speed are based on human factors studies that take into consideration weather, road conditions, driver personality and behavior, and vehicle type. The prototype has been developed on a distributed system of networked UNIX computers but is designed to run on parallel computers, such as ANL`s IBM SP-2, for large-scale problems. A novel feature of the approach is that vehicles are represented by autonomous computer processes which exchange messages with other processes. The vehicles have a behavior model which governs route selection and driving behavior, and can react to external traffic events much like real vehicles. With this approach, the simulation is scaleable to take advantage of emerging massively parallel processor (MPP) systems.« less
Psychology of computer use: XXXII. Computer screen-savers as distractors.
Volk, F A; Halcomb, C G
1994-12-01
The differences in performance of 16 male and 16 female undergraduates on three cognitive tasks were investigated in the presence of visual distractors (computer-generated dynamic graphic images). These tasks included skilled and unskilled proofreading and listening comprehension. The visually demanding task of proofreading (skilled and unskilled) showed no significant decreases in performance in the distractor conditions. Results showed significant decrements, however, in performance on listening comprehension in at least one of the distractor conditions.
Error Monitoring in Speech Production: A Computational Test of the Perceptual Loop Theory.
ERIC Educational Resources Information Center
Hartsuiker, Robert J.; Kolk, Herman H. J.
2001-01-01
Tested whether an elaborated version of the perceptual loop theory (W. Levelt, 1983) and the main interruption rule was consistent with existing time course data (E. Blackmer and E. Mitton, 1991; C. Oomen and A. Postma, in press). The study suggests that including an inner loop through the speech comprehension system generates predictions that fit…
Meyer, L C
1997-06-01
This article provides an overview of the issues and effects of principle-centered health care within organized systems of care; portrays a comprehensive disease management framework for home health care; and offers virtual health management, telecommunications, and mobile computing strategies to enable health management enterprises to achieve health and outcomes maximization accountability demands in managed care.
Blade Displacement Predictions for the Full-Scale UH-60A Airloads Rotor
NASA Technical Reports Server (NTRS)
Bledron, Robert T.; Lee-Rausch, Elizabeth M.
2014-01-01
An unsteady Reynolds-Averaged Navier-Stokes solver for unstructured grids is loosely coupled to a rotorcraft comprehensive code and used to simulate two different test conditions from a wind-tunnel test of a full-scale UH-60A rotor. Performance data and sectional airloads from the simulation are compared with corresponding tunnel data to assess the level of fidelity of the aerodynamic aspects of the simulation. The focus then turns to a comparison of the blade displacements, both rigid (blade root) and elastic. Comparisons of computed root motions are made with data from three independent measurement systems. Finally, comparisons are made between computed elastic bending and elastic twist, and the corresponding measurements obtained from a photogrammetry system. Overall the correlation between computed and measured displacements was good, especially for the root pitch and lag motions and the elastic bending deformation. The correlation of root lead-lag motion and elastic twist deformation was less favorable.
Solution techniques for transient stability-constrained optimal power flow – Part II
Geng, Guangchao; Abhyankar, Shrirang; Wang, Xiaoyu; ...
2017-06-28
Transient stability-constrained optimal power flow is an important emerging problem with power systems pushed to the limits for economic benefits, dense and larger interconnected systems, and reduced inertia due to expected proliferation of renewable energy resources. In this study, two more approaches: single machine equivalent and computational intelligence are presented. Also discussed are various application areas, and future directions in this research area. In conclusion, a comprehensive resource for the available literature, publicly available test systems, and relevant numerical libraries is also provided.
Solution techniques for transient stability-constrained optimal power flow – Part II
DOE Office of Scientific and Technical Information (OSTI.GOV)
Geng, Guangchao; Abhyankar, Shrirang; Wang, Xiaoyu
Transient stability-constrained optimal power flow is an important emerging problem with power systems pushed to the limits for economic benefits, dense and larger interconnected systems, and reduced inertia due to expected proliferation of renewable energy resources. In this study, two more approaches: single machine equivalent and computational intelligence are presented. Also discussed are various application areas, and future directions in this research area. In conclusion, a comprehensive resource for the available literature, publicly available test systems, and relevant numerical libraries is also provided.
Joshi, Vinayak; Agurto, Carla; VanNess, Richard; Nemeth, Sheila; Soliz, Peter; Barriga, Simon
2014-01-01
One of the most important signs of systemic disease that presents on the retina is vascular abnormalities such as in hypertensive retinopathy. Manual analysis of fundus images by human readers is qualitative and lacks in accuracy, consistency and repeatability. Present semi-automatic methods for vascular evaluation are reported to increase accuracy and reduce reader variability, but require extensive reader interaction; thus limiting the software-aided efficiency. Automation thus holds a twofold promise. First, decrease variability while increasing accuracy, and second, increasing the efficiency. In this paper we propose fully automated software as a second reader system for comprehensive assessment of retinal vasculature; which aids the readers in the quantitative characterization of vessel abnormalities in fundus images. This system provides the reader with objective measures of vascular morphology such as tortuosity, branching angles, as well as highlights of areas with abnormalities such as artery-venous nicking, copper and silver wiring, and retinal emboli; in order for the reader to make a final screening decision. To test the efficacy of our system, we evaluated the change in performance of a newly certified retinal reader when grading a set of 40 color fundus images with and without the assistance of the software. The results demonstrated an improvement in reader's performance with the software assistance, in terms of accuracy of detection of vessel abnormalities, determination of retinopathy, and reading time. This system enables the reader in making computer-assisted vasculature assessment with high accuracy and consistency, at a reduced reading time.
National meeting to review IPAD status and goals. [Integrated Programs for Aerospace-vehicle Design
NASA Technical Reports Server (NTRS)
Fulton, R. E.
1980-01-01
A joint NASA/industry project called Integrated Programs for Aerospace-vehicle Design (IPAD) is described, which has the goal of raising aerospace-industry productivity through the application of computers to integrate company-wide management of engineering data. Basically a general-purpose interactive computing system developed to support engineering design processes, the IPAD design is composed of three major software components: the executive, data management, and geometry and graphics software. Results of IPAD activities include a comprehensive description of a future representative aerospace vehicle design process and its interface to manufacturing, and requirements and preliminary design of a future IPAD software system to integrate engineering activities of an aerospace company having several products under simultaneous development.
ERIC Educational Resources Information Center
Rodriguez, Cathi Draper; Filler, John; Higgins, Kyle
2012-01-01
Through this exploratory study the authors investigated the effects of primary language support delivered via computer on the English reading comprehension skills of English language learners. Participants were 28 First-grade students identified as Limited English Proficient. The primary language of all participants was Spanish. Students were…
The Effect of Gloss Type and Mode on Iranian EFL Learners' Reading Comprehension
ERIC Educational Resources Information Center
Sadeghi, Karim; Ahmadi, Negar
2012-01-01
This study investigated the effects of three kinds of gloss conditions that is traditional non-CALL marginal gloss, computer-based audio gloss, and computer-based extended audio gloss, on reading comprehension of Iranian EFL learners. To this end, three experimental and one control groups, each comprising 15 participants, took part in this study.…
Computer-Assisted Training in the Comprehension of Authentic French Speech: A Closer View
ERIC Educational Resources Information Center
Hoeflaak, Arie
2004-01-01
In this article, the development of a computer-assisted listening comprehension project is described. First, we comment briefly on the points of departure, the need for autonomous learning against the background of recent changes in Dutch education, and the role of learning strategies. Then, an error analysis, the programs used for this project,…
ERIC Educational Resources Information Center
Ponce, Hector R.; Lopez, Mario J.; Mayer, Richard E.
2012-01-01
This article examines the effectiveness of a computer-based instructional program (e-PELS) aimed at direct instruction in a collection of reading comprehension strategies. In e-PELS, students learn to highlight and outline expository passages based on various types of text structures (such as comparison or cause-and-effect) as well as to…
Dynamic file-access characteristics of a production parallel scientific workload
NASA Technical Reports Server (NTRS)
Kotz, David; Nieuwejaar, Nils
1994-01-01
Multiprocessors have permitted astounding increases in computational performance, but many cannot meet the intense I/O requirements of some scientific applications. An important component of any solution to this I/O bottleneck is a parallel file system that can provide high-bandwidth access to tremendous amounts of data in parallel to hundreds or thousands of processors. Most successful systems are based on a solid understanding of the expected workload, but thus far there have been no comprehensive workload characterizations of multiprocessor file systems. This paper presents the results of a three week tracing study in which all file-related activity on a massively parallel computer was recorded. Our instrumentation differs from previous efforts in that it collects information about every I/O request and about the mix of jobs running in a production environment. We also present the results of a trace-driven caching simulation and recommendations for designers of multiprocessor file systems.
An Optimization Framework for Dynamic Hybrid Energy Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wenbo Du; Humberto E Garcia; Christiaan J.J. Paredis
A computational framework for the efficient analysis and optimization of dynamic hybrid energy systems (HES) is developed. A microgrid system with multiple inputs and multiple outputs (MIMO) is modeled using the Modelica language in the Dymola environment. The optimization loop is implemented in MATLAB, with the FMI Toolbox serving as the interface between the computational platforms. Two characteristic optimization problems are selected to demonstrate the methodology and gain insight into the system performance. The first is an unconstrained optimization problem that optimizes the dynamic properties of the battery, reactor and generator to minimize variability in the HES. The second problemmore » takes operating and capital costs into consideration by imposing linear and nonlinear constraints on the design variables. The preliminary optimization results obtained in this study provide an essential step towards the development of a comprehensive framework for designing HES.« less
Representing spatial information in a computational model for network management
NASA Technical Reports Server (NTRS)
Blaisdell, James H.; Brownfield, Thomas F.
1994-01-01
While currently available relational database management systems (RDBMS) allow inclusion of spatial information in a data model, they lack tools for presenting this information in an easily comprehensible form. Computer-aided design (CAD) software packages provide adequate functions to produce drawings, but still require manual placement of symbols and features. This project has demonstrated a bridge between the data model of an RDBMS and the graphic display of a CAD system. It is shown that the CAD system can be used to control the selection of data with spatial components from the database and then quickly plot that data on a map display. It is shown that the CAD system can be used to extract data from a drawing and then control the insertion of that data into the database. These demonstrations were successful in a test environment that incorporated many features of known working environments, suggesting that the techniques developed could be adapted for practical use.
V/STOLAND digital avionics system for XV-15 tilt rotor
NASA Technical Reports Server (NTRS)
Liden, S.
1980-01-01
A digital flight control system for the tilt rotor research aircraft provides sophisticated navigation, guidance, control, display and data acquisition capabilities for performing terminal area navigation, guidance and control research. All functions of the XV-15 V/STOLAND system were demonstrated on the NASA-ARC S-19 simulation facility under a comprehensive dynamic acceptance test. The most noteworthy accomplishments of the system are: (1) automatic configuration control of a tilt-rotor aircraft over the total operating range; (2) total hands-off landing to touchdown on various selectable straight-in glide slopes and on a flight path that includes a two-revolution helix; (3) automatic guidance along a programmed three-dimensional reference flight path; (4) navigation data for the automatic guidance computed on board, based on VOR/DME, TACAN, or MLS navid data; and (5) integration of a large set of functions in a single computer, utilizing 16k words of storage for programs and data.
A computational approach to climate science education with CLIMLAB
NASA Astrophysics Data System (ADS)
Rose, B. E. J.
2017-12-01
CLIMLAB is a Python-based software toolkit for interactive, process-oriented climate modeling for use in education and research. It is motivated by the need for simpler tools and more reproducible workflows with which to "fill in the gaps" between blackboard-level theory and the results of comprehensive climate models. With CLIMLAB you can interactively mix and match physical model components, or combine simpler process models together into a more comprehensive model. I use CLIMLAB in the classroom to put models in the hands of students (undergraduate and graduate), and emphasize a hierarchical, process-oriented approach to understanding the key emergent properties of the climate system. CLIMLAB is equally a tool for climate research, where the same needs exist for more robust, process-based understanding and reproducible computational results. I will give an overview of CLIMLAB and an update on recent developments, including: a full-featured, well-documented, interactive implementation of a widely-used radiation model (RRTM) packaging with conda-forge for compiler-free (and hassle-free!) installation on Mac, Windows and Linux interfacing with xarray for i/o and graphics with gridded model data a rich and growing collection of examples and self-computing lecture notes in Jupyter notebook format
Executive control systems in the engineering design environment
NASA Technical Reports Server (NTRS)
Hurst, P. W.; Pratt, T. W.
1985-01-01
Executive Control Systems (ECSs) are software structures for the unification of various engineering design application programs into comprehensive systems with a central user interface (uniform access) method and a data management facility. Attention is presently given to the most significant determinations of a research program conducted for 24 ECSs, used in government and industry engineering design environments to integrate CAD/CAE applications programs. Characterizations are given for the systems' major architectural components and the alternative design approaches considered in their development. Attention is given to ECS development prospects in the areas of interdisciplinary usage, standardization, knowledge utilization, and computer science technology transfer.
A comprehensive inpatient discharge system.
O'Connell, E. M.; Teich, J. M.; Pedraza, L. A.; Thomas, D.
1996-01-01
Our group has developed a computer system that supports all phases of the inpatient discharge process. The system fills in most of the physician's discharge order form and the nurse's discharge abstract, using information available from sign-out, order entry, scheduling, and other databases. It supplies information for referrals to outside institutions, and provides a variety of instruction materials for patients. Discharge forms can be completed in advance, so that the patient is not waiting for final paperwork. Physicians and nurses can work on their components independently, rather than in series. Response to the system has been very favorable. PMID:8947755
Mast, Fred D; Ratushny, Alexander V; Aitchison, John D
2014-09-15
Systems cell biology melds high-throughput experimentation with quantitative analysis and modeling to understand many critical processes that contribute to cellular organization and dynamics. Recently, there have been several advances in technology and in the application of modeling approaches that enable the exploration of the dynamic properties of cells. Merging technology and computation offers an opportunity to objectively address unsolved cellular mechanisms, and has revealed emergent properties and helped to gain a more comprehensive and fundamental understanding of cell biology. © 2014 Mast et al.
Martins, Goncalo; Moondra, Arul; Dubey, Abhishek; Bhattacharjee, Anirban; Koutsoukos, Xenofon D.
2016-01-01
In modern networked control applications, confidentiality and integrity are important features to address in order to prevent against attacks. Moreover, network control systems are a fundamental part of the communication components of current cyber-physical systems (e.g., automotive communications). Many networked control systems employ Time-Triggered (TT) architectures that provide mechanisms enabling the exchange of precise and synchronous messages. TT systems have computation and communication constraints, and with the aim to enable secure communications in the network, it is important to evaluate the computational and communication overhead of implementing secure communication mechanisms. This paper presents a comprehensive analysis and evaluation of the effects of adding a Hash-based Message Authentication (HMAC) to TT networked control systems. The contributions of the paper include (1) the analysis and experimental validation of the communication overhead, as well as a scalability analysis that utilizes the experimental result for both wired and wireless platforms and (2) an experimental evaluation of the computational overhead of HMAC based on a kernel-level Linux implementation. An automotive application is used as an example, and the results show that it is feasible to implement a secure communication mechanism without interfering with the existing automotive controller execution times. The methods and results of the paper can be used for evaluating the performance impact of security mechanisms and, thus, for the design of secure wired and wireless TT networked control systems. PMID:27463718
Martins, Goncalo; Moondra, Arul; Dubey, Abhishek; Bhattacharjee, Anirban; Koutsoukos, Xenofon D
2016-07-25
In modern networked control applications, confidentiality and integrity are important features to address in order to prevent against attacks. Moreover, network control systems are a fundamental part of the communication components of current cyber-physical systems (e.g., automotive communications). Many networked control systems employ Time-Triggered (TT) architectures that provide mechanisms enabling the exchange of precise and synchronous messages. TT systems have computation and communication constraints, and with the aim to enable secure communications in the network, it is important to evaluate the computational and communication overhead of implementing secure communication mechanisms. This paper presents a comprehensive analysis and evaluation of the effects of adding a Hash-based Message Authentication (HMAC) to TT networked control systems. The contributions of the paper include (1) the analysis and experimental validation of the communication overhead, as well as a scalability analysis that utilizes the experimental result for both wired and wireless platforms and (2) an experimental evaluation of the computational overhead of HMAC based on a kernel-level Linux implementation. An automotive application is used as an example, and the results show that it is feasible to implement a secure communication mechanism without interfering with the existing automotive controller execution times. The methods and results of the paper can be used for evaluating the performance impact of security mechanisms and, thus, for the design of secure wired and wireless TT networked control systems.
Visualising elastic anisotropy: theoretical background and computational implementation
NASA Astrophysics Data System (ADS)
Nordmann, J.; Aßmus, M.; Altenbach, H.
2018-02-01
In this article, we present the technical realisation for visualisations of characteristic parameters of the fourth-order elasticity tensor, which is classified by three-dimensional symmetry groups. Hereby, expressions for spatial representations of uc(Young)'s modulus and bulk modulus as well as plane representations of shear modulus and uc(Poisson)'s ratio are derived and transferred into a comprehensible form to computer algebra systems. Additionally, we present approaches for spatial representations of both latter parameters. These three- and two-dimensional representations are implemented into the software MATrix LABoratory. Exemplary representations of characteristic materials complete the present treatise.
CFD Techniques for Propulsion Applications
NASA Technical Reports Server (NTRS)
1992-01-01
The symposium was composed of the following sessions: turbomachinery computations and validations; flow in ducts, intakes, and nozzles; and reacting flows. Forty papers were presented, and they covered full 3-D code validation and numerical techniques; multidimensional reacting flow; and unsteady viscous flow for the entire spectrum of propulsion system components. The capabilities of the various numerical techniques were assessed and significant new developments were identified. The technical evaluation spells out where progress has been made and concludes that the present state of the art has almost reached the level necessary to tackle the comprehensive topic of computational fluid dynamics (CFD) validation for propulsion.
Computational and Experimental Analysis of the Secretome of Methylococcus capsulatus (Bath)
Indrelid, Stine; Mathiesen, Geir; Jacobsen, Morten; Lea, Tor; Kleiveland, Charlotte R.
2014-01-01
The Gram-negative methanotroph Methylococcus capsulatus (Bath) was recently demonstrated to abrogate inflammation in a murine model of inflammatory bowel disease, suggesting interactions with cells involved in maintaining mucosal homeostasis and emphasizing the importance of understanding the many properties of M. capsulatus. Secreted proteins determine how bacteria may interact with their environment, and a comprehensive knowledge of such proteins is therefore vital to understand bacterial physiology and behavior. The aim of this study was to systematically analyze protein secretion in M. capsulatus (Bath) by identifying the secretion systems present and the respective secreted substrates. Computational analysis revealed that in addition to previously recognized type II secretion systems and a type VII secretion system, a type Vb (two-partner) secretion system and putative type I secretion systems are present in M. capsulatus (Bath). In silico analysis suggests that the diverse secretion systems in M.capsulatus transport proteins likely to be involved in adhesion, colonization, nutrient acquisition and homeostasis maintenance. Results of the computational analysis was verified and extended by an experimental approach showing that in addition an uncharacterized protein and putative moonlighting proteins are released to the medium during exponential growth of M. capsulatus (Bath). PMID:25479164
Numerosity as a topological invariant.
Kluth, Tobias; Zetzsche, Christoph
2016-01-01
The ability to quickly recognize the number of objects in our environment is a fundamental cognitive function. However, it is far from clear which computations and which actual neural processing mechanisms are used to provide us with such a skill. Here we try to provide a detailed and comprehensive analysis of this issue, which comprises both the basic mathematical foundations and the peculiarities imposed by the structure of the visual system and by the neural computations provided by the visual cortex. We suggest that numerosity should be considered as a mathematical invariant. Making use of concepts from mathematical topology--like connectedness, Betti numbers, and the Gauss-Bonnet theorem--we derive the basic computations suited for the computation of this invariant. We show that the computation of numerosity is possible in a neurophysiologically plausible fashion using only computational elements which are known to exist in the visual cortex. We further show that a fundamental feature of numerosity perception, its Weber property, arises naturally, assuming noise in the basic neural operations. The model is tested on an extended data set (made publicly available). It is hoped that our results can provide a general framework for future research on the invariance properties of the numerosity system.
Using a voice to put a name to a face: the psycholinguistics of proper name comprehension.
Barr, Dale J; Jackson, Laura; Phillips, Isobel
2014-02-01
We propose that hearing a proper name (e.g., Kevin) in a particular voice serves as a compound memory cue that directly activates representations of a mutually known target person, often permitting reference resolution without any complex computation of shared knowledge. In a referential communication study, pairs of friends played a communication game, in which we monitored the eyes of one friend (the addressee) while he or she sought to identify the target person, in a set of four photos, on the basis of a name spoken aloud. When the name was spoken by a friend, addressees rapidly identified the target person, and this facilitation was independent of whether the friend was articulating a message he or she had designed versus one from a third party with whom the target person was not shared. Our findings suggest that the comprehension system takes advantage of regularities in the environment to minimize effortful computation about who knows what.
Comparison of Computed and Measured Vortex Evolution for a UH-60A Rotor in Forward Flight
NASA Technical Reports Server (NTRS)
Ahmad, Jasim Uddin; Yamauchi, Gloria K.; Kao, David L.
2013-01-01
A Computational Fluid Dynamics (CFD) simulation using the Navier-Stokes equations was performed to determine the evolutionary and dynamical characteristics of the vortex flowfield for a highly flexible aeroelastic UH-60A rotor in forward flight. The experimental wake data were acquired using Particle Image Velocimetry (PIV) during a test of the fullscale UH-60A rotor in the National Full-Scale Aerodynamics Complex 40- by 80-Foot Wind Tunnel. The PIV measurements were made in a stationary cross-flow plane at 90 deg rotor azimuth. The CFD simulation was performed using the OVERFLOW CFD solver loosely coupled with the rotorcraft comprehensive code CAMRAD II. Characteristics of vortices captured in the PIV plane from different blades are compared with CFD calculations. The blade airloads were calculated using two different turbulence models. A limited spatial, temporal, and CFD/comprehensive-code coupling sensitivity analysis was performed in order to verify the unsteady helicopter simulations with a moving rotor grid system.
Development of a computer-assisted learning software package on dental traumatology.
Tolidis, K; Crawford, P; Stephens, C; Papadogiannis, Y; Plakias, C
1998-10-01
The development of computer-assisted learning software packages is a relatively new field of computer application. The progress made in personal computer technology toward more user-friendly operating systems has stimulated the academic community to develop computer-assisted learning for pre- and postgraduate students. The ability of computers to combine audio and visual data in an interactive form provides a powerful educational tool. The purpose of this study was to develop and evaluate a computer-assisted learning package on dental traumatology. This program contains background information on the diagnosis, classification, and management of dental injuries in both the permanent and the deciduous dentitions. It is structured into chapters according to the nature of the injury and whether injury has occurred in the primary or permanent dentition. At the end of each chapter there is a self-assessment questionnaire as well as references to relevant literature. Extensive use of pictures and video provides a comprehensive overview of the subject.
Wurdack, C M
1997-01-01
Computers are changing the way we do everything from paying our bills to programming our home entertainment systems. If you thought that dental education was not likely to benefit from computers, consider this: Computer technology is revolutionizing dental instruction in ways that promise to improve the quality and efficiency of dental education. It is providing a challenging learning opportunity for dental educators as well. Since much of dental education involves the visual transfer of both concepts and procedures from the instructor to the student, it makes sense that using computer technology to enhance conventional teaching techniques--with materials that include clear, informative images and real-time demonstrations melding sound and animation to deliver to the student in the classroom material that complements textbooks, 35mm slides, and the lecture format. Use of computers at UOP is about teaching students to be competent dentists by making instruction more direct, better visualized, and more comprehensible.
A comprehensive map of the mTOR signaling network
Caron, Etienne; Ghosh, Samik; Matsuoka, Yukiko; Ashton-Beaucage, Dariel; Therrien, Marc; Lemieux, Sébastien; Perreault, Claude; Roux, Philippe P; Kitano, Hiroaki
2010-01-01
The mammalian target of rapamycin (mTOR) is a central regulator of cell growth and proliferation. mTOR signaling is frequently dysregulated in oncogenic cells, and thus an attractive target for anticancer therapy. Using CellDesigner, a modeling support software for graphical notation, we present herein a comprehensive map of the mTOR signaling network, which includes 964 species connected by 777 reactions. The map complies with both the systems biology markup language (SBML) and graphical notation (SBGN) for computational analysis and graphical representation, respectively. As captured in the mTOR map, we review and discuss our current understanding of the mTOR signaling network and highlight the impact of mTOR feedback and crosstalk regulations on drug-based cancer therapy. This map is available on the Payao platform, a Web 2.0 based community-wide interactive process for creating more accurate and information-rich databases. Thus, this comprehensive map of the mTOR network will serve as a tool to facilitate systems-level study of up-to-date mTOR network components and signaling events toward the discovery of novel regulatory processes and therapeutic strategies for cancer. PMID:21179025
Jacobs, Jeffrey P
2002-01-01
The field of congenital heart surgery has the opportunity to create the first comprehensive international database for a medical subspecialty. An understanding of the demographics of congenital heart disease and the rapid growth of computer technology leads to the realization that creating a comprehensive international database for pediatric cardiac surgery represents an important and achievable goal. The evolution of computer-based data analysis creates an opportunity to develop software to manage an international congenital heart surgery database and eventually become an electronic medical record. The same database data set for congenital heart surgery is now being used in Europe and North America. Additional work is under way to involve Africa, Asia, Australia, and South America. The almost simultaneous publication of the European Association for Cardio-thoracic Surgery/Society of Thoracic Surgeons coding system and the Association for European Paediatric Cardiology coding system resulted in the potential for multiple coding. Representatives of the Association for European Paediatric Cardiology, Society of Thoracic Surgeons, European Association for Cardio-thoracic Surgery, and European Congenital Heart Surgeons Foundation agree that these hierarchical systems are complementary and not competitive. An international committee will map the two systems. The ideal coding system will permit a diagnosis or procedure to be coded only one time with mapping allowing this code to be used for patient care, billing, practice management, teaching, research, and reporting to governmental agencies. The benefits of international data gathering and sharing are global, with the long-term goal of the continued upgrade in the quality of congenital heart surgery worldwide. Copyright 2002 by W.B. Saunders Company
DBSecSys 2.0: a database of Burkholderia mallei and Burkholderia pseudomallei secretion systems.
Memišević, Vesna; Kumar, Kamal; Zavaljevski, Nela; DeShazer, David; Wallqvist, Anders; Reifman, Jaques
2016-09-20
Burkholderia mallei and B. pseudomallei are the causative agents of glanders and melioidosis, respectively, diseases with high morbidity and mortality rates. B. mallei and B. pseudomallei are closely related genetically; B. mallei evolved from an ancestral strain of B. pseudomallei by genome reduction and adaptation to an obligate intracellular lifestyle. Although these two bacteria cause different diseases, they share multiple virulence factors, including bacterial secretion systems, which represent key components of bacterial pathogenicity. Despite recent progress, the secretion system proteins for B. mallei and B. pseudomallei, their pathogenic mechanisms of action, and host factors are not well characterized. We previously developed a manually curated database, DBSecSys, of bacterial secretion system proteins for B. mallei. Here, we report an expansion of the database with corresponding information about B. pseudomallei. DBSecSys 2.0 contains comprehensive literature-based and computationally derived information about B. mallei ATCC 23344 and literature-based and computationally derived information about B. pseudomallei K96243. The database contains updated information for 163 B. mallei proteins from the previous database and 61 additional B. mallei proteins, and new information for 281 B. pseudomallei proteins associated with 5 secretion systems, their 1,633 human- and murine-interacting targets, and 2,400 host-B. mallei interactions and 2,286 host-B. pseudomallei interactions. The database also includes information about 13 pathogenic mechanisms of action for B. mallei and B. pseudomallei secretion system proteins inferred from the available literature or computationally. Additionally, DBSecSys 2.0 provides details about 82 virulence attenuation experiments for 52 B. mallei secretion system proteins and 98 virulence attenuation experiments for 61 B. pseudomallei secretion system proteins. We updated the Web interface and data access layer to speed-up users' search of detailed information for orthologous proteins related to secretion systems of the two pathogens. The updates of DBSecSys 2.0 provide unique capabilities to access comprehensive information about secretion systems of B. mallei and B. pseudomallei. They enable studies and comparisons of corresponding proteins of these two closely related pathogens and their host-interacting partners. The database is available at http://dbsecsys.bhsai.org .
Equivalent ZF precoding scheme for downlink indoor MU-MIMO VLC systems
NASA Astrophysics Data System (ADS)
Fan, YangYu; Zhao, Qiong; Kang, BoChao; Deng, LiJun
2018-01-01
In indoor visible light communication (VLC) systems, the channels of photo detectors (PDs) at one user are highly correlated, which determines the choice of spatial diversity model for individual users. In a spatial diversity model, the signals received by PDs belonging to one user carry the same information, and can be combined directly. Based on the above, we propose an equivalent zero-forcing (ZF) precoding scheme for multiple-user multiple-input single-output (MU-MIMO) VLC systems by transforming an indoor MU-MIMO VLC system into an indoor multiple-user multiple-input single-output (MU-MISO) VLC system through simply processing. The power constraints of light emitting diodes (LEDs) are also taken into account. Comprehensive computer simulations in three scenarios indicate that our scheme can not only reduce the computational complexity, but also guarantee the system performance. Furthermore, the proposed scheme does not require noise information in the calculating of the precoding weights, and has no restrictions on the numbers of APs and PDs.
Operating manual for coaxial injection combustion model. [for the space shuttle main engine
NASA Technical Reports Server (NTRS)
Sutton, R. D.; Schuman, M. D.; Chadwick, W. D.
1974-01-01
An operating manual for the coaxial injection combustion model (CICM) is presented as the final report for an eleven month effort designed to provide improvement, to verify, and to document the comprehensive computer program for analyzing the performance of thrust chamber operation with gas/liquid coaxial jet injection. The effort culminated in delivery of an operation FORTRAN IV computer program and associated documentation pertaining to the combustion conditions in the space shuttle main engine. The computer program is structured for compatibility with the standardized Joint Army-Navy-NASA-Air Force (JANNAF) performance evaluation procedure. Use of the CICM in conjunction with the JANNAF procedure allows the analysis of engine systems using coaxial gas/liquid injection.
NASA Astrophysics Data System (ADS)
Speidel, Steven
1992-08-01
Our ultimate goal is to develop neural-like cognitive sensory processing within non-neuronal systems. Toward this end, computational models are being developed for selectivity attending the task-relevant parts of composite sensory excitations in an example sound processing application. Significant stimuli partials are selectively attended through the use of generalized neural adaptive beamformers. Computational components are being tested by experiment in the laboratory and also by use of recordings from sensor deployments in the ocean. Results will be presented. These computational components are being integrated into a comprehensive processing architecture that simultaneously attends memory according to stimuli, attends stimuli according to memory, and attends stimuli and memory according to an ongoing thought process. The proposed neural architecture is potentially very fast when implemented in special hardware.
Solving subsurface structural problems using a computer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Witte, D.M.
1987-02-01
Until recently, the solution of subsurface structural problems has required a combination of graphical construction, trigonometry, time, and patience. Recent advances in software available for both mainframe and microcomputers now reduce the time and potential error of these calculations by an order of magnitude. Software for analysis of deviated wells, three point problems, apparent dip, apparent thickness, and the intersection of two planes, as well as the plotting and interpretation of these data can be used to allow timely and accurate exploration or operational decisions. The available computer software provides a set of utilities, or tools, rather than a comprehensive,more » intelligent system. The burden for selection of appropriate techniques, computation methods, and interpretations still lies with the explorationist user.« less
A New Mathematical Framework for Design Under Uncertainty
2016-05-05
blending multiple information sources via auto-regressive stochastic modeling. A computationally efficient machine learning framework is developed based on...sion and machine learning approaches; see Fig. 1. This will lead to a comprehensive description of system performance with less uncertainty than in the...Bayesian optimization of super-cavitating hy- drofoils The goal of this study is to demonstrate the capabilities of statistical learning and
NASA Technical Reports Server (NTRS)
Tolivar, A. F.; Key, R. W.
1980-01-01
The attitude control performance of the solar electric propulsion system (SEPS) was evaluated. A thrust vector control system for powered flight control was examined along with a gas jet reaction control system, and a reaction wheel system, both of which have been proposed for nonpowered flight control. Comprehensive computer simulations of each control system were made and evaluated using a 30 mode spacecraft model. Results obtained indicate that thrust vector control and reaction wheel systems offer acceptable smooth proportional control. The gas jet control system is shown to be risky for a flexible structure such as SEPS, and is therefore, not recommended as a primary control method.
Roadmap for cardiovascular circulation model
Bradley, Christopher P.; Suresh, Vinod; Mithraratne, Kumar; Muller, Alexandre; Ho, Harvey; Ladd, David; Hellevik, Leif R.; Omholt, Stig W.; Chase, J. Geoffrey; Müller, Lucas O.; Watanabe, Sansuke M.; Blanco, Pablo J.; de Bono, Bernard; Hunter, Peter J.
2016-01-01
Abstract Computational models of many aspects of the mammalian cardiovascular circulation have been developed. Indeed, along with orthopaedics, this area of physiology is one that has attracted much interest from engineers, presumably because the equations governing blood flow in the vascular system are well understood and can be solved with well‐established numerical techniques. Unfortunately, there have been only a few attempts to create a comprehensive public domain resource for cardiovascular researchers. In this paper we propose a roadmap for developing an open source cardiovascular circulation model. The model should be registered to the musculo‐skeletal system. The computational infrastructure for the cardiovascular model should provide for near real‐time computation of blood flow and pressure in all parts of the body. The model should deal with vascular beds in all tissues, and the computational infrastructure for the model should provide links into CellML models of cell function and tissue function. In this work we review the literature associated with 1D blood flow modelling in the cardiovascular system, discuss model encoding standards, software and a model repository. We then describe the coordinate systems used to define the vascular geometry, derive the equations and discuss the implementation of these coupled equations in the open source computational software OpenCMISS. Finally, some preliminary results are presented and plans outlined for the next steps in the development of the model, the computational software and the graphical user interface for accessing the model. PMID:27506597
Roadmap for cardiovascular circulation model.
Safaei, Soroush; Bradley, Christopher P; Suresh, Vinod; Mithraratne, Kumar; Muller, Alexandre; Ho, Harvey; Ladd, David; Hellevik, Leif R; Omholt, Stig W; Chase, J Geoffrey; Müller, Lucas O; Watanabe, Sansuke M; Blanco, Pablo J; de Bono, Bernard; Hunter, Peter J
2016-12-01
Computational models of many aspects of the mammalian cardiovascular circulation have been developed. Indeed, along with orthopaedics, this area of physiology is one that has attracted much interest from engineers, presumably because the equations governing blood flow in the vascular system are well understood and can be solved with well-established numerical techniques. Unfortunately, there have been only a few attempts to create a comprehensive public domain resource for cardiovascular researchers. In this paper we propose a roadmap for developing an open source cardiovascular circulation model. The model should be registered to the musculo-skeletal system. The computational infrastructure for the cardiovascular model should provide for near real-time computation of blood flow and pressure in all parts of the body. The model should deal with vascular beds in all tissues, and the computational infrastructure for the model should provide links into CellML models of cell function and tissue function. In this work we review the literature associated with 1D blood flow modelling in the cardiovascular system, discuss model encoding standards, software and a model repository. We then describe the coordinate systems used to define the vascular geometry, derive the equations and discuss the implementation of these coupled equations in the open source computational software OpenCMISS. Finally, some preliminary results are presented and plans outlined for the next steps in the development of the model, the computational software and the graphical user interface for accessing the model. © 2016 The Authors. The Journal of Physiology © 2016 The Physiological Society.
Exploration on the matching between Optical Comprehensive Design Experiment and Washington Accord
NASA Astrophysics Data System (ADS)
Cao, Yiping; Chen, Wenjing; Zhang, Qican; Liu, Yuankun; Li, Dahai; Zhou, Xinzhi; Wei, Jun
2017-08-01
Common problems faced in optical comprehensive design experiment and going against the Washington Accord are pointed out. For resolving these problems, an instructional and innovative teaching scheme for Optics Comprehensive Design Experiment is proposed. We would like to understand the student that can improve the hands-on practical ability, theory knowledge understanding ability, complex problem solving ability, engineering application ability, cooperative ability after tracking and researching the student who have attended the class about Optical Comprehensive Design Experiment, We found that there are some problems on the course such as the experiment content vague, the student beginning less time, phase separation theory and engineering application, the experiment content lack of selectivity and so on. So we have made some improvements reference to the Washington Accord for the class teaching plan about Optical Comprehensive Design Experiment. This class must relevant to the engineering basic courses, professional foundation course and the major courses, so far as to the future study and work that which can play a role in inheriting and continuity to the students. The Optical Comprehensive Design Experiment teaching program requires students learning this course to have learnt basic courses like analog electronics technique, digital electronic technique, applied optics and computer and other related courses which students are required to comprehensively utilize. This teaching scheme contains six practical complex engineering problems which are respectively optical system design, light energy meter design, illuminometer design, material refractive index measuring system design, light intensity measuring system design and open design. Establishing the optional experiment and open experiment can provide students with a greater choice and enhance the students' creativity, vivid teaching experimental teachers and enriching contents of experiment can make the experiment more interesting, providing students with more opportunities to conduct experiment and improving students' practical ability with long learning time, putting emphasis on student's understanding of complex engineering problems and the cognitive of the process to solve complex engineering problems with actual engineering problems. Applying the scheme in other courses and improving accordingly will be able to ensure the quality of engineering education. Look forward to offering useful reference for the curriculum system construction in colleges and universities.
Knowledge-based public health situation awareness
NASA Astrophysics Data System (ADS)
Mirhaji, Parsa; Zhang, Jiajie; Srinivasan, Arunkumar; Richesson, Rachel L.; Smith, Jack W.
2004-09-01
There have been numerous efforts to create comprehensive databases from multiple sources to monitor the dynamics of public health and most specifically to detect the potential threats of bioterrorism before widespread dissemination. But there are not many evidences for the assertion that these systems are timely and dependable, or can reliably identify man made from natural incident. One must evaluate the value of so called 'syndromic surveillance systems' along with the costs involved in design, development, implementation and maintenance of such systems and the costs involved in investigation of the inevitable false alarms1. In this article we will introduce a new perspective to the problem domain with a shift in paradigm from 'surveillance' toward 'awareness'. As we conceptualize a rather different approach to tackle the problem, we will introduce a different methodology in application of information science, computer science, cognitive science and human-computer interaction concepts in design and development of so called 'public health situation awareness systems'. We will share some of our design and implementation concepts for the prototype system that is under development in the Center for Biosecurity and Public Health Informatics Research, in the University of Texas Health Science Center at Houston. The system is based on a knowledgebase containing ontologies with different layers of abstraction, from multiple domains, that provide the context for information integration, knowledge discovery, interactive data mining, information visualization, information sharing and communications. The modular design of the knowledgebase and its knowledge representation formalism enables incremental evolution of the system from a partial system to a comprehensive knowledgebase of 'public health situation awareness' as it acquires new knowledge through interactions with domain experts or automatic discovery of new knowledge.
Towards pervasive computing in health care – A literature review
Orwat, Carsten; Graefe, Andreas; Faulwasser, Timm
2008-01-01
Background The evolving concepts of pervasive computing, ubiquitous computing and ambient intelligence are increasingly influencing health care and medicine. Summarizing published research, this literature review provides an overview of recent developments and implementations of pervasive computing systems in health care. It also highlights some of the experiences reported in deployment processes. Methods There is no clear definition of pervasive computing in the current literature. Thus specific inclusion criteria for selecting articles about relevant systems were developed. Searches were conducted in four scientific databases alongside manual journal searches for the period of 2002 to 2006. Articles included present prototypes, case studies and pilot studies, clinical trials and systems that are already in routine use. Results The searches identified 69 articles describing 67 different systems. In a quantitative analysis, these systems were categorized into project status, health care settings, user groups, improvement aims, and systems features (i.e., component types, data gathering, data transmission, systems functions). The focus is on the types of systems implemented, their frequency of occurrence and their characteristics. Qualitative analyses were performed of deployment issues, such as organizational and personnel issues, privacy and security issues, and financial issues. This paper provides a comprehensive access to the literature of the emerging field by addressing specific topics of application settings, systems features, and deployment experiences. Conclusion Both an overview and an analysis of the literature on a broad and heterogeneous range of systems are provided. Most systems are described in their prototype stages. Deployment issues, such as implications on organization or personnel, privacy concerns, or financial issues are mentioned rarely, though their solution is regarded as decisive in transferring promising systems to a stage of regular operation. There is a need for further research on the deployment of pervasive computing systems, including clinical studies, economic and social analyses, user studies, etc. PMID:18565221
DOE Office of Scientific and Technical Information (OSTI.GOV)
Toma, Milan; Jensen, Morten Ø.; Einstein, Daniel R.
2015-07-17
Numerical models of native heart valves are being used to study valve biomechanics to aid design and development of repair procedures and replacement devices. These models have evolved from simple two-dimensional approximations to complex three-dimensional, fully coupled fluid-structure interaction (FSI) systems. Such simulations are useful for predicting the mechanical and hemodynamic loading on implanted valve devices. A current challenge for improving the accuracy of these predictions is choosing and implementing modeling boundary conditions. In order to address this challenge, we are utilizing an advanced in-vitro system to validate FSI conditions for the mitral valve system. Explanted ovine mitral valves weremore » mounted in an in vitro setup, and structural data for the mitral valve was acquired with *CT. Experimental data from the in-vitro ovine mitral valve system were used to validate the computational model. As the valve closes, the hemodynamic data, high speed lea et dynamics, and force vectors from the in-vitro system were compared to the results of the FSI simulation computational model. The total force of 2.6 N per papillary muscle is matched by the computational model. In vitro and in vivo force measurements are important in validating and adjusting material parameters in computational models. The simulations can then be used to answer questions that are otherwise not possible to investigate experimentally. This work is important to maximize the validity of computational models of not just the mitral valve, but any biomechanical aspect using computational simulation in designing medical devices.« less
Toma, Milan; Jensen, Morten Ø; Einstein, Daniel R; Yoganathan, Ajit P; Cochran, Richard P; Kunzelman, Karyn S
2016-04-01
Numerical models of native heart valves are being used to study valve biomechanics to aid design and development of repair procedures and replacement devices. These models have evolved from simple two-dimensional approximations to complex three-dimensional, fully coupled fluid-structure interaction (FSI) systems. Such simulations are useful for predicting the mechanical and hemodynamic loading on implanted valve devices. A current challenge for improving the accuracy of these predictions is choosing and implementing modeling boundary conditions. In order to address this challenge, we are utilizing an advanced in vitro system to validate FSI conditions for the mitral valve system. Explanted ovine mitral valves were mounted in an in vitro setup, and structural data for the mitral valve was acquired with [Formula: see text]CT. Experimental data from the in vitro ovine mitral valve system were used to validate the computational model. As the valve closes, the hemodynamic data, high speed leaflet dynamics, and force vectors from the in vitro system were compared to the results of the FSI simulation computational model. The total force of 2.6 N per papillary muscle is matched by the computational model. In vitro and in vivo force measurements enable validating and adjusting material parameters to improve the accuracy of computational models. The simulations can then be used to answer questions that are otherwise not possible to investigate experimentally. This work is important to maximize the validity of computational models of not just the mitral valve, but any biomechanical aspect using computational simulation in designing medical devices.
Mobile cloud-computing-based healthcare service by noncontact ECG monitoring.
Fong, Ee-May; Chung, Wan-Young
2013-12-02
Noncontact electrocardiogram (ECG) measurement technique has gained popularity these days owing to its noninvasive features and convenience in daily life use. This paper presents mobile cloud computing for a healthcare system where a noncontact ECG measurement method is employed to capture biomedical signals from users. Healthcare service is provided to continuously collect biomedical signals from multiple locations. To observe and analyze the ECG signals in real time, a mobile device is used as a mobile monitoring terminal. In addition, a personalized healthcare assistant is installed on the mobile device; several healthcare features such as health status summaries, medication QR code scanning, and reminders are integrated into the mobile application. Health data are being synchronized into the healthcare cloud computing service (Web server system and Web server dataset) to ensure a seamless healthcare monitoring system and anytime and anywhere coverage of network connection is available. Together with a Web page application, medical data are easily accessed by medical professionals or family members. Web page performance evaluation was conducted to ensure minimal Web server latency. The system demonstrates better availability of off-site and up-to-the-minute patient data, which can help detect health problems early and keep elderly patients out of the emergency room, thus providing a better and more comprehensive healthcare cloud computing service.
Mobile Cloud-Computing-Based Healthcare Service by Noncontact ECG Monitoring
Fong, Ee-May; Chung, Wan-Young
2013-01-01
Noncontact electrocardiogram (ECG) measurement technique has gained popularity these days owing to its noninvasive features and convenience in daily life use. This paper presents mobile cloud computing for a healthcare system where a noncontact ECG measurement method is employed to capture biomedical signals from users. Healthcare service is provided to continuously collect biomedical signals from multiple locations. To observe and analyze the ECG signals in real time, a mobile device is used as a mobile monitoring terminal. In addition, a personalized healthcare assistant is installed on the mobile device; several healthcare features such as health status summaries, medication QR code scanning, and reminders are integrated into the mobile application. Health data are being synchronized into the healthcare cloud computing service (Web server system and Web server dataset) to ensure a seamless healthcare monitoring system and anytime and anywhere coverage of network connection is available. Together with a Web page application, medical data are easily accessed by medical professionals or family members. Web page performance evaluation was conducted to ensure minimal Web server latency. The system demonstrates better availability of off-site and up-to-the-minute patient data, which can help detect health problems early and keep elderly patients out of the emergency room, thus providing a better and more comprehensive healthcare cloud computing service. PMID:24316562
Visibiome: an efficient microbiome search engine based on a scalable, distributed architecture.
Azman, Syafiq Kamarul; Anwar, Muhammad Zohaib; Henschel, Andreas
2017-07-24
Given the current influx of 16S rRNA profiles of microbiota samples, it is conceivable that large amounts of them eventually are available for search, comparison and contextualization with respect to novel samples. This process facilitates the identification of similar compositional features in microbiota elsewhere and therefore can help to understand driving factors for microbial community assembly. We present Visibiome, a microbiome search engine that can perform exhaustive, phylogeny based similarity search and contextualization of user-provided samples against a comprehensive dataset of 16S rRNA profiles environments, while tackling several computational challenges. In order to scale to high demands, we developed a distributed system that combines web framework technology, task queueing and scheduling, cloud computing and a dedicated database server. To further ensure speed and efficiency, we have deployed Nearest Neighbor search algorithms, capable of sublinear searches in high-dimensional metric spaces in combination with an optimized Earth Mover Distance based implementation of weighted UniFrac. The search also incorporates pairwise (adaptive) rarefaction and optionally, 16S rRNA copy number correction. The result of a query microbiome sample is the contextualization against a comprehensive database of microbiome samples from a diverse range of environments, visualized through a rich set of interactive figures and diagrams, including barchart-based compositional comparisons and ranking of the closest matches in the database. Visibiome is a convenient, scalable and efficient framework to search microbiomes against a comprehensive database of environmental samples. The search engine leverages a popular but computationally expensive, phylogeny based distance metric, while providing numerous advantages over the current state of the art tool.
A Position on a Computer Literacy Course.
ERIC Educational Resources Information Center
Self, Charles C.
A position is put forth on the appropriate content of a computer literacy course and the role of computer literacy in the community college. First, various definitions of computer literacy are examined, including the programming, computer awareness, and comprehensive approaches. Next, five essential components of a computer literacy course are…
Scientific Visualization, Seeing the Unseeable
LBNL
2017-12-09
June 24, 2008 Berkeley Lab lecture: Scientific visualization transforms abstract data into readily comprehensible images, provide a vehicle for "seeing the unseeable," and play a central role in bo... June 24, 2008 Berkeley Lab lecture: Scientific visualization transforms abstract data into readily comprehensible images, provide a vehicle for "seeing the unseeable," and play a central role in both experimental and computational sciences. Wes Bethel, who heads the Scientific Visualization Group in the Computational Research Division, presents an overview of visualization and computer graphics, current research challenges, and future directions for the field.
ERIC Educational Resources Information Center
Kerr, Matthew A.; Symons, Sonya E.
2006-01-01
This study examined whether children's reading rate, comprehension, and recall are affected by computer presentation of text. Participants were 60 grade five students, who each read two expository texts, one in a traditional print format and the other from a computer monitor, which used a common scrolling text interface. After reading each text,…
Evaluative methodology for comprehensive water quality management planning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dyer, H. L.
Computer-based evaluative methodologies have been developed to provide for the analysis of coupled phenomena associated with natural resource comprehensive planning requirements. Provisions for planner/computer interaction have been included. Each of the simulation models developed is described in terms of its coded procedures. An application of the models for water quality management planning is presented; and the data requirements for each of the models are noted.
ERIC Educational Resources Information Center
Swerdloff, Matthew
2013-01-01
The purpose of this study was to investigate the specific effects of targeted English Language Arts (ELA) instruction using multimedia applications. Student reading comprehension, student attitude toward computers, and student attitude toward school were measured in this study. The study also examined the perceptions, of selected students, of the…
ERIC Educational Resources Information Center
Twyman, Todd; Tindal, Gerald
2006-01-01
The purpose of this study was to improve the comprehension and problem-solving skills of students with disabilities in social studies using a conceptually framed, computer-adapted history text. Participants were 11th and 12th grade students identified with learning disabilities in reading and writing from two intact, self-contained social studies…
QuantWorm: a comprehensive software package for Caenorhabditis elegans phenotypic assays.
Jung, Sang-Kyu; Aleman-Meza, Boanerges; Riepe, Celeste; Zhong, Weiwei
2014-01-01
Phenotypic assays are crucial in genetics; however, traditional methods that rely on human observation are unsuitable for quantitative, large-scale experiments. Furthermore, there is an increasing need for comprehensive analyses of multiple phenotypes to provide multidimensional information. Here we developed an automated, high-throughput computer imaging system for quantifying multiple Caenorhabditis elegans phenotypes. Our imaging system is composed of a microscope equipped with a digital camera and a motorized stage connected to a computer running the QuantWorm software package. Currently, the software package contains one data acquisition module and four image analysis programs: WormLifespan, WormLocomotion, WormLength, and WormEgg. The data acquisition module collects images and videos. The WormLifespan software counts the number of moving worms by using two time-lapse images; the WormLocomotion software computes the velocity of moving worms; the WormLength software measures worm body size; and the WormEgg software counts the number of eggs. To evaluate the performance of our software, we compared the results of our software with manual measurements. We then demonstrated the application of the QuantWorm software in a drug assay and a genetic assay. Overall, the QuantWorm software provided accurate measurements at a high speed. Software source code, executable programs, and sample images are available at www.quantworm.org. Our software package has several advantages over current imaging systems for C. elegans. It is an all-in-one package for quantifying multiple phenotypes. The QuantWorm software is written in Java and its source code is freely available, so it does not require use of commercial software or libraries. It can be run on multiple platforms and easily customized to cope with new methods and requirements.
A Haptic-Enhanced System for Molecular Sensing
NASA Astrophysics Data System (ADS)
Comai, Sara; Mazza, Davide
The science of haptics has received an enormous attention in the last decade. One of the major application trends of haptics technology is data visualization and training. In this paper, we present a haptically-enhanced system for manipulation and tactile exploration of molecules.The geometrical models of molecules is extracted either from theoretical or empirical data using file formats widely adopted in chemical and biological fields. The addition of information computed with computational chemistry tools, allows users to feel the interaction forces between an explored molecule and a charge associated to the haptic device, and to visualize a huge amount of numerical data in a more comprehensible way. The developed tool can be used either for teaching or research purposes due to its high reliance on both theoretical and experimental data.
Mass spectrometry-based proteomics: basic principles and emerging technologies and directions.
Van Riper, Susan K; de Jong, Ebbing P; Carlis, John V; Griffin, Timothy J
2013-01-01
As the main catalytic and structural molecules within living systems, proteins are the most likely biomolecules to be affected by radiation exposure. Proteomics, the comprehensive characterization of proteins within complex biological samples, is therefore a research approach ideally suited to assess the effects of radiation exposure on cells and tissues. For comprehensive characterization of proteomes, an analytical platform capable of quantifying protein abundance, identifying post-translation modifications and revealing members of protein complexes on a system-wide level is necessary. Mass spectrometry (MS), coupled with technologies for sample fractionation and automated data analysis, provides such a versatile and powerful platform. In this chapter we offer a view on the current state of MS-proteomics, and focus on emerging technologies within three areas: (1) New instrumental methods; (2) New computational methods for peptide identification; and (3) Label-free quantification. These emerging technologies should be valuable for researchers seeking to better understand biological effects of radiation on living systems.
A systems concept of the vestibular organs
NASA Technical Reports Server (NTRS)
Mayne, R.
1974-01-01
A comprehensive model of vestibular organ function is presented. The model is based on an analogy with the inertial guidance systems used in navigation. Three distinct operations are investigated: angular motion sensing, linear motion sensing, and computation. These operations correspond to the semicircular canals, the otoliths, and central processing respectively. It is especially important for both an inertial guidance system and the vestibular organs to distinguish between attitude with respect to the vertical on the one hand, and linear velocity and displacement on the other. The model is applied to various experimental situations and found to be corroborated by them.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tan, J; Yan, Y; Hager, F
Purpose: Radiation therapy has evolved to become not only more precise and potent, but also more complicated to monitor and deliver. More rigorous and comprehensive quality assurance is needed to safeguard ever advancing radiation therapy. ICRU standards dictate that an ever growing set of treatment parameters are manually checked weekly by medical physicists. This “weekly chart check” procedure is laborious and subject to human errors or other factors. A computer-assisted chart checking process will enable more complete and accurate human review of critical parameters, reduce the risk of medical errors, and improve the efficiency. Methods: We developed a web-based softwaremore » system that enables a thorough weekly quality assurance checks. In the backend, the software retrieves all machine parameters from a Treatment Management System (TMS) and compares them against the corresponding ones from the treatment planning system. They are also checked for validity against preset rules. The results are displayed as a web page in the front-end for physicists to review. Then a summary report is generated and uploaded automatically to the TMS as a record for weekly chart checking. Results: The software system has been deployed on a web server in our department’s intranet, and has been tested thoroughly by our clinical physicists. A plan parameter would be highlighted when it is off the preset limit. The developed system has changed the way of checking charts with significantly improved accuracy, efficiency, and completeness. It has been shown to be robust, fast, and easy to use. Conclusion: A computer-assisted system has been developed for efficient, accurate, and comprehensive weekly chart checking. The system has been extensively validated and is being implemented for routine clinical use.« less
System analysis in rotorcraft design: The past decade
NASA Technical Reports Server (NTRS)
Galloway, Thomas L.
1988-01-01
Rapid advances in the technology of electronic digital computers and the need for an integrated synthesis approach in developing future rotorcraft programs has led to increased emphasis on system analysis techniques in rotorcraft design. The task in systems analysis is to deal with complex, interdependent, and conflicting requirements in a structured manner so rational and objective decisions can be made. Whether the results are wisdom or rubbish depends upon the validity and sometimes more importantly, the consistency of the inputs, the correctness of the analysis, and a sensible choice of measures of effectiveness to draw conclusions. In rotorcraft design this means combining design requirements, technology assessment, sensitivity analysis and reviews techniques currently in use by NASA and Army organizations in developing research programs and vehicle specifications for rotorcraft. These procedures span simple graphical approaches to comprehensive analysis on large mainframe computers. Examples of recent applications to military and civil missions are highlighted.
Bassett, Danielle S; Sporns, Olaf
2017-01-01
Despite substantial recent progress, our understanding of the principles and mechanisms underlying complex brain function and cognition remains incomplete. Network neuroscience proposes to tackle these enduring challenges. Approaching brain structure and function from an explicitly integrative perspective, network neuroscience pursues new ways to map, record, analyze and model the elements and interactions of neurobiological systems. Two parallel trends drive the approach: the availability of new empirical tools to create comprehensive maps and record dynamic patterns among molecules, neurons, brain areas and social systems; and the theoretical framework and computational tools of modern network science. The convergence of empirical and computational advances opens new frontiers of scientific inquiry, including network dynamics, manipulation and control of brain networks, and integration of network processes across spatiotemporal domains. We review emerging trends in network neuroscience and attempt to chart a path toward a better understanding of the brain as a multiscale networked system. PMID:28230844
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davis, E.G.; Mioduszewski, R.J.
The Chemical Computer Man: Chemical Agent Response Simulation (CARS) is a computer model and simulation program for estimating the dynamic changes in human physiological dysfunction resulting from exposures to chemical-threat nerve agents. The newly developed CARS methodology simulates agent exposure effects on the following five indices of human physiological function: mental, vision, cardio-respiratory, visceral, and limbs. Mathematical models and the application of basic pharmacokinetic principles were incorporated into the simulation so that for each chemical exposure, the relationship between exposure dosage, absorbed dosage (agent blood plasma concentration), and level of physiological response are computed as a function of time. CARS,more » as a simulation tool, is designed for the users with little or no computer-related experience. The model combines maximum flexibility with a comprehensive user-friendly interactive menu-driven system. Users define an exposure problem and obtain immediate results displayed in tabular, graphical, and image formats. CARS has broad scientific and engineering applications, not only in technology for the soldier in the area of Chemical Defense, but also in minimizing animal testing in biomedical and toxicological research and the development of a modeling system for human exposure to hazardous-waste chemicals.« less
NASA Technical Reports Server (NTRS)
Potter, R. C.; Vandam, C. P.
1995-01-01
High-lift system aerodynamics has been gaining attention in recent years. In an effort to improve aircraft performance, comprehensive studies of multi-element airfoil systems are being undertaken in wind-tunnel and flight experiments. Recent developments in Computational Fluid Dynamics (CFD) offer a relatively inexpensive alternative for studying complex viscous flows by numerically solving the Navier-Stokes (N-S) equations. Current limitations in computer resources restrict practical high-lift N-S computations to two dimensions, but CFD predictions can yield tremendous insight into flow structure, interactions between airfoil elements, and effects of changes in airfoil geometry or free-stream conditions. These codes are very accurate when compared to strictly 2D data provided by wind-tunnel testing, as will be shown here. Yet, additional challenges must be faced in the analysis of a production aircraft wing section, such as that of the NASA Langley Transport Systems Research Vehicle (TSRV). A primary issue is the sweep theory used to correlate 2D predictions with 3D flight results, accounting for sweep, taper, and finite wing effects. Other computational issues addressed here include the effects of surface roughness of the geometry, cove shape modeling, grid topology, and transition specification. The sensitivity of the flow to changing free-stream conditions is investigated. In addition, the effects of Gurney flaps on the aerodynamic characteristics of the airfoil system are predicted.
Assessing and Enhancing Environmental Sustainability: A Conceptual Review.
Little, John C; Hester, Erich T; Carey, Cayelan C
2016-07-05
While sustainability is an essential concept to ensure the future of humanity and the integrity of the resources and ecosystems on which we depend, identifying a comprehensive yet realistic way to assess and enhance sustainability may be one of the most difficult challenges of our time. We review the primary environmental sustainability assessment approaches, categorizing them as either being design-based or those that employ computational frameworks and/or indicators. We also briefly review approaches used for assessing economic and social sustainability because sustainability necessitates integrating environmental, economic, and social elements. We identify the collective limitations of the existing assessment approaches, showing that there is not a consistent definition of sustainability, that the approaches are generally not comprehensive and are subject to unintended consequences, that there is little to no connection between bottom-up and top-down approaches, and that the field of sustainability is largely fragmented, with a range of academic disciplines and professional organizations pursuing similar goals, but without much formal coordination. We conclude by emphasizing the need for a comprehensive definition of sustainability (that integrates environmental, economic, and social aspects) with a unified system-of-systems approach that is causal, modular, tiered, and scalable, as well as new educational and organizational structures to improve systems-level interdisciplinary integration.
Toward a comprehensive areal model of earthquake-induced landslides
Miles, S.B.; Keefer, D.K.
2009-01-01
This paper provides a review of regional-scale modeling of earthquake-induced landslide hazard with respect to the needs for disaster risk reduction and sustainable development. Based on this review, it sets out important research themes and suggests computing with words (CW), a methodology that includes fuzzy logic systems, as a fruitful modeling methodology for addressing many of these research themes. A range of research, reviewed here, has been conducted applying CW to various aspects of earthquake-induced landslide hazard zonation, but none facilitate comprehensive modeling of all types of earthquake-induced landslides. A new comprehensive areal model of earthquake-induced landslides (CAMEL) is introduced here that was developed using fuzzy logic systems. CAMEL provides an integrated framework for modeling all types of earthquake-induced landslides using geographic information systems. CAMEL is designed to facilitate quantitative and qualitative representation of terrain conditions and knowledge about these conditions on the likely areal concentration of each landslide type. CAMEL is highly modifiable and adaptable; new knowledge can be easily added, while existing knowledge can be changed to better match local knowledge and conditions. As such, CAMEL should not be viewed as a complete alternative to other earthquake-induced landslide models. CAMEL provides an open framework for incorporating other models, such as Newmark's displacement method, together with previously incompatible empirical and local knowledge. ?? 2009 ASCE.
A Fast Synthetic Aperture Radar Raw Data Simulation Using Cloud Computing.
Li, Zhixin; Su, Dandan; Zhu, Haijiang; Li, Wei; Zhang, Fan; Li, Ruirui
2017-01-08
Synthetic Aperture Radar (SAR) raw data simulation is a fundamental problem in radar system design and imaging algorithm research. The growth of surveying swath and resolution results in a significant increase in data volume and simulation period, which can be considered to be a comprehensive data intensive and computing intensive issue. Although several high performance computing (HPC) methods have demonstrated their potential for accelerating simulation, the input/output (I/O) bottleneck of huge raw data has not been eased. In this paper, we propose a cloud computing based SAR raw data simulation algorithm, which employs the MapReduce model to accelerate the raw data computing and the Hadoop distributed file system (HDFS) for fast I/O access. The MapReduce model is designed for the irregular parallel accumulation of raw data simulation, which greatly reduces the parallel efficiency of graphics processing unit (GPU) based simulation methods. In addition, three kinds of optimization strategies are put forward from the aspects of programming model, HDFS configuration and scheduling. The experimental results show that the cloud computing based algorithm achieves 4_ speedup over the baseline serial approach in an 8-node cloud environment, and each optimization strategy can improve about 20%. This work proves that the proposed cloud algorithm is capable of solving the computing intensive and data intensive issues in SAR raw data simulation, and is easily extended to large scale computing to achieve higher acceleration.
Nonuniform Deployment of Autonomous Agents in Harbor-Like Environments
2014-11-12
ith agent than to all other agents. Interested readers are referred to [55] for the comprehensive study on Voronoi partitioning and its applications...robots: An rfid approach, PhD dissertation, School of Electrical Engi- neering and Computer Science, University of Ottawa (October 2012). [55] A. Okabe, B...Gueaieb, A stochastic approach of mobile robot navigation using customized rfid sys- tems, International Conference on Signals, Circuits and Systems
Reconstruction of Sea State One
1988-02-01
this section only a general overview of the wave computer system will be offered. A more comprehensive treatment of this subject is available in Appendix...1) Sync Strip and Threshold Processing Card (2) Pulse Generation Logic Card (3) X Vector Logic Card (4) Y Vector Logic Card (5) Blanking Interval...output by this comparator when the threshold is crossed, which shall be referred to as threshold crossing (THC). (2) PULSE GENERATION LOGIC CARD Turning
Callahan, Damien M.; Umberger, Brian R.; Kent-Braun, Jane A.
2013-01-01
The pathway of voluntary joint torque production includes motor neuron recruitment and rate-coding, sarcolemmal depolarization and calcium release by the sarcoplasmic reticulum, force generation by motor proteins within skeletal muscle, and force transmission by tendon across the joint. The direct source of energetic support for this process is ATP hydrolysis. It is possible to examine portions of this physiologic pathway using various in vivo and in vitro techniques, but an integrated view of the multiple processes that ultimately impact joint torque remains elusive. To address this gap, we present a comprehensive computational model of the combined neuromuscular and musculoskeletal systems that includes novel components related to intracellular bioenergetics function. Components representing excitatory drive, muscle activation, force generation, metabolic perturbations, and torque production during voluntary human ankle dorsiflexion were constructed, using a combination of experimentally-derived data and literature values. Simulation results were validated by comparison with torque and metabolic data obtained in vivo. The model successfully predicted peak and submaximal voluntary and electrically-elicited torque output, and accurately simulated the metabolic perturbations associated with voluntary contractions. This novel, comprehensive model could be used to better understand impact of global effectors such as age and disease on various components of the neuromuscular system, and ultimately, voluntary torque output. PMID:23405245
Technology for the product and process data base
NASA Technical Reports Server (NTRS)
Barnes, R. D.
1984-01-01
The computerized product and process data base is increasingly recognized to be the cornerstone component of an overall system aimed at the integrated automation of the industrial processes of a given company or enterprise. The technology needed to support these more effective computer integrated design and manufacturing methods, especially the concept of 3-D computer-sensible product definitions rather than engineering drawings, is not fully available and rationalized. Progress is being made, however, in bridging this technology gap with concentration on the modeling of sophisticated information and data structures, high-performance interactive user interfaces and comprehensive tools for managing the resulting computerized product definition and process data base.
A Comprehensive Computer Package for Ambulatory Surgical Facilities
Kessler, Robert R.
1980-01-01
Ambulatory surgical centers are a cost effective alternative to hospital surgery. Their increasing popularity has contributed to heavy case loads, an accumulation of vast amounts of medical and financial data and economic pressures to maintain a tight control over “cash flow”. Computerization is now a necessity to aid ambulatory surgical centers to maintain their competitive edge. An on-line system is especially necessary as it allows interactive scheduling of surgical cases, immediate access to financial data and rapid gathering of medical and statistical information. This paper describes the significant features of the computer package in use at the Salt Lake Surgical Center, which processes 500 cases per month.
Research on a Frame-Based Model of Reading Comprehension. Final Report.
ERIC Educational Resources Information Center
Goldstein, Ira
This report summarizes computational investigations of language comprehension based on Marvin Minsky's theory of frames, a recent advance in artifical intelligence theories about the representation of knowledge. The investigations discussed explored frame theory as a basis for text comprehension by implementing models of the theory and developing…
NASA Technical Reports Server (NTRS)
Reardon, John E.; Violett, Duane L., Jr.
1991-01-01
The AFAS Database System was developed to provide the basic structure of a comprehensive database system for the Marshall Space Flight Center (MSFC) Structures and Dynamics Laboratory Aerophysics Division. The system is intended to handle all of the Aerophysics Division Test Facilities as well as data from other sources. The system was written for the DEC VAX family of computers in FORTRAN-77 and utilizes the VMS indexed file system and screen management routines. Various aspects of the system are covered, including a description of the user interface, lists of all code structure elements, descriptions of the file structures, a description of the security system operation, a detailed description of the data retrieval tasks, a description of the session log, and a description of the archival system.
Polyanskiy, Mikhail N.
2015-01-01
We describe a computer code for simulating the amplification of ultrashort mid-infrared laser pulses in CO 2 amplifiers and their propagation through arbitrary optical systems. This code is based on a comprehensive model that includes an accurate consideration of the CO 2 active medium and a physical optics propagation algorithm, and takes into account the interaction of the laser pulse with the material of the optical elements. Finally, the application of the code for optimizing an isotopic regenerative amplifier is described.
Validation of heart and lung teleauscultation on an Internet-based system.
Fragasso, Gabriele; De Benedictis, Marialuisa; Palloshi, Altin; Moltrasio, Marco; Cappelletti, Alberto; Carlino, Mauro; Marchisi, Angelo; Pala, Mariagrazia; Alfieri, Ottavio; Margonato, Alberto
2003-11-01
The feasibility and accuracy of an Internet-based system for teleauscultation was evaluated in 103 cardiac patients, who were auscultated by the same cardiologist with a conventional stethoscope and with an Internet-based method, using an electronic stethoscope and transmitting heart and lung sounds between computer work stations. In 92% of patients, the results of electronic and acoustic auscultation coincided, indicating that teleauscultation may be considered a reliable method for assessing cardiac patients and could, therefore, be adopted in the context of comprehensive telecare programs.
A comprehensive overview of the applications of artificial life.
Kim, Kyung-Joong; Cho, Sung-Bae
2006-01-01
We review the applications of artificial life (ALife), the creation of synthetic life on computers to study, simulate, and understand living systems. The definition and features of ALife are shown by application studies. ALife application fields treated include robot control, robot manufacturing, practical robots, computer graphics, natural phenomenon modeling, entertainment, games, music, economics, Internet, information processing, industrial design, simulation software, electronics, security, data mining, and telecommunications. In order to show the status of ALife application research, this review primarily features a survey of about 180 ALife application articles rather than a selected representation of a few articles. Evolutionary computation is the most popular method for designing such applications, but recently swarm intelligence, artificial immune network, and agent-based modeling have also produced results. Applications were initially restricted to the robotics and computer graphics, but presently, many different applications in engineering areas are of interest.
Direct coal liquefaction baseline design and system analysis. Quarterly report, January--March 1991
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1991-04-01
The primary objective of the study is to develop a computer model for a base line direct coal liquefaction design based on two stage direct coupled catalytic reactors. This primary objective is to be accomplished by completing the following: a base line design based on previous DOE/PETC results from Wilsonville pilot plant and other engineering evaluations; a cost estimate and economic analysis; a computer model incorporating the above two steps over a wide range of capacities and selected process alternatives; a comprehensive training program for DOE/PETC Staff to understand and use the computer model; a thorough documentation of all underlyingmore » assumptions for baseline economics; and a user manual and training material which will facilitate updating of the model in the future.« less
Direct coal liquefaction baseline design and system analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1991-04-01
The primary objective of the study is to develop a computer model for a base line direct coal liquefaction design based on two stage direct coupled catalytic reactors. This primary objective is to be accomplished by completing the following: a base line design based on previous DOE/PETC results from Wilsonville pilot plant and other engineering evaluations; a cost estimate and economic analysis; a computer model incorporating the above two steps over a wide range of capacities and selected process alternatives; a comprehensive training program for DOE/PETC Staff to understand and use the computer model; a thorough documentation of all underlyingmore » assumptions for baseline economics; and a user manual and training material which will facilitate updating of the model in the future.« less
NASA Technical Reports Server (NTRS)
Orzechowski, J. A.
1982-01-01
The CMC fluid mechanics program system was developed to transmit the theoretical evolution of finite element numerical solution methodology, applied to nonlinear field problems into a versatile computer code for comprehensive flow field analysis. A detailed view of the code from the standpoint of a computer programmer's use is presented. A system macroflow chart and detailed flow charts of several routines necessary to interact with a theoretican/user to modify the operation of this program are presented. All subroutines and details of usage, primarily for input and output routines are described. Integer and real scalars and a cross reference list denoting subroutine usage for these scalars are outlined. Entry points in dynamic storage vector IZ; the lengths of each vector accompanying the scalar definitions are described. A listing of the routines peculiar to the standard test case and a listing of the input deck and printout for this case are included.
Amsel, Ben D
2011-04-01
Empirically derived semantic feature norms categorized into different types of knowledge (e.g., visual, functional, auditory) can be summed to create number-of-feature counts per knowledge type. Initial evidence suggests several such knowledge types may be recruited during language comprehension. The present study provides a more detailed understanding of the timecourse and intensity of influence of several such knowledge types on real-time neural activity. A linear mixed-effects model was applied to single trial event-related potentials for 207 visually presented concrete words measured on total number of features (semantic richness), imageability, and number of visual motion, color, visual form, smell, taste, sound, and function features. Significant influences of multiple feature types occurred before 200ms, suggesting parallel neural computation of word form and conceptual knowledge during language comprehension. Function and visual motion features most prominently influenced neural activity, underscoring the importance of action-related knowledge in computing word meaning. The dynamic time courses and topographies of these effects are most consistent with a flexible conceptual system wherein temporally dynamic recruitment of representations in modal and supramodal cortex are a crucial element of the constellation of processes constituting word meaning computation in the brain. Copyright © 2011 Elsevier Ltd. All rights reserved.
Tanimoto, Steven; Thompson, Rob; Berninger, Virginia W.; Nagy, William; Abbott, Robert D.
2015-01-01
Computer scientists and educational researchers evaluated effectiveness of computerized instruction tailored to evidence-based impairments in specific learning disabilities (SLDs) in students in grades 4 to 9 with persisting SLDs despite prior extra help. Following comprehensive, evidence-based differential diagnosis for dysgraphia (impaired handwriting), dyslexia (impaired word reading and spelling), and oral and written language learning disability (OWL LD), students completed 18 sessions of computerized instruction over about 3 months. The 11 students taught letter formation with sequential, numbered, colored arrow cues with full contours who wrote letters on lines added to iPAD screen showed more and stronger treatment effects than the 21 students taught using only visual motion cues for letter formation who wrote on an unlined computer monitor. Teaching to all levels of language in multiple functional language systems (by ear, eye, mouth, and hand) close in time resulted in significant gains in reading and writing skills for the group and in diagnosed SLD hallmark impairments for individuals; also, performance on computerized learning activities correlated with treatment gains. Results are discussed in reference to need for both accommodations and explicit instruction for persisting SLDs and the potential for computers to teach handwriting, morphophonemic orthographies, comprehension, and composition. PMID:26858470
A Model-based Health Monitoring and Diagnostic System for the UH-60 Helicopter. Appendix D
NASA Technical Reports Server (NTRS)
Patterson-Hine, Ann; Hindson, William; Sanderfer, Dwight; Deb, Somnath; Domagala, Chuck
2001-01-01
Model-based reasoning techniques hold much promise in providing comprehensive monitoring and diagnostics capabilities for complex systems. We are exploring the use of one of these techniques, which utilizes multi-signal modeling and the TEAMS-RT real-time diagnostic engine, on the UH-60 Rotorcraft Aircrew Systems Concepts Airborne Laboratory (RASCAL) flight research aircraft. We focus on the engine and transmission systems, and acquire sensor data across the 1553 bus as well as by direct analog-to-digital conversion from sensors to the QHuMS (Qualtech health and usage monitoring system) computer. The QHuMS computer uses commercially available components and is rack-mounted in the RASCAL facility. A multi-signal model of the transmission and engine subsystems enables studies of system testability and analysis of the degree of fault isolation available with various instrumentation suites. The model and examples of these analyses will be described and the data architectures enumerated. Flight tests of this system will validate the data architecture and provide real-time flight profiles to be further analyzed in the laboratory.
Pallante-Kichura, Andrea L.; Bae, Won C.; Du, Jiang; Statum, Sheronda; Wolfson, Tanya; Gamst, Anthony C.; Cory, Esther; Amiel, David; Bugbee, William D.; Sah, Robert L.; Chung, Christine B.
2014-01-01
Objective: To describe and apply a semiquantitative MRI scoring system for multifeature analysis of cartilage defect repair in the knee by osteochondral allografts and to correlate this scoring system with histopathologic, micro–computed tomography (µCT), and biomechanical reference standards using a goat repair model. Design: Fourteen adult goats had 2 osteochondral allografts implanted into each knee: one in the medial femoral condyle and one in the lateral trochlea. At 12 months, goats were euthanized and MRI was performed. Two blinded radiologists independently rated 9 primary features for each graft, including cartilage signal, fill, edge integration, surface congruity, calcified cartilage integrity, subchondral bone plate congruity, subchondral bone marrow signal, osseous integration, and presence of cystic changes. Four ancillary features of the joint were also evaluated, including opposing cartilage, meniscal tears, synovitis, and fat-pad scarring. Comparison was made with histologic and µCT reference standards as well as biomechanical measures. Interobserver agreement and agreement with reference standards was assessed. Cohen’s κ, Spearman’s correlation, and Kruskal-Wallis tests were used as appropriate. Results: There was substantial agreement (κ > 0.6, P < 0.001) for each MRI feature and with comparison against reference standards, except for cartilage edge integration (κ = 0.6). There was a strong positive correlation between MRI and reference standard scores (ρ = 0.86, P < 0.01). Osteochondral allograft MRI scoring system was sensitive to differences in outcomes between the types of allografts. Conclusions: We have described a comprehensive MRI scoring system for osteochondral allografts and have validated this scoring system with histopathologic and µCT reference standards as well as biomechanical indentation testing. PMID:24489999
Software Development Technologies for Reactive, Real-Time, and Hybrid Systems
NASA Technical Reports Server (NTRS)
Manna, Zohar
1996-01-01
The research is directed towards the design and implementation of a comprehensive deductive environment for the development of high-assurance systems, especially reactive (concurrent, real-time, and hybrid) systems. Reactive systems maintain an ongoing interaction with their environment, and are among the most difficult to design and verify. The project aims to provide engineers with a wide variety of tools within a single, general, formal framework in which the tools will be most effective. The entire development process is considered, including the construction, transformation, validation, verification, debugging, and maintenance of computer systems. The goal is to automate the process as much as possible and reduce the errors that pervade hardware and software development.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zaborszky, J.; Venkatasubramanian, V.
1995-10-01
Taxonomy Theory is the first precise comprehensive theory for large power system dynamics modeled in any detail. The motivation for this project is to show that it can be used, practically, for analyzing a disturbance that actually occurred on a large system, which affected a sizable portion of the Midwest with supercritical Hopf type oscillations. This event is well documented and studied. The report first summarizes Taxonomy Theory with an engineering flavor. Then various computational approaches are sighted and analyzed for desirability to use with Taxonomy Theory. Then working equations are developed for computing a segment of the feasibility boundarymore » that bounds the region of (operating) parameters throughout which the operating point can be moved without losing stability. Then experimental software incorporating large EPRI software packages PSAPAC is developed. After a summary of the events during the subject disturbance, numerous large scale computations, up to 7600 buses, are reported. These results are reduced into graphical and tabular forms, which then are analyzed and discussed. The report is divided into two volumes. This volume illustrates the use of the Taxonomy Theory for computing the feasibility boundary and presents evidence that the event indeed led to a Hopf type oscillation on the system. Furthermore it proves that the Feasibility Theory can indeed be used for practical computation work with very large systems. Volume 2, a separate volume, will show that the disturbance has led to a supercritical (that is stable oscillation) Hopf bifurcation.« less
The Effects of Beacons, Comments, and Tasks on Program Comprehension Process in Software Maintenance
ERIC Educational Resources Information Center
Fan, Quyin
2010-01-01
Program comprehension is the most important and frequent process in software maintenance. Extensive research has found that individual characteristics of programmers, differences of computer programs, and differences of task-driven motivations are the major factors that affect the program comprehension results. There is no study specifically…
Ataer-Cansizoglu, Esra; Bolon-Canedo, Veronica; Campbell, J Peter; Bozkurt, Alican; Erdogmus, Deniz; Kalpathy-Cramer, Jayashree; Patel, Samir; Jonas, Karyn; Chan, R V Paul; Ostmo, Susan; Chiang, Michael F
2015-11-01
We developed and evaluated the performance of a novel computer-based image analysis system for grading plus disease in retinopathy of prematurity (ROP), and identified the image features, shapes, and sizes that best correlate with expert diagnosis. A dataset of 77 wide-angle retinal images from infants screened for ROP was collected. A reference standard diagnosis was determined for each image by combining image grading from 3 experts with the clinical diagnosis from ophthalmoscopic examination. Manually segmented images were cropped into a range of shapes and sizes, and a computer algorithm was developed to extract tortuosity and dilation features from arteries and veins. Each feature was fed into our system to identify the set of characteristics that yielded the highest-performing system compared to the reference standard, which we refer to as the "i-ROP" system. Among the tested crop shapes, sizes, and measured features, point-based measurements of arterial and venous tortuosity (combined), and a large circular cropped image (with radius 6 times the disc diameter), provided the highest diagnostic accuracy. The i-ROP system achieved 95% accuracy for classifying preplus and plus disease compared to the reference standard. This was comparable to the performance of the 3 individual experts (96%, 94%, 92%), and significantly higher than the mean performance of 31 nonexperts (81%). This comprehensive analysis of computer-based plus disease suggests that it may be feasible to develop a fully-automated system based on wide-angle retinal images that performs comparably to expert graders at three-level plus disease discrimination. Computer-based image analysis, using objective and quantitative retinal vascular features, has potential to complement clinical ROP diagnosis by ophthalmologists.
Metal oxide resistive random access memory based synaptic devices for brain-inspired computing
NASA Astrophysics Data System (ADS)
Gao, Bin; Kang, Jinfeng; Zhou, Zheng; Chen, Zhe; Huang, Peng; Liu, Lifeng; Liu, Xiaoyan
2016-04-01
The traditional Boolean computing paradigm based on the von Neumann architecture is facing great challenges for future information technology applications such as big data, the Internet of Things (IoT), and wearable devices, due to the limited processing capability issues such as binary data storage and computing, non-parallel data processing, and the buses requirement between memory units and logic units. The brain-inspired neuromorphic computing paradigm is believed to be one of the promising solutions for realizing more complex functions with a lower cost. To perform such brain-inspired computing with a low cost and low power consumption, novel devices for use as electronic synapses are needed. Metal oxide resistive random access memory (ReRAM) devices have emerged as the leading candidate for electronic synapses. This paper comprehensively addresses the recent work on the design and optimization of metal oxide ReRAM-based synaptic devices. A performance enhancement methodology and optimized operation scheme to achieve analog resistive switching and low-energy training behavior are provided. A three-dimensional vertical synapse network architecture is proposed for high-density integration and low-cost fabrication. The impacts of the ReRAM synaptic device features on the performances of neuromorphic systems are also discussed on the basis of a constructed neuromorphic visual system with a pattern recognition function. Possible solutions to achieve the high recognition accuracy and efficiency of neuromorphic systems are presented.
Singh, Anushikha; Dutta, Malay Kishore
2017-12-01
The authentication and integrity verification of medical images is a critical and growing issue for patients in e-health services. Accurate identification of medical images and patient verification is an essential requirement to prevent error in medical diagnosis. The proposed work presents an imperceptible watermarking system to address the security issue of medical fundus images for tele-ophthalmology applications and computer aided automated diagnosis of retinal diseases. In the proposed work, patient identity is embedded in fundus image in singular value decomposition domain with adaptive quantization parameter to maintain perceptual transparency for variety of fundus images like healthy fundus or disease affected image. In the proposed method insertion of watermark in fundus image does not affect the automatic image processing diagnosis of retinal objects & pathologies which ensure uncompromised computer-based diagnosis associated with fundus image. Patient ID is correctly recovered from watermarked fundus image for integrity verification of fundus image at the diagnosis centre. The proposed watermarking system is tested in a comprehensive database of fundus images and results are convincing. results indicate that proposed watermarking method is imperceptible and it does not affect computer vision based automated diagnosis of retinal diseases. Correct recovery of patient ID from watermarked fundus image makes the proposed watermarking system applicable for authentication of fundus images for computer aided diagnosis and Tele-ophthalmology applications. Copyright © 2017 Elsevier B.V. All rights reserved.
A computational model for simulating text comprehension.
Lemaire, Benoît; Denhière, Guy; Bellissens, Cédrick; Jhean-Larose, Sandra
2006-11-01
In the present article, we outline the architecture of a computer program for simulating the process by which humans comprehend texts. The program is based on psycholinguistic theories about human memory and text comprehension processes, such as the construction-integration model (Kintsch, 1998), the latent semantic analysis theory of knowledge representation (Landauer & Dumais, 1997), and the predication algorithms (Kintsch, 2001; Lemaire & Bianco, 2003), and it is intended to help psycholinguists investigate the way humans comprehend texts.
ERIC Educational Resources Information Center
Alodwan, Talal; Almosa, Mosaab
2018-01-01
The study aimed to assess the effectiveness of a computer program based on Analysis, Design, Development, Implementation and Evaluation (ADDIE) Model on the achievement of Ninth Graders' listening and Reading Comprehension Skills in English. The study sample comprised 70 ninth graders during the second semester of the academic year 2016/2017. The…
Linking medical records to an expert system
NASA Technical Reports Server (NTRS)
Naeymi-Rad, Frank; Trace, David; Desouzaalmeida, Fabio
1991-01-01
This presentation will be done using the IMR-Entry (Intelligent Medical Record Entry) system. IMR-Entry is a software program developed as a front-end to our diagnostic consultant software MEDAS (Medical Emergency Decision Assistance System). MEDAS (the Medical Emergency Diagnostic Assistance System) is a diagnostic consultant system using a multimembership Bayesian design for its inference engine and relational database technology for its knowledge base maintenance. Research on MEDAS began at the University of Southern California and the Institute of Critical Care in the mid 1970's with support from NASA and NSF. The MEDAS project moved to Chicago in 1982; its current progress is due to collaboration between Illinois Institute of Technology, The Chicago Medical School, Lake Forest College and NASA at KSC. Since the purpose of an expert system is to derive a hypothesis, its communication vocabulary is limited to features used by its knowledge base. The development of a comprehensive problem based medical record entry system which could handshake with an expert system while creating an electronic medical record at the same time was studied. IMR-E is a computer based patient record that serves as a front end to the expert system MEDAS. IMR-E is a graphically oriented comprehensive medical record. The programs major components are demonstrated.
Applicability of computational systems biology in toxicology.
Kongsbak, Kristine; Hadrup, Niels; Audouze, Karine; Vinggaard, Anne Marie
2014-07-01
Systems biology as a research field has emerged within the last few decades. Systems biology, often defined as the antithesis of the reductionist approach, integrates information about individual components of a biological system. In integrative systems biology, large data sets from various sources and databases are used to model and predict effects of chemicals on, for instance, human health. In toxicology, computational systems biology enables identification of important pathways and molecules from large data sets; tasks that can be extremely laborious when performed by a classical literature search. However, computational systems biology offers more advantages than providing a high-throughput literature search; it may form the basis for establishment of hypotheses on potential links between environmental chemicals and human diseases, which would be very difficult to establish experimentally. This is possible due to the existence of comprehensive databases containing information on networks of human protein-protein interactions and protein-disease associations. Experimentally determined targets of the specific chemical of interest can be fed into these networks to obtain additional information that can be used to establish hypotheses on links between the chemical and human diseases. Such information can also be applied for designing more intelligent animal/cell experiments that can test the established hypotheses. Here, we describe how and why to apply an integrative systems biology method in the hypothesis-generating phase of toxicological research. © 2014 Nordic Association for the Publication of BCPT (former Nordic Pharmacological Society).
The GPRIME approach to finite element modeling
NASA Technical Reports Server (NTRS)
Wallace, D. R.; Mckee, J. H.; Hurwitz, M. M.
1983-01-01
GPRIME, an interactive modeling system, runs on the CDC 6000 computers and the DEC VAX 11/780 minicomputer. This system includes three components: (1) GPRIME, a user friendly geometric language and a processor to translate that language into geometric entities, (2) GGEN, an interactive data generator for 2-D models; and (3) SOLIDGEN, a 3-D solid modeling program. Each component has a computer user interface of an extensive command set. All of these programs make use of a comprehensive B-spline mathematics subroutine library, which can be used for a wide variety of interpolation problems and other geometric calculations. Many other user aids, such as automatic saving of the geometric and finite element data bases and hidden line removal, are available. This interactive finite element modeling capability can produce a complete finite element model, producing an output file of grid and element data.
Roth, Christian J; Ismail, Mahmoud; Yoshihara, Lena; Wall, Wolfgang A
2017-01-01
In this article, we propose a comprehensive computational model of the entire respiratory system, which allows simulating patient-specific lungs under different ventilation scenarios and provides a deeper insight into local straining and stressing of pulmonary acini. We include novel 0D inter-acinar linker elements to respect the interplay between neighboring alveoli, an essential feature especially in heterogeneously distended lungs. The model is applicable to healthy and diseased patient-specific lung geometries. Presented computations in this work are based on a patient-specific lung geometry obtained from computed tomography data and composed of 60,143 conducting airways, 30,072 acini, and 140,135 inter-acinar linkers. The conducting airways start at the trachea and end before the respiratory bronchioles. The acini are connected to the conducting airways via terminal airways and to each other via inter-acinar linkers forming a fully coupled anatomically based respiratory model. Presented numerical examples include simulation of breathing during a spirometry-like test, measurement of a quasi-static pressure-volume curve using a supersyringe maneuver, and volume-controlled mechanical ventilation. The simulations show that our model incorporating inter-acinar dependencies successfully reproduces physiological results in healthy and diseased states. Moreover, within these scenarios, a deeper insight into local pressure, volume, and flow rate distribution in the human lung is investigated and discussed. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Computer-Aided Modeling and Analysis of Power Processing Systems (CAMAPPS), phase 1
NASA Technical Reports Server (NTRS)
Kim, S.; Lee, J.; Cho, B. H.; Lee, F. C.
1986-01-01
The large-signal behaviors of a regulator depend largely on the type of power circuit topology and control. Thus, for maximum flexibility, it is best to develop models for each functional block a independent modules. A regulator can then be configured by collecting appropriate pre-defined modules for each functional block. In order to complete the component model generation for a comprehensive spacecraft power system, the following modules were developed: solar array switching unit and control; shunt regulators; and battery discharger. The capability of each module is demonstrated using a simplified Direct Energy Transfer (DET) system. Large-signal behaviors of solar array power systems were analyzed. Stability of the solar array system operating points with a nonlinear load is analyzed. The state-plane analysis illustrates trajectories of the system operating point under various conditions. Stability and transient responses of the system operating near the solar array's maximum power point are also analyzed. The solar array system mode of operation is described using the DET spacecraft power system. The DET system is simulated for various operating conditions. Transfer of the software program CAMAPPS (Computer Aided Modeling and Analysis of Power Processing Systems) to NASA/GSFC (Goddard Space Flight Center) was accomplished.
Prabhakar, P.; Sames, William J.; Dehoff, Ryan R.; ...
2015-03-28
Here, a computational modeling approach to simulate residual stress formation during the electron beam melting (EBM) process within the additive manufacturing (AM) technologies for Inconel 718 is presented in this paper. The EBM process has demonstrated a high potential to fabricate components with complex geometries, but the resulting components are influenced by the thermal cycles observed during the manufacturing process. When processing nickel based superalloys, very high temperatures (approx. 1000 °C) are observed in the powder bed, base plate, and build. These high temperatures, when combined with substrate adherence, can result in warping of the base plate and affect themore » final component by causing defects. It is important to have an understanding of the thermo-mechanical response of the entire system, that is, its mechanical behavior towards thermal loading occurring during the EBM process prior to manufacturing a component. Therefore, computational models to predict the response of the system during the EBM process will aid in eliminating the undesired process conditions, a priori, in order to fabricate the optimum component. Such a comprehensive computational modeling approach is demonstrated to analyze warping of the base plate, stress and plastic strain accumulation within the material, and thermal cycles in the system during different stages of the EBM process.« less
TethysCluster: A comprehensive approach for harnessing cloud resources for hydrologic modeling
NASA Astrophysics Data System (ADS)
Nelson, J.; Jones, N.; Ames, D. P.
2015-12-01
Advances in water resources modeling are improving the information that can be supplied to support decisions affecting the safety and sustainability of society. However, as water resources models become more sophisticated and data-intensive they require more computational power to run. Purchasing and maintaining the computing facilities needed to support certain modeling tasks has been cost-prohibitive for many organizations. With the advent of the cloud, the computing resources needed to address this challenge are now available and cost-effective, yet there still remains a significant technical barrier to leverage these resources. This barrier inhibits many decision makers and even trained engineers from taking advantage of the best science and tools available. Here we present the Python tools TethysCluster and CondorPy, that have been developed to lower the barrier to model computation in the cloud by providing (1) programmatic access to dynamically scalable computing resources, (2) a batch scheduling system to queue and dispatch the jobs to the computing resources, (3) data management for job inputs and outputs, and (4) the ability to dynamically create, submit, and monitor computing jobs. These Python tools leverage the open source, computing-resource management, and job management software, HTCondor, to offer a flexible and scalable distributed-computing environment. While TethysCluster and CondorPy can be used independently to provision computing resources and perform large modeling tasks, they have also been integrated into Tethys Platform, a development platform for water resources web apps, to enable computing support for modeling workflows and decision-support systems deployed as web apps.
DGCA: A comprehensive R package for Differential Gene Correlation Analysis.
McKenzie, Andrew T; Katsyv, Igor; Song, Won-Min; Wang, Minghui; Zhang, Bin
2016-11-15
Dissecting the regulatory relationships between genes is a critical step towards building accurate predictive models of biological systems. A powerful approach towards this end is to systematically study the differences in correlation between gene pairs in more than one distinct condition. In this study we develop an R package, DGCA (for Differential Gene Correlation Analysis), which offers a suite of tools for computing and analyzing differential correlations between gene pairs across multiple conditions. To minimize parametric assumptions, DGCA computes empirical p-values via permutation testing. To understand differential correlations at a systems level, DGCA performs higher-order analyses such as measuring the average difference in correlation and multiscale clustering analysis of differential correlation networks. Through a simulation study, we show that the straightforward z-score based method that DGCA employs significantly outperforms the existing alternative methods for calculating differential correlation. Application of DGCA to the TCGA RNA-seq data in breast cancer not only identifies key changes in the regulatory relationships between TP53 and PTEN and their target genes in the presence of inactivating mutations, but also reveals an immune-related differential correlation module that is specific to triple negative breast cancer (TNBC). DGCA is an R package for systematically assessing the difference in gene-gene regulatory relationships under different conditions. This user-friendly, effective, and comprehensive software tool will greatly facilitate the application of differential correlation analysis in many biological studies and thus will help identification of novel signaling pathways, biomarkers, and targets in complex biological systems and diseases.
RNA Structural Dynamics As Captured by Molecular Simulations: A Comprehensive Overview.
Šponer, Jiří; Bussi, Giovanni; Krepl, Miroslav; Banáš, Pavel; Bottaro, Sandro; Cunha, Richard A; Gil-Ley, Alejandro; Pinamonti, Giovanni; Poblete, Simón; Jurečka, Petr; Walter, Nils G; Otyepka, Michal
2018-04-25
With both catalytic and genetic functions, ribonucleic acid (RNA) is perhaps the most pluripotent chemical species in molecular biology, and its functions are intimately linked to its structure and dynamics. Computer simulations, and in particular atomistic molecular dynamics (MD), allow structural dynamics of biomolecular systems to be investigated with unprecedented temporal and spatial resolution. We here provide a comprehensive overview of the fast-developing field of MD simulations of RNA molecules. We begin with an in-depth, evaluatory coverage of the most fundamental methodological challenges that set the basis for the future development of the field, in particular, the current developments and inherent physical limitations of the atomistic force fields and the recent advances in a broad spectrum of enhanced sampling methods. We also survey the closely related field of coarse-grained modeling of RNA systems. After dealing with the methodological aspects, we provide an exhaustive overview of the available RNA simulation literature, ranging from studies of the smallest RNA oligonucleotides to investigations of the entire ribosome. Our review encompasses tetranucleotides, tetraloops, a number of small RNA motifs, A-helix RNA, kissing-loop complexes, the TAR RNA element, the decoding center and other important regions of the ribosome, as well as assorted others systems. Extended sections are devoted to RNA-ion interactions, ribozymes, riboswitches, and protein/RNA complexes. Our overview is written for as broad of an audience as possible, aiming to provide a much-needed interdisciplinary bridge between computation and experiment, together with a perspective on the future of the field.
Advanced piloted aircraft flight control system design methodology. Volume 1: Knowledge base
NASA Technical Reports Server (NTRS)
Mcruer, Duane T.; Myers, Thomas T.
1988-01-01
The development of a comprehensive and electric methodology for conceptual and preliminary design of flight control systems is presented and illustrated. The methodology is focused on the design stages starting with the layout of system requirements and ending when some viable competing system architectures (feedback control structures) are defined. The approach is centered on the human pilot and the aircraft as both the sources of, and the keys to the solution of, many flight control problems. The methodology relies heavily on computational procedures which are highly interactive with the design engineer. To maximize effectiveness, these techniques, as selected and modified to be used together in the methodology, form a cadre of computational tools specifically tailored for integrated flight control system preliminary design purposes. While theory and associated computational means are an important aspect of the design methodology, the lore, knowledge and experience elements, which guide and govern applications are critical features. This material is presented as summary tables, outlines, recipes, empirical data, lists, etc., which encapsulate a great deal of expert knowledge. Much of this is presented in topical knowledge summaries which are attached as Supplements. The composite of the supplements and the main body elements constitutes a first cut at a a Mark 1 Knowledge Base for manned-aircraft flight control.
Cross-language differences in the brain network subserving intelligible speech.
Ge, Jianqiao; Peng, Gang; Lyu, Bingjiang; Wang, Yi; Zhuo, Yan; Niu, Zhendong; Tan, Li Hai; Leff, Alexander P; Gao, Jia-Hong
2015-03-10
How is language processed in the brain by native speakers of different languages? Is there one brain system for all languages or are different languages subserved by different brain systems? The first view emphasizes commonality, whereas the second emphasizes specificity. We investigated the cortical dynamics involved in processing two very diverse languages: a tonal language (Chinese) and a nontonal language (English). We used functional MRI and dynamic causal modeling analysis to compute and compare brain network models exhaustively with all possible connections among nodes of language regions in temporal and frontal cortex and found that the information flow from the posterior to anterior portions of the temporal cortex was commonly shared by Chinese and English speakers during speech comprehension, whereas the inferior frontal gyrus received neural signals from the left posterior portion of the temporal cortex in English speakers and from the bilateral anterior portion of the temporal cortex in Chinese speakers. Our results revealed that, although speech processing is largely carried out in the common left hemisphere classical language areas (Broca's and Wernicke's areas) and anterior temporal cortex, speech comprehension across different language groups depends on how these brain regions interact with each other. Moreover, the right anterior temporal cortex, which is crucial for tone processing, is equally important as its left homolog, the left anterior temporal cortex, in modulating the cortical dynamics in tone language comprehension. The current study pinpoints the importance of the bilateral anterior temporal cortex in language comprehension that is downplayed or even ignored by popular contemporary models of speech comprehension.
Cross-language differences in the brain network subserving intelligible speech
Ge, Jianqiao; Peng, Gang; Lyu, Bingjiang; Wang, Yi; Zhuo, Yan; Niu, Zhendong; Tan, Li Hai; Leff, Alexander P.; Gao, Jia-Hong
2015-01-01
How is language processed in the brain by native speakers of different languages? Is there one brain system for all languages or are different languages subserved by different brain systems? The first view emphasizes commonality, whereas the second emphasizes specificity. We investigated the cortical dynamics involved in processing two very diverse languages: a tonal language (Chinese) and a nontonal language (English). We used functional MRI and dynamic causal modeling analysis to compute and compare brain network models exhaustively with all possible connections among nodes of language regions in temporal and frontal cortex and found that the information flow from the posterior to anterior portions of the temporal cortex was commonly shared by Chinese and English speakers during speech comprehension, whereas the inferior frontal gyrus received neural signals from the left posterior portion of the temporal cortex in English speakers and from the bilateral anterior portion of the temporal cortex in Chinese speakers. Our results revealed that, although speech processing is largely carried out in the common left hemisphere classical language areas (Broca’s and Wernicke’s areas) and anterior temporal cortex, speech comprehension across different language groups depends on how these brain regions interact with each other. Moreover, the right anterior temporal cortex, which is crucial for tone processing, is equally important as its left homolog, the left anterior temporal cortex, in modulating the cortical dynamics in tone language comprehension. The current study pinpoints the importance of the bilateral anterior temporal cortex in language comprehension that is downplayed or even ignored by popular contemporary models of speech comprehension. PMID:25713366
Software Development Technologies for Reactive, Real-Time, and Hybrid Systems: Summary of Research
NASA Technical Reports Server (NTRS)
Manna, Zohar
1998-01-01
This research is directed towards the implementation of a comprehensive deductive-algorithmic environment (toolkit) for the development and verification of high assurance reactive systems, especially concurrent, real-time, and hybrid systems. For this, we have designed and implemented the STCP (Stanford Temporal Prover) verification system. Reactive systems have an ongoing interaction with their environment, and their computations are infinite sequences of states. A large number of systems can be seen as reactive systems, including hardware, concurrent programs, network protocols, and embedded systems. Temporal logic provides a convenient language for expressing properties of reactive systems. A temporal verification methodology provides procedures for proving that a given system satisfies a given temporal property. The research covered necessary theoretical foundations as well as implementation and application issues.
A Portable Computer System for Auditing Quality of Ambulatory Care
McCoy, J. Michael; Dunn, Earl V.; Borgiel, Alexander E.
1987-01-01
Prior efforts to effectively and efficiently audit quality of ambulatory care based on comprehensive process criteria have been limited largely by the complexity and cost of data abstraction and management. Over the years, several demonstration projects have generated large sets of process criteria and mapping systems for evaluating quality of care, but these paper-based approaches have been impractical to implement on a routine basis. Recognizing that portable microcomputers could solve many of the technical problems in abstracting data from medical records, we built upon previously described criteria and developed a microcomputer-based abstracting system that facilitates reliable and cost-effective data abstraction.
Theory and experiments in model-based space system anomaly management
NASA Astrophysics Data System (ADS)
Kitts, Christopher Adam
This research program consists of an experimental study of model-based reasoning methods for detecting, diagnosing and resolving anomalies that occur when operating a comprehensive space system. Using a first principles approach, several extensions were made to the existing field of model-based fault detection and diagnosis in order to develop a general theory of model-based anomaly management. Based on this theory, a suite of algorithms were developed and computationally implemented in order to detect, diagnose and identify resolutions for anomalous conditions occurring within an engineering system. The theory and software suite were experimentally verified and validated in the context of a simple but comprehensive, student-developed, end-to-end space system, which was developed specifically to support such demonstrations. This space system consisted of the Sapphire microsatellite which was launched in 2001, several geographically distributed and Internet-enabled communication ground stations, and a centralized mission control complex located in the Space Technology Center in the NASA Ames Research Park. Results of both ground-based and on-board experiments demonstrate the speed, accuracy, and value of the algorithms compared to human operators, and they highlight future improvements required to mature this technology.
The Nature of Computer Assisted Learning.
ERIC Educational Resources Information Center
Whiting, John
Computer assisted learning (CAL) is an old technology which has generated much new interest. Computers can: reduce data to a directly comprehensible form; reduce administration; communicate worldwide and exchange, store, and retrieve data; and teach. The computer's limitation is in its dependence on the user's ability and perceptive nature.…
NASA Astrophysics Data System (ADS)
Cenek, Martin; Dahl, Spencer K.
2016-11-01
Systems with non-linear dynamics frequently exhibit emergent system behavior, which is important to find and specify rigorously to understand the nature of the modeled phenomena. Through this analysis, it is possible to characterize phenomena such as how systems assemble or dissipate and what behaviors lead to specific final system configurations. Agent Based Modeling (ABM) is one of the modeling techniques used to study the interaction dynamics between a system's agents and its environment. Although the methodology of ABM construction is well understood and practiced, there are no computational, statistically rigorous, comprehensive tools to evaluate an ABM's execution. Often, a human has to observe an ABM's execution in order to analyze how the ABM functions, identify the emergent processes in the agent's behavior, or study a parameter's effect on the system-wide behavior. This paper introduces a new statistically based framework to automatically analyze agents' behavior, identify common system-wide patterns, and record the probability of agents changing their behavior from one pattern of behavior to another. We use network based techniques to analyze the landscape of common behaviors in an ABM's execution. Finally, we test the proposed framework with a series of experiments featuring increasingly emergent behavior. The proposed framework will allow computational comparison of ABM executions, exploration of a model's parameter configuration space, and identification of the behavioral building blocks in a model's dynamics.
Cenek, Martin; Dahl, Spencer K
2016-11-01
Systems with non-linear dynamics frequently exhibit emergent system behavior, which is important to find and specify rigorously to understand the nature of the modeled phenomena. Through this analysis, it is possible to characterize phenomena such as how systems assemble or dissipate and what behaviors lead to specific final system configurations. Agent Based Modeling (ABM) is one of the modeling techniques used to study the interaction dynamics between a system's agents and its environment. Although the methodology of ABM construction is well understood and practiced, there are no computational, statistically rigorous, comprehensive tools to evaluate an ABM's execution. Often, a human has to observe an ABM's execution in order to analyze how the ABM functions, identify the emergent processes in the agent's behavior, or study a parameter's effect on the system-wide behavior. This paper introduces a new statistically based framework to automatically analyze agents' behavior, identify common system-wide patterns, and record the probability of agents changing their behavior from one pattern of behavior to another. We use network based techniques to analyze the landscape of common behaviors in an ABM's execution. Finally, we test the proposed framework with a series of experiments featuring increasingly emergent behavior. The proposed framework will allow computational comparison of ABM executions, exploration of a model's parameter configuration space, and identification of the behavioral building blocks in a model's dynamics.
ERIC Educational Resources Information Center
Douglas, Karen H.; Ayres, Kevin M.; Langone, John; Bramlett, Virginia Bell
2011-01-01
This study evaluated the effects of a computer-based instructional program to assist three students with mild to moderate intellectual disabilities in using pictorial graphic organizers as aids for increasing comprehension of electronic text-based recipes. Student comprehension of recipes was measured by their ability to verbally retell recipe…
Construction of In-house Databases in a Corporation
NASA Astrophysics Data System (ADS)
Dezaki, Kyoko; Saeki, Makoto
Rapid progress in advanced informationalization has increased need to enforce documentation activities in industries. Responding to it Tokin Corporation has been engaged in database construction for patent information, technical reports and so on accumulated inside the Company. Two results are obtained; One is TOPICS, inhouse patent information management system, the other is TOMATIS, management and technical information system by use of personal computers and all-purposed relational database software. These systems aim at compiling databases of patent and technological management information generated internally and externally by low labor efforts as well as low cost, and providing for comprehensive information company-wide. This paper introduces the outline of these systems and how they are actually used.
Argonne simulation framework for intelligent transportation systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ewing, T.; Doss, E.; Hanebutte, U.
1996-04-01
A simulation framework has been developed which defines a high-level architecture for a large-scale, comprehensive, scalable simulation of an Intelligent Transportation System (ITS). The simulator is designed to run on parallel computers and distributed (networked) computer systems; however, a version for a stand alone workstation is also available. The ITS simulator includes an Expert Driver Model (EDM) of instrumented ``smart`` vehicles with in-vehicle navigation units. The EDM is capable of performing optimal route planning and communicating with Traffic Management Centers (TMC). A dynamic road map data base is sued for optimum route planning, where the data is updated periodically tomore » reflect any changes in road or weather conditions. The TMC has probe vehicle tracking capabilities (display position and attributes of instrumented vehicles), and can provide 2-way interaction with traffic to provide advisories and link times. Both the in-vehicle navigation module and the TMC feature detailed graphical user interfaces that includes human-factors studies to support safety and operational research. Realistic modeling of variations of the posted driving speed are based on human factor studies that take into consideration weather, road conditions, driver`s personality and behavior and vehicle type. The simulator has been developed on a distributed system of networked UNIX computers, but is designed to run on ANL`s IBM SP-X parallel computer system for large scale problems. A novel feature of the developed simulator is that vehicles will be represented by autonomous computer processes, each with a behavior model which performs independent route selection and reacts to external traffic events much like real vehicles. Vehicle processes interact with each other and with ITS components by exchanging messages. With this approach, one will be able to take advantage of emerging massively parallel processor (MPP) systems.« less
Evolution of the ATLAS distributed computing system during the LHC long shutdown
NASA Astrophysics Data System (ADS)
Campana, S.; Atlas Collaboration
2014-06-01
The ATLAS Distributed Computing project (ADC) was established in 2007 to develop and operate a framework, following the ATLAS computing model, to enable data storage, processing and bookkeeping on top of the Worldwide LHC Computing Grid (WLCG) distributed infrastructure. ADC development has always been driven by operations and this contributed to its success. The system has fulfilled the demanding requirements of ATLAS, daily consolidating worldwide up to 1 PB of data and running more than 1.5 million payloads distributed globally, supporting almost one thousand concurrent distributed analysis users. Comprehensive automation and monitoring minimized the operational manpower required. The flexibility of the system to adjust to operational needs has been important to the success of the ATLAS physics program. The LHC shutdown in 2013-2015 affords an opportunity to improve the system in light of operational experience and scale it to cope with the demanding requirements of 2015 and beyond, most notably a much higher trigger rate and event pileup. We will describe the evolution of the ADC software foreseen during this period. This includes consolidating the existing Production and Distributed Analysis framework (PanDA) and ATLAS Grid Information System (AGIS), together with the development and commissioning of next generation systems for distributed data management (DDM/Rucio) and production (Prodsys-2). We will explain how new technologies such as Cloud Computing and NoSQL databases, which ATLAS investigated as R&D projects in past years, will be integrated in production. Finally, we will describe more fundamental developments such as breaking job-to-data locality by exploiting storage federations and caches, and event level (rather than file or dataset level) workload engines.
A Fast Synthetic Aperture Radar Raw Data Simulation Using Cloud Computing
Li, Zhixin; Su, Dandan; Zhu, Haijiang; Li, Wei; Zhang, Fan; Li, Ruirui
2017-01-01
Synthetic Aperture Radar (SAR) raw data simulation is a fundamental problem in radar system design and imaging algorithm research. The growth of surveying swath and resolution results in a significant increase in data volume and simulation period, which can be considered to be a comprehensive data intensive and computing intensive issue. Although several high performance computing (HPC) methods have demonstrated their potential for accelerating simulation, the input/output (I/O) bottleneck of huge raw data has not been eased. In this paper, we propose a cloud computing based SAR raw data simulation algorithm, which employs the MapReduce model to accelerate the raw data computing and the Hadoop distributed file system (HDFS) for fast I/O access. The MapReduce model is designed for the irregular parallel accumulation of raw data simulation, which greatly reduces the parallel efficiency of graphics processing unit (GPU) based simulation methods. In addition, three kinds of optimization strategies are put forward from the aspects of programming model, HDFS configuration and scheduling. The experimental results show that the cloud computing based algorithm achieves 4× speedup over the baseline serial approach in an 8-node cloud environment, and each optimization strategy can improve about 20%. This work proves that the proposed cloud algorithm is capable of solving the computing intensive and data intensive issues in SAR raw data simulation, and is easily extended to large scale computing to achieve higher acceleration. PMID:28075343
ERIC Educational Resources Information Center
Falleur, David M.
This presentation describes SuperPILOT, an extended version of Apple PILOT, a programming language for developing computer-assisted instruction (CAI) with the Apple II computer that includes the features of its early PILOT (Programmed Inquiry, Learning or Teaching) ancestors together with new features that make use of the Apple computer's advanced…
Manifesto of computational social science
NASA Astrophysics Data System (ADS)
Conte, R.; Gilbert, N.; Bonelli, G.; Cioffi-Revilla, C.; Deffuant, G.; Kertesz, J.; Loreto, V.; Moat, S.; Nadal, J.-P.; Sanchez, A.; Nowak, A.; Flache, A.; San Miguel, M.; Helbing, D.
2012-11-01
The increasing integration of technology into our lives has created unprecedented volumes of data on society's everyday behaviour. Such data opens up exciting new opportunities to work towards a quantitative understanding of our complex social systems, within the realms of a new discipline known as Computational Social Science. Against a background of financial crises, riots and international epidemics, the urgent need for a greater comprehension of the complexity of our interconnected global society and an ability to apply such insights in policy decisions is clear. This manifesto outlines the objectives of this new scientific direction, considering the challenges involved in it, and the extensive impact on science, technology and society that the success of this endeavour is likely to bring about.
NASA Technical Reports Server (NTRS)
Manhardt, P. D.
1982-01-01
The CMC fluid mechanics program system was developed to transmit the theoretical solution of finite element numerical solution methodology, applied to nonlinear field problems into a versatile computer code for comprehensive flow field analysis. Data procedures for the CMC 3 dimensional Parabolic Navier-Stokes (PNS) algorithm are presented. General data procedures a juncture corner flow standard test case data deck is described. A listing of the data deck and an explanation of grid generation methodology are presented. Tabulations of all commands and variables available to the user are described. These are in alphabetical order with cross reference numbers which refer to storage addresses.
A distributed program composition system
NASA Technical Reports Server (NTRS)
Brown, Robert L.
1989-01-01
A graphical technique for creating distributed computer programs is investigated and a prototype implementation is described which serves as a testbed for the concepts. The type of programs under examination is restricted to those comprising relatively heavyweight parts that intercommunicate by passing messages of typed objects. Such programs are often presented visually as a directed graph with computer program parts as the nodes and communication channels as the edges. This class of programs, called parts-based programs, is not well supported by existing computer systems; much manual work is required to describe the program to the system, establish the communication paths, accommodate the heterogeneity of data types, and to locate the parts of the program on the various systems involved. The work described solves most of these problems by providing an interface for describing parts-based programs in this class in a way that closely models the way programmers think about them: using sketches of diagraphs. Program parts, the computational modes of the larger program system are categorized in libraries and are accessed with browsers. The process of programming has the programmer draw the program graph interactively. Heterogeneity is automatically accommodated by the insertion of type translators where necessary between the parts. Many decisions are necessary in the creation of a comprehensive tool for interactive creation of programs in this class. Possibilities are explored and the issues behind such decisions are presented. An approach to program composition is described, not a carefully implemented programming environment. However, a prototype implementation is described that can demonstrate the ideas presented.
How language production shapes language form and comprehension
MacDonald, Maryellen C.
2012-01-01
Language production processes can provide insight into how language comprehension works and language typology—why languages tend to have certain characteristics more often than others. Drawing on work in memory retrieval, motor planning, and serial order in action planning, the Production-Distribution-Comprehension (PDC) account links work in the fields of language production, typology, and comprehension: (1) faced with substantial computational burdens of planning and producing utterances, language producers implicitly follow three biases in utterance planning that promote word order choices that reduce these burdens, thereby improving production fluency. (2) These choices, repeated over many utterances and individuals, shape the distributions of utterance forms in language. The claim that language form stems in large degree from producers' attempts to mitigate utterance planning difficulty is contrasted with alternative accounts in which form is driven by language use more broadly, language acquisition processes, or producers' attempts to create language forms that are easily understood by comprehenders. (3) Language perceivers implicitly learn the statistical regularities in their linguistic input, and they use this prior experience to guide comprehension of subsequent language. In particular, they learn to predict the sequential structure of linguistic signals, based on the statistics of previously-encountered input. Thus, key aspects of comprehension behavior are tied to lexico-syntactic statistics in the language, which in turn derive from utterance planning biases promoting production of comparatively easy utterance forms over more difficult ones. This approach contrasts with classic theories in which comprehension behaviors are attributed to innate design features of the language comprehension system and associated working memory. The PDC instead links basic features of comprehension to a different source: production processes that shape language form. PMID:23637689
Terascale Computing in Accelerator Science and Technology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ko, Kwok
2002-08-21
We have entered the age of ''terascale'' scientific computing. Processors and system architecture both continue to evolve; hundred-teraFLOP computers are expected in the next few years, and petaFLOP computers toward the end of this decade are conceivable. This ever-increasing power to solve previously intractable numerical problems benefits almost every field of science and engineering and is revolutionizing some of them, notably including accelerator physics and technology. At existing accelerators, it will help us optimize performance, expand operational parameter envelopes, and increase reliability. Design decisions for next-generation machines will be informed by unprecedented comprehensive and accurate modeling, as well as computer-aidedmore » engineering; all this will increase the likelihood that even their most advanced subsystems can be commissioned on time, within budget, and up to specifications. Advanced computing is also vital to developing new means of acceleration and exploring the behavior of beams under extreme conditions. With continued progress it will someday become reasonable to speak of a complete numerical model of all phenomena important to a particular accelerator.« less
NASA Astrophysics Data System (ADS)
Amsallem, David; Tezaur, Radek; Farhat, Charbel
2016-12-01
A comprehensive approach for real-time computations using a database of parametric, linear, projection-based reduced-order models (ROMs) based on arbitrary underlying meshes is proposed. In the offline phase of this approach, the parameter space is sampled and linear ROMs defined by linear reduced operators are pre-computed at the sampled parameter points and stored. Then, these operators and associated ROMs are transformed into counterparts that satisfy a certain notion of consistency. In the online phase of this approach, a linear ROM is constructed in real-time at a queried but unsampled parameter point by interpolating the pre-computed linear reduced operators on matrix manifolds and therefore computing an interpolated linear ROM. The proposed overall model reduction framework is illustrated with two applications: a parametric inverse acoustic scattering problem associated with a mockup submarine, and a parametric flutter prediction problem associated with a wing-tank system. The second application is implemented on a mobile device, illustrating the capability of the proposed computational framework to operate in real-time.
NASA Technical Reports Server (NTRS)
Hopkins, Dale A.
1992-01-01
The presentation gives a partial overview of research and development underway in the Structures Division of LeRC, which collectively is referred to as the Computational Structures Technology Program. The activities in the program are diverse and encompass four major categories: (1) composite materials and structures; (2) probabilistic analysis and reliability; (3) design optimization and expert systems; and (4) computational methods and simulation. The approach of the program is comprehensive and entails exploration of fundamental theories of structural mechanics to accurately represent the complex physics governing engine structural performance, formulation, and implementation of computational techniques and integrated simulation strategies to provide accurate and efficient solutions of the governing theoretical models by exploiting the emerging advances in computer technology, and validation and verification through numerical and experimental tests to establish confidence and define the qualities and limitations of the resulting theoretical models and computational solutions. The program comprises both in-house and sponsored research activities. The remainder of the presentation provides a sample of activities to illustrate the breadth and depth of the program and to demonstrate the accomplishments and benefits that have resulted.
NASA Technical Reports Server (NTRS)
Welp, D. W.; Brown, R. A.; Ullman, D. G.; Kuhner, M. B.
1974-01-01
A computer simulation program which models a commercial short-haul aircraft operating in the civil air system was developed. The purpose of the program is to evaluate the effect of a given aircraft avionics capability on the ability of the aircraft to perform on-time carrier operations. The program outputs consist primarily of those quantities which can be used to determine direct operating costs. These include: (1) schedule reliability or delays, (2) repairs/replacements, (3) fuel consumption, and (4) cancellations. More comprehensive models of the terminal area environment were added and a simulation of an existing airline operation was conducted to obtain a form of model verification. The capability of the program to provide comparative results (sensitivity analysis) was then demonstrated by modifying the aircraft avionics capability for additional computer simulations.
Choi, Jong Soo; Lee, Jean Hyoung; Park, Jong Hwan; Nam, Han Seung; Kwon, Hyuknam; Kim, Dongsoo; Park, Seung Woo
2011-04-01
Implementing an efficient Electronic Medical Record (EMR) system is regarded as one of the key strategies for improving the quality of healthcare services. However, the system's interoperability between medical devices and the EMR is a big barrier to deploying the EMR system in an outpatient clinical setting. The purpose of this study is to design a framework for a seamless and comprehensively integrated medical device interface system, and to develop and implement a system for accelerating the deployment of the EMR system. We designed and developed a framework that could transform data from medical devices into the relevant standards and then store them in the EMR. The framework is composed of 5 interfacing methods according to the types of medical devices utilized at an outpatient clinical setting, registered in Samsung Medical Center (SMC) database. The medical devices used for this study were devices that have microchips embedded or that came packaged with personal computers. The devices are completely integrated with the EMR based on SMC's long term IT strategies. First deployment of integrating 352 medical devices into the EMR took place in April, 2006, and it took about 48 months. By March, 2010, every medical device was interfaced with the EMR. About 66,000 medical examinations per month were performed taking up an average of 50GB of storage space. We surveyed users, mainly the technicians. Out of 73 that responded, 76% of the respondents replied that they were strongly satisfied or satisfied, 20% replied as being neutral and only 4% complained about the speed of the system, which was attributed to the slow speed of the old-fashioned medical devices and computers. The current implementation of the medical device interface system based on the SMC framework significantly streamlines the clinical workflow in a satisfactory manner. 2010 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Lou, Yang; Zhou, Weimin; Matthews, Thomas P.; Appleton, Catherine M.; Anastasio, Mark A.
2017-04-01
Photoacoustic computed tomography (PACT) and ultrasound computed tomography (USCT) are emerging modalities for breast imaging. As in all emerging imaging technologies, computer-simulation studies play a critically important role in developing and optimizing the designs of hardware and image reconstruction methods for PACT and USCT. Using computer-simulations, the parameters of an imaging system can be systematically and comprehensively explored in a way that is generally not possible through experimentation. When conducting such studies, numerical phantoms are employed to represent the physical properties of the patient or object to-be-imaged that influence the measured image data. It is highly desirable to utilize numerical phantoms that are realistic, especially when task-based measures of image quality are to be utilized to guide system design. However, most reported computer-simulation studies of PACT and USCT breast imaging employ simple numerical phantoms that oversimplify the complex anatomical structures in the human female breast. We develop and implement a methodology for generating anatomically realistic numerical breast phantoms from clinical contrast-enhanced magnetic resonance imaging data. The phantoms will depict vascular structures and the volumetric distribution of different tissue types in the breast. By assigning optical and acoustic parameters to different tissue structures, both optical and acoustic breast phantoms will be established for use in PACT and USCT studies.
Systems Biology Approaches for Host–Fungal Interactions: An Expanding Multi-Omics Frontier
Culibrk, Luka; Croft, Carys A.
2016-01-01
Abstract Opportunistic fungal infections are an increasing threat for global health, and for immunocompromised patients in particular. These infections are characterized by interaction between fungal pathogen and host cells. The exact mechanisms and the attendant variability in host and fungal pathogen interaction remain to be fully elucidated. The field of systems biology aims to characterize a biological system, and utilize this knowledge to predict the system's response to stimuli such as fungal exposures. A multi-omics approach, for example, combining data from genomics, proteomics, metabolomics, would allow a more comprehensive and pan-optic “two systems” biology of both the host and the fungal pathogen. In this review and literature analysis, we present highly specialized and nascent methods for analysis of multiple -omes of biological systems, in addition to emerging single-molecule visualization techniques that may assist in determining biological relevance of multi-omics data. We provide an overview of computational methods for modeling of gene regulatory networks, including some that have been applied towards the study of an interacting host and pathogen. In sum, comprehensive characterizations of host–fungal pathogen systems are now possible, and utilization of these cutting-edge multi-omics strategies may yield advances in better understanding of both host biology and fungal pathogens at a systems scale. PMID:26885725
Dynamic computer model for the metallogenesis and tectonics of the Circum-North Pacific
Scotese, Christopher R.; Nokleberg, Warren J.; Monger, James W.H.; Norton, Ian O.; Parfenov, Leonid M.; Khanchuk, Alexander I.; Bundtzen, Thomas K.; Dawson, Kenneth M.; Eremin, Roman A.; Frolov, Yuri F.; Fujita, Kazuya; Goryachev, Nikolai A.; Pozdeev, Anany I.; Ratkin, Vladimir V.; Rodinov, Sergey M.; Rozenblum, Ilya S.; Scholl, David W.; Shpikerman, Vladimir I.; Sidorov, Anatoly A.; Stone, David B.
2001-01-01
The digital files on this report consist of a dynamic computer model of the metallogenesis and tectonics of the Circum-North Pacific, and background articles, figures, and maps. The tectonic part of the dynamic computer model is derived from a major analysis of the tectonic evolution of the Circum-North Pacific which is also contained in directory tectevol. The dynamic computer model and associated materials on this CD-ROM are part of a project on the major mineral deposits, metallogenesis, and tectonics of the Russian Far East, Alaska, and the Canadian Cordillera. The project provides critical information on bedrock geology and geophysics, tectonics, major metalliferous mineral resources, metallogenic patterns, and crustal origin and evolution of mineralizing systems for this region. The major scientific goals and benefits of the project are to: (1) provide a comprehensive international data base on the mineral resources of the region that is the first, extensive knowledge available in English; (2) provide major new interpretations of the origin and crustal evolution of mineralizing systems and their host rocks, thereby enabling enhanced, broad-scale tectonic reconstructions and interpretations; and (3) promote trade and scientific and technical exchanges between North America and Eastern Asia.
TopoMS: Comprehensive topological exploration for molecular and condensed-matter systems.
Bhatia, Harsh; Gyulassy, Attila G; Lordi, Vincenzo; Pask, John E; Pascucci, Valerio; Bremer, Peer-Timo
2018-06-15
We introduce TopoMS, a computational tool enabling detailed topological analysis of molecular and condensed-matter systems, including the computation of atomic volumes and charges through the quantum theory of atoms in molecules, as well as the complete molecular graph. With roots in techniques from computational topology, and using a shared-memory parallel approach, TopoMS provides scalable, numerically robust, and topologically consistent analysis. TopoMS can be used as a command-line tool or with a GUI (graphical user interface), where the latter also enables an interactive exploration of the molecular graph. This paper presents algorithmic details of TopoMS and compares it with state-of-the-art tools: Bader charge analysis v1.0 (Arnaldsson et al., 01/11/17) and molecular graph extraction using Critic2 (Otero-de-la-Roza et al., Comput. Phys. Commun. 2014, 185, 1007). TopoMS not only combines the functionality of these individual codes but also demonstrates up to 4× performance gain on a standard laptop, faster convergence to fine-grid solution, robustness against lattice bias, and topological consistency. TopoMS is released publicly under BSD License. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ali, Syed Firasat; Hajek, Brian K.; Usman, Shoaib
The report emphasizes smooth transition from paper-based procedure systems (PBPSs) to computer-based procedure systems (CBPSs) for the existing commercial nuclear power plants in the U.S. The expected advantages and of the transition are mentioned including continued, safe and efficient operation of the plants under their recently acquired or desired extended licenses. The report proposes a three-stage survey to aid in developing a national strategic plan for the transition from PBPSs to CBPSs. It also includes a comprehensive questionnaire that can be readily used for the first stage of the suggested survey.
NASA Technical Reports Server (NTRS)
Kerr, Andrew W.
1989-01-01
Programs related to rotorcraft aeromechanics and man-machine integration are discussed which will support advanced army rotorcraft design. In aeromechanics, recent advances in computational fluid dynamics will be used to characterize the complex unsteady flowfields of rotorcraft, and a second-generation comprehensive helicopter analysis system will be used along with models of aerodynamics, engines, and control systems to study the structural dynamics of rotor/body configurations. The man-machine integration program includes the development of advanced cockpit design technology and the evaluation of cockpit and mission equipment concepts in a real-time full-combat environment.
Tang, Qi-Yi; Zhang, Chuan-Xi
2013-04-01
A comprehensive but simple-to-use software package called DPS (Data Processing System) has been developed to execute a range of standard numerical analyses and operations used in experimental design, statistics and data mining. This program runs on standard Windows computers. Many of the functions are specific to entomological and other biological research and are not found in standard statistical software. This paper presents applications of DPS to experimental design, statistical analysis and data mining in entomology. © 2012 The Authors Insect Science © 2012 Institute of Zoology, Chinese Academy of Sciences.
Reaction-mediated entropic effect on phase separation in a binary polymer system
NASA Astrophysics Data System (ADS)
Sun, Shujun; Guo, Miaocai; Yi, Xiaosu; Zhang, Zuoguang
2017-10-01
We present a computer simulation to study the phase separation behavior induced by polymerization in a binary system comprising polymer chains and reactive monomers. We examined the influence of interaction parameter between components and monomer concentration on the reaction-induced phase separation. The simulation results demonstrate that increasing interaction parameter (enthalpic effect) would accelerate phase separation, while entropic effect plays a key role in the process of phase separation. Furthermore, scanning electron microscopy observations illustrate identical morphologies as found in theoretical simulation. This study may enrich our comprehension of phase separation in polymer mixture.
A Comprehensive Toolset for General-Purpose Private Computing and Outsourcing
2016-12-08
project and scientific advances made towards each of the research thrusts throughout the project duration. 1 Project Objectives Cloud computing enables...possibilities that the cloud enables is computation outsourcing, when the client can utilize any necessary computing resources for its computational task...Security considerations, however, stand on the way of harnessing the full benefits of cloud computing to the fullest extent and prevent clients from
Computerizing an integrated clinical and financial record system in a CMHC: a pilot project.
Newkham, J; Bawcom, L
1981-01-01
The authors describe the three-year experience of a mid-sized community mental health center in designing and installing an automated Staff/Management Information System (S/MIS). The purpose of the project, piloted at the heart od Texas Region Mental Health Mental Retardation Center (HOTRMHMR) in Waco, Texas, was to examine the feasibility of a comprehensive data system operating at a local level which would create an effective audit trail for services and reimbursement and serve as a viable mechanism for the transmission of center data to a state system via computer tapes. Included in the discussion are agency philosophy, costs, management attitudes, the design and implementation process, and special features which evolved from the fully integrated system.
Fei, Ding-Yu; Zhao, Xiaoming; Boanca, Cosmin; Hughes, Esther; Bai, Ou; Merrell, Ronald; Rafiq, Azhar
2010-07-01
To design and test an embedded biomedical sensor system that can monitor astronauts' comprehensive physiological parameters, and provide real-time data display during extra-vehicle activities (EVA) in the space exploration. An embedded system was developed with an array of biomedical sensors that can be integrated into the spacesuit. Wired communications were tested for physiological data acquisition and data transmission to a computer mounted on the spacesuit during task performances simulating EVA sessions. The sensor integration, data collection and communication, and the real-time data monitoring were successfully validated in the NASA field tests. The developed system may work as an embedded system for monitoring health status during long-term space mission. Copyright 2010 Elsevier Ltd. All rights reserved.
An overview of the NASA electronic components information management system
NASA Technical Reports Server (NTRS)
Kramer, G.; Waterbury, S.
1991-01-01
The NASA Parts Project Office (NPPO) comprehensive data system to support all NASA Electric, Electronic, and Electromechanical (EEE) parts management and technical data requirements is described. A phase delivery approach is adopted, comprising four principal phases. Phases 1 and 2 support Space Station Freedom (SSF) and use a centralized architecture with all data and processing kept on a mainframe computer. Phases 3 and 4 support all NASA centers and projects and implement a distributed system architecture, in which data and processing are shared among networked database servers. The Phase 1 system, which became operational in February of 1990, implements a core set of functions. Phase 2, scheduled for release in 1991, adds functions to the Phase 1 system. Phase 3, to be prototyped beginning in 1991 and delivered in 1992, introduces a distributed system, separate from the Phase 1 and 2 system, with a refined semantic data model. Phase 4 extends the data model and functionality of the Phase 3 system to provide support for the NASA design community, including integration with Computer Aided Design (CAD) environments. Phase 4 is scheduled for prototyping in 1992 to 93 and delivery in 1994.
MetaboTools: A comprehensive toolbox for analysis of genome-scale metabolic models
Aurich, Maike K.; Fleming, Ronan M. T.; Thiele, Ines
2016-08-03
Metabolomic data sets provide a direct read-out of cellular phenotypes and are increasingly generated to study biological questions. Previous work, by us and others, revealed the potential of analyzing extracellular metabolomic data in the context of the metabolic model using constraint-based modeling. With the MetaboTools, we make our methods available to the broader scientific community. The MetaboTools consist of a protocol, a toolbox, and tutorials of two use cases. The protocol describes, in a step-wise manner, the workflow of data integration, and computational analysis. The MetaboTools comprise the Matlab code required to complete the workflow described in the protocol. Tutorialsmore » explain the computational steps for integration of two different data sets and demonstrate a comprehensive set of methods for the computational analysis of metabolic models and stratification thereof into different phenotypes. The presented workflow supports integrative analysis of multiple omics data sets. Importantly, all analysis tools can be applied to metabolic models without performing the entire workflow. Taken together, the MetaboTools constitute a comprehensive guide to the intra-model analysis of extracellular metabolomic data from microbial, plant, or human cells. In conclusion, this computational modeling resource offers a broad set of computational analysis tools for a wide biomedical and non-biomedical research community.« less
Inferring cortical function in the mouse visual system through large-scale systems neuroscience.
Hawrylycz, Michael; Anastassiou, Costas; Arkhipov, Anton; Berg, Jim; Buice, Michael; Cain, Nicholas; Gouwens, Nathan W; Gratiy, Sergey; Iyer, Ramakrishnan; Lee, Jung Hoon; Mihalas, Stefan; Mitelut, Catalin; Olsen, Shawn; Reid, R Clay; Teeter, Corinne; de Vries, Saskia; Waters, Jack; Zeng, Hongkui; Koch, Christof
2016-07-05
The scientific mission of the Project MindScope is to understand neocortex, the part of the mammalian brain that gives rise to perception, memory, intelligence, and consciousness. We seek to quantitatively evaluate the hypothesis that neocortex is a relatively homogeneous tissue, with smaller functional modules that perform a common computational function replicated across regions. We here focus on the mouse as a mammalian model organism with genetics, physiology, and behavior that can be readily studied and manipulated in the laboratory. We seek to describe the operation of cortical circuitry at the computational level by comprehensively cataloging and characterizing its cellular building blocks along with their dynamics and their cell type-specific connectivities. The project is also building large-scale experimental platforms (i.e., brain observatories) to record the activity of large populations of cortical neurons in behaving mice subject to visual stimuli. A primary goal is to understand the series of operations from visual input in the retina to behavior by observing and modeling the physical transformations of signals in the corticothalamic system. We here focus on the contribution that computer modeling and theory make to this long-term effort.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boris, J.P.; Picone, J.M.; Lambrakos, S.G.
The Surveillance, Correlation, and Tracking (SCAT) problem is the computation-limited kernel of future battle-management systems currently being developed, for example, under the Strategic Defense Initiative (SDI). This report shows how high-performance SCAT can be performed in this decade. Estimates suggest that an increase by a factor of at least one thousand in computational capacity will be necessary to track 10/sup 5/ SDI objects in real time. This large improvement is needed because standard algorithms for data organization in important segments of the SCAT problem scale as N/sup 2/ and N/sup 3/, where N is the number of perceived objects. Itmore » is shown that the required speed-up factor can now be achieved because of two new developments: 1) a heterogeneous element supercomputer system based on available parallel-processing technology can account for over one order of magnitude performance improvement today over existing supercomputers; and 2) algorithmic innovations development recently by the NRL Laboratory for Computational Physics will account for another two orders of magnitude improvement. Based on these advances, a comprehensive, high-performance kernel for a simulator/system to perform the SCAT portion of SDI battle management is described.« less
Computers and the Primary Curriculum 3-13.
ERIC Educational Resources Information Center
Crompton, Rob, Ed.
This book is a comprehensive and practical guide to the use of computers across a wide age range. Extensive use is made of photographs, illustrations, cartoons, and samples of children's work to demonstrate the versatility of computer use in schools. An introduction by Rob Crompton placing computer use within the educational context of the United…
Computer Assisted Language Learning. Routledge Studies in Computer Assisted Language Learning
ERIC Educational Resources Information Center
Pennington, Martha
2011-01-01
Computer-assisted language learning (CALL) is an approach to language teaching and learning in which computer technology is used as an aid to the presentation, reinforcement and assessment of material to be learned, usually including a substantial interactive element. This books provides an up-to date and comprehensive overview of…
ERIC Educational Resources Information Center
San Marcos Unified School District, CA.
THE FOLLOWING IS THE FULL TEXT OF THIS DOCUMENT: After viewing many computer-literacy programs, we believe San Marcos Junior High School has developed a unique program which will truly develop computer literacy. Our hope is to give all students a comprehensive look at computers as they go through their two years here. They will not only learn the…
Effects of Strength of Accent on an L2 Interactive Lecture Listening Comprehension Test
ERIC Educational Resources Information Center
Ockey, Gary J.; Papageorgiou, Spiros; French, Robert
2016-01-01
This article reports on a study which aimed to determine the effect of strength of accent on listening comprehension of interactive lectures. Test takers (N = 21,726) listened to an interactive lecture given by one of nine speakers and responded to six comprehension items. The test taker responses were analyzed with the Rasch computer program…
SIMWEST - A simulation model for wind energy storage systems
NASA Technical Reports Server (NTRS)
Edsinger, R. W.; Warren, A. W.; Gordon, L. H.; Chang, G. C.
1978-01-01
This paper describes a comprehensive and efficient computer program for the modeling of wind energy systems with storage. The level of detail of SIMWEST (SImulation Model for Wind Energy STorage) is consistent with evaluating the economic feasibility as well as the general performance of wind energy systems with energy storage options. The software package consists of two basic programs and a library of system, environmental, and control components. The first program is a precompiler which allows the library components to be put together in building block form. The second program performs the technoeconomic system analysis with the required input/output, and the integration of system dynamics. An example of the application of the SIMWEST program to a current 100 kW wind energy storage system is given.
Coupled CFD/CSD Analysis of an Active-Twist Rotor in a Wind Tunnel with Experimental Validation
NASA Technical Reports Server (NTRS)
Massey, Steven J.; Kreshock, Andrew R.; Sekula, Martin K.
2015-01-01
An unsteady Reynolds averaged Navier-Stokes analysis loosely coupled with a comprehensive rotorcraft code is presented for a second-generation active-twist rotor. High fidelity Navier-Stokes results for three configurations: an isolated rotor, a rotor with fuselage, and a rotor with fuselage mounted in a wind tunnel, are compared to lifting-line theory based comprehensive rotorcraft code calculations and wind tunnel data. Results indicate that CFD/CSD predictions of flapwise bending moments are in good agreement with wind tunnel measurements for configurations with a fuselage, and that modeling the wind tunnel environment does not significantly enhance computed results. Actuated rotor results for the rotor with fuselage configuration are also validated for predictions of vibratory blade loads and fixed-system vibratory loads. Varying levels of agreement with wind tunnel measurements are observed for blade vibratory loads, depending on the load component (flap, lag, or torsion) and the harmonic being examined. Predicted trends in fixed-system vibratory loads are in good agreement with wind tunnel measurements.
NASA Technical Reports Server (NTRS)
Chen, Wei; Tsui, Kwok-Leung; Allen, Janet K.; Mistree, Farrokh
1994-01-01
In this paper we introduce a comprehensive and rigorous robust design procedure to overcome some limitations of the current approaches. A comprehensive approach is general enough to model the two major types of robust design applications, namely, robust design associated with the minimization of the deviation of performance caused by the deviation of noise factors (uncontrollable parameters), and robust design due to the minimization of the deviation of performance caused by the deviation of control factors (design variables). We achieve mathematical rigor by using, as a foundation, principles from the design of experiments and optimization. Specifically, we integrate the Response Surface Method (RSM) with the compromise Decision Support Problem (DSP). Our approach is especially useful for design problems where there are no closed-form solutions and system performance is computationally expensive to evaluate. The design of a solar powered irrigation system is used as an example. Our focus in this paper is on illustrating our approach rather than on the results per se.
Anima: Modular Workflow System for Comprehensive Image Data Analysis
Rantanen, Ville; Valori, Miko; Hautaniemi, Sampsa
2014-01-01
Modern microscopes produce vast amounts of image data, and computational methods are needed to analyze and interpret these data. Furthermore, a single image analysis project may require tens or hundreds of analysis steps starting from data import and pre-processing to segmentation and statistical analysis; and ending with visualization and reporting. To manage such large-scale image data analysis projects, we present here a modular workflow system called Anima. Anima is designed for comprehensive and efficient image data analysis development, and it contains several features that are crucial in high-throughput image data analysis: programing language independence, batch processing, easily customized data processing, interoperability with other software via application programing interfaces, and advanced multivariate statistical analysis. The utility of Anima is shown with two case studies focusing on testing different algorithms developed in different imaging platforms and an automated prediction of alive/dead C. elegans worms by integrating several analysis environments. Anima is a fully open source and available with documentation at www.anduril.org/anima. PMID:25126541
NASA Astrophysics Data System (ADS)
Torres Irribarra, D.; Freund, R.; Fisher, W.; Wilson, M.
2015-02-01
Computer-based, online assessments modelled, designed, and evaluated for adaptively administered invariant measurement are uniquely suited to defining and maintaining traceability to standardized units in education. An assessment of this kind is embedded in the Assessing Data Modeling and Statistical Reasoning (ADM) middle school mathematics curriculum. Diagnostic information about middle school students' learning of statistics and modeling is provided via computer-based formative assessments for seven constructs that comprise a learning progression for statistics and modeling from late elementary through the middle school grades. The seven constructs are: Data Display, Meta-Representational Competence, Conceptions of Statistics, Chance, Modeling Variability, Theory of Measurement, and Informal Inference. The end product is a web-delivered system built with Ruby on Rails for use by curriculum development teams working with classroom teachers in designing, developing, and delivering formative assessments. The online accessible system allows teachers to accurately diagnose students' unique comprehension and learning needs in a common language of real-time assessment, logging, analysis, feedback, and reporting.
DataView: a computational visualisation system for multidisciplinary design and analysis
NASA Astrophysics Data System (ADS)
Wang, Chengen
2016-01-01
Rapidly processing raw data and effectively extracting underlining information from huge volumes of multivariate data become essential to all decision-making processes in sectors like finance, government, medical care, climate analysis, industries, science, etc. Remarkably, visualisation is recognised as a fundamental technology that props up human comprehension, cognition and utilisation of burgeoning amounts of heterogeneous data. This paper presents a computational visualisation system, named DataView, which has been developed for graphically displaying and capturing outcomes of multiphysics problem-solvers widely used in engineering fields. The DataView is functionally composed of techniques for table/diagram representation, and graphical illustration of scalar, vector and tensor fields. The field visualisation techniques are implemented on the basis of a range of linear and non-linear meshes, which flexibly adapts to disparate data representation schemas adopted by a variety of disciplinary problem-solvers. The visualisation system has been successfully applied to a number of engineering problems, of which some illustrations are presented to demonstrate effectiveness of the visualisation techniques.
FY02 CBNP Annual Report Input: Bioinformatics Support for CBNP Research and Deployments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Slezak, T; Wolinsky, M
2002-10-31
The events of FY01 dynamically reprogrammed the objectives of the CBNP bioinformatics support team, to meet rapidly-changing Homeland Defense needs and requests from other agencies for assistance: Use computational techniques to determine potential unique DNA signature candidates for microbial and viral pathogens of interest to CBNP researcher and to our collaborating partner agencies such as the Centers for Disease Control and Prevention (CDC), U.S. Department of Agriculture (USDA), Department of Defense (DOD), and Food and Drug Administration (FDA). Develop effective electronic screening measures for DNA signatures to reduce the cost and time of wet-bench screening. Build a comprehensive system formore » tracking the development and testing of DNA signatures. Build a chain-of-custody sample tracking system for field deployment of the DNA signatures as part of the BASIS project. Provide computational tools for use by CBNP Biological Foundations researchers.« less
Monolithic silicon-photonic platforms in state-of-the-art CMOS SOI processes [Invited].
Stojanović, Vladimir; Ram, Rajeev J; Popović, Milos; Lin, Sen; Moazeni, Sajjad; Wade, Mark; Sun, Chen; Alloatti, Luca; Atabaki, Amir; Pavanello, Fabio; Mehta, Nandish; Bhargava, Pavan
2018-05-14
Integrating photonics with advanced electronics leverages transistor performance, process fidelity and package integration, to enable a new class of systems-on-a-chip for a variety of applications ranging from computing and communications to sensing and imaging. Monolithic silicon photonics is a promising solution to meet the energy efficiency, sensitivity, and cost requirements of these applications. In this review paper, we take a comprehensive view of the performance of the silicon-photonic technologies developed to date for photonic interconnect applications. We also present the latest performance and results of our "zero-change" silicon photonics platforms in 45 nm and 32 nm SOI CMOS. The results indicate that the 45 nm and 32 nm processes provide a "sweet-spot" for adding photonic capability and enhancing integrated system applications beyond the Moore-scaling, while being able to offload major communication tasks from more deeply-scaled compute and memory chips without complicated 3D integration approaches.
Building a virtual ligand screening pipeline using free software: a survey.
Glaab, Enrico
2016-03-01
Virtual screening, the search for bioactive compounds via computational methods, provides a wide range of opportunities to speed up drug development and reduce the associated risks and costs. While virtual screening is already a standard practice in pharmaceutical companies, its applications in preclinical academic research still remain under-exploited, in spite of an increasing availability of dedicated free databases and software tools. In this survey, an overview of recent developments in this field is presented, focusing on free software and data repositories for screening as alternatives to their commercial counterparts, and outlining how available resources can be interlinked into a comprehensive virtual screening pipeline using typical academic computing facilities. Finally, to facilitate the set-up of corresponding pipelines, a downloadable software system is provided, using platform virtualization to integrate pre-installed screening tools and scripts for reproducible application across different operating systems. © The Author 2015. Published by Oxford University Press.
Building a virtual ligand screening pipeline using free software: a survey
2016-01-01
Virtual screening, the search for bioactive compounds via computational methods, provides a wide range of opportunities to speed up drug development and reduce the associated risks and costs. While virtual screening is already a standard practice in pharmaceutical companies, its applications in preclinical academic research still remain under-exploited, in spite of an increasing availability of dedicated free databases and software tools. In this survey, an overview of recent developments in this field is presented, focusing on free software and data repositories for screening as alternatives to their commercial counterparts, and outlining how available resources can be interlinked into a comprehensive virtual screening pipeline using typical academic computing facilities. Finally, to facilitate the set-up of corresponding pipelines, a downloadable software system is provided, using platform virtualization to integrate pre-installed screening tools and scripts for reproducible application across different operating systems. PMID:26094053
Computational knowledge integration in biopharmaceutical research.
Ficenec, David; Osborne, Mark; Pradines, Joel; Richards, Dan; Felciano, Ramon; Cho, Raymond J; Chen, Richard O; Liefeld, Ted; Owen, James; Ruttenberg, Alan; Reich, Christian; Horvath, Joseph; Clark, Tim
2003-09-01
An initiative to increase biopharmaceutical research productivity by capturing, sharing and computationally integrating proprietary scientific discoveries with public knowledge is described. This initiative involves both organisational process change and multiple interoperating software systems. The software components rely on mutually supporting integration techniques. These include a richly structured ontology, statistical analysis of experimental data against stored conclusions, natural language processing of public literature, secure document repositories with lightweight metadata, web services integration, enterprise web portals and relational databases. This approach has already begun to increase scientific productivity in our enterprise by creating an organisational memory (OM) of internal research findings, accessible on the web. Through bringing together these components it has also been possible to construct a very large and expanding repository of biological pathway information linked to this repository of findings which is extremely useful in analysis of DNA microarray data. This repository, in turn, enables our research paradigm to be shifted towards more comprehensive systems-based understandings of drug action.
NASA Astrophysics Data System (ADS)
Tadokoro, Satoshi; Kitano, Hiroaki; Takahashi, Tomoichi; Noda, Itsuki; Matsubara, Hitoshi; Shinjoh, Atsushi; Koto, Tetsuo; Takeuchi, Ikuo; Takahashi, Hironao; Matsuno, Fumitoshi; Hatayama, Mitsunori; Nobe, Jun; Shimada, Susumu
2000-07-01
This paper introduces the RoboCup-Rescue Simulation Project, a contribution to the disaster mitigation, search and rescue problem. A comprehensive urban disaster simulator is constructed on distributed computers. Heterogeneous intelligent agents such as fire fighters, victims and volunteers conduct search and rescue activities in this virtual disaster world. A real world interface integrates various sensor systems and controllers of infrastructures in the real cities with the real world. Real-time simulation is synchronized with actual disasters, computing complex relationship between various damage factors and agent behaviors. A mission-critical man-machine interface provides portability and robustness of disaster mitigation centers, and augmented-reality interfaces for rescue in real disasters. It also provides a virtual- reality training function for the public. This diverse spectrum of RoboCup-Rescue contributes to the creation of the safer social system.
Improved Force Fields for Peptide Nucleic Acids with Optimized Backbone Torsion Parameters.
Jasiński, Maciej; Feig, Michael; Trylska, Joanna
2018-06-06
Peptide nucleic acids are promising nucleic acid analogs for antisense therapies as they can form stable duplex and triplex structures with DNA and RNA. Computational studies of PNA-containing duplexes and triplexes are an important component for guiding their design, yet existing force fields have not been well validated and parametrized with modern computational capabilities. We present updated CHARMM and Amber force fields for PNA that greatly improve the stability of simulated PNA-containing duplexes and triplexes in comparison with experimental structures and allow such systems to be studied on microsecond time scales. The force field modifications focus on reparametrized PNA backbone torsion angles to match high-level quantum mechanics reference energies for a model compound. The microsecond simulations of PNA-PNA, PNA-DNA, PNA-RNA, and PNA-DNA-PNA complexes also allowed a comprehensive analysis of hydration and ion interactions with such systems.
User's manual for the BNW-II optimization code for dry/wet-cooled power plants
DOE Office of Scientific and Technical Information (OSTI.GOV)
Braun, D.J.; Bamberger, J.A.; Braun, D.J.
1978-05-01
The User's Manual describes how to operate BNW-II, a computer code developed by the Pacific Northwest Laboratory (PNL) as a part of its activities under the Department of Energy (DOE) Dry Cooling Enhancement Program. The computer program offers a comprehensive method of evaluating the cost savings potential of dry/wet-cooled heat rejection systems. Going beyond simple ''figure-of-merit'' cooling tower optimization, this method includes such items as the cost of annual replacement capacity, and the optimum split between plant scale-up and replacement capacity, as well as the purchase and operating costs of all major heat rejection components. Hence the BNW-II code ismore » a useful tool for determining potential cost savings of new dry/wet surfaces, new piping, or other components as part of an optimized system for a dry/wet-cooled plant.« less
Medem, Anna V; Seidling, Hanna M; Eichler, Hans-Georg; Kaltschmidt, Jens; Metzner, Michael; Hubert, Carina M; Czock, David; Haefeli, Walter E
2017-05-01
Electronic clinical decision support systems (CDSS) require drug information that can be processed by computers. The goal of this project was to determine and evaluate a compilation of variables that comprehensively capture the information contained in the summary of product characteristic (SmPC) and unequivocally describe the drug, its dosage options, and clinical pharmacokinetics. An expert panel defined and structured a set of variables and drafted a guideline to extract and enter information on dosage and clinical pharmacokinetics from textual SmPCs as published by the European Medicines Agency (EMA). The set of variables was iteratively revised and evaluated by data extraction and variable allocation of roughly 7% of all centrally approved drugs. The information contained in the SmPC was allocated to three information clusters consisting of 260 variables. The cluster "drug characterization" specifies the nature of the drug. The cluster "dosage" provides information on approved drug dosages and defines corresponding specific conditions. The cluster "clinical pharmacokinetics" includes pharmacokinetic parameters of relevance for dosing in clinical practice. A first evaluation demonstrated that, despite the complexity of the current free text SmPCs, dosage and pharmacokinetic information can be reliably extracted from the SmPCs and comprehensively described by a limited set of variables. By proposing a compilation of variables well describing drug dosage and clinical pharmacokinetics, the project represents a step forward towards the development of a comprehensive database system serving as information source for sophisticated CDSS.
NASA Astrophysics Data System (ADS)
Hoepfer, Matthias
Over the last two decades, computer modeling and simulation have evolved as the tools of choice for the design and engineering of dynamic systems. With increased system complexities, modeling and simulation become essential enablers for the design of new systems. Some of the advantages that modeling and simulation-based system design allows for are the replacement of physical tests to ensure product performance, reliability and quality, the shortening of design cycles due to the reduced need for physical prototyping, the design for mission scenarios, the invoking of currently nonexisting technologies, and the reduction of technological and financial risks. Traditionally, dynamic systems are modeled in a monolithic way. Such monolithic models include all the data, relations and equations necessary to represent the underlying system. With increased complexity of these models, the monolithic model approach reaches certain limits regarding for example, model handling and maintenance. Furthermore, while the available computer power has been steadily increasing according to Moore's Law (a doubling in computational power every 10 years), the ever-increasing complexities of new models have negated the increased resources available. Lastly, modern systems and design processes are interdisciplinary, enforcing the necessity to make models more flexible to be able to incorporate different modeling and design approaches. The solution to bypassing the shortcomings of monolithic models is cosimulation. In a very general sense, co-simulation addresses the issue of linking together different dynamic sub-models to a model which represents the overall, integrated dynamic system. It is therefore an important enabler for the design of interdisciplinary, interconnected, highly complex dynamic systems. While a basic co-simulation setup can be very easy, complications can arise when sub-models display behaviors such as algebraic loops, singularities, or constraints. This work frames the co-simulation approach to modeling and simulation. It lays out the general approach to dynamic system co-simulation, and gives a comprehensive overview of what co-simulation is and what it is not. It creates a taxonomy of the requirements and limits of co-simulation, and the issues arising with co-simulating sub-models. Possible solutions towards resolving the stated problems are investigated to a certain depth. A particular focus is given to the issue of time stepping. It will be shown that for dynamic models, the selection of the simulation time step is a crucial issue with respect to computational expense, simulation accuracy, and error control. The reasons for this are discussed in depth, and a time stepping algorithm for co-simulation with unknown dynamic sub-models is proposed. Motivations and suggestions for the further treatment of selected issues are presented.
Process and representation in graphical displays
NASA Technical Reports Server (NTRS)
Gillan, Douglas J.; Lewis, Robert; Rudisill, Marianne
1993-01-01
Our initial model of graphic comprehension has focused on statistical graphs. Like other models of human-computer interaction, models of graphical comprehension can be used by human-computer interface designers and developers to create interfaces that present information in an efficient and usable manner. Our investigation of graph comprehension addresses two primary questions: how do people represent the information contained in a data graph?; and how do they process information from the graph? The topics of focus for graphic representation concern the features into which people decompose a graph and the representations of the graph in memory. The issue of processing can be further analyzed as two questions: what overall processing strategies do people use?; and what are the specific processing skills required?
Comprehensive analysis of a medication dosing error related to CPOE.
Horsky, Jan; Kuperman, Gilad J; Patel, Vimla L
2005-01-01
This case study of a serious medication error demonstrates the necessity of a comprehensive methodology for the analysis of failures in interaction between humans and information systems. The authors used a novel approach to analyze a dosing error related to computer-based ordering of potassium chloride (KCl). The method included a chronological reconstruction of events and their interdependencies from provider order entry usage logs, semistructured interviews with involved clinicians, and interface usability inspection of the ordering system. Information collected from all sources was compared and evaluated to understand how the error evolved and propagated through the system. In this case, the error was the product of faults in interaction among human and system agents that methods limited in scope to their distinct analytical domains would not identify. The authors characterized errors in several converging aspects of the drug ordering process: confusing on-screen laboratory results review, system usability difficulties, user training problems, and suboptimal clinical system safeguards that all contributed to a serious dosing error. The results of the authors' analysis were used to formulate specific recommendations for interface layout and functionality modifications, suggest new user alerts, propose changes to user training, and address error-prone steps of the KCl ordering process to reduce the risk of future medication dosing errors.
A Plan for Community College Instructional Computing.
ERIC Educational Resources Information Center
Howard, Alan; And Others
This document presents a comprehensive plan for future growth in instructional computing in the Washington community colleges. Two chapters define the curriculum objectives and content recommended for instructional courses in the community colleges which require access to computing facilities. The courses described include data processing…
Human Expertise Helps Computer Classify Images
NASA Technical Reports Server (NTRS)
Rorvig, Mark E.
1991-01-01
Two-domain method of computational classification of images requires less computation than other methods for computational recognition, matching, or classification of images or patterns. Does not require explicit computational matching of features, and incorporates human expertise without requiring translation of mental processes of classification into language comprehensible to computer. Conceived to "train" computer to analyze photomicrographs of microscope-slide specimens of leucocytes from human peripheral blood to distinguish between specimens from healthy and specimens from traumatized patients.
[Computerized system validation of clinical researches].
Yan, Charles; Chen, Feng; Xia, Jia-lai; Zheng, Qing-shan; Liu, Daniel
2015-11-01
Validation is a documented process that provides a high degree of assurance. The computer system does exactly and consistently what it is designed to do in a controlled manner throughout the life. The validation process begins with the system proposal/requirements definition, and continues application and maintenance until system retirement and retention of the e-records based on regulatory rules. The objective to do so is to clearly specify that each application of information technology fulfills its purpose. The computer system validation (CSV) is essential in clinical studies according to the GCP standard, meeting product's pre-determined attributes of the specifications, quality, safety and traceability. This paper describes how to perform the validation process and determine relevant stakeholders within an organization in the light of validation SOPs. Although a specific accountability in the implementation of the validation process might be outsourced, the ultimate responsibility of the CSV remains on the shoulder of the business process owner-sponsor. In order to show that the compliance of the system validation has been properly attained, it is essential to set up comprehensive validation procedures and maintain adequate documentations as well as training records. Quality of the system validation should be controlled using both QC and QA means.
Computer-based diagnostic expert systems in rheumatology: where do we stand in 2014?
Alder, Hannes; Michel, Beat A; Marx, Christian; Tamborrini, Giorgio; Langenegger, Thomas; Bruehlmann, Pius; Steurer, Johann; Wildi, Lukas M
2014-01-01
Background. The early detection of rheumatic diseases and the treatment to target have become of utmost importance to control the disease and improve its prognosis. However, establishing a diagnosis in early stages is challenging as many diseases initially present with similar symptoms and signs. Expert systems are computer programs designed to support the human decision making and have been developed in almost every field of medicine. Methods. This review focuses on the developments in the field of rheumatology to give a comprehensive insight. Medline, Embase, and Cochrane Library were searched. Results. Reports of 25 expert systems with different design and field of application were found. The performance of 19 of the identified expert systems was evaluated. The proportion of correctly diagnosed cases was between 43.1 and 99.9%. Sensitivity and specificity ranged from 62 to 100 and 88 to 98%, respectively. Conclusions. Promising diagnostic expert systems with moderate to excellent performance were identified. The validation process was in general underappreciated. None of the systems, however, seemed to have succeeded in daily practice. This review identifies optimal characteristics to increase the survival rate of expert systems and may serve as valuable information for future developments in the field.
Kellmeyer, Philipp; Cochrane, Thomas; Müller, Oliver; Mitchell, Christine; Ball, Tonio; Fins, Joseph J; Biller-Andorno, Nikola
2016-10-01
Closed-loop medical devices such as brain-computer interfaces are an emerging and rapidly advancing neurotechnology. The target patients for brain-computer interfaces (BCIs) are often severely paralyzed, and thus particularly vulnerable in terms of personal autonomy, decisionmaking capacity, and agency. Here we analyze the effects of closed-loop medical devices on the autonomy and accountability of both persons (as patients or research participants) and neurotechnological closed-loop medical systems. We show that although BCIs can strengthen patient autonomy by preserving or restoring communicative abilities and/or motor control, closed-loop devices may also create challenges for moral and legal accountability. We advocate the development of a comprehensive ethical and legal framework to address the challenges of emerging closed-loop neurotechnologies like BCIs and stress the centrality of informed consent and refusal as a means to foster accountability. We propose the creation of an international neuroethics task force with members from medical neuroscience, neuroengineering, computer science, medical law, and medical ethics, as well as representatives of patient advocacy groups and the public.
Galaxy morphology - An unsupervised machine learning approach
NASA Astrophysics Data System (ADS)
Schutter, A.; Shamir, L.
2015-09-01
Structural properties poses valuable information about the formation and evolution of galaxies, and are important for understanding the past, present, and future universe. Here we use unsupervised machine learning methodology to analyze a network of similarities between galaxy morphological types, and automatically deduce a morphological sequence of galaxies. Application of the method to the EFIGI catalog show that the morphological scheme produced by the algorithm is largely in agreement with the De Vaucouleurs system, demonstrating the ability of computer vision and machine learning methods to automatically profile galaxy morphological sequences. The unsupervised analysis method is based on comprehensive computer vision techniques that compute the visual similarities between the different morphological types. Rather than relying on human cognition, the proposed system deduces the similarities between sets of galaxy images in an automatic manner, and is therefore not limited by the number of galaxies being analyzed. The source code of the method is publicly available, and the protocol of the experiment is included in the paper so that the experiment can be replicated, and the method can be used to analyze user-defined datasets of galaxy images.
NASA Astrophysics Data System (ADS)
Zimoń, Małgorzata; Sawko, Robert; Emerson, David; Thompson, Christopher
2017-11-01
Uncertainty quantification (UQ) is increasingly becoming an indispensable tool for assessing the reliability of computational modelling. Efficient handling of stochastic inputs, such as boundary conditions, physical properties or geometry, increases the utility of model results significantly. We discuss the application of non-intrusive generalised polynomial chaos techniques in the context of fluid engineering simulations. Deterministic and Monte Carlo integration rules are applied to a set of problems, including ordinary differential equations and the computation of aerodynamic parameters subject to random perturbations. In particular, we analyse acoustic wave propagation in a heterogeneous medium to study the effects of mesh resolution, transients, number and variability of stochastic inputs. We consider variants of multi-level Monte Carlo and perform a novel comparison of the methods with respect to numerical and parametric errors, as well as computational cost. The results provide a comprehensive view of the necessary steps in UQ analysis and demonstrate some key features of stochastic fluid flow systems.
MIPS: analysis and annotation of proteins from whole genomes in 2005
Mewes, H. W.; Frishman, D.; Mayer, K. F. X.; Münsterkötter, M.; Noubibou, O.; Pagel, P.; Rattei, T.; Oesterheld, M.; Ruepp, A.; Stümpflen, V.
2006-01-01
The Munich Information Center for Protein Sequences (MIPS at the GSF), Neuherberg, Germany, provides resources related to genome information. Manually curated databases for several reference organisms are maintained. Several of these databases are described elsewhere in this and other recent NAR database issues. In a complementary effort, a comprehensive set of >400 genomes automatically annotated with the PEDANT system are maintained. The main goal of our current work on creating and maintaining genome databases is to extend gene centered information to information on interactions within a generic comprehensive framework. We have concentrated our efforts along three lines (i) the development of suitable comprehensive data structures and database technology, communication and query tools to include a wide range of different types of information enabling the representation of complex information such as functional modules or networks Genome Research Environment System, (ii) the development of databases covering computable information such as the basic evolutionary relations among all genes, namely SIMAP, the sequence similarity matrix and the CABiNet network analysis framework and (iii) the compilation and manual annotation of information related to interactions such as protein–protein interactions or other types of relations (e.g. MPCDB, MPPI, CYGD). All databases described and the detailed descriptions of our projects can be accessed through the MIPS WWW server (). PMID:16381839
MIPS: analysis and annotation of proteins from whole genomes in 2005.
Mewes, H W; Frishman, D; Mayer, K F X; Münsterkötter, M; Noubibou, O; Pagel, P; Rattei, T; Oesterheld, M; Ruepp, A; Stümpflen, V
2006-01-01
The Munich Information Center for Protein Sequences (MIPS at the GSF), Neuherberg, Germany, provides resources related to genome information. Manually curated databases for several reference organisms are maintained. Several of these databases are described elsewhere in this and other recent NAR database issues. In a complementary effort, a comprehensive set of >400 genomes automatically annotated with the PEDANT system are maintained. The main goal of our current work on creating and maintaining genome databases is to extend gene centered information to information on interactions within a generic comprehensive framework. We have concentrated our efforts along three lines (i) the development of suitable comprehensive data structures and database technology, communication and query tools to include a wide range of different types of information enabling the representation of complex information such as functional modules or networks Genome Research Environment System, (ii) the development of databases covering computable information such as the basic evolutionary relations among all genes, namely SIMAP, the sequence similarity matrix and the CABiNet network analysis framework and (iii) the compilation and manual annotation of information related to interactions such as protein-protein interactions or other types of relations (e.g. MPCDB, MPPI, CYGD). All databases described and the detailed descriptions of our projects can be accessed through the MIPS WWW server (http://mips.gsf.de).
NASA Astrophysics Data System (ADS)
Kuroki, Hayato; Ino, Shuichi; Nakano, Satoko; Hori, Kotaro; Ifukube, Tohru
The authors of this paper have been studying a real-time speech-to-caption system using speech recognition technology with a “repeat-speaking” method. In this system, they used a “repeat-speaker” who listens to a lecturer's voice and then speaks back the lecturer's speech utterances into a speech recognition computer. The througoing system showed that the accuracy of the captions is about 97% in Japanese-Japanese conversion and the conversion time from voices to captions is about 4 seconds in English-English conversion in some international conferences. Of course it required a lot of costs to achieve these high performances. In human communications, speech understanding depends not only on verbal information but also on non-verbal information such as speaker's gestures, and face and mouth movements. So the authors found the idea to display information of captions and speaker's face movement images with a suitable way to achieve a higher comprehension after storing information once into a computer briefly. In this paper, we investigate the relationship of the display sequence and display timing between captions that have speech recognition errors and the speaker's face movement images. The results show that the sequence “to display the caption before the speaker's face image” improves the comprehension of the captions. The sequence “to display both simultaneously” shows an improvement only a few percent higher than the question sentence, and the sequence “to display the speaker's face image before the caption” shows almost no change. In addition, the sequence “to display the caption 1 second before the speaker's face shows the most significant improvement of all the conditions.
Towards a Comprehensive Computational Simulation System for Turbomachinery
NASA Technical Reports Server (NTRS)
Shih, Ming-Hsin
1994-01-01
The objective of this work is to develop algorithms associated with a comprehensive computational simulation system for turbomachinery flow fields. This development is accomplished in a modular fashion. These modules includes grid generation, visualization, network, simulation, toolbox, and flow modules. An interactive grid generation module is customized to facilitate the grid generation process associated with complicated turbomachinery configurations. With its user-friendly graphical user interface, the user may interactively manipulate the default settings to obtain a quality grid within a fraction of time that is usually required for building a grid about the same geometry with a general-purpose grid generation code. Non-Uniform Rational B-Spline formulations are utilized in the algorithm to maintain geometry fidelity while redistributing grid points on the solid surfaces. Bezier curve formulation is used to allow interactive construction of inner boundaries. It is also utilized to allow interactive point distribution. Cascade surfaces are transformed from three-dimensional surfaces of revolution into two-dimensional parametric planes for easy manipulation. Such a transformation allows these manipulated plane grids to be mapped to surfaces of revolution by any generatrix definition. A sophisticated visualization module is developed to al-low visualization for both grid and flow solution, steady or unsteady. A network module is built to allow data transferring in the heterogeneous environment. A flow module is integrated into this system, using an existing turbomachinery flow code. A simulation module is developed to combine the network, flow, and visualization module to achieve near real-time flow simulation about turbomachinery geometries. A toolbox module is developed to support the overall task. A batch version of the grid generation module is developed to allow portability and has been extended to allow dynamic grid generation for pitch changing turbomachinery configurations. Various applications with different characteristics are presented to demonstrate the success of this system.
A multi-scale approach to designing therapeutics for tuberculosis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Linderman, Jennifer J.; Cilfone, Nicholas A.; Pienaar, Elsje
Approximately one third of the world’s population is infected with Mycobacterium tuberculosis. Limited information about how the immune system fights M. tuberculosis and what constitutes protection from the bacteria impact our ability to develop effective therapies for tuberculosis. We present an in vivo systems biology approach that integrates data from multiple model systems and over multiple length and time scales into a comprehensive multi-scale and multi-compartment view of the in vivo immune response to M. tuberculosis. Lastly, we describe computational models that can be used to study (a) immunomodulation with the cytokines tumor necrosis factor and interleukin 10, (b) oralmore » and inhaled antibiotics, and (c) the effect of vaccination.« less
A comprehensive payment model for short- and long-stay psychiatric patients.
Fries, B E; Durance, P W; Nerenz, D R; Ashcraft, M L
1993-01-01
In this article, a payment model is developed for a hospital system with both acute- and chronic-stay psychiatric patients. "Transition pricing" provides a balance between the incentives of an episode-based system and the necessity of per diem long-term payments. Payment is dependent on two new psychiatric resident classification systems for short- and long-term stays. Data on per diem cost of inpatient care, by day of stay, was computed from a sample of 2,968 patients from 100 psychiatric units in 51 Department of Veterans Affairs (VA) Medical Centers. Using a 9-month cohort of all VA psychiatric discharges nationwide (79,337 with non-chronic stays), profits and losses were simulated.
Exploring Techniques for Vision Based Human Activity Recognition: Methods, Systems, and Evaluation
Xu, Xin; Tang, Jinshan; Zhang, Xiaolong; Liu, Xiaoming; Zhang, Hong; Qiu, Yimin
2013-01-01
With the wide applications of vision based intelligent systems, image and video analysis technologies have attracted the attention of researchers in the computer vision field. In image and video analysis, human activity recognition is an important research direction. By interpreting and understanding human activities, we can recognize and predict the occurrence of crimes and help the police or other agencies react immediately. In the past, a large number of papers have been published on human activity recognition in video and image sequences. In this paper, we provide a comprehensive survey of the recent development of the techniques, including methods, systems, and quantitative evaluation of the performance of human activity recognition. PMID:23353144
A multi-scale approach to designing therapeutics for tuberculosis
Linderman, Jennifer J.; Cilfone, Nicholas A.; Pienaar, Elsje; ...
2015-04-20
Approximately one third of the world’s population is infected with Mycobacterium tuberculosis. Limited information about how the immune system fights M. tuberculosis and what constitutes protection from the bacteria impact our ability to develop effective therapies for tuberculosis. We present an in vivo systems biology approach that integrates data from multiple model systems and over multiple length and time scales into a comprehensive multi-scale and multi-compartment view of the in vivo immune response to M. tuberculosis. Lastly, we describe computational models that can be used to study (a) immunomodulation with the cytokines tumor necrosis factor and interleukin 10, (b) oralmore » and inhaled antibiotics, and (c) the effect of vaccination.« less
Reference Architecture Model Enabling Standards Interoperability.
Blobel, Bernd
2017-01-01
Advanced health and social services paradigms are supported by a comprehensive set of domains managed by different scientific disciplines. Interoperability has to evolve beyond information and communication technology (ICT) concerns, including the real world business domains and their processes, but also the individual context of all actors involved. So, the system must properly reflect the environment in front and around the computer as essential and even defining part of the health system. This paper introduces an ICT-independent system-theoretical, ontology-driven reference architecture model allowing the representation and harmonization of all domains involved including the transformation into an appropriate ICT design and implementation. The entire process is completely formalized and can therefore be fully automated.
ERIC Educational Resources Information Center
Shaw, David C.; Johnson, Dorothy M.
The complete comprehension of this paper requires a firm grasp of both mathematical demography and FORTRAN programming. The paper aims at the establishment of a language with which complex demographic manipulations can be briefly expressed in a form intelligible both to demographic analysts and to computers. The Demographic Computer Library (DCL)…
Care and Handling of Computer Magnetic Storage Media.
ERIC Educational Resources Information Center
Geller, Sidney B.
Intended for use by data processing installation managers, operating personnel, and technical staff, this publication provides a comprehensive set of care and handling guidelines for the physical/chemical preservation of computer magnetic storage media--principally computer magnetic tapes--and their stored data. Emphasis is placed on media…
Effect of Computer Support on Younger Women with Breast Cancer
Gustafson, David H; Hawkins, Robert; Pingree, Suzanne; McTavish, Fiona; Arora, Neeraj K; Mendenhall, John; Cella, David F; Serlin, Ronald C; Apantaku, Funmi M; Stewart, James; Salner, Andrew
2001-01-01
OBJECTIVE Assess impact of a computer-based patient support system on quality of life in younger women with breast cancer, with particular emphasis on assisting the underserved. DESIGN Randomized controlled trial conducted between 1995 and 1998. SETTING Five sites: two teaching hospitals (Madison, Wis, and Chicago, Ill), two nonteaching hospitals (Chicago), and a cancer resource center (Indianapolis, Ill). The latter three sites treat many underserved patients. PARTICIPANTS Newly diagnosed breast cancer patients (N = 246) under age 60. INTERVENTIONS Experimental group received Comprehensive Health Enhancement Support System (CHESS), a home-based computer system providing information, decision-making, and emotional support. MEASUREMENTS AND MAIN RESULTS Pretest and two post-test surveys (at two- and five-month follow-up) measured aspects of participation in care, social/information support, and quality of life. At two-month follow-up, the CHESS group was significantly more competent at seeking information, more comfortable participating in care, and had greater confidence in doctor(s). At five-month follow-up, the CHESS group had significantly better social support and also greater information competence. In addition, experimental assignment interacted with several indicators of medical underservice (race, education, and lack of insurance), such that CHESS benefits were greater for the disadvantaged than the advantaged group. CONCLUSIONS Computer-based patient support systems such as CHESS may benefit patients by providing information and social support, and increasing their participation in health care. These benefits may be largest for currently underserved populations. PMID:11520380
ERIC Educational Resources Information Center
Emery, James C., Ed.
A comprehensive review of the current status, prospects, and problems of computer networking in higher education is presented from the perspectives of both computer users and network suppliers. Several areas of computer use are considered including applications for instruction, research, and administration in colleges and universities. In the…
RNA Structural Dynamics As Captured by Molecular Simulations: A Comprehensive Overview
2018-01-01
With both catalytic and genetic functions, ribonucleic acid (RNA) is perhaps the most pluripotent chemical species in molecular biology, and its functions are intimately linked to its structure and dynamics. Computer simulations, and in particular atomistic molecular dynamics (MD), allow structural dynamics of biomolecular systems to be investigated with unprecedented temporal and spatial resolution. We here provide a comprehensive overview of the fast-developing field of MD simulations of RNA molecules. We begin with an in-depth, evaluatory coverage of the most fundamental methodological challenges that set the basis for the future development of the field, in particular, the current developments and inherent physical limitations of the atomistic force fields and the recent advances in a broad spectrum of enhanced sampling methods. We also survey the closely related field of coarse-grained modeling of RNA systems. After dealing with the methodological aspects, we provide an exhaustive overview of the available RNA simulation literature, ranging from studies of the smallest RNA oligonucleotides to investigations of the entire ribosome. Our review encompasses tetranucleotides, tetraloops, a number of small RNA motifs, A-helix RNA, kissing-loop complexes, the TAR RNA element, the decoding center and other important regions of the ribosome, as well as assorted others systems. Extended sections are devoted to RNA–ion interactions, ribozymes, riboswitches, and protein/RNA complexes. Our overview is written for as broad of an audience as possible, aiming to provide a much-needed interdisciplinary bridge between computation and experiment, together with a perspective on the future of the field. PMID:29297679
Stormwater quality processes for three land-use areas in Broward County, Florida
Mattraw, H.C.; Miller, Robert A.
1981-01-01
Systematic collection and chemical analysis of stormwater runoff samples from three small urban areas in Broward County, Florida, were obtained between 1974 and 1977. Thirty or more runoff-constituent loads were computed for each of the homogeneous land-use areas. The areas sampled were single family residential, highway, and a commercial shopping center. Rainfall , runoff, and nutrient and metal analyses were stored in a data-management system. The data-management system permitted computation of loads, publication of basic-data reports and the interface of environmental and load information with a comprehensive statistical analysis system. Seven regression models relating water quality loads to characteristics of peak discharge, antecedent conditions, season, storm duration and rainfall intensity were constructed for each of the three sites. Total water-quality loads were computed for the collection period by summing loads for individual storms. Loads for unsampled storms were estimated by using regression models and records of storm precipitation. Loadings, pounds per day per acre of hydraulically effective impervious area, were computed for the three land-use types. Total nitrogen, total phosphorus, and total residue loadings were highest in the residential area. Chemical oxygen demand and total lead loadings were highest in the commercial area. Loadings of atmospheric fallout on each watershed were estimated by bulk precipitation samples collected at the highway and commercial site. (USGS)
Techniques for the rapid display and manipulation of 3-D biomedical data.
Goldwasser, S M; Reynolds, R A; Talton, D A; Walsh, E S
1988-01-01
The use of fully interactive 3-D workstations with true real-time performance will become increasingly common as technology matures and economical commercial systems become available. This paper provides a comprehensive introduction to high speed approaches to the display and manipulation of 3-D medical objects obtained from tomographic data acquisition systems such as CT, MR, and PET. A variety of techniques are outlined including the use of software on conventional minicomputers, hardware assist devices such as array processors and programmable frame buffers, and special purpose computer architecture for dedicated high performance systems. While both algorithms and architectures are addressed, the major theme centers around the utilization of hardware-based approaches including parallel processors for the implementation of true real-time systems.
Diversity and evolution of class 2 CRISPR–Cas systems
Shmakov, Sergey; Smargon, Aaron; Scott, David; Cox, David; Pyzocha, Neena; Yan, Winston; Abudayyeh, Omar O.; Gootenberg, Jonathan S.; Makarova, Kira S.; Wolf, Yuri I.; Severinov, Konstantin; Zhang, Feng; Koonin, Eugene V.
2018-01-01
Class 2 CRISPR–Cas systems are characterized by effector modules that consist of a single multidomain protein, such as Cas9 or Cpf1. We designed a computational pipeline for the discovery of novel class 2 variants and used it to identify six new CRISPR–Cas subtypes. The diverse properties of these new systems provide potential for the development of versatile tools for genome editing and regulation. In this Analysis article, we present a comprehensive census of class 2 types and class 2 subtypes in complete and draft bacterial and archaeal genomes, outline evolutionary scenarios for the independent origin of different class 2 CRISPR–Cas systems from mobile genetic elements, and propose an amended classification and nomenclature of CRISPR–Cas. PMID:28111461
Applied Graph-Mining Algorithms to Study Biomolecular Interaction Networks
2014-01-01
Protein-protein interaction (PPI) networks carry vital information on the organization of molecular interactions in cellular systems. The identification of functionally relevant modules in PPI networks is one of the most important applications of biological network analysis. Computational analysis is becoming an indispensable tool to understand large-scale biomolecular interaction networks. Several types of computational methods have been developed and employed for the analysis of PPI networks. Of these computational methods, graph comparison and module detection are the two most commonly used strategies. This review summarizes current literature on graph kernel and graph alignment methods for graph comparison strategies, as well as module detection approaches including seed-and-extend, hierarchical clustering, optimization-based, probabilistic, and frequent subgraph methods. Herein, we provide a comprehensive review of the major algorithms employed under each theme, including our recently published frequent subgraph method, for detecting functional modules commonly shared across multiple cancer PPI networks. PMID:24800226
Four-body trajectory optimization
NASA Technical Reports Server (NTRS)
Pu, C. L.; Edelbaum, T. N.
1974-01-01
A comprehensive optimization program has been developed for computing fuel-optimal trajectories between the earth and a point in the sun-earth-moon system. It presents methods for generating fuel optimal two-impulse trajectories which may originate at the earth or a point in space and fuel optimal three-impulse trajectories between two points in space. The extrapolation of the state vector and the computation of the state transition matrix are accomplished by the Stumpff-Weiss method. The cost and constraint gradients are computed analytically in terms of the terminal state and the state transition matrix. The 4-body Lambert problem is solved by using the Newton-Raphson method. An accelerated gradient projection method is used to optimize a 2-impulse trajectory with terminal constraint. The Davidon's Variance Method is used both in the accelerated gradient projection method and the outer loop of a 3-impulse trajectory optimization problem.
Computer support for physiological cell modelling using an ontology on cell physiology.
Takao, Shimayoshi; Kazuhiro, Komurasaki; Akira, Amano; Takeshi, Iwashita; Masanori, Kanazawa; Tetsuya, Matsuda
2006-01-01
The development of electrophysiological whole cell models to support the understanding of biological mechanisms is increasing rapidly. Due to the complexity of biological systems, comprehensive cell models, which are composed of many imported sub-models of functional elements, can get quite complicated as well, making computer modification difficult. Here, we propose a computer support to enhance structural changes of cell models, employing the markup languages CellML and our original PMSML (physiological model structure markup language), in addition to a new ontology for cell physiological modelling. In particular, a method to make references from CellML files to the ontology and a method to assist manipulation of model structures using markup languages together with the ontology are reported. Using these methods three software utilities, including a graphical model editor, are implemented. Experimental results proved that these methods are effective for the modification of electrophysiological models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Unat, Didem; Dubey, Anshu; Hoefler, Torsten
The cost of data movement has always been an important concern in high performance computing (HPC) systems. It has now become the dominant factor in terms of both energy consumption and performance. Support for expression of data locality has been explored in the past, but those efforts have had only modest success in being adopted in HPC applications for various reasons. However, with the increasing complexity of the memory hierarchy and higher parallelism in emerging HPC systems, locality management has acquired a new urgency. Developers can no longer limit themselves to low-level solutions and ignore the potential for productivity andmore » performance portability obtained by using locality abstractions. Fortunately, the trend emerging in recent literature on the topic alleviates many of the concerns that got in the way of their adoption by application developers. Data locality abstractions are available in the forms of libraries, data structures, languages and runtime systems; a common theme is increasing productivity without sacrificing performance. Furthermore, this paper examines these trends and identifies commonalities that can combine various locality concepts to develop a comprehensive approach to expressing and managing data locality on future large-scale high-performance computing systems.« less
Line-of-sight pointing accuracy/stability analysis and computer simulation for small spacecraft
NASA Astrophysics Data System (ADS)
Algrain, Marcelo C.; Powers, Richard M.
1996-06-01
This paper presents a case study where a comprehensive computer simulation is developed to determine the driving factors contributing to spacecraft pointing accuracy and stability. The simulation is implemented using XMATH/SystemBuild software from Integrated Systems, Inc. The paper is written in a tutorial manner and models for major system components are described. Among them are spacecraft bus, attitude controller, reaction wheel assembly, star-tracker unit, inertial reference unit, and gyro drift estimators (Kalman filter). THe predicted spacecraft performance is analyzed for a variety of input commands and system disturbances. The primary deterministic inputs are desired attitude angles and rate setpoints. The stochastic inputs include random torque disturbances acting on the spacecraft, random gyro bias noise, gyro random walk, and star-tracker noise. These inputs are varied over a wide range to determine their effects on pointing accuracy and stability. The results are presented in the form of trade-off curves designed to facilitate the proper selection of subsystems so that overall spacecraft pointing accuracy and stability requirements are met.
Neurofeedback Training for BCI Control
NASA Astrophysics Data System (ADS)
Neuper, Christa; Pfurtscheller, Gert
Brain-computer interface (BCI) systems detect changes in brain signals that reflect human intention, then translate these signals to control monitors or external devices (for a comprehensive review, see [1]). BCIs typically measure electrical signals resulting from neural firing (i.e. neuronal action potentials, Electroencephalogram (ECoG), or Electroencephalogram (EEG)). Sophisticated pattern recognition and classification algorithms convert neural activity into the required control signals. BCI research has focused heavily on developing powerful signal processing and machine learning techniques to accurately classify neural activity [2-4].
1990-04-01
across the coastal plain to the surrounding mountains . Historically, the lowlands were frequently inundated by tidal flows through a direct natural...approximately in the center of the Los Angeles coastal plain. This low plain is bordered on the north by the eastern Santa Monica Mountains and the Repetto...Hills, on the east by the Puente Hills and the Santa Ana Mountains , on the southeast by the San Joaquin Hills, and on the south and west by the
Drug target inference through pathway analysis of genomics data
Ma, Haisu; Zhao, Hongyu
2013-01-01
Statistical modeling coupled with bioinformatics is commonly used for drug discovery. Although there exist many approaches for single target based drug design and target inference, recent years have seen a paradigm shift to system-level pharmacological research. Pathway analysis of genomics data represents one promising direction for computational inference of drug targets. This article aims at providing a comprehensive review on the evolving issues is this field, covering methodological developments, their pros and cons, as well as future research directions. PMID:23369829
Modelling of Surfaces. Part 1: Monatomic Metallic Surfaces Using Equivalent Crystal Theory
NASA Technical Reports Server (NTRS)
Bozzolo, Guillermo; Ferrante, John; Rodriguez, Agustin M.
1994-01-01
We present a detailed description of equivalent crystal theory focusing on its application to the study of surface structure. While the emphasis is in the structure of the algorithm and its computational aspects, we also present a comprehensive discussion on the calculation of surface energies of metallic systems with equivalent crystal theory and other approaches. Our results are compared to experiment and other semiempirical as well as first-principles calculations for a variety of fcc and bcc metals.
Polyarteritis nodosa: MDCT as a 'One-Stop Shop' Modality for Whole-Body Arterial Evaluation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tsai, W.-L.; Tsai, I-C.; Lee Tain, E-mail: s841082@ym.edu.t
Polyarteritis nodosa is a rare disease, which is characterized by aneurysm formation and occlusion in the arteries of multiple systems. Due to its extensive involvement, whole-body evaluation is necessary for diagnosis and treatment monitoring. We report a case of polyarteritis nodosa using multidetector-row computed tomography (MDCT) as a 'one-stop shop' modality for whole-body arterial evaluation. With precise protocol design, MDCT can be used as a reliable noninvasive modality providing comprehensive whole-body arterial evaluation.
Genomic cloud computing: legal and ethical points to consider
Dove, Edward S; Joly, Yann; Tassé, Anne-Marie; Burton, Paul; Chisholm, Rex; Fortier, Isabel; Goodwin, Pat; Harris, Jennifer; Hveem, Kristian; Kaye, Jane; Kent, Alistair; Knoppers, Bartha Maria; Lindpaintner, Klaus; Little, Julian; Riegman, Peter; Ripatti, Samuli; Stolk, Ronald; Bobrow, Martin; Cambon-Thomsen, Anne; Dressler, Lynn; Joly, Yann; Kato, Kazuto; Knoppers, Bartha Maria; Rodriguez, Laura Lyman; McPherson, Treasa; Nicolás, Pilar; Ouellette, Francis; Romeo-Casabona, Carlos; Sarin, Rajiv; Wallace, Susan; Wiesner, Georgia; Wilson, Julia; Zeps, Nikolajs; Simkevitz, Howard; De Rienzo, Assunta; Knoppers, Bartha M
2015-01-01
The biggest challenge in twenty-first century data-intensive genomic science, is developing vast computer infrastructure and advanced software tools to perform comprehensive analyses of genomic data sets for biomedical research and clinical practice. Researchers are increasingly turning to cloud computing both as a solution to integrate data from genomics, systems biology and biomedical data mining and as an approach to analyze data to solve biomedical problems. Although cloud computing provides several benefits such as lower costs and greater efficiency, it also raises legal and ethical issues. In this article, we discuss three key ‘points to consider' (data control; data security, confidentiality and transfer; and accountability) based on a preliminary review of several publicly available cloud service providers' Terms of Service. These ‘points to consider' should be borne in mind by genomic research organizations when negotiating legal arrangements to store genomic data on a large commercial cloud service provider's servers. Diligent genomic cloud computing means leveraging security standards and evaluation processes as a means to protect data and entails many of the same good practices that researchers should always consider in securing their local infrastructure. PMID:25248396
Genomic cloud computing: legal and ethical points to consider.
Dove, Edward S; Joly, Yann; Tassé, Anne-Marie; Knoppers, Bartha M
2015-10-01
The biggest challenge in twenty-first century data-intensive genomic science, is developing vast computer infrastructure and advanced software tools to perform comprehensive analyses of genomic data sets for biomedical research and clinical practice. Researchers are increasingly turning to cloud computing both as a solution to integrate data from genomics, systems biology and biomedical data mining and as an approach to analyze data to solve biomedical problems. Although cloud computing provides several benefits such as lower costs and greater efficiency, it also raises legal and ethical issues. In this article, we discuss three key 'points to consider' (data control; data security, confidentiality and transfer; and accountability) based on a preliminary review of several publicly available cloud service providers' Terms of Service. These 'points to consider' should be borne in mind by genomic research organizations when negotiating legal arrangements to store genomic data on a large commercial cloud service provider's servers. Diligent genomic cloud computing means leveraging security standards and evaluation processes as a means to protect data and entails many of the same good practices that researchers should always consider in securing their local infrastructure.
NASA Astrophysics Data System (ADS)
Whitford, Dennis J.
2002-05-01
This paper, the second of a two-part series, introduces undergraduate students to ocean wave forecasting using interactive computer-generated visualization and animation. Verbal descriptions and two-dimensional illustrations are often insufficient for student comprehension. Fortunately, the introduction of computers in the geosciences provides a tool for addressing this problem. Computer-generated visualization and animation, accompanied by oral explanation, have been shown to be a pedagogical improvement to more traditional methods of instruction. Cartographic science and other disciplines using geographical information systems have been especially aggressive in pioneering the use of visualization and animation, whereas oceanography has not. This paper will focus on the teaching of ocean swell wave forecasting, often considered a difficult oceanographic topic due to the mathematics and physics required, as well as its interdependence on time and space. Several MATLAB ® software programs are described and offered to visualize and animate group speed, frequency dispersion, angular dispersion, propagation, and wave height forecasting of deep water ocean swell waves. Teachers may use these interactive visualizations and animations without requiring an extensive background in computer programming.
Computation of Asteroid Proper Elements on the Grid
NASA Astrophysics Data System (ADS)
Novakovic, B.; Balaz, A.; Knezevic, Z.; Potocnik, M.
2009-12-01
A procedure of gridification of the computation of asteroid proper orbital elements is described. The need to speed up the time consuming computations and make them more efficient is justified by the large increase of observational data expected from the next generation all sky surveys. We give the basic notion of proper elements and of the contemporary theories and methods used to compute them for different populations of objects. Proper elements for nearly 70,000 asteroids are derived since the beginning of use of the Grid infrastructure for the purpose. The average time for the catalogs update is significantly shortened with respect to the time needed with stand-alone workstations. We also present basics of the Grid computing, the concepts of Grid middleware and its Workload management system. The practical steps we undertook to efficiently gridify our application are described in full detail. We present the results of a comprehensive testing of the performance of different Grid sites, and offer some practical conclusions based on the benchmark results and on our experience. Finally, we propose some possibilities for the future work.
Almario, Christopher V; Chey, William; Kaung, Aung; Whitman, Cynthia; Fuller, Garth; Reid, Mark; Nguyen, Ken; Bolus, Roger; Dennis, Buddy; Encarnacion, Rey; Martinez, Bibiana; Talley, Jennifer; Modi, Rushaba; Agarwal, Nikhil; Lee, Aaron; Kubomoto, Scott; Sharma, Gobind; Bolus, Sally; Chang, Lin; Spiegel, Brennan M R
2015-01-01
Healthcare delivery now mandates shorter visits with higher documentation requirements, undermining the patient-provider interaction. To improve clinic visit efficiency, we developed a patient-provider portal that systematically collects patient symptoms using a computer algorithm called Automated Evaluation of Gastrointestinal Symptoms (AEGIS). AEGIS also automatically "translates" the patient report into a full narrative history of present illness (HPI). We aimed to compare the quality of computer-generated vs. physician-documented HPIs. We performed a cross-sectional study with a paired sample design among individuals visiting outpatient adult gastrointestinal (GI) clinics for evaluation of active GI symptoms. Participants first underwent usual care and then subsequently completed AEGIS. Each individual thereby had both a physician-documented and a computer-generated HPI. Forty-eight blinded physicians assessed HPI quality across six domains using 5-point scales: (i) overall impression, (ii) thoroughness, (iii) usefulness, (iv) organization, (v) succinctness, and (vi) comprehensibility. We compared HPI scores within patient using a repeated measures model. Seventy-five patients had both computer-generated and physician-documented HPIs. The mean overall impression score for computer-generated HPIs was higher than physician HPIs (3.68 vs. 2.80; P<0.001), even after adjusting for physician and visit type, location, mode of transcription, and demographics. Computer-generated HPIs were also judged more complete (3.70 vs. 2.73; P<0.001), more useful (3.82 vs. 3.04; P<0.001), better organized (3.66 vs. 2.80; P<0.001), more succinct (3.55 vs. 3.17; P<0.001), and more comprehensible (3.66 vs. 2.97; P<0.001). Computer-generated HPIs were of higher overall quality, better organized, and more succinct, comprehensible, complete, and useful compared with HPIs written by physicians during usual care in GI clinics.
NASA Astrophysics Data System (ADS)
Witham, Shawn; Boylen, Brett; Owesen, Barr; Rocchia, Walter; Alexov, Emil
2011-03-01
Electrostatic forces and energies are two of the major components that contribute to the stability, function and interaction of biological macromolecules. The calculations of the electrostatic potential distribution in such systems, which are comprised of irregularly shaped objects immersed in a water phase, is not a trivial task. In addition, an accurate model requires any missing hydrogen atoms of the corresponding structural files (Protein Data Bank, or, PDB files) to be generated in silico and, if necessary, missing atoms or residues to be predicted as well. Here we report a comprehensive suite, an academic DelPhi webserver, which allows the users to upload their structural file, calculate the components of the electrostatic energy, generate the corresponding potential (and/or concentration/dielectric constant) distribution map, and choose the appropriate force field. The webserver utilizes modern technology to take user input and construct an algorithm that suits the users specific needs. The webserver uses Clemson University's Palmetto Supercomputer Cluster to handle the DelPhi calculations, which can range anywhere from small and short computation times, to extensive and computationally demanding runtimes. The work was supported by a grant from NIGMS, NIH, grant number 1R01GM093937-01.
Software to model AXAF-I image quality
NASA Technical Reports Server (NTRS)
Ahmad, Anees; Feng, Chen
1995-01-01
A modular user-friendly computer program for the modeling of grazing-incidence type x-ray optical systems has been developed. This comprehensive computer software GRAZTRACE covers the manipulation of input data, ray tracing with reflectivity and surface deformation effects, convolution with x-ray source shape, and x-ray scattering. The program also includes the capabilities for image analysis, detector scan modeling, and graphical presentation of the results. A number of utilities have been developed to interface the predicted Advanced X-ray Astrophysics Facility-Imaging (AXAF-I) mirror structural and thermal distortions with the ray-trace. This software is written in FORTRAN 77 and runs on a SUN/SPARC station. An interactive command mode version and a batch mode version of the software have been developed.
Jung, HaRim; Song, MoonBae; Youn, Hee Yong; Kim, Ung Mo
2015-09-18
A content-matched (CM) rangemonitoring query overmoving objects continually retrieves the moving objects (i) whose non-spatial attribute values are matched to given non-spatial query values; and (ii) that are currently located within a given spatial query range. In this paper, we propose a new query indexing structure, called the group-aware query region tree (GQR-tree) for efficient evaluation of CMrange monitoring queries. The primary role of the GQR-tree is to help the server leverage the computational capabilities of moving objects in order to improve the system performance in terms of the wireless communication cost and server workload. Through a series of comprehensive simulations, we verify the superiority of the GQR-tree method over the existing methods.
Model of the Reticular Formation of the Brainstem Based on Glial-Neuronal Interactions.
Mitterauer, Bernhard J
A new model of the reticular formation of the brainstem is proposed. It refers to the neuronal and glial cell systems. Thus, it is biomimetically founded. The reticular formation generates modes of behavior (sleeping, eating, etc.) and commands all behavior according to the most appropriate environmental information. The reticular formation works on an abductive logic and is dominated by a redundancy of potential command. Formally, a special mode of behavior is represented by a comprehensive cycle (Hamilton loop) located in the glial network (syncytium) and embodied in gap junctional plaques. Whereas for the neuronal network of the reticular formation, a computer simulation has already been presented; here, the necessary devices for computation in the whole network are outlined.
NASA Technical Reports Server (NTRS)
Groves, Curtis Edward
2014-01-01
Spacecraft thermal protection systems are at risk of being damaged due to airflow produced from Environmental Control Systems. There are inherent uncertainties and errors associated with using Computational Fluid Dynamics to predict the airflow field around a spacecraft from the Environmental Control System. This paper describes an approach to quantify the uncertainty in using Computational Fluid Dynamics to predict airflow speeds around an encapsulated spacecraft without the use of test data. Quantifying the uncertainty in analytical predictions is imperative to the success of any simulation-based product. The method could provide an alternative to traditional "validation by test only" mentality. This method could be extended to other disciplines and has potential to provide uncertainty for any numerical simulation, thus lowering the cost of performing these verifications while increasing the confidence in those predictions. Spacecraft requirements can include a maximum airflow speed to protect delicate instruments during ground processing. Computational Fluid Dynamics can be used to verify these requirements; however, the model must be validated by test data. This research includes the following three objectives and methods. Objective one is develop, model, and perform a Computational Fluid Dynamics analysis of three (3) generic, non-proprietary, environmental control systems and spacecraft configurations. Several commercially available and open source solvers have the capability to model the turbulent, highly three-dimensional, incompressible flow regime. The proposed method uses FLUENT, STARCCM+, and OPENFOAM. Objective two is to perform an uncertainty analysis of the Computational Fluid Dynamics model using the methodology found in "Comprehensive Approach to Verification and Validation of Computational Fluid Dynamics Simulations". This method requires three separate grids and solutions, which quantify the error bars around Computational Fluid Dynamics predictions. The method accounts for all uncertainty terms from both numerical and input variables. Objective three is to compile a table of uncertainty parameters that could be used to estimate the error in a Computational Fluid Dynamics model of the Environmental Control System /spacecraft system. Previous studies have looked at the uncertainty in a Computational Fluid Dynamics model for a single output variable at a single point, for example the re-attachment length of a backward facing step. For the flow regime being analyzed (turbulent, three-dimensional, incompressible), the error at a single point can propagate into the solution both via flow physics and numerical methods. Calculating the uncertainty in using Computational Fluid Dynamics to accurately predict airflow speeds around encapsulated spacecraft in is imperative to the success of future missions.
NASA Technical Reports Server (NTRS)
Groves, Curtis Edward
2014-01-01
Spacecraft thermal protection systems are at risk of being damaged due to airflow produced from Environmental Control Systems. There are inherent uncertainties and errors associated with using Computational Fluid Dynamics to predict the airflow field around a spacecraft from the Environmental Control System. This paper describes an approach to quantify the uncertainty in using Computational Fluid Dynamics to predict airflow speeds around an encapsulated spacecraft without the use of test data. Quantifying the uncertainty in analytical predictions is imperative to the success of any simulation-based product. The method could provide an alternative to traditional validation by test only mentality. This method could be extended to other disciplines and has potential to provide uncertainty for any numerical simulation, thus lowering the cost of performing these verifications while increasing the confidence in those predictions.Spacecraft requirements can include a maximum airflow speed to protect delicate instruments during ground processing. Computational Fluid Dynamics can be used to verify these requirements; however, the model must be validated by test data. This research includes the following three objectives and methods. Objective one is develop, model, and perform a Computational Fluid Dynamics analysis of three (3) generic, non-proprietary, environmental control systems and spacecraft configurations. Several commercially available and open source solvers have the capability to model the turbulent, highly three-dimensional, incompressible flow regime. The proposed method uses FLUENT, STARCCM+, and OPENFOAM. Objective two is to perform an uncertainty analysis of the Computational Fluid Dynamics model using the methodology found in Comprehensive Approach to Verification and Validation of Computational Fluid Dynamics Simulations. This method requires three separate grids and solutions, which quantify the error bars around Computational Fluid Dynamics predictions. The method accounts for all uncertainty terms from both numerical and input variables. Objective three is to compile a table of uncertainty parameters that could be used to estimate the error in a Computational Fluid Dynamics model of the Environmental Control System spacecraft system.Previous studies have looked at the uncertainty in a Computational Fluid Dynamics model for a single output variable at a single point, for example the re-attachment length of a backward facing step. For the flow regime being analyzed (turbulent, three-dimensional, incompressible), the error at a single point can propagate into the solution both via flow physics and numerical methods. Calculating the uncertainty in using Computational Fluid Dynamics to accurately predict airflow speeds around encapsulated spacecraft in is imperative to the success of future missions.
NASA Technical Reports Server (NTRS)
Groves, Curtis E.
2013-01-01
Spacecraft thermal protection systems are at risk of being damaged due to airflow produced from Environmental Control Systems. There are inherent uncertainties and errors associated with using Computational Fluid Dynamics to predict the airflow field around a spacecraft from the Environmental Control System. This proposal describes an approach to validate the uncertainty in using Computational Fluid Dynamics to predict airflow speeds around an encapsulated spacecraft. The research described here is absolutely cutting edge. Quantifying the uncertainty in analytical predictions is imperative to the success of any simulation-based product. The method could provide an alternative to traditional"validation by test only'' mentality. This method could be extended to other disciplines and has potential to provide uncertainty for any numerical simulation, thus lowering the cost of performing these verifications while increasing the confidence in those predictions. Spacecraft requirements can include a maximum airflow speed to protect delicate instruments during ground processing. Computationaf Fluid Dynamics can be used to veritY these requirements; however, the model must be validated by test data. The proposed research project includes the following three objectives and methods. Objective one is develop, model, and perform a Computational Fluid Dynamics analysis of three (3) generic, non-proprietary, environmental control systems and spacecraft configurations. Several commercially available solvers have the capability to model the turbulent, highly three-dimensional, incompressible flow regime. The proposed method uses FLUENT and OPEN FOAM. Objective two is to perform an uncertainty analysis of the Computational Fluid . . . Dynamics model using the methodology found in "Comprehensive Approach to Verification and Validation of Computational Fluid Dynamics Simulations". This method requires three separate grids and solutions, which quantify the error bars around Computational Fluid Dynamics predictions. The method accounts for all uncertainty terms from both numerical and input variables. Objective three is to compile a table of uncertainty parameters that could be used to estimate the error in a Computational Fluid Dynamics model of the Environmental Control System /spacecraft system. Previous studies have looked at the uncertainty in a Computational Fluid Dynamics model for a single output variable at a single point, for example the re-attachment length of a backward facing step. To date, the author is the only person to look at the uncertainty in the entire computational domain. For the flow regime being analyzed (turbulent, threedimensional, incompressible), the error at a single point can propagate into the solution both via flow physics and numerical methods. Calculating the uncertainty in using Computational Fluid Dynamics to accurately predict airflow speeds around encapsulated spacecraft in is imperative to the success of future missions.
I Use the Computer to ADVANCE Advances in Comprehension-Strategy Research.
ERIC Educational Resources Information Center
Blohm, Paul J.
Merging the instructional implications drawn from theory and research in the interactive reading model, schemata, and metacognition with computer based instruction seems a natural approach for actively involving students' participation in reading and learning from text. Computer based graphic organizers guide students' preview or review of lengthy…
A Framework for Understanding Physics Students' Computational Modeling Practices
ERIC Educational Resources Information Center
Lunk, Brandon Robert
2012-01-01
With the growing push to include computational modeling in the physics classroom, we are faced with the need to better understand students' computational modeling practices. While existing research on programming comprehension explores how novices and experts generate programming algorithms, little of this discusses how domain content…
Chapter 11. Quality evaluation of apple by computer vision
USDA-ARS?s Scientific Manuscript database
Apple is one of the most consumed fruits in the world, and there is a critical need for enhanced computer vision technology for quality assessment of apples. This chapter gives a comprehensive review on recent advances in various computer vision techniques for detecting surface and internal defects ...
Mahfuz, Mohammad U; Makrakis, Dimitrios; Mouftah, Hussein T
2014-09-01
In this paper, a comprehensive analysis of the sampling-based optimum signal detection in ideal (i.e., free) diffusion-based concentration-encoded molecular communication (CEMC) system has been presented. A generalized amplitude-shift keying (ASK)-based CEMC system has been considered in diffusion-based noise and intersymbol interference (ISI) conditions. Information is encoded by modulating the amplitude of the transmission rate of information molecules at the TN. The critical issues involved in the sampling-based receiver thus developed are addressed in detail, and its performance in terms of the number of samples per symbol, communication range, and transmission data rate is evaluated. ISI produced by the residual molecules deteriorates the performance of the CEMC system significantly, which further deteriorates when the communication range and/or the transmission data rate increase(s). In addition, the performance of the optimum receiver depends on the receiver's ability to compute the ISI accurately, thus providing a trade-off between receiver complexity and achievable bit error rate (BER). Exact and approximate detection performances have been derived. Finally, it is found that the sampling-based signal detection scheme thus developed can be applied to both binary and multilevel (M-ary) ASK-based CEMC systems, although M-ary systems suffer more from higher BER.
Space shuttle post-entry and landing analysis. Volume 2: Appendices
NASA Technical Reports Server (NTRS)
Crawford, B. S.; Duiven, E. M.
1973-01-01
Four candidate navigation systems for the space shuttle orbiter approach and landing phase are evaluated in detail. These include three conventional navaid systems and a single-station one-way Doppler system. In each case, a Kalman filter is assumed to be mechanized in the onboard computer, blending the navaid data with IMU and altimeter data. Filter state dimensions ranging from 6 to 24 are involved in the candidate systems. Comprehensive truth models with state dimensions ranging from 63 to 82 are formulated and used to generate detailed error budgets and sensitivity curves illustrating the effect of variations in the size of individual error sources on touchdown accuracy. The projected overall performance of each system is shown in the form of time histories of position and velocity error components.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, K.L.
This document has been developed to provide guidance in the interchange of electronic CAD data with Martin Marietta Energy Systems, Inc., Oak Ridge, Tennessee. It is not meant to be as comprehensive as the existing standards and specifications, but to provide a minimum set of practices that will enhance the success of the CAD data exchange. It is now a Department of Energy (DOE) Oak Ridge Field Office requirement that Architect-Engineering (A-E) firms prepare all new drawings using a Computer Aided Design (CAD) system that is compatible with the Facility Manager`s (FM) CAD system. For Oak Ridge facilities, the CADmore » system used for facility design by the FM, Martin Marietta Energy Systems, Inc., is Intregraph. The format for interchange of CAD data for Oak Ridge facilities will be the Intergraph MicroStation/IGDS format.« less
dos-Santos, M; Fujino, A
2012-01-01
Radiology teaching usually employs a systematic and comprehensive set of medical images and related information. Databases with representative radiological images and documents are highly desirable and widely used in Radiology teaching programs. Currently, computer-based teaching file systems are widely used in Medicine and Radiology teaching as an educational resource. This work addresses a user-centered radiology electronic teaching file system as an instance of MIRC compliant medical image database. Such as a digital library, the clinical cases are available to access by using a web browser. The system has offered great opportunities to some Radiology residents interact with experts. This has been done by applying user-centered techniques and creating usage context-based tools in order to make available an interactive system.
Load flow and state estimation algorithms for three-phase unbalanced power distribution systems
NASA Astrophysics Data System (ADS)
Madvesh, Chiranjeevi
Distribution load flow and state estimation are two important functions in distribution energy management systems (DEMS) and advanced distribution automation (ADA) systems. Distribution load flow analysis is a tool which helps to analyze the status of a power distribution system under steady-state operating conditions. In this research, an effective and comprehensive load flow algorithm is developed to extensively incorporate the distribution system components. Distribution system state estimation is a mathematical procedure which aims to estimate the operating states of a power distribution system by utilizing the information collected from available measurement devices in real-time. An efficient and computationally effective state estimation algorithm adapting the weighted-least-squares (WLS) method has been developed in this research. Both the developed algorithms are tested on different IEEE test-feeders and the results obtained are justified.
Micro-separation toward systems biology.
Liu, Bi-Feng; Xu, Bo; Zhang, Guisen; Du, Wei; Luo, Qingming
2006-02-17
Current biology is experiencing transformation in logic or philosophy that forces us to reevaluate the concept of cell, tissue or entire organism as a collection of individual components. Systems biology that aims at understanding biological system at the systems level is an emerging research area, which involves interdisciplinary collaborations of life sciences, computational and mathematical sciences, systems engineering, and analytical technology, etc. For analytical chemistry, developing innovative methods to meet the requirement of systems biology represents new challenges as also opportunities and responsibility. In this review, systems biology-oriented micro-separation technologies are introduced for comprehensive profiling of genome, proteome and metabolome, characterization of biomolecules interaction and single cell analysis such as capillary electrophoresis, ultra-thin layer gel electrophoresis, micro-column liquid chromatography, and their multidimensional combinations, parallel integrations, microfabricated formats, and nano technology involvement. Future challenges and directions are also suggested.
ERIC Educational Resources Information Center
Smith, Glenn Gordon
2012-01-01
This study compared books with embedded computer games (via pentop computers with microdot paper and audio feedback) with regular books with maps, in terms of fifth graders' comprehension and retention of spatial details from stories. One group read a story in hard copy with embedded computer games, the other group read it in regular book format…
ERIC Educational Resources Information Center
Venkatesh, Vijay P.
2013-01-01
The current computing landscape owes its roots to the birth of hardware and software technologies from the 1940s and 1950s. Since then, the advent of mainframes, miniaturized computing, and internetworking has given rise to the now prevalent cloud computing era. In the past few months just after 2010, cloud computing adoption has picked up pace…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Zuwei; Zhao, Haibo, E-mail: klinsmannzhb@163.com; Zheng, Chuguang
2015-01-15
This paper proposes a comprehensive framework for accelerating population balance-Monte Carlo (PBMC) simulation of particle coagulation dynamics. By combining Markov jump model, weighted majorant kernel and GPU (graphics processing unit) parallel computing, a significant gain in computational efficiency is achieved. The Markov jump model constructs a coagulation-rule matrix of differentially-weighted simulation particles, so as to capture the time evolution of particle size distribution with low statistical noise over the full size range and as far as possible to reduce the number of time loopings. Here three coagulation rules are highlighted and it is found that constructing appropriate coagulation rule providesmore » a route to attain the compromise between accuracy and cost of PBMC methods. Further, in order to avoid double looping over all simulation particles when considering the two-particle events (typically, particle coagulation), the weighted majorant kernel is introduced to estimate the maximum coagulation rates being used for acceptance–rejection processes by single-looping over all particles, and meanwhile the mean time-step of coagulation event is estimated by summing the coagulation kernels of rejected and accepted particle pairs. The computational load of these fast differentially-weighted PBMC simulations (based on the Markov jump model) is reduced greatly to be proportional to the number of simulation particles in a zero-dimensional system (single cell). Finally, for a spatially inhomogeneous multi-dimensional (multi-cell) simulation, the proposed fast PBMC is performed in each cell, and multiple cells are parallel processed by multi-cores on a GPU that can implement the massively threaded data-parallel tasks to obtain remarkable speedup ratio (comparing with CPU computation, the speedup ratio of GPU parallel computing is as high as 200 in a case of 100 cells with 10 000 simulation particles per cell). These accelerating approaches of PBMC are demonstrated in a physically realistic Brownian coagulation case. The computational accuracy is validated with benchmark solution of discrete-sectional method. The simulation results show that the comprehensive approach can attain very favorable improvement in cost without sacrificing computational accuracy.« less
Beck, Susan L; Eaton, Linda H; Echeverria, Christina; Mooney, Kathi H
2017-10-01
SymptomCare@Home, an integrated symptom monitoring and management system, was designed as part of randomized clinical trials to help patients with cancer who receive chemotherapy in ambulatory clinics and often experience significant symptoms at home. An iterative design process was informed by chronic disease management theory and features of assessment and clinical decision support systems used in other diseases. Key stakeholders participated in the design process: nurse scientists, clinical experts, bioinformatics experts, and computer programmers. Especially important was input from end users, patients, and nurse practitioners participating in a series of studies testing the system. The system includes both a patient and clinician interface and fully integrates two electronic subsystems: a telephone computer-linked interactive voice response system and a Web-based Decision Support-Symptom Management System. Key features include (1) daily symptom monitoring, (2) self-management coaching, (3) alerting, and (4) nurse practitioner follow-up. The nurse practitioner is distinctively positioned to provide assessment, education, support, and pharmacologic and nonpharmacologic interventions to intensify management of poorly controlled symptoms at home. SymptomCare@Home is a model for providing telehealth. The system facilitates using evidence-based guidelines as part of a comprehensive symptom management approach. The design process and system features can be applied to other diseases and conditions.
Artificial intelligence and robotics in high throughput post-genomics.
Laghaee, Aroosha; Malcolm, Chris; Hallam, John; Ghazal, Peter
2005-09-15
The shift of post-genomics towards a systems approach has offered an ever-increasing role for artificial intelligence (AI) and robotics. Many disciplines (e.g. engineering, robotics, computer science) bear on the problem of automating the different stages involved in post-genomic research with a view to developing quality assured high-dimensional data. We review some of the latest contributions of AI and robotics to this end and note the limitations arising from the current independent, exploratory way in which specific solutions are being presented for specific problems without regard to how these could be eventually integrated into one comprehensible integrated intelligent system.
NASA's Aviation Safety and Modeling Project
NASA Technical Reports Server (NTRS)
Chidester, Thomas R.; Statler, Irving C.
2006-01-01
The Aviation Safety Monitoring and Modeling (ASMM) Project of NASA's Aviation Safety program is cultivating sources of data and developing automated computer hardware and software to facilitate efficient, comprehensive, and accurate analyses of the data collected from large, heterogeneous databases throughout the national aviation system. The ASMM addresses the need to provide means for increasing safety by enabling the identification and correcting of predisposing conditions that could lead to accidents or to incidents that pose aviation risks. A major component of the ASMM Project is the Aviation Performance Measuring System (APMS), which is developing the next generation of software tools for analyzing and interpreting flight data.
Scalable Parameter Estimation for Genome-Scale Biochemical Reaction Networks
Kaltenbacher, Barbara; Hasenauer, Jan
2017-01-01
Mechanistic mathematical modeling of biochemical reaction networks using ordinary differential equation (ODE) models has improved our understanding of small- and medium-scale biological processes. While the same should in principle hold for large- and genome-scale processes, the computational methods for the analysis of ODE models which describe hundreds or thousands of biochemical species and reactions are missing so far. While individual simulations are feasible, the inference of the model parameters from experimental data is computationally too intensive. In this manuscript, we evaluate adjoint sensitivity analysis for parameter estimation in large scale biochemical reaction networks. We present the approach for time-discrete measurement and compare it to state-of-the-art methods used in systems and computational biology. Our comparison reveals a significantly improved computational efficiency and a superior scalability of adjoint sensitivity analysis. The computational complexity is effectively independent of the number of parameters, enabling the analysis of large- and genome-scale models. Our study of a comprehensive kinetic model of ErbB signaling shows that parameter estimation using adjoint sensitivity analysis requires a fraction of the computation time of established methods. The proposed method will facilitate mechanistic modeling of genome-scale cellular processes, as required in the age of omics. PMID:28114351
Program For Analysis Of Metal-Matrix Composites
NASA Technical Reports Server (NTRS)
Murthy, P. L. N.; Mital, S. K.
1994-01-01
METCAN (METal matrix Composite ANalyzer) is computer program used to simulate computationally nonlinear behavior of high-temperature metal-matrix composite structural components in specific applications, providing comprehensive analyses of thermal and mechanical performances. Written in FORTRAN 77.
Data management system for USGS/USEPA urban hydrology studies program
Doyle, W.H.; Lorens, J.A.
1982-01-01
A data management system was developed to store, update, and retrieve data collected in urban stormwater studies jointly conducted by the U.S. Geological Survey and U.S. Environmental Protection Agency in 11 cities in the United States. The data management system is used to retrieve and combine data from USGS data files for use in rainfall, runoff, and water-quality models and for data computations such as storm loads. The system is based on the data management aspect of the Statistical Analysis System (SAS) and was used to create all the data files in the data base. SAS is used for storage and retrieval of basin physiography, land-use, and environmental practices inventory data. Also, storm-event water-quality characteristics are stored in the data base. The advantages of using SAS to create and manage a data base are many with a few being that it is simple, easy to use, contains a comprehensive statistical package, and can be used to modify files very easily. Data base system development has progressed rapidly during the last two decades and the data managment system concepts used in this study reflect the advancement made in computer technology during this era. Urban stormwater data is, however, just one application for which the system can be used. (USGS)
ERIC Educational Resources Information Center
Abraham, Lee B.
2008-01-01
Language learners have unprecedented opportunities for developing second language literacy skills and intercultural understanding by reading authentic texts on the Internet and in multimedia computer-assisted language learning environments. This article presents findings from a meta-analysis of 11 studies of computer-mediated glosses in second…
ERIC Educational Resources Information Center
Ke, Fengfeng
2008-01-01
This article reports findings on a study of educational computer games used within various classroom situations. Employing an across-stage, mixed method model, the study examined whether educational computer games, in comparison to traditional paper-and-pencil drills, would be more effective in facilitating comprehensive math learning outcomes,…
Cognitive Consequences of Participation in a "Fifth Dimension" After-School Computer Club.
ERIC Educational Resources Information Center
Mayer, Richard E.; Quilici, Jill; Moreno, Roxana; Duran, Richard; Woodbridge, Scott; Simon, Rebecca; Sanchez, David; Lavezzo, Amy
1997-01-01
Children who attended the Fifth Dimension after-school computer club at least 10 times during the 1994-95 school year performed better on word problem comprehension tests than did non-participating children. Results support the hypothesis that experience in using computer software in the Fifth Dimension club produces measurable, resilient, and…
ERIC Educational Resources Information Center
1971
Computers have effected a comprehensive transformation of chemistry. Computers have greatly enhanced the chemist's ability to do model building, simulations, data refinement and reduction, analysis of data in terms of models, on-line data logging, automated control of experiments, quantum chemistry and statistical and mechanical calculations, and…
ERIC Educational Resources Information Center
Worrell, Jamie; Duffy, Mary Lou; Brady, Michael P.; Dukes, Charles; Gonzalez-DeHass, Alyssa
2016-01-01
Many schools use computer-based testing to measure students' progress for end-of-the-year and statewide assessments. There is little research to support whether computer-based testing accurately reflects student progress, particularly among students with learning, performance, and generalization difficulties. This article summarizes an…
Trends in data locality abstractions for HPC systems
Unat, Didem; Dubey, Anshu; Hoefler, Torsten; ...
2017-05-10
The cost of data movement has always been an important concern in high performance computing (HPC) systems. It has now become the dominant factor in terms of both energy consumption and performance. Support for expression of data locality has been explored in the past, but those efforts have had only modest success in being adopted in HPC applications for various reasons. However, with the increasing complexity of the memory hierarchy and higher parallelism in emerging HPC systems, locality management has acquired a new urgency. Developers can no longer limit themselves to low-level solutions and ignore the potential for productivity andmore » performance portability obtained by using locality abstractions. Fortunately, the trend emerging in recent literature on the topic alleviates many of the concerns that got in the way of their adoption by application developers. Data locality abstractions are available in the forms of libraries, data structures, languages and runtime systems; a common theme is increasing productivity without sacrificing performance. Furthermore, this paper examines these trends and identifies commonalities that can combine various locality concepts to develop a comprehensive approach to expressing and managing data locality on future large-scale high-performance computing systems.« less
34 CFR 303.115 - Comprehensive child find system.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 34 Education 2 2013-07-01 2013-07-01 false Comprehensive child find system. 303.115 Section 303... Components of A Statewide System § 303.115 Comprehensive child find system. Each system must include a comprehensive child find system that meets the requirements in §§ 303.302 and 303.303. (Approved by Office of...
34 CFR 303.115 - Comprehensive child find system.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 34 Education 2 2012-07-01 2012-07-01 false Comprehensive child find system. 303.115 Section 303... Components of A Statewide System § 303.115 Comprehensive child find system. Each system must include a comprehensive child find system that meets the requirements in §§ 303.302 and 303.303. (Approved by Office of...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gigley, H.M.
1982-01-01
An artificial intelligence approach to the simulation of neurolinguistically constrained processes in sentence comprehension is developed using control strategies for simulation of cooperative computation in associative networks. The desirability of this control strategy in contrast to ATN and production system strategies is explained. A first pass implementation of HOPE, an artificial intelligence simulation model of sentence comprehension, constrained by studies of aphasic performance, psycholinguistics, neurolinguistics, and linguistic theory is described. Claims that the model could serve as a basis for sentence production simulation and for a model of language acquisition as associative learning are discussed. HOPE is a model thatmore » performs in a normal state and includes a lesion simulation facility. HOPE is also a research tool. Its modifiability and use as a tool to investigate hypothesized causes of degradation in comprehension performance by aphasic patients are described. Issues of using behavioral constraints in modelling and obtaining appropriate data for simulated process modelling are discussed. Finally, problems of validation of the simulation results are raised; and issues of how to interpret clinical results to define the evolution of the model are discussed. Conclusions with respect to the feasibility of artificial intelligence simulation process modelling are discussed based on the current state of research.« less
NASA Technical Reports Server (NTRS)
Bavuso, Salvatore J.; Rothmann, Elizabeth; Dugan, Joanne Bechta; Trivedi, Kishor S.; Mittal, Nitin; Boyd, Mark A.; Geist, Robert M.; Smotherman, Mark D.
1994-01-01
The Hybrid Automated Reliability Predictor (HARP) integrated Reliability (HiRel) tool system for reliability/availability prediction offers a toolbox of integrated reliability/availability programs that can be used to customize the user's application in a workstation or nonworkstation environment. HiRel consists of interactive graphical input/output programs and four reliability/availability modeling engines that provide analytical and simulative solutions to a wide host of reliable fault-tolerant system architectures and is also applicable to electronic systems in general. The tool system was designed to be compatible with most computing platforms and operating systems, and some programs have been beta tested, within the aerospace community for over 8 years. Volume 1 provides an introduction to the HARP program. Comprehensive information on HARP mathematical models can be found in the references.
Revalidation of the Selection Instrument for Flight Training
2017-07-01
AACog) composite, as measured by the following SIFT subscales: o Mechanical Comprehension Test (MCT) o Math Skills Test (MST) o Reading...applied mechanical science. varies/ 15 minutes MST Math Skills Test—Assesses the examinee’s computational skill and mathematical aptitude...3.36 2.11 .03 1.01 Mechanical Comprehension 463 -1.95 3.58 .01 0.92 Math Skills 463 -2.59 2.87 .06 0.82 Reading Comprehension 463 -2.51 2.93 .52
A History of Rotorcraft Comprehensive Analyses
NASA Technical Reports Server (NTRS)
Johnson, Wayne
2013-01-01
A history of the development of rotorcraft comprehensive analyses is presented. Comprehensive analyses are digital computer programs that calculate the aeromechanical behavior of the rotor and aircraft, bringing together the most advanced models of the geometry, structure, dynamics, and aerodynamics available in rotary wing technology. The development of the major codes of the last five decades from industry, government, and universities is described. A number of common themes observed in this history are discussed.
Tracking interface and common curve dynamics for two-fluid flow in porous media
Mcclure, James E.; Miller, Cass T.; Gray, W. G.; ...
2016-04-29
Pore-scale studies of multiphase flow in porous medium systems can be used to understand transport mechanisms and quantitatively determine closure relations that better incorporate microscale physics into macroscale models. Multiphase flow simulators constructed using the lattice Boltzmann method provide a means to conduct such studies, including both the equilibrium and dynamic aspects. Moving, storing, and analyzing the large state space presents a computational challenge when highly-resolved models are applied. We present an approach to simulate multiphase flow processes in which in-situ analysis is applied to track multiphase flow dynamics at high temporal resolution. We compute a comprehensive set of measuresmore » of the phase distributions and the system dynamics, which can be used to aid fundamental understanding and inform closure relations for macroscale models. The measures computed include microscale point representations and macroscale averages of fluid saturations, the pressure and velocity of the fluid phases, interfacial areas, interfacial curvatures, interface and common curve velocities, interfacial orientation tensors, phase velocities and the contact angle between the fluid-fluid interface and the solid surface. Test cases are studied to validate the approach and illustrate how measures of system state can be obtained and used to inform macroscopic theory.« less
Dines, David M; Gulotta, Lawrence; Craig, Edward V; Dines, Joshua S
Complex glenoid deformities present the most difficult challenges in shoulder arthroplasty. These deformities may be caused by severe degenerative or congenital deformity, posttraumatic anatomy, tumor, or, in most cases, bone loss after glenoid failure in anatomical total shoulder arthroplasty. Methods of treating severe bone loss have included inlay and onlay bone-graft options with augmented fixation and, in severe cases, salvage hemiarthroplasty with limited goal expectations. Unfortunately, none of these methods has provided consistently satisfactory reproducible results. In fact, each has had high rates of failure and complications. Borrowing from hip and knee arthroplasty surgeons' experience in using CAD/CAM (computer-aided design/computer-aided manufacturing) patient-specific implants to fill significant bony defects, Dr. D. M. Dines developed a glenoid vault reconstruction system (VRS) in conjunction with the Comprehensive Shoulder Arthroplasty System (Zimmer Biomet) to treat severe glenoid bone deficiency. The VRS is patient-specific; each case involves making a preoperative plan based on 3-dimensional reconstruction of a 2-dimensional computed tomography scan. The patient-specific implant and instrumentation that are created are used to reconstruct the deformed glenoid in an anatomical position without the need for additional bone graft. In this article, we describe the implant and its indications, technical aspects of production, and surgical technique.
Geytenbeek, Joke J M; Vermeulen, R Jeroen; Becher, Jules G; Oostrom, Kim J
2015-03-01
To assess spoken language comprehension in non-speaking children with severe cerebral palsy (CP) and to explore possible associations with motor type and disability. Eighty-seven non-speaking children (44 males, 43 females, mean age 6y 8mo, SD 2y 1mo) with spastic (54%) or dyskinetic (46%) CP (Gross Motor Function Classification System [GMFCS] levels IV [39%] and V [61%]) underwent spoken language comprehension assessment with the computer-based instrument for low motor language testing (C-BiLLT), a new and validated diagnostic instrument. A multiple linear regression model was used to investigate which variables explained the variation in C-BiLLT scores. Associations between spoken language comprehension abilities (expressed in z-score or age-equivalent score) and motor type of CP, GMFCS and Manual Ability Classification System (MACS) levels, gestational age, and epilepsy were analysed with Fisher's exact test. A p-value <0.05 was considered statistically significant. Chronological age, motor type, and GMFCS classification explained 33% (R=0.577, R(2) =0.33) of the variance in spoken language comprehension. Of the children aged younger than 6 years 6 months, 52.4% of the children with dyskinetic CP attained comprehension scores within the average range (z-score ≥-1.6) as opposed to none of the children with spastic CP. Of the children aged older than 6 years 6 months, 32% of the children with dyskinetic CP reached the highest achievable age-equivalent score compared to 4% of the children with spastic CP. No significant difference in disability was found between CP-related variables (MACS levels, gestational age, epilepsy), with the exception of GMFCS which showed a significant difference in children aged younger than 6 years 6 months (p=0.043). Despite communication disabilities in children with severe CP, particularly in dyskinetic CP, spoken language comprehension may show no or only moderate delay. These findings emphasize the importance of introducing alternative and/or augmentative communication devices from early childhood. © 2014 Mac Keith Press.
MannDB: A microbial annotation database for protein characterization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, C; Lam, M; Smith, J
2006-05-19
MannDB was created to meet a need for rapid, comprehensive automated protein sequence analyses to support selection of proteins suitable as targets for driving the development of reagents for pathogen or protein toxin detection. Because a large number of open-source tools were needed, it was necessary to produce a software system to scale the computations for whole-proteome analysis. Thus, we built a fully automated system for executing software tools and for storage, integration, and display of automated protein sequence analysis and annotation data. MannDB is a relational database that organizes data resulting from fully automated, high-throughput protein-sequence analyses using open-sourcemore » tools. Types of analyses provided include predictions of cleavage, chemical properties, classification, features, functional assignment, post-translational modifications, motifs, antigenicity, and secondary structure. Proteomes (lists of hypothetical and known proteins) are downloaded and parsed from Genbank and then inserted into MannDB, and annotations from SwissProt are downloaded when identifiers are found in the Genbank entry or when identical sequences are identified. Currently 36 open-source tools are run against MannDB protein sequences either on local systems or by means of batch submission to external servers. In addition, BLAST against protein entries in MvirDB, our database of microbial virulence factors, is performed. A web client browser enables viewing of computational results and downloaded annotations, and a query tool enables structured and free-text search capabilities. When available, links to external databases, including MvirDB, are provided. MannDB contains whole-proteome analyses for at least one representative organism from each category of biological threat organism listed by APHIS, CDC, HHS, NIAID, USDA, USFDA, and WHO. MannDB comprises a large number of genomes and comprehensive protein sequence analyses representing organisms listed as high-priority agents on the websites of several governmental organizations concerned with bio-terrorism. MannDB provides the user with a BLAST interface for comparison of native and non-native sequences and a query tool for conveniently selecting proteins of interest. In addition, the user has access to a web-based browser that compiles comprehensive and extensive reports.« less
Massive Cloud-Based Big Data Processing for Ocean Sensor Networks and Remote Sensing
NASA Astrophysics Data System (ADS)
Schwehr, K. D.
2017-12-01
Until recently, the work required to integrate and analyze data for global-scale environmental issues was prohibitive both in cost and availability. Traditional desktop processing systems are not able to effectively store and process all the data, and super computer solutions are financially out of the reach of most people. The availability of large-scale cloud computing has created tools that are usable by small groups and individuals regardless of financial resources or locally available computational resources. These systems give scientists and policymakers the ability to see how critical resources are being used across the globe with little or no barrier to entry. Google Earth Engine has the Moderate Resolution Imaging Spectroradiometer (MODIS) Terra, MODIS Aqua, and Global Land Data Assimilation Systems (GLDAS) data catalogs available live online. Here we demonstrate these data to calculate the correlation between lagged chlorophyll and rainfall to identify areas of eutrophication, matching these events to ocean currents from datasets like HYbrid Coordinate Ocean Model (HYCOM) to check if there are constraints from oceanographic configurations. The system can provide addition ground truth with observations from sensor networks like the International Comprehensive Ocean-Atmosphere Data Set / Voluntary Observing Ship (ICOADS/VOS) and Argo floats. This presentation is intended to introduce users to the datasets, programming idioms, and functionality of Earth Engine for large-scale, data-driven oceanography.
Thompson, Robert; Tanimoto, Steve; Lyman, Ruby Dawn; Geselowitz, Kira; Begay, Kristin Kawena; Nielsen, Kathleen; Nagy, William; Abbott, Robert; Raskind, Marshall; Berninger, Virginia
2018-05-01
Children in grades 4 to 6 ( N =14) who despite early intervention had persisting dyslexia (impaired word reading and spelling) were assessed before and after computerized reading and writing instruction aimed at subword, word, and syntax skills shown in four prior studies to be effective for treating dyslexia. During the 12 two-hour sessions once a week after school they first completed HAWK Letters in Motion© for manuscript and cursive handwriting, HAWK Words in Motion© for phonological, orthographic, and morphological coding for word reading and spelling, and HAWK Minds in Motion© for sentence reading comprehension and written sentence composing. A reading comprehension activity in which sentences were presented one word at a time or one added word at a time was introduced. Next, to instill hope they could overcome their struggles with reading and spelling, they read and discussed stories about struggles of Buckminister Fuller who overcame early disabilities to make important contributions to society. Finally, they engaged in the new Kokopelli's World (KW)©, blocks-based online lessons, to learn computer coding in introductory programming by creating stories in sentence blocks (Tanimoto and Thompson 2016). Participants improved significantly in hallmark word decoding and spelling deficits of dyslexia, three syntax skills (oral construction, listening comprehension, and written composing), reading comprehension (with decoding as covariate), handwriting, orthographic and morphological coding, orthographic loop, and inhibition (focused attention). They answered more reading comprehension questions correctly when they had read sentences presented one word at a time (eliminating both regressions out and regressions in during saccades) than when presented one added word at a time (eliminating only regressions out during saccades). Indicators of improved self-efficacy that they could learn to read and write were observed. Reminders to pay attention and stay on task needed before adding computer coding were not needed after computer coding was added.
THE COMPREHENSION OF RAPID SPEECH BY THE BLIND, PART III.
ERIC Educational Resources Information Center
FOULKE, EMERSON
A REVIEW OF THE RESEARCH ON THE COMPREHENSION OF RAPID SPEECH BY THE BLIND IDENTIFIES FIVE METHODS OF SPEECH COMPRESSION--SPEECH CHANGING, ELECTROMECHANICAL SAMPLING, COMPUTER SAMPLING, SPEECH SYNTHESIS, AND FREQUENCY DIVIDING WITH THE HARMONIC COMPRESSOR. THE SPEECH CHANGING AND ELECTROMECHANICAL SAMPLING METHODS AND THE NECESSARY APPARATUS HAVE…
Prediction, Error, and Adaptation during Online Sentence Comprehension
ERIC Educational Resources Information Center
Fine, Alex Brabham
2013-01-01
A fundamental challenge for human cognition is perceiving and acting in a world in which the statistics that characterize available sensory data are non-stationary. This thesis focuses on this problem specifically in the domain of sentence comprehension, where linguistic variability poses computational challenges to the processes underlying…
Keeping It Simple: The Case for E-Mail.
ERIC Educational Resources Information Center
Haimovic, Gila
The Open University of Israel (OUI) is a distance education institution that offers over 250 computer-mediated courses through the Internet. All OUI students must pass an English reading comprehension exemption exam or take the University's English reading comprehension courses. Because reading instruction differs from content instruction,…
Computer Assisted Instruction to Promote Comprehension in Students with Learning Disabilities
ERIC Educational Resources Information Center
Stetter, Maria Earman; Hughes, Marie Tejero
2011-01-01
Reading comprehension is a crucial skill for academic success of all students. Very often, students with learning disabilities struggle with reading skills and since students learn new information in school by reading; these difficulties often increase the academic struggles students with learning disabilities face. The current study examined…
ERIC Educational Resources Information Center
Poitras, Eric; Trevors, Gregory
2012-01-01
Planning, conducting, and reporting leading-edge research requires professionals who are capable of highly skilled reading. This study reports the development of an empirically informed computer-based learning environment designed to foster the acquisition of reading comprehension strategies that mediate expertise in the social sciences. Empirical…
The Role of Working Memory in Metaphor Production and Comprehension
ERIC Educational Resources Information Center
Chiappe, Dan L.; Chiappe, Penny
2007-01-01
The following tested Kintsch's [Kintsch, W. (2000). "Metaphor comprehension: a computational theory." "Psychonomic Bulletin & Review," 7, 257-266 and Kintsch, W. (2001). "Predication." "Cognitive Science," 25, 173-202] Predication Model, which predicts that working memory capacity is an important factor in metaphor processing. In support of his…
Effect of Hypertextual Reading on Academic Success and Comprehension Skills
ERIC Educational Resources Information Center
Durukan, Erhan
2014-01-01
As computer technology developed, hypertexts emerged as an influential environment for developing language skills. This study aims to evaluate a text prepared in a hypertextual environment and its effects on academic success and comprehension skills. In this study, "preliminary test final test control group experimental pattern" was used…
Event-Based Plausibility Immediately Influences On-Line Language Comprehension
ERIC Educational Resources Information Center
Matsuki, Kazunaga; Chow, Tracy; Hare, Mary; Elman, Jeffrey L.; Scheepers, Christoph; McRae, Ken
2011-01-01
In some theories of sentence comprehension, linguistically relevant lexical knowledge, such as selectional restrictions, is privileged in terms of the time-course of its access and influence. We examined whether event knowledge computed by combining multiple concepts can rapidly influence language understanding even in the absence of selectional…
Twelfth NASTRAN (R) Users' Colloquium
NASA Technical Reports Server (NTRS)
1984-01-01
NASTRAN is a large, comprehensive, nonproprietary, general purpose finite element computer code for structural analysis. The Twelfth Users' Colloquim provides some comprehensive papers on the application of finite element methods in engineering, comparisons with other approaches, unique applications, pre and post processing or auxiliary programs, and new methods of analysis with NASTRAN.
34 CFR 303.115 - Comprehensive child find system.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 34 Education 2 2014-07-01 2013-07-01 true Comprehensive child find system. 303.115 Section 303.115... Components of A Statewide System § 303.115 Comprehensive child find system. Each system must include a comprehensive child find system that meets the requirements in §§ 303.302 and 303.303. (Approved by Office of...
NASA Astrophysics Data System (ADS)
Kuzucu, H.
1992-11-01
Modern defense systems depend on comprehensive surveillance capability. The ability to detect and locate the radio signals is a major element of a surveillance system. With the increasing need for more mobile surveillance systems in conjunction with the rapid deployment of forces and the advent of technology allowing more enhanced use of small aperture systems, tactical direction finding (DF) and radiolocation systems will have to be operated in diverse operational conditions. A quick assessment of the error levels expected and the evaluation of the reliability of the fixes on the targeted areas bears crucial importance to the effectiveness of the missions relying on DF data. This paper presents a sophisticated, graphics workstation based computer tool developed for the system level analysis of radio communication systems and describes its use in radiolocation applications for realizing such accurate and realistic assessments with substantial money and time savings.
XXV IUPAP Conference on Computational Physics (CCP2013): Preface
NASA Astrophysics Data System (ADS)
2014-05-01
XXV IUPAP Conference on Computational Physics (CCP2013) was held from 20-24 August 2013 at the Russian Academy of Sciences in Moscow, Russia. The annual Conferences on Computational Physics (CCP) present an overview of the most recent developments and opportunities in computational physics across a broad range of topical areas. The CCP series aims to draw computational scientists from around the world and to stimulate interdisciplinary discussion and collaboration by putting together researchers interested in various fields of computational science. It is organized under the auspices of the International Union of Pure and Applied Physics and has been in existence since 1989. The CCP series alternates between Europe, America and Asia-Pacific. The conferences are traditionally supported by European Physical Society and American Physical Society. This year the Conference host was Landau Institute for Theoretical Physics. The Conference contained 142 presentations, and, in particular, 11 plenary talks with comprehensive reviews from airbursts to many-electron systems. We would like to take this opportunity to thank our sponsors: International Union of Pure and Applied Physics (IUPAP), European Physical Society (EPS), Division of Computational Physics of American Physical Society (DCOMP/APS), Russian Foundation for Basic Research, Department of Physical Sciences of Russian Academy of Sciences, RSC Group company. Further conference information and images from the conference are available in the pdf.
Are Technology Interruptions Impacting Your Bottom Line? An Innovative Proposal for Change.
Ledbetter, Tamera; Shultz, Sarah; Beckham, Roxanne
2017-10-01
Nursing interruptions are a costly and dangerous variable in acute care hospitals. Malfunctioning technology equipment interrupts nursing care and prevents full utilization of computer safety systems to prevent patient care errors. This paper identifies an innovative approach to nursing interruptions related to computer and computer cart malfunctions. The impact on human resources is defined and outcome measures were proposed. A multifaceted proposal, based on a literature review, aimed at reducing nursing interruptions is presented. This proposal is expected to increase patient safety, as well as patient and nurse satisfaction. Acute care hospitals utilizing electronic medical records and bar-coded medication administration technology. Nurses, information technology staff, nursing informatics staff, and all leadership teams affected by technology problems and their proposed solutions. Literature from multiple fields was reviewed to evaluate research related to computer/computer cart failures, and the approaches used to resolve these issues. Outcome measured strategic goals related to patient safety, and nurse and patient satisfaction. Specific help desk metrics will demonstrate the effect of interventions. This paper addresses a gap in the literature and proposes practical and innovative solutions. A comprehensive computer and computer cart repair program is essential for patient safety, financial stewardship, and utilization of resources. © 2015 Wiley Periodicals, Inc.
Report of the theory panel. [space physics
NASA Technical Reports Server (NTRS)
Ashourabdalla, Maha; Rosner, Robert; Antiochos, Spiro; Curtis, Steven; Fejer, B.; Goertz, Christoph K.; Goldstein, Melvyn L.; Holzer, Thomas E.; Jokipii, J. R.; Lee, Lou-Chuang
1991-01-01
The ultimate goal of this research is to develop an understanding which is sufficiently comprehensive to allow realistic predictions of the behavior of the physical systems. Theory has a central role to play in the quest for this understanding. The level of theoretical description is dependent on three constraints: (1) the available computer hardware may limit both the number and the size of physical processes the model system can describe; (2) the fact that some natural systems may only be described in a statistical manner; and (3) the fact that some natural systems may be observable only through remote sensing which is intrinsically limited by spatial resolution and line of sight integration. From this the report discusses present accomplishments and future goals of theoretical space physics. Finally, the development and use of new supercomputer is examined.
The Mission Operations Planning Assistant
NASA Technical Reports Server (NTRS)
Schuetzle, James G.
1987-01-01
The Mission Operations Planning Assistant (MOPA) is a knowledge-based system developed to support the planning and scheduling of instrument activities on the Upper Atmospheric Research Satellite (UARS). The MOPA system represents and maintains instrument plans at two levels of abstraction in order to keep plans comprehensible to both UARS Principal Investigators and Command Management personnel. The hierarchical representation of plans also allows MOPA to automatically create detailed instrument activity plans from which spacecraft command loads may be generated. The MOPA system was developed on a Symbolics 3640 computer using the ZetaLisp and ART languages. MOPA's features include a textual and graphical interface for plan inspection and modification, recognition of instrument operational constraint violations during the planning process, and consistency maintenance between the different planning levels. This paper describes the current MOPA system.
The mission operations planning assistant
NASA Technical Reports Server (NTRS)
Schuetzle, James G.
1987-01-01
The Mission Operations Planning Assistant (MOPA) is a knowledge-based system developed to support the planning and scheduling of instrument activities on the Upper Atmospheric Research Satellite (UARS). The MOPA system represents and maintains instrument plans at two levels of abstraction in order to keep plans comprehensible to both UARS Prinicpal Investigators and Command Management personnel. The hierarchical representation of plans also allows MOPA to automatically create detailed instrument activity plans from which spacecraft command loads may be generated. The MOPA system was developed on a Symbolics 3640 computer using the ZETALISP and ART languages. MOPA's features include a textual and graphical interface for plan inspection and modification, recognition of instrument operational constraint violations during the planning process, and consistency maintenance between the different planning levels. This paper describes the current MOPA system.
A Comprehensive Payment Model for Short- and Long-Stay Psychiatric Patients
Fries, Brant E.; Durance, Paul W.; Nerenz, David R.; Ashcraft, Marie L.F.
1993-01-01
In this article, a payment model is developed for a hospital system with both acute- and chronic-stay psychiatric patients. “Transition pricing” provides a balance between the incentives of an episode-based system and the necessity of per diem long-term payments. Payment is dependent on two new psychiatric resident classification systems for short- and long-term stays. Data on per diem cost of inpatient care, by day of stay, was computed from a sample of 2,968 patients from 100 psychiatric units in 51 Department of Veterans Affairs (VA) Medical Centers. Using a 9-month cohort of all VA psychiatric discharges nationwide (79,337 with non-chronic stays), profits and losses were simulated. PMID:10135343
Analysis, calculation and utilization of the k-balance attribute in interdependent networks
NASA Astrophysics Data System (ADS)
Liu, Zheng; Li, Qing; Wang, Dan; Xu, Mingwei
2018-05-01
Interdependent networks, where two networks depend on each other, are becoming more and more significant in modern systems. From previous work, it can be concluded that interdependent networks are more vulnerable than a single network. The robustness in interdependent networks deserves special attention. In this paper, we propose a metric of robustness from a new perspective-the balance. First, we define the balance-coefficient of the interdependent system. Based on precise analysis and derivation, we prove some significant theories and provide an efficient algorithm to compute the balance-coefficient. Finally, we propose an optimal solution to reduce the balance-coefficient to enhance the robustness of the given system. Comprehensive experiments confirm the efficiency of our algorithms.
Information transfer satellite concept study. Volume 4: computer manual
NASA Technical Reports Server (NTRS)
Bergin, P.; Kincade, C.; Kurpiewski, D.; Leinhaupel, F.; Millican, F.; Onstad, R.
1971-01-01
The Satellite Telecommunications Analysis and Modeling Program (STAMP) provides the user with a flexible and comprehensive tool for the analysis of ITS system requirements. While obtaining minimum cost design points, the program enables the user to perform studies over a wide range of user requirements and parametric demands. The program utilizes a total system approach wherein the ground uplink and downlink, the spacecraft, and the launch vehicle are simultaneously synthesized. A steepest descent algorithm is employed to determine the minimum total system cost design subject to the fixed user requirements and imposed constraints. In the process of converging to the solution, the pertinent subsystem tradeoffs are resolved. This report documents STAMP through a technical analysis and a description of the principal techniques employed in the program.
The Integrated Waste Tracking System - A Flexible Waste Management Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, Robert Stephen
2001-02-01
The US Department of Energy (DOE) Idaho National Engineering and Environmental Laboratory (INEEL) has fully embraced a flexible, computer-based tool to help increase waste management efficiency and integrate multiple operational functions from waste generation through waste disposition while reducing cost. The Integrated Waste Tracking System (IWTS)provides comprehensive information management for containerized waste during generation,storage, treatment, transport, and disposal. The IWTS provides all information necessary for facilities to properly manage and demonstrate regulatory compliance. As a platformindependent, client-server and Web-based inventory and compliance system, the IWTS has proven to be a successful tracking, characterization, compliance, and reporting tool that meets themore » needs of both operations and management while providing a high level of management flexibility.« less
Geerts, Hugo; Hofmann-Apitius, Martin; Anastasio, Thomas J
2017-11-01
Neurodegenerative diseases such as Alzheimer's disease (AD) follow a slowly progressing dysfunctional trajectory, with a large presymptomatic component and many comorbidities. Using preclinical models and large-scale omics studies ranging from genetics to imaging, a large number of processes that might be involved in AD pathology at different stages and levels have been identified. The sheer number of putative hypotheses makes it almost impossible to estimate their contribution to the clinical outcome and to develop a comprehensive view on the pathological processes driving the clinical phenotype. Traditionally, bioinformatics approaches have provided correlations and associations between processes and phenotypes. Focusing on causality, a new breed of advanced and more quantitative modeling approaches that use formalized domain expertise offer new opportunities to integrate these different modalities and outline possible paths toward new therapeutic interventions. This article reviews three different computational approaches and their possible complementarities. Process algebras, implemented using declarative programming languages such as Maude, facilitate simulation and analysis of complicated biological processes on a comprehensive but coarse-grained level. A model-driven Integration of Data and Knowledge, based on the OpenBEL platform and using reverse causative reasoning and network jump analysis, can generate mechanistic knowledge and a new, mechanism-based taxonomy of disease. Finally, Quantitative Systems Pharmacology is based on formalized implementation of domain expertise in a more fine-grained, mechanism-driven, quantitative, and predictive humanized computer model. We propose a strategy to combine the strengths of these individual approaches for developing powerful modeling methodologies that can provide actionable knowledge for rational development of preventive and therapeutic interventions. Development of these computational approaches is likely to be required for further progress in understanding and treating AD. Copyright © 2017 the Alzheimer's Association. Published by Elsevier Inc. All rights reserved.
34 CFR 303.321 - Comprehensive child find system.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 34 Education 2 2011-07-01 2010-07-01 true Comprehensive child find system. 303.321 Section 303.321... Services Identification and Evaluation § 303.321 Comprehensive child find system. (a) General. (1) Each system must include a comprehensive child find system that is consistent with part B of the Act (see 34...
Clinical applications of cone beam computed tomography in endodontics: A comprehensive review.
Cohenca, Nestor; Shemesh, Hagay
2015-06-01
Cone beam computed tomography (CBCT) is a new technology that produces three-dimensional (3D) digital imaging at reduced cost and less radiation for the patient than traditional CT scans. It also delivers faster and easier image acquisition. By providing a 3D representation of the maxillofacial tissues in a cost- and dose-efficient manner, a better preoperative assessment can be obtained for diagnosis and treatment. This comprehensive review presents current applications of CBCT in endodontics. Specific case examples illustrate the difference in treatment planning with traditional periapical radiography versus CBCT technology.
Comprehensive Thematic T-Matrix Reference Database: A 2014-2015 Update
NASA Technical Reports Server (NTRS)
Mishchenko, Michael I.; Zakharova, Nadezhda; Khlebtsov, Nikolai G.; Videen, Gorden; Wriedt, Thomas
2015-01-01
The T-matrix method is one of the most versatile and efficient direct computer solvers of the macroscopic Maxwell equations and is widely used for the computation of electromagnetic scattering by single and composite particles, discrete random media, and particles in the vicinity of an interface separating two half-spaces with different refractive indices. This paper is the seventh update to the comprehensive thematic database of peer-reviewed T-matrix publications initiated by us in 2004 and includes relevant publications that have appeared since 2013. It also lists a number of earlier publications overlooked previously.
Development of a COTS-Based Computing Environment Blueprint Application at KSC
NASA Technical Reports Server (NTRS)
Ghansah, Isaac; Boatright, Bryan
1996-01-01
This paper describes a blueprint that can be used for developing a distributed computing environment (DCE) for NASA in general, and the Kennedy Space Center (KSC) in particular. A comprehensive, open, secure, integrated, and multi-vendor DCE such as OSF DCE has been suggested. Design issues, as well as recommendations for each component have been given. Where necessary, modifications were suggested to fit the needs of KSC. This was done in the areas of security and directory services. Readers requiring a more comprehensive coverage are encouraged to refer to the eight-chapter document prepared for this work.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rearden, Bradley T.; Jessee, Matthew Anderson
The SCALE Code System is a widely used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor physics, radiation shielding, radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including 3 deterministic and 3 Monte Carlomore » radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results. SCALE 6.2 represents one of the most comprehensive revisions in the history of SCALE, providing several new capabilities and significant improvements in many existing features.« less
Kamatuka, Kenta; Hattori, Masahiro; Sugiyama, Tomoyasu
2016-12-01
RNA interference (RNAi) screening is extensively used in the field of reverse genetics. RNAi libraries constructed using random oligonucleotides have made this technology affordable. However, the new methodology requires exploration of the RNAi target gene information after screening because the RNAi library includes non-natural sequences that are not found in genes. Here, we developed a web-based tool to support RNAi screening. The system performs short hairpin RNA (shRNA) target prediction that is informed by comprehensive enquiry (SPICE). SPICE automates several tasks that are laborious but indispensable to evaluate the shRNAs obtained by RNAi screening. SPICE has four main functions: (i) sequence identification of shRNA in the input sequence (the sequence might be obtained by sequencing clones in the RNAi library), (ii) searching the target genes in the database, (iii) demonstrating biological information obtained from the database, and (iv) preparation of search result files that can be utilized in a local personal computer (PC). Using this system, we demonstrated that genes targeted by random oligonucleotide-derived shRNAs were not different from those targeted by organism-specific shRNA. The system facilitates RNAi screening, which requires sequence analysis after screening. The SPICE web application is available at http://www.spice.sugysun.org/.
NASA Astrophysics Data System (ADS)
Gómez-Rodríguez, Carlos
2017-07-01
Liu et al. [1] provide a comprehensive account of research on dependency distance in human languages. While the article is a very rich and useful report on this complex subject, here I will expand on a few specific issues where research in computational linguistics (specifically natural language processing) can inform DDM research, and vice versa. These aspects have not been explored much in [1] or elsewhere, probably due to the little overlap between both research communities, but they may provide interesting insights for improving our understanding of the evolution of human languages, the mechanisms by which the brain processes and understands language, and the construction of effective computer systems to achieve this goal.
Computer-Assisted Diagnosis of the Sleep Apnea-Hypopnea Syndrome: A Review
Alvarez-Estevez, Diego; Moret-Bonillo, Vicente
2015-01-01
Automatic diagnosis of the Sleep Apnea-Hypopnea Syndrome (SAHS) has become an important area of research due to the growing interest in the field of sleep medicine and the costs associated with its manual diagnosis. The increment and heterogeneity of the different techniques, however, make it somewhat difficult to adequately follow the recent developments. A literature review within the area of computer-assisted diagnosis of SAHS has been performed comprising the last 15 years of research in the field. Screening approaches, methods for the detection and classification of respiratory events, comprehensive diagnostic systems, and an outline of current commercial approaches are reviewed. An overview of the different methods is presented together with validation analysis and critical discussion of the current state of the art. PMID:26266052
Precision Modeling Of Targets Using The VALUE Computer Program
NASA Astrophysics Data System (ADS)
Hoffman, George A.; Patton, Ronald; Akerman, Alexander
1989-08-01
The 1976-vintage LASERX computer code has been augmented to produce realistic electro-optical images of targets. Capabilities lacking in LASERX but recently incorporated into its VALUE successor include: •Shadows cast onto the ground •Shadows cast onto parts of the target •See-through transparencies (e.g.,canopies) •Apparent images due both to atmospheric scattering and turbulence •Surfaces characterized by multiple bi-directional reflectance functions VALUE provides not only realistic target modeling by its precise and comprehensive representation of all target attributes, but additionally VALUE is very user friendly. Specifically, setup of runs is accomplished by screen prompting menus in a sequence of queries that is logical to the user. VALUE also incorporates the Optical Encounter (OPEC) software developed by Tricor Systems,Inc., Elgin, IL.
Jung, HaRim; Song, MoonBae; Youn, Hee Yong; Kim, Ung Mo
2015-01-01
A content-matched (CM) range monitoring query over moving objects continually retrieves the moving objects (i) whose non-spatial attribute values are matched to given non-spatial query values; and (ii) that are currently located within a given spatial query range. In this paper, we propose a new query indexing structure, called the group-aware query region tree (GQR-tree) for efficient evaluation of CM range monitoring queries. The primary role of the GQR-tree is to help the server leverage the computational capabilities of moving objects in order to improve the system performance in terms of the wireless communication cost and server workload. Through a series of comprehensive simulations, we verify the superiority of the GQR-tree method over the existing methods. PMID:26393613
Statistical behavior of ten million experimental detection limits
NASA Astrophysics Data System (ADS)
Voigtman, Edward; Abraham, Kevin T.
2011-02-01
Using a lab-constructed laser-excited fluorimeter, together with bootstrapping methodology, the authors have generated many millions of experimental linear calibration curves for the detection of rhodamine 6G tetrafluoroborate in ethanol solutions. The detection limits computed from them are in excellent agreement with both previously published theory and with comprehensive Monte Carlo computer simulations. Currie decision levels and Currie detection limits, each in the theoretical, chemical content domain, were found to be simply scaled reciprocals of the non-centrality parameter of the non-central t distribution that characterizes univariate linear calibration curves that have homoscedastic, additive Gaussian white noise. Accurate and precise estimates of the theoretical, content domain Currie detection limit for the experimental system, with 5% (each) probabilities of false positives and false negatives, are presented.