Organizational Strategies for End-User Computing Support.
ERIC Educational Resources Information Center
Blackmun, Robert R.; And Others
1988-01-01
Effective support for end users of computers has been an important issue in higher education from the first applications of general purpose mainframe computers through minicomputers, microcomputers, and supercomputers. The development of end user support is reviewed and organizational models are examined. (Author/MLW)
Electronic decision support in general practice. What's the hold up?
Liaw, S T; Schattner, P
2003-11-01
The uptake of computers in Australian general practice has been for administrative use and prescribing, but the development of electronic decision support (EDS) has been particularly slow. Therefore, computers are not being used to their full potential in assisting general practitioners to care for their patients. This article examines current barriers to EDS in general practice and possible strategies to increase its uptake. Barriers to the uptake of EDS include a lack of a business case, shifting of costs for data collection and management to the clinician, uncertainty about the optimal level of decision support, lack of technical and semantic standards, and resistance to EDS use by the time conscious GP. There is a need for a more strategic and attractive incentives program, greater national coordination, and more effective collaboration between government, the computer industry and the medical profession if current inertia is to be overcome.
Generalizations of polylogarithms for Feynman integrals
NASA Astrophysics Data System (ADS)
Bogner, Christian
2016-10-01
In this talk, we discuss recent progress in the application of generalizations of polylogarithms in the symbolic computation of multi-loop integrals. We briefly review the Maple program MPL which supports a certain approach for the computation of Feynman integrals in terms of multiple polylogarithms. Furthermore we discuss elliptic generalizations of polylogarithms which have shown to be useful in the computation of the massive two-loop sunrise integral.
ERIC Educational Resources Information Center
Worrell, Jamie; Duffy, Mary Lou; Brady, Michael P.; Dukes, Charles; Gonzalez-DeHass, Alyssa
2016-01-01
Many schools use computer-based testing to measure students' progress for end-of-the-year and statewide assessments. There is little research to support whether computer-based testing accurately reflects student progress, particularly among students with learning, performance, and generalization difficulties. This article summarizes an…
NASA Technical Reports Server (NTRS)
Blakely, R. L.
1973-01-01
A G189A simulation of the shuttle orbiter EC/lSS was prepared and used to study payload support capabilities. Two master program libraries of the G189A computer program were prepared for the NASA/JSC computer system. Several new component subroutines were added to the G189A program library and many existing subroutines were revised to improve their capabilities. A number of special analyses were performed in support of a NASA/JSC shuttle orbiter EC/LSS payload support capability study.
Code of Federal Regulations, 2014 CFR
2014-10-01
... (Class A Telephone Companies). 36.311 Section 36.311 Telecommunication FEDERAL COMMUNICATIONS COMMISSION..., office equipment, and general purpose computers. (b) The expenses in these account are apportioned among...
Code of Federal Regulations, 2013 CFR
2013-10-01
... (Class A Telephone Companies). 36.311 Section 36.311 Telecommunication FEDERAL COMMUNICATIONS COMMISSION..., office equipment, and general purpose computers. (b) The expenses in these account are apportioned among...
Code of Federal Regulations, 2012 CFR
2012-10-01
... (Class A Telephone Companies). 36.311 Section 36.311 Telecommunication FEDERAL COMMUNICATIONS COMMISSION..., office equipment, and general purpose computers. (b) The expenses in these account are apportioned among...
Astronaut James Newman works with computers and GPS
1993-09-20
STS051-16-028 (12-22 Sept 1993) --- On Discovery's middeck, astronaut James H. Newman, mission specialist, works with an array of computers, including one devoted to Global Positioning System (GPS) operations, a general portable onboard computer displaying a tracking map, a portable audio data modem and another payload and general support computer. Newman was joined by four other NASA astronauts for almost ten full days in space.
Specifying Computer-Supported Collaboration Scripts
ERIC Educational Resources Information Center
Kobbe, Lars; Weinberger, Armin; Dillenbourg, Pierre; Harrer, Andreas; Hamalainen, Raija; Hakkinen, Paivi; Fischer, Frank
2007-01-01
Collaboration scripts facilitate social and cognitive processes of collaborative learning by shaping the way learners interact with each other. Computer-supported collaboration scripts generally suffer from the problem of being restrained to a specific learning platform. A standardization of collaboration scripts first requires a specification of…
47 CFR 69.307 - General support facilities.
Code of Federal Regulations, 2014 CFR
2014-10-01
....307 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES (CONTINUED... computer investment used in the provision of the Line Information Database sub-element at § 69.120(b) shall be assigned to that sub-element. (b) General purpose computer investment used in the provision of the...
47 CFR 69.307 - General support facilities.
Code of Federal Regulations, 2010 CFR
2010-10-01
....307 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES (CONTINUED... computer investment used in the provision of the Line Information Database sub-element at § 69.120(b) shall be assigned to that sub-element. (b) General purpose computer investment used in the provision of the...
47 CFR 69.307 - General support facilities.
Code of Federal Regulations, 2011 CFR
2011-10-01
....307 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES (CONTINUED... computer investment used in the provision of the Line Information Database sub-element at § 69.120(b) shall be assigned to that sub-element. (b) General purpose computer investment used in the provision of the...
47 CFR 69.307 - General support facilities.
Code of Federal Regulations, 2012 CFR
2012-10-01
....307 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES (CONTINUED... computer investment used in the provision of the Line Information Database sub-element at § 69.120(b) shall be assigned to that sub-element. (b) General purpose computer investment used in the provision of the...
47 CFR 69.307 - General support facilities.
Code of Federal Regulations, 2013 CFR
2013-10-01
....307 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES (CONTINUED... computer investment used in the provision of the Line Information Database sub-element at § 69.120(b) shall be assigned to that sub-element. (b) General purpose computer investment used in the provision of the...
New Generation General Purpose Computer (GPC) compact IBM unit
NASA Technical Reports Server (NTRS)
1991-01-01
New Generation General Purpose Computer (GPC) compact IBM unit replaces a two-unit earlier generation computer. The new IBM unit is documented in table top views alone (S91-26867, S91-26868), with the onboard equipment it supports including the flight deck CRT screen and keypad (S91-26866), and next to the two earlier versions it replaces (S91-26869).
ERIC Educational Resources Information Center
Demetriadis, Stavros; Egerter, Tina; Hanisch, Frank; Fischer, Frank
2011-01-01
This study investigates the effectiveness of using peer review in the context of scripted collaboration to foster both domain-specific and domain-general knowledge acquisition in the computer science domain. Using a one-factor design with a script and a control condition, students worked in small groups on a series of computer science problems…
Ways of achieving continuous service from computers
NASA Technical Reports Server (NTRS)
Quinn, M. J., Jr.
1974-01-01
This paper outlines the methods used in the real-time computer complex to keep computers operating. Methods include selectover, high-speed restart, and low-speed restart. The hardware and software needed to implement these methods is discussed as well as the system recovery facility, alternate device support, and timeout. In general, methods developed while supporting the Gemini, Apollo, and Skylab space missions are presented.
Program Supports Scientific Visualization
NASA Technical Reports Server (NTRS)
Keith, Stephan
1994-01-01
Primary purpose of General Visualization System (GVS) computer program is to support scientific visualization of data generated by panel-method computer program PMARC_12 (inventory number ARC-13362) on Silicon Graphics Iris workstation. Enables user to view PMARC geometries and wakes as wire frames or as light shaded objects. GVS is written in C language.
Clinical Computing in General Dentistry
Schleyer, Titus K.L.; Thyvalikakath, Thankam P.; Spallek, Heiko; Torres-Urquidy, Miguel H.; Hernandez, Pedro; Yuhaniak, Jeannie
2006-01-01
Objective: Measure the adoption and utilization of, opinions about, and attitudes toward clinical computing among general dentists in the United States. Design: Telephone survey of a random sample of 256 general dentists in active practice in the United States. Measurements: A 39-item telephone interview measuring practice characteristics and information technology infrastructure; clinical information storage; data entry and access; attitudes toward and opinions about clinical computing (features of practice management systems, barriers, advantages, disadvantages, and potential improvements); clinical Internet use; and attitudes toward the National Health Information Infrastructure. Results: The authors successfully screened 1,039 of 1,159 randomly sampled U.S. general dentists in active practice (89.6% response rate). Two hundred fifty-six (24.6%) respondents had computers at chairside and thus were eligible for this study. The authors successfully interviewed 102 respondents (39.8%). Clinical information associated with administration and billing, such as appointments and treatment plans, was stored predominantly on the computer; other information, such as the medical history and progress notes, primarily resided on paper. Nineteen respondents, or 1.8% of all general dentists, were completely paperless. Auxiliary personnel, such as dental assistants and hygienists, entered most data. Respondents adopted clinical computing to improve office efficiency and operations, support diagnosis and treatment, and enhance patient communication and perception. Barriers included insufficient operational reliability, program limitations, a steep learning curve, cost, and infection control issues. Conclusion: Clinical computing is being increasingly adopted in general dentistry. However, future research must address usefulness and ease of use, workflow support, infection control, integration, and implementation issues. PMID:16501177
19. VIEW OF THE GENERAL CHEMISTRY LABORATORY IN BUILDING 881. ...
19. VIEW OF THE GENERAL CHEMISTRY LABORATORY IN BUILDING 881. (4/12/62) - Rocky Flats Plant, General Manufacturing, Support, Records-Central Computing, Southern portion of Plant, Golden, Jefferson County, CO
2013-09-29
ISS037-E-004299 (29 Sept. 2013) --- NASA astronaut Karen Nyberg, Expedition 37 flight engineer, uses a payload and general support computer (PGSC) in the Harmony node of the International Space Station.
ERIC Educational Resources Information Center
Zaidel, Mark; Luo, XiaoHui
2010-01-01
This study investigates the efficiency of multimedia instruction at the college level by comparing the effectiveness of multimedia elements used in the computer supported learning with the cost of their preparation. Among the various technologies that advance learning, instructors and students generally identify interactive multimedia elements as…
Online Hand Holding in Fixing Computer Glitches
ERIC Educational Resources Information Center
Goldsborough, Reid
2005-01-01
According to most surveys, computer manufacturers such as HP puts out reliable products, and computers in general are less troublesome than in the past. But personal computers are still prone to bugs, conflicts, viruses, spyware infestations, hacker and phishing attacks, and--most of all--user error. Unfortunately, technical support from computer…
ERIC Educational Resources Information Center
Lee, Chun-Yi; Chen, Ming-Jang; Chang, Wen-Long
2014-01-01
The aim of this study is to investigate the effects of solution methods and question prompts on generalization and justification of non-routine problem solving for Grade 9 students. The learning activities are based on the context of the frog jumping game. In addition, related computer tools were used to support generalization and justification of…
Shen, Wenfeng; Wei, Daming; Xu, Weimin; Zhu, Xin; Yuan, Shizhong
2010-10-01
Biological computations like electrocardiological modelling and simulation usually require high-performance computing environments. This paper introduces an implementation of parallel computation for computer simulation of electrocardiograms (ECGs) in a personal computer environment with an Intel CPU of Core (TM) 2 Quad Q6600 and a GPU of Geforce 8800GT, with software support by OpenMP and CUDA. It was tested in three parallelization device setups: (a) a four-core CPU without a general-purpose GPU, (b) a general-purpose GPU plus 1 core of CPU, and (c) a four-core CPU plus a general-purpose GPU. To effectively take advantage of a multi-core CPU and a general-purpose GPU, an algorithm based on load-prediction dynamic scheduling was developed and applied to setting (c). In the simulation with 1600 time steps, the speedup of the parallel computation as compared to the serial computation was 3.9 in setting (a), 16.8 in setting (b), and 20.0 in setting (c). This study demonstrates that a current PC with a multi-core CPU and a general-purpose GPU provides a good environment for parallel computations in biological modelling and simulation studies. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.
de Wit, Niek J
2012-01-01
Computer support is considered by many to be a promising strategy for improving healthcare interventions, especially in the management of chronic diseases. So far, however, evidence of the effectiveness of ICT support in healthcare is limited. Recently, computer-supported cardiovascular disease management was compared with usual care during an RCT comprised of 1100 primary care patients. This trial demonstrated that neither the clinical outcome nor the cardiovascular morbidity rate improved, even though management of the risk factors improved over 1 year of follow-up. The pragmatic design of the RCT in daily general practice may have restricted implementing the computer support, and may also have hampered the evaluation of the cardiovascular effects. The results demonstrate that although computer support may help improve the performance of disease management, its impact on disease outcomes is questionable. ICT innovations in healthcare require rigorous investigative evaluation before their implementation in daily practice can be justified.
Computing and Office Automation: Changing Variables.
ERIC Educational Resources Information Center
Staman, E. Michael
1981-01-01
Trends in computing and office automation and their applications, including planning, institutional research, and general administrative support in higher education, are discussed. Changing aspects of information processing and an increasingly larger user community are considered. The computing literacy cycle may involve programming, analysis, use…
Sigma 2 Graphic Display Software Program Description
NASA Technical Reports Server (NTRS)
Johnson, B. T.
1973-01-01
A general purpose, user oriented graphic support package was implemented. A comprehensive description of the two software components comprising this package is given: Display Librarian and Display Controller. These programs have been implemented in FORTRAN on the XDS Sigma 2 Computer Facility. This facility consists of an XDS Sigma 2 general purpose computer coupled to a Computek Display Terminal.
Quality user support: Supporting quality users
DOE Office of Scientific and Technical Information (OSTI.GOV)
Woolley, T.C.
1994-12-31
During the past decade, fundamental changes have occurred in technical computing in the oil industry. Technical computing systems have moved from local, fragmented quantity, to global, integrated, quality. The compute power available to the average geoscientist at his desktop has grown exponentially. Technical computing applications have increased in integration and complexity. At the same time, there has been a significant change in the work force due to the pressures of restructuring, and the increased focus on international opportunities. The profile of the user of technical computing resources has changed. Users are generally more mature, knowledgeable, and team oriented than theirmore » predecessors. In the 1990s, computer literacy is a requirement. This paper describes the steps taken by Oryx Energy Company to address the problems and opportunities created by the explosive growth in computing power and needs, coupled with the contraction of the business. A successful user support strategy will be described. Characteristics of the program include: (1) Client driven support; (2) Empowerment of highly skilled professionals to fill the support role; (3) Routine and ongoing modification to the support plan; (4) Utilization of the support assignment to create highly trained advocates on the line; (5) Integration of the support role to the reservoir management team. Results of the plan include a highly trained work force, stakeholder teams that include support personnel, and global support from a centralized support organization.« less
Medical Signal-Conditioning and Data-Interface System
NASA Technical Reports Server (NTRS)
Braun, Jeffrey; Jacobus, charles; Booth, Scott; Suarez, Michael; Smith, Derek; Hartnagle, Jeffrey; LePrell, Glenn
2006-01-01
A general-purpose portable, wearable electronic signal-conditioning and data-interface system is being developed for medical applications. The system can acquire multiple physiological signals (e.g., electrocardiographic, electroencephalographic, and electromyographic signals) from sensors on the wearer s body, digitize those signals that are received in analog form, preprocess the resulting data, and transmit the data to one or more remote location(s) via a radiocommunication link and/or the Internet. The system includes a computer running data-object-oriented software that can be programmed to configure the system to accept almost any analog or digital input signals from medical devices. The computing hardware and software implement a general-purpose data-routing-and-encapsulation architecture that supports tagging of input data and routing the data in a standardized way through the Internet and other modern packet-switching networks to one or more computer(s) for review by physicians. The architecture supports multiple-site buffering of data for redundancy and reliability, and supports both real-time and slower-than-real-time collection, routing, and viewing of signal data. Routing and viewing stations support insertion of automated analysis routines to aid in encoding, analysis, viewing, and diagnosis.
Palmer, Rebecca; Enderby, Pam
2016-10-01
The speech-language pathology profession has explored a number of approaches to support efficient delivery of interventions for people with stroke-induced aphasia. This study aimed to explore the role of volunteers in supporting self-managed practice of computerised language exercises. A qualitative interview study of the volunteer support role was carried out alongside a pilot randomised controlled trial of computer aphasia therapy. Patients with aphasia practised computer exercises tailored for them by a speech-language pathologist at home regularly for 5 months. Eight of the volunteers who supported the intervention took part in semi-structured interviews. Interviews were audio recorded, transcribed verbatim and analysed thematically. Emergent themes included: training and support requirements; perception of the volunteer role; challenges facing the volunteer, in general and specifically related to supporting computer therapy exercises. The authors concluded that volunteers helped to motivate patients to practise their computer therapy exercises and also provided support to the carers. Training and ongoing structured support of therapy activity and conduct is required from a trained speech-language pathologist to ensure the successful involvement of volunteers supporting impairment-based computer exercises in patients' own homes.
Context as Support for Learning Computer Organization
ERIC Educational Resources Information Center
Tew, Allison Elliott; Dorn, Brian; Leahy, William D., Jr.; Guzdial, Mark
2008-01-01
The ubiquity of personal computational devices in the lives of today's students presents a meaningful context for courses in computer organization beyond the general-purpose or imaginary processors routinely used. This article presents results of a comparative study examining student performance in a conventional organization course and in one…
Quantitative Investigation of the Technologies That Support Cloud Computing
ERIC Educational Resources Information Center
Hu, Wenjin
2014-01-01
Cloud computing is dramatically shaping modern IT infrastructure. It virtualizes computing resources, provides elastic scalability, serves as a pay-as-you-use utility, simplifies the IT administrators' daily tasks, enhances the mobility and collaboration of data, and increases user productivity. We focus on providing generalized black-box…
Computer-aided decision support systems for endoscopy in the gastrointestinal tract: a review.
Liedlgruber, Michael; Uhl, Andreas
2011-01-01
Today, medical endoscopy is a widely used procedure to inspect the inner cavities of the human body. The advent of endoscopic imaging techniques-allowing the acquisition of images or videos-created the possibility for the development of the whole new branch of computer-aided decision support systems. Such systems aim at helping physicians to identify possibly malignant abnormalities more accurately. At the beginning of this paper, we give a brief introduction to the history of endoscopy, followed by introducing the main types of endoscopes which emerged so far (flexible endoscope, wireless capsule endoscope, and confocal laser endomicroscope). We then give a brief introduction to computer-aided decision support systems specifically targeted at endoscopy in the gastrointestinal tract. Then we present general facts and figures concerning computer-aided decision support systems and summarize work specifically targeted at computer-aided decision support in the gastrointestinal tract. This summary is followed by a discussion of some common issues concerning the approaches reviewed and suggestions of possible ways to resolve them.
COEFUV: A Computer Implementation of a Generalized Unmanned Vehicle Cost Model.
1978-10-01
78 T N OMBER . C A FEUCNTER CLASSIF lED DAS-TRRNL mh~hhhh~hhE DAS-TR-78-4 DAS-TR-78-4 coI COEFUV: A COMPUTER IMPLEMENTATION OF A IM GENERALIZED ...34 and the time to generate them are important. Many DAS participants supported this effort. The authors wish to acknow- ledge Richard H. Anderson for...conflict and the on-going COMBAT ANGEL program at Davis-Monthan Air Force Base, there is not a generally accepted costing methodology for unmanned vehicles
ERIC Educational Resources Information Center
de Laat, Maarten; Lally, Vic; Lipponen, Lasse; Simons, Robert-Jan
2007-01-01
The focus of this study is to explore the advances that Social Network Analysis (SNA) can bring, in combination with other methods, when studying Networked Learning/Computer-Supported Collaborative Learning (NL/CSCL). We present a general overview of how SNA is applied in NL/CSCL research; we then go on to illustrate how this research method can…
ERIC Educational Resources Information Center
Stanford Univ., CA.
Recognizing the need to balance generality and economy in system costs, the Project INFO team at Stanford University developing OASIS has sought to provide generalized and powerful computer support within the normal range of operating and analytical requirements associated with university administration. The specific design objectives of the OASIS…
Generalization Through the Recurrent Interaction of Episodic Memories
Kumaran, Dharshan; McClelland, James L.
2012-01-01
In this article, we present a perspective on the role of the hippocampal system in generalization, instantiated in a computational model called REMERGE (recurrency and episodic memory results in generalization). We expose a fundamental, but neglected, tension between prevailing computational theories that emphasize the function of the hippocampus in pattern separation (Marr, 1971; McClelland, McNaughton, & O'Reilly, 1995), and empirical support for its role in generalization and flexible relational memory (Cohen & Eichenbaum, 1993; Eichenbaum, 1999). Our account provides a means by which to resolve this conflict, by demonstrating that the basic representational scheme envisioned by complementary learning systems theory (McClelland et al., 1995), which relies upon orthogonalized codes in the hippocampus, is compatible with efficient generalization—as long as there is recurrence rather than unidirectional flow within the hippocampal circuit or, more widely, between the hippocampus and neocortex. We propose that recurrent similarity computation, a process that facilitates the discovery of higher-order relationships between a set of related experiences, expands the scope of classical exemplar-based models of memory (e.g., Nosofsky, 1984) and allows the hippocampus to support generalization through interactions that unfold within a dynamically created memory space. PMID:22775499
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-17
...: Social Security Administration (SSA). ACTION: Notice of a renewal of an existing computer-matching... INFORMATION: A. General The Computer Matching and Privacy Protection Act of 1988 (Public Law (Pub. L.) 100-503... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA 2012-0021] Privacy Act of 1974, as Amended...
NASA Astrophysics Data System (ADS)
Gonczi, Amanda L.; Chiu, Jennifer L.; Maeng, Jennifer L.; Bell, Randy L.
2016-07-01
This investigation sought to identify patterns in elementary science teachers' computer simulation use, particularly implementation structures and instructional supports commonly employed by teachers. Data included video-recorded science lessons of 96 elementary teachers who used computer simulations in one or more science lessons. Results indicated teachers used a one-to-one student-to-computer ratio most often either during class-wide individual computer use or during a rotating station structure. Worksheets, general support, and peer collaboration were the most common forms of instructional support. The least common instructional support forms included lesson pacing, initial play, and a closure discussion. Students' simulation use was supported in the fewest ways during a rotating station structure. Results suggest that simulation professional development with elementary teachers needs to explicitly focus on implementation structures and instructional support to enhance participants' pedagogical knowledge and improve instructional simulation use. In addition, research is needed to provide theoretical explanations for the observed patterns that should subsequently be addressed in supporting teachers' instructional simulation use during professional development or in teacher preparation programs.
Bernstam, Elmer V.; Hersh, William R.; Johnson, Stephen B.; Chute, Christopher G.; Nguyen, Hien; Sim, Ida; Nahm, Meredith; Weiner, Mark; Miller, Perry; DiLaura, Robert P.; Overcash, Marc; Lehmann, Harold P.; Eichmann, David; Athey, Brian D.; Scheuermann, Richard H.; Anderson, Nick; Starren, Justin B.; Harris, Paul A.; Smith, Jack W.; Barbour, Ed; Silverstein, Jonathan C.; Krusch, David A.; Nagarajan, Rakesh; Becich, Michael J.
2010-01-01
Clinical and translational research increasingly requires computation. Projects may involve multiple computationally-oriented groups including information technology (IT) professionals, computer scientists and biomedical informaticians. However, many biomedical researchers are not aware of the distinctions among these complementary groups, leading to confusion, delays and sub-optimal results. Although written from the perspective of clinical and translational science award (CTSA) programs within academic medical centers, the paper addresses issues that extend beyond clinical and translational research. The authors describe the complementary but distinct roles of operational IT, research IT, computer science and biomedical informatics using a clinical data warehouse as a running example. In general, IT professionals focus on technology. The authors distinguish between two types of IT groups within academic medical centers: central or administrative IT (supporting the administrative computing needs of large organizations) and research IT (supporting the computing needs of researchers). Computer scientists focus on general issues of computation such as designing faster computers or more efficient algorithms, rather than specific applications. In contrast, informaticians are concerned with data, information and knowledge. Biomedical informaticians draw on a variety of tools, including but not limited to computers, to solve information problems in health care and biomedicine. The paper concludes with recommendations regarding administrative structures that can help to maximize the benefit of computation to biomedical research within academic health centers. PMID:19550198
18. VIEW OF THE GENERAL CHEMISTRY LAB. THE LABORATORY PROVIDED ...
18. VIEW OF THE GENERAL CHEMISTRY LAB. THE LABORATORY PROVIDED GENERAL ANALYTICAL AND STANDARDS CALIBRATION, AS WELL AS DEVELOPMENT OPERATIONS INCLUDING WASTE TECHNOLOGY DEVELOPMENT AND DEVELOPMENT AND TESTING OF MECHANICAL SYSTEMS FOR WEAPONS SYSTEMS. (4/4/66) - Rocky Flats Plant, General Manufacturing, Support, Records-Central Computing, Southern portion of Plant, Golden, Jefferson County, CO
44 CFR 13.42 - Retention and access requirements for records.
Code of Federal Regulations, 2014 CFR
2014-10-01
... AGENCY, DEPARTMENT OF HOMELAND SECURITY GENERAL UNIFORM ADMINISTRATIVE REQUIREMENTS FOR GRANTS AND... to all financial and programmatic records, supporting documents, statistical records, and other... computations of the rate at which a particular group of costs is chargeable (such as computer usage chargeback...
44 CFR 13.42 - Retention and access requirements for records.
Code of Federal Regulations, 2011 CFR
2011-10-01
... AGENCY, DEPARTMENT OF HOMELAND SECURITY GENERAL UNIFORM ADMINISTRATIVE REQUIREMENTS FOR GRANTS AND... to all financial and programmatic records, supporting documents, statistical records, and other... computations of the rate at which a particular group of costs is chargeable (such as computer usage chargeback...
44 CFR 13.42 - Retention and access requirements for records.
Code of Federal Regulations, 2013 CFR
2013-10-01
... AGENCY, DEPARTMENT OF HOMELAND SECURITY GENERAL UNIFORM ADMINISTRATIVE REQUIREMENTS FOR GRANTS AND... to all financial and programmatic records, supporting documents, statistical records, and other... computations of the rate at which a particular group of costs is chargeable (such as computer usage chargeback...
44 CFR 13.42 - Retention and access requirements for records.
Code of Federal Regulations, 2010 CFR
2010-10-01
... AGENCY, DEPARTMENT OF HOMELAND SECURITY GENERAL UNIFORM ADMINISTRATIVE REQUIREMENTS FOR GRANTS AND... to all financial and programmatic records, supporting documents, statistical records, and other... computations of the rate at which a particular group of costs is chargeable (such as computer usage chargeback...
44 CFR 13.42 - Retention and access requirements for records.
Code of Federal Regulations, 2012 CFR
2012-10-01
... AGENCY, DEPARTMENT OF HOMELAND SECURITY GENERAL UNIFORM ADMINISTRATIVE REQUIREMENTS FOR GRANTS AND... to all financial and programmatic records, supporting documents, statistical records, and other... computations of the rate at which a particular group of costs is chargeable (such as computer usage chargeback...
KITTEN Lightweight Kernel 0.1 Beta
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pedretti, Kevin; Levenhagen, Michael; Kelly, Suzanne
2007-12-12
The Kitten Lightweight Kernel is a simplified OS (operating system) kernel that is intended to manage a compute node's hardware resources. It provides a set of mechanisms to user-level applications for utilizing hardware resources (e.g., allocating memory, creating processes, accessing the network). Kitten is much simpler than general-purpose OS kernels, such as Linux or Windows, but includes all of the esssential functionality needed to support HPC (high-performance computing) MPI, PGAS and OpenMP applications. Kitten provides unique capabilities such as physically contiguous application memory, transparent large page support, and noise-free tick-less operation, which enable HPC applications to obtain greater efficiency andmore » scalability than with general purpose OS kernels.« less
Uncover the Cloud for Geospatial Sciences and Applications to Adopt Cloud Computing
NASA Astrophysics Data System (ADS)
Yang, C.; Huang, Q.; Xia, J.; Liu, K.; Li, J.; Xu, C.; Sun, M.; Bambacus, M.; Xu, Y.; Fay, D.
2012-12-01
Cloud computing is emerging as the future infrastructure for providing computing resources to support and enable scientific research, engineering development, and application construction, as well as work force education. On the other hand, there is a lot of doubt about the readiness of cloud computing to support a variety of scientific research, development and educations. This research is a project funded by NASA SMD to investigate through holistic studies how ready is the cloud computing to support geosciences. Four applications with different computing characteristics including data, computing, concurrent, and spatiotemporal intensities are taken to test the readiness of cloud computing to support geosciences. Three popular and representative cloud platforms including Amazon EC2, Microsoft Azure, and NASA Nebula as well as a traditional cluster are utilized in the study. Results illustrates that cloud is ready to some degree but more research needs to be done to fully implemented the cloud benefit as advertised by many vendors and defined by NIST. Specifically, 1) most cloud platform could help stand up new computing instances, a new computer, in a few minutes as envisioned, therefore, is ready to support most computing needs in an on demand fashion; 2) the load balance and elasticity, a defining characteristic, is ready in some cloud platforms, such as Amazon EC2, to support bigger jobs, e.g., needs response in minutes, while some are not ready to support the elasticity and load balance well. All cloud platform needs further research and development to support real time application at subminute level; 3) the user interface and functionality of cloud platforms vary a lot and some of them are very professional and well supported/documented, such as Amazon EC2, some of them needs significant improvement for the general public to adopt cloud computing without professional training or knowledge about computing infrastructure; 4) the security is a big concern in cloud computing platform, with the sharing spirit of cloud computing, it is very hard to ensure higher level security, except a private cloud is built for a specific organization without public access, public cloud platform does not support FISMA medium level yet and may never be able to support FISMA high level; 5) HPC jobs needs of cloud computing is not well supported and only Amazon EC2 supports this well. The research is being taken by NASA and other agencies to consider cloud computing adoption. We hope the publication of the research would also benefit the public to adopt cloud computing.
A School Finance Computer Simulation Model
ERIC Educational Resources Information Center
Boardman, Gerald R.
1974-01-01
Presents a description of the computer simulation model developed by the National Educational Finance Project for use by States in planning and evaluating alternative approaches for State support programs. Provides a general introduction to the model, a program operation overview, a sample run, and some conclusions. (Author/WM)
STS-99 MS Kavandi poses by two laptop computers on OV-105's flight deck
2000-03-30
STS099-315-008 (11-22 February 2000) ---Astronaut Janet L. Kavandi, mission specialist, is photographed near the Payload General Support Computers (PGSC) dealing with the Shuttle Radar Topography Mission (SRTM) on the middeck of the Space Shuttle Endeavour.
An efficient temporal logic for robotic task planning
NASA Technical Reports Server (NTRS)
Becker, Jeffrey M.
1989-01-01
Computations required for temporal reasoning can be prohibitively expensive if fully general representations are used. Overly simple representations, such as totally ordered sequence of time points, are inadequate for use in a nonlinear task planning system. A middle ground is identified which is general enough to support a capable nonlinear task planner, but specialized enough that the system can support online task planning in real time. A Temporal Logic System (TLS) was developed during the Intelligent Task Automation (ITA) project to support robotic task planning. TLS is also used within the ITA system to support plan execution, monitoring, and exception handling.
A rigorous computational approach to linear response
NASA Astrophysics Data System (ADS)
Bahsoun, Wael; Galatolo, Stefano; Nisoli, Isaia; Niu, Xiaolong
2018-03-01
We present a general setting in which the formula describing the linear response of the physical measure of a perturbed system can be obtained. In this general setting we obtain an algorithm to rigorously compute the linear response. We apply our results to expanding circle maps. In particular, we present examples where we compute, up to a pre-specified error in the L∞ -norm, the response of expanding circle maps under stochastic and deterministic perturbations. Moreover, we present an example where we compute, up to a pre-specified error in the L 1-norm, the response of the intermittent family at the boundary; i.e. when the unperturbed system is the doubling map. This work was mainly conducted during a visit of SG to Loughborough University. WB and SG would like to thank The Leverhulme Trust for supporting mutual research visits through the Network Grant IN-2014-021. SG thanks the Department of Mathematical Sciences at Loughborough University for hospitality. WB thanks Dipartimento di Matematica, Universita di Pisa. The research of SG and IN is partially supported by EU Marie-Curie IRSES ‘Brazilian-European partnership in Dynamical Systems’ (FP7-PEOPLE-2012-IRSES 318999 BREUDS). IN was partially supported by CNPq and FAPERJ. IN would like to thank the Department of Mathematics at Uppsala University and the support of the KAW grant 2013.0315.
System support software for the Space Ultrareliable Modular Computer (SUMC)
NASA Technical Reports Server (NTRS)
Hill, T. E.; Hintze, G. C.; Hodges, B. C.; Austin, F. A.; Buckles, B. P.; Curran, R. T.; Lackey, J. D.; Payne, R. E.
1974-01-01
The highly transportable programming system designed and implemented to support the development of software for the Space Ultrareliable Modular Computer (SUMC) is described. The SUMC system support software consists of program modules called processors. The initial set of processors consists of the supervisor, the general purpose assembler for SUMC instruction and microcode input, linkage editors, an instruction level simulator, a microcode grid print processor, and user oriented utility programs. A FORTRAN 4 compiler is undergoing development. The design facilitates the addition of new processors with a minimum effort and provides the user quasi host independence on the ground based operational software development computer. Additional capability is provided to accommodate variations in the SUMC architecture without consequent major modifications in the initial processors.
Discrete Mathematics Course Supported by CAS MATHEMATICA
ERIC Educational Resources Information Center
Ivanov, O. A.; Ivanova, V. V.; Saltan, A. A.
2017-01-01
In this paper, we discuss examples of assignments for a course in discrete mathematics for undergraduate students majoring in business informatics. We consider several problems with computer-based solutions and discuss general strategies for using computers in teaching mathematics and its applications. In order to evaluate the effectiveness of our…
Design and Development of a Smart Storytelling Toy
ERIC Educational Resources Information Center
Kara, Nuri; Aydin, Cansu Cigdem; Cagiltay, Kursat
2014-01-01
Because computers generally make children passive listeners, new technological devices need to support children's storytelling activities. This article introduces the StoryTech, a smart toy that includes a virtual space comprised of computer-based graphics and characters as well as a real space that involves stuffed animals, background cards…
SuML: A Survey Markup Language for Generalized Survey Encoding
Barclay, MW; Lober, WB; Karras, BT
2002-01-01
There is a need in clinical and research settings for a sophisticated, generalized, web based survey tool that supports complex logic, separation of content and presentation, and computable guidelines. There are many commercial and open source survey packages available that provide simple logic; few provide sophistication beyond “goto” statements; none support the use of guidelines. These tools are driven by databases, static web pages, and structured documents using markup languages such as eXtensible Markup Language (XML). We propose a generalized, guideline aware language and an implementation architecture using open source standards.
ERIC Educational Resources Information Center
Michell, Dee; Szorenyi, Anna; Falkner, Katrina; Szabo, Claudia
2017-01-01
Computer science, like technology in general, is seen as a masculine field and the under-representation of women an intransigent problem. In this paper, we argue that the cultural belief in Australia that computer science is a domain for men results in many girls and women being chased away from that field as part of a border protection campaign…
Computer-based, Jeopardy™-like game in general chemistry for engineering majors
NASA Astrophysics Data System (ADS)
Ling, S. S.; Saffre, F.; Kadadha, M.; Gater, D. L.; Isakovic, A. F.
2013-03-01
We report on the design of Jeopardy™-like computer game for enhancement of learning of general chemistry for engineering majors. While we examine several parameters of student achievement and attitude, our primary concern is addressing the motivation of students, which tends to be low in a traditionally run chemistry lectures. The effect of the game-playing is tested by comparing paper-based game quiz, which constitutes a control group, and computer-based game quiz, constituting a treatment group. Computer-based game quizzes are Java™-based applications that students run once a week in the second part of the last lecture of the week. Overall effectiveness of the semester-long program is measured through pretest-postest conceptual testing of general chemistry. The objective of this research is to determine to what extent this ``gamification'' of the course delivery and course evaluation processes may be beneficial to the undergraduates' learning of science in general, and chemistry in particular. We present data addressing gender-specific difference in performance, as well as background (pre-college) level of general science and chemistry preparation. We outline the plan how to extend such approach to general physics courses and to modern science driven electives, and we offer live, in-lectures examples of our computer gaming experience. We acknowledge support from Khalifa University, Abu Dhabi
Derivation of Einstein-Cartan theory from general relativity
NASA Astrophysics Data System (ADS)
Petti, Richard
2015-04-01
General relativity cannot describe exchange of classical intrinsic angular momentum and orbital angular momentum. Einstein-Cartan theory fixes this problem in the least invasive way. In the late 20th century, the consensus view was that Einstein-Cartan theory requires inclusion of torsion without adequate justification, it has no empirical support (though it doesn't conflict with any known evidence), it solves no important problem, and it complicates gravitational theory with no compensating benefit. In 1986 the author published a derivation of Einstein-Cartan theory from general relativity, with no additional assumptions or parameters. Starting without torsion, Poincaré symmetry, classical or quantum spin, or spinors, it derives torsion and its relation to spin from a continuum limit of general relativistic solutions. The present work makes the case that this computation, combined with supporting arguments, constitutes a derivation of Einstein-Cartan theory from general relativity, not just a plausibility argument. This paper adds more and simpler explanations, more computational details, correction of a factor of 2, discussion of limitations of the derivation, and discussion of some areas of gravitational research where Einstein-Cartan theory is relevant.
Content Analysis: What Are They Talking About?
ERIC Educational Resources Information Center
Strijbos, Jan-Willem; Martens, Rob L.; Prins, Frans J.; Jochems, Wim M. G.
2006-01-01
Quantitative content analysis is increasingly used to surpass surface level analyses in computer-supported collaborative learning (e.g., counting messages), but critical reflection on accepted practice has generally not been reported. A review of CSCL conference proceedings revealed a general vagueness in definitions of units of analysis. In…
Hagen, Monika E; Wagner, Oliver J; Inan, Ihsan; Morel, Philippe
2009-09-01
Due to improved ergonomics and dexterity, robotic surgery is promoted as being easily performed by surgeons with no special skills necessary. We tested this hypothesis by measuring IQ elements, computer gaming skills, general dexterity with chopsticks, and evaluating laparoscopic experience in correlation to performance ability with the da Vinci robot. Thirty-four individuals were tested for robotic dexterity, IQ elements, computer-gaming skills and general dexterity. Eighteen surgically inexperienced and 16 laparoscopically trained surgeons were included. Each individual performed three different tasks with the da Vinci surgical system and their times were recorded. An IQ test (elements: logical thinking, 3D imagination and technical understanding) was completed by each participant. Computer skills were tested with a simple computer game (hand-eye coordination) and general dexterity was evaluated by the ability to use chopsticks. We found no correlation between logical thinking, 3D imagination and robotic skills. Both computer gaming and general dexterity showed a slight but non-significant improvement in performance with the da Vinci robot (p > 0.05). A significant correlation between robotic skills, technical understanding and laparoscopic experience was observed (p < 0.05). The data support the conclusion that there are no significant correlations between robotic performance and logical thinking, 3D understanding, computer gaming skills and general dexterity. A correlation between robotic skills and technical understanding may exist. Laparoscopic experience seems to be the strongest predictor of performance with the da Vinci surgical system. Generally, it appears difficult to determine non-surgical predictors for robotic surgery.
Multimodal and ubiquitous computing systems: supporting independent-living older users.
Perry, Mark; Dowdall, Alan; Lines, Lorna; Hone, Kate
2004-09-01
We document the rationale and design of a multimodal interface to a pervasive/ubiquitous computing system that supports independent living by older people in their own homes. The Millennium Home system involves fitting a resident's home with sensors--these sensors can be used to trigger sequences of interaction with the resident to warn them about dangerous events, or to check if they need external help. We draw lessons from the design process and conclude the paper with implications for the design of multimodal interfaces to ubiquitous systems developed for the elderly and in healthcare, as well as for more general ubiquitous computing applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barney, B; Shuler, J
2006-08-21
Purple is an Advanced Simulation and Computing (ASC) funded massively parallel supercomputer located at Lawrence Livermore National Laboratory (LLNL). The Purple Computational Environment documents the capabilities and the environment provided for the FY06 LLNL Level 1 General Availability Milestone. This document describes specific capabilities, tools, and procedures to support both local and remote users. The model is focused on the needs of the ASC user working in the secure computing environments at Los Alamos National Laboratory, Lawrence Livermore National Laboratory, and Sandia National Laboratories, but also documents needs of the LLNL and Alliance users working in the unclassified environment. Additionally,more » the Purple Computational Environment maps the provided capabilities to the Trilab ASC Computing Environment (ACE) Version 8.0 requirements. The ACE requirements reflect the high performance computing requirements for the General Availability user environment capabilities of the ASC community. Appendix A lists these requirements and includes a description of ACE requirements met and those requirements that are not met for each section of this document. The Purple Computing Environment, along with the ACE mappings, has been issued and reviewed throughout the Tri-lab community.« less
ANL statement of site strategy for computing workstations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fenske, K.R.; Boxberger, L.M.; Amiot, L.W.
1991-11-01
This Statement of Site Strategy describes the procedure at Argonne National Laboratory for defining, acquiring, using, and evaluating scientific and office workstations and related equipment and software in accord with DOE Order 1360.1A (5-30-85), and Laboratory policy. It is Laboratory policy to promote the installation and use of computing workstations to improve productivity and communications for both programmatic and support personnel, to ensure that computing workstations acquisitions meet the expressed need in a cost-effective manner, and to ensure that acquisitions of computing workstations are in accord with Laboratory and DOE policies. The overall computing site strategy at ANL is tomore » develop a hierarchy of integrated computing system resources to address the current and future computing needs of the laboratory. The major system components of this hierarchical strategy are: Supercomputers, Parallel computers, Centralized general purpose computers, Distributed multipurpose minicomputers, and Computing workstations and office automation support systems. Computing workstations include personal computers, scientific and engineering workstations, computer terminals, microcomputers, word processing and office automation electronic workstations, and associated software and peripheral devices costing less than $25,000 per item.« less
1984-09-27
more effectively structured and transportable simulation program modules and powerful support software, are already in place for current use. The early...incorporates the various limits and conditions described for the major acceleration categories. (14) Speed Loop This module Is executed when the shaft speed...available, high confidence models and modules . A great leverage is gained by using generally available general purpose computers and associated support
A comparative analysis of support vector machines and extreme learning machines.
Liu, Xueyi; Gao, Chuanhou; Li, Ping
2012-09-01
The theory of extreme learning machines (ELMs) has recently become increasingly popular. As a new learning algorithm for single-hidden-layer feed-forward neural networks, an ELM offers the advantages of low computational cost, good generalization ability, and ease of implementation. Hence the comparison and model selection between ELMs and other kinds of state-of-the-art machine learning approaches has become significant and has attracted many research efforts. This paper performs a comparative analysis of the basic ELMs and support vector machines (SVMs) from two viewpoints that are different from previous works: one is the Vapnik-Chervonenkis (VC) dimension, and the other is their performance under different training sample sizes. It is shown that the VC dimension of an ELM is equal to the number of hidden nodes of the ELM with probability one. Additionally, their generalization ability and computational complexity are exhibited with changing training sample size. ELMs have weaker generalization ability than SVMs for small sample but can generalize as well as SVMs for large sample. Remarkably, great superiority in computational speed especially for large-scale sample problems is found in ELMs. The results obtained can provide insight into the essential relationship between them, and can also serve as complementary knowledge for their past experimental and theoretical comparisons. Copyright © 2012 Elsevier Ltd. All rights reserved.
Device 2E6 (ACMS) Air Combat Maneuvering Simulator Instructor Console Review.
1983-12-01
While the device provides some new features which support training such as a debrief facility and a computer based instructor training module, the...Equipment Center, Orlando, FL (in printing). - 11 - -~.-. -- ~ --- NAVTRAEQUI PCEN 82-M-0767- 1 PROJECTORS DOE COMPUTER SYSTEMS Figure 1. General...arrangement (2E6) - 12 7 NAVTRAEQUIPCEN 82-M--0767-1 d. instructor stations, e. computer systems, ftarget model subsystem, g. debrief subsystem, h
Climate@Home: Crowdsourcing Climate Change Research
NASA Astrophysics Data System (ADS)
Xu, C.; Yang, C.; Li, J.; Sun, M.; Bambacus, M.
2011-12-01
Climate change deeply impacts human wellbeing. Significant amounts of resources have been invested in building super-computers that are capable of running advanced climate models, which help scientists understand climate change mechanisms, and predict its trend. Although climate change influences all human beings, the general public is largely excluded from the research. On the other hand, scientists are eagerly seeking communication mediums for effectively enlightening the public on climate change and its consequences. The Climate@Home project is devoted to connect the two ends with an innovative solution: crowdsourcing climate computing to the general public by harvesting volunteered computing resources from the participants. A distributed web-based computing platform will be built to support climate computing, and the general public can 'plug-in' their personal computers to participate in the research. People contribute the spare computing power of their computers to run a computer model, which is used by scientists to predict climate change. Traditionally, only super-computers could handle such a large computing processing load. By orchestrating massive amounts of personal computers to perform atomized data processing tasks, investments on new super-computers, energy consumed by super-computers, and carbon release from super-computers are reduced. Meanwhile, the platform forms a social network of climate researchers and the general public, which may be leveraged to raise climate awareness among the participants. A portal is to be built as the gateway to the climate@home project. Three types of roles and the corresponding functionalities are designed and supported. The end users include the citizen participants, climate scientists, and project managers. Citizen participants connect their computing resources to the platform by downloading and installing a computing engine on their personal computers. Computer climate models are defined at the server side. Climate scientists configure computer model parameters through the portal user interface. After model configuration, scientists then launch the computing task. Next, data is atomized and distributed to computing engines that are running on citizen participants' computers. Scientists will receive notifications on the completion of computing tasks, and examine modeling results via visualization modules of the portal. Computing tasks, computing resources, and participants are managed by project managers via portal tools. A portal prototype has been built for proof of concept. Three forums have been setup for different groups of users to share information on science aspect, technology aspect, and educational outreach aspect. A facebook account has been setup to distribute messages via the most popular social networking platform. New treads are synchronized from the forums to facebook. A mapping tool displays geographic locations of the participants and the status of tasks on each client node. A group of users have been invited to test functions such as forums, blogs, and computing resource monitoring.
Computational materials science and engineering education: A survey of trends and needs
NASA Astrophysics Data System (ADS)
Thornton, K.; Nola, Samanthule; Edwin Garcia, R.; Asta, Mark; Olson, G. B.
2009-10-01
Results from a recent reassessment of the state of computational materials science and engineering (CMSE) education are reported. Surveys were distributed to the chairs and heads of materials programs, faculty members engaged in computational research, and employers of materials scientists and engineers, mainly in the United States. The data was compiled to assess current course offerings related to CMSE, the general climate for introducing computational methods in MSE curricula, and the requirements from the employers’ viewpoint. Furthermore, the available educational resources and their utilization by the community are examined. The surveys show a general support for integrating computational content into MSE education. However, they also reflect remaining issues with implementation, as well as a gap between the tools being taught in courses and those that are used by employers. Overall, the results suggest the necessity for a comprehensively developed vision and plans to further the integration of computational methods into MSE curricula.
ERIC Educational Resources Information Center
Campbell, Todd; Longhurst, Max L.; Wang, Shiang-Kwei; Hsu, Hui-Yin; Coster, Dan C.
2015-01-01
While access to computers, other technologies, and cyber-enabled resources that could be leveraged for enhancing student learning in science is increasing, generally it has been found that teachers use technology more for administrative purposes or to support traditional instruction. This use of technology, especially to support traditional…
A Simple and Resource-efficient Setup for the Computer-aided Drug Design Laboratory.
Moretti, Loris; Sartori, Luca
2016-10-01
Undertaking modelling investigations for Computer-Aided Drug Design (CADD) requires a proper environment. In principle, this could be done on a single computer, but the reality of a drug discovery program requires robustness and high-throughput computing (HTC) to efficiently support the research. Therefore, a more capable alternative is needed but its implementation has no widespread solution. Here, the realization of such a computing facility is discussed, from general layout to technical details all aspects are covered. © 2016 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Hreniuc, V.; Hreniuc, A.; Pescaru, A.
2017-08-01
Solving a general strength problem of a ship hull may be done using analytical approaches which are useful to deduce the buoyancy forces distribution, the weighting forces distribution along the hull and the geometrical characteristics of the sections. These data are used to draw the free body diagrams and to compute the stresses. The general strength problems require a large amount of calculi, therefore it is interesting how a computer may be used to solve such problems. Using computer programming an engineer may conceive software instruments based on analytical approaches. However, before developing the computer code the research topic must be thoroughly analysed, in this way being reached a meta-level of understanding of the problem. The following stage is to conceive an appropriate development strategy of the original software instruments useful for the rapid development of computer aided analytical models. The geometrical characteristics of the sections may be computed using a bool algebra that operates with ‘simple’ geometrical shapes. By ‘simple’ we mean that for the according shapes we have direct calculus relations. In the set of ‘simple’ shapes we also have geometrical entities bounded by curves approximated as spline functions or as polygons. To conclude, computer programming offers the necessary support to solve general strength ship hull problems using analytical methods.
Doctors' experience with handheld computers in clinical practice: qualitative study.
McAlearney, Ann Scheck; Schweikhart, Sharon B; Medow, Mitchell A
2004-05-15
To examine doctors' perspectives about their experiences with handheld computers in clinical practice. Qualitative study of eight focus groups consisting of doctors with diverse training and practice patterns. Six practice settings across the United States and two additional focus group sessions held at a national meeting of general internists. 54 doctors who did or did not use handheld computers. Doctors who used handheld computers in clinical practice seemed generally satisfied with them and reported diverse patterns of use. Users perceived that the devices helped them increase productivity and improve patient care. Barriers to use concerned the device itself and personal and perceptual constraints, with perceptual factors such as comfort with technology, preference for paper, and the impression that the devices are not easy to use somewhat difficult to overcome. Participants suggested that organisations can help promote handheld computers by providing advice on purchase, usage, training, and user support. Participants expressed concern about reliability and security of the device but were particularly concerned about dependency on the device and over-reliance as a substitute for clinical thinking. Doctors expect handheld computers to become more useful, and most seem interested in leveraging (getting the most value from) their use. Key opportunities with handheld computers included their use as a stepping stone to build doctors' comfort with other information technology and ehealth initiatives and providing point of care support that helps improve patient care.
QuEST for malware type-classification
NASA Astrophysics Data System (ADS)
Vaughan, Sandra L.; Mills, Robert F.; Grimaila, Michael R.; Peterson, Gilbert L.; Oxley, Mark E.; Dube, Thomas E.; Rogers, Steven K.
2015-05-01
Current cyber-related security and safety risks are unprecedented, due in no small part to information overload and skilled cyber-analyst shortages. Advances in decision support and Situation Awareness (SA) tools are required to support analysts in risk mitigation. Inspired by human intelligence, research in Artificial Intelligence (AI) and Computational Intelligence (CI) have provided successful engineering solutions in complex domains including cyber. Current AI approaches aggregate large volumes of data to infer the general from the particular, i.e. inductive reasoning (pattern-matching) and generally cannot infer answers not previously programmed. Whereas humans, rarely able to reason over large volumes of data, have successfully reached the top of the food chain by inferring situations from partial or even partially incorrect information, i.e. abductive reasoning (pattern-completion); generating a hypothetical explanation of observations. In order to achieve an engineering advantage in computational decision support and SA we leverage recent research in human consciousness, the role consciousness plays in decision making, modeling the units of subjective experience which generate consciousness, qualia. This paper introduces a novel computational implementation of a Cognitive Modeling Architecture (CMA) which incorporates concepts of consciousness. We apply our model to the malware type-classification task. The underlying methodology and theories are generalizable to many domains.
Using Technology To Support Comprehensive Guidance Program Operations: A Variety of Strategies.
ERIC Educational Resources Information Center
Bowers, Judy
The Tucson Unified School District made a goal for 2000-2001 for all counselors to have their own computer at school. This article looks at how these computers are used to enhance the counselors' jobs. At the district level, the staff communicates with counselors through e-mail. Meeting reminders, general information, and upcoming events are…
ERIC Educational Resources Information Center
Pennington, Robert C.
2010-01-01
Although legislation mandates that students with autism receive instruction linked to the general education core content, there is limited research supporting the effectiveness of interventions for teaching core content to these students. In this study, the author reviewed research conducted between the years 1997 and 2008 using computer-assisted…
Evolution of Embedded Processing for Wide Area Surveillance
2014-01-01
future vision . 15. SUBJECT TERMS Embedded processing; high performance computing; general-purpose graphical processing units (GPGPUs) 16. SECURITY...recon- naissance (ISR) mission capabilities. The capabilities these advancements are achieving include the ability to provide persistent all...fighters to support and positively affect their mission . Significant improvements in high-performance computing (HPC) technology make it possible to
Real-time data reduction capabilities at the Langley 7 by 10 foot high speed tunnel
NASA Technical Reports Server (NTRS)
Fox, C. H., Jr.
1980-01-01
The 7 by 10 foot high speed tunnel performs a wide range of tests employing a variety of model installation methods. To support the reduction of static data from this facility, a generalized wind tunnel data reduction program had been developed for use on the Langley central computer complex. The capabilities of a version of this generalized program adapted for real time use on a dedicated on-site computer are discussed. The input specifications, instructions for the console operator, and full descriptions of the algorithms are included.
Grids, virtualization, and clouds at Fermilab
Timm, S.; Chadwick, K.; Garzoglio, G.; ...
2014-06-11
Fermilab supports a scientific program that includes experiments and scientists located across the globe. To better serve this community, in 2004, the (then) Computing Division undertook the strategy of placing all of the High Throughput Computing (HTC) resources in a Campus Grid known as FermiGrid, supported by common shared services. In 2007, the FermiGrid Services group deployed a service infrastructure that utilized Xen virtualization, LVS network routing and MySQL circular replication to deliver highly available services that offered significant performance, reliability and serviceability improvements. This deployment was further enhanced through the deployment of a distributed redundant network core architecture andmore » the physical distribution of the systems that host the virtual machines across multiple buildings on the Fermilab Campus. In 2010, building on the experience pioneered by FermiGrid in delivering production services in a virtual infrastructure, the Computing Sector commissioned the FermiCloud, General Physics Computing Facility and Virtual Services projects to serve as platforms for support of scientific computing (FermiCloud 6 GPCF) and core computing (Virtual Services). Lastly, this work will present the evolution of the Fermilab Campus Grid, Virtualization and Cloud Computing infrastructure together with plans for the future.« less
Grids, virtualization, and clouds at Fermilab
NASA Astrophysics Data System (ADS)
Timm, S.; Chadwick, K.; Garzoglio, G.; Noh, S.
2014-06-01
Fermilab supports a scientific program that includes experiments and scientists located across the globe. To better serve this community, in 2004, the (then) Computing Division undertook the strategy of placing all of the High Throughput Computing (HTC) resources in a Campus Grid known as FermiGrid, supported by common shared services. In 2007, the FermiGrid Services group deployed a service infrastructure that utilized Xen virtualization, LVS network routing and MySQL circular replication to deliver highly available services that offered significant performance, reliability and serviceability improvements. This deployment was further enhanced through the deployment of a distributed redundant network core architecture and the physical distribution of the systems that host the virtual machines across multiple buildings on the Fermilab Campus. In 2010, building on the experience pioneered by FermiGrid in delivering production services in a virtual infrastructure, the Computing Sector commissioned the FermiCloud, General Physics Computing Facility and Virtual Services projects to serve as platforms for support of scientific computing (FermiCloud 6 GPCF) and core computing (Virtual Services). This work will present the evolution of the Fermilab Campus Grid, Virtualization and Cloud Computing infrastructure together with plans for the future.
Model implementation for dynamic computation of system cost
NASA Astrophysics Data System (ADS)
Levri, J.; Vaccari, D.
The Advanced Life Support (ALS) Program metric is the ratio of the equivalent system mass (ESM) of a mission based on International Space Station (ISS) technology to the ESM of that same mission based on ALS technology. ESM is a mission cost analog that converts the volume, power, cooling and crewtime requirements of a mission into mass units to compute an estimate of the life support system emplacement cost. Traditionally, ESM has been computed statically, using nominal values for system sizing. However, computation of ESM with static, nominal sizing estimates cannot capture the peak sizing requirements driven by system dynamics. In this paper, a dynamic model for a near-term Mars mission is described. The model is implemented in Matlab/Simulink' for the purpose of dynamically computing ESM. This paper provides a general overview of the crew, food, biomass, waste, water and air blocks in the Simulink' model. Dynamic simulations of the life support system track mass flow, volume and crewtime needs, as well as power and cooling requirement profiles. The mission's ESM is computed, based upon simulation responses. Ultimately, computed ESM values for various system architectures will feed into an optimization search (non-derivative) algorithm to predict parameter combinations that result in reduced objective function values.
NASA Technical Reports Server (NTRS)
Southall, J. W.
1979-01-01
The engineering-specified requirements for integrated information processing by means of the Integrated Programs for Aerospace-Vehicle Design (IPAD) system are presented. A data model is described and is based on the design process of a typical aerospace vehicle. General data management requirements are specified for data storage, retrieval, generation, communication, and maintenance. Information management requirements are specified for a two-component data model. In the general portion, data sets are managed as entities, and in the specific portion, data elements and the relationships between elements are managed by the system, allowing user access to individual elements for the purpose of query. Computer program management requirements are specified for support of a computer program library, control of computer programs, and installation of computer programs into IPAD.
Blanson Henkemans, O. A.; Rogers, W. A.; Fisk, A. D.; Neerincx, M. A.; Lindenberg, J.; van der Mast, C. A. P. G.
2014-01-01
Summary Objectives We developed an adaptive computer assistant for the supervision of diabetics’ self-care, to support limiting illness and need for acute treatment, and improve health literacy. This assistant monitors self-care activities logged in the patient’s electronic diary. Accordingly, it provides context-aware feedback. The objective was to evaluate whether older adults in general can make use of the computer assistant and to compare an adaptive computer assistant with a fixed one, concerning its usability and contribution to health literacy. Methods We conducted a laboratory experiment in the Georgia Tech Aware Home wherein 28 older adults participated in a usability evaluation of the computer assistant, while engaged in scenarios reflecting normal and health-critical situations. We evaluated the assistant on effectiveness, efficiency, satisfaction, and educational value. Finally, we studied the moderating effects of the subjects’ personal characteristics. Results Logging self-care tasks and receiving feedback from the computer assistant enhanced the subjects’ knowledge of diabetes. The adaptive assistant was more effective in dealing with normal and health-critical situations, and, generally, it led to more time efficiency. Subjects’ personal characteristics had substantial effects on the effectiveness and efficiency of the two computer assistants. Conclusions Older adults were able to use the adaptive computer assistant. In addition, it had a positive effect on the development of health literacy. The assistant has the potential to support older diabetics’ self care while maintaining quality of life. PMID:18213433
List processing software for the LeCroy 1821 Segment Manager Interface
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dorries, T.; Moore, C.; Pordes, R.
1987-05-01
Many experiments at Fermilab now include some FASTBUS electronics in their data readout. The software reported in this paper provides general support for the LeCroy 1821 interface. The list processing device drivers allow FASTBUS data to be read out efficiently into the Fermilab Computing Department supported data acquisition systems.
Specialized computer architectures for computational aerodynamics
NASA Technical Reports Server (NTRS)
Stevenson, D. K.
1978-01-01
In recent years, computational fluid dynamics has made significant progress in modelling aerodynamic phenomena. Currently, one of the major barriers to future development lies in the compute-intensive nature of the numerical formulations and the relative high cost of performing these computations on commercially available general purpose computers, a cost high with respect to dollar expenditure and/or elapsed time. Today's computing technology will support a program designed to create specialized computing facilities to be dedicated to the important problems of computational aerodynamics. One of the still unresolved questions is the organization of the computing components in such a facility. The characteristics of fluid dynamic problems which will have significant impact on the choice of computer architecture for a specialized facility are reviewed.
NASA Astrophysics Data System (ADS)
Mancho, Ana M.; Wiggins, Stephen; Curbelo, Jezabel; Mendoza, Carolina
2013-11-01
Lagrangian descriptors are a recent technique which reveals geometrical structures in phase space and which are valid for aperiodically time dependent dynamical systems. We discuss a general methodology for constructing them and we discuss a ``heuristic argument'' that explains why this method is successful. We support this argument by explicit calculations on a benchmark problem. Several other benchmark examples are considered that allow us to assess the performance of Lagrangian descriptors with both finite time Lyapunov exponents (FTLEs) and finite time averages of certain components of the vector field (``time averages''). In all cases Lagrangian descriptors are shown to be both more accurate and computationally efficient than these methods. We thank CESGA for computing facilities. This research was supported by MINECO grants: MTM2011-26696, I-Math C3-0104, ICMAT Severo Ochoa project SEV-2011-0087, and CSIC grant OCEANTECH. SW acknowledges the support of the ONR (Grant No. N00014-01-1-0769).
Reinventing patient-centered computing for the twenty-first century.
Goldberg, H S; Morales, A; Gottlieb, L; Meador, L; Safran, C
2001-01-01
Despite evidence over the past decade that patients like and will use patient-centered computing systems in managing their health, patients have remained forgotten stakeholders in advances in clinical computing systems. We present a framework for patient empowerment and the technical realization of that framework in an architecture called CareLink. In an evaluation of the initial deployment of CareLink in the support of neonatal intensive care, we have demonstrated a reduction in the length of stay for very-low birthweight infants, and an improvement in family satisfaction with care delivery. With the ubiquitous adoption of the Internet into the general culture, patient-centered computing provides the opportunity to mend broken health care relationships and reconnect patients to the care delivery process. CareLink itself provides functionality to support both clinical care and research, and provides a living laboratory for the further study of patient-centered computing.
ERIC Educational Resources Information Center
Batey, Anne
Computers are integrated into science education when they are used as the most appropriate tool or delivery system to support the goals of science education. The goals of science education can be condensed into two general areas. One area concerns the preparation of a science-literate citizenry; the second area concerns understanding the…
15 CFR Supplement No. 1 to Part 736 - General Order
Code of Federal Regulations, 2014 CFR
2014-01-01
... authority to conduct the nation's foreign affairs); medicine (on the CCL) and medical devices (both as...; telecommunications equipment and associated computers, software and technology; items in support of United Nations...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goebel, J
2004-02-27
Without stable hardware any program will fail. The frustration and expense of supporting bad hardware can drain an organization, delay progress, and frustrate everyone involved. At Stanford Linear Accelerator Center (SLAC), we have created a testing method that helps our group, SLAC Computer Services (SCS), weed out potentially bad hardware and purchase the best hardware at the best possible cost. Commodity hardware changes often, so new evaluations happen periodically each time we purchase systems and minor re-evaluations happen for revised systems for our clusters, about twice a year. This general framework helps SCS perform correct, efficient evaluations. This article outlinesmore » SCS's computer testing methods and our system acceptance criteria. We expanded the basic ideas to other evaluations such as storage, and we think the methods outlined in this article has helped us choose hardware that is much more stable and supportable than our previous purchases. We have found that commodity hardware ranges in quality, so systematic method and tools for hardware evaluation were necessary. This article is based on one instance of a hardware purchase, but the guidelines apply to the general problem of purchasing commodity computer systems for production computational work.« less
Self-evaluation of decision-making: A general Bayesian framework for metacognitive computation.
Fleming, Stephen M; Daw, Nathaniel D
2017-01-01
People are often aware of their mistakes, and report levels of confidence in their choices that correlate with objective performance. These metacognitive assessments of decision quality are important for the guidance of behavior, particularly when external feedback is absent or sporadic. However, a computational framework that accounts for both confidence and error detection is lacking. In addition, accounts of dissociations between performance and metacognition have often relied on ad hoc assumptions, precluding a unified account of intact and impaired self-evaluation. Here we present a general Bayesian framework in which self-evaluation is cast as a "second-order" inference on a coupled but distinct decision system, computationally equivalent to inferring the performance of another actor. Second-order computation may ensue whenever there is a separation between internal states supporting decisions and confidence estimates over space and/or time. We contrast second-order computation against simpler first-order models in which the same internal state supports both decisions and confidence estimates. Through simulations we show that second-order computation provides a unified account of different types of self-evaluation often considered in separate literatures, such as confidence and error detection, and generates novel predictions about the contribution of one's own actions to metacognitive judgments. In addition, the model provides insight into why subjects' metacognition may sometimes be better or worse than task performance. We suggest that second-order computation may underpin self-evaluative judgments across a range of domains. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Self-Evaluation of Decision-Making: A General Bayesian Framework for Metacognitive Computation
2017-01-01
People are often aware of their mistakes, and report levels of confidence in their choices that correlate with objective performance. These metacognitive assessments of decision quality are important for the guidance of behavior, particularly when external feedback is absent or sporadic. However, a computational framework that accounts for both confidence and error detection is lacking. In addition, accounts of dissociations between performance and metacognition have often relied on ad hoc assumptions, precluding a unified account of intact and impaired self-evaluation. Here we present a general Bayesian framework in which self-evaluation is cast as a “second-order” inference on a coupled but distinct decision system, computationally equivalent to inferring the performance of another actor. Second-order computation may ensue whenever there is a separation between internal states supporting decisions and confidence estimates over space and/or time. We contrast second-order computation against simpler first-order models in which the same internal state supports both decisions and confidence estimates. Through simulations we show that second-order computation provides a unified account of different types of self-evaluation often considered in separate literatures, such as confidence and error detection, and generates novel predictions about the contribution of one’s own actions to metacognitive judgments. In addition, the model provides insight into why subjects’ metacognition may sometimes be better or worse than task performance. We suggest that second-order computation may underpin self-evaluative judgments across a range of domains. PMID:28004960
Doctors' experience with handheld computers in clinical practice: qualitative study
McAlearney, Ann Scheck; Schweikhart, Sharon B; Medow, Mitchell A
2004-01-01
Objective To examine doctors' perspectives about their experiences with handheld computers in clinical practice. Design Qualitative study of eight focus groups consisting of doctors with diverse training and practice patterns. Setting Six practice settings across the United States and two additional focus group sessions held at a national meeting of general internists. Participants 54 doctors who did or did not use handheld computers. Results Doctors who used handheld computers in clinical practice seemed generally satisfied with them and reported diverse patterns of use. Users perceived that the devices helped them increase productivity and improve patient care. Barriers to use concerned the device itself and personal and perceptual constraints, with perceptual factors such as comfort with technology, preference for paper, and the impression that the devices are not easy to use somewhat difficult to overcome. Participants suggested that organisations can help promote handheld computers by providing advice on purchase, usage, training, and user support. Participants expressed concern about reliability and security of the device but were particularly concerned about dependency on the device and over-reliance as a substitute for clinical thinking. Conclusions Doctors expect handheld computers to become more useful, and most seem interested in leveraging (getting the most value from) their use. Key opportunities with handheld computers included their use as a stepping stone to build doctors' comfort with other information technology and ehealth initiatives and providing point of care support that helps improve patient care. PMID:15142920
Operation plan for the data 100/LARS terminal system
NASA Technical Reports Server (NTRS)
Bowen, A. J., Jr.
1980-01-01
The Data 100/LARS terminal system provides an interface for processing on the IBM 3031 computer system at Purdue University's Laboratory for Applications of Remote Sensing. The environment in which the system is operated and supported is discussed. The general support responsibilities, procedural mechanisms, and training established for the benefit of the system users are defined.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-13
... science-based human health assessments to support the Agency's regulatory activities. The IRIS database... May 21, 2009, to develop human health assessments for inclusion in the IRIS database. The purpose of... if you will need audio-visual aid (e.g., lap top computer and slide projector). In general, each...
Attitude ground support system for the solar maximum mission spacecraft
NASA Technical Reports Server (NTRS)
Nair, G.
1980-01-01
The SMM attitude ground support system (AGSS) supports the acquisition of spacecraft roll attitude reference, performs the in-flight calibration of the attitude sensor complement, supports onboard control autonomy via onboard computer data base updates, and monitors onboard computer (OBC) performance. Initial roll attitude acquisition is accomplished by obtaining a coarse 3 axis attitude estimate from magnetometer and Sun sensor data and subsequently refining it by processing data from the fixed head star trackers. In-flight calibration of the attitude sensor complement is achieved by processing data from a series of slew maneuvers designed to maximize the observability and accuracy of the appropriate alignments and biases. To ensure autonomy of spacecraft operation, the AGSS selects guide stars and computes sensor occultation information for uplink to the OBC. The onboard attitude control performance is monitored on the ground through periodic attitude determination and processing of OBC data in downlink telemetry. In general, the control performance has met mission requirements. However, software and hardware problems have resulted in sporadic attitude reference losses.
Integration of High-Performance Computing into Cloud Computing Services
NASA Astrophysics Data System (ADS)
Vouk, Mladen A.; Sills, Eric; Dreher, Patrick
High-Performance Computing (HPC) projects span a spectrum of computer hardware implementations ranging from peta-flop supercomputers, high-end tera-flop facilities running a variety of operating systems and applications, to mid-range and smaller computational clusters used for HPC application development, pilot runs and prototype staging clusters. What they all have in common is that they operate as a stand-alone system rather than a scalable and shared user re-configurable resource. The advent of cloud computing has changed the traditional HPC implementation. In this article, we will discuss a very successful production-level architecture and policy framework for supporting HPC services within a more general cloud computing infrastructure. This integrated environment, called Virtual Computing Lab (VCL), has been operating at NC State since fall 2004. Nearly 8,500,000 HPC CPU-Hrs were delivered by this environment to NC State faculty and students during 2009. In addition, we present and discuss operational data that show that integration of HPC and non-HPC (or general VCL) services in a cloud can substantially reduce the cost of delivering cloud services (down to cents per CPU hour).
Technical Support. Focus on Careers.
ERIC Educational Resources Information Center
Thiers, Naomi
1996-01-01
Describes work conditions, education and training needs, and salaries of the following technician careers: auto/diesel technicians, auto body repairers, general maintenance mechanics, heating/air conditioning/refrigeration, paralegals, engineering technicians, science technicians, computer repairers, and drafters. (SK)
20. VIEW OF THE RECORDS STORAGE AREA LOCATED ON THE ...
20. VIEW OF THE RECORDS STORAGE AREA LOCATED ON THE FIRST FLOOR MEZZANINE. (1/83) - Rocky Flats Plant, General Manufacturing, Support, Records-Central Computing, Southern portion of Plant, Golden, Jefferson County, CO
1997-12-01
that I’ll turn my attention to that computer game we’ve talked so much about... Dave Van Veldhuizen and Scott Brown (soon-to-be Drs. Van Veldhuizen and...Industry Training Systems Conference. 1988. 37. Van Veldhuizen , D. A. and L. J Hutson. "A Design Methodology for Domain Inde- pendent Computer...proposed by Van Veld- huizen and Hutson (37), extends the general architecture to support both a domain- independent approach to implementing CGFs and
Digital processing of Mariner 9 television data.
NASA Technical Reports Server (NTRS)
Green, W. B.; Seidman, J. B.
1973-01-01
The digital image processing performed by the Image Processing Laboratory (IPL) at JPL in support of the Mariner 9 mission is summarized. The support is divided into the general categories of image decalibration (the removal of photometric and geometric distortions from returned imagery), computer cartographic projections in support of mapping activities, and adaptive experimenter support (flexible support to provide qualitative digital enhancements and quantitative data reduction of returned imagery). Among the tasks performed were the production of maximum discriminability versions of several hundred frames to support generation of a geodetic control net for Mars, and special enhancements supporting analysis of Phobos and Deimos images.
12. VIEW OF THE NONDESTRUCTIVE TESTING EQUIPMENT BEING USED TO ...
12. VIEW OF THE NON-DESTRUCTIVE TESTING EQUIPMENT BEING USED TO DETECT FLAWS IN FABRICATED COMPONENTS. (6/76) - Rocky Flats Plant, General Manufacturing, Support, Records-Central Computing, Southern portion of Plant, Golden, Jefferson County, CO
NASA Technical Reports Server (NTRS)
Jain, Abhinandan
2011-01-01
Ndarts software provides algorithms for computing quantities associated with the dynamics of articulated, rigid-link, multibody systems. It is designed as a general-purpose dynamics library that can be used for the modeling of robotic platforms, space vehicles, molecular dynamics, and other such applications. The architecture and algorithms in Ndarts are based on the Spatial Operator Algebra (SOA) theory for computational multibody and robot dynamics developed at JPL. It uses minimal, internal coordinate models. The algorithms are low-order, recursive scatter/ gather algorithms. In comparison with the earlier Darts++ software, this version has a more general and cleaner design needed to support a larger class of computational dynamics needs. It includes a frames infrastructure, allows algorithms to operate on subgraphs of the system, and implements lazy and deferred computation for better efficiency. Dynamics modeling modules such as Ndarts are core building blocks of control and simulation software for space, robotic, mechanism, bio-molecular, and material systems modeling.
Parallelization strategies for continuum-generalized method of moments on the multi-thread systems
NASA Astrophysics Data System (ADS)
Bustamam, A.; Handhika, T.; Ernastuti, Kerami, D.
2017-07-01
Continuum-Generalized Method of Moments (C-GMM) covers the Generalized Method of Moments (GMM) shortfall which is not as efficient as Maximum Likelihood estimator by using the continuum set of moment conditions in a GMM framework. However, this computation would take a very long time since optimizing regularization parameter. Unfortunately, these calculations are processed sequentially whereas in fact all modern computers are now supported by hierarchical memory systems and hyperthreading technology, which allowing for parallel computing. This paper aims to speed up the calculation process of C-GMM by designing a parallel algorithm for C-GMM on the multi-thread systems. First, parallel regions are detected for the original C-GMM algorithm. There are two parallel regions in the original C-GMM algorithm, that are contributed significantly to the reduction of computational time: the outer-loop and the inner-loop. Furthermore, this parallel algorithm will be implemented with standard shared-memory application programming interface, i.e. Open Multi-Processing (OpenMP). The experiment shows that the outer-loop parallelization is the best strategy for any number of observations.
Sutiono, Agung Budi; Suwa, Hirohiko; Ohta, Toshizumi; Arifin, Muh Zafrullah; Kitamura, Yohei; Yoshida, Kazunari; Merdika, Daduk; Qiantori, Andri; Iskandar
2012-12-01
Disasters bring consequences of negative impacts on the environment and human life. One of the common cause of critical condition is traumatic brain injury (TBI), namely, epidural (EDH) and subdural hematoma (SDH), due to downfall hard things during earthquake. We proposed and analyzed the user response, namely neurosurgeon, general doctor/surgeon and nurse when they interacted with TBI computer interface. The communication systems was supported by TBI web based applications using emergency broadband access network with tethered balloon and simulated in the field trial to evaluate the coverage area. The interface consisted of demography data and multi tabs for anamnesis, treatment, follow up and teleconference interfaces. The interface allows neurosurgeon, surgeon/general doctors and nurses to entry the EDH and SDH patient's data during referring them on the emergency simulation and evaluated based on time needs and their understanding. The average time needed was obtained after simulated by Lenovo T500 notebook using mouse; 8-10 min for neurosurgeons, 12-15 min for surgeons/general doctors and 15-19 min for nurses. By using Think Pad X201 Tablet, the time needed for entry data was 5-7 min for neurosurgeon, 7-10 min for surgeons/general doctors and 12-16 min for nurses. We observed that the time difference was depending on the computer type and user literacy qualification as well as their understanding on traumatic brain injury, particularly for the nurses. In conclusion, there are five data classification for simply TBI GUI, namely, 1) demography, 2) specific anamnesis for EDH and SDH, 3) treatment action and medicine of TBI, 4) follow up data display and 5) teleneurosurgery for streaming video consultation. The type of computer, particularly tablet PC was more convenient and faster for entry data, compare to that computer mouse touched pad. Emergency broadband access network using tethered balloon is possible to be employed to cover the communications systems in disaster area.
1998-01-01
Engineering mockup shows the general arrangement of the plarned Biotechnology Facility inside an EXPRESS rack aboard the International Space Station. This layout includes a gas supply module (bottom left), control computer and laptop interface (bottom right), two rotating wall vessels (top right), and support systems.
Advances in Grid Computing for the FabrIc for Frontier Experiments Project at Fermialb
DOE Office of Scientific and Technical Information (OSTI.GOV)
Herner, K.; Alba Hernandex, A. F.; Bhat, S.
The FabrIc for Frontier Experiments (FIFE) project is a major initiative within the Fermilab Scientic Computing Division charged with leading the computing model for Fermilab experiments. Work within the FIFE project creates close collaboration between experimenters and computing professionals to serve high-energy physics experiments of diering size, scope, and physics area. The FIFE project has worked to develop common tools for job submission, certicate management, software and reference data distribution through CVMFS repositories, robust data transfer, job monitoring, and databases for project tracking. Since the projects inception the experiments under the FIFE umbrella have signicantly matured, and present an increasinglymore » complex list of requirements to service providers. To meet these requirements, the FIFE project has been involved in transitioning the Fermilab General Purpose Grid cluster to support a partitionable slot model, expanding the resources available to experiments via the Open Science Grid, assisting with commissioning dedicated high-throughput computing resources for individual experiments, supporting the eorts of the HEP Cloud projects to provision a variety of back end resources, including public clouds and high performance computers, and developing rapid onboarding procedures for new experiments and collaborations. The larger demands also require enhanced job monitoring tools, which the project has developed using such tools as ElasticSearch and Grafana. in helping experiments manage their large-scale production work ows. This group in turn requires a structured service to facilitate smooth management of experiment requests, which FIFE provides in the form of the Production Operations Management Service (POMS). POMS is designed to track and manage requests from the FIFE experiments to run particular work ows, and support troubleshooting and triage in case of problems. Recently a new certicate management infrastructure called Distributed Computing Access with Federated Identities (DCAFI) has been put in place that has eliminated our dependence on a Fermilab-specic third-party Certicate Authority service and better accommodates FIFE collaborators without a Fermilab Kerberos account. DCAFI integrates the existing InCommon federated identity infrastructure, CILogon Basic CA, and a MyProxy service using a new general purpose open source tool. We will discuss the general FIFE onboarding strategy, progress in expanding FIFE experiments presence on the Open Science Grid, new tools for job monitoring, the POMS service, and the DCAFI project.« less
Advances in Grid Computing for the Fabric for Frontier Experiments Project at Fermilab
NASA Astrophysics Data System (ADS)
Herner, K.; Alba Hernandez, A. F.; Bhat, S.; Box, D.; Boyd, J.; Di Benedetto, V.; Ding, P.; Dykstra, D.; Fattoruso, M.; Garzoglio, G.; Kirby, M.; Kreymer, A.; Levshina, T.; Mazzacane, A.; Mengel, M.; Mhashilkar, P.; Podstavkov, V.; Retzke, K.; Sharma, N.; Teheran, J.
2017-10-01
The Fabric for Frontier Experiments (FIFE) project is a major initiative within the Fermilab Scientific Computing Division charged with leading the computing model for Fermilab experiments. Work within the FIFE project creates close collaboration between experimenters and computing professionals to serve high-energy physics experiments of differing size, scope, and physics area. The FIFE project has worked to develop common tools for job submission, certificate management, software and reference data distribution through CVMFS repositories, robust data transfer, job monitoring, and databases for project tracking. Since the projects inception the experiments under the FIFE umbrella have significantly matured, and present an increasingly complex list of requirements to service providers. To meet these requirements, the FIFE project has been involved in transitioning the Fermilab General Purpose Grid cluster to support a partitionable slot model, expanding the resources available to experiments via the Open Science Grid, assisting with commissioning dedicated high-throughput computing resources for individual experiments, supporting the efforts of the HEP Cloud projects to provision a variety of back end resources, including public clouds and high performance computers, and developing rapid onboarding procedures for new experiments and collaborations. The larger demands also require enhanced job monitoring tools, which the project has developed using such tools as ElasticSearch and Grafana. in helping experiments manage their large-scale production workflows. This group in turn requires a structured service to facilitate smooth management of experiment requests, which FIFE provides in the form of the Production Operations Management Service (POMS). POMS is designed to track and manage requests from the FIFE experiments to run particular workflows, and support troubleshooting and triage in case of problems. Recently a new certificate management infrastructure called Distributed Computing Access with Federated Identities (DCAFI) has been put in place that has eliminated our dependence on a Fermilab-specific third-party Certificate Authority service and better accommodates FIFE collaborators without a Fermilab Kerberos account. DCAFI integrates the existing InCommon federated identity infrastructure, CILogon Basic CA, and a MyProxy service using a new general purpose open source tool. We will discuss the general FIFE onboarding strategy, progress in expanding FIFE experiments presence on the Open Science Grid, new tools for job monitoring, the POMS service, and the DCAFI project.
23. VIEW OF THE FIRST FLOOR PLAN. THE FIRST FLOOR ...
23. VIEW OF THE FIRST FLOOR PLAN. THE FIRST FLOOR HOUSED ADMINISTRATIVE OFFICES, THE CENTRAL COMPUTING, UTILITY SYSTEMS, ANALYTICAL LABORATORIES, AND MAINTENANCE SHOPS. THE ORIGINAL DRAWING HAS BEEN ARCHIVED ON MICROFILM. THE DRAWING WAS REPRODUCED AT THE BEST QUALITY POSSIBLE. LETTERS AND NUMBERS IN THE CIRCLES INDICATE FOOTER AND/OR COLUMN LOCATIONS. - Rocky Flats Plant, General Manufacturing, Support, Records-Central Computing, Southern portion of Plant, Golden, Jefferson County, CO
Knowledge management: Role of the the Radiation Safety Information Computational Center (RSICC)
NASA Astrophysics Data System (ADS)
Valentine, Timothy
2017-09-01
The Radiation Safety Information Computational Center (RSICC) at Oak Ridge National Laboratory (ORNL) is an information analysis center that collects, archives, evaluates, synthesizes and distributes information, data and codes that are used in various nuclear technology applications. RSICC retains more than 2,000 software packages that have been provided by code developers from various federal and international agencies. RSICC's customers (scientists, engineers, and students from around the world) obtain access to such computing codes (source and/or executable versions) and processed nuclear data files to promote on-going research, to ensure nuclear and radiological safety, and to advance nuclear technology. The role of such information analysis centers is critical for supporting and sustaining nuclear education and training programs both domestically and internationally, as the majority of RSICC's customers are students attending U.S. universities. Additionally, RSICC operates a secure CLOUD computing system to provide access to sensitive export-controlled modeling and simulation (M&S) tools that support both domestic and international activities. This presentation will provide a general review of RSICC's activities, services, and systems that support knowledge management and education and training in the nuclear field.
The Recovery of a Clinical Database Management System after Destruction by Fire *
Covvey, H.D.; McAlister, N.H.; Greene, J.; Wigle, E.D.
1981-01-01
In August 1980 a fire in the Cardiovascular Unit at Toronto General Hospital severely damaged the physical plant and rendered all on-site equipment unrecoverable. Among the hardware items in the fire was the computer which supports our cardiovascular database system. Within hours after the fire it was determined that the computer was no longer serviceable. Beyond off-site back-up tapes, there was the possibility that recent records on the computer had suffered a similar fate. Immediate procedures were instituted to obtain a replacement computer system and to clean media to permit data recovery. Within 2 months a partial system was supporting all users, and all data was recovered and being used. The destructive potential of a fire is rarely seriously considered relative to computer equipment in our clinical environments. Full-replacement value insurance; an excellent equipment supplier with the capacity to respond to an emergency; backup and recovery procedures with off-site storage; and dedicated staff are key hedges against disaster.
CCOMP: An efficient algorithm for complex roots computation of determinantal equations
NASA Astrophysics Data System (ADS)
Zouros, Grigorios P.
2018-01-01
In this paper a free Python algorithm, entitled CCOMP (Complex roots COMPutation), is developed for the efficient computation of complex roots of determinantal equations inside a prescribed complex domain. The key to the method presented is the efficient determination of the candidate points inside the domain which, in their close neighborhood, a complex root may lie. Once these points are detected, the algorithm proceeds to a two-dimensional minimization problem with respect to the minimum modulus eigenvalue of the system matrix. In the core of CCOMP exist three sub-algorithms whose tasks are the efficient estimation of the minimum modulus eigenvalues of the system matrix inside the prescribed domain, the efficient computation of candidate points which guarantee the existence of minima, and finally, the computation of minima via bound constrained minimization algorithms. Theoretical results and heuristics support the development and the performance of the algorithm, which is discussed in detail. CCOMP supports general complex matrices, and its efficiency, applicability and validity is demonstrated to a variety of microwave applications.
Embedded systems for supporting computer accessibility.
Mulfari, Davide; Celesti, Antonio; Fazio, Maria; Villari, Massimo; Puliafito, Antonio
2015-01-01
Nowadays, customized AT software solutions allow their users to interact with various kinds of computer systems. Such tools are generally available on personal devices (e.g., smartphones, laptops and so on) commonly used by a person with a disability. In this paper, we investigate a way of using the aforementioned AT equipments in order to access many different devices without assistive preferences. The solution takes advantage of open source hardware and its core component consists of an affordable Linux embedded system: it grabs data coming from the assistive software, which runs on the user's personal device, then, after processing, it generates native keyboard and mouse HID commands for the target computing device controlled by the end user. This process supports any operating system available on the target machine and it requires no specialized software installation; therefore the user with a disability can rely on a single assistive tool to control a wide range of computing platforms, including conventional computers and many kinds of mobile devices, which receive input commands through the USB HID protocol.
The super-Turing computational power of plastic recurrent neural networks.
Cabessa, Jérémie; Siegelmann, Hava T
2014-12-01
We study the computational capabilities of a biologically inspired neural model where the synaptic weights, the connectivity pattern, and the number of neurons can evolve over time rather than stay static. Our study focuses on the mere concept of plasticity of the model so that the nature of the updates is assumed to be not constrained. In this context, we show that the so-called plastic recurrent neural networks (RNNs) are capable of the precise super-Turing computational power--as the static analog neural networks--irrespective of whether their synaptic weights are modeled by rational or real numbers, and moreover, irrespective of whether their patterns of plasticity are restricted to bi-valued updates or expressed by any other more general form of updating. Consequently, the incorporation of only bi-valued plastic capabilities in a basic model of RNNs suffices to break the Turing barrier and achieve the super-Turing level of computation. The consideration of more general mechanisms of architectural plasticity or of real synaptic weights does not further increase the capabilities of the networks. These results support the claim that the general mechanism of plasticity is crucially involved in the computational and dynamical capabilities of biological neural networks. They further show that the super-Turing level of computation reflects in a suitable way the capabilities of brain-like models of computation.
An Organization Design for the PERSPAY Consolidated Data Center.
1983-06-01
GENERAL 25 B. PROGRAM DEVELOPMENT 30 C. COMPUTER OPERATIONS 33 D. TECHNICAL SUPPORT BRANCH 36 E. MANAGEMENT OF THE D.P. ORGANIZATION...support, and 4) management , including planning, administration, and control ( Brandon , Norton Gaydasch, Frank). The following list shows the... Management policy on just what the ADP installation is expected to do is an information requirement that must be satisfied in order for the operation to
Practical somewhat-secure quantum somewhat-homomorphic encryption with coherent states
NASA Astrophysics Data System (ADS)
Tan, Si-Hui; Ouyang, Yingkai; Rohde, Peter P.
2018-04-01
We present a scheme for implementing homomorphic encryption on coherent states encoded using phase-shift keys. The encryption operations require only rotations in phase space, which commute with computations in the code space performed via passive linear optics, and with generalized nonlinear phase operations that are polynomials of the photon-number operator in the code space. This encoding scheme can thus be applied to any computation with coherent-state inputs, and the computation proceeds via a combination of passive linear optics and generalized nonlinear phase operations. An example of such a computation is matrix multiplication, whereby a vector representing coherent-state amplitudes is multiplied by a matrix representing a linear optics network, yielding a new vector of coherent-state amplitudes. By finding an orthogonal partitioning of the support of our encoded states, we quantify the security of our scheme via the indistinguishability of the encrypted code words. While we focus on coherent-state encodings, we expect that this phase-key encoding technique could apply to any continuous-variable computation scheme where the phase-shift operator commutes with the computation.
Evaluating open-source cloud computing solutions for geosciences
NASA Astrophysics Data System (ADS)
Huang, Qunying; Yang, Chaowei; Liu, Kai; Xia, Jizhe; Xu, Chen; Li, Jing; Gui, Zhipeng; Sun, Min; Li, Zhenglong
2013-09-01
Many organizations start to adopt cloud computing for better utilizing computing resources by taking advantage of its scalability, cost reduction, and easy to access characteristics. Many private or community cloud computing platforms are being built using open-source cloud solutions. However, little has been done to systematically compare and evaluate the features and performance of open-source solutions in supporting Geosciences. This paper provides a comprehensive study of three open-source cloud solutions, including OpenNebula, Eucalyptus, and CloudStack. We compared a variety of features, capabilities, technologies and performances including: (1) general features and supported services for cloud resource creation and management, (2) advanced capabilities for networking and security, and (3) the performance of the cloud solutions in provisioning and operating the cloud resources as well as the performance of virtual machines initiated and managed by the cloud solutions in supporting selected geoscience applications. Our study found that: (1) no significant performance differences in central processing unit (CPU), memory and I/O of virtual machines created and managed by different solutions, (2) OpenNebula has the fastest internal network while both Eucalyptus and CloudStack have better virtual machine isolation and security strategies, (3) Cloudstack has the fastest operations in handling virtual machines, images, snapshots, volumes and networking, followed by OpenNebula, and (4) the selected cloud computing solutions are capable for supporting concurrent intensive web applications, computing intensive applications, and small-scale model simulations without intensive data communication.
Review of Collaborative Tools for Planning and Engineering
2007-10-01
including PDAs) and Operating Systems 1 In general, should support laptops, desktops, Windows OS, Mac OS, Palm OS, Windows CE, Blackberry , Sun...better), voting (to establish operating parameters), reactor design, wind tunnel simulation Display same material on every computer, synchronisation
Educational Systems Design Implications of Electronic Publishing.
ERIC Educational Resources Information Center
Romiszowski, Alexander J.
1994-01-01
Discussion of electronic publishing focuses on the four main purposes of media in general: communication, entertainment, motivation, and education. Highlights include electronic journals and books; hypertext; user control; computer graphics and animation; electronic games; virtual reality; multimedia; electronic performance support;…
STS-109 MS Currie on aft flight deck
2002-03-04
STS109-E-5291 (1-12 March 2002) --- Astronaut Nancy J. Currie, STS-109 mission specialist, works with Payload and General Support Computers (PGSC) on the mid deck of the Space Shuttle Columbia. The image was taken with digital still camera.
National Test Bed Security and Communications Architecture Working Group Report
1992-04-01
computer systems via a physical medium. Most of those physical media are tappable or interceptable. This means that all the data that flows across the...provides the capability for NTBN nodes to support users operating in differing COIs to share the computing resources and communication media and for...representation. Again generally speaking, the NTBN must act as the high-speed, wide-bandwidth communications media that would provide the "near real-time
Software for Brain Network Simulations: A Comparative Study
Tikidji-Hamburyan, Ruben A.; Narayana, Vikram; Bozkus, Zeki; El-Ghazawi, Tarek A.
2017-01-01
Numerical simulations of brain networks are a critical part of our efforts in understanding brain functions under pathological and normal conditions. For several decades, the community has developed many software packages and simulators to accelerate research in computational neuroscience. In this article, we select the three most popular simulators, as determined by the number of models in the ModelDB database, such as NEURON, GENESIS, and BRIAN, and perform an independent evaluation of these simulators. In addition, we study NEST, one of the lead simulators of the Human Brain Project. First, we study them based on one of the most important characteristics, the range of supported models. Our investigation reveals that brain network simulators may be biased toward supporting a specific set of models. However, all simulators tend to expand the supported range of models by providing a universal environment for the computational study of individual neurons and brain networks. Next, our investigations on the characteristics of computational architecture and efficiency indicate that all simulators compile the most computationally intensive procedures into binary code, with the aim of maximizing their computational performance. However, not all simulators provide the simplest method for module development and/or guarantee efficient binary code. Third, a study of their amenability for high-performance computing reveals that NEST can almost transparently map an existing model on a cluster or multicore computer, while NEURON requires code modification if the model developed for a single computer has to be mapped on a computational cluster. Interestingly, parallelization is the weakest characteristic of BRIAN, which provides no support for cluster computations and limited support for multicore computers. Fourth, we identify the level of user support and frequency of usage for all simulators. Finally, we carry out an evaluation using two case studies: a large network with simplified neural and synaptic models and a small network with detailed models. These two case studies allow us to avoid any bias toward a particular software package. The results indicate that BRIAN provides the most concise language for both cases considered. Furthermore, as expected, NEST mostly favors large network models, while NEURON is better suited for detailed models. Overall, the case studies reinforce our general observation that simulators have a bias in the computational performance toward specific types of the brain network models. PMID:28775687
NERVA dynamic analysis methodology, SPRVIB
NASA Technical Reports Server (NTRS)
Vronay, D. F.
1972-01-01
The general dynamic computer code called SPRVIB (Spring Vib) developed in support of the NERVA (nuclear engine for rocket vehicle application) program is described. Using normal mode techniques, the program computes kinematical responses of a structure caused by various combinations of harmonic and elliptic forcing functions or base excitations. Provision is made for a graphical type of force or base excitation input to the structure. A description of the required input format and a listing of the program are presented, along with several examples illustrating the use of the program. SPRVIB is written in FORTRAN 4 computer language for use on the CDC 6600 or the IBM 360/75 computers.
7. VIEW OF MACHINE SHOP IN BUILDING 881. WORKERS IN ...
7. VIEW OF MACHINE SHOP IN BUILDING 881. WORKERS IN THE MACHINE SHOP FORMED ENRICHED URANIUM COMPONENTS INTO THEIR FINAL SHAPES. (12/12/56) - Rocky Flats Plant, General Manufacturing, Support, Records-Central Computing, Southern portion of Plant, Golden, Jefferson County, CO
Biotechnology Facility (BTF) for ISS
NASA Technical Reports Server (NTRS)
1998-01-01
Engineering mockup shows the general arrangement of the plarned Biotechnology Facility inside an EXPRESS rack aboard the International Space Station. This layout includes a gas supply module (bottom left), control computer and laptop interface (bottom right), two rotating wall vessels (top right), and support systems.
17. VIEW OF HYDRIDING SYSTEM IN BUILDING 881. THE HYDRIDING ...
17. VIEW OF HYDRIDING SYSTEM IN BUILDING 881. THE HYDRIDING SYSTEM WAS PART OF THE FAST ENRICHED URANIUM RECOVERY PROCESS. (11/11/59) - Rocky Flats Plant, General Manufacturing, Support, Records-Central Computing, Southern portion of Plant, Golden, Jefferson County, CO
MSIX - A general and user-friendly platform for RAM analysis
NASA Astrophysics Data System (ADS)
Pan, Z. J.; Blemel, Peter
The authors present a CAD (computer-aided design) platform supporting RAM (reliability, availability, and maintainability) analysis with efficient system description and alternative evaluation. The design concepts, implementation techniques, and application results are described. This platform is user-friendly because of its graphic environment, drawing facilities, object orientation, self-tutoring, and access to the operating system. The programs' independency and portability make them generally applicable to various analysis tasks.
Bartha, Michael C; Allie, Paul; Kokot, Douglas; Roe, Cynthia Purvis
2015-01-01
Computer users continue to report eye and upper body discomfort even as workstation flexibility has improved. Research shows a relationship between character size, viewing distance, and reading performance. Few reports exist regarding text height viewed under normal office work conditions and eye discomfort. This paper reports self-selected computer display placement, text characteristics, and subjective comfort for older and younger computer workers under real-world conditions. Computer workers were provided with monitors and adjustable display support(s). In Study 1, older workers wearing progressive-addition lenses (PALs) were observed. In study 2, older workers wearing multifocal lenses and younger workers were observed. Workers wearing PALs experienced less eye and body discomfort with adjustable displays, and less eye and neck discomfort for text visual angles near or greater than ergonomic recommendations. Older workers wearing multifocal correction positioned displays much lower than younger workers. In general, computer users did not adjust character size to ensure that fovial images of text fell within the recommended range. Ergonomic display placement recommendations should be different for computer users wearing multifocal correction for presbyopia. Ergonomic training should emphasize adjusting text size for user comfort.
Jimenez-Molina, Angel; Gaete-Villegas, Jorge; Fuentes, Javier
2018-06-01
New advances in telemedicine, ubiquitous computing, and artificial intelligence have supported the emergence of more advanced applications and support systems for chronic patients. This trend addresses the important problem of chronic illnesses, highlighted by multiple international organizations as a core issue in future healthcare. Despite the myriad of exciting new developments, each application and system is designed and implemented for specific purposes and lacks the flexibility to support different healthcare concerns. Some of the known problems of such developments are the integration issues between applications and existing healthcare systems, the reusability of technical knowledge in the creation of new and more sophisticated systems and the usage of data gathered from multiple sources in the generation of new knowledge. This paper proposes a framework for the development of chronic disease support systems and applications as an answer to these shortcomings. Through this framework our pursuit is to create a common ground methodology upon which new developments can be created and easily integrated to provide better support to chronic patients, medical staff and other relevant participants. General requirements are inferred for any support system from the primary attention process of chronic patients by the Business Process Management Notation. Numerous technical approaches are proposed to design a general architecture that considers the medical organizational requirements in the treatment of a patient. A framework is presented for any application in support of chronic patients and evaluated by a case study to test the applicability and pertinence of the solution. Copyright © 2018 Elsevier Inc. All rights reserved.
Tobitt, Simon; Percival, Robert
2017-07-04
UK society is undergoing a technological revolution, including meeting health needs through technology. Government policy is shifting towards a "digital by default" position. Studies have trialled health technology interventions for those experiencing psychosis and shown them to be useful. To gauge levels of engagement with mobile phones (Internet-enabled or cell phone), computers and the Internet in the specific population of community mental health rehabilitation. Two surveys were conducted: with service-users on use/non-use of technologies, and interest in technology interventions and support; and with placements on facilities and support available to service-users. Levels of engagement in this population were substantially less than those recorded in the general UK and other clinical populations: 40.2% regularly use mobiles, 17.5% computers, and 14.4% the Internet. Users of all three technologies were significantly younger than non-users. Users of mobiles and computers were significantly more likely to live in lower support/higher independence placements. Of surveyed placements, 35.5% provide a communal computer and 38.7% IT skills sessions. Community mental health rehabilitation service-users risk finding themselves excluded by a "digital divide". Action is needed to ensure equal access to online opportunities, including healthcare innovations. Clinical and policy implications are discussed.
Nearly Supersymmetric Dark Atoms
Behbahani, Siavosh R.; Jankowiak, Martin; Rube, Tomas; ...
2011-01-01
Theories of dark matter that support bound states are an intriguing possibility for the identity of the missing mass of the Universe. This article proposes a class of models of supersymmetric composite dark matter where the interactions with the Standard Model communicate supersymmetry breaking to the dark sector. In these models, supersymmetry breaking can be treated as a perturbation on the spectrum of bound states. Using a general formalism, the spectrum with leading supersymmetry effects is computed without specifying the details of the binding dynamics. The interactions of the composite states with the Standard Model are computed, and several benchmarkmore » models are described. General features of nonrelativistic supersymmetric bound states are emphasized.« less
Environmental apsects of the transuranics: a selected, annotated bibliography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martin, F. M.; Sanders, C. T.; Talmage, S. S.
This fourth published bibliography of 528 references is from the computer information file built to provide support to the Nevada Applied Ecology Group (NAEG) of the AEC Nevada Operations Office. The general scope is environmental aspects of uranium and the transuranic elements, with a preponderance of material on plutonium. In addition, there are supporting materials involving basic ecology or general reviews on other nuclides that are entered at the request of the NAEG. References provide findings-oriented abstracts. Numerical data is referred to, in the comment field. Indexes are given for author, subject category, keywords, geographic location, permuted title, taxons, andmore » publication description.« less
The Impact of New Technology on Accounting Education.
ERIC Educational Resources Information Center
Shaoul, Jean
The introduction of computers in the Department of Accounting and Finance at Manchester University is described. General background outlining the increasing need for microcomputers in the accounting curriculum (including financial modelling tools and decision support systems such as linear programming, statistical packages, and simulation) is…
Networking Foundations for Collaborative Computing at Internet Scope
2006-01-01
network-supported synchronous multime- dia groupwork at Internet scope and for large user groups. Contributions entail an novel classification for...multimedia resources in interactive groupwork , generalized to the domain of CSCW from the “right to speak” [26]. A floor control protocol mediates access to
5. VIEW OF THE FOUNDRY. IN THE FOUNDRY, ENRICHED URANIUM ...
5. VIEW OF THE FOUNDRY. IN THE FOUNDRY, ENRICHED URANIUM WAS CAST INTO SLABS OR INGOTS FROM WHICH WEAPONS COMPONENTS WERE FABRICATED. (4/4/66) - Rocky Flats Plant, General Manufacturing, Support, Records-Central Computing, Southern portion of Plant, Golden, Jefferson County, CO
4. VIEW OF THE FOUNDRY. IN THE FOUNDRY, ENRICHED URANIUM ...
4. VIEW OF THE FOUNDRY. IN THE FOUNDRY, ENRICHED URANIUM WAS CAST INTO SLABS OR INGOTS FROM WHICH WEAPONS COMPONENTS WERE FABRICATED. (5/17/62). - Rocky Flats Plant, General Manufacturing, Support, Records-Central Computing, Southern portion of Plant, Golden, Jefferson County, CO
VOA - MS Dunbar supervises experiments on Spacehab
1998-03-04
S89-E-5660 (22-31 Jan 1998) --- Using a Payload General Support Computer (PGSC) onboard the Space Shuttle Atlantis, astronaut Bonnie J. Dunbar, mission specialist, enters data associated with supply transfer to Mir Space Station. The photograph was taken with an Electronic Still Camera (ESC).
Generalized environmental control and life support system computer program (G1894), phase 3
NASA Technical Reports Server (NTRS)
Mcenulty, R. E.
1978-01-01
The work performed during Phase 3 of the Generalized Environmental Control Life Support System (ECLSS) Computer Program is reported. Phase 3 of this program covered the period from December 1977 to September 1978. The computerized simulation of the Shuttle Orbiter ECLSS was upgraded in the following areas: (1) the payload loop of the Shuttle simulation was completely recoded and checked out; (2) the Shuttle simulation water and freon loop initialization logic was simplified to permit easier program input for the user; (3) the computerized simulation was modified to accept the WASP subroutine, which is a subroutine to evaluate thermal properties of water and freon; (4) the 1108 operating system was upgraded by LEC; (5) the Shuttle simulation was modified to permit failure cases which simulate zero component flow values; and (6) the Shuttle SEPS version was modified and secure files were setup on the 1108 and 1110 systems to permit simulation runs to be made from remote terminals.
2013-01-01
PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) RAND...prototype model illustrating concretely a new approach. The prototype model itself should be seen not as a definitive end point, but rather as a...9 1 4 8 1 4 8 1 1 1 1 2 2 1 2 2 2 8 9 2 8 9 3 9 9 2 8 8 2 8 9 2 9 9 1 1 3 1 2 4 1 2 4 xviii A Computational Model of Public Support for Insurgency
Client - server programs analysis in the EPOCA environment
NASA Astrophysics Data System (ADS)
Donatelli, Susanna; Mazzocca, Nicola; Russo, Stefano
1996-09-01
Client - server processing is a popular paradigm for distributed computing. In the development of client - server programs, the designer has first to ensure that the implementation behaves correctly, in particular that it is deadlock free. Second, he has to guarantee that the program meets predefined performance requirements. This paper addresses the issues in the analysis of client - server programs in EPOCA. EPOCA is a computer-aided software engeneering (CASE) support system that allows the automated construction and analysis of generalized stochastic Petri net (GSPN) models of concurrent applications. The paper describes, on the basis of a realistic case study, how client - server systems are modelled in EPOCA, and the kind of qualitative and quantitative analysis supported by its tools.
Computer Analysis Of High-Speed Roller Bearings
NASA Technical Reports Server (NTRS)
Coe, H.
1988-01-01
High-speed cylindrical roller-bearing analysis program (CYBEAN) developed to compute behavior of cylindrical rolling-element bearings at high speeds and with misaligned shafts. With program, accurate assessment of geometry-induced roller preload possible for variety of out-ring and housing configurations and loading conditions. Enables detailed examination of bearing performance and permits exploration of causes and consequences of bearing skew. Provides general capability for assessment of designs of bearings supporting main shafts of engines. Written in FORTRAN IV.
Discrete mathematics course supported by CAS MATHEMATICA
NASA Astrophysics Data System (ADS)
Ivanov, O. A.; Ivanova, V. V.; Saltan, A. A.
2017-08-01
In this paper, we discuss examples of assignments for a course in discrete mathematics for undergraduate students majoring in business informatics. We consider several problems with computer-based solutions and discuss general strategies for using computers in teaching mathematics and its applications. In order to evaluate the effectiveness of our approach, we conducted an anonymous survey. The results of the survey provide evidence that our approach contributes to high outcomes and aligns with the course aims and objectives.
NASA Technical Reports Server (NTRS)
Mcenulty, R. E.
1977-01-01
The G189A simulation of the Shuttle Orbiter ECLSS was upgraded. All simulation library versions and simulation models were converted from the EXEC2 to the EXEC8 computer system and a new program, G189PL, was added to the combination master program library. The program permits the post-plotting of up to 100 frames of plot data over any time interval of a G189 simulation run. The overlay structure of the G189A simulations were restructured for the purpose of conserving computer core requirements and minimizing run time requirements.
Integrating Grid Services into the Cray XT4 Environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
NERSC; Cholia, Shreyas; Lin, Hwa-Chun Wendy
2009-05-01
The 38640 core Cray XT4"Franklin" system at the National Energy Research Scientific Computing Center (NERSC) is a massively parallel resource available to Department of Energy researchers that also provides on-demand grid computing to the Open Science Grid. The integration of grid services on Franklin presented various challenges, including fundamental differences between the interactive and compute nodes, a stripped down compute-node operating system without dynamic library support, a shared-root environment and idiosyncratic application launching. Inour work, we describe how we resolved these challenges on a running, general-purpose production system to provide on-demand compute, storage, accounting and monitoring services through generic gridmore » interfaces that mask the underlying system-specific details for the end user.« less
Applications of intelligent computer-aided training
NASA Technical Reports Server (NTRS)
Loftin, R. B.; Savely, Robert T.
1991-01-01
Intelligent computer-aided training (ICAT) systems simulate the behavior of an experienced instructor observing a trainee, responding to help requests, diagnosing and remedying trainee errors, and proposing challenging new training scenarios. This paper presents a generic ICAT architecture that supports the efficient development of ICAT systems for varied tasks. In addition, details of ICAT projects, built with this architecture, that deliver specific training for Space Shuttle crew members, ground support personnel, and flight controllers are presented. Concurrently with the creation of specific ICAT applications, a general-purpose software development environment for ICAT systems is being built. The widespread use of such systems for both ground-based and on-orbit training will serve to preserve task and training expertise, support the training of large numbers of personnel in a distributed manner, and ensure the uniformity and verifiability of training experiences.
Donato, David I.
2012-01-01
This report presents the mathematical expressions and the computational techniques required to compute maximum-likelihood estimates for the parameters of the National Descriptive Model of Mercury in Fish (NDMMF), a statistical model used to predict the concentration of methylmercury in fish tissue. The expressions and techniques reported here were prepared to support the development of custom software capable of computing NDMMF parameter estimates more quickly and using less computer memory than is currently possible with available general-purpose statistical software. Computation of maximum-likelihood estimates for the NDMMF by numerical solution of a system of simultaneous equations through repeated Newton-Raphson iterations is described. This report explains the derivation of the mathematical expressions required for computational parameter estimation in sufficient detail to facilitate future derivations for any revised versions of the NDMMF that may be developed.
A Computational Model of Reasoning from the Clinical Literature
Rennels, Glenn D.
1986-01-01
This paper explores the premise that a formalized representation of empirical studies can play a central role in computer-based decision support. The specific motivations underlying this research include the following propositions: 1. Reasoning from experimental evidence contained in the clinical literature is central to the decisions physicians make in patient care. 2. A computational model, based upon a declarative representation for published reports of clinical studies, can drive a computer program that selectively tailors knowledge of the clinical literature as it is applied to a particular case. 3. The development of such a computational model is an important first step toward filling a void in computer-based decision support systems. Furthermore, the model may help us better understand the general principles of reasoning from experimental evidence both in medicine and other domains. Roundsman is a developmental computer system which draws upon structured representations of the clinical literature in order to critique plans for the management of primary breast cancer. Roundsman is able to produce patient-specific analyses of breast cancer management options based on the 24 clinical studies currently encoded in its knowledge base. The Roundsman system is a first step in exploring how the computer can help to bring a critical analysis of the relevant literature to the physician, structured around a particular patient and treatment decision.
2008-03-28
in plane bending stiffness. Figure 4. Non-Symmetric General Buckling In accordance with equations (4) through (11), the...the DAPS3 version of the code documented in reference 1, the DAPS4 code computes the stresses and deflections, interbay buckling pressure, general ... plane and out- of- plane bending , eliminating the simple support assumption at the bay ends. b. Stresses and deflections at all points between the
Airborne Electro-Optical Sensor Simulation System. Final Report.
ERIC Educational Resources Information Center
Hayworth, Don
The total system capability, including all the special purpose and general purpose hardware comprising the Airborne Electro-Optical Sensor Simulation (AEOSS) System, is described. The functional relationship between hardware portions is described together with interface to the software portion of the computer image generation. Supporting rationale…
General purpose computer program for interacting supersonic configurations: Programmer's manual
NASA Technical Reports Server (NTRS)
Crill, W.; Dale, B.
1977-01-01
The program ISCON (Interacting Supersonic Configuration) is described. The program is in support of the problem to generate a numerical procedure for determining the unsteady dynamic forces on interacting wings and tails in supersonic flow. Subroutines are presented along with the complete FORTRAN source listing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tukey, J.W.; Bloomfield, P.
In its most general terms, the work carried out under the contract consists of the development of new data analytic methods and the improvement of existing methods, their implementation on computer, especially minicomputers, and the development of non-statistical, systems-level software to support these activities. The work reported or completed is reviewed. (GHT)
Role of Prefrontal Cortex in Learning and Generalizing Hierarchical Rules in 8-Month-Old Infants.
Werchan, Denise M; Collins, Anne G E; Frank, Michael J; Amso, Dima
2016-10-05
Recent research indicates that adults and infants spontaneously create and generalize hierarchical rule sets during incidental learning. Computational models and empirical data suggest that, in adults, this process is supported by circuits linking prefrontal cortex (PFC) with striatum and their modulation by dopamine, but the neural circuits supporting this form of learning in infants are largely unknown. We used near-infrared spectroscopy to record PFC activity in 8-month-old human infants during a simple audiovisual hierarchical-rule-learning task. Behavioral results confirmed that infants adopted hierarchical rule sets to learn and generalize spoken object-label mappings across different speaker contexts. Infants had increased activity over right dorsal lateral PFC when rule sets switched from one trial to the next, a neural marker related to updating rule sets into working memory in the adult literature. Infants' eye blink rate, a possible physiological correlate of striatal dopamine activity, also increased when rule sets switched from one trial to the next. Moreover, the increase in right dorsolateral PFC activity in conjunction with eye blink rate also predicted infants' generalization ability, providing exploratory evidence for frontostriatal involvement during learning. These findings provide evidence that PFC is involved in rudimentary hierarchical rule learning in 8-month-old infants, an ability that was previously thought to emerge later in life in concert with PFC maturation. Hierarchical rule learning is a powerful learning mechanism that allows rules to be selected in a context-appropriate fashion and transferred or reused in novel contexts. Data from computational models and adults suggests that this learning mechanism is supported by dopamine-innervated interactions between prefrontal cortex (PFC) and striatum. Here, we provide evidence that PFC also supports hierarchical rule learning during infancy, challenging the current dogma that PFC is an underdeveloped brain system until adolescence. These results add new insights into the neurobiological mechanisms available to support learning and generalization in very early postnatal life, providing evidence that PFC and the frontostriatal circuitry are involved in organizing learning and behavior earlier in life than previously known. Copyright © 2016 the authors 0270-6474/16/3610314-09$15.00/0.
Role of Prefrontal Cortex in Learning and Generalizing Hierarchical Rules in 8-Month-Old Infants
Werchan, Denise M.; Collins, Anne G.E.; Frank, Michael J.
2016-01-01
Recent research indicates that adults and infants spontaneously create and generalize hierarchical rule sets during incidental learning. Computational models and empirical data suggest that, in adults, this process is supported by circuits linking prefrontal cortex (PFC) with striatum and their modulation by dopamine, but the neural circuits supporting this form of learning in infants are largely unknown. We used near-infrared spectroscopy to record PFC activity in 8-month-old human infants during a simple audiovisual hierarchical-rule-learning task. Behavioral results confirmed that infants adopted hierarchical rule sets to learn and generalize spoken object–label mappings across different speaker contexts. Infants had increased activity over right dorsal lateral PFC when rule sets switched from one trial to the next, a neural marker related to updating rule sets into working memory in the adult literature. Infants' eye blink rate, a possible physiological correlate of striatal dopamine activity, also increased when rule sets switched from one trial to the next. Moreover, the increase in right dorsolateral PFC activity in conjunction with eye blink rate also predicted infants' generalization ability, providing exploratory evidence for frontostriatal involvement during learning. These findings provide evidence that PFC is involved in rudimentary hierarchical rule learning in 8-month-old infants, an ability that was previously thought to emerge later in life in concert with PFC maturation. SIGNIFICANCE STATEMENT Hierarchical rule learning is a powerful learning mechanism that allows rules to be selected in a context-appropriate fashion and transferred or reused in novel contexts. Data from computational models and adults suggests that this learning mechanism is supported by dopamine-innervated interactions between prefrontal cortex (PFC) and striatum. Here, we provide evidence that PFC also supports hierarchical rule learning during infancy, challenging the current dogma that PFC is an underdeveloped brain system until adolescence. These results add new insights into the neurobiological mechanisms available to support learning and generalization in very early postnatal life, providing evidence that PFC and the frontostriatal circuitry are involved in organizing learning and behavior earlier in life than previously known. PMID:27707968
Local Area Network Strategies and Guidelines for a Peruvian Air Force Computer Center
1991-03-01
service elements to support application processes such as job management, and financial data exchange. The layer also supports the virtual terminal and... virtual file concept. [Ref.3 :p. 285] Essentially, the lowest three layers are concerned with the communication protocols associated with the data...General de la Fuerza Aerea Peruana Lima, Republica del Peru 5. Escuela de Oficiales de la Fuerza Aerea Peruana 2 Biblioteca del Grupo del Instruccion Base
Analysis of the vibratory excitation arising from spiral bevel gears
NASA Technical Reports Server (NTRS)
Mark, William D.
1987-01-01
Tools required to understand and predict in terms of its underlying causes the vibratory excitation arising from meshing spiral bevel gears are developed. A generalized three component transmission error of meshing spiral bevel gears is defined. Equations are derived that yield the three components of the generalized transmission error in terms of deviations of tooth running surfaces from equispaced perfect spherical involute surfaces and tooth/gearbody elastic deformations arising from the three components of the generalized force transmitted by the meshing gears. A method for incorporating these equations into the equations of motion of a gear system is described. Equations are derived for the three components of the generalized force transmitted by the gears which are valid whenever inertial effects of the meshing gears and their supports are negligible. Bearing offsets from the positions occupied by the shaft centerlines of perfect spherical involute bevel gears and bearing/bearing support flexibilities enter into the computation of these forces.
De Rosis, Sabina; Seghieri, Chiara
2015-08-22
There is general consensus that appropriate development and use of information and communication technologies (ICT) are crucial in the delivery of effective primary care (PC). Several countries are defining policies to support and promote a structural change of the health care system through the introduction of ICT. This study analyses the state of development of basic ICT in PC systems of 31 European countries with the aim to describe the extent of, and main purposes for, computer use by General Practitioners (GPs) across Europe. Additionally, trends over time have been analysed. Descriptive statistical analysis was performed on data from the QUALICOPC (Quality and Costs of Primary Care in Europe) survey, to describe the geographic differences in the general use of computer, and in specific computerized clinical functions for different health-related purposes such as prescribing, medication checking, generating health records and research for medical information on the Internet. While all the countries have achieved a near-universal adoption of a computer in their primary care practices, with only a few countries near or under the boundary of 90 %, the computerisation of primary care clinical functions presents a wide variability of adoption within and among countries and, in several cases (such as in the southern and central-eastern Europe), a large room for improvement. At European level, more efforts could be done to support southern and central-eastern Europe in closing the gap in adoption and use of ICT in PC. In particular, more attention seems to be need on the current usages of the computer in PC, by focusing policies and actions on the improvement of the appropriate usages that can impact on quality and costs of PC and can facilitate an interconnected health care system. However, policies and investments seem necessary but not sufficient to achieve these goals. Organizational, behavioural and also networking aspects should be taken in consideration.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cooperstock, F.I., E-mail: cooperst@uvic.ca; Dupre, M.J., E-mail: mdupre@tulane.edu
We introduce a naturally-defined totally invariant spacetime energy expression for general relativity incorporating the contribution from gravity. The extension links seamlessly to the action integral for the gravitational field. The demand that the general expression for arbitrary systems reduces to the Tolman integral in the case of stationary bounded distributions, leads to the matter-localized Ricci integral for energy–momentum in support of the energy localization hypothesis. The role of the observer is addressed and as an extension of the special relativistic case, the field of observers comoving with the matter is seen to compute the intrinsic global energy of a system.more » The new localized energy supports the Bonnor claim that the Szekeres collapsing dust solutions are energy-conserving. It is suggested that in the extreme of strong gravity, the Heisenberg Uncertainty Principle be generalized in terms of spacetime energy–momentum. -- Highlights: •We present a totally invariant spacetime energy expression for general relativity incorporating the contribution from gravity. •Demand for the general expression to reduce to the Tolman integral for stationary systems supports the Ricci integral as energy–momentum. •Localized energy via the Ricci integral is consistent with the energy localization hypothesis. •New localized energy supports the Bonnor claim that the Szekeres collapsing dust solutions are energy-conserving. •Suggest the Heisenberg Uncertainty Principle be generalized in terms of spacetime energy–momentum in strong gravity extreme.« less
Support System Effects on the DLR-F6 Transport Configuration in the National Transonic Facility
NASA Technical Reports Server (NTRS)
Rivers, Melissa B.; Hunter, Craig A.; Gatlin, Gregory M.
2009-01-01
An experimental investigation of the DLR-F6 generic transport configuration was conducted in the NASA NTF for use in the Drag Prediction Workshop. As data from this experimental investigation was collected, a large difference in drag values was seen between the NTF test and an ONERA test that was conducted several years ago. After much investigation, it was determined that this difference was likely due to a sting effect correction applied to the ONERA data which NTF does not use. This insight led to the present work. In this study, a computational assessment has been undertaken to investigate model support system interference effects on the DLR-F6 transport configuration. The configurations computed during this investigation were the isolated wing-body, the wing-body with the full support system (blade and sting), the wing-body with just the blade, and the wing-body with just the sting. The results from this investigation show the same trends that ONERA saw when they conducted a similar experimental investigation in the S2MA tunnel. Computational results suggest that the blade contributed an interference type of effect, the sting contributed a general blockage effect, and the full support system combined these effects.
Games at work: the recreational use of computer games during working hours.
Reinecke, Leonard
2009-08-01
The present study investigated the recreational use of video and computer games in the workplace. In an online survey, 833 employed users of online casual games reported on their use of computer games during working hours. The data indicate that playing computer games in the workplace elicits substantial levels of recovery experience. Recovery experience associated with gameplay was the strongest predictor for the use of games in the workplace. Furthermore, individuals with higher levels of work-related fatigue reported stronger recovery experience during gameplay and showed a higher tendency to play games during working hours than did persons with lower levels of work strain. Additionally, the social situation at work was found to have a significant influence on the use of games. Persons receiving less social support from colleagues and supervisors played games at work more frequently than did individuals with higher levels of social support. Furthermore, job control was positively related to the use of games at work. In sum, the results of the present study illustrate that computer games have a significant recovery potential. Implications of these findings for research on personal computer use during work and for games research in general are discussed.
The StratusLab cloud distribution: Use-cases and support for scientific applications
NASA Astrophysics Data System (ADS)
Floros, E.
2012-04-01
The StratusLab project is integrating an open cloud software distribution that enables organizations to setup and provide their own private or public IaaS (Infrastructure as a Service) computing clouds. StratusLab distribution capitalizes on popular infrastructure virtualization solutions like KVM, the OpenNebula virtual machine manager, Claudia service manager and SlipStream deployment platform, which are further enhanced and expanded with additional components developed within the project. The StratusLab distribution covers the core aspects of a cloud IaaS architecture, namely Computing (life-cycle management of virtual machines), Storage, Appliance management and Networking. The resulting software stack provides a packaged turn-key solution for deploying cloud computing services. The cloud computing infrastructures deployed using StratusLab can support a wide range of scientific and business use cases. Grid computing has been the primary use case pursued by the project and for this reason the initial priority has been the support for the deployment and operation of fully virtualized production-level grid sites; a goal that has already been achieved by operating such a site as part of EGI's (European Grid Initiative) pan-european grid infrastructure. In this area the project is currently working to provide non-trivial capabilities like elastic and autonomic management of grid site resources. Although grid computing has been the motivating paradigm, StratusLab's cloud distribution can support a wider range of use cases. Towards this direction, we have developed and currently provide support for setting up general purpose computing solutions like Hadoop, MPI and Torque clusters. For what concerns scientific applications the project is collaborating closely with the Bioinformatics community in order to prepare VM appliances and deploy optimized services for bioinformatics applications. In a similar manner additional scientific disciplines like Earth Science can take advantage of StratusLab cloud solutions. Interested users are welcomed to join StratusLab's user community by getting access to the reference cloud services deployed by the project and offered to the public.
NASA Technical Reports Server (NTRS)
Proctor, B. W.; Reysa, R. P.; Russell, D. J.
1975-01-01
Viable crew appliance concepts were identified by means of a thorough literature search. Studies were made of the food management, personal hygiene, housekeeping, and off-duty habitability functions to determine which concepts best satisfy the Space Shuttle Orbiter and Modular Space Station mission requirements. Models of selected appliance concepts not currently included in the generalized environmental-thermal control and life support systems computer program were developed and validated. Development plans of selected concepts were generated for future reference. A shuttle freezer conceptual design was developed and a test support activity was provided for regenerative environmental control life support subsystems.
SAMSAN- MODERN NUMERICAL METHODS FOR CLASSICAL SAMPLED SYSTEM ANALYSIS
NASA Technical Reports Server (NTRS)
Frisch, H. P.
1994-01-01
SAMSAN was developed to aid the control system analyst by providing a self consistent set of computer algorithms that support large order control system design and evaluation studies, with an emphasis placed on sampled system analysis. Control system analysts have access to a vast array of published algorithms to solve an equally large spectrum of controls related computational problems. The analyst usually spends considerable time and effort bringing these published algorithms to an integrated operational status and often finds them less general than desired. SAMSAN reduces the burden on the analyst by providing a set of algorithms that have been well tested and documented, and that can be readily integrated for solving control system problems. Algorithm selection for SAMSAN has been biased toward numerical accuracy for large order systems with computational speed and portability being considered important but not paramount. In addition to containing relevant subroutines from EISPAK for eigen-analysis and from LINPAK for the solution of linear systems and related problems, SAMSAN contains the following not so generally available capabilities: 1) Reduction of a real non-symmetric matrix to block diagonal form via a real similarity transformation matrix which is well conditioned with respect to inversion, 2) Solution of the generalized eigenvalue problem with balancing and grading, 3) Computation of all zeros of the determinant of a matrix of polynomials, 4) Matrix exponentiation and the evaluation of integrals involving the matrix exponential, with option to first block diagonalize, 5) Root locus and frequency response for single variable transfer functions in the S, Z, and W domains, 6) Several methods of computing zeros for linear systems, and 7) The ability to generate documentation "on demand". All matrix operations in the SAMSAN algorithms assume non-symmetric matrices with real double precision elements. There is no fixed size limit on any matrix in any SAMSAN algorithm; however, it is generally agreed by experienced users, and in the numerical error analysis literature, that computation with non-symmetric matrices of order greater than about 200 should be avoided or treated with extreme care. SAMSAN attempts to support the needs of application oriented analysis by providing: 1) a methodology with unlimited growth potential, 2) a methodology to insure that associated documentation is current and available "on demand", 3) a foundation of basic computational algorithms that most controls analysis procedures are based upon, 4) a set of check out and evaluation programs which demonstrate usage of the algorithms on a series of problems which are structured to expose the limits of each algorithm's applicability, and 5) capabilities which support both a priori and a posteriori error analysis for the computational algorithms provided. The SAMSAN algorithms are coded in FORTRAN 77 for batch or interactive execution and have been implemented on a DEC VAX computer under VMS 4.7. An effort was made to assure that the FORTRAN source code was portable and thus SAMSAN may be adaptable to other machine environments. The documentation is included on the distribution tape or can be purchased separately at the price below. SAMSAN version 2.0 was developed in 1982 and updated to version 3.0 in 1988.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2000-02-01
DOE support for a broad research program in the sciences of complexity permitted the Santa Fe Institute to initiate new collaborative research within its integrative core activities as well as to host visitors to participate in research on specific topics that serve as motivation and testing ground for the study of the general principles of complex systems. Results are presented on computational biology, biodiversity and ecosystem research, and advanced computing and simulation.
NASA Astrophysics Data System (ADS)
Honing, Henkjan; Zuidema, Willem
2014-09-01
The future of cognitive science will be about bridging neuroscience and behavioral studies, with essential roles played by comparative biology, formal modeling, and the theory of computation. Nowhere will this integration be more strongly needed than in understanding the biological basis of language and music. We thus strongly sympathize with the general framework that Fitch [1] proposes, and welcome the remarkably broad and readable review he presents to support it.
TMS communications software. Volume 2: Bus interface unit
NASA Technical Reports Server (NTRS)
Gregor, P. J.
1979-01-01
A data bus communication system to support the space shuttle's Trend Monitoring System (TMS) and to provide a basis for evaluation of the bus concept is described. Installation of the system included developing both hardware and software interfaces between the bus and the specific TMS computers and terminals. The software written for the microprocessor-based bus interface units is described. The software implements both the general bus communications protocol and also the specific interface protocols for the TMS computers and terminals.
Computer-aided-engineering system for modeling and analysis of ECLSS integration testing
NASA Technical Reports Server (NTRS)
Sepahban, Sonbol
1987-01-01
The accurate modeling and analysis of two-phase fluid networks found in environmental control and life support systems is presently undertaken by computer-aided engineering (CAE) techniques whose generalized fluid dynamics package can solve arbitrary flow networks. The CAE system for integrated test bed modeling and analysis will also furnish interfaces and subsystem/test-article mathematical models. Three-dimensional diagrams of the test bed are generated by the system after performing the requisite simulation and analysis.
A compendium of computational fluid dynamics at the Langley Research Center
NASA Technical Reports Server (NTRS)
1980-01-01
Through numerous summary examples, the scope and general nature of the computational fluid dynamics (CFD) effort at Langley is identified. These summaries will help inform researchers in CFD and line management at Langley of the overall effort. In addition to the inhouse efforts, out of house CFD work supported by Langley through industrial contracts and university grants are included. Researchers were encouraged to include summaries of work in preliminary and tentative states of development as well as current research approaching definitive results.
Removing the center from computing: biology's new mode of digital knowledge production.
November, Joseph
2011-06-01
This article shows how the USA's National Institutes of Health (NIH) helped to bring about a major shift in the way computers are used to produce knowledge and in the design of computers themselves as a consequence of its early 1960s efforts to introduce information technology to biologists. Starting in 1960 the NIH sought to reform the life sciences by encouraging researchers to make use of digital electronic computers, but despite generous federal support biologists generally did not embrace the new technology. Initially the blame fell on biologists' lack of appropriate (i.e. digital) data for computers to process. However, when the NIH consulted MIT computer architect Wesley Clark about this problem, he argued that the computer's quality as a device that was centralized posed an even greater challenge to potential biologist users than did the computer's need for digital data. Clark convinced the NIH that if the agency hoped to effectively computerize biology, it would need to satisfy biologists' experimental and institutional needs by providing them the means to use a computer without going to a computing center. With NIH support, Clark developed the 1963 Laboratory Instrument Computer (LINC), a small, real-time interactive computer intended to be used inside the laboratory and controlled entirely by its biologist users. Once built, the LINC provided a viable alternative to the 1960s norm of large computers housed in computing centers. As such, the LINC not only became popular among biologists, but also served in later decades as an important precursor of today's computing norm in the sciences and far beyond, the personal computer.
Miller, R A
2010-01-01
The INTERNIST-1/Quick Medical Reference (QMR) diagnostic decision support project spans four decades, from 1971-onward. This paper describes the history of the project and details insights gained of relevance to the general clinical and informatics communities.
Investigating Student Use of Electronic Support Tools and Mathematical Reasoning
ERIC Educational Resources Information Center
Higgins, Kristina N.; Crawford, Lindy; Huscroft-D'Angelo, Jacqueline; Horney, Mark
2016-01-01
Mathematical reasoning involves comprehending mathematical information and concepts in a logical way and forming conclusions and generalizations based on this comprehension. Computer-based learning has been incorporated into classrooms across the country, and specific aspects of technology need to be studied to determine how programs are…
21. VIEW OF THE ENTRANCE TO THE TUNNEL CONNECTING BUILDINGS ...
21. VIEW OF THE ENTRANCE TO THE TUNNEL CONNECTING BUILDINGS 881 AND 883. THE TUNNEL WAS CONSTRUCTED IN 1957 TO TRANSPORT ENRICHED URANIUM COMPONENTS BETWEEN THE BUILDINGS. (1/98) - Rocky Flats Plant, General Manufacturing, Support, Records-Central Computing, Southern portion of Plant, Golden, Jefferson County, CO
15. DETAILED VIEW OF ENRICHED URANIUM STORAGE TANK. THE ADDITION ...
15. DETAILED VIEW OF ENRICHED URANIUM STORAGE TANK. THE ADDITION OF THE GLASS RINGS SHOWN AT THE TOP OF THE TANK HELPS PREVENT THE URANIUM FROM REACHING CRITICALITY LIMITS. (4/12/62) - Rocky Flats Plant, General Manufacturing, Support, Records-Central Computing, Southern portion of Plant, Golden, Jefferson County, CO
A Scenario Approach to Assessment of New Communications Media.
ERIC Educational Resources Information Center
Spangler, Kathleen; And Others
In a study supported by the Charles F. Kettering Foundation, a research team developed a methodology for illustrating the effective and ineffective uses of audio, video, and computer teleconferencing by developing scenarios for eacb medium. The group first invented a general situation--a conference involving participants with global, regional, and…
10 CFR Appendix A to Subpart D of... - Categorical Exclusions Applicable to General Agency Actions
Code of Federal Regulations, 2010 CFR
2010-01-01
... research and development A1Routine actions necessary to support the normal conduct of agency business, such... (including, but not limited to, literature surveys, inventories, audits), data analysis (including computer modelling), document preparation (such as conceptual design or feasibility studies, analytical energy supply...
10 CFR Appendix A to Subpart D of... - Categorical Exclusions Applicable to General Agency Actions
Code of Federal Regulations, 2011 CFR
2011-01-01
... research and development A1Routine actions necessary to support the normal conduct of agency business, such... (including, but not limited to, literature surveys, inventories, audits), data analysis (including computer modelling), document preparation (such as conceptual design or feasibility studies, analytical energy supply...
NASA Technical Reports Server (NTRS)
Bushnell, D.
1974-01-01
Code is easy to use yet is general with respect to: (a) type of analysis to be performed; (b) geometry of shell meridian; (c) type of wall construction; (d) type of boundary conditions, ring supports, and branching configuration; and (e) type of loading.
Creativity-Supporting Learning Environment--CSLE
ERIC Educational Resources Information Center
Apiola, Mikko; Lattu, Matti; Pasanen, Tomi A.
2012-01-01
Despite much public discussion about the importance of creativity and innovation-friendly teaching in Finnish higher education, the impact of the general opinion on actual teaching practices has been limited. In the Finnish computer science education the teaching mostly follows a pattern of lectures, fixed exercise sets, and exams. With this…
PATRAN-STAGS translator (PATSTAGS)
NASA Technical Reports Server (NTRS)
Otte, Neil
1990-01-01
A a computer program used to translate PATRAN finite element model data into Structural Analysis of General Shells (STAGS) input data is presented. The program supports translation of nodal, nodal constraints, element, force, and pressure data. The subroutine UPRESS required for the readings of live pressure data into STAGS is also presented.
Improving Cognitive Abilities and e-Inclusion in Children with Cerebral Palsy
NASA Astrophysics Data System (ADS)
Martinengo, Chiara; Curatelli, Francesco
Besides overcoming the motor barriers for accessing to computers and Internet, ICT tools can provide a very useful, and often necessary, support for the cognitive development of motor-impaired children with cerebral palsy. In fact, software tools for computation and communication allow teachers to put into effect, in a more complete and efficient way, the learning methods and the educational plans studied for the child. In the present article, after a brief analysis of the general objectives to be pursued for favouring the learning for children with cerebral palsy, we take account of some specific difficulties in the logical-linguistic and logical-mathematical fields, and we show how they can be overcome using general ICT tools and specifically implemented software programs.
Feature selection using probabilistic prediction of support vector regression.
Yang, Jian-Bo; Ong, Chong-Jin
2011-06-01
This paper presents a new wrapper-based feature selection method for support vector regression (SVR) using its probabilistic predictions. The method computes the importance of a feature by aggregating the difference, over the feature space, of the conditional density functions of the SVR prediction with and without the feature. As the exact computation of this importance measure is expensive, two approximations are proposed. The effectiveness of the measure using these approximations, in comparison to several other existing feature selection methods for SVR, is evaluated on both artificial and real-world problems. The result of the experiments show that the proposed method generally performs better than, or at least as well as, the existing methods, with notable advantage when the dataset is sparse.
The Adam language: Ada extended with support for multiway activities
NASA Technical Reports Server (NTRS)
Charlesworth, Arthur
1993-01-01
The Adam language is an extension of Ada that supports multiway activities, which are cooperative activities involving two or more processes. This support is provided by three new constructs: diva procedures, meet statements, and multiway accept statements. Diva procedures are recursive generic procedures having a particular restrictive syntax that facilitates translation for parallel computers. Meet statements and multiway accept statements provide two ways to express a multiway rendezvous, which is an n-way rendezvous generalizing Ada's 2-way rendezvous. While meet statements tend to have simpler rules than multiway accept statements, the latter approach is a more straightforward extension of Ada. The only nonnull statements permitted within meet statements and multiway accept statements are calls on instantiated diva procedures. A call on an instantiated diva procedure is also permitted outside a multiway rendezvous; thus sequential Adam programs using diva procedures can be written. Adam programs are translated into Ada programs appropriate for use on parallel computers.
Crew appliance computer program manual, volume 1
NASA Technical Reports Server (NTRS)
Russell, D. J.
1975-01-01
Trade studies of numerous appliance concepts for advanced spacecraft galley, personal hygiene, housekeeping, and other areas were made to determine which best satisfy the space shuttle orbiter and modular space station mission requirements. Analytical models of selected appliance concepts not currently included in the G-189A Generalized Environmental/Thermal Control and Life Support Systems (ETCLSS) Computer Program subroutine library were developed. The new appliance subroutines are given along with complete analytical model descriptions, solution methods, user's input instructions, and validation run results. The appliance components modeled were integrated with G-189A ETCLSS models for shuttle orbiter and modular space station, and results from computer runs of these systems are presented.
Quantum optimization for training support vector machines.
Anguita, Davide; Ridella, Sandro; Rivieccio, Fabio; Zunino, Rodolfo
2003-01-01
Refined concepts, such as Rademacher estimates of model complexity and nonlinear criteria for weighting empirical classification errors, represent recent and promising approaches to characterize the generalization ability of Support Vector Machines (SVMs). The advantages of those techniques lie in both improving the SVM representation ability and yielding tighter generalization bounds. On the other hand, they often make Quadratic-Programming algorithms no longer applicable, and SVM training cannot benefit from efficient, specialized optimization techniques. The paper considers the application of Quantum Computing to solve the problem of effective SVM training, especially in the case of digital implementations. The presented research compares the behavioral aspects of conventional and enhanced SVMs; experiments in both a synthetic and real-world problems support the theoretical analysis. At the same time, the related differences between Quadratic-Programming and Quantum-based optimization techniques are considered.
A Two-Layer Least Squares Support Vector Machine Approach to Credit Risk Assessment
NASA Astrophysics Data System (ADS)
Liu, Jingli; Li, Jianping; Xu, Weixuan; Shi, Yong
Least squares support vector machine (LS-SVM) is a revised version of support vector machine (SVM) and has been proved to be a useful tool for pattern recognition. LS-SVM had excellent generalization performance and low computational cost. In this paper, we propose a new method called two-layer least squares support vector machine which combines kernel principle component analysis (KPCA) and linear programming form of least square support vector machine. With this method sparseness and robustness is obtained while solving large dimensional and large scale database. A U.S. commercial credit card database is used to test the efficiency of our method and the result proved to be a satisfactory one.
Parallel processing for scientific computations
NASA Technical Reports Server (NTRS)
Alkhatib, Hasan S.
1995-01-01
The scope of this project dealt with the investigation of the requirements to support distributed computing of scientific computations over a cluster of cooperative workstations. Various experiments on computations for the solution of simultaneous linear equations were performed in the early phase of the project to gain experience in the general nature and requirements of scientific applications. A specification of a distributed integrated computing environment, DICE, based on a distributed shared memory communication paradigm has been developed and evaluated. The distributed shared memory model facilitates porting existing parallel algorithms that have been designed for shared memory multiprocessor systems to the new environment. The potential of this new environment is to provide supercomputing capability through the utilization of the aggregate power of workstations cooperating in a cluster interconnected via a local area network. Workstations, generally, do not have the computing power to tackle complex scientific applications, making them primarily useful for visualization, data reduction, and filtering as far as complex scientific applications are concerned. There is a tremendous amount of computing power that is left unused in a network of workstations. Very often a workstation is simply sitting idle on a desk. A set of tools can be developed to take advantage of this potential computing power to create a platform suitable for large scientific computations. The integration of several workstations into a logical cluster of distributed, cooperative, computing stations presents an alternative to shared memory multiprocessor systems. In this project we designed and evaluated such a system.
Koivunen, Marita; Välimäki, Maritta; Patel, Anita; Knapp, Martin; Hätönen, Heli; Kuosmanen, Lauri; Pitkänen, Anneli; Anttila, Minna; Katajisto, Jouko
2010-09-01
Utilisation of information technology (IT) in the treatment of people with severe mental health problems is an unknown area in Europe. Use of IT and guiding patients to relevant sources of health information requires that nursing staff have positive attitudes toward computers and accept IT use as a part of daily practises. The aim of the study was to assess the effects of the implementation of a web-based patient support system on staff's attitudes towards computers and IT use on psychiatric wards. Hundred and forty-nine nurses in two psychiatric hospitals in Finland were randomised to two groups to deliver patient education for patients with schizophrenia and psychosis with a web-based system (n = 76) or leaflets (n = 73). After baseline nurses were followed-up for 18 months after the introduction of the system. The primary outcome was nurses' motivation to utilise computers, and the secondary outcomes were nurses' beliefs in and satisfaction with computers, and use of computer and internet. There were no statistically significant differences between study groups in attitudes towards computers (motivation p = 0.936, beliefs p = 0.270, satisfaction p = 0.462) and internet use (p = 0.276). However, nurses' general computer use (p = 0.029) increased more in the leaflet group than in the IT intervention group. We can conclude that IT has promise as an alternative method in patient education, as the implementation of the web-based patient support system in daily basis did not have a negative effect on nurses' attitudes towards IT. © 2010 The Authors. Journal compilation © 2010 Nordic College of Caring Science.
Dollé, Laurent; Chavarriaga, Ricardo
2018-01-01
We present a computational model of spatial navigation comprising different learning mechanisms in mammals, i.e., associative, cognitive mapping and parallel systems. This model is able to reproduce a large number of experimental results in different variants of the Morris water maze task, including standard associative phenomena (spatial generalization gradient and blocking), as well as navigation based on cognitive mapping. Furthermore, we show that competitive and cooperative patterns between different navigation strategies in the model allow to explain previous apparently contradictory results supporting either associative or cognitive mechanisms for spatial learning. The key computational mechanism to reconcile experimental results showing different influences of distal and proximal cues on the behavior, different learning times, and different abilities of individuals to alternatively perform spatial and response strategies, relies in the dynamic coordination of navigation strategies, whose performance is evaluated online with a common currency through a modular approach. We provide a set of concrete experimental predictions to further test the computational model. Overall, this computational work sheds new light on inter-individual differences in navigation learning, and provides a formal and mechanistic approach to test various theories of spatial cognition in mammals. PMID:29630600
The Design of a High Performance Earth Imagery and Raster Data Management and Processing Platform
NASA Astrophysics Data System (ADS)
Xie, Qingyun
2016-06-01
This paper summarizes the general requirements and specific characteristics of both geospatial raster database management system and raster data processing platform from a domain-specific perspective as well as from a computing point of view. It also discusses the need of tight integration between the database system and the processing system. These requirements resulted in Oracle Spatial GeoRaster, a global scale and high performance earth imagery and raster data management and processing platform. The rationale, design, implementation, and benefits of Oracle Spatial GeoRaster are described. Basically, as a database management system, GeoRaster defines an integrated raster data model, supports image compression, data manipulation, general and spatial indices, content and context based queries and updates, versioning, concurrency, security, replication, standby, backup and recovery, multitenancy, and ETL. It provides high scalability using computer and storage clustering. As a raster data processing platform, GeoRaster provides basic operations, image processing, raster analytics, and data distribution featuring high performance computing (HPC). Specifically, HPC features include locality computing, concurrent processing, parallel processing, and in-memory computing. In addition, the APIs and the plug-in architecture are discussed.
A users' guide to the trace contaminant control simulation computer program
NASA Technical Reports Server (NTRS)
Perry, J. L.
1994-01-01
The Trace Contaminant Control Simulation computer program is a tool for assessing the performance of various trace contaminant control technologies for removing trace chemical contamination from a spacecraft cabin atmosphere. The results obtained from the program can be useful in assessing different technology combinations, system sizing, system location with respect to other life support systems, and the overall life cycle economics of a trace contaminant control system. The user's manual is extracted in its entirety from NASA TM-108409 to provide a stand-alone reference for using any version of the program. The first publication of the manual as part of TM-108409 also included a detailed listing of version 8.0 of the program. As changes to the code were necessary, it became apparent that the user's manual should be separate from the computer code documentation and be general enough to provide guidance in using any version of the program. Provided in the guide are tips for input file preparation, general program execution, and output file manipulation. Information concerning source code listings of the latest version of the computer program may be obtained by contacting the author.
16. VIEW OF THE ENRICHED URANIUM RECOVERY SYSTEM. ENRICHED URANIUM ...
16. VIEW OF THE ENRICHED URANIUM RECOVERY SYSTEM. ENRICHED URANIUM RECOVERY PROCESSED RELATIVELY PURE MATERIALS AND SOLUTIONS AND SOLID RESIDUES WITH RELATIVELY LOW URANIUM CONTENT. URANIUM RECOVERY INVOLVED BOTH SLOW AND FAST PROCESSES. (4/4/66) - Rocky Flats Plant, General Manufacturing, Support, Records-Central Computing, Southern portion of Plant, Golden, Jefferson County, CO
ERIC Educational Resources Information Center
Gil, Pablo
2017-01-01
University courses concerning Computer Vision and Image Processing are generally taught using a traditional methodology that is focused on the teacher rather than on the students. This approach is consequently not effective when teachers seek to attain cognitive objectives involving their students' critical thinking. This manuscript covers the…
Enhancing Image Findability through a Dual-Perspective Navigation Framework
ERIC Educational Resources Information Center
Lin, Yi-Ling
2013-01-01
This dissertation focuses on investigating whether users will locate desired images more efficiently and effectively when they are provided with information descriptors from both experts and the general public. This study develops a way to support image finding through a human-computer interface by providing subject headings and social tags about…
Slot, Tegan; Charpentier, Karine; Dumas, Geneviève; Delisle, Alain; Leger, Andy; Plamondon, André
2009-01-01
The aim of the study was to evaluate the effect of forearm support provided by the Workplace Board on perceived tension, comfort and productivity among pregnant and non-pregnant female computer workers. Ten pregnant and 18 non-pregnant women participated in the study. Participants completed three sets of tension/discomfort questionnaires at two week intervals. The first set was completed prior to any workstation intervention; the second set was completed after two weeks working with an ergonomically adjusted workstation; the third set was completed after two weeks working with the Workplace Board integrated into the office workstation. With the Workplace Board, decreased perceived tension was reported in the left shoulder, wrist and low back in non-pregnant women only. The Board was generally liked by all participants, and increased comfort and productivity in all areas, with the exception of a negative effect on productivity of general office tasks. The board is suitable for integration in most office workstations and for most users, but has no special benefits for pregnant women.
Large Eddy Simulation of a Supercritical Turbulent Mixing Layer
NASA Astrophysics Data System (ADS)
Sheikhi, Reza; Hadi, Fatemeh; Safari, Mehdi
2017-11-01
Supercritical turbulent flows are relevant to a wide range of applications such as supercritical power cycles, gas turbine combustors, rocket propulsion and internal combustion engines. Large eddy simulation (LES) analysis of such flows involves solving mass, momentum, energy and scalar transport equations with inclusion of generalized diffusion fluxes. These equations are combined with a real gas equation of state and the corresponding thermodynamic mixture variables. Subgrid scale models are needed for not only the conventional convective terms but also the additional high pressure effects arising due to the nonlinearity associated with generalized diffusion fluxes and real gas equation of state. In this study, LES is carried out to study the high pressure turbulent mixing of methane with carbon dioxide in a temporally developing mixing layer under supercritical condition. LES results are assessed by comparing with data obtained from direct numerical simulation (DNS) of the same layer. LES predictions agree favorably with DNS data and represent several key supercritical turbulent flow features such as high density gradient regions. Supported by DOE Grant SC0017097; computational support is provided by DOE National Energy Research Scientific Computing Center.
The effect of training methodology on knowledge representation in categorization.
Hélie, Sébastien; Shamloo, Farzin; Ell, Shawn W
2017-01-01
Category representations can be broadly classified as containing within-category information or between-category information. Although such representational differences can have a profound impact on decision-making, relatively little is known about the factors contributing to the development and generalizability of different types of category representations. These issues are addressed by investigating the impact of training methodology and category structures using a traditional empirical approach as well as the novel adaptation of computational modeling techniques from the machine learning literature. Experiment 1 focused on rule-based (RB) category structures thought to promote between-category representations. Participants learned two sets of two categories during training and were subsequently tested on a novel categorization problem using the training categories. Classification training resulted in a bias toward between-category representations whereas concept training resulted in a bias toward within-category representations. Experiment 2 focused on information-integration (II) category structures thought to promote within-category representations. With II structures, there was a bias toward within-category representations regardless of training methodology. Furthermore, in both experiments, computational modeling suggests that only within-category representations could support generalization during the test phase. These data suggest that within-category representations may be dominant and more robust for supporting the reconfiguration of current knowledge to support generalization.
The effect of training methodology on knowledge representation in categorization
Shamloo, Farzin; Ell, Shawn W.
2017-01-01
Category representations can be broadly classified as containing within–category information or between–category information. Although such representational differences can have a profound impact on decision–making, relatively little is known about the factors contributing to the development and generalizability of different types of category representations. These issues are addressed by investigating the impact of training methodology and category structures using a traditional empirical approach as well as the novel adaptation of computational modeling techniques from the machine learning literature. Experiment 1 focused on rule–based (RB) category structures thought to promote between–category representations. Participants learned two sets of two categories during training and were subsequently tested on a novel categorization problem using the training categories. Classification training resulted in a bias toward between–category representations whereas concept training resulted in a bias toward within–category representations. Experiment 2 focused on information-integration (II) category structures thought to promote within–category representations. With II structures, there was a bias toward within–category representations regardless of training methodology. Furthermore, in both experiments, computational modeling suggests that only within–category representations could support generalization during the test phase. These data suggest that within–category representations may be dominant and more robust for supporting the reconfiguration of current knowledge to support generalization. PMID:28846732
Computing with scale-invariant neural representations
NASA Astrophysics Data System (ADS)
Howard, Marc; Shankar, Karthik
The Weber-Fechner law is perhaps the oldest quantitative relationship in psychology. Consider the problem of the brain representing a function f (x) . Different neurons have receptive fields that support different parts of the range, such that the ith neuron has a receptive field at xi. Weber-Fechner scaling refers to the finding that the width of the receptive field scales with xi as does the difference between the centers of adjacent receptive fields. Weber-Fechner scaling is exponentially resource-conserving. Neurophysiological evidence suggests that neural representations obey Weber-Fechner scaling in the visual system and perhaps other systems as well. We describe an optimality constraint that is solved by Weber-Fechner scaling, providing an information-theoretic rationale for this principle of neural coding. Weber-Fechner scaling can be generated within a mathematical framework using the Laplace transform. Within this framework, simple computations such as translation, correlation and cross-correlation can be accomplished. This framework can in principle be extended to provide a general computational language for brain-inspired cognitive computation on scale-invariant representations. Supported by NSF PHY 1444389 and the BU Initiative for the Physics and Mathematics of Neural Systems,.
High-Throughput Computing on High-Performance Platforms: A Case Study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oleynik, D; Panitkin, S; Matteo, Turilli
The computing systems used by LHC experiments has historically consisted of the federation of hundreds to thousands of distributed resources, ranging from small to mid-size resource. In spite of the impressive scale of the existing distributed computing solutions, the federation of small to mid-size resources will be insufficient to meet projected future demands. This paper is a case study of how the ATLAS experiment has embraced Titan -- a DOE leadership facility in conjunction with traditional distributed high- throughput computing to reach sustained production scales of approximately 52M core-hours a years. The three main contributions of this paper are: (i)more » a critical evaluation of design and operational considerations to support the sustained, scalable and production usage of Titan; (ii) a preliminary characterization of a next generation executor for PanDA to support new workloads and advanced execution modes; and (iii) early lessons for how current and future experimental and observational systems can be integrated with production supercomputers and other platforms in a general and extensible manner.« less
Strategy generalization across orientation tasks: testing a computational cognitive model.
Gunzelmann, Glenn
2008-07-08
Humans use their spatial information processing abilities flexibly to facilitate problem solving and decision making in a variety of tasks. This article explores the question of whether a general strategy can be adapted for performing two different spatial orientation tasks by testing the predictions of a computational cognitive model. Human performance was measured on an orientation task requiring participants to identify the location of a target either on a map (find-on-map) or within an egocentric view of a space (find-in-scene). A general strategy instantiated in a computational cognitive model of the find-on-map task, based on the results from Gunzelmann and Anderson (2006), was adapted to perform both tasks and used to generate performance predictions for a new study. The qualitative fit of the model to the human data supports the view that participants were able to tailor a general strategy to the requirements of particular spatial tasks. The quantitative differences between the predictions of the model and the performance of human participants in the new experiment expose individual differences in sample populations. The model provides a means of accounting for those differences and a framework for understanding how human spatial abilities are applied to naturalistic spatial tasks that involve reasoning with maps. 2008 Cognitive Science Society, Inc.
2008-06-01
capacity planning; • Electrical generation capacity planning; • Machine scheduling; • Freight scheduling; • Dairy farm expansion planning...Support Systems and Multi Criteria Decision Analysis Products A.2.11.2.2.1 ELECTRE IS ELECTRE IS is a generalization of ELECTRE I. It is a...criteria, ELECTRE IS supports the user in the process of selecting one alternative or a subset of alternatives. The method consists of two parts
1988-06-01
and for that reason has received considerable attention recently. Of particular interest in this research Is the work of Toulmin et. al. [19793 In...whenever we make a claim there must be some grounds in which to base our conclusion, Toulmin states that our thoughts are generally directed from the...WARRANT will be the absolute reason to believe the CLAIM on the basis of the GROUNDS. For that, Toulmin allows for further BACKING which, in his
'Cloud computing' and clinical trials: report from an ECRIN workshop.
Ohmann, Christian; Canham, Steve; Danielyan, Edgar; Robertshaw, Steve; Legré, Yannick; Clivio, Luca; Demotes, Jacques
2015-07-29
Growing use of cloud computing in clinical trials prompted the European Clinical Research Infrastructures Network, a European non-profit organisation established to support multinational clinical research, to organise a one-day workshop on the topic to clarify potential benefits and risks. The issues that arose in that workshop are summarised and include the following: the nature of cloud computing and the cloud computing industry; the risks in using cloud computing services now; the lack of explicit guidance on this subject, both generally and with reference to clinical trials; and some possible ways of reducing risks. There was particular interest in developing and using a European 'community cloud' specifically for academic clinical trial data. It was recognised that the day-long workshop was only the start of an ongoing process. Future discussion needs to include clarification of trial-specific regulatory requirements for cloud computing and involve representatives from the relevant regulatory bodies.
Spielberg, Freya; Kurth, Ann E; Severynen, Anneleen; Hsieh, Yu-Hsiang; Moring-Parris, Daniel; Mackenzie, Sara; Rothman, Richard
2011-06-01
Providers in emergency care settings (ECSs) often face barriers to expanded HIV testing. We undertook formative research to understand the potential utility of a computer tool, "CARE," to facilitate rapid HIV testing in ECSs. Computer tool usability and acceptability were assessed among 35 adult patients, and provider focus groups were held, in two ECSs in Washington State and Maryland. The computer tool was usable by patients of varying computer literacy. Patients appreciated the tool's privacy and lack of judgment and their ability to reflect on HIV risks and create risk reduction plans. Staff voiced concerns regarding ECS-based HIV testing generally, including resources for follow-up of newly diagnosed people. Computer-delivered HIV testing support was acceptable and usable among low-literacy populations in two ECSs. Such tools may help circumvent some practical barriers associated with routine HIV testing in busy settings though linkages to care will still be needed.
van den Berg, Yvonne H M; Gommans, Rob
2017-09-01
New technologies have led to several major advances in psychological research over the past few decades. Peer nomination research is no exception. Thanks to these technological innovations, computerized data collection is becoming more common in peer nomination research. However, computer-based assessment is more than simply programming the questionnaire and asking respondents to fill it in on computers. In this chapter the advantages and challenges of computer-based assessments are discussed. In addition, a list of practical recommendations and considerations is provided to inform researchers on how computer-based methods can be applied to their own research. Although the focus is on the collection of peer nomination data in particular, many of the requirements, considerations, and implications are also relevant for those who consider the use of other sociometric assessment methods (e.g., paired comparisons, peer ratings, peer rankings) or computer-based assessments in general. © 2017 Wiley Periodicals, Inc.
Continuing educational needs in computers and informatics. McGill survey of family physicians.
McClaran, J.; Snell, L.; Duarte-Franco, E.
2000-01-01
OBJECTIVE: To describe family physicians' perceived educational needs in computers and informatics. DESIGN: Mailed survey. SETTING: General or family practices in Canada. PARTICIPANTS: Physicians (489 responded to a mailing sent to 2,500 physicians) who might attend sessions at the McGill Centre for CME. Two duplicate questionnaires were excluded from the analysis. METHOD: Four domains were addressed: practice profile, clinical CME needs, professional CME needs, and preferred learning formats. Data were entered on dBASE IV; analyses were performed on SPSS. MAIN FINDINGS: In the 487 questionnaires retained for analysis, "informatics and computers" was mentioned more than any other clinical diagnostic area, any other professional area, and all but three patient groups and service areas as a topic where improvement in knowledge and skills was needed in the coming year. Most physicians had no access to computer support for practice (62.6%); physicians caring for neonates, toddlers, or hospital inpatients were more likely to report some type of computer support. CONCLUSIONS: Family physicians selected knowledge and skills for computers and informatics as an area for improvement in the coming year more frequently than they selected most traditional clinical CME topics. This educational need is particularly great in small towns and in settings where some computerized hospital data are already available. PMID:10790816
Biomechanical effects of mobile computer location in a vehicle cab.
Saginus, Kyle A; Marklin, Richard W; Seeley, Patricia; Simoneau, Guy G; Freier, Stephen
2011-10-01
The objective of this research is to determine the best location to place a conventional mobile computer supported by a commercially available mount in a light truck cab. U.S. and Canadian electric utility companies are in the process of integrating mobile computers into their fleet vehicle cabs. There are no publications on the effect of mobile computer location in a vehicle cab on biomechanical loading, performance, and subjective assessment. The authors tested four locations of mobile computers in a light truck cab in a laboratory study to determine how location affected muscle activity of the lower back and shoulders; joint angles of the shoulders, elbows, and wrist; user performance; and subjective assessment. A total of 22 participants were tested in this study. Placing the mobile computer closer to the steering wheel reduced low back and shoulder muscle activity. Joint angles of the shoulders, elbows, and wrists were also closer to neutral angle. Biomechanical modeling revealed substantially less spinal compression and trunk muscle force. In general, there were no practical differences in performance between the locations. Subjective assessment indicated that users preferred the mobile computer to be as close as possible to the steering wheel. Locating the mobile computer close to the steering wheel reduces risk of injuries, such as low back pain and shoulder tendonitis. Results from the study can guide electric utility companies in the installation of mobile computers into vehicle cabs. Results may also be generalized to other industries that use trucklike vehicles, such as construction.
Updated Panel-Method Computer Program
NASA Technical Reports Server (NTRS)
Ashby, Dale L.
1995-01-01
Panel code PMARC_12 (Panel Method Ames Research Center, version 12) computes potential-flow fields around complex three-dimensional bodies such as complete aircraft models. Contains several advanced features, including internal mathematical modeling of flow, time-stepping wake model for simulating either steady or unsteady motions, capability for Trefftz computation of drag induced by plane, and capability for computation of off-body and on-body streamlines, and capability of computation of boundary-layer parameters by use of two-dimensional integral boundary-layer method along surface streamlines. Investigators interested in visual representations of phenomena, may want to consider obtaining program GVS (ARC-13361), General visualization System. GVS is Silicon Graphics IRIS program created to support scientific-visualization needs of PMARC_12. GVS available separately from COSMIC. PMARC_12 written in standard FORTRAN 77, with exception of NAMELIST extension used for input.
Comparing student performance on paper- and computer-based math curriculum-based measures.
Hensley, Kiersten; Rankin, Angelica; Hosp, John
2017-01-01
As the number of computerized curriculum-based measurement (CBM) tools increases, it is necessary to examine whether or not student performance can generalize across a variety of test administration modes (i.e., paper or computer). The purpose of this study is to compare math fact fluency on paper versus computer for 197 upper elementary students. Students completed identical sets of probes on paper and on the computer, which were then scored for digits correct, problems correct, and accuracy. Results showed a significant difference in performance between the two sets of probes, with higher fluency rates on the paper probes. Because decisions about levels of student support and interventions often rely on measures such as these, more research in this area is needed to examine the potential differences in student performance between paper-based and computer-based CBMs.
The neural and computational bases of semantic cognition.
Ralph, Matthew A Lambon; Jefferies, Elizabeth; Patterson, Karalyn; Rogers, Timothy T
2017-01-01
Semantic cognition refers to our ability to use, manipulate and generalize knowledge that is acquired over the lifespan to support innumerable verbal and non-verbal behaviours. This Review summarizes key findings and issues arising from a decade of research into the neurocognitive and neurocomputational underpinnings of this ability, leading to a new framework that we term controlled semantic cognition (CSC). CSC offers solutions to long-standing queries in philosophy and cognitive science, and yields a convergent framework for understanding the neural and computational bases of healthy semantic cognition and its dysfunction in brain disorders.
A multitasking, multisinked, multiprocessor data acquisition front end
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fox, R.; Au, R.; Molen, A.V.
1989-10-01
The authors have developed a generalized data acquisition front end system which is based on MC68020 processors running a commercial real time kernel (rhoSOS), and implemented primarily in a high level language (C). This system has been attached to the back end on-line computing system at NSCL via our high performance ETHERNET protocol. Data may be simultaneously sent to any number of back end systems. Fixed fraction sampling along links to back end computing is also supported. A nonprocedural program generator simplifies the development of experiment specific code.
Counts, Sarah J; Kim, Anthony W
2017-08-01
Modalities to detect and characterize lung cancer are generally divided into those that are invasive [endobronchial ultrasound (EBUS), esophageal ultrasound (EUS), and electromagnetic navigational bronchoscopy (ENMB)] versus noninvasive [chest radiography (CXR), computed tomography (CT), positron emission tomography (PET), and magnetic resonance imaging (MRI)]. This chapter describes these modalities, the literature supporting their use, and delineates what tests to use to best evaluate the patient with lung cancer. Copyright © 2017 Elsevier Inc. All rights reserved.
A Scatter-Based Prototype Framework and Multi-Class Extension of Support Vector Machines
Jenssen, Robert; Kloft, Marius; Zien, Alexander; Sonnenburg, Sören; Müller, Klaus-Robert
2012-01-01
We provide a novel interpretation of the dual of support vector machines (SVMs) in terms of scatter with respect to class prototypes and their mean. As a key contribution, we extend this framework to multiple classes, providing a new joint Scatter SVM algorithm, at the level of its binary counterpart in the number of optimization variables. This enables us to implement computationally efficient solvers based on sequential minimal and chunking optimization. As a further contribution, the primal problem formulation is developed in terms of regularized risk minimization and the hinge loss, revealing the score function to be used in the actual classification of test patterns. We investigate Scatter SVM properties related to generalization ability, computational efficiency, sparsity and sensitivity maps, and report promising results. PMID:23118845
Bringing computational science to the public.
McDonagh, James L; Barker, Daniel; Alderson, Rosanna G
2016-01-01
The increasing use of computers in science allows for the scientific analyses of large datasets at an increasing pace. We provided examples and interactive demonstrations at Dundee Science Centre as part of the 2015 Women in Science festival, to present aspects of computational science to the general public. We used low-cost Raspberry Pi computers to provide hands on experience in computer programming and demonstrated the application of computers to biology. Computer games were used as a means to introduce computers to younger visitors. The success of the event was evaluated by voluntary feedback forms completed by visitors, in conjunction with our own self-evaluation. This work builds on the original work of the 4273π bioinformatics education program of Barker et al. (2013, BMC Bioinform. 14:243). 4273π provides open source education materials in bioinformatics. This work looks at the potential to adapt similar materials for public engagement events. It appears, at least in our small sample of visitors (n = 13), that basic computational science can be conveyed to people of all ages by means of interactive demonstrations. Children as young as five were able to successfully edit simple computer programs with supervision. This was, in many cases, their first experience of computer programming. The feedback is predominantly positive, showing strong support for improving computational science education, but also included suggestions for improvement. Our conclusions are necessarily preliminary. However, feedback forms suggest methods were generally well received among the participants; "Easy to follow. Clear explanation" and "Very easy. Demonstrators were very informative." Our event, held at a local Science Centre in Dundee, demonstrates that computer games and programming activities suitable for young children can be performed alongside a more specialised and applied introduction to computational science for older visitors.
Psychiatrists’ Comfort Using Computers and Other Electronic Devices in Clinical Practice
Fochtmann, Laura J.; Clarke, Diana E.; Barber, Keila; Hong, Seung-Hee; Yager, Joel; Mościcki, Eve K.; Plovnick, Robert M.
2015-01-01
This report highlights findings from the Study of Psychiatrists’ Use of Informational Resources in Clinical Practice, a cross-sectional Web- and paper-based survey that examined psychiatrists’ comfort using computers and other electronic devices in clinical practice. One-thousand psychiatrists were randomly selected from the American Medical Association Physician Masterfile and asked to complete the survey between May and August, 2012. A total of 152 eligible psychiatrists completed the questionnaire (response rate 22.2 %). The majority of psychiatrists reported comfort using computers for educational and personal purposes. However, 26 % of psychiatrists reported not using or not being comfortable using computers for clinical functions. Psychiatrists under age 50 were more likely to report comfort using computers for all purposes than their older counterparts. Clinical tasks for which computers were reportedly used comfortably, specifically by psychiatrists younger than 50, included documenting clinical encounters, prescribing, ordering laboratory tests, accessing read-only patient information (e.g., test results), conducting internet searches for general clinical information, accessing online patient educational materials, and communicating with patients or other clinicians. Psychiatrists generally reported comfort using computers for personal and educational purposes. However, use of computers in clinical care was less common, particularly among psychiatrists 50 and older. Information and educational resources need to be available in a variety of accessible, user-friendly, computer and non-computer-based formats, to support use across all ages. Moreover, ongoing training and technical assistance with use of electronic and mobile device technologies in clinical practice is needed. Research on barriers to clinical use of computers is warranted. PMID:26667248
Psychiatrists' Comfort Using Computers and Other Electronic Devices in Clinical Practice.
Duffy, Farifteh F; Fochtmann, Laura J; Clarke, Diana E; Barber, Keila; Hong, Seung-Hee; Yager, Joel; Mościcki, Eve K; Plovnick, Robert M
2016-09-01
This report highlights findings from the Study of Psychiatrists' Use of Informational Resources in Clinical Practice, a cross-sectional Web- and paper-based survey that examined psychiatrists' comfort using computers and other electronic devices in clinical practice. One-thousand psychiatrists were randomly selected from the American Medical Association Physician Masterfile and asked to complete the survey between May and August, 2012. A total of 152 eligible psychiatrists completed the questionnaire (response rate 22.2 %). The majority of psychiatrists reported comfort using computers for educational and personal purposes. However, 26 % of psychiatrists reported not using or not being comfortable using computers for clinical functions. Psychiatrists under age 50 were more likely to report comfort using computers for all purposes than their older counterparts. Clinical tasks for which computers were reportedly used comfortably, specifically by psychiatrists younger than 50, included documenting clinical encounters, prescribing, ordering laboratory tests, accessing read-only patient information (e.g., test results), conducting internet searches for general clinical information, accessing online patient educational materials, and communicating with patients or other clinicians. Psychiatrists generally reported comfort using computers for personal and educational purposes. However, use of computers in clinical care was less common, particularly among psychiatrists 50 and older. Information and educational resources need to be available in a variety of accessible, user-friendly, computer and non-computer-based formats, to support use across all ages. Moreover, ongoing training and technical assistance with use of electronic and mobile device technologies in clinical practice is needed. Research on barriers to clinical use of computers is warranted.
A general-purpose development environment for intelligent computer-aided training systems
NASA Technical Reports Server (NTRS)
Savely, Robert T.
1990-01-01
Space station training will be a major task, requiring the creation of large numbers of simulation-based training systems for crew, flight controllers, and ground-based support personnel. Given the long duration of space station missions and the large number of activities supported by the space station, the extension of space shuttle training methods to space station training may prove to be impractical. The application of artificial intelligence technology to simulation training can provide the ability to deliver individualized training to large numbers of personnel in a distributed workstation environment. The principal objective of this project is the creation of a software development environment which can be used to build intelligent training systems for procedural tasks associated with the operation of the space station. Current NASA Johnson Space Center projects and joint projects with other NASA operational centers will result in specific training systems for existing space shuttle crew, ground support personnel, and flight controller tasks. Concurrently with the creation of these systems, a general-purpose development environment for intelligent computer-aided training systems will be built. Such an environment would permit the rapid production, delivery, and evolution of training systems for space station crew, flight controllers, and other support personnel. The widespread use of such systems will serve to preserve task and training expertise, support the training of many personnel in a distributed manner, and ensure the uniformity and verifiability of training experiences. As a result, significant reductions in training costs can be realized while safety and the probability of mission success can be enhanced.
Cloud Infrastructures for In Silico Drug Discovery: Economic and Practical Aspects
Clematis, Andrea; Quarati, Alfonso; Cesini, Daniele; Milanesi, Luciano; Merelli, Ivan
2013-01-01
Cloud computing opens new perspectives for small-medium biotechnology laboratories that need to perform bioinformatics analysis in a flexible and effective way. This seems particularly true for hybrid clouds that couple the scalability offered by general-purpose public clouds with the greater control and ad hoc customizations supplied by the private ones. A hybrid cloud broker, acting as an intermediary between users and public providers, can support customers in the selection of the most suitable offers, optionally adding the provisioning of dedicated services with higher levels of quality. This paper analyses some economic and practical aspects of exploiting cloud computing in a real research scenario for the in silico drug discovery in terms of requirements, costs, and computational load based on the number of expected users. In particular, our work is aimed at supporting both the researchers and the cloud broker delivering an IaaS cloud infrastructure for biotechnology laboratories exposing different levels of nonfunctional requirements. PMID:24106693
The dependence of educational infrastructure on clinical infrastructure.
Cimino, C.
1998-01-01
The Albert Einstein College of Medicine needed to assess the growth of its infrastructure for educational computing as a first step to determining if student needs were being met. Included in computing infrastructure are space, equipment, software, and computing services. The infrastructure was assessed by reviewing purchasing and support logs for a six year period from 1992 to 1998. This included equipment, software, and e-mail accounts provided to students and to faculty for educational purposes. Student space has grown at a constant rate (averaging 14% increase each year respectively). Student equipment on campus has grown by a constant amount each year (average 8.3 computers each year). Student infrastructure off campus and educational support of faculty has not kept pace. It has either declined or remained level over the six year period. The availability of electronic mail clearly demonstrates this with accounts being used by 99% of students, 78% of Basic Science Course Leaders, 38% of Clerkship Directors, 18% of Clerkship Site Directors, and 8% of Clinical Elective Directors. The collection of the initial descriptive infrastructure data has revealed problems that may generalize to other medical schools. The discrepancy between infrastructure available to students and faculty on campus and students and faculty off campus creates a setting where students perceive a paradoxical declining support for computer use as they progress through medical school. While clinical infrastructure may be growing, it is at the expense of educational infrastructure at affiliate hospitals. PMID:9929262
NASA Astrophysics Data System (ADS)
Walker, Ernest; Chen, Xinjia; Cooper, Reginald L.
2010-04-01
An arbitrarily accurate approach is used to determine the bit-error rate (BER) performance for generalized asynchronous DS-CDMA systems, in Gaussian noise with Raleigh fading. In this paper, and the sequel, new theoretical work has been contributed which substantially enhances existing performance analysis formulations. Major contributions include: substantial computational complexity reduction, including a priori BER accuracy bounding; an analytical approach that facilitates performance evaluation for systems with arbitrary spectral spreading distributions, with non-uniform transmission delay distributions. Using prior results, augmented by these enhancements, a generalized DS-CDMA system model is constructed and used to evaluated the BER performance, in a variety of scenarios. In this paper, the generalized system modeling was used to evaluate the performance of both Walsh- Hadamard (WH) and Walsh-Hadamard-seeded zero-correlation-zone (WH-ZCZ) coding. The selection of these codes was informed by the observation that WH codes contain N spectral spreading values (0 to N - 1), one for each code sequence; while WH-ZCZ codes contain only two spectral spreading values (N/2 - 1,N/2); where N is the sequence length in chips. Since these codes span the spectral spreading range for DS-CDMA coding, by invoking an induction argument, the generalization of the system model is sufficiently supported. The results in this paper, and the sequel, support the claim that an arbitrary accurate performance analysis for DS-CDMA systems can be evaluated over the full range of binary coding, with minimal computational complexity.
Image analysis library software development
NASA Technical Reports Server (NTRS)
Guseman, L. F., Jr.; Bryant, J.
1977-01-01
The Image Analysis Library consists of a collection of general purpose mathematical/statistical routines and special purpose data analysis/pattern recognition routines basic to the development of image analysis techniques for support of current and future Earth Resources Programs. Work was done to provide a collection of computer routines and associated documentation which form a part of the Image Analysis Library.
Project CAD as of July 1978: CAD support project, situation in July 1978
NASA Technical Reports Server (NTRS)
Boesch, L.; Lang-Lendorff, G.; Rothenberg, R.; Stelzer, V.
1979-01-01
The structure of Computer Aided Design (CAD) and the requirements for program developments in past and future are described. The actual standard and the future aims of CAD programs are presented. The developed programs in: (1) civil engineering; (2) mechanical engineering; (3) chemical engineering/shipbuilding; (4) electrical engineering; and (5) general programs are discussed.
6. VIEW OF THE BRIQUETTING PRESS AND CHIP CLEANING HOOD. ...
6. VIEW OF THE BRIQUETTING PRESS AND CHIP CLEANING HOOD. SCRAPS OF ENRICHED URANIUM FROM MACHINING OPERATIONS WERE CLEANED IN A SOLVENT BATH, THEN PRESSED INTO BRIQUETTS. THE BRIQUETTS WERE USED AS FEED MATERIAL FOR THE FOUNDRY. (4/4/66) - Rocky Flats Plant, General Manufacturing, Support, Records-Central Computing, Southern portion of Plant, Golden, Jefferson County, CO
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-31
... found in the environment. Through the IRIS Program, EPA provides the highest quality science-based human... for the external review draft human health assessment titled, ``Toxicological Review of n-Butanol: In... will need audio-visual equipment (e.g., laptop computer and slide projector). In general, each...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-18
.... Indicate if you will need audio-visual equipment (e.g., laptop computer and slide projector). In general... diagnoses of certain tumors reported in a Ramazzini Institute methanol research study, which was cited and... (PWG) review of select studies conducted at the Institute. EPA is today releasing the draft IRIS...
A Social Approach to High-Level Context Generation for Supporting Context-Aware M-Learning
ERIC Educational Resources Information Center
Pan, Xu-Wei; Ding, Ling; Zhu, Xi-Yong; Yang, Zhao-Xiang
2017-01-01
In m-learning environments, context-awareness is for wide use where learners' situations are varied, dynamic and unpredictable. We are facing the challenge of requirements of both generality and depth in generating and processing high-level context. In this paper, we present a social approach which exploits social dynamics and social computing for…
Stevens, Jean-Luc R.; Elver, Marco; Bednar, James A.
2013-01-01
Lancet is a new, simulator-independent Python utility for succinctly specifying, launching, and collating results from large batches of interrelated computationally demanding program runs. This paper demonstrates how to combine Lancet with IPython Notebook to provide a flexible, lightweight, and agile workflow for fully reproducible scientific research. This informal and pragmatic approach uses IPython Notebook to capture the steps in a scientific computation as it is gradually automated and made ready for publication, without mandating the use of any separate application that can constrain scientific exploration and innovation. The resulting notebook concisely records each step involved in even very complex computational processes that led to a particular figure or numerical result, allowing the complete chain of events to be replicated automatically. Lancet was originally designed to help solve problems in computational neuroscience, such as analyzing the sensitivity of a complex simulation to various parameters, or collecting the results from multiple runs with different random starting points. However, because it is never possible to know in advance what tools might be required in future tasks, Lancet has been designed to be completely general, supporting any type of program as long as it can be launched as a process and can return output in the form of files. For instance, Lancet is also heavily used by one of the authors in a separate research group for launching batches of microprocessor simulations. This general design will allow Lancet to continue supporting a given research project even as the underlying approaches and tools change. PMID:24416014
Achieving reliability - The evolution of redundancy in American manned spacecraft computers
NASA Technical Reports Server (NTRS)
Tomayko, J. E.
1985-01-01
The Shuttle is the first launch system deployed by NASA with full redundancy in the on-board computer systems. Fault-tolerance, i.e., restoring to a backup with less capabilities, was the method selected for Apollo. The Gemini capsule was the first to carry a computer, which also served as backup for Titan launch vehicle guidance. Failure of the Gemini computer resulted in manual control of the spacecraft. The Apollo system served vehicle flight control and navigation functions. The redundant computer on Skylab provided attitude control only in support of solar telescope pointing. The STS digital, fly-by-wire avionics system requires 100 percent reliability. The Orbiter carries five general purpose computers, four being fully-redundant and the fifth being soley an ascent-descent tool. The computers are synchronized at input and output points at a rate of about six times a second. The system is projected to cause a loss of an Orbiter only four times in a billion flights.
Elastic Cloud Computing Architecture and System for Heterogeneous Spatiotemporal Computing
NASA Astrophysics Data System (ADS)
Shi, X.
2017-10-01
Spatiotemporal computation implements a variety of different algorithms. When big data are involved, desktop computer or standalone application may not be able to complete the computation task due to limited memory and computing power. Now that a variety of hardware accelerators and computing platforms are available to improve the performance of geocomputation, different algorithms may have different behavior on different computing infrastructure and platforms. Some are perfect for implementation on a cluster of graphics processing units (GPUs), while GPUs may not be useful on certain kind of spatiotemporal computation. This is the same situation in utilizing a cluster of Intel's many-integrated-core (MIC) or Xeon Phi, as well as Hadoop or Spark platforms, to handle big spatiotemporal data. Furthermore, considering the energy efficiency requirement in general computation, Field Programmable Gate Array (FPGA) may be a better solution for better energy efficiency when the performance of computation could be similar or better than GPUs and MICs. It is expected that an elastic cloud computing architecture and system that integrates all of GPUs, MICs, and FPGAs could be developed and deployed to support spatiotemporal computing over heterogeneous data types and computational problems.
Towards a model of pion generalized parton distributions from Dyson-Schwinger equations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moutarde, H.
2015-04-10
We compute the pion quark Generalized Parton Distribution H{sup q} and Double Distributions F{sup q} and G{sup q} in a coupled Bethe-Salpeter and Dyson-Schwinger approach. We use simple algebraic expressions inspired by the numerical resolution of Dyson-Schwinger and Bethe-Salpeter equations. We explicitly check the support and polynomiality properties, and the behavior under charge conjugation or time invariance of our model. We derive analytic expressions for the pion Double Distributions and Generalized Parton Distribution at vanishing pion momentum transfer at a low scale. Our model compares very well to experimental pion form factor or parton distribution function data.
Cady, R.E.; Peckenpaugh, J.M.
1985-01-01
RAQSIM, a generalized flow model of a groundwater system using finite-element methods, is documented to explain how it works and to demonstrate that it gives valid results. Three support programs that are used to compute recharge and discharge data required as input to RAQSIM are described. RAQSIM was developed to solve transient, two-dimensional, regional groundwater flow problems with isotropic or anisotropic conductance. The model can also simulate radially-symmetric flow to a well and steady-state flow. The mathematical basis, program structure, data input and output procedures, organization of data sets, and program features and options of RAQSIM are discussed. An example , containing listings of data and results and illustrating RAQSIM 's capabilities, is discussed in detail. Two test problems also are discussed comparing RAQSIM 's results with analytical procedures. The first support program described, the PET Program, uses solar radiation and other climatic data in the Jensen-Haise method to compute potential evapotranspiration. The second support program, the Soil-Water Program, uses output from the PET Program, soil characteristics, and the ratio of potential to actual evapotranspiration for each crop to compute infiltration, storage, and removal of water from the soil zone. The third program, the Recharge-Discharge Program, uses output from the Soil-Water Program together with other data to compute recharge and discharge from the groundwater flow system. For each support program, a program listing and examples of the data and results for the Twin Platte-Middle Republican study are provided. In addition, a brief discussion on how each program operates and on procedures for running and modifying these programs are presented. (Author 's abstract)
Polytopol computing for multi-core and distributed systems
NASA Astrophysics Data System (ADS)
Spaanenburg, Henk; Spaanenburg, Lambert; Ranefors, Johan
2009-05-01
Multi-core computing provides new challenges to software engineering. The paper addresses such issues in the general setting of polytopol computing, that takes multi-core problems in such widely differing areas as ambient intelligence sensor networks and cloud computing into account. It argues that the essence lies in a suitable allocation of free moving tasks. Where hardware is ubiquitous and pervasive, the network is virtualized into a connection of software snippets judiciously injected to such hardware that a system function looks as one again. The concept of polytopol computing provides a further formalization in terms of the partitioning of labor between collector and sensor nodes. Collectors provide functions such as a knowledge integrator, awareness collector, situation displayer/reporter, communicator of clues and an inquiry-interface provider. Sensors provide functions such as anomaly detection (only communicating singularities, not continuous observation), they are generally powered or self-powered, amorphous (not on a grid) with generation-and-attrition, field re-programmable, and sensor plug-and-play-able. Together the collector and the sensor are part of the skeleton injector mechanism, added to every node, and give the network the ability to organize itself into some of many topologies. Finally we will discuss a number of applications and indicate how a multi-core architecture supports the security aspects of the skeleton injector.
Human factors with nonhumans - Factors that affect computer-task performance
NASA Technical Reports Server (NTRS)
Washburn, David A.
1992-01-01
There are two general strategies that may be employed for 'doing human factors research with nonhuman animals'. First, one may use the methods of traditional human factors investigations to examine the nonhuman animal-to-machine interface. Alternatively, one might use performance by nonhuman animals as a surrogate for or model of performance by a human operator. Each of these approaches is illustrated with data in the present review. Chronic ambient noise was found to have a significant but inconsequential effect on computer-task performance by rhesus monkeys (Macaca mulatta). Additional data supported the generality of findings such as these to humans, showing that rhesus monkeys are appropriate models of human psychomotor performance. It is argued that ultimately the interface between comparative psychology and technology will depend on the coordinated use of both strategies of investigation.
Gradient-based adaptation of general gaussian kernels.
Glasmachers, Tobias; Igel, Christian
2005-10-01
Gradient-based optimizing of gaussian kernel functions is considered. The gradient for the adaptation of scaling and rotation of the input space is computed to achieve invariance against linear transformations. This is done by using the exponential map as a parameterization of the kernel parameter manifold. By restricting the optimization to a constant trace subspace, the kernel size can be controlled. This is, for example, useful to prevent overfitting when minimizing radius-margin generalization performance measures. The concepts are demonstrated by training hard margin support vector machines on toy data.
NASA Astrophysics Data System (ADS)
Fasel, Markus
2016-10-01
High-Performance Computing Systems are powerful tools tailored to support large- scale applications that rely on low-latency inter-process communications to run efficiently. By design, these systems often impose constraints on application workflows, such as limited external network connectivity and whole node scheduling, that make more general-purpose computing tasks, such as those commonly found in high-energy nuclear physics applications, more difficult to carry out. In this work, we present a tool designed to simplify access to such complicated environments by handling the common tasks of job submission, software management, and local data management, in a framework that is easily adaptable to the specific requirements of various computing systems. The tool, initially constructed to process stand-alone ALICE simulations for detector and software development, was successfully deployed on the NERSC computing systems, Carver, Hopper and Edison, and is being configured to provide access to the next generation NERSC system, Cori. In this report, we describe the tool and discuss our experience running ALICE applications on NERSC HPC systems. The discussion will include our initial benchmarks of Cori compared to other systems and our attempts to leverage the new capabilities offered with Cori to support data-intensive applications, with a future goal of full integration of such systems into ALICE grid operations.
Classification of Normal and Pathological Gait in Young Children Based on Foot Pressure Data.
Guo, Guodong; Guffey, Keegan; Chen, Wenbin; Pergami, Paola
2017-01-01
Human gait recognition, an active research topic in computer vision, is generally based on data obtained from images/videos. We applied computer vision technology to classify pathology-related changes in gait in young children using a foot-pressure database collected using the GAITRite walkway system. As foot positioning changes with children's development, we also investigated the possibility of age estimation based on this data. Our results demonstrate that the data collected by the GAITRite system can be used for normal/pathological gait classification. Combining age information and normal/pathological gait classification increases the accuracy of the classifier. This novel approach could support the development of an accurate, real-time, and economic measure of gait abnormalities in children, able to provide important feedback to clinicians regarding the effect of rehabilitation interventions, and to support targeted treatment modifications.
Trajectory Software With Upper Atmosphere Model
NASA Technical Reports Server (NTRS)
Barrett, Charles
2012-01-01
The Trajectory Software Applications 6.0 for the Dec Alpha platform has an implementation of the Jacchia-Lineberry Upper Atmosphere Density Model used in the Mission Control Center for International Space Station support. Previous trajectory software required an upper atmosphere to support atmosphere drag calculations in the Mission Control Center. The Functional operation will differ depending on the end-use of the module. In general, the calling routine will use function-calling arguments to specify input to the processor. The atmosphere model will then compute and return atmospheric density at the time of interest.
Leibon, Gregory; Rockmore, Daniel N.; Park, Wooram; Taintor, Robert; Chirikjian, Gregory S.
2008-01-01
We present algorithms for fast and stable approximation of the Hermite transform of a compactly supported function on the real line, attainable via an application of a fast algebraic algorithm for computing sums associated with a three-term relation. Trade-offs between approximation in bandlimit (in the Hermite sense) and size of the support region are addressed. Numerical experiments are presented that show the feasibility and utility of our approach. Generalizations to any family of orthogonal polynomials are outlined. Applications to various problems in tomographic reconstruction, including the determination of protein structure, are discussed. PMID:20027202
[Online therapies - what is known about their functionality].
Stenberg, Jan-Henry; Joutsenniemi, Kaisla; Holi, Matti
2015-01-01
Online therapies are partly automated therapies, in which psychotherapeutic contents have been complemented with computer-aided presentational and educational contents, with a therapist giving support to the progress of the patient. As methods, these therapeutic programs incorporate therapeutic methods that have proven effective, such as remodeling of thoughts, activation of behavior and exposure, empathy, strengthening of cooperative relationship and motivation, and general support for self-reflection. For instance, online therapies already constitute part of the Finnish treatment guidelines on depression. Online therapies are available throughout Finland for the essential psychiatric illnesses.
Moore, Helen J; Nixon, Catherine; Tariq, Anisah; Emery, Jon; Hamilton, Willie; Hoare, Zoë; Kershenbaum, Anne; Neal, Richard D; Ukoumunne, Obioha C; Usher-Smith, Juliet; Walter, Fiona M; Whyte, Sophie; Rubin, Greg
2016-04-04
For most cancers, only a minority of patients have symptoms meeting the National Institute for Health and Clinical Excellence guidance for urgent referral. For gastro-oesophageal cancers, the 'alarm' symptoms of dysphagia and weight loss are reported by only 32 and 8 % of patients, respectively, and their presence correlates with advanced-stage disease. Electronic clinical decision-support tools that integrate with clinical computer systems have been developed for general practice, although uncertainty remains concerning their effectiveness. The objectives of this trial are to optimise the intervention and establish the acceptability of both the intervention and randomisation, confirm the suitability and selection of outcome measures, finalise the design for the phase III definitive trial, and obtain preliminary estimates of the intervention effect. This is a two-arm, multi-centre, cluster-randomised, controlled phase II trial design, which will extend over a 16-month period, across 60 general practices within the North East and North Cumbria and the Eastern Local Clinical Research Network areas. Practices will be randomised to receive either the intervention (the electronic clinical decision-support tool) or to act as a control (usual care). From these practices, we will recruit 3000 adults who meet the trial eligibility criteria and present to their GP with symptoms suggestive of gastro-oesophageal cancer. The main measures are the process data, which include the practitioner outcomes, service outcomes, diagnostic intervals, health economic outcomes, and patient outcomes. One-on-one interviews in a sub-sample of 30 patient-GP dyads will be undertaken to understand the impact of the use or non-use of the electronic clinical decision-support tool in the consultation. A further 10-15 GPs will be interviewed to identify and gain an understanding of the facilitators and constraints influencing implementation of the electronic clinical decision-support tool in practice. We aim to generate new knowledge on the process measures regarding the use of electronic clinical decision-support tools in primary care in general and to inform a subsequent definitive phase III trial. Preliminary data on the impact of the support tool on resource utilisation and health care costs will also be collected. ISRCTN Registry, ISRCTN12595588 .
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, S.A.
In computing landscape which has a plethora of different hardware architectures and supporting software systems ranging from compilers to operating systems, there is an obvious and strong need for a philosophy of software development that lends itself to the design and construction of portable code systems. The current efforts to standardize software bear witness to this need. SABrE is an effort to implement a software development environment which is itself portable and promotes the design and construction of portable applications. SABrE does not include such important tools as editors and compilers. Well built tools of that kind are readily availablemore » across virtually all computer platforms. The areas that SABrE addresses are at a higher level involving issues such as data portability, portable inter-process communication, and graphics. These blocks of functionality have particular significance to the kind of code development done at LLNL. That is partly why the general computing community has not supplied us with these tools already. This is another key feature of the software development environments which we must recognize. The general computing community cannot and should not be expected to produce all of the tools which we require.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, S.A.
In computing landscape which has a plethora of different hardware architectures and supporting software systems ranging from compilers to operating systems, there is an obvious and strong need for a philosophy of software development that lends itself to the design and construction of portable code systems. The current efforts to standardize software bear witness to this need. SABrE is an effort to implement a software development environment which is itself portable and promotes the design and construction of portable applications. SABrE does not include such important tools as editors and compilers. Well built tools of that kind are readily availablemore » across virtually all computer platforms. The areas that SABrE addresses are at a higher level involving issues such as data portability, portable inter-process communication, and graphics. These blocks of functionality have particular significance to the kind of code development done at LLNL. That is partly why the general computing community has not supplied us with these tools already. This is another key feature of the software development environments which we must recognize. The general computing community cannot and should not be expected to produce all of the tools which we require.« less
Performance characteristics of three-phase induction motors
NASA Technical Reports Server (NTRS)
Wood, M. E.
1977-01-01
An investigation into the characteristics of three phase, 400 Hz, induction motors of the general type used on aircraft and spacecraft is summarized. Results of laboratory tests are presented and compared with results from a computer program. Representative motors were both tested and simulated under nominal conditions as well as off nominal conditions of temperature, frequency, voltage magnitude, and voltage balance. Good correlation was achieved between simulated and laboratory results. The primary purpose of the program was to verify the simulation accuracy of the computer program, which in turn will be used as an analytical tool to support the shuttle orbiter.
Program Helps Simulate Neural Networks
NASA Technical Reports Server (NTRS)
Villarreal, James; Mcintire, Gary
1993-01-01
Neural Network Environment on Transputer System (NNETS) computer program provides users high degree of flexibility in creating and manipulating wide variety of neural-network topologies at processing speeds not found in conventional computing environments. Supports back-propagation and back-propagation-related algorithms. Back-propagation algorithm used is implementation of Rumelhart's generalized delta rule. NNETS developed on INMOS Transputer(R). Predefines back-propagation network, Jordan network, and reinforcement network to assist users in learning and defining own networks. Also enables users to configure other neural-network paradigms from NNETS basic architecture. Small portion of software written in OCCAM(R) language.
Aeronautical engineering: A continuing bibliography with indexes (supplement 316)
NASA Technical Reports Server (NTRS)
1995-01-01
This bibliography lists 413 reports, articles, and other documents introduced into the NASA scientific and technical information system in April 1995. Subject coverage includes: aeronautics; mathematical and computer sciences; chemistry and material sciences; geosciences; design, construction and testing of aircraft and aircraft engines; aircraft components, equipment, and systems; ground support systems; and theoretical and applied aspects of aerodynamics and general fluid dynamics.
9. VIEW OF MILLING AND LATHE MACHINES, MILLING AND LATHE ...
9. VIEW OF MILLING AND LATHE MACHINES, MILLING AND LATHE MACHINES WERE USED TO FORM COMPONENTS INTO THEIR FINAL SHAPE. IN THE FOUNDRY, ENRICHED URANIUM WAS CAST INTO SPHERICAL SHAPES OR INGOT FROM WHICH WEAPONS COMPONENTS WERE FABRICATED. (4/4/66) - Rocky Flats Plant, General Manufacturing, Support, Records-Central Computing, Southern portion of Plant, Golden, Jefferson County, CO
Simulation Methods for Design of Networked Power Electronics and Information Systems
2014-07-01
Insertion of latency in every branch and at every node permits the system model to be efficiently distributed across many separate computing cores. An... the system . We demonstrated extensibility and generality of the Virtual Test Bed (VTB) framework to support multiple solvers and their associated...Information Systems Objectives The overarching objective of this program is to develop methods for fast
Shuttle mission simulator baseline definition report, volume 2
NASA Technical Reports Server (NTRS)
Dahlberg, A. W.; Small, D. E.
1973-01-01
The baseline definition report for the space shuttle mission simulator is presented. The subjects discussed are: (1) the general configurations, (2) motion base crew station, (3) instructor operator station complex, (4) display devices, (5) electromagnetic compatibility, (6) external interface equipment, (7) data conversion equipment, (8) fixed base crew station equipment, and (9) computer complex. Block diagrams of the supporting subsystems are provided.
Efficient universal quantum channel simulation in IBM's cloud quantum computer
NASA Astrophysics Data System (ADS)
Wei, Shi-Jie; Xin, Tao; Long, Gui-Lu
2018-07-01
The study of quantum channels is an important field and promises a wide range of applications, because any physical process can be represented as a quantum channel that transforms an initial state into a final state. Inspired by the method of performing non-unitary operators by the linear combination of unitary operations, we proposed a quantum algorithm for the simulation of the universal single-qubit channel, described by a convex combination of "quasi-extreme" channels corresponding to four Kraus operators, and is scalable to arbitrary higher dimension. We demonstrated the whole algorithm experimentally using the universal IBM cloud-based quantum computer and studied the properties of different qubit quantum channels. We illustrated the quantum capacity of the general qubit quantum channels, which quantifies the amount of quantum information that can be protected. The behavior of quantum capacity in different channels revealed which types of noise processes can support information transmission, and which types are too destructive to protect information. There was a general agreement between the theoretical predictions and the experiments, which strongly supports our method. By realizing the arbitrary qubit channel, this work provides a universally- accepted way to explore various properties of quantum channels and novel prospect for quantum communication.
The finite element method in low speed aerodynamics
NASA Technical Reports Server (NTRS)
Baker, A. J.; Manhardt, P. D.
1975-01-01
The finite element procedure is shown to be of significant impact in design of the 'computational wind tunnel' for low speed aerodynamics. The uniformity of the mathematical differential equation description, for viscous and/or inviscid, multi-dimensional subsonic flows about practical aerodynamic system configurations, is utilized to establish the general form of the finite element algorithm. Numerical results for inviscid flow analysis, as well as viscous boundary layer, parabolic, and full Navier Stokes flow descriptions verify the capabilities and overall versatility of the fundamental algorithm for aerodynamics. The proven mathematical basis, coupled with the distinct user-orientation features of the computer program embodiment, indicate near-term evolution of a highly useful analytical design tool to support computational configuration studies in low speed aerodynamics.
NASA Astrophysics Data System (ADS)
Zhao, Ben; Garbacki, Paweł; Gkantsidis, Christos; Iamnitchi, Adriana; Voulgaris, Spyros
After a decade of intensive investigation, peer-to-peer computing has established itself as an accepted research eld in the general area of distributed systems. Peer-to- peer computing can be seen as the democratization of computing over throwing traditional hierarchical designs favored in client-server systems largely brought about by last-mile network improvements which have made individual PCs rst-class citizens in the network community. Much of the early focus in peer-to-peer systems was on best-effort le sharing applications. In recent years, however, research has focused on peer-to-peer systems that provide operational properties and functionality similar to those shown by more traditional distributed systems. These properties include stronger consistency, reliability, and security guarantees suitable to supporting traditional applications such as databases.
Berner, Eta S.; Detmer, Don E.; Simborg, Donald
2005-01-01
For over thirty years, there have been predictions that the widespread clinical use of computers was imminent. Yet the “wave” has never broken. In this article, two broad time periods are examined: the 1960's to the 1980's and the 1980's to the present. Technology immaturity, health administrator focus on financial systems, application “unfriendliness,” and physician resistance were all barriers to acceptance during the early time period. Although these factors persist, changes in clinicians' economics, more computer literacy in the general population, and, most importantly, changes in government policies and increased support for clinical computing suggest that the wave may break in the next decade. PMID:15492029
Space-Bounded Church-Turing Thesis and Computational Tractability of Closed Systems.
Braverman, Mark; Schneider, Jonathan; Rojas, Cristóbal
2015-08-28
We report a new limitation on the ability of physical systems to perform computation-one that is based on generalizing the notion of memory, or storage space, available to the system to perform the computation. Roughly, we define memory as the maximal amount of information that the evolving system can carry from one instant to the next. We show that memory is a limiting factor in computation even in lieu of any time limitations on the evolving system-such as when considering its equilibrium regime. We call this limitation the space-bounded Church-Turing thesis (SBCT). The SBCT is supported by a simulation assertion (SA), which states that predicting the long-term behavior of bounded-memory systems is computationally tractable. In particular, one corollary of SA is an explicit bound on the computational hardness of the long-term behavior of a discrete-time finite-dimensional dynamical system that is affected by noise. We prove such a bound explicitly.
Imprecise results: Utilizing partial computations in real-time systems
NASA Technical Reports Server (NTRS)
Lin, Kwei-Jay; Natarajan, Swaminathan; Liu, Jane W.-S.
1987-01-01
In real-time systems, a computation may not have time to complete its execution because of deadline requirements. In such cases, no result except the approximate results produced by the computations up to that point will be available. It is desirable to utilize these imprecise results if possible. Two approaches are proposed to enable computations to return imprecise results when executions cannot be completed normally. The milestone approach records results periodically, and if a deadline is reached, returns the last recorded result. The sieve approach demarcates sections of code which can be skipped if the time available is insufficient. By using these approaches, the system is able to produce imprecise results when deadlines are reached. The design of the Concord project is described which supports imprecise computations using these techniques. Also presented is a general model of imprecise computations using these techniques, as well as one which takes into account the influence of the environment, showing where the latter approach fits into this model.
Towards a general neural controller for quadrupedal locomotion.
Maufroy, Christophe; Kimura, Hiroshi; Takase, Kunikatsu
2008-05-01
Our study aims at the design and implementation of a general controller for quadruped locomotion, allowing the robot to use the whole range of quadrupedal gaits (i.e. from low speed walking to fast running). A general legged locomotion controller must integrate both posture control and rhythmic motion control and have the ability to shift continuously from one control method to the other according to locomotion speed. We are developing such a general quadrupedal locomotion controller by using a neural model involving a CPG (Central Pattern Generator) utilizing ground reaction force sensory feedback. We used a biologically faithful musculoskeletal model with a spine and hind legs, and computationally simulated stable stepping motion at various speeds using the neuro-mechanical system combining the neural controller and the musculoskeletal model. We compared the changes of the most important locomotion characteristics (stepping period, duty ratio and support length) according to speed in our simulations with the data on real cat walking. We found similar tendencies for all of them. In particular, the swing period was approximately constant while the stance period decreased with speed, resulting in a decreasing stepping period and duty ratio. Moreover, the support length increased with speed due to the posterior extreme position that shifted progressively caudally, while the anterior extreme position was approximately constant. This indicates that we succeeded in reproducing to some extent the motion of a cat from the kinematical point of view, even though we used a 2D bipedal model. We expect that such computational models will become essential tools for legged locomotion neuroscience in the future.
Receiving social support online: implications for health education.
White, M; Dorman, S M
2001-12-01
Online support groups are expanding as the general public becomes more comfortable using computer-mediated communication technology. These support groups have certain benefits for users who may not be able to or do not have the desire to attend face-to-face sessions. Online support groups also present challenges when compared to traditional face-to-face group communication. Communication difficulties may arise resulting from lack of visual and aural cues found in traditional face-to-face communication. Online support groups have emerged within health care as a result of the need individuals have to know more about health conditions they are confronting. The proliferation of these online communities may provide an opportunity for health educators to reach target populations with specific messages. This paper reviews the development of health-related online support groups, examines research conducted within these communities, compares their utility with traditional support groups and discusses the implications of these groups for health education.
Generic Divide and Conquer Internet-Based Computing
NASA Technical Reports Server (NTRS)
Follen, Gregory J. (Technical Monitor); Radenski, Atanas
2003-01-01
The growth of Internet-based applications and the proliferation of networking technologies have been transforming traditional commercial application areas as well as computer and computational sciences and engineering. This growth stimulates the exploration of Peer to Peer (P2P) software technologies that can open new research and application opportunities not only for the commercial world, but also for the scientific and high-performance computing applications community. The general goal of this project is to achieve better understanding of the transition to Internet-based high-performance computing and to develop solutions for some of the technical challenges of this transition. In particular, we are interested in creating long-term motivation for end users to provide their idle processor time to support computationally intensive tasks. We believe that a practical P2P architecture should provide useful service to both clients with high-performance computing needs and contributors of lower-end computing resources. To achieve this, we are designing dual -service architecture for P2P high-performance divide-and conquer computing; we are also experimenting with a prototype implementation. Our proposed architecture incorporates a master server, utilizes dual satellite servers, and operates on the Internet in a dynamically changing large configuration of lower-end nodes provided by volunteer contributors. A dual satellite server comprises a high-performance computing engine and a lower-end contributor service engine. The computing engine provides generic support for divide and conquer computations. The service engine is intended to provide free useful HTTP-based services to contributors of lower-end computing resources. Our proposed architecture is complementary to and accessible from computational grids, such as Globus, Legion, and Condor. Grids provide remote access to existing higher-end computing resources; in contrast, our goal is to utilize idle processor time of lower-end Internet nodes. Our project is focused on a generic divide and conquer paradigm and on mobile applications of this paradigm that can operate on a loose and ever changing pool of lower-end Internet nodes.
Xyce parallel electronic simulator users guide, version 6.1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keiter, Eric R; Mei, Ting; Russo, Thomas V.
This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been designed as a SPICE-compatible, high-performance analog circuit simulator, and has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability over the current state-of-the-art in the following areas; Capability to solve extremely large circuit problems by supporting large-scale parallel computing platforms (up to thousands of processors). This includes support for most popular parallel and serial computers; A differential-algebraic-equation (DAE) formulation, which better isolates the device model package from solver algorithms. This allows one to developmore » new types of analysis without requiring the implementation of analysis-specific device models; Device models that are specifically tailored to meet Sandia's needs, including some radiationaware devices (for Sandia users only); and Object-oriented code design and implementation using modern coding practices. Xyce is a parallel code in the most general sense of the phrase-a message passing parallel implementation-which allows it to run efficiently a wide range of computing platforms. These include serial, shared-memory and distributed-memory parallel platforms. Attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows.« less
Xyce parallel electronic simulator users' guide, Version 6.0.1.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keiter, Eric R; Mei, Ting; Russo, Thomas V.
This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been designed as a SPICE-compatible, high-performance analog circuit simulator, and has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability over the current state-of-the-art in the following areas: Capability to solve extremely large circuit problems by supporting large-scale parallel computing platforms (up to thousands of processors). This includes support for most popular parallel and serial computers. A differential-algebraic-equation (DAE) formulation, which better isolates the device model package from solver algorithms. This allows one to developmore » new types of analysis without requiring the implementation of analysis-specific device models. Device models that are specifically tailored to meet Sandias needs, including some radiationaware devices (for Sandia users only). Object-oriented code design and implementation using modern coding practices. Xyce is a parallel code in the most general sense of the phrase a message passing parallel implementation which allows it to run efficiently a wide range of computing platforms. These include serial, shared-memory and distributed-memory parallel platforms. Attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows.« less
Xyce parallel electronic simulator users guide, version 6.0.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keiter, Eric R; Mei, Ting; Russo, Thomas V.
This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been designed as a SPICE-compatible, high-performance analog circuit simulator, and has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability over the current state-of-the-art in the following areas: Capability to solve extremely large circuit problems by supporting large-scale parallel computing platforms (up to thousands of processors). This includes support for most popular parallel and serial computers. A differential-algebraic-equation (DAE) formulation, which better isolates the device model package from solver algorithms. This allows one to developmore » new types of analysis without requiring the implementation of analysis-specific device models. Device models that are specifically tailored to meet Sandias needs, including some radiationaware devices (for Sandia users only). Object-oriented code design and implementation using modern coding practices. Xyce is a parallel code in the most general sense of the phrase a message passing parallel implementation which allows it to run efficiently a wide range of computing platforms. These include serial, shared-memory and distributed-memory parallel platforms. Attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows.« less
Wauchope, R Don; Ahuja, Lajpat R; Arnold, Jeffrey G; Bingner, Ron; Lowrance, Richard; van Genuchten, Martinus T; Adams, Larry D
2003-01-01
We present an overview of USDA Agricultural Research Service (ARS) computer models and databases related to pest-management science, emphasizing current developments in environmental risk assessment and management simulation models. The ARS has a unique national interdisciplinary team of researchers in surface and sub-surface hydrology, soil and plant science, systems analysis and pesticide science, who have networked to develop empirical and mechanistic computer models describing the behavior of pests, pest responses to controls and the environmental impact of pest-control methods. Historically, much of this work has been in support of production agriculture and in support of the conservation programs of our 'action agency' sister, the Natural Resources Conservation Service (formerly the Soil Conservation Service). Because we are a public agency, our software/database products are generally offered without cost, unless they are developed in cooperation with a private-sector cooperator. Because ARS is a basic and applied research organization, with development of new science as our highest priority, these products tend to be offered on an 'as-is' basis with limited user support except for cooperating R&D relationship with other scientists. However, rapid changes in the technology for information analysis and communication continually challenge our way of doing business.
Foundational Tools for Petascale Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, Barton
2014-05-19
The Paradyn project has a history of developing algorithms, techniques, and software that push the cutting edge of tool technology for high-end computing systems. Under this funding, we are working on a three-year agenda to make substantial new advances in support of new and emerging Petascale systems. The overall goal for this work is to address the steady increase in complexity of these petascale systems. Our work covers two key areas: (1) The analysis, instrumentation and control of binary programs. Work in this area falls under the general framework of the Dyninst API tool kits. (2) Infrastructure for building toolsmore » and applications at extreme scale. Work in this area falls under the general framework of the MRNet scalability framework. Note that work done under this funding is closely related to work done under a contemporaneous grant, “High-Performance Energy Applications and Systems”, SC0004061/FG02-10ER25972, UW PRJ36WV.« less
Solving large mixed linear models using preconditioned conjugate gradient iteration.
Strandén, I; Lidauer, M
1999-12-01
Continuous evaluation of dairy cattle with a random regression test-day model requires a fast solving method and algorithm. A new computing technique feasible in Jacobi and conjugate gradient based iterative methods using iteration on data is presented. In the new computing technique, the calculations in multiplication of a vector by a matrix were recorded to three steps instead of the commonly used two steps. The three-step method was implemented in a general mixed linear model program that used preconditioned conjugate gradient iteration. Performance of this program in comparison to other general solving programs was assessed via estimation of breeding values using univariate, multivariate, and random regression test-day models. Central processing unit time per iteration with the new three-step technique was, at best, one-third that needed with the old technique. Performance was best with the test-day model, which was the largest and most complex model used. The new program did well in comparison to other general software. Programs keeping the mixed model equations in random access memory required at least 20 and 435% more time to solve the univariate and multivariate animal models, respectively. Computations of the second best iteration on data took approximately three and five times longer for the animal and test-day models, respectively, than did the new program. Good performance was due to fast computing time per iteration and quick convergence to the final solutions. Use of preconditioned conjugate gradient based methods in solving large breeding value problems is supported by our findings.
Nuclear Engine System Simulation (NESS) version 2.0
NASA Technical Reports Server (NTRS)
Pelaccio, Dennis G.; Scheil, Christine M.; Petrosky, Lyman J.
1993-01-01
The topics are presented in viewgraph form and include the following; nuclear thermal propulsion (NTP) engine system analysis program development; nuclear thermal propulsion engine analysis capability requirements; team resources used to support NESS development; expanded liquid engine simulations (ELES) computer model; ELES verification examples; NESS program development evolution; past NTP ELES analysis code modifications and verifications; general NTP engine system features modeled by NESS; representative NTP expander, gas generator, and bleed engine system cycles modeled by NESS; NESS program overview; NESS program flow logic; enabler (NERVA type) nuclear thermal rocket engine; prismatic fuel elements and supports; reactor fuel and support element parameters; reactor parameters as a function of thrust level; internal shield sizing; and reactor thermal model.
DOC II 32-bit digital optical computer: optoelectronic hardware and software
NASA Astrophysics Data System (ADS)
Stone, Richard V.; Zeise, Frederick F.; Guilfoyle, Peter S.
1991-12-01
This paper describes current electronic hardware subsystems and software code which support OptiComp's 32-bit general purpose digital optical computer (DOC II). The reader is referred to earlier papers presented in this section for a thorough discussion of theory and application regarding DOC II. The primary optoelectronic subsystems include the drive electronics for the multichannel acousto-optic modulators, the avalanche photodiode amplifier, as well as threshold circuitry, and the memory subsystems. This device utilizes a single optical Boolean vector matrix multiplier and its VME based host controller interface in performing various higher level primitives. OptiComp Corporation wishes to acknowledge the financial support of the Office of Naval Research, the National Aeronautics and Space Administration, the Rome Air Development Center, and the Strategic Defense Initiative Office for the funding of this program under contracts N00014-87-C-0077, N00014-89-C-0266 and N00014-89-C- 0225.
Computer Applications and Virtual Environments (CAVE)
NASA Technical Reports Server (NTRS)
1993-01-01
Virtual Reality (VR) can provide cost effective methods to design and evaluate components and systems for maintenance and refurbishment operations. Marshall SPace Flight Center (MSFC) is begirning to utilize VR for design analysis in the X-34 experimental reusable space vehicle. Analysts at MSFC's Computer Applications and Virtual Environments (CAVE) used Head Mounted Displays (HMD) (pictured), spatial trackers and gesture inputs as a means to animate or inhabit a properly sized virtual human model. These models are used in a VR scenario as a way to determine functionality of space and maintenance requirements for the virtual X-34. The primary functions of the virtual X-34 mockup is to support operations development and design analysis for engine removal, the engine compartment and the aft fuselage. This capability provides general visualization support to engineers and designers at MSFC and to the System Design Freeze Review at Orbital Sciences Corporation (OSC).
ComputerApplications and Virtual Environments (CAVE)
NASA Technical Reports Server (NTRS)
1993-01-01
Virtual Reality (VR) can provide cost effective methods to design and evaluate components and systems for maintenance and refurbishment operations. The Marshall Space Flight Centerr (MSFC) in Huntsville, Alabama began to utilize VR for design analysis in the X-34 experimental reusable space vehicle. Analysts at MSFC's Computer Applications and Virtual Environments (CAVE) used Head Mounted Displays (HMD) (pictured), spatial trackers and gesture inputs as a means to animate or inhabit a properly sized virtual human model. These models were used in a VR scenario as a way to determine functionality of space and maintenance requirements for the virtual X-34. The primary functions of the virtual X-34 mockup was to support operations development and design analysis for engine removal, the engine compartment and the aft fuselage. This capability provided general visualization support to engineers and designers at MSFC and to the System Design Freeze Review at Orbital Sciences Corporation (OSC). The X-34 program was cancelled in 2001.
EON: a component-based approach to automation of protocol-directed therapy.
Musen, M A; Tu, S W; Das, A K; Shahar, Y
1996-01-01
Provision of automated support for planning protocol-directed therapy requires a computer program to take as input clinical data stored in an electronic patient-record system and to generate as output recommendations for therapeutic interventions and laboratory testing that are defined by applicable protocols. This paper presents a synthesis of research carried out at Stanford University to model the therapy-planning task and to demonstrate a component-based architecture for building protocol-based decision-support systems. We have constructed general-purpose software components that (1) interpret abstract protocol specifications to construct appropriate patient-specific treatment plans; (2) infer from time-stamped patient data higher-level, interval-based, abstract concepts; (3) perform time-oriented queries on a time-oriented patient database; and (4) allow acquisition and maintenance of protocol knowledge in a manner that facilitates efficient processing both by humans and by computers. We have implemented these components in a computer system known as EON. Each of the components has been developed, evaluated, and reported independently. We have evaluated the integration of the components as a composite architecture by implementing T-HELPER, a computer-based patient-record system that uses EON to offer advice regarding the management of patients who are following clinical trial protocols for AIDS or HIV infection. A test of the reuse of the software components in a different clinical domain demonstrated rapid development of a prototype application to support protocol-based care of patients who have breast cancer. PMID:8930854
High Resolution Wind Direction and Speed Information for Support of Fire Operations
B.W. Butler; J.M. Forthofer; M.A. Finney; L.S. Bradshaw; R. Stratton
2006-01-01
Computational Fluid Dynamics (CFD) technology has been used to model wind speed and direction in mountainous terrain at a relatively high resolution compared to other readily available technologies. The process termed âgridded windâ is not a forecast, but rather represents a method for calculating the influence of terrain on general wind flows. Gridded wind simulations...
NASA Astrophysics Data System (ADS)
Zhao, Liang; Xu, Shun; Tu, Yu-Song; Zhou, Xin
2017-06-01
Not Available Project supported by the National Natural Science Foundation for Outstanding Young Scholars, China (Grant No. 11422542), the National Natural Science Foundation of China (Grant Nos. 11605151 and 11675138), and the Shanghai Supercomputer Center of China and Special Program for Applied Research on Super Computation of the NSFC-Guangdong Joint Fund (the second phase).
Relative motion of orbiting particles under the influence of perturbing forces. Volume 1: Summary
NASA Technical Reports Server (NTRS)
Eades, J. B., Jr.
1974-01-01
The relative motion for orbiting vehicles, under the influence of various perturbing forces, has been studied to determine what influence these inputs, and others, can have. The analytical tasks are discribed in general terms; the force types considered, are outlined modelled and simulated, and the capabilities of the computer programs which have evolved in support of this work are denoted.
Design of a Multi-Touch Tabletop for Simulation-Based Training
2014-06-01
receive, for example using point and click mouse-based computer interactions to specify the routes that vehicles take as part of a convoy...learning, coordination and support for planning. We first provide background in tabletop interaction in general and survey earlier efforts to use...tremendous progress over the past five years. Touch detection technologies now enable multiple users to interact simultaneously on large areas with
22. VIEW OF THE BASEMENT FLOOR PLAN. THE BASEMENT TUNNELS ...
22. VIEW OF THE BASEMENT FLOOR PLAN. THE BASEMENT TUNNELS WERE DESIGNED AS FALLOUT SHELTERS AND USED FOR STORAGE. THE ORIGINAL DRAWING HAS BEEN ARCHIVED ON MICROFILM. THE DRAWING WAS REPRODUCED AT THE BEST QUALITY POSSIBLE. LETTERS AND NUMBERS IN THE CIRCLES INDICATE FOOTER AND/OR COLUMN LOCATIONS. - Rocky Flats Plant, General Manufacturing, Support, Records-Central Computing, Southern portion of Plant, Golden, Jefferson County, CO
Aeronautical engineering: A continuing bibliography with indexes (supplement 267)
NASA Technical Reports Server (NTRS)
1991-01-01
This bibliography lists 661 reports, articles, and other documents introduced into the NASA scientific and technical information system in June, 1991. Subject coverage includes design, construction and testing of aircraft and aircraft engines; aircraft components, equipment and systems; ground support systems; theoretical and applied aspects of aerodynamics and general fluid dynamics; electrical engineering; aircraft control; remote sensing; computer sciences; nuclear physics; and social sciences.
ERIC Educational Resources Information Center
Vermont Inst. for Self-Reliance, Rutland.
This guide provides a description of Responsive Text (RT), a method for presenting job-relevant information within a computer-based support system. A summary of what RT is and why it is important is provided first. The first section of the guide provides a brief overview of what research tells about the reading process and how the general design…
An Analysis of Cloud Computing with Amazon Web Services for the Atmospheric Science Data Center
NASA Astrophysics Data System (ADS)
Gleason, J. L.; Little, M. M.
2013-12-01
NASA science and engineering efforts rely heavily on compute and data handling systems. The nature of NASA science data is such that it is not restricted to NASA users, instead it is widely shared across a globally distributed user community including scientists, educators, policy decision makers, and the public. Therefore NASA science computing is a candidate use case for cloud computing where compute resources are outsourced to an external vendor. Amazon Web Services (AWS) is a commercial cloud computing service developed to use excess computing capacity at Amazon, and potentially provides an alternative to costly and potentially underutilized dedicated acquisitions whenever NASA scientists or engineers require additional data processing. AWS desires to provide a simplified avenue for NASA scientists and researchers to share large, complex data sets with external partners and the public. AWS has been extensively used by JPL for a wide range of computing needs and was previously tested on a NASA Agency basis during the Nebula testing program. Its ability to support the Langley Science Directorate needs to be evaluated by integrating it with real world operational needs across NASA and the associated maturity that would come with that. The strengths and weaknesses of this architecture and its ability to support general science and engineering applications has been demonstrated during the previous testing. The Langley Office of the Chief Information Officer in partnership with the Atmospheric Sciences Data Center (ASDC) has established a pilot business interface to utilize AWS cloud computing resources on a organization and project level pay per use model. This poster discusses an effort to evaluate the feasibility of the pilot business interface from a project level perspective by specifically using a processing scenario involving the Clouds and Earth's Radiant Energy System (CERES) project.
Discovering chemistry with an ab initio nanoreactor
Wang, Lee-Ping; Titov, Alexey; McGibbon, Robert; ...
2014-11-02
Chemical understanding is driven by the experimental discovery of new compounds and reactivity, and is supported by theory and computation that provides detailed physical insight. While theoretical and computational studies have generally focused on specific processes or mechanistic hypotheses, recent methodological and computational advances harken the advent of their principal role in discovery. Here we report the development and application of the ab initio nanoreactor – a highly accelerated, first-principles molecular dynamics simulation of chemical reactions that discovers new molecules and mechanisms without preordained reaction coordinates or elementary steps. Using the nanoreactor we show new pathways for glycine synthesis frommore » primitive compounds proposed to exist on the early Earth, providing new insight into the classic Urey-Miller experiment. Ultimately, these results highlight the emergence of theoretical and computational chemistry as a tool for discovery in addition to its traditional role of interpreting experimental findings.« less
Discovering chemistry with an ab initio nanoreactor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Lee-Ping; Titov, Alexey; McGibbon, Robert
Chemical understanding is driven by the experimental discovery of new compounds and reactivity, and is supported by theory and computation that provides detailed physical insight. While theoretical and computational studies have generally focused on specific processes or mechanistic hypotheses, recent methodological and computational advances harken the advent of their principal role in discovery. Here we report the development and application of the ab initio nanoreactor – a highly accelerated, first-principles molecular dynamics simulation of chemical reactions that discovers new molecules and mechanisms without preordained reaction coordinates or elementary steps. Using the nanoreactor we show new pathways for glycine synthesis frommore » primitive compounds proposed to exist on the early Earth, providing new insight into the classic Urey-Miller experiment. Ultimately, these results highlight the emergence of theoretical and computational chemistry as a tool for discovery in addition to its traditional role of interpreting experimental findings.« less
Partitioning in Avionics Architectures: Requirements, Mechanisms, and Assurance
NASA Technical Reports Server (NTRS)
Rushby, John
1999-01-01
Automated aircraft control has traditionally been divided into distinct "functions" that are implemented separately (e.g., autopilot, autothrottle, flight management); each function has its own fault-tolerant computer system, and dependencies among different functions are generally limited to the exchange of sensor and control data. A by-product of this "federated" architecture is that faults are strongly contained within the computer system of the function where they occur and cannot readily propagate to affect the operation of other functions. More modern avionics architectures contemplate supporting multiple functions on a single, shared, fault-tolerant computer system where natural fault containment boundaries are less sharply defined. Partitioning uses appropriate hardware and software mechanisms to restore strong fault containment to such integrated architectures. This report examines the requirements for partitioning, mechanisms for their realization, and issues in providing assurance for partitioning. Because partitioning shares some concerns with computer security, security models are reviewed and compared with the concerns of partitioning.
Bures, Vladimír; Otcenásková, Tereza; Cech, Pavel; Antos, Karel
2012-11-01
Biological incidents jeopardising public health require decision-making that consists of one dominant feature: complexity. Therefore, public health decision-makers necessitate appropriate support. Based on the analogy with business intelligence (BI) principles, the contextual analysis of the environment and available data resources, and conceptual modelling within systems and knowledge engineering, this paper proposes a general framework for computer-based decision support in the case of a biological incident. At the outset, the analysis of potential inputs to the framework is conducted and several resources such as demographic information, strategic documents, environmental characteristics, agent descriptors and surveillance systems are considered. Consequently, three prototypes were developed, tested and evaluated by a group of experts. Their selection was based on the overall framework scheme. Subsequently, an ontology prototype linked with an inference engine, multi-agent-based model focusing on the simulation of an environment, and expert-system prototypes were created. All prototypes proved to be utilisable support tools for decision-making in the field of public health. Nevertheless, the research revealed further issues and challenges that might be investigated by both public health focused researchers and practitioners.
NASA Technical Reports Server (NTRS)
Muraca, R. J.; Stephens, M. V.; Dagenhart, J. R.
1975-01-01
A general analysis capable of predicting performance characteristics of cross-wind axis turbines was developed, including the effects of airfoil geometry, support struts, blade aspect ratio, windmill solidity, blade interference and curved flow. The results were compared with available wind tunnel results for a catenary blade shape. A theoretical performance curve for an aerodynamically efficient straight blade configuration was also presented. In addition, a linearized analytical solution applicable for straight configurations was developed. A listing of the computer program developed for numerical solutions of the general performance equations is included in the appendix.
Computer assisted audit techniques for UNIX (UNIX-CAATS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Polk, W.T.
1991-12-31
Federal and DOE regulations impose specific requirements for internal controls of computer systems. These controls include adequate separation of duties and sufficient controls for access of system and data. The DOE Inspector General`s Office has the responsibility to examine internal controls, as well as efficient use of computer system resources. As a result, DOE supported NIST development of computer assisted audit techniques to examine BSD UNIX computers (UNIX-CAATS). These systems were selected due to the increasing number of UNIX workstations in use within DOE. This paper describes the design and development of these techniques, as well as the results ofmore » testing at NIST and the first audit at a DOE site. UNIX-CAATS consists of tools which examine security of passwords, file systems, and network access. In addition, a tool was developed to examine efficiency of disk utilization. Test results at NIST indicated inadequate password management, as well as weak network resource controls. File system security was considered adequate. Audit results at a DOE site indicated weak password management and inefficient disk utilization. During the audit, we also found improvements to UNIX-CAATS were needed when applied to large systems. NIST plans to enhance the techniques developed for DOE/IG in future work. This future work would leverage currently available tools, along with needed enhancements. These enhancements would enable DOE/IG to audit large systems, such as supercomputers.« less
Developing eThread pipeline using SAGA-pilot abstraction for large-scale structural bioinformatics.
Ragothaman, Anjani; Boddu, Sairam Chowdary; Kim, Nayong; Feinstein, Wei; Brylinski, Michal; Jha, Shantenu; Kim, Joohyun
2014-01-01
While most of computational annotation approaches are sequence-based, threading methods are becoming increasingly attractive because of predicted structural information that could uncover the underlying function. However, threading tools are generally compute-intensive and the number of protein sequences from even small genomes such as prokaryotes is large typically containing many thousands, prohibiting their application as a genome-wide structural systems biology tool. To leverage its utility, we have developed a pipeline for eThread--a meta-threading protein structure modeling tool, that can use computational resources efficiently and effectively. We employ a pilot-based approach that supports seamless data and task-level parallelism and manages large variation in workload and computational requirements. Our scalable pipeline is deployed on Amazon EC2 and can efficiently select resources based upon task requirements. We present runtime analysis to characterize computational complexity of eThread and EC2 infrastructure. Based on results, we suggest a pathway to an optimized solution with respect to metrics such as time-to-solution or cost-to-solution. Our eThread pipeline can scale to support a large number of sequences and is expected to be a viable solution for genome-scale structural bioinformatics and structure-based annotation, particularly, amenable for small genomes such as prokaryotes. The developed pipeline is easily extensible to other types of distributed cyberinfrastructure.
Developing eThread Pipeline Using SAGA-Pilot Abstraction for Large-Scale Structural Bioinformatics
Ragothaman, Anjani; Feinstein, Wei; Jha, Shantenu; Kim, Joohyun
2014-01-01
While most of computational annotation approaches are sequence-based, threading methods are becoming increasingly attractive because of predicted structural information that could uncover the underlying function. However, threading tools are generally compute-intensive and the number of protein sequences from even small genomes such as prokaryotes is large typically containing many thousands, prohibiting their application as a genome-wide structural systems biology tool. To leverage its utility, we have developed a pipeline for eThread—a meta-threading protein structure modeling tool, that can use computational resources efficiently and effectively. We employ a pilot-based approach that supports seamless data and task-level parallelism and manages large variation in workload and computational requirements. Our scalable pipeline is deployed on Amazon EC2 and can efficiently select resources based upon task requirements. We present runtime analysis to characterize computational complexity of eThread and EC2 infrastructure. Based on results, we suggest a pathway to an optimized solution with respect to metrics such as time-to-solution or cost-to-solution. Our eThread pipeline can scale to support a large number of sequences and is expected to be a viable solution for genome-scale structural bioinformatics and structure-based annotation, particularly, amenable for small genomes such as prokaryotes. The developed pipeline is easily extensible to other types of distributed cyberinfrastructure. PMID:24995285
Multidisciplinary analysis of actively controlled large flexible spacecraft
NASA Technical Reports Server (NTRS)
Cooper, Paul A.; Young, John W.; Sutter, Thomas R.
1986-01-01
The control of Flexible Structures (COFS) program has supported the development of an analysis capability at the Langley Research Center called the Integrated Multidisciplinary Analysis Tool (IMAT) which provides an efficient data storage and transfer capability among commercial computer codes to aid in the dynamic analysis of actively controlled structures. IMAT is a system of computer programs which transfers Computer-Aided-Design (CAD) configurations, structural finite element models, material property and stress information, structural and rigid-body dynamic model information, and linear system matrices for control law formulation among various commercial applications programs through a common database. Although general in its formulation, IMAT was developed specifically to aid in the evaluation of the structures. A description of the IMAT system and results of an application of the system are given.
NASA Technical Reports Server (NTRS)
Flanders, J. H.; Helmers, C. T.; Stanten, S. F.
1973-01-01
The relationship is examined between the space shuttle onboard avionics and the ground test computer language GOAL when used in the onboard computers. The study is aimed at providing system analysis support to the feasibility analysis of a GOAL to HAL translator, where HAL is the language used to program the onboard computers for flight. The subject is dealt with in three aspects. First, the system configuration at checkout, the general checkout and launch sequences, and the inventory of subsystems are described. Secondly, the hierarchic organization of onboard software and different ways of introducing GOAL-derived software onboard are described. Also the flow of commands and test data during checkout is diagrammed. Finally, possible impact of error detection and redundancy management on the GOAL language is discussed.
NASA Technical Reports Server (NTRS)
1994-01-01
In the mid-1980s, Kinetic Systems and Langley Research Center determined that high speed CAMAC (Computer Automated Measurement and Control) data acquisition systems could significantly improve Langley's ARTS (Advanced Real Time Simulation) system. The ARTS system supports flight simulation R&D, and the CAMAC equipment allowed 32 high performance simulators to be controlled by centrally located host computers. This technology broadened Kinetic Systems' capabilities and led to several commercial applications. One of them is General Atomics' fusion research program. Kinetic Systems equipment allows tokamak data to be acquired four to 15 times more rapidly. Ford Motor company uses the same technology to control and monitor transmission testing facilities.
Interface Provides Standard-Bus Communication
NASA Technical Reports Server (NTRS)
Culliton, William G.
1995-01-01
Microprocessor-controlled interface (IEEE-488/LVABI) incorporates service-request and direct-memory-access features. Is circuit card enabling digital communication between system called "laser auto-covariance buffer interface" (LVABI) and compatible personal computer via general-purpose interface bus (GPIB) conforming to Institute for Electrical and Electronics Engineers (IEEE) Standard 488. Interface serves as second interface enabling first interface to exploit advantages of GPIB, via utility software written specifically for GPIB. Advantages include compatibility with multitasking and support of communication among multiple computers. Basic concept also applied in designing interfaces for circuits other than LVABI for unidirectional or bidirectional handling of parallel data up to 16 bits wide.
ERIC Educational Resources Information Center
Celik, Vehbi; Yesilyurt, Etem
2013-01-01
There is a large body of research regarding computer supported education, perceptions of computer self-efficacy, computer anxiety and the technological attitudes of teachers and teacher candidates. However, no study has been conducted on the correlation between and effect of computer supported education, perceived computer self-efficacy, computer…
Development of hybrid computer plasma models for different pressure regimes
NASA Astrophysics Data System (ADS)
Hromadka, Jakub; Ibehej, Tomas; Hrach, Rudolf
2016-09-01
With increased performance of contemporary computers during last decades numerical simulations became a very powerful tool applicable also in plasma physics research. Plasma is generally an ensemble of mutually interacting particles that is out of the thermodynamic equilibrium and for this reason fluid computer plasma models give results with only limited accuracy. On the other hand, much more precise particle models are often limited only on 2D problems because of their huge demands on the computer resources. Our contribution is devoted to hybrid modelling techniques that combine advantages of both modelling techniques mentioned above, particularly to their so-called iterative version. The study is focused on mutual relations between fluid and particle models that are demonstrated on the calculations of sheath structures of low temperature argon plasma near a cylindrical Langmuir probe for medium and higher pressures. Results of a simple iterative hybrid plasma computer model are also given. The authors acknowledge the support of the Grant Agency of Charles University in Prague (project 220215).
Space-Bounded Church-Turing Thesis and Computational Tractability of Closed Systems
NASA Astrophysics Data System (ADS)
Braverman, Mark; Schneider, Jonathan; Rojas, Cristóbal
2015-08-01
We report a new limitation on the ability of physical systems to perform computation—one that is based on generalizing the notion of memory, or storage space, available to the system to perform the computation. Roughly, we define memory as the maximal amount of information that the evolving system can carry from one instant to the next. We show that memory is a limiting factor in computation even in lieu of any time limitations on the evolving system—such as when considering its equilibrium regime. We call this limitation the space-bounded Church-Turing thesis (SBCT). The SBCT is supported by a simulation assertion (SA), which states that predicting the long-term behavior of bounded-memory systems is computationally tractable. In particular, one corollary of SA is an explicit bound on the computational hardness of the long-term behavior of a discrete-time finite-dimensional dynamical system that is affected by noise. We prove such a bound explicitly.
Center of Excellence for Geospatial Information Science research plan 2013-18
Usery, E. Lynn
2013-01-01
The U.S. Geological Survey Center of Excellence for Geospatial Information Science (CEGIS) was created in 2006 and since that time has provided research primarily in support of The National Map. The presentations and publications of the CEGIS researchers document the research accomplishments that include advances in electronic topographic map design, generalization, data integration, map projections, sea level rise modeling, geospatial semantics, ontology, user-centered design, volunteer geographic information, and parallel and grid computing for geospatial data from The National Map. A research plan spanning 2013–18 has been developed extending the accomplishments of the CEGIS researchers and documenting new research areas that are anticipated to support The National Map of the future. In addition to extending the 2006–12 research areas, the CEGIS research plan for 2013–18 includes new research areas in data models, geospatial semantics, high-performance computing, volunteered geographic information, crowdsourcing, social media, data integration, and multiscale representations to support the Three-Dimensional Elevation Program (3DEP) and The National Map of the future of the U.S. Geological Survey.
Clifford support vector machines for classification, regression, and recurrence.
Bayro-Corrochano, Eduardo Jose; Arana-Daniel, Nancy
2010-11-01
This paper introduces the Clifford support vector machines (CSVM) as a generalization of the real and complex-valued support vector machines using the Clifford geometric algebra. In this framework, we handle the design of kernels involving the Clifford or geometric product. In this approach, one redefines the optimization variables as multivectors. This allows us to have a multivector as output. Therefore, we can represent multiple classes according to the dimension of the geometric algebra in which we work. We show that one can apply CSVM for classification and regression and also to build a recurrent CSVM. The CSVM is an attractive approach for the multiple input multiple output processing of high-dimensional geometric entities. We carried out comparisons between CSVM and the current approaches to solve multiclass classification and regression. We also study the performance of the recurrent CSVM with experiments involving time series. The authors believe that this paper can be of great use for researchers and practitioners interested in multiclass hypercomplex computing, particularly for applications in complex and quaternion signal and image processing, satellite control, neurocomputation, pattern recognition, computer vision, augmented virtual reality, robotics, and humanoids.
Fuzzy logic, neural networks, and soft computing
NASA Technical Reports Server (NTRS)
Zadeh, Lofti A.
1994-01-01
The past few years have witnessed a rapid growth of interest in a cluster of modes of modeling and computation which may be described collectively as soft computing. The distinguishing characteristic of soft computing is that its primary aims are to achieve tractability, robustness, low cost, and high MIQ (machine intelligence quotient) through an exploitation of the tolerance for imprecision and uncertainty. Thus, in soft computing what is usually sought is an approximate solution to a precisely formulated problem or, more typically, an approximate solution to an imprecisely formulated problem. A simple case in point is the problem of parking a car. Generally, humans can park a car rather easily because the final position of the car is not specified exactly. If it were specified to within, say, a few millimeters and a fraction of a degree, it would take hours or days of maneuvering and precise measurements of distance and angular position to solve the problem. What this simple example points to is the fact that, in general, high precision carries a high cost. The challenge, then, is to exploit the tolerance for imprecision by devising methods of computation which lead to an acceptable solution at low cost. By its nature, soft computing is much closer to human reasoning than the traditional modes of computation. At this juncture, the major components of soft computing are fuzzy logic (FL), neural network theory (NN), and probabilistic reasoning techniques (PR), including genetic algorithms, chaos theory, and part of learning theory. Increasingly, these techniques are used in combination to achieve significant improvement in performance and adaptability. Among the important application areas for soft computing are control systems, expert systems, data compression techniques, image processing, and decision support systems. It may be argued that it is soft computing, rather than the traditional hard computing, that should be viewed as the foundation for artificial intelligence. In the years ahead, this may well become a widely held position.
Pareto Joint Inversion of Love and Quasi Rayleigh's waves - synthetic study
NASA Astrophysics Data System (ADS)
Bogacz, Adrian; Dalton, David; Danek, Tomasz; Miernik, Katarzyna; Slawinski, Michael A.
2017-04-01
In this contribution the specific application of Pareto joint inversion in solving geophysical problem is presented. Pareto criterion combine with Particle Swarm Optimization were used to solve geophysical inverse problems for Love and Quasi Rayleigh's waves. Basic theory of forward problem calculation for chosen surface waves is described. To avoid computational problems some simplification were made. This operation allowed foster and more straightforward calculation without lost of solution generality. According to the solving scheme restrictions, considered model must have exact two layers, elastic isotropic surface layer and elastic isotropic half space with infinite thickness. The aim of the inversion is to obain elastic parameters and model geometry using dispersion data. In calculations different case were considered, such as different number of modes for different wave types and different frequencies. Created solutions are using OpenMP standard for parallel computing, which help in reduction of computational times. The results of experimental computations are presented and commented. This research was performed in the context of The Geomechanics Project supported by Husky Energy. Also, this research was partially supported by the Natural Sciences and Engineering Research Council of Canada, grant 238416-2013, and by the Polish National Science Center under contract No. DEC-2013/11/B/ST10/0472.
Exemplary Social Studies Teachers Use of Computer-Supported Instruction in the Classroom
ERIC Educational Resources Information Center
Acikalin, Mehmet
2010-01-01
Educators increasingly support the use of computer-supported instruction in social studies education. However few studies have been conducted to study teacher use of computer-supported instruction in social studies education. This study was therefore designed to examine the use of exemplary social studies teachers' computer-supported instruction…
NASA Astrophysics Data System (ADS)
Bielski, Conrad; Lemoine, Guido; Syryczynski, Jacek
2009-09-01
High Performance Computing (HPC) hardware solutions such as grid computing and General Processing on a Graphics Processing Unit (GPGPU) are now accessible to users with general computing needs. Grid computing infrastructures in the form of computing clusters or blades are becoming common place and GPGPU solutions that leverage the processing power of the video card are quickly being integrated into personal workstations. Our interest in these HPC technologies stems from the need to produce near real-time maps from a combination of pre- and post-event satellite imagery in support of post-disaster management. Faster processing provides a twofold gain in this situation: 1. critical information can be provided faster and 2. more elaborate automated processing can be performed prior to providing the critical information. In our particular case, we test the use of the PANTEX index which is based on analysis of image textural measures extracted using anisotropic, rotation-invariant GLCM statistics. The use of this index, applied in a moving window, has been shown to successfully identify built-up areas in remotely sensed imagery. Built-up index image masks are important input to the structuring of damage assessment interpretation because they help optimise the workload. The performance of computing the PANTEX workflow is compared on two different HPC hardware architectures: (1) a blade server with 4 blades, each having dual quad-core CPUs and (2) a CUDA enabled GPU workstation. The reference platform is a dual CPU-quad core workstation and the PANTEX workflow total computing time is measured. Furthermore, as part of a qualitative evaluation, the differences in setting up and configuring various hardware solutions and the related software coding effort is presented.
1988-03-31
radar operation and data - collection activities, a large data -analysis effort has been under way in support of automatic wind-shear detection algorithm ...REDUCTION AND ALGORITHM DEVELOPMENT 49 A. General-Purpose Software 49 B. Concurrent Computer Systems 49 C. Sun Workstations 51 D. Radar Data Analysis 52...1. Algorithm Verification 52 2. Other Studies 53 3. Translations 54 4. Outside Distributions 55 E. Mesonet/LLWAS Data Analysis 55 1. 1985 Data 55 2
24. VIEW OF THE SECOND FLOOR PLAN. ENRICHED URANIUM AND ...
24. VIEW OF THE SECOND FLOOR PLAN. ENRICHED URANIUM AND STAINLESS STEEL WEAPONS COMPONENT PRODUCTION-RELATED ACTIVITIES OCCURRED PRIMARILY ON THE SECOND FLOOR. THE ORIGINAL DRAWING HAS BEEN ARCHIVED ON MICROFILM. THE DRAWING WAS REPRODUCED AT THE BEST QUALITY POSSIBLE. LETTERS AND NUMBERS IN THE CIRCLES INDICATE FOOTER AND/OR COLUMN LOCATIONS. - Rocky Flats Plant, General Manufacturing, Support, Records-Central Computing, Southern portion of Plant, Golden, Jefferson County, CO
25. VIEW OF THE MACHINE TOOL LAYOUT IN ROOMS 244 ...
25. VIEW OF THE MACHINE TOOL LAYOUT IN ROOMS 244 AND 296. MACHINES WERE USED FOR STAINLESS STEEL FABRICATION (THE J-LINE). THE ORIGINAL DRAWING HAS BEEN ARCHIVED ON MICROFILM. THE DRAWING WAS REPRODUCED AT THE BEST QUALITY POSSIBLE. LETTERS AND NUMBERS IN THE CIRCLES INDICATE FOOTER AND/OR COLUMN LOCATIONS. - Rocky Flats Plant, General Manufacturing, Support, Records-Central Computing, Southern portion of Plant, Golden, Jefferson County, CO
Snowflake: A Lightweight Portable Stencil DSL
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Nathan; Driscoll, Michael; Markley, Charles
Stencil computations are not well optimized by general-purpose production compilers and the increased use of multicore, manycore, and accelerator-based systems makes the optimization problem even more challenging. In this paper we present Snowflake, a Domain Specific Language (DSL) for stencils that uses a 'micro-compiler' approach, i.e., small, focused, domain-specific code generators. The approach is similar to that used in image processing stencils, but Snowflake handles the much more complex stencils that arise in scientific computing, including complex boundary conditions, higher-order operators (larger stencils), higher dimensions, variable coefficients, non-unit-stride iteration spaces, and multiple input or output meshes. Snowflake is embedded inmore » the Python language, allowing it to interoperate with popular scientific tools like SciPy and iPython; it also takes advantage of built-in Python libraries for powerful dependence analysis as part of a just-in-time compiler. We demonstrate the power of the Snowflake language and the micro-compiler approach with a complex scientific benchmark, HPGMG, that exercises the generality of stencil support in Snowflake. By generating OpenMP comparable to, and OpenCL within a factor of 2x of hand-optimized HPGMG, Snowflake demonstrates that a micro-compiler can support diverse processor architectures and is performance-competitive whilst preserving a high-level Python implementation.« less
Snowflake: A Lightweight Portable Stencil DSL
Zhang, Nathan; Driscoll, Michael; Markley, Charles; ...
2017-05-01
Stencil computations are not well optimized by general-purpose production compilers and the increased use of multicore, manycore, and accelerator-based systems makes the optimization problem even more challenging. In this paper we present Snowflake, a Domain Specific Language (DSL) for stencils that uses a 'micro-compiler' approach, i.e., small, focused, domain-specific code generators. The approach is similar to that used in image processing stencils, but Snowflake handles the much more complex stencils that arise in scientific computing, including complex boundary conditions, higher-order operators (larger stencils), higher dimensions, variable coefficients, non-unit-stride iteration spaces, and multiple input or output meshes. Snowflake is embedded inmore » the Python language, allowing it to interoperate with popular scientific tools like SciPy and iPython; it also takes advantage of built-in Python libraries for powerful dependence analysis as part of a just-in-time compiler. We demonstrate the power of the Snowflake language and the micro-compiler approach with a complex scientific benchmark, HPGMG, that exercises the generality of stencil support in Snowflake. By generating OpenMP comparable to, and OpenCL within a factor of 2x of hand-optimized HPGMG, Snowflake demonstrates that a micro-compiler can support diverse processor architectures and is performance-competitive whilst preserving a high-level Python implementation.« less
Density-Dependent Quantized Least Squares Support Vector Machine for Large Data Sets.
Nan, Shengyu; Sun, Lei; Chen, Badong; Lin, Zhiping; Toh, Kar-Ann
2017-01-01
Based on the knowledge that input data distribution is important for learning, a data density-dependent quantization scheme (DQS) is proposed for sparse input data representation. The usefulness of the representation scheme is demonstrated by using it as a data preprocessing unit attached to the well-known least squares support vector machine (LS-SVM) for application on big data sets. Essentially, the proposed DQS adopts a single shrinkage threshold to obtain a simple quantization scheme, which adapts its outputs to input data density. With this quantization scheme, a large data set is quantized to a small subset where considerable sample size reduction is generally obtained. In particular, the sample size reduction can save significant computational cost when using the quantized subset for feature approximation via the Nyström method. Based on the quantized subset, the approximated features are incorporated into LS-SVM to develop a data density-dependent quantized LS-SVM (DQLS-SVM), where an analytic solution is obtained in the primal solution space. The developed DQLS-SVM is evaluated on synthetic and benchmark data with particular emphasis on large data sets. Extensive experimental results show that the learning machine incorporating DQS attains not only high computational efficiency but also good generalization performance.
Computational Science in Armenia (Invited Talk)
NASA Astrophysics Data System (ADS)
Marandjian, H.; Shoukourian, Yu.
This survey is devoted to the development of informatics and computer science in Armenia. The results in theoretical computer science (algebraic models, solutions to systems of general form recursive equations, the methods of coding theory, pattern recognition and image processing), constitute the theoretical basis for developing problem-solving-oriented environments. As examples can be mentioned: a synthesizer of optimized distributed recursive programs, software tools for cluster-oriented implementations of two-dimensional cellular automata, a grid-aware web interface with advanced service trading for linear algebra calculations. In the direction of solving scientific problems that require high-performance computing resources, examples of completed projects include the field of physics (parallel computing of complex quantum systems), astrophysics (Armenian virtual laboratory), biology (molecular dynamics study of human red blood cell membrane), meteorology (implementing and evaluating the Weather Research and Forecast Model for the territory of Armenia). The overview also notes that the Institute for Informatics and Automation Problems of the National Academy of Sciences of Armenia has established a scientific and educational infrastructure, uniting computing clusters of scientific and educational institutions of the country and provides the scientific community with access to local and international computational resources, that is a strong support for computational science in Armenia.
Spanish students' use of the Internet for mental health information and support seeking.
Montagni, Ilaria; Parizot, Isabelle; Horgan, Aine; Gonzalez-Caballero, Juan-Luis; Almenara-Barrios, José; Lagares-Franco, Carolina; Peralta-Sáez, Juan-Luis; Chauvin, Pierre; Amaddeo, Francesco
2016-06-01
The Internet is a growing source of information for health in general, with university student being online almost daily. Evaluating their use of the Internet for mental health information and support can help understanding if online tools and websites should be used for mental health promotion and, to some extent, care. A survey was conducted with more than 600 students of Law, Nursing and Computer Science of the University of Cadiz in Spain to determine their general use of the Internet and their perceptions and trust in using this medium for mental health information. Data were collected using a 25-item-questionnaire and findings indicated that students had a strong distrust in online mental health information, notwithstanding their daily use of the Internet. The frequency and methods of their research on the Internet correlated with their health status, their medical consultations and with certain socio-demographic characteristics. © The Author(s) 2014.
A support vector machine based control application to the experimental three-tank system.
Iplikci, Serdar
2010-07-01
This paper presents a support vector machine (SVM) approach to generalized predictive control (GPC) of multiple-input multiple-output (MIMO) nonlinear systems. The possession of higher generalization potential and at the same time avoidance of getting stuck into the local minima have motivated us to employ SVM algorithms for modeling MIMO systems. Based on the SVM model, detailed and compact formulations for calculating predictions and gradient information, which are used in the computation of the optimal control action, are given in the paper. The proposed MIMO SVM-based GPC method has been verified on an experimental three-tank liquid level control system. Experimental results have shown that the proposed method can handle the control task successfully for different reference trajectories. Moreover, a detailed discussion on data gathering, model selection and effects of the control parameters have been given in this paper. 2010 ISA. Published by Elsevier Ltd. All rights reserved.
Parallelization and checkpointing of GPU applications through program transformation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Solano-Quinde, Lizandro Damian
2012-01-01
GPUs have emerged as a powerful tool for accelerating general-purpose applications. The availability of programming languages that makes writing general-purpose applications for running on GPUs tractable have consolidated GPUs as an alternative for accelerating general purpose applications. Among the areas that have benefited from GPU acceleration are: signal and image processing, computational fluid dynamics, quantum chemistry, and, in general, the High Performance Computing (HPC) Industry. In order to continue to exploit higher levels of parallelism with GPUs, multi-GPU systems are gaining popularity. In this context, single-GPU applications are parallelized for running in multi-GPU systems. Furthermore, multi-GPU systems help to solvemore » the GPU memory limitation for applications with large application memory footprint. Parallelizing single-GPU applications has been approached by libraries that distribute the workload at runtime, however, they impose execution overhead and are not portable. On the other hand, on traditional CPU systems, parallelization has been approached through application transformation at pre-compile time, which enhances the application to distribute the workload at application level and does not have the issues of library-based approaches. Hence, a parallelization scheme for GPU systems based on application transformation is needed. Like any computing engine of today, reliability is also a concern in GPUs. GPUs are vulnerable to transient and permanent failures. Current checkpoint/restart techniques are not suitable for systems with GPUs. Checkpointing for GPU systems present new and interesting challenges, primarily due to the natural differences imposed by the hardware design, the memory subsystem architecture, the massive number of threads, and the limited amount of synchronization among threads. Therefore, a checkpoint/restart technique suitable for GPU systems is needed. The goal of this work is to exploit higher levels of parallelism and to develop support for application-level fault tolerance in applications using multiple GPUs. Our techniques reduce the burden of enhancing single-GPU applications to support these features. To achieve our goal, this work designs and implements a framework for enhancing a single-GPU OpenCL application through application transformation.« less
An assessment of General Aviation utilization of advanced avionics technology
NASA Technical Reports Server (NTRS)
Quinby, G. F.
1980-01-01
Needs of the general aviation industry for services and facilities which might be supplied by NASA were examined. In the data collection phase, twenty-one individuals from nine manufacturing companies in general aviation were interviewed against a carefully prepared meeting format. General aviation avionics manufacturers were credited with a high degree of technology transfer from the forcing industries such as television, automotive, and computers and a demonstrated ability to apply advanced technology such as large scale integration and microprocessors to avionics functions in an innovative and cost effective manner. The industry's traditional resistance to any unnecessary regimentation or standardization was confirmed. Industry's self sufficiency in applying advanced technology to avionics product development was amply demonstrated. NASA research capability could be supportive in areas of basic mechanics of turbulence in weather and alternative means for its sensing.
Opportunities and Needs for Mobile-Computing Technology to Support U.S. Geological Survey Fieldwork
Wood, Nathan J.; Halsing, David L.
2006-01-01
To assess the opportunities and needs for mobile-computing technology at the U.S. Geological Survey (USGS), we conducted an internal, Internet-based survey of bureau scientists whose research includes fieldwork. In summer 2005, 144 survey participants answered 65 questions about fieldwork activities and conditions, technology to support field research, and postfieldwork data processing and analysis. Results suggest that some types of mobile-computing technology are already commonplace, such as digital cameras and Global Positioning System (GPS) receivers, whereas others are not, such as personal digital assistants (PDAs) and tablet-based personal computers (tablet PCs). The potential for PDA use in the USGS is high: 97 percent of respondents record field observations (primarily environmental conditions and water-quality data), and 87 percent take field samples (primarily water-quality data, water samples, and sediment/soil samples). The potential for tablet PC use in the USGS is also high: 59 percent of respondents map environmental features in the field, primarily by sketching in field notebooks, on aerial photographs, or on topographic-map sheets. Results also suggest that efficient mobile-computing-technology solutions could benefit many USGS scientists because most respondents spend at least 1 week per year in the field, conduct field sessions that are least 1 week in duration, have field crews of one to three people, and typically travel on foot about 1 mi from their field vehicles. By allowing researchers to enter data directly into digital databases while in the field, mobile-computing technology could also minimize postfieldwork data processing: 93 percent of respondents enter collected field data into their office computers, and more than 50 percent spend at least 1 week per year on postfieldwork data processing. Reducing postfieldwork data processing could free up additional time for researchers and result in cost savings for the bureau. Generally, respondents support greater use of mobile-computing technology at the USGS and are interested in training opportunities and further discussions related to data archiving, access to additional digital data types, and technology development.
Bottrighi, Alessio; Terenziani, Paolo
2016-09-01
Several different computer-assisted management systems of computer interpretable guidelines (CIGs) have been developed by the Artificial Intelligence in Medicine community. Each CIG system is characterized by a specific formalism to represent CIGs, and usually provides a manager to acquire, consult and execute them. Though there are several commonalities between most formalisms in the literature, each formalism has its own peculiarities. The goal of our work is to provide a flexible support to the extension or definition of CIGs formalisms, and of their acquisition and execution engines. Instead of defining "yet another CIG formalism and its manager", we propose META-GLARE (META Guideline Acquisition, Representation, and Execution), a "meta"-system to define new CIG systems. In this paper, META-GLARE, a meta-system to define new CIG systems, is presented. We try to capture the commonalities among current CIG approaches, by providing (i) a general manager for the acquisition, consultation and execution of hierarchical graphs (representing the control flow of actions in CIGs), parameterized over the types of nodes and of arcs constituting it, and (ii) a library of different elementary components of guidelines nodes (actions) and arcs, in which each type definition involves the specification of how objects of this type can be acquired, consulted and executed. We provide generality and flexibility, by allowing free aggregations of such elementary components to define new primitive node and arc types. We have drawn several experiments, in which we have used META-GLARE to build a CIG system (Experiment 1 in Section 8), or to extend it (Experiments 2 and 3). Such experiments show that META-GLARE provides a useful and easy-to-use support to such tasks. For instance, re-building the Guideline Acquisition, Representation, and Execution (GLARE) system using META-GLARE required less than one day (Experiment 1). META-GLARE is a meta-system for CIGs supporting fast prototyping. Since META-GLARE provides acquisition and execution engines that are parametric over the specific CIG formalism, it supports easy update and construction of CIG systems. Copyright © 2016 Elsevier B.V. All rights reserved.
New Computational Approach to Electron Transport in Irregular Graphene Nanostructures
NASA Astrophysics Data System (ADS)
Mason, Douglas; Heller, Eric; Prendergast, David; Neaton, Jeffrey
2009-03-01
For novel graphene devices of nanoscale-to-macroscopic scale, many aspects of their transport properties are not easily understood due to difficulties in fabricating devices with regular edges. Here we develop a framework to efficiently calculate and potentially screen electronic transport properties of arbitrary nanoscale graphene device structures. A generalization of the established recursive Green's function method is presented, providing access to arbitrary device and lead geometries with substantial computer-time savings. Using single-orbital nearest-neighbor tight-binding models and the Green's function-Landauer scattering formalism, we will explore the transmission function of irregular two-dimensional graphene-based nanostructures with arbitrary lead orientation. Prepared by LBNL under contract DE-AC02-05CH11231 and supported by the U.S. Dept. of Energy Computer Science Graduate Fellowship under grant DE-FG02-97ER25308.
47 CFR 32.2124 - General purpose computers.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 47 Telecommunication 2 2010-10-01 2010-10-01 false General purpose computers. 32.2124 Section 32.2124 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES UNIFORM... General purpose computers. (a) This account shall include the original cost of computers and peripheral...
47 CFR 32.2124 - General purpose computers.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 47 Telecommunication 2 2011-10-01 2011-10-01 false General purpose computers. 32.2124 Section 32.2124 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES UNIFORM... General purpose computers. (a) This account shall include the original cost of computers and peripheral...
47 CFR 32.2124 - General purpose computers.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 47 Telecommunication 2 2014-10-01 2014-10-01 false General purpose computers. 32.2124 Section 32.2124 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES UNIFORM... General purpose computers. (a) This account shall include the original cost of computers and peripheral...
47 CFR 32.2124 - General purpose computers.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 47 Telecommunication 2 2013-10-01 2013-10-01 false General purpose computers. 32.2124 Section 32.2124 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES UNIFORM... General purpose computers. (a) This account shall include the original cost of computers and peripheral...
47 CFR 32.2124 - General purpose computers.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 47 Telecommunication 2 2012-10-01 2012-10-01 false General purpose computers. 32.2124 Section 32.2124 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES UNIFORM... General purpose computers. (a) This account shall include the original cost of computers and peripheral...
NASA Astrophysics Data System (ADS)
Li, Tao
2018-06-01
The complexity of aluminum electrolysis process leads the temperature for aluminum reduction cells hard to measure directly. However, temperature is the control center of aluminum production. To solve this problem, combining some aluminum plant's practice data, this paper presents a Soft-sensing model of temperature for aluminum electrolysis process on Improved Twin Support Vector Regression (ITSVR). ITSVR eliminates the slow learning speed of Support Vector Regression (SVR) and the over-fit risk of Twin Support Vector Regression (TSVR) by introducing a regularization term into the objective function of TSVR, which ensures the structural risk minimization principle and lower computational complexity. Finally, the model with some other parameters as auxiliary variable, predicts the temperature by ITSVR. The simulation result shows Soft-sensing model based on ITSVR has short time-consuming and better generalization.
Extreme Scale Computing to Secure the Nation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, D L; McGraw, J R; Johnson, J R
2009-11-10
Since the dawn of modern electronic computing in the mid 1940's, U.S. national security programs have been dominant users of every new generation of high-performance computer. Indeed, the first general-purpose electronic computer, ENIAC (the Electronic Numerical Integrator and Computer), was used to calculate the expected explosive yield of early thermonuclear weapons designs. Even the U. S. numerical weather prediction program, another early application for high-performance computing, was initially funded jointly by sponsors that included the U.S. Air Force and Navy, agencies interested in accurate weather predictions to support U.S. military operations. For the decades of the cold war, national securitymore » requirements continued to drive the development of high performance computing (HPC), including advancement of the computing hardware and development of sophisticated simulation codes to support weapons and military aircraft design, numerical weather prediction as well as data-intensive applications such as cryptography and cybersecurity U.S. national security concerns continue to drive the development of high-performance computers and software in the U.S. and in fact, events following the end of the cold war have driven an increase in the growth rate of computer performance at the high-end of the market. This mainly derives from our nation's observance of a moratorium on underground nuclear testing beginning in 1992, followed by our voluntary adherence to the Comprehensive Test Ban Treaty (CTBT) beginning in 1995. The CTBT prohibits further underground nuclear tests, which in the past had been a key component of the nation's science-based program for assuring the reliability, performance and safety of U.S. nuclear weapons. In response to this change, the U.S. Department of Energy (DOE) initiated the Science-Based Stockpile Stewardship (SBSS) program in response to the Fiscal Year 1994 National Defense Authorization Act, which requires, 'in the absence of nuclear testing, a progam to: (1) Support a focused, multifaceted program to increase the understanding of the enduring stockpile; (2) Predict, detect, and evaluate potential problems of the aging of the stockpile; (3) Refurbish and re-manufacture weapons and components, as required; and (4) Maintain the science and engineering institutions needed to support the nation's nuclear deterrent, now and in the future'. This program continues to fulfill its national security mission by adding significant new capabilities for producing scientific results through large-scale computational simulation coupled with careful experimentation, including sub-critical nuclear experiments permitted under the CTBT. To develop the computational science and the computational horsepower needed to support its mission, SBSS initiated the Accelerated Strategic Computing Initiative, later renamed the Advanced Simulation & Computing (ASC) program (sidebar: 'History of ASC Computing Program Computing Capability'). The modern 3D computational simulation capability of the ASC program supports the assessment and certification of the current nuclear stockpile through calibration with past underground test (UGT) data. While an impressive accomplishment, continued evolution of national security mission requirements will demand computing resources at a significantly greater scale than we have today. In particular, continued observance and potential Senate confirmation of the Comprehensive Test Ban Treaty (CTBT) together with the U.S administration's promise for a significant reduction in the size of the stockpile and the inexorable aging and consequent refurbishment of the stockpile all demand increasing refinement of our computational simulation capabilities. Assessment of the present and future stockpile with increased confidence of the safety and reliability without reliance upon calibration with past or future test data is a long-term goal of the ASC program. This will be accomplished through significant increases in the scientific bases that underlie the computational tools. Computer codes must be developed that replace phenomenology with increased levels of scientific understanding together with an accompanying quantification of uncertainty. These advanced codes will place significantly higher demands on the computing infrastructure than do the current 3D ASC codes. This article discusses not only the need for a future computing capability at the exascale for the SBSS program, but also considers high performance computing requirements for broader national security questions. For example, the increasing concern over potential nuclear terrorist threats demands a capability to assess threats and potential disablement technologies as well as a rapid forensic capability for determining a nuclear weapons design from post-detonation evidence (nuclear counterterrorism).« less
Likelihood ratio decisions in memory: three implied regularities.
Glanzer, Murray; Hilford, Andrew; Maloney, Laurence T
2009-06-01
We analyze four general signal detection models for recognition memory that differ in their distributional assumptions. Our analyses show that a basic assumption of signal detection theory, the likelihood ratio decision axis, implies three regularities in recognition memory: (1) the mirror effect, (2) the variance effect, and (3) the z-ROC length effect. For each model, we present the equations that produce the three regularities and show, in computed examples, how they do so. We then show that the regularities appear in data from a range of recognition studies. The analyses and data in our study support the following generalization: Individuals make efficient recognition decisions on the basis of likelihood ratios.
The Aging Navigational System.
Lester, Adam W; Moffat, Scott D; Wiener, Jan M; Barnes, Carol A; Wolbers, Thomas
2017-08-30
The discovery of neuronal systems dedicated to computing spatial information, composed of functionally distinct cell types such as place and grid cells, combined with an extensive body of human-based behavioral and neuroimaging research has provided us with a detailed understanding of the brain's navigation circuit. In this review, we discuss emerging evidence from rodents, non-human primates, and humans that demonstrates how cognitive aging affects the navigational computations supported by these systems. Critically, we show 1) that navigational deficits cannot solely be explained by general deficits in learning and memory, 2) that there is no uniform decline across different navigational computations, and 3) that navigational deficits might be sensitive markers for impending pathological decline. Following an introduction to the mechanisms underlying spatial navigation and how they relate to general processes of learning and memory, the review discusses how aging affects the perception and integration of spatial information, the creation and storage of memory traces for spatial information, and the use of spatial information during navigational behavior. The closing section highlights the clinical potential of behavioral and neural markers of spatial navigation, with a particular emphasis on neurodegenerative disorders. Copyright © 2017 Elsevier Inc. All rights reserved.
Support for e-Health Services Among Elderly Primary Care Patients
Pirogowicz, Iwona
2014-01-01
Abstract Background: E-health has a substantial potential to improve access to as well as support efficient and effective care for the elderly. Questions remain and must be addressed, however, regarding the challenges faced by the elderly in the use of this technology. The aim of the study was to assess the attitude (needs) and preferences of the elderly in a selected region of Poland regarding selected e-health services and the factors associated with them. Materials and Methods: The study was conducted among 286 patients over 60 years of age being served by general practitioners in southwest Poland's Lower Silesia Province. The assessment pertaining to e-health was based on a specially designed questionnaire. Results: Nearly one-third of the study respondents had a computer at home, and 61% of these (19% of all surveyed elderly) used it. Twenty-two percent of respondents used the Internet, at least occasionally, whereas 62% used mobile phones. Approximately 41% (n=116) of the elderly had a favorable attitude toward e-health services (labeled here as supporters) and were willing to use it if/when offered an opportunity to do so. A substantial majority (84%) of supporters expressed a desire to receive simple medical recommendations via mobile phone or a computer, although significant majorities (61% and 60%, respectively) would like to receive the results of tests by e-mail or short message service reminders for scheduled visits or prescribed medications. Slightly less than half (47%) of e-health supporters would request appointments online. Among the more important factors associated with support of e-health services were urban residence, higher education, and normal cognitive function, as well as having a computer, Internet access, or a mobile phone. Conclusions: The majority of elderly patients in this Polish community are not overly enthusiastic about using information and communications technology tools in their healthcare. Nevertheless, a substantial percentage (41%) among this group support selected e-health services. PMID:24359252
Continuing challenges for computer-based neuropsychological tests.
Letz, Richard
2003-08-01
A number of issues critical to the development of computer-based neuropsychological testing systems that remain continuing challenges to their widespread use in occupational and environmental health are reviewed. Several computer-based neuropsychological testing systems have been developed over the last 20 years, and they have contributed substantially to the study of neurologic effects of a number of environmental exposures. However, many are no longer supported and do not run on contemporary personal computer operating systems. Issues that are continuing challenges for development of computer-based neuropsychological tests in environmental and occupational health are discussed: (1) some current technological trends that generally make test development more difficult; (2) lack of availability of usable speech recognition of the type required for computer-based testing systems; (3) implementing computer-based procedures and tasks that are improvements over, not just adaptations of, their manually-administered predecessors; (4) implementing tests of a wider range of memory functions than the limited range now available; (5) paying more attention to motivational influences that affect the reliability and validity of computer-based measurements; and (6) increasing the usability of and audience for computer-based systems. Partial solutions to some of these challenges are offered. The challenges posed by current technological trends are substantial and generally beyond the control of testing system developers. Widespread acceptance of the "tablet PC" and implementation of accurate small vocabulary, discrete, speaker-independent speech recognition would enable revolutionary improvements to computer-based testing systems, particularly for testing memory functions not covered in existing systems. Dynamic, adaptive procedures, particularly ones based on item-response theory (IRT) and computerized-adaptive testing (CAT) methods, will be implemented in new tests that will be more efficient, reliable, and valid than existing test procedures. These additional developments, along with implementation of innovative reporting formats, are necessary for more widespread acceptance of the testing systems.
Spacecraft attitude determination accuracy from mission experience
NASA Technical Reports Server (NTRS)
Brasoveanu, D.; Hashmall, J.; Baker, D.
1994-01-01
This document presents a compilation of the attitude accuracy attained by a number of satellites that have been supported by the Flight Dynamics Facility (FDF) at Goddard Space Flight Center (GSFC). It starts with a general description of the factors that influence spacecraft attitude accuracy. After brief descriptions of the missions supported, it presents the attitude accuracy results for currently active and older missions, including both three-axis stabilized and spin-stabilized spacecraft. The attitude accuracy results are grouped by the sensor pair used to determine the attitudes. A supplementary section is also included, containing the results of theoretical computations of the effects of variation of sensor accuracy on overall attitude accuracy.
NASA Astrophysics Data System (ADS)
Paladini, D.; Mello, A. B.
2016-07-01
Inmetro's data about the conformity of certificated products, process and services are, usually, displayed at fragmented databases of difficult access for several reasons, for instance, the lack of computational solutions which allow this kind of access to its users. A discussion about some of the technological solutions to support supervisory activities by the appropriate regulatory bodies and also to provide information access to society in general is herein presented, along with a theoretical explanation of the pros and cons of such technologies to the conclusion that a mobile platform seems to be the best tool for the requirements of Inmetro.
XaNSoNS: GPU-accelerated simulator of diffraction patterns of nanoparticles
NASA Astrophysics Data System (ADS)
Neverov, V. S.
XaNSoNS is an open source software with GPU support, which simulates X-ray and neutron 1D (or 2D) diffraction patterns and pair-distribution functions (PDF) for amorphous or crystalline nanoparticles (up to ∼107 atoms) of heterogeneous structural content. Among the multiple parameters of the structure the user may specify atomic displacements, site occupancies, molecular displacements and molecular rotations. The software uses general equations nonspecific to crystalline structures to calculate the scattering intensity. It supports four major standards of parallel computing: MPI, OpenMP, Nvidia CUDA and OpenCL, enabling it to run on various architectures, from CPU-based HPCs to consumer-level GPUs.
Campion, Thomas R.; Waitman, Lemuel R.; May, Addison K.; Ozdas, Asli; Lorenzi, Nancy M.; Gadd, Cynthia S.
2009-01-01
Introduction: Evaluations of computerized clinical decision support systems (CDSS) typically focus on clinical performance changes and do not include social, organizational, and contextual characteristics explaining use and effectiveness. Studies of CDSS for intensive insulin therapy (IIT) are no exception, and the literature lacks an understanding of effective computer-based IIT implementation and operation. Results: This paper presents (1) a literature review of computer-based IIT evaluations through the lens of institutional theory, a discipline from sociology and organization studies, to demonstrate the inconsistent reporting of workflow and care process execution and (2) a single-site case study to illustrate how computer-based IIT requires substantial organizational change and creates additional complexity with unintended consequences including error. Discussion: Computer-based IIT requires organizational commitment and attention to site-specific technology, workflow, and care processes to achieve intensive insulin therapy goals. The complex interaction between clinicians, blood glucose testing devices, and CDSS may contribute to workflow inefficiency and error. Evaluations rarely focus on the perspective of nurses, the primary users of computer-based IIT whose knowledge can potentially lead to process and care improvements. Conclusion: This paper addresses a gap in the literature concerning the social, organizational, and contextual characteristics of CDSS in general and for intensive insulin therapy specifically. Additionally, this paper identifies areas for future research to define optimal computer-based IIT process execution: the frequency and effect of manual data entry error of blood glucose values, the frequency and effect of nurse overrides of CDSS insulin dosing recommendations, and comprehensive ethnographic study of CDSS for IIT. PMID:19815452
NASA Technical Reports Server (NTRS)
Nakai, Junko; VanDerWijngaart, Rob F.
2003-01-01
Markets are often considered superior to other global scheduling mechanisms for distributed computing systems. This claim is supported by: a casual observation from our every-day life that markets successfully equilibrate supply and demand, and the features of markets which originate in the general equilibrium theory, e.g., efficiency and the lack of necessity of 2 central controller. This paper describes why such beliefs in markets are not warranted. It does so by examining the general equilibrium theory, in terms of scope, abstraction, and interpretation. Not only does the general equilibrium theory fail to provide a satisfactory explanation of actual economies, including a computing-resource economy, it also falls short of supplying theoretical foundations for commonly held views of market desirability. This paper also points out that the argument for the desirability of markets involves circular reasoning and that the desirability can be established only vis-a-vis a scheduling goal. Finally, recasting the conclusion of Arrow's Impossibility Theorem as that for global scheduling, we conclude that there exists no market-based scheduler that is rational (in the sense defined in microeconomic theory), takes into account utility of more than one user, and yet yields a Pareto-optimal outcome for arbitrary user utility functions.
uPy: a ubiquitous computer graphics Python API with Biological Modeling Applications
Autin, L.; Johnson, G.; Hake, J.; Olson, A.; Sanner, M.
2015-01-01
In this paper we describe uPy, an extension module for the Python programming language that provides a uniform abstraction of the APIs of several 3D computer graphics programs called hosts, including: Blender, Maya, Cinema4D, and DejaVu. A plugin written with uPy is a unique piece of code that will run in all uPy-supported hosts. We demonstrate the creation of complex plug-ins for molecular/cellular modeling and visualization and discuss how uPy can more generally simplify programming for many types of projects (not solely science applications) intended for multi-host distribution. uPy is available at http://upy.scripps.edu PMID:24806987
Improving Unstructured Mesh Partitions for Multiple Criteria Using Mesh Adjacencies
Smith, Cameron W.; Rasquin, Michel; Ibanez, Dan; ...
2018-02-13
The scalability of unstructured mesh based applications depends on partitioning methods that quickly balance the computational work while reducing communication costs. Zhou et al. [SIAM J. Sci. Comput., 32 (2010), pp. 3201{3227; J. Supercomput., 59 (2012), pp. 1218{1228] demonstrated the combination of (hyper)graph methods with vertex and element partition improvement for PHASTA CFD scaling to hundreds of thousands of processes. Our work generalizes partition improvement to support balancing combinations of all the mesh entity dimensions (vertices, edges, faces, regions) in partitions with imbalances exceeding 70%. Improvement results are then presented for multiple entity dimensions on up to one million processesmore » on meshes with over 12 billion tetrahedral elements.« less
Improving Unstructured Mesh Partitions for Multiple Criteria Using Mesh Adjacencies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Cameron W.; Rasquin, Michel; Ibanez, Dan
The scalability of unstructured mesh based applications depends on partitioning methods that quickly balance the computational work while reducing communication costs. Zhou et al. [SIAM J. Sci. Comput., 32 (2010), pp. 3201{3227; J. Supercomput., 59 (2012), pp. 1218{1228] demonstrated the combination of (hyper)graph methods with vertex and element partition improvement for PHASTA CFD scaling to hundreds of thousands of processes. Our work generalizes partition improvement to support balancing combinations of all the mesh entity dimensions (vertices, edges, faces, regions) in partitions with imbalances exceeding 70%. Improvement results are then presented for multiple entity dimensions on up to one million processesmore » on meshes with over 12 billion tetrahedral elements.« less
Off-diagonal Jacobian support for Nodal BCs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peterson, John W.; Andrs, David; Gaston, Derek R.
In this brief note, we describe the implementation of o-diagonal Jacobian computations for nodal boundary conditions in the Multiphysics Object Oriented Simulation Environment (MOOSE) [1] framework. There are presently a number of applications [2{5] based on the MOOSE framework that solve complicated physical systems of partial dierential equations whose boundary conditions are often highly nonlinear. Accurately computing the on- and o-diagonal Jacobian and preconditioner entries associated to these constraints is crucial for enabling ecient numerical solvers in these applications. Two key ingredients are required for properly specifying the Jacobian contributions of nonlinear nodal boundary conditions in MOOSE and nite elementmore » codes in general: 1. The ability to zero out entire Jacobian matrix rows after \
Data Understanding Applied to Optimization
NASA Technical Reports Server (NTRS)
Buntine, Wray; Shilman, Michael
1998-01-01
The goal of this research is to explore and develop software for supporting visualization and data analysis of search and optimization. Optimization is an ever-present problem in science. The theory of NP-completeness implies that the problems can only be resolved by increasingly smarter problem specific knowledge, possibly for use in some general purpose algorithms. Visualization and data analysis offers an opportunity to accelerate our understanding of key computational bottlenecks in optimization and to automatically tune aspects of the computation for specific problems. We will prototype systems to demonstrate how data understanding can be successfully applied to problems characteristic of NASA's key science optimization tasks, such as central tasks for parallel processing, spacecraft scheduling, and data transmission from a remote satellite.
Machine learning-based diagnosis of melanoma using macro images.
Gautam, Diwakar; Ahmed, Mushtaq; Meena, Yogesh Kumar; Ul Haq, Ahtesham
2018-05-01
Cancer bears a poisoning threat to human society. Melanoma, the skin cancer, originates from skin layers and penetrates deep into subcutaneous layers. There exists an extensive research in melanoma diagnosis using dermatoscopic images captured through a dermatoscope. While designing a diagnostic model for general handheld imaging systems is an emerging trend, this article proposes a computer-aided decision support system for macro images captured by a general-purpose camera. General imaging conditions are adversely affected by nonuniform illumination, which further affects the extraction of relevant information. To mitigate it, we process an image to define a smooth illumination surface using the multistage illumination compensation approach, and the infected region is extracted using the proposed multimode segmentation method. The lesion information is numerated as a feature set comprising geometry, photometry, border series, and texture measures. The redundancy in feature set is reduced using information theory methods, and a classification boundary is modeled to distinguish benign and malignant samples using support vector machine, random forest, neural network, and fast discriminative mixed-membership-based naive Bayesian classifiers. Moreover, the experimental outcome is supported by hypothesis testing and boxplot representation for classification losses. The simulation results prove the significance of the proposed model that shows an improved performance as compared with competing arts. Copyright © 2017 John Wiley & Sons, Ltd.
Exploiting Temporal Constraints of Clinical Guidelines by Applying OpenEHR Archetypes.
Cintho, Lilian Mie Mukai; Garcia, Diego; da Silva Santos, Bruno Henrique; Sacchi, Lucia; Quaglini, Silvana; Moro, Claudia Maria Cabral
2017-01-01
Studies describing Computer-Interpretable Clinical Guidelines (CIG) with temporal constrains (TC) generally have not addressed issues related to their integration into Electronic Health Record (EHR) systems. This study aimed to represent TCs contained in clinical guidelines by applying archetypes and Guideline Definition Language (GDL) to incorporate decision support into EHRs. An example of each TC class in the clinical guideline for management of Atrial Fibrillation was represented using archetypes and GDL.
NASA Technical Reports Server (NTRS)
Kemp, William B., Jr.
1990-01-01
Guidelines are presented for use of the computer program PANCOR to assess the interference due to tunnel walls and model support in a slotted wind tunnel test section at subsonic speeds. Input data requirements are described in detail and program output and general program usage are described. The program is written for effective automatic vectorization on a CDC CYBER 200 class vector processing system.
1989-05-01
from the computer and tells the operator whether the hole meets the established criteria. All the components have been incorporated into a mobile cart...DeHavilland: - Dash-8 applications " General Dynamics: - F16 tail light panel - ATF Supportable Hybrid Stuctures Program - ATA aplications " USAF AFWAL...Compressive Fatigue of Flawed Graphite/Epoxy Composites," Doctoral Thesis , Massachusetts Institute of Technology, Department of Aeronautics and
A Generalized DBMS to Support Diversified Data.
1987-07-21
interest on bonds ). Hence. they require a definition of subtraction which yields 30 days as the answer to the above computation. Only a user-defined...STON85]. Alternately. one can follow the standard scheduler model [BERNS1] in which a module is callable by code in the access methods when a...direction for evolution . These could include when to cease investigating alternate plans. and the ability to specify one’s own optimizer parameters
Software engineering methodologies and tools
NASA Technical Reports Server (NTRS)
Wilcox, Lawrence M.
1993-01-01
Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.
NASA Astrophysics Data System (ADS)
Petković, Dalibor; Shamshirband, Shahaboddin; Saboohi, Hadi; Ang, Tan Fong; Anuar, Nor Badrul; Rahman, Zulkanain Abdul; Pavlović, Nenad T.
2014-07-01
The quantitative assessment of image quality is an important consideration in any type of imaging system. The modulation transfer function (MTF) is a graphical description of the sharpness and contrast of an imaging system or of its individual components. The MTF is also known and spatial frequency response. The MTF curve has different meanings according to the corresponding frequency. The MTF of an optical system specifies the contrast transmitted by the system as a function of image size, and is determined by the inherent optical properties of the system. In this study, the polynomial and radial basis function (RBF) are applied as the kernel function of Support Vector Regression (SVR) to estimate and predict estimate MTF value of the actual optical system according to experimental tests. Instead of minimizing the observed training error, SVR_poly and SVR_rbf attempt to minimize the generalization error bound so as to achieve generalized performance. The experimental results show that an improvement in predictive accuracy and capability of generalization can be achieved by the SVR_rbf approach in compare to SVR_poly soft computing methodology.
Accuracy and Calibration of Computational Approaches for Inpatient Mortality Predictive Modeling.
Nakas, Christos T; Schütz, Narayan; Werners, Marcus; Leichtle, Alexander B
2016-01-01
Electronic Health Record (EHR) data can be a key resource for decision-making support in clinical practice in the "big data" era. The complete database from early 2012 to late 2015 involving hospital admissions to Inselspital Bern, the largest Swiss University Hospital, was used in this study, involving over 100,000 admissions. Age, sex, and initial laboratory test results were the features/variables of interest for each admission, the outcome being inpatient mortality. Computational decision support systems were utilized for the calculation of the risk of inpatient mortality. We assessed the recently proposed Acute Laboratory Risk of Mortality Score (ALaRMS) model, and further built generalized linear models, generalized estimating equations, artificial neural networks, and decision tree systems for the predictive modeling of the risk of inpatient mortality. The Area Under the ROC Curve (AUC) for ALaRMS marginally corresponded to the anticipated accuracy (AUC = 0.858). Penalized logistic regression methodology provided a better result (AUC = 0.872). Decision tree and neural network-based methodology provided even higher predictive performance (up to AUC = 0.912 and 0.906, respectively). Additionally, decision tree-based methods can efficiently handle Electronic Health Record (EHR) data that have a significant amount of missing records (in up to >50% of the studied features) eliminating the need for imputation in order to have complete data. In conclusion, we show that statistical learning methodology can provide superior predictive performance in comparison to existing methods and can also be production ready. Statistical modeling procedures provided unbiased, well-calibrated models that can be efficient decision support tools for predicting inpatient mortality and assigning preventive measures.
Generalized SMO algorithm for SVM-based multitask learning.
Cai, Feng; Cherkassky, Vladimir
2012-06-01
Exploiting additional information to improve traditional inductive learning is an active research area in machine learning. In many supervised-learning applications, training data can be naturally separated into several groups, and incorporating this group information into learning may improve generalization. Recently, Vapnik proposed a general approach to formalizing such problems, known as "learning with structured data" and its support vector machine (SVM) based optimization formulation called SVM+. Liang and Cherkassky showed the connection between SVM+ and multitask learning (MTL) approaches in machine learning, and proposed an SVM-based formulation for MTL called SVM+MTL for classification. Training the SVM+MTL classifier requires the solution of a large quadratic programming optimization problem which scales as O(n(3)) with sample size n. So there is a need to develop computationally efficient algorithms for implementing SVM+MTL. This brief generalizes Platt's sequential minimal optimization (SMO) algorithm to the SVM+MTL setting. Empirical results show that, for typical SVM+MTL problems, the proposed generalized SMO achieves over 100 times speed-up, in comparison with general-purpose optimization routines.
Sordo, Margarita; Boxwala, Aziz A; Ogunyemi, Omolola; Greenes, Robert A
2004-01-01
A major obstacle to sharing computable clinical knowledge is the lack of a common language for specifying expressions and criteria. Such a language could be used to specify decision criteria, formulae, and constraints on data and action. Al-though the Arden Syntax addresses this problem for clinical rules, its generalization to HL7's object-oriented data model is limited. The GELLO Expression language is an object-oriented language used for expressing logical conditions and computations in the GLIF3 (GuideLine Interchange Format, v. 3) guideline modeling language. It has been further developed under the auspices of the HL7 Clinical Decision Support Technical Committee, as a proposed HL7 standard., GELLO is based on the Object Constraint Language (OCL), because it is vendor-independent, object-oriented, and side-effect-free. GELLO expects an object-oriented data model. Although choice of model is arbitrary, standardization is facilitated by ensuring that the data model is compatible with the HL7 Reference Information Model (RIM).
47 CFR 32.6124 - General purpose computers expense.
Code of Federal Regulations, 2013 CFR
2013-10-01
... is the physical operation of general purpose computers and the maintenance of operating systems. This... UNIFORM SYSTEM OF ACCOUNTS FOR TELECOMMUNICATIONS COMPANIES Instructions for Expense Accounts § 32.6124... application systems and databases for general purpose computers. (See also § 32.6720, General and...
47 CFR 32.6124 - General purpose computers expense.
Code of Federal Regulations, 2014 CFR
2014-10-01
... is the physical operation of general purpose computers and the maintenance of operating systems. This... UNIFORM SYSTEM OF ACCOUNTS FOR TELECOMMUNICATIONS COMPANIES Instructions for Expense Accounts § 32.6124... application systems and databases for general purpose computers. (See also § 32.6720, General and...
47 CFR 32.6124 - General purpose computers expense.
Code of Federal Regulations, 2011 CFR
2011-10-01
... is the physical operation of general purpose computers and the maintenance of operating systems. This... UNIFORM SYSTEM OF ACCOUNTS FOR TELECOMMUNICATIONS COMPANIES Instructions for Expense Accounts § 32.6124... application systems and databases for general purpose computers. (See also § 32.6720, General and...
47 CFR 32.6124 - General purpose computers expense.
Code of Federal Regulations, 2012 CFR
2012-10-01
... is the physical operation of general purpose computers and the maintenance of operating systems. This... UNIFORM SYSTEM OF ACCOUNTS FOR TELECOMMUNICATIONS COMPANIES Instructions for Expense Accounts § 32.6124... application systems and databases for general purpose computers. (See also § 32.6720, General and...
47 CFR 32.6124 - General purpose computers expense.
Code of Federal Regulations, 2010 CFR
2010-10-01
... is the physical operation of general purpose computers and the maintenance of operating systems. This... UNIFORM SYSTEM OF ACCOUNTS FOR TELECOMMUNICATIONS COMPANIES Instructions for Expense Accounts § 32.6124... application systems and databases for general purpose computers. (See also § 32.6720, General and...
Computer-based learning: interleaving whole and sectional representation of neuroanatomy.
Pani, John R; Chariker, Julia H; Naaz, Farah
2013-01-01
The large volume of material to be learned in biomedical disciplines requires optimizing the efficiency of instruction. In prior work with computer-based instruction of neuroanatomy, it was relatively efficient for learners to master whole anatomy and then transfer to learning sectional anatomy. It may, however, be more efficient to continuously integrate learning of whole and sectional anatomy. A study of computer-based learning of neuroanatomy was conducted to compare a basic transfer paradigm for learning whole and sectional neuroanatomy with a method in which the two forms of representation were interleaved (alternated). For all experimental groups, interactive computer programs supported an approach to instruction called adaptive exploration. Each learning trial consisted of time-limited exploration of neuroanatomy, self-timed testing, and graphical feedback. The primary result of this study was that interleaved learning of whole and sectional neuroanatomy was more efficient than the basic transfer method, without cost to long-term retention or generalization of knowledge to recognizing new images (Visible Human and MRI). Copyright © 2012 American Association of Anatomists.
Use of agents to implement an integrated computing environment
NASA Technical Reports Server (NTRS)
Hale, Mark A.; Craig, James I.
1995-01-01
Integrated Product and Process Development (IPPD) embodies the simultaneous application to both system and quality engineering methods throughout an iterative design process. The use of IPPD results in the time-conscious, cost-saving development of engineering systems. To implement IPPD, a Decision-Based Design perspective is encapsulated in an approach that focuses on the role of the human designer in product development. The approach has two parts and is outlined in this paper. First, an architecture, called DREAMS, is being developed that facilitates design from a decision-based perspective. Second, a supporting computing infrastructure, called IMAGE, is being designed. Agents are used to implement the overall infrastructure on the computer. Successful agent utilization requires that they be made of three components: the resource, the model, and the wrap. Current work is focused on the development of generalized agent schemes and associated demonstration projects. When in place, the technology independent computing infrastructure will aid the designer in systematically generating knowledge used to facilitate decision-making.
NASA Astrophysics Data System (ADS)
Neves, Rui Gomes; Teodoro, Vítor Duarte
2012-09-01
A teaching approach aiming at an epistemologically balanced integration of computational modelling in science and mathematics education is presented. The approach is based on interactive engagement learning activities built around computational modelling experiments that span the range of different kinds of modelling from explorative to expressive modelling. The activities are designed to make a progressive introduction to scientific computation without requiring prior development of a working knowledge of programming, generate and foster the resolution of cognitive conflicts in the understanding of scientific and mathematical concepts and promote performative competency in the manipulation of different and complementary representations of mathematical models. The activities are supported by interactive PDF documents which explain the fundamental concepts, methods and reasoning processes using text, images and embedded movies, and include free space for multimedia enriched student modelling reports and teacher feedback. To illustrate, an example from physics implemented in the Modellus environment and tested in undergraduate university general physics and biophysics courses is discussed.
Computer-Based Learning: Interleaving Whole and Sectional Representation of Neuroanatomy
Pani, John R.; Chariker, Julia H.; Naaz, Farah
2015-01-01
The large volume of material to be learned in biomedical disciplines requires optimizing the efficiency of instruction. In prior work with computer-based instruction of neuroanatomy, it was relatively efficient for learners to master whole anatomy and then transfer to learning sectional anatomy. It may, however, be more efficient to continuously integrate learning of whole and sectional anatomy. A study of computer-based learning of neuroanatomy was conducted to compare a basic transfer paradigm for learning whole and sectional neuroanatomy with a method in which the two forms of representation were interleaved (alternated). For all experimental groups, interactive computer programs supported an approach to instruction called adaptive exploration. Each learning trial consisted of time-limited exploration of neuroanatomy, self-timed testing, and graphical feedback. The primary result of this study was that interleaved learning of whole and sectional neuroanatomy was more efficient than the basic transfer method, without cost to long-term retention or generalization of knowledge to recognizing new images (Visible Human and MRI). PMID:22761001
General aviation crash safety program at Langley Research Center
NASA Technical Reports Server (NTRS)
Thomson, R. G.
1976-01-01
The purpose of the crash safety program is to support development of the technology to define and demonstrate new structural concepts for improved crash safety and occupant survivability in general aviation aircraft. The program involves three basic areas of research: full-scale crash simulation testing, nonlinear structural analyses necessary to predict failure modes and collapse mechanisms of the vehicle, and evaluation of energy absorption concepts for specific component design. Both analytical and experimental methods are being used to develop expertise in these areas. Analyses include both simplified procedures for estimating energy absorption capabilities and more complex computer programs for analysis of general airframe response. Full-scale tests of typical structures as well as tests on structural components are being used to verify the analyses and to demonstrate improved design concepts.
Welch, M C; Kwan, P W; Sajeev, A S M
2014-10-01
Agent-based modelling has proven to be a promising approach for developing rich simulations for complex phenomena that provide decision support functions across a broad range of areas including biological, social and agricultural sciences. This paper demonstrates how high performance computing technologies, namely General-Purpose Computing on Graphics Processing Units (GPGPU), and commercial Geographic Information Systems (GIS) can be applied to develop a national scale, agent-based simulation of an incursion of Old World Screwworm fly (OWS fly) into the Australian mainland. The development of this simulation model leverages the combination of massively data-parallel processing capabilities supported by NVidia's Compute Unified Device Architecture (CUDA) and the advanced spatial visualisation capabilities of GIS. These technologies have enabled the implementation of an individual-based, stochastic lifecycle and dispersal algorithm for the OWS fly invasion. The simulation model draws upon a wide range of biological data as input to stochastically determine the reproduction and survival of the OWS fly through the different stages of its lifecycle and dispersal of gravid females. Through this model, a highly efficient computational platform has been developed for studying the effectiveness of control and mitigation strategies and their associated economic impact on livestock industries can be materialised. Copyright © 2014 International Atomic Energy Agency 2014. Published by Elsevier B.V. All rights reserved.
The challenge of big data in public health: an opportunity for visual analytics.
Ola, Oluwakemi; Sedig, Kamran
2014-01-01
Public health (PH) data can generally be characterized as big data. The efficient and effective use of this data determines the extent to which PH stakeholders can sufficiently address societal health concerns as they engage in a variety of work activities. As stakeholders interact with data, they engage in various cognitive activities such as analytical reasoning, decision-making, interpreting, and problem solving. Performing these activities with big data is a challenge for the unaided mind as stakeholders encounter obstacles relating to the data's volume, variety, velocity, and veracity. Such being the case, computer-based information tools are needed to support PH stakeholders. Unfortunately, while existing computational tools are beneficial in addressing certain work activities, they fall short in supporting cognitive activities that involve working with large, heterogeneous, and complex bodies of data. This paper presents visual analytics (VA) tools, a nascent category of computational tools that integrate data analytics with interactive visualizations, to facilitate the performance of cognitive activities involving big data. Historically, PH has lagged behind other sectors in embracing new computational technology. In this paper, we discuss the role that VA tools can play in addressing the challenges presented by big data. In doing so, we demonstrate the potential benefit of incorporating VA tools into PH practice, in addition to highlighting the need for further systematic and focused research.
The Challenge of Big Data in Public Health: An Opportunity for Visual Analytics
Ola, Oluwakemi; Sedig, Kamran
2014-01-01
Public health (PH) data can generally be characterized as big data. The efficient and effective use of this data determines the extent to which PH stakeholders can sufficiently address societal health concerns as they engage in a variety of work activities. As stakeholders interact with data, they engage in various cognitive activities such as analytical reasoning, decision-making, interpreting, and problem solving. Performing these activities with big data is a challenge for the unaided mind as stakeholders encounter obstacles relating to the data’s volume, variety, velocity, and veracity. Such being the case, computer-based information tools are needed to support PH stakeholders. Unfortunately, while existing computational tools are beneficial in addressing certain work activities, they fall short in supporting cognitive activities that involve working with large, heterogeneous, and complex bodies of data. This paper presents visual analytics (VA) tools, a nascent category of computational tools that integrate data analytics with interactive visualizations, to facilitate the performance of cognitive activities involving big data. Historically, PH has lagged behind other sectors in embracing new computational technology. In this paper, we discuss the role that VA tools can play in addressing the challenges presented by big data. In doing so, we demonstrate the potential benefit of incorporating VA tools into PH practice, in addition to highlighting the need for further systematic and focused research. PMID:24678376
Modular thermal analyzer routine, volume 1
NASA Technical Reports Server (NTRS)
Oren, J. A.; Phillips, M. A.; Williams, D. R.
1972-01-01
The Modular Thermal Analyzer Routine (MOTAR) is a general thermal analysis routine with strong capabilities for performing thermal analysis of systems containing flowing fluids, fluid system controls (valves, heat exchangers, etc.), life support systems, and thermal radiation situations. Its modular organization permits the analysis of a very wide range of thermal problems for simple problems containing a few conduction nodes to those containing complicated flow and radiation analysis with each problem type being analyzed with peak computational efficiency and maximum ease of use. The organization and programming methods applied to MOTAR achieved a high degree of computer utilization efficiency in terms of computer execution time and storage space required for a given problem. The computer time required to perform a given problem on MOTAR is approximately 40 to 50 percent that required for the currently existing widely used routines. The computer storage requirement for MOTAR is approximately 25 percent more than the most commonly used routines for the most simple problems but the data storage techniques for the more complicated options should save a considerable amount of space.
Hybrid quantum-classical hierarchy for mitigation of decoherence and determination of excited states
DOE Office of Scientific and Technical Information (OSTI.GOV)
McClean, Jarrod R.; Kimchi-Schwartz, Mollie E.; Carter, Jonathan
Using quantum devices supported by classical computational resources is a promising approach to quantum-enabled computation. One powerful example of such a hybrid quantum-classical approach optimized for classically intractable eigenvalue problems is the variational quantum eigensolver, built to utilize quantum resources for the solution of eigenvalue problems and optimizations with minimal coherence time requirements by leveraging classical computational resources. These algorithms have been placed as leaders among the candidates for the first to achieve supremacy over classical computation. Here, we provide evidence for the conjecture that variational approaches can automatically suppress even nonsystematic decoherence errors by introducing an exactly solvable channelmore » model of variational state preparation. Moreover, we develop a more general hierarchy of measurement and classical computation that allows one to obtain increasingly accurate solutions by leveraging additional measurements and classical resources. In conclusion, we demonstrate numerically on a sample electronic system that this method both allows for the accurate determination of excited electronic states as well as reduces the impact of decoherence, without using any additional quantum coherence time or formal error-correction codes.« less
Measuring the impact of computer resource quality on the software development process and product
NASA Technical Reports Server (NTRS)
Mcgarry, Frank; Valett, Jon; Hall, Dana
1985-01-01
The availability and quality of computer resources during the software development process was speculated to have measurable, significant impact on the efficiency of the development process and the quality of the resulting product. Environment components such as the types of tools, machine responsiveness, and quantity of direct access storage may play a major role in the effort to produce the product and in its subsequent quality as measured by factors such as reliability and ease of maintenance. During the past six years, the NASA Goddard Space Flight Center has conducted experiments with software projects in an attempt to better understand the impact of software development methodologies, environments, and general technologies on the software process and product. Data was extracted and examined from nearly 50 software development projects. All were related to support of satellite flight dynamics ground-based computations. The relationship between computer resources and the software development process and product as exemplified by the subject NASA data was examined. Based upon the results, a number of computer resource-related implications are provided.
Controlling Infrastructure Costs: Right-Sizing the Mission Control Facility
NASA Technical Reports Server (NTRS)
Martin, Keith; Sen-Roy, Michael; Heiman, Jennifer
2009-01-01
Johnson Space Center's Mission Control Center is a space vehicle, space program agnostic facility. The current operational design is essentially identical to the original facility architecture that was developed and deployed in the mid-90's. In an effort to streamline the support costs of the mission critical facility, the Mission Operations Division (MOD) of Johnson Space Center (JSC) has sponsored an exploratory project to evaluate and inject current state-of-the-practice Information Technology (IT) tools, processes and technology into legacy operations. The general push in the IT industry has been trending towards a data-centric computer infrastructure for the past several years. Organizations facing challenges with facility operations costs are turning to creative solutions combining hardware consolidation, virtualization and remote access to meet and exceed performance, security, and availability requirements. The Operations Technology Facility (OTF) organization at the Johnson Space Center has been chartered to build and evaluate a parallel Mission Control infrastructure, replacing the existing, thick-client distributed computing model and network architecture with a data center model utilizing virtualization to provide the MCC Infrastructure as a Service. The OTF will design a replacement architecture for the Mission Control Facility, leveraging hardware consolidation through the use of blade servers, increasing utilization rates for compute platforms through virtualization while expanding connectivity options through the deployment of secure remote access. The architecture demonstrates the maturity of the technologies generally available in industry today and the ability to successfully abstract the tightly coupled relationship between thick-client software and legacy hardware into a hardware agnostic "Infrastructure as a Service" capability that can scale to meet future requirements of new space programs and spacecraft. This paper discusses the benefits and difficulties that a migration to cloud-based computing philosophies has uncovered when compared to the legacy Mission Control Center architecture. The team consists of system and software engineers with extensive experience with the MCC infrastructure and software currently used to support the International Space Station (ISS) and Space Shuttle program (SSP).
The neural circuits for arithmetic principles.
Liu, Jie; Zhang, Han; Chen, Chuansheng; Chen, Hui; Cui, Jiaxin; Zhou, Xinlin
2017-02-15
Arithmetic principles are the regularities underlying arithmetic computation. Little is known about how the brain supports the processing of arithmetic principles. The current fMRI study examined neural activation and functional connectivity during the processing of verbalized arithmetic principles, as compared to numerical computation and general language processing. As expected, arithmetic principles elicited stronger activation in bilateral horizontal intraparietal sulcus and right supramarginal gyrus than did language processing, and stronger activation in left middle temporal lobe and left orbital part of inferior frontal gyrus than did computation. In contrast, computation elicited greater activation in bilateral horizontal intraparietal sulcus (extending to posterior superior parietal lobule) than did either arithmetic principles or language processing. Functional connectivity analysis with the psychophysiological interaction approach (PPI) showed that left temporal-parietal (MTG-HIPS) connectivity was stronger during the processing of arithmetic principle and language than during computation, whereas parietal-occipital connectivities were stronger during computation than during the processing of arithmetic principles and language. Additionally, the left fronto-parietal (orbital IFG-HIPS) connectivity was stronger during the processing of arithmetic principles than during computation. The results suggest that verbalized arithmetic principles engage a neural network that overlaps but is distinct from the networks for computation and language processing. Copyright © 2016 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buelt, J.L.; Stottlemyre, J.A.; White, M.K.
1991-09-01
Because of the great complexity and number of potential waste sites facing the US Department of Energy (DOE) for potential cleanup, the DOE is supporting the development of a computer-based methodology to streamline the remedial investigations/feasibility study process required for DOE operable units. DOE operable units are generally more complex in nature because of the existence of multiple waste sites within many of the operable units and the presence of mixed radioactive and hazardous chemical wastes. Consequently, Pacific Northwest Laboratory (PNL) is developing the Remedial Action Assessment System (RAAS), which is aimed at screening, linking, and evaluating establishment technology processmore » options in support of conducting feasibility studies under the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA). It is also intended to do the same in support of corrective measures studies requires by the Resource Conservation and Recovery Act (RCRA). This paper presents the characteristics of two RAAS prototypes currently being developed. These include the RAAS Technology Information System, which accesses information on technologies in a graphical and tabular manner, and the main RAAS methodology, which screens, links, and evaluates remedial technologies. 4 refs., 3 figs., 1 tab.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buelt, J.L.; Stottlemyre, J.A.; White, M.K.
1991-02-01
Because of the great complexity and number of potential waste sites facing the US Department of Energy (DOE) for potential cleanup, the DOE is supporting the development of a computer-based methodology to streamline the remedial investigation/feasibility study process required for DOE operable units. DOE operable units are generally more complex in nature because of the existence of multiple waste sites within many of the operable units and the presence of mixed radioactive and hazardous chemical wastes. Consequently, Pacific Northwest Laboratory (PNL) is developing the Remedial Action Assessment System (RAAS), which is aimed at screening, linking, and evaluating established technology processmore » options in support of conducting feasibility studies under the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA). It is also intended to do the same in support of corrective measures studies required by the Resource Conservation and Recovery Act (RCRA). This paper presents the characteristics of two RAAS prototypes currently being developed. These include the RAAS Technology Information System, which accesses information on technologies in a graphical and tabular manner, and the main RAAS methodology, which screens, links, and evaluates remedial technologies. 4 refs., 3 figs., 1 tab.« less
[Intranet applications in radiology].
Knopp, M V; von Hippel, G M; Koch, T; Knopp, M A
2000-01-01
The aim of the paper is to present the conceptual basis and capabilities of intranet applications in radiology. The intranet, which is the local brother of the internet can be readily realized using existing computer components and a network. All current computer operating systems support intranet applications which allow hard and software independent communication of text, images, video and sound with the use of browser software without dedicated programs on the individual personal computers. Radiological applications for text communication e.g. department specific bulletin boards and access to examination protocols; use of image communication for viewing and limited processing and documentation of radiological images can be achieved on decentralized PCs as well as speech communication for dictation, distribution of dictation and speech recognition. The intranet helps to optimize the organizational efficiency and cost effectiveness in the daily work of radiological departments in outpatients and hospital settings. The general interest in internet and intranet technology will guarantee its continuous development.
Laboratory and software applications for clinical trials: the global laboratory environment.
Briscoe, Chad
2011-11-01
The Applied Pharmaceutical Software Meeting is held annually. It is sponsored by The Boston Society, a not-for-profit organization that coordinates a series of meetings within the global pharmaceutical industry. The meeting generally focuses on laboratory applications, but in recent years has expanded to include some software applications for clinical trials. The 2011 meeting emphasized the global laboratory environment. Global clinical trials generate massive amounts of data in many locations that must be centralized and processed for efficient analysis. Thus, the meeting had a strong focus on establishing networks and systems for dealing with the computer infrastructure to support such environments. In addition to the globally installed laboratory information management system, electronic laboratory notebook and other traditional laboratory applications, cloud computing is quickly becoming the answer to provide efficient, inexpensive options for managing the large volumes of data and computing power, and thus it served as a central theme for the meeting.
Construction of Orthonormal Wavelets Using Symbolic Algebraic Methods
NASA Astrophysics Data System (ADS)
Černá, Dana; Finěk, Václav
2009-09-01
Our contribution is concerned with the solution of nonlinear algebraic equations systems arising from the computation of scaling coefficients of orthonormal wavelets with compact support. Specifically Daubechies wavelets, symmlets, coiflets, and generalized coiflets. These wavelets are defined as a solution of equation systems which are partly linear and partly nonlinear. The idea of presented methods consists in replacing those equations for scaling coefficients by equations for scaling moments. It enables us to eliminate some quadratic conditions in the original system and then simplify it. The simplified system is solved with the aid of the Gröbner basis method. The advantage of our approach is that in some cases, it provides all possible solutions and these solutions can be computed to arbitrary precision. For small systems, we are even able to find explicit solutions. The computation was carried out by symbolic algebra software Maple.
Huggins, Jane E.; Guger, Christoph; Ziat, Mounia; Zander, Thorsten O.; Taylor, Denise; Tangermann, Michael; Soria-Frisch, Aureli; Simeral, John; Scherer, Reinhold; Rupp, Rüdiger; Ruffini, Giulio; Robinson, Douglas K. R.; Ramsey, Nick F.; Nijholt, Anton; Müller-Putz, Gernot; McFarland, Dennis J.; Mattia, Donatella; Lance, Brent J.; Kindermans, Pieter-Jan; Iturrate, Iñaki; Herff, Christian; Gupta, Disha; Do, An H.; Collinger, Jennifer L.; Chavarriaga, Ricardo; Chase, Steven M.; Bleichner, Martin G.; Batista, Aaron; Anderson, Charles W.; Aarnoutse, Erik J.
2017-01-01
The Sixth International Brain–Computer Interface (BCI) Meeting was held 30 May–3 June 2016 at the Asilomar Conference Grounds, Pacific Grove, California, USA. The conference included 28 workshops covering topics in BCI and brain–machine interface research. Topics included BCI for specific populations or applications, advancing BCI research through use of specific signals or technological advances, and translational and commercial issues to bring both implanted and non-invasive BCIs to market. BCI research is growing and expanding in the breadth of its applications, the depth of knowledge it can produce, and the practical benefit it can provide both for those with physical impairments and the general public. Here we provide summaries of each workshop, illustrating the breadth and depth of BCI research and highlighting important issues and calls for action to support future research and development. PMID:29152523
Huggins, Jane E; Guger, Christoph; Ziat, Mounia; Zander, Thorsten O; Taylor, Denise; Tangermann, Michael; Soria-Frisch, Aureli; Simeral, John; Scherer, Reinhold; Rupp, Rüdiger; Ruffini, Giulio; Robinson, Douglas K R; Ramsey, Nick F; Nijholt, Anton; Müller-Putz, Gernot; McFarland, Dennis J; Mattia, Donatella; Lance, Brent J; Kindermans, Pieter-Jan; Iturrate, Iñaki; Herff, Christian; Gupta, Disha; Do, An H; Collinger, Jennifer L; Chavarriaga, Ricardo; Chase, Steven M; Bleichner, Martin G; Batista, Aaron; Anderson, Charles W; Aarnoutse, Erik J
2017-01-01
The Sixth International Brain-Computer Interface (BCI) Meeting was held 30 May-3 June 2016 at the Asilomar Conference Grounds, Pacific Grove, California, USA. The conference included 28 workshops covering topics in BCI and brain-machine interface research. Topics included BCI for specific populations or applications, advancing BCI research through use of specific signals or technological advances, and translational and commercial issues to bring both implanted and non-invasive BCIs to market. BCI research is growing and expanding in the breadth of its applications, the depth of knowledge it can produce, and the practical benefit it can provide both for those with physical impairments and the general public. Here we provide summaries of each workshop, illustrating the breadth and depth of BCI research and highlighting important issues and calls for action to support future research and development.
Machine learning methods in chemoinformatics
Mitchell, John B O
2014-01-01
Machine learning algorithms are generally developed in computer science or adjacent disciplines and find their way into chemical modeling by a process of diffusion. Though particular machine learning methods are popular in chemoinformatics and quantitative structure–activity relationships (QSAR), many others exist in the technical literature. This discussion is methods-based and focused on some algorithms that chemoinformatics researchers frequently use. It makes no claim to be exhaustive. We concentrate on methods for supervised learning, predicting the unknown property values of a test set of instances, usually molecules, based on the known values for a training set. Particularly relevant approaches include Artificial Neural Networks, Random Forest, Support Vector Machine, k-Nearest Neighbors and naïve Bayes classifiers. WIREs Comput Mol Sci 2014, 4:468–481. How to cite this article: WIREs Comput Mol Sci 2014, 4:468–481. doi:10.1002/wcms.1183 PMID:25285160
NASA Astrophysics Data System (ADS)
Puligheddu, Marcello; Gygi, Francois; Galli, Giulia
The prediction of the thermal properties of solids and liquids is central to numerous problems in condensed matter physics and materials science, including the study of thermal management of opto-electronic and energy conversion devices. We present a method to compute the thermal conductivity of solids by performing ab initio molecular dynamics at non equilibrium conditions. Our formulation is based on a generalization of the approach to equilibrium technique, using sinusoidal temperature gradients, and it only requires calculations of first principles trajectories and atomic forces. We discuss results and computational requirements for a representative, simple oxide, MgO, and compare with experiments and data obtained with classical potentials. This work was supported by MICCoM as part of the Computational Materials Science Program funded by the U.S. Department of Energy (DOE), Office of Science , Basic Energy Sciences (BES), Materials Sciences and Engineering Division under Grant DOE/BES 5J-30.
Experience using radio frequency laptops to access the electronic medical record in exam rooms.
Dworkin, L. A.; Krall, M.; Chin, H.; Robertson, N.; Harris, J.; Hughes, J.
1999-01-01
Kaiser Permanente, Northwest, evaluated the use of laptop computers to access our existing comprehensive Electronic Medical Record in exam rooms via a wireless radiofrequency (RF) network. Eleven of 22 clinicians who were offered the laptops successfully adopted their use in the exam room. These clinicians were able to increase their exam room time with the patient by almost 4 minutes (25%), apparently without lengthening their overall work day. Patient response to exam room computing was overwhelmingly positive. The RF network response time was similar to the hardwired network. Problems cited by some laptop users and many of the eleven non-adopters included battery issues, different equipment layout and function, and inadequate training. IT support needs for the RF laptops were two to four times greater than for hardwired desktops. Addressing the reliability and training issues should increase clinician acceptance, making a successful general roll-out for exam room computing more likely. PMID:10566458
Käser, Tanja; Baschera, Gian-Marco; Kohn, Juliane; Kucian, Karin; Richtmann, Verena; Grond, Ursina; Gross, Markus; von Aster, Michael
2013-01-01
This article presents the design and a first pilot evaluation of the computer-based training program Calcularis for children with developmental dyscalculia (DD) or difficulties in learning mathematics. The program has been designed according to insights on the typical and atypical development of mathematical abilities. The learning process is supported through multimodal cues, which encode different properties of numbers. To offer optimal learning conditions, a user model completes the program and allows flexible adaptation to a child's individual learning and knowledge profile. Thirty-two children with difficulties in learning mathematics completed the 6–12-weeks computer training. The children played the game for 20 min per day for 5 days a week. The training effects were evaluated using neuropsychological tests. Generally, children benefited significantly from the training regarding number representation and arithmetic operations. Furthermore, children liked to play with the program and reported that the training improved their mathematical abilities. PMID:23935586
Modeling the fusion of cylindrical bioink particles in post bioprinting structure formation
NASA Astrophysics Data System (ADS)
McCune, Matt; Shafiee, Ashkan; Forgacs, Gabor; Kosztin, Ioan
2015-03-01
Cellular Particle Dynamics (CPD) is an effective computational method to describe the shape evolution and biomechanical relaxation processes in multicellular systems. Thus, CPD is a useful tool to predict the outcome of post-printing structure formation in bioprinting. The predictive power of CPD has been demonstrated for multicellular systems composed of spherical bioink units. Experiments and computer simulations were related through an independently developed theoretical formalism based on continuum mechanics. Here we generalize the CPD formalism to (i) include cylindrical bioink particles often used in specific bioprinting applications, (ii) describe the more realistic experimental situation in which both the length and the volume of the cylindrical bioink units decrease during post-printing structure formation, and (iii) directly connect CPD simulations to the corresponding experiments without the need of the intermediate continuum theory inherently based on simplifying assumptions. Work supported by NSF [PHY-0957914]. Computer time provided by the University of Missouri Bioinformatics Consortium.
SkyNet: Modular nuclear reaction network library
NASA Astrophysics Data System (ADS)
Lippuner, Jonas; Roberts, Luke F.
2017-10-01
The general-purpose nuclear reaction network SkyNet evolves the abundances of nuclear species under the influence of nuclear reactions. SkyNet can be used to compute the nucleosynthesis evolution in all astrophysical scenarios where nucleosynthesis occurs. Any list of isotopes can be evolved and SkyNet supports various different types of nuclear reactions. SkyNet is modular, permitting new or existing physics, such as nuclear reactions or equations of state, to be easily added or modified.
Computer Models of Underwater Acoustic Propagation.
1980-01-02
deterministic propagation loss result. Development of a model for the more general problem is required, as evidenced by the trends in future sonar designs ...air. The water column itself is treated as an ideal fluid incapable of supporting showr stresses and having a uniform or, at most, piecewise constant...evaluated at any depth (zs 4 z -zN). The layer in which the source is located will be designated by LS and the receiver layer by LR. The depth dependent
An Efficient Means of Determining the Newtonian Potential for Highly Flattened Mass Distributions
NASA Astrophysics Data System (ADS)
Cohl, H.
1999-05-01
In this dissertation talk we present a mathematical result that, to the best of our knowledge, has been previously undiscovered. That is, the Green's function in a variety of orthogonal coordinate systems may be expressed in terms of a single sum over the azimuthal quantum number, m, of terms involving Toroidal Harmonics. We show how this new addition theorem can be effectively applied to a variety of potential problems in gravitation, electrostatics and magnetostatics and, in particular, demonstrate how it may be used to analyze the properties of general nonaxisymmetric disk systems with and without vertical extent. Finally, we describe our numerical implementation of the addition theorem in order to determine the Newtonian potential extremely close to highly flattened mass distributions. This yields an extremely efficient technique for computing the boundary values in a general algorithm that is designed to solve the 3D Poisson equation on a cylindrical coordinate lattice. We acknowledge support from the U.S. National Science Foundation through grant AST-9528424 and DGE-9355007, the latter of which has been issued through the NSF's Graduate Traineeships Program. This work also has been supported, in part, by grants of high-performance-computing time on NPACI facilities at SDSC and UT, Austin, and through the PET program of NAVOCEANO DoD Major Shared Resource Center in Stennis, MS.
EMCORE - Emotional Cooperative Groupware
NASA Astrophysics Data System (ADS)
Fasoli, N.; Messina, A.
In the last years considerable effort has been spent to develop groupware applications. Despite this, no general consenus has been met by groupware applications in computer field. Interdisciplinary approach could prove very useful to overcome these difficulties. A workgroup is not simply a set of people gathered together, working for a common goal. It can also be thought as a strong, hard mental reality. Actually, sociological and psychological definitions of group differ considerably. At sociological level a group is generally described in the view of the activities and events occurring inside the group itself. On the other hand, the psychological group approach considers not only the actions occurring inside the group, but also all the mental activities originated by belonging to the group, be they emotional or rational nature. Since early '60 simple work group (i.e. discussion group) has been analyzed in his psychological behavior. EMCORE is a prototype which aims to support computer science methods with psychological approach. The tool has been developed for a discussion group supported by heterogeneous distributed systems and has been implemented according to the CORBA abstraction augmented by the machine independent JAVA language. The tool allows all the common activities of a discussion group: discussion by voice or by chatting board if multimedia device are not present; discussion and elaboration of a shared document by text and/or graphic editor. At the same time, tools are provided for the psychoanalytic approach, according to a specific methodology.
Sainath, Kamalesh; Teixeira, Fernando L; Donderici, Burkay
2014-01-01
We develop a general-purpose formulation, based on two-dimensional spectral integrals, for computing electromagnetic fields produced by arbitrarily oriented dipoles in planar-stratified environments, where each layer may exhibit arbitrary and independent anisotropy in both its (complex) permittivity and permeability tensors. Among the salient features of our formulation are (i) computation of eigenmodes (characteristic plane waves) supported in arbitrarily anisotropic media in a numerically robust fashion, (ii) implementation of an hp-adaptive refinement for the numerical integration to evaluate the radiation and weakly evanescent spectra contributions, and (iii) development of an adaptive extension of an integral convergence acceleration technique to compute the strongly evanescent spectrum contribution. While other semianalytic techniques exist to solve this problem, none have full applicability to media exhibiting arbitrary double anisotropies in each layer, where one must account for the whole range of possible phenomena (e.g., mode coupling at interfaces and nonreciprocal mode propagation). Brute-force numerical methods can tackle this problem but only at a much higher computational cost. The present formulation provides an efficient and robust technique for field computation in arbitrary planar-stratified environments. We demonstrate the formulation for a number of problems related to geophysical exploration.
Multilevel Iterative Methods in Nonlinear Computational Plasma Physics
NASA Astrophysics Data System (ADS)
Knoll, D. A.; Finn, J. M.
1997-11-01
Many applications in computational plasma physics involve the implicit numerical solution of coupled systems of nonlinear partial differential equations or integro-differential equations. Such problems arise in MHD, systems of Vlasov-Fokker-Planck equations, edge plasma fluid equations. We have been developing matrix-free Newton-Krylov algorithms for such problems and have applied these algorithms to the edge plasma fluid equations [1,2] and to the Vlasov-Fokker-Planck equation [3]. Recently we have found that with increasing grid refinement, the number of Krylov iterations required per Newton iteration has grown unmanageable [4]. This has led us to the study of multigrid methods as a means of preconditioning matrix-free Newton-Krylov methods. In this poster we will give details of the general multigrid preconditioned Newton-Krylov algorithm, as well as algorithm performance details on problems of interest in the areas of magnetohydrodynamics and edge plasma physics. Work supported by US DoE 1. Knoll and McHugh, J. Comput. Phys., 116, pg. 281 (1995) 2. Knoll and McHugh, Comput. Phys. Comm., 88, pg. 141 (1995) 3. Mousseau and Knoll, J. Comput. Phys. (1997) (to appear) 4. Knoll and McHugh, SIAM J. Sci. Comput. 19, (1998) (to appear)
NASA Technical Reports Server (NTRS)
Rediess, Herman A.; Hewett, M. D.
1991-01-01
The requirements are assessed for the use of remote computation to support HRV flight testing. First, remote computational requirements were developed to support functions that will eventually be performed onboard operational vehicles of this type. These functions which either cannot be performed onboard in the time frame of initial HRV flight test programs because the technology of airborne computers will not be sufficiently advanced to support the computational loads required, or it is not desirable to perform the functions onboard in the flight test program for other reasons. Second, remote computational support either required or highly desirable to conduct flight testing itself was addressed. The use is proposed of an Automated Flight Management System which is described in conceptual detail. Third, autonomous operations is discussed and finally, unmanned operations.
ERIC Educational Resources Information Center
Smith, Peter, Ed.
2014-01-01
The Association Supporting Computer Users in Education (ASCUE) is a group of people interested in small college computing issues. It is a blend of people from all over the country who use computers in their teaching, academic support, and administrative support functions. ASCUE has a strong tradition of bringing its members together to pool their…
ERIC Educational Resources Information Center
Association Supporting Computer Users in Education, 2017
2017-01-01
The Association Supporting Computer Users in Education (ASCUE) is a group of people interested in small college computing issues. It is a blend of people from all over the country who use computers in their teaching, academic support, and administrative support functions. Begun in 1968 as the College and University Eleven-Thirty Users' Group…
ERIC Educational Resources Information Center
Smith, Peter, Ed.
2015-01-01
The Association Supporting Computer Users in Education (ASCUE) is a group of people interested in small college computing issues. It is a blend of people from all over the country who use computers in their teaching, academic support, and administrative support functions. ASCUE has a strong tradition of bringing its members together to pool their…
Ultrasound phase rotation beamforming on multi-core DSP.
Ma, Jieming; Karadayi, Kerem; Ali, Murtaza; Kim, Yongmin
2014-01-01
Phase rotation beamforming (PRBF) is a commonly-used digital receive beamforming technique. However, due to its high computational requirement, it has traditionally been supported by hardwired architectures, e.g., application-specific integrated circuits (ASICs) or more recently field-programmable gate arrays (FPGAs). In this study, we investigated the feasibility of supporting software-based PRBF on a multi-core DSP. To alleviate the high computing requirement, the analog front-end (AFE) chips integrating quadrature demodulation in addition to analog-to-digital conversion were defined and used. With these new AFE chips, only delay alignment and phase rotation need to be performed by DSP, substantially reducing the computational load. We implemented the delay alignment and phase rotation modules on a Texas Instruments C6678 DSP with 8 cores. We found it takes 200 μs to beamform 2048 samples from 64 channels using 2 cores. With 4 cores, 20 million samples can be beamformed in one second. Therefore, ADC frequencies up to 40 MHz with 2:1 decimation in AFE chips or up to 20 MHz with no decimation can be supported as long as the ADC-to-DSP I/O requirement can be met. The remaining 4 cores can work on back-end processing tasks and applications, e.g., color Doppler or ultrasound elastography. One DSP being able to handle both beamforming and back-end processing could lead to low-power and low-cost ultrasound machines, benefiting ultrasound imaging in general, particularly portable ultrasound machines. Copyright © 2013 Elsevier B.V. All rights reserved.
Assis, Sofia; Costa, Pedro; Rosas, Maria Jose; Vaz, Rui; Silva Cunha, Joao Paulo
2016-08-01
Intraoperative evaluation of the efficacy of Deep Brain Stimulation includes evaluation of the effect on rigidity. A subjective semi-quantitative scale is used, dependent on the examiner perception and experience. A system was proposed previously, aiming to tackle this subjectivity, using quantitative data and providing real-time feedback of the computed rigidity reduction, hence supporting the physician decision. This system comprised of a gyroscope-based motion sensor in a textile band, placed in the patients hand, which communicated its measurements to a laptop. The latter computed a signal descriptor from the angular velocity of the hand during wrist flexion in DBS surgery. The first approach relied on using a general rigidity reduction model, regardless of the initial severity of the symptom. Thus, to enhance the performance of the previously presented system, we aimed to develop models for high and low baseline rigidity, according to the examiner assessment before any stimulation. This would allow a more patient-oriented approach. Additionally, usability was improved by having in situ processing in a smartphone, instead of a computer. Such system has shown to be reliable, presenting an accuracy of 82.0% and a mean error of 3.4%. Relatively to previous results, the performance was similar, further supporting the importance of considering the cogwheel rigidity to better infer about the reduction in rigidity. Overall, we present a simple, wearable, mobile system, suitable for intra-operatory conditions during DBS, supporting a physician in decision-making when setting stimulation parameters.
Xyce™ Parallel Electronic Simulator Users' Guide, Version 6.5.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keiter, Eric R.; Aadithya, Karthik V.; Mei, Ting
This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been designed as a SPICE-compatible, high-performance analog circuit simulator, and has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability over the current state-of-the-art in the following areas: Capability to solve extremely large circuit problems by supporting large-scale parallel computing platforms (up to thousands of processors). This includes support for most popular parallel and serial computers. A differential-algebraic-equation (DAE) formulation, which better isolates the device model package from solver algorithms. This allows one to developmore » new types of analysis without requiring the implementation of analysis-specific device models. Device models that are specifically tailored to meet Sandia's needs, including some radiation- aware devices (for Sandia users only). Object-oriented code design and implementation using modern coding practices. Xyce is a parallel code in the most general sense of the phrase -- a message passing parallel implementation -- which allows it to run efficiently a wide range of computing platforms. These include serial, shared-memory and distributed-memory parallel platforms. Attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows. The information herein is subject to change without notice. Copyright © 2002-2016 Sandia Corporation. All rights reserved.« less
2012-05-01
cloud computing 17 NASA Nebula Platform • Cloud computing pilot program at NASA Ames • Integrates open-source components into seamless, self...Mission support • Education and public outreach (NASA Nebula , 2010) 18 NSF Supported Cloud Research • Support for Cloud Computing in...Mell, P. & Grance, T. (2011). The NIST Definition of Cloud Computing. NIST Special Publication 800-145 • NASA Nebula (2010). Retrieved from
Angus, Robert M; Thompson, Elizabeth B; Davies, Lisa; Trusdale, Ann; Hodgson, Chris; McKnight, Eddie; Davies, Andrew; Pearson, Mike G
2012-12-01
Applying guidelines is a universal challenge that is often not met. Intelligent software systems that facilitate real-time management during a clinical interaction may offer a solution. To determine if the use of a computer-guided consultation that facilitates the National Institute for Health and Clinical Excellence-based chronic obstructive pulmonary disease (COPD) guidance and prompts clinical decision-making is feasible in primary care and to assess its impact on diagnosis and management in reviews of COPD patients. Practice nurses, one-third of whom had no specific respiratory training, undertook a computer-guided review in the usual consulting room setting using a laptop computer with the screen visible to them and to the patient. A total of 293 patients (mean (SD) age 69.7 (10.1) years, 163 (55.6%) male) with a diagnosis of COPD were randomly selected from GP databases in 16 practices and assessed. Of 236 patients who had spirometry, 45 (19%) did not have airflow obstruction and the guided clinical history changed the primary diagnosis from COPD in a further 24 patients. In the 191 patients with confirmed COPD, the consultations prompted management changes including 169 recommendations for altered prescribing of inhalers (addition or discontinuation, inhaler dose or device). In addition, 47% of the 55 current smokers were referred for smoking cessation support, 12 (6%) for oxygen assessment, and 47 (24%) for pulmonary rehabilitation. Computer-guided consultations are practicable in general practice. Primary care COPD databases were confirmed to contain a significant proportion of incorrectly assigned patients. They resulted in interventions and the rationalisation of prescribing in line with recommendations. Only in 22 (12%) of those fully assessed was no management change suggested. The introduction of a computer-guided consultation offers the prospect of comprehensive guideline quality management.
ERIC Educational Resources Information Center
Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.
This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for two programs in the state's postsecondary-level computer information systems technology cluster: computer programming and network support. Presented in the introduction are program descriptions and suggested course…
Schulz, S; Romacker, M; Hahn, U
1998-01-01
The development of powerful and comprehensive medical ontologies that support formal reasoning on a large scale is one of the key requirements for clinical computing in the next millennium. Taxonomic medical knowledge, a major portion of these ontologies, is mainly characterized by generalization and part-whole relations between concepts. While reasoning in generalization hierarchies is quite well understood, no fully conclusive mechanism as yet exists for part-whole reasoning. The approach we take emulates part-whole reasoning via classification-based reasoning using SEP triplets, a special data structure for encoding part-whole relations that is fully embedded in the formal framework of standard description logics.
Data Parallel Line Relaxation (DPLR) Code User Manual: Acadia - Version 4.01.1
NASA Technical Reports Server (NTRS)
Wright, Michael J.; White, Todd; Mangini, Nancy
2009-01-01
Data-Parallel Line Relaxation (DPLR) code is a computational fluid dynamic (CFD) solver that was developed at NASA Ames Research Center to help mission support teams generate high-value predictive solutions for hypersonic flow field problems. The DPLR Code Package is an MPI-based, parallel, full three-dimensional Navier-Stokes CFD solver with generalized models for finite-rate reaction kinetics, thermal and chemical non-equilibrium, accurate high-temperature transport coefficients, and ionized flow physics incorporated into the code. DPLR also includes a large selection of generalized realistic surface boundary conditions and links to enable loose coupling with external thermal protection system (TPS) material response and shock layer radiation codes.
Schulz, S.; Romacker, M.; Hahn, U.
1998-01-01
The development of powerful and comprehensive medical ontologies that support formal reasoning on a large scale is one of the key requirements for clinical computing in the next millennium. Taxonomic medical knowledge, a major portion of these ontologies, is mainly characterized by generalization and part-whole relations between concepts. While reasoning in generalization hierarchies is quite well understood, no fully conclusive mechanism as yet exists for part-whole reasoning. The approach we take emulates part-whole reasoning via classification-based reasoning using SEP triplets, a special data structure for encoding part-whole relations that is fully embedded in the formal framework of standard description logics. Images Figure 3 PMID:9929335
NASA Astrophysics Data System (ADS)
Fukushima, Toshio
2017-06-01
Reviewed are recently developed methods of the numerical integration of the gravitational field of general two- or three-dimensional bodies with arbitrary shape and mass density distribution: (i) an axisymmetric infinitely-thin disc (Fukushima 2016a, MNRAS, 456, 3702), (ii) a general infinitely-thin plate (Fukushima 2016b, MNRAS, 459, 3825), (iii) a plane-symmetric and axisymmetric ring-like object (Fukushima 2016c, AJ, 152, 35), (iv) an axisymmetric thick disc (Fukushima 2016d, MNRAS, 462, 2138), and (v) a general three-dimensional body (Fukushima 2016e, MNRAS, 463, 1500). The key techniques employed are (a) the split quadrature method using the double exponential rule (Takahashi and Mori, 1973, Numer. Math., 21, 206), (b) the precise and fast computation of complete elliptic integrals (Fukushima 2015, J. Comp. Appl. Math., 282, 71), (c) Ridder's algorithm of numerical differentiaion (Ridder 1982, Adv. Eng. Softw., 4, 75), (d) the recursive computation of the zonal toroidal harmonics, and (e) the integration variable transformation to the local spherical polar coordinates. These devices succesfully regularize the Newton kernel in the integrands so as to provide accurate integral values. For example, the general 3D potential is regularly integrated as Φ (\\vec{x}) = - G \\int_0^∞ ( \\int_{-1}^1 ( \\int_0^{2π} ρ (\\vec{x}+\\vec{q}) dψ ) dγ ) q dq, where \\vec{q} = q (√{1-γ^2} cos ψ, √{1-γ^2} sin ψ, γ), is the relative position vector referred to \\vec{x}, the position vector at which the potential is evaluated. As a result, the new methods can compute the potential and acceleration vector very accurately. In fact, the axisymmetric integration reproduces the Miyamoto-Nagai potential with 14 correct digits. The developed methods are applied to the gravitational field study of galaxies and protoplanetary discs. Among them, the investigation on the rotation curve of M33 supports a disc-like structure of the dark matter with a double-power-law surface mass density distribution. Fortran 90 subroutines to execute these methods and their test programs and sample outputs are available from the author's WEB site: https://www.researchgate.net/profile/Toshio_Fukushima/
Alfredsson, Jayne; Plichart, Patrick; Zary, Nabil
2012-01-01
Research on computer supported scoring of assessments in health care education has mainly focused on automated scoring. Little attention has been given to how informatics can support the currently predominant human-based grading approach. This paper reports steps taken to develop a model for a computer supported scoring process that focuses on optimizing a task that was previously undertaken without computer support. The model was also implemented in the open source assessment platform TAO in order to study its benefits. Ability to score test takers anonymously, analytics on the graders reliability and a more time efficient process are example of observed benefits. A computer supported scoring will increase the quality of the assessment results.
Computer decision support as a source of interpretation error: the case of electrocardiograms.
Tsai, Theodore L; Fridsma, Douglas B; Gatti, Guido
2003-01-01
The aim of this study was to determine the effect that the computer interpretation (CI) of electrocardiograms (EKGs) has on the accuracy of resident (noncardiologist) physicians reading EKGs. A randomized, controlled trial was conducted in a laboratory setting from February through June 2001, using a two-period crossover design with matched pairs of subjects randomly assigned to sequencing groups. Subjects' interpretive accuracy of discrete, cardiologist-determined EKG findings were measured as judged by a board-certified internist. Without the CI, subjects interpreted 48.9% (95% confidence interval, 45.0% to 52.8%) of the findings correctly. With the CI, subjects interpreted 55.4% (51.9% to 58.9%) correctly (p < 0.0001). When the CIs that agreed with the gold standard (Correct CIs) were not included, 53.1% (47.7% to 58.5%) of the findings were interpreted correctly. When the correct CI was included, accuracy increased to 68.1% (63.2% to 72.7%; p < 0.0001). When computer advice that did not agree with the gold standard (Incorrect CI) was not provided to the subjects, 56.7% (48.5% to 64.5%) of findings were interpreted correctly. Accuracy dropped to 48.3% (40.4% to 56.4%) when the incorrect computer advice was provided (p = 0.131). Subjects erroneously agreed with the incorrect CI more often when it was presented with the EKG 67.7% (57.2% to 76.7%) than when it was not 34.6% (23.8% to 47.3%; p < 0.0001). Computer decision support systems can generally improve the interpretive accuracy of internal medicine residents in reading EKGs. However, subjects were influenced significantly by incorrect advice, which tempers the overall usefulness of computer-generated advice in this and perhaps other areas.
Computer Decision Support as a Source of Interpretation Error: The Case of Electrocardiograms
Tsai, Theodore L.; Fridsma, Douglas B.; Gatti, Guido
2003-01-01
Objective: The aim of this study was to determine the effect that the computer interpretation (CI) of electrocardiograms (EKGs) has on the accuracy of resident (noncardiologist) physicians reading EKGs. Design: A randomized, controlled trial was conducted in a laboratory setting from February through June 2001, using a two-period crossover design with matched pairs of subjects randomly assigned to sequencing groups. Measurements: Subjects' interpretive accuracy of discrete, cardiologist-determined EKG findings were measured as judged by a board-certified internist. Results: Without the CI, subjects interpreted 48.9% (95% confidence interval, 45.0% to 52.8%) of the findings correctly. With the CI, subjects interpreted 55.4% (51.9% to 58.9%) correctly (p < 0.0001). When the CIs that agreed with the gold standard (Correct CIs) were not included, 53.1% (47.7% to 58.5%) of the findings were interpreted correctly. When the correct CI was included, accuracy increased to 68.1% (63.2% to 72.7%; p < 0.0001). When computer advice that did not agree with the gold standard (Incorrect CI) was not provided to the subjects, 56.7% (48.5% to 64.5%) of findings were interpreted correctly. Accuracy dropped to 48.3% (40.4% to 56.4%) when the incorrect computer advice was provided (p = 0.131). Subjects erroneously agreed with the incorrect CI more often when it was presented with the EKG 67.7% (57.2% to 76.7%) than when it was not 34.6% (23.8% to 47.3%; p < 0.0001). Conclusions: Computer decision support systems can generally improve the interpretive accuracy of internal medicine residents in reading EKGs. However, subjects were influenced significantly by incorrect advice, which tempers the overall usefulness of computer-generated advice in this and perhaps other areas. PMID:12807810
Campion, Thomas R; Waitman, Lemuel R; May, Addison K; Ozdas, Asli; Lorenzi, Nancy M; Gadd, Cynthia S
2010-01-01
Evaluations of computerized clinical decision support systems (CDSS) typically focus on clinical performance changes and do not include social, organizational, and contextual characteristics explaining use and effectiveness. Studies of CDSS for intensive insulin therapy (IIT) are no exception, and the literature lacks an understanding of effective computer-based IIT implementation and operation. This paper presents (1) a literature review of computer-based IIT evaluations through the lens of institutional theory, a discipline from sociology and organization studies, to demonstrate the inconsistent reporting of workflow and care process execution and (2) a single-site case study to illustrate how computer-based IIT requires substantial organizational change and creates additional complexity with unintended consequences including error. Computer-based IIT requires organizational commitment and attention to site-specific technology, workflow, and care processes to achieve intensive insulin therapy goals. The complex interaction between clinicians, blood glucose testing devices, and CDSS may contribute to workflow inefficiency and error. Evaluations rarely focus on the perspective of nurses, the primary users of computer-based IIT whose knowledge can potentially lead to process and care improvements. This paper addresses a gap in the literature concerning the social, organizational, and contextual characteristics of CDSS in general and for intensive insulin therapy specifically. Additionally, this paper identifies areas for future research to define optimal computer-based IIT process execution: the frequency and effect of manual data entry error of blood glucose values, the frequency and effect of nurse overrides of CDSS insulin dosing recommendations, and comprehensive ethnographic study of CDSS for IIT. Copyright (c) 2009. Published by Elsevier Ireland Ltd.
2013-01-01
Background Chronic Obstructive Pulmonary Disease (COPD) is a major cause of morbidity and mortality. Effective self-management support interventions are needed to improve the health and functional status of people with COPD or at risk for COPD. Computer-tailored technology could be an effective way to provide this support. Methods/Design This paper presents the protocol of a randomised controlled trial testing the effectiveness of a web-based, computer-tailored self-management intervention to change health behaviours of people with or at risk for COPD. An intervention group will be compared to a usual care control group, in which the intervention group will receive a web-based, computer-tailored self-management intervention. Participants will be recruited from an online panel and through general practices. Outcomes will be measured at baseline and at 6 months. The primary outcomes will be smoking behaviour, measuring the 7-day point prevalence abstinence and physical activity, measured in minutes. Secondary outcomes will include dyspnoea score, quality of life, stages of change, intention to change behaviour and alternative smoking behaviour measures, including current smoking behaviour, 24-hour point prevalence abstinence, prolonged abstinence, continued abstinence and number of quit attempts. Discussion To the best of our knowledge, this will be the first randomised controlled trial to test the effectiveness of a web-based, computer-tailored self-management intervention for people with or at risk for COPD. The results will be important to explore the possible benefits of computer-tailored interventions for the self-management of people with or at risk for COPD and potentially other chronic health conditions. Dutch trial register NTR3421 PMID:23742208
An automatic frequency control loop using overlapping DFTs (Discrete Fourier Transforms)
NASA Technical Reports Server (NTRS)
Aguirre, S.
1988-01-01
An automatic frequency control (AFC) loop is introduced and analyzed in detail. The new scheme is a generalization of the well known Cross Product AFC loop that uses running overlapping discrete Fourier transforms (DFTs) to create a discriminator curve. Linear analysis is included and supported with computer simulations. The algorithm is tested in a low carrier to noise ratio (CNR) dynamic environment, and the probability of loss of lock is estimated via computer simulations. The algorithm discussed is a suboptimum tracking scheme with a larger frequency error variance compared to an optimum strategy, but offers simplicity of implementation and a very low operating threshold CNR. This technique can be applied during the carrier acquisition and re-acquisition process in the Advanced Receiver.
National meeting to review IPAD status and goals. [Integrated Programs for Aerospace-vehicle Design
NASA Technical Reports Server (NTRS)
Fulton, R. E.
1980-01-01
A joint NASA/industry project called Integrated Programs for Aerospace-vehicle Design (IPAD) is described, which has the goal of raising aerospace-industry productivity through the application of computers to integrate company-wide management of engineering data. Basically a general-purpose interactive computing system developed to support engineering design processes, the IPAD design is composed of three major software components: the executive, data management, and geometry and graphics software. Results of IPAD activities include a comprehensive description of a future representative aerospace vehicle design process and its interface to manufacturing, and requirements and preliminary design of a future IPAD software system to integrate engineering activities of an aerospace company having several products under simultaneous development.
Lindstrand, Peg
2002-01-01
This study focuses on differences between the ways in which we look at girls' and boys' computer activities. It is evident that the gender varieties per se generate different conditions for boys and girls. Generally, children with disabilities have great difficulty defining their needs and wishes. Pedagogues, habilitation staff and parents are needed as support for both boys' and girls' development. If technology is to be part of this development, we must pay attention to it. Research within this area highlights the differences and patterns that occur. The study stresses the expectations and experiences that parents of children with disabilities have of computer-based activities for their children, with a focus on gender-related issues.
Thalamic and cortical pathways supporting auditory processing
Lee, Charles C.
2012-01-01
The neural processing of auditory information engages pathways that begin initially at the cochlea and that eventually reach forebrain structures. At these higher levels, the computations necessary for extracting auditory source and identity information rely on the neuroanatomical connections between the thalamus and cortex. Here, the general organization of these connections in the medial geniculate body (thalamus) and the auditory cortex is reviewed. In addition, we consider two models organizing the thalamocortical pathways of the non-tonotopic and multimodal auditory nuclei. Overall, the transfer of information to the cortex via the thalamocortical pathways is complemented by the numerous intracortical and corticocortical pathways. Although interrelated, the convergent interactions among thalamocortical, corticocortical, and commissural pathways enable the computations necessary for the emergence of higher auditory perception. PMID:22728130
Eppig, Janan T; Smith, Cynthia L; Blake, Judith A; Ringwald, Martin; Kadin, James A; Richardson, Joel E; Bult, Carol J
2017-01-01
The Mouse Genome Informatics (MGI), resource ( www.informatics.jax.org ) has existed for over 25 years, and over this time its data content, informatics infrastructure, and user interfaces and tools have undergone dramatic changes (Eppig et al., Mamm Genome 26:272-284, 2015). Change has been driven by scientific methodological advances, rapid improvements in computational software, growth in computer hardware capacity, and the ongoing collaborative nature of the mouse genomics community in building resources and sharing data. Here we present an overview of the current data content of MGI, describe its general organization, and provide examples using simple and complex searches, and tools for mining and retrieving sets of data.
The Open System Interconnection as a building block in a health sciences information network.
Boss, R W
1985-01-01
The interconnection of integrated health sciences library systems with other health sciences computer systems to achieve information networks will require either custom linkages among specific devices or the adoption of standards that all systems support. The most appropriate standards appear to be those being developed under the Open System Interconnection (OSI) reference model, which specifies a set of rules and functions that computers must follow to exchange information. The protocols have been modularized into seven different layers. The lowest three layers are generally available as off-the-shelf interfacing products. The higher layers require special development for particular applications. This paper describes the OSI, its application in health sciences networks, and specific tasks that remain to be undertaken. PMID:4052672
Online help: cancer patients participate in a computer-mediated support group.
Weinberg, N; Schmale, J; Uken, J; Wessel, K
1996-02-01
This study investigated a computer mediated support group of six breast cancer patients. For a three-month period, patients used home computers to connect to a computer bulletin board on which they read messages from and posted messages to each other. The patients had no difficulty learning to use the computer and used it an average of one hour a week. The patients discussed their medical conditions, shared personal concerns, and offered support. This online approach provided many features of traditional face-to-face support groups.
Practical Application of Model Checking in Software Verification
NASA Technical Reports Server (NTRS)
Havelund, Klaus; Skakkebaek, Jens Ulrik
1999-01-01
This paper presents our experiences in applying the JAVA PATHFINDER (J(sub PF)), a recently developed JAVA to SPIN translator, in the finding of synchronization bugs in a Chinese Chess game server application written in JAVA. We give an overview of J(sub PF) and the subset of JAVA that it supports and describe the abstraction and verification of the game server. Finally, we analyze the results of the effort. We argue that abstraction by under-approximation is necessary for abstracting sufficiently smaller models for verification purposes; that user guidance is crucial for effective abstraction; and that current model checkers do not conveniently support the computational models of software in general and JAVA in particular.
Robotics supporting autonomy. 5th French Japanese Conference on Bio-ethics.
Gelin, Rodolphe
2013-12-01
The aim of this paper is to propose a new vision on robots. Generally seen as a threat against humanity or at least against employment, we will demonstrate that this new kind of machine can be a support not only for people in loss of autonomy but even for everyone. They will not replace people, they will assist them. The mass production of these companion robots will create a new industry that could take the relay of the automotive and the computer industries in this century. This access to the mass market will require solving technological and acceptability problems by a common work of researchers, engineers, users and the major stakeholders of our society.
77 FR 74020 - Office of Child Support Enforcement; Privacy Act of 1974; Computer Matching Agreement
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-12
... 29, 2012, sent a report of a Computer Matching Program to the Committee on Homeland Security and... Support Enforcement; Privacy Act of 1974; Computer Matching Agreement AGENCY: Office of Child Support Enforcement (OCSE), ACF, HHS. ACTION: Notice of a Computer Matching Program. SUMMARY: In accordance with the...
Dai, Huanping; Micheyl, Christophe
2015-05-01
Proportion correct (Pc) is a fundamental measure of task performance in psychophysics. The maximum Pc score that can be achieved by an optimal (maximum-likelihood) observer in a given task is of both theoretical and practical importance, because it sets an upper limit on human performance. Within the framework of signal detection theory, analytical solutions for computing the maximum Pc score have been established for several common experimental paradigms under the assumption of Gaussian additive internal noise. However, as the scope of applications of psychophysical signal detection theory expands, the need is growing for psychophysicists to compute maximum Pc scores for situations involving non-Gaussian (internal or stimulus-induced) noise. In this article, we provide a general formula for computing the maximum Pc in various psychophysical experimental paradigms for arbitrary probability distributions of sensory activity. Moreover, easy-to-use MATLAB code implementing the formula is provided. Practical applications of the formula are illustrated, and its accuracy is evaluated, for two paradigms and two types of probability distributions (uniform and Gaussian). The results demonstrate that Pc scores computed using the formula remain accurate even for continuous probability distributions, as long as the conversion from continuous probability density functions to discrete probability mass functions is supported by a sufficiently high sampling resolution. We hope that the exposition in this article, and the freely available MATLAB code, facilitates calculations of maximum performance for a wider range of experimental situations, as well as explorations of the impact of different assumptions concerning internal-noise distributions on maximum performance in psychophysical experiments.
General aviation design synthesis utilizing interactive computer graphics
NASA Technical Reports Server (NTRS)
Galloway, T. L.; Smith, M. R.
1976-01-01
Interactive computer graphics is a fast growing area of computer application, due to such factors as substantial cost reductions in hardware, general availability of software, and expanded data communication networks. In addition to allowing faster and more meaningful input/output, computer graphics permits the use of data in graphic form to carry out parametric studies for configuration selection and for assessing the impact of advanced technologies on general aviation designs. The incorporation of interactive computer graphics into a NASA developed general aviation synthesis program is described, and the potential uses of the synthesis program in preliminary design are demonstrated.
DUKSUP: A Computer Program for High Thrust Launch Vehicle Trajectory Design and Optimization
NASA Technical Reports Server (NTRS)
Williams, C. H.; Spurlock, O. F.
2014-01-01
From the late 1960's through 1997, the leadership of NASA's Intermediate and Large class unmanned expendable launch vehicle projects resided at the NASA Lewis (now Glenn) Research Center (LeRC). One of LeRC's primary responsibilities --- trajectory design and performance analysis --- was accomplished by an internally-developed analytic three dimensional computer program called DUKSUP. Because of its Calculus of Variations-based optimization routine, this code was generally more capable of finding optimal solutions than its contemporaries. A derivation of optimal control using the Calculus of Variations is summarized including transversality, intermediate, and final conditions. The two point boundary value problem is explained. A brief summary of the code's operation is provided, including iteration via the Newton-Raphson scheme and integration of variational and motion equations via a 4th order Runge-Kutta scheme. Main subroutines are discussed. The history of the LeRC trajectory design efforts in the early 1960's is explained within the context of supporting the Centaur upper stage program. How the code was constructed based on the operation of the Atlas/Centaur launch vehicle, the limits of the computers of that era, the limits of the computer programming languages, and the missions it supported are discussed. The vehicles DUKSUP supported (Atlas/Centaur, Titan/Centaur, and Shuttle/Centaur) are briefly described. The types of missions, including Earth orbital and interplanetary, are described. The roles of flight constraints and their impact on launch operations are detailed (such as jettisoning hardware on heating, Range Safety, ground station tracking, and elliptical parking orbits). The computer main frames on which the code was hosted are described. The applications of the code are detailed, including independent check of contractor analysis, benchmarking, leading edge analysis, and vehicle performance improvement assessments. Several of DUKSUP's many major impacts on launches are discussed including Intelsat, Voyager, Pioneer Venus, HEAO, Galileo, and Cassini.
DUKSUP: A Computer Program for High Thrust Launch Vehicle Trajectory Design and Optimization
NASA Technical Reports Server (NTRS)
Spurlock, O. Frank; Williams, Craig H.
2015-01-01
From the late 1960s through 1997, the leadership of NASAs Intermediate and Large class unmanned expendable launch vehicle projects resided at the NASA Lewis (now Glenn) Research Center (LeRC). One of LeRCs primary responsibilities --- trajectory design and performance analysis --- was accomplished by an internally-developed analytic three dimensional computer program called DUKSUP. Because of its Calculus of Variations-based optimization routine, this code was generally more capable of finding optimal solutions than its contemporaries. A derivation of optimal control using the Calculus of Variations is summarized including transversality, intermediate, and final conditions. The two point boundary value problem is explained. A brief summary of the codes operation is provided, including iteration via the Newton-Raphson scheme and integration of variational and motion equations via a 4th order Runge-Kutta scheme. Main subroutines are discussed. The history of the LeRC trajectory design efforts in the early 1960s is explained within the context of supporting the Centaur upper stage program. How the code was constructed based on the operation of the AtlasCentaur launch vehicle, the limits of the computers of that era, the limits of the computer programming languages, and the missions it supported are discussed. The vehicles DUKSUP supported (AtlasCentaur, TitanCentaur, and ShuttleCentaur) are briefly described. The types of missions, including Earth orbital and interplanetary, are described. The roles of flight constraints and their impact on launch operations are detailed (such as jettisoning hardware on heating, Range Safety, ground station tracking, and elliptical parking orbits). The computer main frames on which the code was hosted are described. The applications of the code are detailed, including independent check of contractor analysis, benchmarking, leading edge analysis, and vehicle performance improvement assessments. Several of DUKSUPs many major impacts on launches are discussed including Intelsat, Voyager, Pioneer Venus, HEAO, Galileo, and Cassini.
GPU-based Parallel Application Design for Emerging Mobile Devices
NASA Astrophysics Data System (ADS)
Gupta, Kshitij
A revolution is underway in the computing world that is causing a fundamental paradigm shift in device capabilities and form-factor, with a move from well-established legacy desktop/laptop computers to mobile devices in varying sizes and shapes. Amongst all the tasks these devices must support, graphics has emerged as the 'killer app' for providing a fluid user interface and high-fidelity game rendering, effectively making the graphics processor (GPU) one of the key components in (present and future) mobile systems. By utilizing the GPU as a general-purpose parallel processor, this dissertation explores the GPU computing design space from an applications standpoint, in the mobile context, by focusing on key challenges presented by these devices---limited compute, memory bandwidth, and stringent power consumption requirements---while improving the overall application efficiency of the increasingly important speech recognition workload for mobile user interaction. We broadly partition trends in GPU computing into four major categories. We analyze hardware and programming model limitations in current-generation GPUs and detail an alternate programming style called Persistent Threads, identify four use case patterns, and propose minimal modifications that would be required for extending native support. We show how by manually extracting data locality and altering the speech recognition pipeline, we are able to achieve significant savings in memory bandwidth while simultaneously reducing the compute burden on GPU-like parallel processors. As we foresee GPU computing to evolve from its current 'co-processor' model into an independent 'applications processor' that is capable of executing complex work independently, we create an alternate application framework that enables the GPU to handle all control-flow dependencies autonomously at run-time while minimizing host involvement to just issuing commands, that facilitates an efficient application implementation. Finally, as compute and communication capabilities of mobile devices improve, we analyze energy implications of processing speech recognition locally (on-chip) and offloading it to servers (in-cloud).
Jali - Unstructured Mesh Infrastructure for Multi-Physics Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garimella, Rao V; Berndt, Markus; Coon, Ethan
2017-04-13
Jali is a parallel unstructured mesh infrastructure library designed for use by multi-physics simulations. It supports 2D and 3D arbitrary polyhedral meshes distributed over hundreds to thousands of nodes. Jali can read write Exodus II meshes along with fields and sets on the mesh and support for other formats is partially implemented or is (https://github.com/MeshToolkit/MSTK), an open source general purpose unstructured mesh infrastructure library from Los Alamos National Laboratory. While it has been made to work with other mesh frameworks such as MOAB and STKmesh in the past, support for maintaining the interface to these frameworks has been suspended formore » now. Jali supports distributed as well as on-node parallelism. Support of on-node parallelism is through direct use of the the mesh in multi-threaded constructs or through the use of "tiles" which are submeshes or sub-partitions of a partition destined for a compute node.« less
Baiardi, A.; Paoloni, L.; Barone, V.; Zakrzewski, V.G.; Ortiz, J.V.
2017-01-01
The analysis of photoelectron spectra is usually facilitated by quantum mechanical simulations. Due to the recent improvement of experimental techniques, the resolution of experimental spectra is rapidly increasing, and the inclusion of vibrational effects is usually mandatory to obtain a reliable reproduction of the spectra. With the aim of defining a robust computational protocol, a general time-independent formulation to compute different kinds of vibrationally-resolved electronic spectra has been generalized to support also photoelectron spectroscopy. The electronic structure data underlying the simulation are computed using different electron propagator approaches. In addition to the more standard approaches, a new and robust implementation of the second-order self-energy approximation of the electron propagator based on a transition operator reference (TOEP2) is presented. To validate our implementation, a series of molecules has been used as test cases. The result of the simulations shows that, for ultraviolet photoionization spectra, the more accurate non-diagonal approaches are needed to obtain a reliable reproduction of vertical ionization energies, but diagonal approaches are sufficient for energy gradients and pole strengths. For X-ray photoelectron spectroscopy, the TOEP2 approach, besides being more efficient, is also the most accurate in the reproduction of both vertical ionization energies and vibrationally-resolved bandshapes. PMID:28521087
Probabilistic Structural Analysis Program
NASA Technical Reports Server (NTRS)
Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.
2010-01-01
NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.
Computer technology forecast study for general aviation
NASA Technical Reports Server (NTRS)
Seacord, C. L.; Vaughn, D.
1976-01-01
A multi-year, multi-faceted program is underway to investigate and develop potential improvements in airframes, engines, and avionics for general aviation aircraft. The objective of this study was to assemble information that will allow the government to assess the trends in computer and computer/operator interface technology that may have application to general aviation in the 1980's and beyond. The current state of the art of computer hardware is assessed, technical developments in computer hardware are predicted, and nonaviation large volume users of computer hardware are identified.
Support Vector Machine algorithm for regression and classification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu, Chenggang; Zavaljevski, Nela
2001-08-01
The software is an implementation of the Support Vector Machine (SVM) algorithm that was invented and developed by Vladimir Vapnik and his co-workers at AT&T Bell Laboratories. The specific implementation reported here is an Active Set method for solving a quadratic optimization problem that forms the major part of any SVM program. The implementation is tuned to specific constraints generated in the SVM learning. Thus, it is more efficient than general-purpose quadratic optimization programs. A decomposition method has been implemented in the software that enables processing large data sets. The size of the learning data is virtually unlimited by themore » capacity of the computer physical memory. The software is flexible and extensible. Two upper bounds are implemented to regulate the SVM learning for classification, which allow users to adjust the false positive and false negative rates. The software can be used either as a standalone, general-purpose SVM regression or classification program, or be embedded into a larger software system.« less
Support-vector-based emergent self-organising approach for emotional understanding
NASA Astrophysics Data System (ADS)
Nguwi, Yok-Yen; Cho, Siu-Yeung
2010-12-01
This study discusses the computational analysis of general emotion understanding from questionnaires methodology. The questionnaires method approaches the subject by investigating the real experience that accompanied the emotions, whereas the other laboratory approaches are generally associated with exaggerated elements. We adopted a connectionist model called support-vector-based emergent self-organising map (SVESOM) to analyse the emotion profiling from the questionnaires method. The SVESOM first identifies the important variables by giving discriminative features with high ranking. The classifier then performs the classification based on the selected features. Experimental results show that the top rank features are in line with the work of Scherer and Wallbott [(1994), 'Evidence for Universality and Cultural Variation of Differential Emotion Response Patterning', Journal of Personality and Social Psychology, 66, 310-328], which approached the emotions physiologically. While the performance measures show that using the full features for classifications can degrade the performance, the selected features provide superior results in terms of accuracy and generalisation.
NASA Technical Reports Server (NTRS)
Chen, Wei; Tsui, Kwok-Leung; Allen, Janet K.; Mistree, Farrokh
1994-01-01
In this paper we introduce a comprehensive and rigorous robust design procedure to overcome some limitations of the current approaches. A comprehensive approach is general enough to model the two major types of robust design applications, namely, robust design associated with the minimization of the deviation of performance caused by the deviation of noise factors (uncontrollable parameters), and robust design due to the minimization of the deviation of performance caused by the deviation of control factors (design variables). We achieve mathematical rigor by using, as a foundation, principles from the design of experiments and optimization. Specifically, we integrate the Response Surface Method (RSM) with the compromise Decision Support Problem (DSP). Our approach is especially useful for design problems where there are no closed-form solutions and system performance is computationally expensive to evaluate. The design of a solar powered irrigation system is used as an example. Our focus in this paper is on illustrating our approach rather than on the results per se.
Programming methodology for a general purpose automation controller
NASA Technical Reports Server (NTRS)
Sturzenbecker, M. C.; Korein, J. U.; Taylor, R. H.
1987-01-01
The General Purpose Automation Controller is a multi-processor architecture for automation programming. A methodology has been developed whose aim is to simplify the task of programming distributed real-time systems for users in research or manufacturing. Programs are built by configuring function blocks (low-level computations) into processes using data flow principles. These processes are activated through the verb mechanism. Verbs are divided into two classes: those which support devices, such as robot joint servos, and those which perform actions on devices, such as motion control. This programming methodology was developed in order to achieve the following goals: (1) specifications for real-time programs which are to a high degree independent of hardware considerations such as processor, bus, and interconnect technology; (2) a component approach to software, so that software required to support new devices and technologies can be integrated by reconfiguring existing building blocks; (3) resistance to error and ease of debugging; and (4) a powerful command language interface.
Sorich, Michael J; McKinnon, Ross A; Miners, John O; Winkler, David A; Smith, Paul A
2004-10-07
This study aimed to evaluate in silico models based on quantum chemical (QC) descriptors derived using the electronegativity equalization method (EEM) and to assess the use of QC properties to predict chemical metabolism by human UDP-glucuronosyltransferase (UGT) isoforms. Various EEM-derived QC molecular descriptors were calculated for known UGT substrates and nonsubstrates. Classification models were developed using support vector machine and partial least squares discriminant analysis. In general, the most predictive models were generated with the support vector machine. Combining QC and 2D descriptors (from previous work) using a consensus approach resulted in a statistically significant improvement in predictivity (to 84%) over both the QC and 2D models and the other methods of combining the descriptors. EEM-derived QC descriptors were shown to be both highly predictive and computationally efficient. It is likely that EEM-derived QC properties will be generally useful for predicting ADMET and physicochemical properties during drug discovery.
77 FR 2299 - Office of Child Support Enforcement; Privacy Act of 1974; Computer Matching Agreement
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-17
... Support Enforcement; Privacy Act of 1974; Computer Matching Agreement AGENCY: Office of Child Support Enforcement (OCSE), ACF, HHS. ACTION: Notice of a Computer Matching Program. SUMMARY: In accordance with the Privacy Act of 1974 (5 U.S.C. 522a), as amended, OCSE is publishing notice of a computer matching program...
77 FR 74019 - Office of Child Support Enforcement; Privacy Act of 1974; Computer Matching Agreement
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-12
... Support Enforcement; Privacy Act of 1974; Computer Matching Agreement AGENCY: Office of Child Support Enforcement (OCSE), ACF, HHS. ACTION: Notice of a Computer Matching Program. SUMMARY: In accordance with the Privacy Act of 1974 (5 U.S.C. 522a), as amended, OCSE is publishing notice of a computer matching program...
75 FR 29774 - Office of Child Support Enforcement; Privacy Act of 1974; Computer Matching Agreement
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-27
... Support Enforcement; Privacy Act of 1974; Computer Matching Agreement AGENCY: Office of Child Support Enforcement (OCSE), ACF, HHS. ACTION: Notice of a computer matching program. SUMMARY: In accordance with the Privacy Act of 1974 (5 U.S.C. 522a), as amended, OCSE is publishing notice of a computer matching program...
75 FR 31457 - Office of Child Support Enforcement; Privacy Act of 1974; Computer Matching Agreement
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-03
... Support Enforcement; Privacy Act of 1974; Computer Matching Agreement AGENCY: Office of Child Support Enforcement (OCSE), ACF, HHS. ACTION: Notice of a Computer Matching Program. SUMMARY: In accordance with the Privacy Act of 1974 (5 U.S.C. 522a), as amended, OCSE is publishing notice of a computer matching program...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vigil,Benny Manuel; Ballance, Robert; Haskell, Karen
Cielo is a massively parallel supercomputer funded by the DOE/NNSA Advanced Simulation and Computing (ASC) program, and operated by the Alliance for Computing at Extreme Scale (ACES), a partnership between Los Alamos National Laboratory (LANL) and Sandia National Laboratories (SNL). The primary Cielo compute platform is physically located at Los Alamos National Laboratory. This Cielo Computational Environment Usage Model documents the capabilities and the environment to be provided for the Q1 FY12 Level 2 Cielo Capability Computing (CCC) Platform Production Readiness Milestone. This document describes specific capabilities, tools, and procedures to support both local and remote users. The model ismore » focused on the needs of the ASC user working in the secure computing environments at Lawrence Livermore National Laboratory (LLNL), Los Alamos National Laboratory, or Sandia National Laboratories, but also addresses the needs of users working in the unclassified environment. The Cielo Computational Environment Usage Model maps the provided capabilities to the tri-Lab ASC Computing Environment (ACE) Version 8.0 requirements. The ACE requirements reflect the high performance computing requirements for the Production Readiness Milestone user environment capabilities of the ASC community. A description of ACE requirements met, and those requirements that are not met, are included in each section of this document. The Cielo Computing Environment, along with the ACE mappings, has been issued and reviewed throughout the tri-Lab community.« less
Research on computer systems benchmarking
NASA Technical Reports Server (NTRS)
Smith, Alan Jay (Principal Investigator)
1996-01-01
This grant addresses the topic of research on computer systems benchmarking and is more generally concerned with performance issues in computer systems. This report reviews work in those areas during the period of NASA support under this grant. The bulk of the work performed concerned benchmarking and analysis of CPUs, compilers, caches, and benchmark programs. The first part of this work concerned the issue of benchmark performance prediction. A new approach to benchmarking and machine characterization was reported, using a machine characterizer that measures the performance of a given system in terms of a Fortran abstract machine. Another report focused on analyzing compiler performance. The performance impact of optimization in the context of our methodology for CPU performance characterization was based on the abstract machine model. Benchmark programs are analyzed in another paper. A machine-independent model of program execution was developed to characterize both machine performance and program execution. By merging these machine and program characterizations, execution time can be estimated for arbitrary machine/program combinations. The work was continued into the domain of parallel and vector machines, including the issue of caches in vector processors and multiprocessors. All of the afore-mentioned accomplishments are more specifically summarized in this report, as well as those smaller in magnitude supported by this grant.
Torus as phase space: Weyl quantization, dequantization, and Wigner formalism
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ligabò, Marilena, E-mail: marilena.ligabo@uniba.it
2016-08-15
The Weyl quantization of classical observables on the torus (as phase space) without regularity assumptions is explicitly computed. The equivalence class of symbols yielding the same Weyl operator is characterized. The Heisenberg equation for the dynamics of general quantum observables is written through the Moyal brackets on the torus and the support of the Wigner transform is characterized. Finally, a dequantization procedure is introduced that applies, for instance, to the Pauli matrices. As a result we obtain the corresponding classical symbols.
NASA Astrophysics Data System (ADS)
Proux, Denys; Segond, Frédérique; Gerbier, Solweig; Metzger, Marie Hélène
Hospital Acquired Infections (HAI) is a real burden for doctors and risk surveillance experts. The impact on patients' health and related healthcare cost is very significant and a major concern even for rich countries. Furthermore required data to evaluate the threat is generally not available to experts and that prevents from fast reaction. However, recent advances in Computational Intelligence Techniques such as Information Extraction, Risk Patterns Detection in documents and Decision Support Systems allow now to address this problem.
2017-02-23
percentages for increased poor health habits, healthcare utilization, and medication usage were computed using the group n as the denominator instead of the n... survey to assess for general areas of health-related behaviors (i.e., sleep and exercise; alcohol, tobacco, and caffeine use; common reasons for seeking...medical care and mental health support services; and reasons for increased prescription and over-the-counter medication usage ) relevant to
NASA Technical Reports Server (NTRS)
Dayman, B., Jr.; Fiore, A. W.
1974-01-01
The present work discusses in general terms the various kinds of ground facilities, in particular, wind tunnels, which support aerodynamic testing. Since not all flight parameters can be simulated simultaneously, an important problem consists in matching parameters. It is pointed out that there is a lack of wind tunnels for a complete Reynolds-number simulation. Using a computer to simulate flow fields can result in considerable reduction of wind-tunnel hours required to develop a given flight vehicle.
The programming language HAL: A specification
NASA Technical Reports Server (NTRS)
1971-01-01
HAL accomplishes three significant objectives: (1) increased readability, through the use of a natural two-dimensional mathematical format; (2) increased reliability, by providing for selective recognition of common data and subroutines, and by incorporating specific data-protect features; (3) real-time control facility, by including a comprehensive set of real-time control commands and signal conditions. Although HAL is designed primarily for programming on-board computers, it is general enough to meet nearly all the needs in the production, verification and support of aerospace, and other real-time applications.
STS-46 Italian Payload Specialist Malerba uses laptop PGSC on OV-104 middeck
NASA Technical Reports Server (NTRS)
1992-01-01
STS-46 Italian Payload Specialist Franco Malerba, wearing communications kit assembly headset (HDST), uses laptop payload and general support computer (PGSC) on the middeck of Atlantis, Orbiter Vehicle (OV) 104. Malerba is positioned in front of the airlock and surrounded by the interdeck access ladder (foreground), a cycle ergometer (directly behind him), the forward lockers (background), and the sleep station (at his left). Food, candy, hygiene kits, beverage containers, and film reels are attached to the forward lockers.
1980-05-22
cross -referenced with the number of the data transaction listed in the data module quality con- trol list NVB Integer variable used to...Organization of the Joint Chiefs of Staff. Technical support was provided by System Sciences, Incorporated under Contract Number DCA100-75-C-0019. Change set... Contract Number DCA 100-75-C-0019. Change set two was prepared u nder Contract Number DCA 100-78-C-0035. Computer Sciences Corporation prepared change
13. VIEW OF A BBOX, WHICH WAS USED IN THE ...
13. VIEW OF A B-BOX, WHICH WAS USED IN THE FAST RECOVERY PROCESS. URANIUM OXIDE WAS TRANSFERRED FOR DISSOLUTION IN A ROOM WHICH HOUSED 3 ROWS OF B-BOXES. B-BOXES ARE CONTROLLED HOODS, SIMILAR TO LAB HOODS THAT OPERATED WITH HIGH AIR VELOCITIES AT THEIR OPENINGS TO ENSURE THAT THE VAPORS WERE CONTAINED WITHIN THE HOOD. (2/14/79) - Rocky Flats Plant, General Manufacturing, Support, Records-Central Computing, Southern portion of Plant, Golden, Jefferson County, CO
NASP guidance design for vehicle autonomy
NASA Astrophysics Data System (ADS)
Wagner, E. A.; Li, I.; Nguyen, D. D.; Nguyen, P. L.
1990-10-01
Vehicle guidance for General Dynamics' NASP vehicle is planned to be self-contained onboard the vehicle, and independent of any ground support during the mission. It will include real-time onboard abort and ascent trajectory optimization capability. Although these features should be considered a natural outgrowth of research in guidance and trajectory optimization and advances in computation, facilitating full vehicle autonomy for NASP represents a significant advance relative to any flight-demonstrated guidance. Algorithms and processing requirements for autonomous NASP vehicle guidance are considered.
Evolutionary vaccination dynamics with internal support mechanisms
NASA Astrophysics Data System (ADS)
Tang, Guo-Mei; Cai, Chao-Ran; Wu, Zhi-Xi
2017-05-01
This paper reports internal support mechanisms (i.e., without external intervention) to enhance the vaccine coverage in the evolutionary vaccination dynamics. We present two internal support mechanisms, one is global support mechanism in which each individual pays a support cost to build up a public fund and then the public fund is divided by all vaccinated individuals, while another is local support mechanism in which each individual pays a support cost and then this support cost will be divided by its immediate vaccinated neighbors. By means of extensive computer simulations, we show that, in the same strength of support cost, the heterogeneous (local) support mechanism can encourage more people to take vaccination than the homogeneous (global) support mechanism. And then, we study the most general case that includes supporters and troublemakers together, where supporters (troublemakers) mean that the individuals join (do not join) the internal support mechanism, in the population. We surprisingly find that, in scale-free networks, the voluntary vaccination dynamics with the local support mechanism will not degrade into the original voluntary vaccination dynamics, and the vaccination level can still be effectively improved. In view of most social networks are of scale-free degree distribution, we study further in empirical networks and find that the vaccination level can still be improved in the absence of external intervention.
Oelsner, Elizabeth C.; Hoffman, Eric A.; Folsom, Aaron R.; Carr, J. Jeffrey; Enright, Paul L.; Kawut, Steven M.; Kronmal, Richard; Lederer, David; Lima, Joao A. C.; Lovasi, Gina S.; Shea, Steven; Barr, R. Graham
2015-01-01
Background Whereas low lung function is known to predict mortality in the general population, the prognostic significance of emphysema on computed tomography (CT) in persons without chronic obstructive pulmonary disease (COPD) remains uncertain. Objective To determine whether greater emphysema-like lung on CT is associated with all-cause mortality among persons without airflow obstruction or COPD in the general population. Design Prospective cohort study. Setting Population-based, multiethnic sample from 6 US communities. Participants 2965 participants ages 45-84 years without airflow obstruction on spirometry. Measurements Emphysema-like lung was defined on cardiac CT as the number of lung voxels less than -950 Hounsfield Units, and was adjusted for the number of total imaged lung voxels. Results Among 2965 participants, 50.9% of whom never smoked, there were 186 deaths over a median of 6.2 years. Greater emphysema-like lung was independently associated with increased mortality (adjusted hazard ratio [HR]1.14 per one-half of the interquartile range, 95% CI 1.04-1.24, P=0.004), adjusting for potential confounders including cardiovascular risk factors and the forced expiratory volume in one second. Generalized additive models supported a linear association between emphysema-like lung and mortality without evidence for a threshold. The association was of greatest magnitude among smokers, although multiplicative interaction terms did not support effect modification by smoking status. Limitations Cardiac CT scans did not include lung apices. The number of deaths was limited among subgroup analyses. Conclusions Emphysema-like lung on CT was associated with all-cause mortality among persons without airflow obstruction or COPD in a general population sample, particularly among smokers. Recognition of the independent prognostic significance of emphysema on CT among patients without COPD on spirometry is warranted. Primary Funding Source NIH/NHLBI. PMID:25506855
Randomised trial of personalised computer based information for cancer patients
Jones, Ray; Pearson, Janne; McGregor, Sandra; Cawsey, Alison J; Barrett, Ann; Craig, Neil; Atkinson, Jacqueline M; Gilmour, W Harper; McEwen, Jim
1999-01-01
Objective To compare the use and effect of a computer based information system for cancer patients that is personalised using each patient's medical record with a system providing only general information and with information provided in booklets. Design Randomised trial with three groups. Data collected at start of radiotherapy, one week later (when information provided), three weeks later, and three months later. Participants 525 patients started radical radiotherapy; 438 completed follow up. Interventions Two groups were offered information via computer (personalised or general information, or both) with open access to computer thereafter; the third group was offered a selection of information booklets. Outcomes Patients' views and preferences, use of computer and information, and psychological status; doctors' perceptions; cost of interventions. Results More patients offered the personalised information said that they had learnt something new, thought the information was relevant, used the computer again, and showed their computer printouts to others. There were no major differences in doctors' perceptions of patients. More of the general computer group were anxious at three months. With an electronic patient record system, in the long run the personalised information system would cost no more than the general system. Full access to booklets cost twice as much as the general system. Conclusions Patients preferred computer systems that provided information from their medical records to systems that just provided general information. This has implications for the design and implementation of electronic patient record systems and reliance on general sources of patient information. PMID:10550090
A Conceptual Framework for Electronic Socio-Emotional Support for People with Special Needs
ERIC Educational Resources Information Center
Shpigelman, Carmit-Noa; Reiter, Shunit; Weiss, Patrice L.
2009-01-01
In recent years an increasing number of people under psychological distress turn to computer-mediated communication for support. A related development is the increasing number of computer-mediated support groups in which people meet, share interests, and exchange socio-emotional support through text-based messages on computer networks. To date, a…
NASA Technical Reports Server (NTRS)
Chesler, L.; Pierce, S.
1971-01-01
Generalized, cyclic, and modified multistep numerical integration methods are developed and evaluated for application to problems of satellite orbit computation. Generalized methods are compared with the presently utilized Cowell methods; new cyclic methods are developed for special second-order differential equations; and several modified methods are developed and applied to orbit computation problems. Special computer programs were written to generate coefficients for these methods, and subroutines were written which allow use of these methods with NASA's GEOSTAR computer program.
Stochastic subset selection for learning with kernel machines.
Rhinelander, Jason; Liu, Xiaoping P
2012-06-01
Kernel machines have gained much popularity in applications of machine learning. Support vector machines (SVMs) are a subset of kernel machines and generalize well for classification, regression, and anomaly detection tasks. The training procedure for traditional SVMs involves solving a quadratic programming (QP) problem. The QP problem scales super linearly in computational effort with the number of training samples and is often used for the offline batch processing of data. Kernel machines operate by retaining a subset of observed data during training. The data vectors contained within this subset are referred to as support vectors (SVs). The work presented in this paper introduces a subset selection method for the use of kernel machines in online, changing environments. Our algorithm works by using a stochastic indexing technique when selecting a subset of SVs when computing the kernel expansion. The work described here is novel because it separates the selection of kernel basis functions from the training algorithm used. The subset selection algorithm presented here can be used in conjunction with any online training technique. It is important for online kernel machines to be computationally efficient due to the real-time requirements of online environments. Our algorithm is an important contribution because it scales linearly with the number of training samples and is compatible with current training techniques. Our algorithm outperforms standard techniques in terms of computational efficiency and provides increased recognition accuracy in our experiments. We provide results from experiments using both simulated and real-world data sets to verify our algorithm.
Computers in general practice: the patient's voice
Potter, A. R.
1981-01-01
Analysis of answers to a questionnaire on the use of computers in general practice showed that 19 per cent of patients in two practices in Staffordshire would be worried if their general practitioner used a computer to store medical records. Twenty-seven per cent of patients would be unwilling to speak frankly about personal matters to their general practitioner if he or she used a computer and 7 per cent said that they would change to another doctor. Fifteen per cent stated that their general practitioner already had information about them that they would not want to be included in a computerized record of their medical history. PMID:7328555
COMPUTER SUPPORT SYSTEMS FOR ESTIMATING CHEMICAL TOXICITY: PRESENT CAPABILITIES AND FUTURE TRENDS
Computer Support Systems for Estimating Chemical Toxicity: Present Capabilities and Future Trends
A wide variety of computer-based artificial intelligence (AI) and decision support systems exist currently to aid in the assessment of toxicity for environmental chemicals. T...
Manifold parametrization of the left ventricle for a statistical modelling of its complete anatomy
NASA Astrophysics Data System (ADS)
Gil, D.; Garcia-Barnes, J.; Hernández-Sabate, A.; Marti, E.
2010-03-01
Distortion of Left Ventricle (LV) external anatomy is related to some dysfunctions, such as hypertrophy. The architecture of myocardial fibers determines LV electromechanical activation patterns as well as mechanics. Thus, their joined modelling would allow the design of specific interventions (such as peacemaker implantation and LV remodelling) and therapies (such as resynchronization). On one hand, accurate modelling of external anatomy requires either a dense sampling or a continuous infinite dimensional approach, which requires non-Euclidean statistics. On the other hand, computation of fiber models requires statistics on Riemannian spaces. Most approaches compute separate statistical models for external anatomy and fibers architecture. In this work we propose a general mathematical framework based on differential geometry concepts for computing a statistical model including, both, external and fiber anatomy. Our framework provides a continuous approach to external anatomy supporting standard statistics. We also provide a straightforward formula for the computation of the Riemannian fiber statistics. We have applied our methodology to the computation of complete anatomical atlas of canine hearts from diffusion tensor studies. The orientation of fibers over the average external geometry agrees with the segmental description of orientations reported in the literature.
PDS: A Performance Database Server
Berry, Michael W.; Dongarra, Jack J.; Larose, Brian H.; ...
1994-01-01
The process of gathering, archiving, and distributing computer benchmark data is a cumbersome task usually performed by computer users and vendors with little coordination. Most important, there is no publicly available central depository of performance data for all ranges of machines from personal computers to supercomputers. We present an Internet-accessible performance database server (PDS) that can be used to extract current benchmark data and literature. As an extension to the X-Windows-based user interface (Xnetlib) to the Netlib archival system, PDS provides an on-line catalog of public domain computer benchmarks such as the LINPACK benchmark, Perfect benchmarks, and the NAS parallelmore » benchmarks. PDS does not reformat or present the benchmark data in any way that conflicts with the original methodology of any particular benchmark; it is thereby devoid of any subjective interpretations of machine performance. We believe that all branches (research laboratories, academia, and industry) of the general computing community can use this facility to archive performance metrics and make them readily available to the public. PDS can provide a more manageable approach to the development and support of a large dynamic database of published performance metrics.« less
Replication of Space-Shuttle Computers in FPGAs and ASICs
NASA Technical Reports Server (NTRS)
Ferguson, Roscoe C.
2008-01-01
A document discusses the replication of the functionality of the onboard space-shuttle general-purpose computers (GPCs) in field-programmable gate arrays (FPGAs) and application-specific integrated circuits (ASICs). The purpose of the replication effort is to enable utilization of proven space-shuttle flight software and software-development facilities to the extent possible during development of software for flight computers for a new generation of launch vehicles derived from the space shuttles. The replication involves specifying the instruction set of the central processing unit and the input/output processor (IOP) of the space-shuttle GPC in a hardware description language (HDL). The HDL is synthesized to form a "core" processor in an FPGA or, less preferably, in an ASIC. The core processor can be used to create a flight-control card to be inserted into a new avionics computer. The IOP of the GPC as implemented in the core processor could be designed to support data-bus protocols other than that of a multiplexer interface adapter (MIA) used in the space shuttle. Hence, a computer containing the core processor could be tailored to communicate via the space-shuttle GPC bus and/or one or more other buses.
On A Nonlinear Generalization of Sparse Coding and Dictionary Learning.
Xie, Yuchen; Ho, Jeffrey; Vemuri, Baba
2013-01-01
Existing dictionary learning algorithms are based on the assumption that the data are vectors in an Euclidean vector space ℝ d , and the dictionary is learned from the training data using the vector space structure of ℝ d and its Euclidean L 2 -metric. However, in many applications, features and data often originated from a Riemannian manifold that does not support a global linear (vector space) structure. Furthermore, the extrinsic viewpoint of existing dictionary learning algorithms becomes inappropriate for modeling and incorporating the intrinsic geometry of the manifold that is potentially important and critical to the application. This paper proposes a novel framework for sparse coding and dictionary learning for data on a Riemannian manifold, and it shows that the existing sparse coding and dictionary learning methods can be considered as special (Euclidean) cases of the more general framework proposed here. We show that both the dictionary and sparse coding can be effectively computed for several important classes of Riemannian manifolds, and we validate the proposed method using two well-known classification problems in computer vision and medical imaging analysis.
On A Nonlinear Generalization of Sparse Coding and Dictionary Learning
Xie, Yuchen; Ho, Jeffrey; Vemuri, Baba
2013-01-01
Existing dictionary learning algorithms are based on the assumption that the data are vectors in an Euclidean vector space ℝd, and the dictionary is learned from the training data using the vector space structure of ℝd and its Euclidean L2-metric. However, in many applications, features and data often originated from a Riemannian manifold that does not support a global linear (vector space) structure. Furthermore, the extrinsic viewpoint of existing dictionary learning algorithms becomes inappropriate for modeling and incorporating the intrinsic geometry of the manifold that is potentially important and critical to the application. This paper proposes a novel framework for sparse coding and dictionary learning for data on a Riemannian manifold, and it shows that the existing sparse coding and dictionary learning methods can be considered as special (Euclidean) cases of the more general framework proposed here. We show that both the dictionary and sparse coding can be effectively computed for several important classes of Riemannian manifolds, and we validate the proposed method using two well-known classification problems in computer vision and medical imaging analysis. PMID:24129583
A generalized computer code for developing dynamic gas turbine engine models (DIGTEM)
NASA Technical Reports Server (NTRS)
Daniele, C. J.
1984-01-01
This paper describes DIGTEM (digital turbofan engine model), a computer program that simulates two spool, two stream (turbofan) engines. DIGTEM was developed to support the development of a real time multiprocessor based engine simulator being designed at the Lewis Research Center. The turbofan engine model in DIGTEM contains steady state performance maps for all the components and has control volumes where continuity and energy balances are maintained. Rotor dynamics and duct momentum dynamics are also included. DIGTEM features an implicit integration scheme for integrating stiff systems and trims the model equations to match a prescribed design point by calculating correction coefficients that balance out the dynamic equations. It uses the same coefficients at off design points and iterates to a balanced engine condition. Transients are generated by defining the engine inputs as functions of time in a user written subroutine (TMRSP). Closed loop controls can also be simulated. DIGTEM is generalized in the aerothermodynamic treatment of components. This feature, along with DIGTEM's trimming at a design point, make it a very useful tool for developing a model of a specific turbofan engine.
A generalized computer code for developing dynamic gas turbine engine models (DIGTEM)
NASA Technical Reports Server (NTRS)
Daniele, C. J.
1983-01-01
This paper describes DIGTEM (digital turbofan engine model), a computer program that simulates two spool, two stream (turbofan) engines. DIGTEM was developed to support the development of a real time multiprocessor based engine simulator being designed at the Lewis Research Center. The turbofan engine model in DIGTEM contains steady state performance maps for all the components and has control volumes where continuity and energy balances are maintained. Rotor dynamics and duct momentum dynamics are also included. DIGTEM features an implicit integration scheme for integrating stiff systems and trims the model equations to match a prescribed design point by calculating correction coefficients that balance out the dynamic equations. It uses the same coefficients at off design points and iterates to a balanced engine condition. Transients are generated by defining the engine inputs as functions of time in a user written subroutine (TMRSP). Closed loop controls can also be simulated. DIGTEM is generalized in the aerothermodynamic treatment of components. This feature, along with DIGTEM's trimming at a design point, make it a very useful tool for developing a model of a specific turbofan engine.
2010-01-01
Background Protein-protein interaction (PPI) plays essential roles in cellular functions. The cost, time and other limitations associated with the current experimental methods have motivated the development of computational methods for predicting PPIs. As protein interactions generally occur via domains instead of the whole molecules, predicting domain-domain interaction (DDI) is an important step toward PPI prediction. Computational methods developed so far have utilized information from various sources at different levels, from primary sequences, to molecular structures, to evolutionary profiles. Results In this paper, we propose a computational method to predict DDI using support vector machines (SVMs), based on domains represented as interaction profile hidden Markov models (ipHMM) where interacting residues in domains are explicitly modeled according to the three dimensional structural information available at the Protein Data Bank (PDB). Features about the domains are extracted first as the Fisher scores derived from the ipHMM and then selected using singular value decomposition (SVD). Domain pairs are represented by concatenating their selected feature vectors, and classified by a support vector machine trained on these feature vectors. The method is tested by leave-one-out cross validation experiments with a set of interacting protein pairs adopted from the 3DID database. The prediction accuracy has shown significant improvement as compared to InterPreTS (Interaction Prediction through Tertiary Structure), an existing method for PPI prediction that also uses the sequences and complexes of known 3D structure. Conclusions We show that domain-domain interaction prediction can be significantly enhanced by exploiting information inherent in the domain profiles via feature selection based on Fisher scores, singular value decomposition and supervised learning based on support vector machines. Datasets and source code are freely available on the web at http://liao.cis.udel.edu/pub/svdsvm. Implemented in Matlab and supported on Linux and MS Windows. PMID:21034480
Evolving Non-Dominated Parameter Sets for Computational Models from Multiple Experiments
NASA Astrophysics Data System (ADS)
Lane, Peter C. R.; Gobet, Fernand
2013-03-01
Creating robust, reproducible and optimal computational models is a key challenge for theorists in many sciences. Psychology and cognitive science face particular challenges as large amounts of data are collected and many models are not amenable to analytical techniques for calculating parameter sets. Particular problems are to locate the full range of acceptable model parameters for a given dataset, and to confirm the consistency of model parameters across different datasets. Resolving these problems will provide a better understanding of the behaviour of computational models, and so support the development of general and robust models. In this article, we address these problems using evolutionary algorithms to develop parameters for computational models against multiple sets of experimental data; in particular, we propose the `speciated non-dominated sorting genetic algorithm' for evolving models in several theories. We discuss the problem of developing a model of categorisation using twenty-nine sets of data and models drawn from four different theories. We find that the evolutionary algorithms generate high quality models, adapted to provide a good fit to all available data.
Survey of computer programs for heat transfer analysis
NASA Astrophysics Data System (ADS)
Noor, A. K.
An overview is presented of the current capabilities of thirty-eight computer programs that can be used for solution of heat transfer problems. These programs range from the large, general-purpose codes with a broad spectrum of capabilities, large user community and comprehensive user support (e.g., ANSYS, MARC, MITAS 2 MSC/NASTRAN, SESAM-69/NV-615) to the small, special purpose codes with limited user community such as ANDES, NNTB, SAHARA, SSPTA, TACO, TEPSA AND TRUMP. The capabilities of the programs surveyed are listed in tabular form followed by a summary of the major features of each program. As with any survey of computer programs, the present one has the following limitations: (1) It is useful only in the initial selection of the programs which are most suitable for a particular application. The final selection of the program to be used should, however, be based on a detailed examination of the documentation and the literature about the program; (2) Since computer software continually changes, often at a rapid rate, some means must be found for updating this survey and maintaining some degree of currency.
Any Ontological Model of the Single Qubit Stabilizer Formalism must be Contextual
NASA Astrophysics Data System (ADS)
Lillystone, Piers; Wallman, Joel J.
Quantum computers allow us to easily solve some problems classical computers find hard. Non-classical improvements in computational power should be due to some non-classical property of quantum theory. Contextuality, a more general notion of non-locality, is a necessary, but not sufficient, resource for quantum speed-up. Proofs of contextuality can be constructed for the classically simulable stabilizer formalism. Previous proofs of stabilizer contextuality are known for 2 or more qubits, for example the Mermin-Peres magic square. In the work presented we extend these results and prove that any ontological model of the single qubit stabilizer theory must be contextual, as defined by R. Spekkens, and give a relation between our result and the Mermin-Peres square. By demonstrating that contextuality is present in the qubit stabilizer formalism we provide further insight into the contextuality present in quantum theory. Understanding the contextuality of classical sub-theories will allow us to better identify the physical properties of quantum theory required for computational speed up. This research was supported by CIFAR, the Government of Ontario, and the Government of Canada through NSERC and Industry Canada.
Survey of computer programs for heat transfer analysis
NASA Technical Reports Server (NTRS)
Noor, A. K.
1982-01-01
An overview is presented of the current capabilities of thirty-eight computer programs that can be used for solution of heat transfer problems. These programs range from the large, general-purpose codes with a broad spectrum of capabilities, large user community and comprehensive user support (e.g., ANSYS, MARC, MITAS 2 MSC/NASTRAN, SESAM-69/NV-615) to the small, special purpose codes with limited user community such as ANDES, NNTB, SAHARA, SSPTA, TACO, TEPSA AND TRUMP. The capabilities of the programs surveyed are listed in tabular form followed by a summary of the major features of each program. As with any survey of computer programs, the present one has the following limitations: (1) It is useful only in the initial selection of the programs which are most suitable for a particular application. The final selection of the program to be used should, however, be based on a detailed examination of the documentation and the literature about the program; (2) Since computer software continually changes, often at a rapid rate, some means must be found for updating this survey and maintaining some degree of currency.
From guideline modeling to guideline execution: defining guideline-based decision-support services.
Tu, S. W.; Musen, M. A.
2000-01-01
We describe our task-based approach to defining the guideline-based decision-support services that the EON system provides. We categorize uses of guidelines in patient-specific decision support into a set of generic tasks--making of decisions, specification of work to be performed, interpretation of data, setting of goals, and issuance of alert and reminders--that can be solved using various techniques. Our model includes constructs required for representing the knowledge used by these techniques. These constructs form a toolkit from which developers can select modeling solutions for guideline task. Based on the tasks and the guideline model, we define a guideline-execution architecture and a model of interactions between a decision-support server and clients that invoke services provided by the server. These services use generic interfaces derived from guideline tasks and their associated modeling constructs. We describe two implementations of these decision-support services and discuss how this work can be generalized. We argue that a well-defined specification of guideline-based decision-support services will facilitate sharing of tools that implement computable clinical guidelines. PMID:11080007
Computer-Supported Feedback Message Tailoring for Healthcare Providers in Malawi: Proof-of-Concept.
Landis-Lewis, Zach; Douglas, Gerald P; Hochheiser, Harry; Kam, Matthew; Gadabu, Oliver; Bwanali, Mwatha; Jacobson, Rebecca S
2015-01-01
Although performance feedback has the potential to help clinicians improve the quality and safety of care, healthcare organizations generally lack knowledge about how this guidance is best provided. In low-resource settings, tools for theory-informed feedback tailoring may enhance limited clinical supervision resources. Our objectives were to establish proof-of-concept for computer-supported feedback message tailoring in Malawi, Africa. We conducted this research in five stages: clinical performance measurement, modeling the influence of feedback on antiretroviral therapy (ART) performance, creating a rule-based message tailoring process, generating tailored messages for recipients, and finally analysis of performance and message tailoring data. We retrospectively generated tailored messages for 7,448 monthly performance reports from 11 ART clinics. We found that tailored feedback could be routinely generated for four guideline-based performance indicators, with 35% of reports having messages prioritized to optimize the effect of feedback. This research establishes proof-of-concept for a novel approach to improving the use of clinical performance feedback in low-resource settings and suggests possible directions for prospective evaluations comparing alternative designs of feedback messages.
Raney Distributions and Random Matrix Theory
NASA Astrophysics Data System (ADS)
Forrester, Peter J.; Liu, Dang-Zheng
2015-03-01
Recent works have shown that the family of probability distributions with moments given by the Fuss-Catalan numbers permit a simple parameterized form for their density. We extend this result to the Raney distribution which by definition has its moments given by a generalization of the Fuss-Catalan numbers. Such computations begin with an algebraic equation satisfied by the Stieltjes transform, which we show can be derived from the linear differential equation satisfied by the characteristic polynomial of random matrix realizations of the Raney distribution. For the Fuss-Catalan distribution, an equilibrium problem characterizing the density is identified. The Stieltjes transform for the limiting spectral density of the singular values squared of the matrix product formed from inverse standard Gaussian matrices, and standard Gaussian matrices, is shown to satisfy a variant of the algebraic equation relating to the Raney distribution. Supported on , we show that it too permits a simple functional form upon the introduction of an appropriate choice of parameterization. As an application, the leading asymptotic form of the density as the endpoints of the support are approached is computed, and is shown to have some universal features.
Computer-Supported Collaborative Learning in Higher Education
ERIC Educational Resources Information Center
Roberts, Tim, Ed.
2005-01-01
"Computer-Supported Collaborative Learning in Higher Education" provides a resource for researchers and practitioners in the area of computer-supported collaborative learning (also known as CSCL); particularly those working within a tertiary education environment. It includes articles of relevance to those interested in both theory and practice in…
A general purpose subroutine for fast fourier transform on a distributed memory parallel machine
NASA Technical Reports Server (NTRS)
Dubey, A.; Zubair, M.; Grosch, C. E.
1992-01-01
One issue which is central in developing a general purpose Fast Fourier Transform (FFT) subroutine on a distributed memory parallel machine is the data distribution. It is possible that different users would like to use the FFT routine with different data distributions. Thus, there is a need to design FFT schemes on distributed memory parallel machines which can support a variety of data distributions. An FFT implementation on a distributed memory parallel machine which works for a number of data distributions commonly encountered in scientific applications is presented. The problem of rearranging the data after computing the FFT is also addressed. The performance of the implementation on a distributed memory parallel machine Intel iPSC/860 is evaluated.
Generalizing the extensibility of a dynamic geometry software
NASA Astrophysics Data System (ADS)
Herceg, Đorđe; Radaković, Davorka; Herceg, Dejana
2012-09-01
Plug-and-play visual components in a Dynamic Geometry Software (DGS) enable development of visually attractive, rich and highly interactive dynamic drawings. We are developing SLGeometry, a DGS that contains a custom programming language, a computer algebra system (CAS engine) and a graphics subsystem. The basic extensibility framework on SLGeometry supports dynamic addition of new functions from attribute annotated classes that implement runtime metadata registration in code. We present a general plug-in framework for dynamic importing of arbitrary Silverlight user interface (UI) controls into SLGeometry at runtime. The CAS engine maintains a metadata storage that describes each imported visual component and enables two-way communication between the expressions stored in the engine and the UI controls on the screen.
Osman, Magda; Wiegmann, Alex
2017-03-01
In this review we make a simple theoretical argument which is that for theory development, computational modeling, and general frameworks for understanding moral psychology researchers should build on domain-general principles from reasoning, judgment, and decision-making research. Our approach is radical with respect to typical models that exist in moral psychology that tend to propose complex innate moral grammars and even evolutionarily guided moral principles. In support of our argument we show that by using a simple value-based decision model we can capture a range of core moral behaviors. Crucially, the argument we propose is that moral situations per se do not require anything specialized or different from other situations in which we have to make decisions, inferences, and judgments in order to figure out how to act.
First Principles Study of Chemically Functionalized Graphene
NASA Astrophysics Data System (ADS)
Jha, Sanjiv; Vasiliev, Igor
2015-03-01
The electronic, structural and vibrational properties of carbon nanomaterials can be affected by chemical functionalization. We applied ab initio computational methods based on density functional theory to study the covalent functionalization of graphene with benzyne, carboxyl groups and tetracyanoethylene oxide (TCNEO). Our calculations were carried out using the SIESTA and Quantum-ESPRESSO electronic structure codes combined with the local density and generalized gradient approximations for the exchange correlation functional and norm-conserving Troullier-Martins pseudopotentials. The simulated Raman and infrared spectra of graphene functionalized with carboxyl groups and TCNEO were consistent with the available experimental results. The computed vibrational spectra of graphene functionalized with carboxyl groups showed that the presence of point defects near the functionalization site affects the Raman and infrared spectroscopic signatures of functionalized graphene. Supported by NSF CHE-1112388.
NASA Technical Reports Server (NTRS)
Marsh, J. G.; Vincent, S.; Mcclinton, A. T.; Chang, E. S.
1975-01-01
A detailed gravimetric geoid was computed for the Northwest Atlantic Ocean and Caribbean Sea area in support of the calibration and evaluation of the GEOS-C altimeter. This geoid, computed on a 15 ft. x 15 ft. grid was based upon a combination of surface gravity data with the GSFC GEM-6 satellite derived gravity data. A comparison of this gravimetric geoid with 10 passes of SKYLAB altimeter data is presented. The agreement of the two data types is quite good with the differences generally less than 2 meters. Sea surface manifestations of numerous short wavelength (approximately 100 km) oceanographic features are now indicated in the gravimetric geoid and are also confirmed by the altimetry data.
Computational Intelligence Techniques for Tactile Sensing Systems
Gastaldo, Paolo; Pinna, Luigi; Seminara, Lucia; Valle, Maurizio; Zunino, Rodolfo
2014-01-01
Tactile sensing helps robots interact with humans and objects effectively in real environments. Piezoelectric polymer sensors provide the functional building blocks of the robotic electronic skin, mainly thanks to their flexibility and suitability for detecting dynamic contact events and for recognizing the touch modality. The paper focuses on the ability of tactile sensing systems to support the challenging recognition of certain qualities/modalities of touch. The research applies novel computational intelligence techniques and a tensor-based approach for the classification of touch modalities; its main results consist in providing a procedure to enhance system generalization ability and architecture for multi-class recognition applications. An experimental campaign involving 70 participants using three different modalities in touching the upper surface of the sensor array was conducted, and confirmed the validity of the approach. PMID:24949646
Prediction of ball and roller bearing thermal and kinematic performance by computer analysis
NASA Technical Reports Server (NTRS)
Pirvics, J.; Kleckner, R. J.
1983-01-01
Characteristics of good computerized analysis software are suggested. These general remarks and an overview of representative software precede a more detailed discussion of load support system analysis program structure. Particular attention is directed at a recent cylindrical roller bearing analysis as an example of the available design tools. Selected software modules are then examined to reveal the detail inherent in contemporary analysis. This leads to a brief section on current design computation which seeks to suggest when and why computerized analysis is warranted. An example concludes the argument offered for such design methodology. Finally, remarks are made concerning needs for model development to address effects which are now considered to be secondary but are anticipated to emerge to primary status in the near future.
NASA Astrophysics Data System (ADS)
Pierce, M. E.; Aktas, M. S.; Aydin, G.; Fox, G. C.; Gadgil, H.; Sayar, A.
2005-12-01
We examine the application of Web Service Architectures and Grid-based distributed computing technologies to geophysics and geo-informatics. We are particularly interested in the integration of Geographical Information System (GIS) services with distributed data mining applications. GIS services provide the general purpose framework for building archival data services, real time streaming data services, and map-based visualization services that may be integrated with data mining and other applications through the use of distributed messaging systems and Web Service orchestration tools. Building upon on our previous work in these areas, we present our current research efforts. These include fundamental investigations into increasing XML-based Web service performance, supporting real time data streams, and integrating GIS mapping tools with audio/video collaboration systems for shared display and annotation.
Solar electric propulsion thrust subsystem development
NASA Technical Reports Server (NTRS)
Masek, T. D.
1973-01-01
The Solar Electric Propulsion System developed under this program was designed to demonstrate all the thrust subsystem functions needed on an unmanned planetary vehicle. The demonstration included operation of the basic elements, power matching input and output voltage regulation, three-axis thrust vector control, subsystem automatic control including failure detection and correction capability (using a PDP-11 computer), operation of critical elements in thermal-vacuum-, zero-gravity-type propellant storage, and data outputs from all subsystem elements. The subsystem elements, functions, unique features, and test setup are described. General features and capabilities of the test-support data system are also presented. The test program culminated in a 1500-h computer-controlled, system-functional demonstration. This included simultaneous operation of two thruster/power conditioner sets. The results of this testing phase satisfied all the program goals.
Computational intelligence techniques for tactile sensing systems.
Gastaldo, Paolo; Pinna, Luigi; Seminara, Lucia; Valle, Maurizio; Zunino, Rodolfo
2014-06-19
Tactile sensing helps robots interact with humans and objects effectively in real environments. Piezoelectric polymer sensors provide the functional building blocks of the robotic electronic skin, mainly thanks to their flexibility and suitability for detecting dynamic contact events and for recognizing the touch modality. The paper focuses on the ability of tactile sensing systems to support the challenging recognition of certain qualities/modalities of touch. The research applies novel computational intelligence techniques and a tensor-based approach for the classification of touch modalities; its main results consist in providing a procedure to enhance system generalization ability and architecture for multi-class recognition applications. An experimental campaign involving 70 participants using three different modalities in touching the upper surface of the sensor array was conducted, and confirmed the validity of the approach.
Data Structures for Extreme Scale Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kahan, Simon
As computing problems of national importance grow, the government meets the increased demand by funding the development of ever larger systems. The overarching goal of the work supported in part by this grant is to increase efficiency of programming and performing computations on these large computing systems. In past work, we have demonstrated that some of these computations once thought to require expensive hardware designs and/or complex, special-purpose programming may be executed efficiently on low-cost commodity cluster computing systems using a general-purpose “latency-tolerant” programming framework. One important developed application of the ideas underlying this framework is graph database technology supportingmore » social network pattern matching used by US intelligence agencies to more quickly identify potential terrorist threats. This database application has been spun out by the Pacific Northwest National Laboratory, a Department of Energy Laboratory, into a commercial start-up, Trovares Inc. We explore an alternative application of the same underlying ideas to a well-studied challenge arising in engineering: solving unstructured sparse linear equations. Solving these equations is key to predicting the behavior of large electronic circuits before they are fabricated. Predicting that behavior ahead of fabrication means that designs can optimized and errors corrected ahead of the expense of manufacture.« less
A comparison of acceleration methods for solving the neutron transport k-eigenvalue problem
NASA Astrophysics Data System (ADS)
Willert, Jeffrey; Park, H.; Knoll, D. A.
2014-10-01
Over the past several years a number of papers have been written describing modern techniques for numerically computing the dominant eigenvalue of the neutron transport criticality problem. These methods fall into two distinct categories. The first category of methods rewrite the multi-group k-eigenvalue problem as a nonlinear system of equations and solve the resulting system using either a Jacobian-Free Newton-Krylov (JFNK) method or Nonlinear Krylov Acceleration (NKA), a variant of Anderson Acceleration. These methods are generally successful in significantly reducing the number of transport sweeps required to compute the dominant eigenvalue. The second category of methods utilize Moment-Based Acceleration (or High-Order/Low-Order (HOLO) Acceleration). These methods solve a sequence of modified diffusion eigenvalue problems whose solutions converge to the solution of the original transport eigenvalue problem. This second class of methods is, in our experience, always superior to the first, as most of the computational work is eliminated by the acceleration from the LO diffusion system. In this paper, we review each of these methods. Our computational results support our claim that the choice of which nonlinear solver to use, JFNK or NKA, should be secondary. The primary computational savings result from the implementation of a HOLO algorithm. We display computational results for a series of challenging multi-dimensional test problems.
The Relationship between Computational Fluency and Student Success in General Studies Mathematics
ERIC Educational Resources Information Center
Hegeman, Jennifer; Waters, Gavin
2012-01-01
Many developmental mathematics programs emphasize computational fluency with the assumption that this is a necessary contributor to student success in general studies mathematics. In an effort to determine which skills are most essential, scores on a computational fluency test were correlated with student success in general studies mathematics at…
FPGA-Based, Self-Checking, Fault-Tolerant Computers
NASA Technical Reports Server (NTRS)
Some, Raphael; Rennels, David
2004-01-01
A proposed computer architecture would exploit the capabilities of commercially available field-programmable gate arrays (FPGAs) to enable computers to detect and recover from bit errors. The main purpose of the proposed architecture is to enable fault-tolerant computing in the presence of single-event upsets (SEUs). [An SEU is a spurious bit flip (also called a soft error) caused by a single impact of ionizing radiation.] The architecture would also enable recovery from some soft errors caused by electrical transients and, to some extent, from intermittent and permanent (hard) errors caused by aging of electronic components. A typical FPGA of the current generation contains one or more complete processor cores, memories, and highspeed serial input/output (I/O) channels, making it possible to shrink a board-level processor node to a single integrated-circuit chip. Custom, highly efficient microcontrollers, general-purpose computers, custom I/O processors, and signal processors can be rapidly and efficiently implemented by use of FPGAs. Unfortunately, FPGAs are susceptible to SEUs. Prior efforts to mitigate the effects of SEUs have yielded solutions that degrade performance of the system and require support from external hardware and software. In comparison with other fault-tolerant- computing architectures (e.g., triple modular redundancy), the proposed architecture could be implemented with less circuitry and lower power demand. Moreover, the fault-tolerant computing functions would require only minimal support from circuitry outside the central processing units (CPUs) of computers, would not require any software support, and would be largely transparent to software and to other computer hardware. There would be two types of modules: a self-checking processor module and a memory system (see figure). The self-checking processor module would be implemented on a single FPGA and would be capable of detecting its own internal errors. It would contain two CPUs executing identical programs in lock step, with comparison of their outputs to detect errors. It would also contain various cache local memory circuits, communication circuits, and configurable special-purpose processors that would use self-checking checkers. (The basic principle of the self-checking checker method is to utilize logic circuitry that generates error signals whenever there is an error in either the checker or the circuit being checked.) The memory system would comprise a main memory and a hardware-controlled check-pointing system (CPS) based on a buffer memory denoted the recovery cache. The main memory would contain random-access memory (RAM) chips and FPGAs that would, in addition to everything else, implement double-error-detecting and single-error-correcting memory functions to enable recovery from single-bit errors.
Planning: supporting and optimizing clinical guidelines execution.
Anselma, Luca; Montani, Stefania
2008-01-01
A crucial feature of computerized clinical guidelines (CGs) lies in the fact that they may be used not only as conventional documents (as if they were just free text) describing general procedures that users have to follow. In fact, thanks to a description of their actions and control flow in some semiformal representation language, CGs can also take advantage of Computer Science methods and Information Technology infrastructures and techniques, to become executable documents, in the sense that they may support clinical decision making and clinical procedures execution. In order to reach this goal, some advanced planning techniques, originally developed within the Artificial Intelligence (AI) community, may be (at least partially) resorted too, after a proper adaptation to the specific CG needs has been carried out.
Simple systems that exhibit self-directed replication
NASA Technical Reports Server (NTRS)
Reggia, James A.; Armentrout, Steven L.; Chou, Hui-Hsien; Peng, Yun
1993-01-01
Biological experience and intuition suggest that self-replication is an inherently complex phenomenon, and early cellular automata models support that conception. More recently, simpler computational models of self-directed replication called sheathed loops have been developed. It is shown here that 'unsheathing' these structures and altering certain assumptions about the symmetry of their components leads to a family of nontrivial self-replicating structures some substantially smaller and simpler than those previously reported. The dependence of replication time and transition function complexity on initial structure size, cell state symmetry, and neighborhood are examined. These results support the view that self-replication is not an inherently complex phenomenon but rather an emergent property arising from local interactions in systems that can be much simpler than is generally believed.
ɛ-subgradient algorithms for bilevel convex optimization
NASA Astrophysics Data System (ADS)
Helou, Elias S.; Simões, Lucas E. A.
2017-05-01
This paper introduces and studies the convergence properties of a new class of explicit ɛ-subgradient methods for the task of minimizing a convex function over a set of minimizers of another convex minimization problem. The general algorithm specializes to some important cases, such as first-order methods applied to a varying objective function, which have computationally cheap iterations. We present numerical experimentation concerning certain applications where the theoretical framework encompasses efficient algorithmic techniques, enabling the use of the resulting methods to solve very large practical problems arising in tomographic image reconstruction. ES Helou was supported by FAPESP grants 2013/07375-0 and 2013/16508-3 and CNPq grant 311476/2014-7. LEA Simões was supported by FAPESP grants 2011/02219-4 and 2013/14615-7.
NASA Technical Reports Server (NTRS)
Sellers, William L., III; Dwoyer, Douglas L.
1992-01-01
The design of a hypersonic aircraft poses unique challenges to the engineering community. Problems with duplicating flight conditions in ground based facilities have made performance predictions risky. Computational fluid dynamics (CFD) has been proposed as an additional means of providing design data. At the present time, CFD codes are being validated based on sparse experimental data and then used to predict performance at flight conditions with generally unknown levels of uncertainty. This paper will discuss the facility and measurement techniques that are required to support CFD development for the design of hypersonic aircraft. Illustrations are given of recent success in combining experimental and direct numerical simulation in CFD model development and validation for hypersonic perfect gas flows.
Yaghmaie, Farideh; Jayasuriya, Rohan
2004-01-01
There have been many changes made to information systems in the last decade. Changes in information systems require users constantly to update their computer knowledge and skills. Computer training is a critical issue for any user because it offers them considerable new skills. The purpose of this study was to measure the effects of 'subjective computer training' and management support on attitudes to computers, computer anxiety and subjective norms to use computers. The data were collected from community health centre staff. The results of the study showed that health staff trained in computer use had more favourable attitudes to computers, less computer anxiety and more awareness of others' expectations about computer use than untrained users. However, there was no relationship between management support and computer attitude, computer anxiety or subjective norms. Lack of computer training for the majority of healthcare staff confirmed the need for more attention to this issue, particularly in health centres.
A Framework for a Computer System to Support Distributed Cooperative Learning
ERIC Educational Resources Information Center
Chiu, Chiung-Hui
2004-01-01
To develop a computer system to support cooperative learning among distributed students; developers should consider the foundations of cooperative learning. This article examines the basic elements that make cooperation work and proposes a framework for such computer supported cooperative learning (CSCL) systems. This framework is constituted of…
Instructional Support Software System. Final Report.
ERIC Educational Resources Information Center
McDonnell Douglas Astronautics Co. - East, St. Louis, MO.
This report describes the development of the Instructional Support System (ISS), a large-scale, computer-based training system that supports both computer-assisted instruction and computer-managed instruction. Written in the Ada programming language, the ISS software package is designed to be machine independent. It is also grouped into functional…
Flexibility in Macro-Scripts for Computer-Supported Collaborative Learning
ERIC Educational Resources Information Center
Dillenbourg, P.; Tchounikine, P.
2007-01-01
In the field of computer-supported collaborative learning (CSCL), scripts are designed to support collaboration among distant learners or co-present learners whose interactions are (at least partially) mediated by a computer. The rationale of scripts is to structure collaborative learning processes in order to trigger group interactions that may…
ERIC Educational Resources Information Center
Engelmann, Tanja; Tergan, Sigmar-Olaf; Hesse, Friedrich W.
2010-01-01
Computer-supported collaboration by spatially distributed group members still involves interaction problems within the group. This article presents an empirical study investigating the question of whether computer-supported collaborative problem solving by spatially distributed group members can be fostered by evoking knowledge and information…
Students' Activity in Computer-Supported Collaborative Problem Solving in Mathematics
ERIC Educational Resources Information Center
Hurme, Tarja-riitta; Jarvela, Sanna
2005-01-01
The purpose of this study was to analyse secondary school students' (N = 16) computer-supported collaborative mathematical problem solving. The problem addressed in the study was: What kinds of metacognitive processes appear during computer-supported collaborative learning in mathematics? Another aim of the study was to consider the applicability…
ERIC Educational Resources Information Center
Berkant, Hasan Güner
2016-01-01
This study investigates faculty of education students' computer self-efficacy beliefs and their attitudes towards computers and implementing computer supported education. This study is descriptive and based on a correlational survey model. The final sample consisted of 414 students studying in the faculty of education of a Turkish university. The…
ERIC Educational Resources Information Center
Vekiri, Ioanna; Chronaki, Anna
2008-01-01
In this study, we examined relations between outside school computer experiences, perceived social support for using computers, and self-efficacy and value beliefs about computer learning for 340 Greek elementary school boys and girls. Participants responded to a questionnaire about their access to computer use outside school (e.g. frequency of…
Computer-Based Indexing on a Small Scale: Bibliography.
ERIC Educational Resources Information Center
Douglas, Kimberly; Wismer, Don
The 131 references on small scale computer-based indexing cited in this bibliography are subdivided as follows: general, general (computer), index structure, microforms, specific systems, KWIC KWAC KWOC, and thesauri. (RAA)
On the generalized VIP time integral methodology for transient thermal problems
NASA Technical Reports Server (NTRS)
Mei, Youping; Chen, Xiaoqin; Tamma, Kumar K.; Sha, Desong
1993-01-01
The paper describes the development and applicability of a generalized VIrtual-Pulse (VIP) time integral method of computation for thermal problems. Unlike past approaches for general heat transfer computations, and with the advent of high speed computing technology and the importance of parallel computations for efficient use of computing environments, a major motivation via the developments described in this paper is the need for developing explicit computational procedures with improved accuracy and stability characteristics. As a consequence, a new and effective VIP methodology is described which inherits these improved characteristics. Numerical illustrative examples are provided to demonstrate the developments and validate the results obtained for thermal problems.
A Parallel Processing Algorithm for Remote Sensing Classification
NASA Technical Reports Server (NTRS)
Gualtieri, J. Anthony
2005-01-01
A current thread in parallel computation is the use of cluster computers created by networking a few to thousands of commodity general-purpose workstation-level commuters using the Linux operating system. For example on the Medusa cluster at NASA/GSFC, this provides for super computing performance, 130 G(sub flops) (Linpack Benchmark) at moderate cost, $370K. However, to be useful for scientific computing in the area of Earth science, issues of ease of programming, access to existing scientific libraries, and portability of existing code need to be considered. In this paper, I address these issues in the context of tools for rendering earth science remote sensing data into useful products. In particular, I focus on a problem that can be decomposed into a set of independent tasks, which on a serial computer would be performed sequentially, but with a cluster computer can be performed in parallel, giving an obvious speedup. To make the ideas concrete, I consider the problem of classifying hyperspectral imagery where some ground truth is available to train the classifier. In particular I will use the Support Vector Machine (SVM) approach as applied to hyperspectral imagery. The approach will be to introduce notions about parallel computation and then to restrict the development to the SVM problem. Pseudocode (an outline of the computation) will be described and then details specific to the implementation will be given. Then timing results will be reported to show what speedups are possible using parallel computation. The paper will close with a discussion of the results.
NASA Technical Reports Server (NTRS)
Kunz, Robert F.
2014-01-01
This document represents the evolving formal documentation of the NPHASE-PSU computer code. Version 3.15 is being delivered along with the software to NASA in 2013.Significant upgrades to the NPHASE-PSU have been made since the first delivery of draft documentation to DARPA and USNRC in 2006. These include a much lighter, faster and memory efficient face based front end, support for arbitrary polyhedra in front end, flow-solver and back-end, a generalized homogeneous multiphase capability, and several two-fluid modelling and algorithmic elements. Specific capability installed for the NASA Gearbox Windage Aerodynamics NRA are included in this version: Hybrid Immersed Overset Boundary Method (HOIBM) [Noack et. al (2009)] Periodic boundary conditions for multiple frames of reference, Fully generalized immersed boundary method, Fully generalized conjugate heat transfer, Droplet deposition, bouncing, splashing models, and, Film transport and breakup.
Pinquart, Martin
2014-05-01
The objective of the meta-analysis is to integrate available results on associations of general parenting (not specific to feeding and activity promotion) and parent-child relations with child weight status, eating, and physical activity. Searching in electronic databases and cross-referencing identified 156 empirical studies. Random-effects meta-analysis was computed. A positive parent-child relationship and higher levels of parental responsiveness were associated with lower weight, healthier eating, and more physical activity of the child. Parental demandingness, overprotection, psychological control, inconsistency, and parenting styles showed associations with some of the assessed outcome variables. Most effect sizes were small and varied by study characteristics. The small effects do not support making general parenting styles, parental demandingness, responsiveness, and the quality of the parent-child relationship a main target of preventing and treating obesity. Reducing parental inconsistency may be a better target if available results are replicated in future studies.
16 CFR 1115.14 - Time computations.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 16 Commercial Practices 2 2012-01-01 2012-01-01 false Time computations. 1115.14 Section 1115.14... PRODUCT HAZARD REPORTS General Interpretation § 1115.14 Time computations. (a) General. Weekends and holidays are excluded from the computation of the time periods in this part. (b) Imputing knowledge. In...
16 CFR 1115.14 - Time computations.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 16 Commercial Practices 2 2014-01-01 2014-01-01 false Time computations. 1115.14 Section 1115.14... PRODUCT HAZARD REPORTS General Interpretation § 1115.14 Time computations. (a) General. Weekends and holidays are excluded from the computation of the time periods in this part. (b) Imputing knowledge. In...
Computer Maintenance Operations Center (CMOC), additional computer support equipment ...
Computer Maintenance Operations Center (CMOC), additional computer support equipment - Beale Air Force Base, Perimeter Acquisition Vehicle Entry Phased-Array Warning System, Techinical Equipment Building, End of Spencer Paul Road, north of Warren Shingle Road (14th Street), Marysville, Yuba County, CA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, Barton
2014-06-30
Peta-scale computing environments pose significant challenges for both system and application developers and addressing them required more than simply scaling up existing tera-scale solutions. Performance analysis tools play an important role in gaining this understanding, but previous monolithic tools with fixed feature sets have not sufficed. Instead, this project worked on the design, implementation, and evaluation of a general, flexible tool infrastructure supporting the construction of performance tools as “pipelines” of high-quality tool building blocks. These tool building blocks provide common performance tool functionality, and are designed for scalability, lightweight data acquisition and analysis, and interoperability. For this project, wemore » built on Open|SpeedShop, a modular and extensible open source performance analysis tool set. The design and implementation of such a general and reusable infrastructure targeted for petascale systems required us to address several challenging research issues. All components needed to be designed for scale, a task made more difficult by the need to provide general modules. The infrastructure needed to support online data aggregation to cope with the large amounts of performance and debugging data. We needed to be able to map any combination of tool components to each target architecture. And we needed to design interoperable tool APIs and workflows that were concrete enough to support the required functionality, yet provide the necessary flexibility to address a wide range of tools. A major result of this project is the ability to use this scalable infrastructure to quickly create tools that match with a machine architecture and a performance problem that needs to be understood. Another benefit is the ability for application engineers to use the highly scalable, interoperable version of Open|SpeedShop, which are reassembled from the tool building blocks into a flexible, multi-user interface set of tools. This set of tools targeted at Office of Science Leadership Class computer systems and selected Office of Science application codes. We describe the contributions made by the team at the University of Wisconsin. The project built on the efforts in Open|SpeedShop funded by DOE/NNSA and the DOE/NNSA Tri-Lab community, extended Open|Speedshop to the Office of Science Leadership Class Computing Facilities, and addressed new challenges found on these cutting edge systems. Work done under this project at Wisconsin can be divided into two categories, new algorithms and techniques for debugging, and foundation infrastructure work on our Dyninst binary analysis and instrumentation toolkits and MRNet scalability infrastructure.« less
Computers in medicine: patients' attitudes
Cruickshank, P. J.
1984-01-01
Data are presented from two surveys where a 26-item questionnaire was used to measure patients' attitudes to diagnostic computers and to medical computers in general. The first group of respondents were 229 patients who had been given outpatient appointments at a hospital general medical clinic specializing in gastrointestinal problems, where some had experienced a diagnostic computer in use. The second group of respondents were 416 patients attending a group general practice where there was no computer. Patients who had experience of the diagnostic computer or a personal computer had more favourable attitudes to computers in medicine as did younger people and males. The two samples of patients showed broadly similar attitudes, and a notable finding was that over half of each group believed that, with a computer around, the personal touch of the doctor would be lost. PMID:6471021
Computer Assisted Multi-Center Creation of Medical Knowledge Bases
Giuse, Nunzia Bettinsoli; Giuse, Dario A.; Miller, Randolph A.
1988-01-01
Computer programs which support different aspects of medical care have been developed in recent years. Their capabilities range from diagnosis to medical imaging, and include hospital management systems and therapy prescription. In spite of their diversity these systems have one commonality: their reliance on a large body of medical knowledge in computer-readable form. This knowledge enables such programs to draw inferences, validate hypotheses, and in general to perform their intended task. As has been clear to developers of such systems, however, the creation and maintenance of medical knowledge bases are very expensive. Practical and economical difficulties encountered during this long-term process have discouraged most attempts. This paper discusses knowledge base creation and maintenance, with special emphasis on medical applications. We first describe the methods currently used and their limitations. We then present our recent work on developing tools and methodologies which will assist in the process of creating a medical knowledge base. We focus, in particular, on the possibility of multi-center creation of the knowledge base.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hale, M.A.; Craig, J.I.
Integrated Product and Process Development (IPPD) embodies the simultaneous application to both system and quality engineering methods throughout an iterative design process. The use of IPPD results in the time-conscious, cost-saving development of engineering systems. To implement IPPD, a Decision-Based Design perspective is encapsulated in an approach that focuses on the role of the human designer in product development. The approach has two parts and is outlined in this paper. First, an architecture, called DREAMS, is being developed that facilitates design from a decision-based perspective. Second, a supporting computing infrastructure, called IMAGE, is being designed. Agents are used to implementmore » the overall infrastructure on the computer. Successful agent utilization requires that they be made of three components: the resource, the model, and the wrap. Current work is focused on the development of generalized agent schemes and associated demonstration projects. When in place, the technology independent computing infrastructure will aid the designer in systematically generating knowledge used to facilitate decision-making.« less
Robust adaptive kinematic control of redundant robots
NASA Technical Reports Server (NTRS)
Tarokh, M.; Zuck, D. D.
1992-01-01
The paper presents a general method for the resolution of redundancy that combines the Jacobian pseudoinverse and augmentation approaches. A direct adaptive control scheme is developed to generate joint angle trajectories for achieving desired end-effector motion as well as additional user defined tasks. The scheme ensures arbitrarily small errors between the desired and the actual motion of the manipulator. Explicit bounds on the errors are established that are directly related to the mismatch between actual and estimated pseudoinverse Jacobian matrix, motion velocity and the controller gain. It is shown that the scheme is tolerant of the mismatch and consequently only infrequent pseudoinverse computations are needed during a typical robot motion. As a result, the scheme is computationally fast, and can be implemented for real-time control of redundant robots. A method is incorporated to cope with the robot singularities allowing the manipulator to get very close or even pass through a singularity while maintaining a good tracking performance and acceptable joint velocities. Computer simulations and experimental results are provided in support of the theoretical developments.