A Computer Program for Crystal Drawing.
ERIC Educational Resources Information Center
Dutch, Steven I.
1981-01-01
Described is a computer program which accepts face data, performs all necessary symmetry operations, and produces a drawing of the resulting crystal. The program shortens computing time to make it suitable for online teaching use or for use in small computers. (Author/DC)
Individual and Sex Differences in the Zone of Acceptable Alternatives.
ERIC Educational Resources Information Center
Leung, S. Alvin; Harmon, Lenore W.
1990-01-01
Examined zone of acceptable alternatives construct from Gottfredson's theory of career aspiration. College students' (N=246) responses to Occupations List were coded with measurements of sex type and prestige, and indicators of zone of acceptable alternatives for subjects' were computed. Found changes over time and differences related to gender…
Equivalency of Computer-based and Paper-and-pencil Testing.
ERIC Educational Resources Information Center
DeAngelis, Susan
2000-01-01
Dental hygiene students (n=15) took a first examination on computer then paper; 15 others took the paper test first. Computer test scores were higher than paper for the first exam. Student acceptance of the computer format was mixed. Computer exams reduced scoring and grade reporting time. (SK)
Self-calibrating multiplexer circuit
Wahl, Chris P.
1997-01-01
A time domain multiplexer system with automatic determination of acceptable multiplexer output limits, error determination, or correction is comprised of a time domain multiplexer, a computer, a constant current source capable of at least three distinct current levels, and two series resistances employed for calibration and testing. A two point linear calibration curve defining acceptable multiplexer voltage limits may be defined by the computer by determining the voltage output of the multiplexer to very accurately known input signals developed from predetermined current levels across the series resistances. Drift in the multiplexer may be detected by the computer when the output voltage limits, expected during normal operation, are exceeded, or the relationship defined by the calibration curve is invalidated.
How Patient Interactions With a Computer-Based Video Intervention Affect Decisions to Test for HIV.
Aronson, Ian David; Rajan, Sonali; Marsch, Lisa A; Bania, Theodore C
2014-06-01
The current study examines predictors of HIV test acceptance among emergency department patients who received an educational video intervention designed to increase HIV testing. A total of 202 patients in the main treatment areas of a high-volume, urban hospital emergency department used inexpensive netbook computers to watch brief educational videos about HIV testing and respond to pre-postintervention data collection instruments. After the intervention, computers asked participants if they would like an HIV test: Approximately 43% (n = 86) accepted. Participants who accepted HIV tests at the end of the intervention took longer to respond to postintervention questions, which included the offer of an HIV test, F(1, 195) = 37.72, p < .001, compared with participants who did not accept testing. Participants who incorrectly answered pretest questions about HIV symptoms were more likely to accept testing F(14, 201) = 4.48, p < .001. White participants were less likely to accept tests than Black, Latino, or "Other" patients, χ(2)(3, N = 202) = 10.39, p < .05. Time spent responding to postintervention questions emerged as the strongest predictor of HIV testing, suggesting that patients who agreed to test spent more time thinking about their response to the offer of an HIV test. Examining intervention usage data, pretest knowledge deficits, and patient demographics can potentially inform more effective behavioral health interventions for underserved populations in clinical settings. © 2013 Society for Public Health Education.
How Patient Interactions With a Computer-Based Video Intervention Affect Decisions to Test for HIV
Aronson, Ian David; Rajan, Sonali; Marsch, Lisa A.; Bania, Theodore C.
2014-01-01
The current study examines predictors of HIV test acceptance among emergency department patients who received an educational video intervention designed to increase HIV testing. A total of 202 patients in the main treatment areas of a high-volume, urban hospital emergency department used inexpensive netbook computers to watch brief educational videos about HIV testing and respond to pre–postintervention data collection instruments. After the intervention, computers asked participants if they would like an HIV test: Approximately 43% (n = 86) accepted. Participants who accepted HIV tests at the end of the intervention took longer to respond to postintervention questions, which included the offer of an HIV test, F(1, 195) = 37.72, p < .001, compared with participants who did not accept testing. Participants who incorrectly answered pretest questions about HIV symptoms were more likely to accept testing F(14, 201) = 4.48, p < .001. White participants were less likely to accept tests than Black, Latino, or “Other” patients, χ2(3, N = 202) = 10.39, p < .05. Time spent responding to postintervention questions emerged as the strongest predictor of HIV testing, suggesting that patients who agreed to test spent more time thinking about their response to the offer of an HIV test. Examining intervention usage data, pretest knowledge deficits, and patient demographics can potentially inform more effective behavioral health interventions for underserved populations in clinical settings. PMID:24225031
Wilkie, Diana J; Kim, Young Ok; Suarez, Marie L; Dauw, Colleen M; Stapleton, Stephen J; Gorman, Geraldine; Storfjell, Judith; Zhao, Zhongsheng
2009-07-01
We aimed to determine the acceptability and feasibility of a pentablet-based software program, PAINReportIt-Plus, as a means for patients with cancer in home hospice to report their symptoms and differences in acceptability by demographic variables. Of the 131 participants (mean age = 59 +/- 13, 58% women, 48.1% African American), 44% had never used a computer, but all participants easily used the computerized tool and reported an average computer acceptability score of 10.3 +/- 1.8, indicating high acceptability. Participants required an average of 19.1 +/- 9.5 minutes to complete the pain section, 9.8 +/- 6.5 minutes for the medication section, and 4.8 +/- 2.3 minutes for the symptom section. The acceptability scores were not statistically different by demographic variables but time to complete the tool differed by racial/ethnic groups. Our findings demonstrate that terminally ill patients with cancer are willing and able to utilize computer pentablet technology to record and describe their pain and other symptoms. Visibility of pain and distress is the first step necessary for the hospice team to develop a care plan for improving control of noxious symptoms.
Zheng, Meixun; Bender, Daniel
2018-03-13
Computer-based testing (CBT) has made progress in health sciences education. In 2015, the authors led implementation of a CBT system (ExamSoft) at a dental school in the U.S. Guided by the Technology Acceptance Model (TAM), the purposes of this study were to (a) examine dental students' acceptance of ExamSoft; (b) understand factors impacting acceptance; and (c) evaluate the impact of ExamSoft on students' learning and exam performance. Survey and focus group data revealed that ExamSoft was well accepted by students as a testing tool and acknowledged by most for its potential to support learning. Regression analyses showed that perceived ease of use and perceived usefulness of ExamSoft significantly predicted student acceptance. Prior CBT experience and computer skills did not significantly predict acceptance of ExamSoft. Students reported that ExamSoft promoted learning in the first program year, primarily through timely and rich feedback on examination performance. t-Tests yielded mixed results on whether students performed better on computerized or paper examinations. The study contributes to the literature on CBT and the application of the TAM model in health sciences education. Findings also suggest ways in which health sciences institutions can implement CBT to maximize its potential as an assessment and learning tool.
Orientation/Time Management Skill Training Lesson: Development and Evaluation
1979-07-01
instructional environment. This Orientation/ Time Management lesson provides students with appropriate role models for increasing acceptance of their...time savings can be obtained by a combination of this type of orientation and time management skill training with a computer-based progress targeting
The effects of perceived USB-delay for sensor and embedded system development.
Du, J; Kade, D; Gerdtman, C; Ozcan, O; Linden, M
2016-08-01
Perceiving delay in computer input devices is a problem which gets even more eminent when being used in healthcare applications and/or in small, embedded systems. Therefore, the amount of delay found as acceptable when using computer input devices was investigated in this paper. A device was developed to perform a benchmark test for the perception of delay. The delay can be set from 0 to 999 milliseconds (ms) between a receiving computer and an available USB-device. The USB-device can be a mouse, a keyboard or some other type of USB-connected input device. Feedback from performed user tests with 36 people form the basis for the determination of time limitations for the USB data processing in microprocessors and embedded systems without users' noticing the delay. For this paper, tests were performed with a personal computer and a common computer mouse, testing the perception of delays between 0 and 500 ms. The results of our user tests show that perceived delays up to 150 ms were acceptable and delays larger than 300 ms were not acceptable at all.
Gender Differences in Teacher Computer Acceptance
ERIC Educational Resources Information Center
Yuen, Allan H. K.; Ma, Will W. K.
2002-01-01
Teachers' computer acceptance is an important factor to the successful use of computers in education. This article explores the gender differences in teacher computer acceptance. The Technology Acceptance Model (TAM) was used as the framework to determine if such differences are present. Survey questionnaires were administered to 186 preservice…
Khalifa, Mohamed
2016-01-01
This study aims at evaluating hospital information systems (HIS) acceptance factors among nurses, in order to provide suggestions for successful HIS implementation. The study used mainly quantitative survey methods to collect data directly from nurses through a questionnaire. The availability of computers in the hospital was one of the most influential factors, with a special emphasis on the unavailability of laptop computers and computers on wheels to facilitate immediate data entry and retrieval when nurses are at the point of care. Nurses believed that HIS might frequently slow down the process of care delivery and increase the time spent by patients inside the hospital especially during slow performance and responsiveness phases. Recommendations were classified into three main areas; improving system performance and availability of computers in the hospital, increasing organizational support in the form of providing training and protected time for nurses' to learn and enhancing users' feedback by listening to their complaints and considering their suggestions.
NASA Technical Reports Server (NTRS)
Stocks, Dana R.
1986-01-01
The Dynamic Gas Temperature Measurement System compensation software accepts digitized data from two different diameter thermocouples and computes a compensated frequency response spectrum for one of the thermocouples. Detailed discussions of the physical system, analytical model, and computer software are presented in this volume and in Volume 1 of this report under Task 3. Computer program software restrictions and test cases are also presented. Compensated and uncompensated data may be presented in either the time or frequency domain. Time domain data are presented as instantaneous temperature vs time. Frequency domain data may be presented in several forms such as power spectral density vs frequency.
The Humanistic Duo: The Park/Recreation Professional and the Computer. (Computer-Can I Use It?).
ERIC Educational Resources Information Center
Weiner, Myron E.
This paper states that there are two fundamental reasons for the comparative absence of computer use for parks and recreation at the present time. These are (1) lack of clear cut cost justification and (2) reluctance on the part of recreation professionals to accept their role as managers and, consequently, to utilize modern management tools. The…
Report on the Acceptance Test of the CRI Y-MP 8128, 10 February - 12 March 1990
NASA Technical Reports Server (NTRS)
Carter, Russell; Kutler, Paul (Technical Monitor)
1998-01-01
The NAS Numerical Aerodynamic Simulation Facility's HSP 2 computer system, a CRI Y-MP 832 SN #1002, underwent a major hardware upgrade in February of 1990. The 32 MWord, 6.3 ns mainframe component of the system was replaced with a 128 MWord, 6.0 ns CRI Y-MP 8128 mainframe, SN #1030. A 30 day Acceptance Test of the computer system was performed by the NAS RND HSP group from 08:00 February 10, 1990 to 08:00 March 12, 1990. Overall responsibility for the RND HSP Acceptance Test was assumed by Duane Carbon. The terms of the contract required that the SN #1030 achieve an effectiveness level of greater than or equal to ninety (90) percent for 30 consecutive days within a 60 day time frame. After the first thirty days, the effectiveness level of SN #1030 was 94.4 percent, hence the acceptance test was passed.
29 CFR 785.48 - Use of time clocks.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Relating to Labor (Continued) WAGE AND HOUR DIVISION, DEPARTMENT OF LABOR STATEMENTS OF GENERAL POLICY OR INTERPRETATION NOT DIRECTLY RELATED TO REGULATIONS HOURS WORKED Recording Working Time § 785.48 Use of time... actually work. For enforcement purposes this practice of computing working time will be accepted, provided...
Exercise Prescribing: Computer Application in Older Adults
ERIC Educational Resources Information Center
Kressig, Reto W.; Echt, Katharina V.
2002-01-01
Purpose: The purpose of this study was to determine if older adults are capable and willing to interact with a computerized exercise promotion interface and to determine to what extent they accept computer-generated exercise recommendations. Design and Methods: Time and requests for assistance were recorded while 34 college-educated volunteers,…
NASA Technical Reports Server (NTRS)
Chu, Y.-Y.; Rouse, W. B.
1979-01-01
As human and computer come to have overlapping decisionmaking abilities, a dynamic or adaptive allocation of responsibilities may be the best mode of human-computer interaction. It is suggested that the computer serve as a backup decisionmaker, accepting responsibility when human workload becomes excessive and relinquishing responsibility when workload becomes acceptable. A queueing theory formulation of multitask decisionmaking is used and a threshold policy for turning the computer on/off is proposed. This policy minimizes event-waiting cost subject to human workload constraints. An experiment was conducted with a balanced design of several subject runs within a computer-aided multitask flight management situation with different task demand levels. It was found that computer aiding enhanced subsystem performance as well as subjective ratings. The queueing model appears to be an adequate representation of the multitask decisionmaking situation, and to be capable of predicting system performance in terms of average waiting time and server occupancy. Server occupancy was further found to correlate highly with the subjective effort ratings.
The ultimatum game: Discrete vs. continuous offers
NASA Astrophysics Data System (ADS)
Dishon-Berkovits, Miriam; Berkovits, Richard
2014-09-01
In many experimental setups in social-sciences, psychology and economy the subjects are requested to accept or dispense monetary compensation which is usually given in discrete units. Using computer and mathematical modeling we show that in the framework of studying the dynamics of acceptance of proposals in the ultimatum game, the long time dynamics of acceptance of offers in the game are completely different for discrete vs. continuous offers. For discrete values the dynamics follow an exponential behavior. However, for continuous offers the dynamics are described by a power-law. This is shown using an agent based computer simulation as well as by utilizing an analytical solution of a mean-field equation describing the model. These findings have implications to the design and interpretation of socio-economical experiments beyond the ultimatum game.
Training + Technology: The Future Is Now.
ERIC Educational Resources Information Center
Heathman, Dena J.; Kleiner, Brian H.
1991-01-01
New applications of computer-assisted training being developed include telecommunications, artificial intelligence, soft skills training, and performance support systems. Barriers to acceptance are development time, costs, and lack of human contact. (SK)
Boevé, Anja J; Meijer, Rob R; Albers, Casper J; Beetsma, Yta; Bosker, Roel J
2015-01-01
The introduction of computer-based testing in high-stakes examining in higher education is developing rather slowly due to institutional barriers (the need of extra facilities, ensuring test security) and teacher and student acceptance. From the existing literature it is unclear whether computer-based exams will result in similar results as paper-based exams and whether student acceptance can change as a result of administering computer-based exams. In this study, we compared results from a computer-based and paper-based exam in a sample of psychology students and found no differences in total scores across the two modes. Furthermore, we investigated student acceptance and change in acceptance of computer-based examining. After taking the computer-based exam, fifty percent of the students preferred paper-and-pencil exams over computer-based exams and about a quarter preferred a computer-based exam. We conclude that computer-based exam total scores are similar as paper-based exam scores, but that for the acceptance of high-stakes computer-based exams it is important that students practice and get familiar with this new mode of test administration.
Nonportable computed radiography of the chest--radiologists' acceptance
NASA Astrophysics Data System (ADS)
Gennari, Rose C.; Gur, David; Miketic, Linda M.; Campbell, William L.; Oliver, James H., III; Plunkett, Michael B.
1994-04-01
Following a large ROC study to assess diagnostic accuracy of PA chest computed radiography (CR) images displayed in a variety of formats, we asked nine experienced radiologists to subjectively assess their acceptance of and preferences for display modes in primary diagnosis of erect PA chest images. Our results indicate that radiologists felt somewhat less comfortable interpreting CR images displayed on either laser-printed films or workstations as compared to conventional films. The use of four minified images were thought to somewhat decrease diagnostic confidence, as well as to increase the time of interpretation. The reverse mode (black bone) images increased radiologists' confidence level in the detection of soft tissue abnormalities.
ERIC Educational Resources Information Center
Hussmann, Katja; Grande, Marion; Meffert, Elisabeth; Christoph, Swetlana; Piefke, Martina; Willmes, Klaus; Huber, Walter
2012-01-01
Although generally accepted as an important part of aphasia assessment, detailed analysis of spontaneous speech is rarely carried out in clinical practice mostly due to time limitations. The Aachener Sprachanalyse (ASPA; Aachen Speech Analysis) is a computer-assisted method for the quantitative analysis of German spontaneous speech that allows for…
Palmer, Rebecca; Enderby, Pam; Paterson, Gail
2013-01-01
Speech and language therapy (SLT) for aphasia can be difficult to access in the later stages of stroke recovery, despite evidence of continued improvement with sufficient therapeutic intensity. Computerized aphasia therapy has been reported to be useful for independent language practice, providing new opportunities for continued rehabilitation. The success of this option depends on its acceptability to patients and carers. To investigate factors that affect the acceptability of independent home computerized aphasia therapy practice. An acceptability study of computerized therapy was carried out alongside a pilot randomized controlled trial of computer aphasia therapy versus usual care for people more than 6 months post-stroke. Following language assessment and computer exercise prescription by a speech and language therapist, participants practised three times a week for 5 months at home with monthly volunteer support. Semi-structured interviews were conducted with 14 participants who received the intervention and ten carers (n = 24). Questions from a topic guide were presented and answered using picture, gesture and written support. Interviews were audio recorded, transcribed verbatim and analysed thematically. Three research SLTs identified and cross-checked themes and subthemes emerging from the data. The key themes that emerged were benefits and disadvantages of computerized aphasia therapy, need for help and support, and comparisons with face-to-face therapy. The independence, flexibility and repetition afforded by the computer was viewed as beneficial and the personalized exercises motivated participants to practise. Participants and carers perceived improvements in word-finding and confidence-talking. Computer practice could cause fatigue and interference with other commitments. Support from carers or volunteers for motivation and technical assistance was seen as important. Although some participants preferred face-to-face therapy, using a computer for independent language practice was perceived to be an acceptable alternative. Independent computerized aphasia therapy is acceptable to stroke survivors. Acceptability can be maximized by tailoring exercises to personal interests of the individual, ensuring access to support and giving consideration to fatigue and life style when recommending practice schedules. © 2013 Royal College of Speech and Language Therapists.
Code of Federal Regulations, 2014 CFR
2014-10-01
..., and warranty of computer software and computer software documentation. 227.7203-14 Section 227.7203-14... GENERAL CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-14 Conformity, acceptance, and warranty of computer software and computer...
Code of Federal Regulations, 2010 CFR
2010-10-01
..., and warranty of computer software and computer software documentation. 227.7203-14 Section 227.7203-14... GENERAL CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-14 Conformity, acceptance, and warranty of computer software and computer...
Code of Federal Regulations, 2012 CFR
2012-10-01
..., and warranty of computer software and computer software documentation. 227.7203-14 Section 227.7203-14... GENERAL CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-14 Conformity, acceptance, and warranty of computer software and computer...
Code of Federal Regulations, 2013 CFR
2013-10-01
..., and warranty of computer software and computer software documentation. 227.7203-14 Section 227.7203-14... GENERAL CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-14 Conformity, acceptance, and warranty of computer software and computer...
Code of Federal Regulations, 2011 CFR
2011-10-01
..., and warranty of computer software and computer software documentation. 227.7203-14 Section 227.7203-14... GENERAL CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-14 Conformity, acceptance, and warranty of computer software and computer...
Boevé, Anja J.; Meijer, Rob R.; Albers, Casper J.; Beetsma, Yta; Bosker, Roel J.
2015-01-01
The introduction of computer-based testing in high-stakes examining in higher education is developing rather slowly due to institutional barriers (the need of extra facilities, ensuring test security) and teacher and student acceptance. From the existing literature it is unclear whether computer-based exams will result in similar results as paper-based exams and whether student acceptance can change as a result of administering computer-based exams. In this study, we compared results from a computer-based and paper-based exam in a sample of psychology students and found no differences in total scores across the two modes. Furthermore, we investigated student acceptance and change in acceptance of computer-based examining. After taking the computer-based exam, fifty percent of the students preferred paper-and-pencil exams over computer-based exams and about a quarter preferred a computer-based exam. We conclude that computer-based exam total scores are similar as paper-based exam scores, but that for the acceptance of high-stakes computer-based exams it is important that students practice and get familiar with this new mode of test administration. PMID:26641632
Performance limits and trade-offs in entropy-driven biochemical computers.
Chu, Dominique
2018-04-14
It is now widely accepted that biochemical reaction networks can perform computations. Examples are kinetic proof reading, gene regulation, or signalling networks. For many of these systems it was found that their computational performance is limited by a trade-off between the metabolic cost, the speed and the accuracy of the computation. In order to gain insight into the origins of these trade-offs, we consider entropy-driven computers as a model of biochemical computation. Using tools from stochastic thermodynamics, we show that entropy-driven computation is subject to a trade-off between accuracy and metabolic cost, but does not involve time-trade-offs. Time trade-offs appear when it is taken into account that the result of the computation needs to be measured in order to be known. We argue that this measurement process, although usually ignored, is a major contributor to the cost of biochemical computation. Copyright © 2018 Elsevier Ltd. All rights reserved.
The Acceptance of Computer Technology by Teachers in Early Childhood Education
ERIC Educational Resources Information Center
Jeong, Hye In; Kim, Yeolib
2017-01-01
This study investigated kindergarten teachers' decision-making process regarding the acceptance of computer technology. We incorporated the Technology Acceptance Model framework, in addition to computer self-efficacy, subjective norm, and personal innovativeness in education technology as external variables. The data were obtained from 160…
1982-12-01
Coppens showed great kindness by accepting supervision of this research when time was short. Vis con - cern, understanding and direzticn led to an...related to computer processing time and storage requirements. These factors will not he addressed directly in this resear:h because the pro - cessing...computational efficiency. Disadvantages are a uniform mesh and periodic boundary con - ditions to satisfy the FFT, and filtering of tho sound speed profile by
Board-foot and Cubic-foot Volume Computing Equations for Southeastern Tree Species
Mackay B. Bryan; Joe P. McClure
1962-01-01
Wide acceptance of Bitterlich's (2) method of sampling, popularized in this country by Grosenbaugh (3), with adaptations such as the variable plot used by Forest Survey in the Southeast, has opened a new era in forest surveying. The efficiency of these sampling methods, accompanied by the timely availability of electronic computing machines, has made it feasible...
Parental Perceptions and Recommendations of Computing Majors: A Technology Acceptance Model Approach
ERIC Educational Resources Information Center
Powell, Loreen; Wimmer, Hayden
2017-01-01
Currently, there are more technology related jobs then there are graduates in supply. The need to understand user acceptance of computing degrees is the first step in increasing enrollment in computing fields. Additionally, valid measurement scales for predicting user acceptance of Information Technology degree programs are required. The majority…
Skylab extravehicular mobility unit thermal simulator
NASA Technical Reports Server (NTRS)
Hixon, C. W.; Phillips, M. A.
1974-01-01
The analytical methods, thermal model, and user's instructions for the Skylab Extravehicular Mobility Unit (SEMU) routine are presented. This digital computer program was developed for detailed thermal performance predictions of the SEMU on the NASA-JSC Univac 1108 computer system. It accounts for conductive, convective, and radiant heat transfer as well as fluid flow and special component characterization. The program provides thermal performance predictions for a 967 node thermal model in one thirty-sixth (1/36) of mission time when operated at a calculating interval of three minutes (mission time). The program has the operational flexibility to: (1) accept card or magnetic tape data input for the thermal model describing the SEMU structure, fluid systems, crewman and component performance, (2) accept card and/or magnetic tape input of internally generated heat and heat influx from the space environment, and (3) output tabular or plotted histories of temperature, flow rates, and other parameters describing system operating modes.
ERIC Educational Resources Information Center
Moran, Mark; Hawkes, Mark; El Gayar, Omar
2010-01-01
Many educational institutions have implemented ubiquitous or required laptop, notebook, or tablet personal computing programs for their students. Yet, limited evidence exists to validate integration and acceptance of the technology among student populations. This research examines student acceptance of mobile computing devices using a modification…
Yan, Mian; Or, Calvin
2017-08-01
This study tested a structural model examining the effects of perceived usefulness, perceived ease of use, attitude, subjective norm, perceived behavioral control, health consciousness, and application-specific self-efficacy on the acceptance (i.e. behavioral intention and actual usage) of a computer-based chronic disease self-monitoring system among patients with type 2 diabetes mellitus and/or hypertension. The model was tested using partial least squares structural equation modeling, with 119 observations that were obtained by pooling data across three time points over a 12-week period. The results indicate that all of the seven constructs examined had a significant total effect on behavioral intention and explained 74 percent of the variance. Also, application-specific self-efficacy and behavioral intention had a significant total effect on actual usage and explained 17 percent of the variance. This study demonstrates that technology acceptance is determined by patient characteristics, technology attributes, and social influences. Applying the findings may increase the likelihood of acceptance.
Time-Domain Impedance Boundary Conditions for Computational Aeroacoustics
NASA Technical Reports Server (NTRS)
Tam, Christopher K. W.; Auriault, Laurent
1996-01-01
It is an accepted practice in aeroacoustics to characterize the properties of an acoustically treated surface by a quantity known as impedance. Impedance is a complex quantity. As such, it is designed primarily for frequency-domain analysis. Time-domain boundary conditions that are the equivalent of the frequency-domain impedance boundary condition are proposed. Both single frequency and model broadband time-domain impedance boundary conditions are provided. It is shown that the proposed boundary conditions, together with the linearized Euler equations, form well-posed initial boundary value problems. Unlike ill-posed problems, they are free from spurious instabilities that would render time-marching computational solutions impossible.
Unsteady three-dimensional thermal field prediction in turbine blades using nonlinear BEM
NASA Technical Reports Server (NTRS)
Martin, Thomas J.; Dulikravich, George S.
1993-01-01
A time-and-space accurate and computationally efficient fully three dimensional unsteady temperature field analysis computer code has been developed for truly arbitrary configurations. It uses boundary element method (BEM) formulation based on an unsteady Green's function approach, multi-point Gaussian quadrature spatial integration on each panel, and a highly clustered time-step integration. The code accepts either temperatures or heat fluxes as boundary conditions that can vary in time on a point-by-point basis. Comparisons of the BEM numerical results and known analytical unsteady results for simple shapes demonstrate very high accuracy and reliability of the algorithm. An example of computed three dimensional temperature and heat flux fields in a realistically shaped internally cooled turbine blade is also discussed.
Ammenwerth, Elske; Mansmann, Ulrich; Iller, Carola; Eichstädter, Ronald
2003-01-01
The documentation of the nursing process is an important but often neglected part of clinical documentation. Paper-based systems have been introduced to support nursing process documentation. Frequently, however, problems such as low quality of documentation are reported. It is unclear whether computer-based documentation systems can reduce these problems and which factors influence their acceptance by users. We introduced a computer-based nursing documentation system on four wards of the University Hospitals of Heidelberg and systematically evaluated its preconditions and its effects in a pretest-posttest intervention study. For the analysis of user acceptance, we concentrated on subjective data drawn from questionnaires and interviews. A questionnaire was developed using items from published questionnaires and items that had to be developed for the special purpose of this study. The quantitative results point to two factors influencing the acceptance of a new computer-based documentation system: the previous acceptance of the nursing process and the previous amount of self-confidence when using computers. On one ward, the diverse acceptance scores heavily declined after the introduction of the nursing documentation system. Explorative qualitative analysis on this ward points to further success factors of computer-based nursing documentation systems. Our results can be used to assist the planning and introduction of computer-based nursing documentation systems. They demonstrate the importance of computer experience and acceptance of the nursing process on a ward but also point to other factors such as the fit between nursing workflow and the functionality of a nursing documentation system.
Mercader, Hannah Faye G; Kabakyenga, Jerome; Katuruba, David Tumusiime; Hobbs, Amy J; Brenner, Jennifer L
2017-02-01
High maternal and child mortality continues in low- and middle-income countries (LMIC). Measurement of maternal, newborn and child health (MNCH) coverage indicators often involves an expensive, complex, and lengthy household data collection process that is especially difficult in less-resourced settings. Computer-assisted personal interviewing (CAPI) has been proposed as a cost-effective and efficient alternative to traditional paper-and-pencil interviewing (PAPI). However, the literature on respondent-level acceptance of CAPI in LMIC has reported mixed outcomes. This is the first study to prospectively examine female respondent acceptance of CAPI and its influencing factors for MNCH data collection in rural Southwest Uganda. Eighteen women aged 15-49 years were randomly selected from 3 rural villages to participate. Each respondent was administered a Women's Questionnaire with half of the survey questions asked using PAPI techniques and the other half using CAPI. Following this PAPI/CAPI exposure, semi-structured focus group discussions (FGDs) assessed respondent attitudes towards PAPI versus CAPI. FGD data analysis involved an immersion/crystallization method (thematic narrative analysis). The sixteen FGD respondents had a median age of 27 (interquartile range: 24.8, 32.3) years old. The majority (62.5%) had only primary level education. Most respondents (68.8%) owned or regularly used a mobile phone or computer. Few respondents (31.3%) had previously seen but not used a tablet computer. Overall, FGDs revealed CAPI acceptance and the factors influencing CAPI acceptability were 'familiarity', 'data confidentiality and security', 'data accuracy', and 'modernization and development'. Female survey respondents in our rural Southwest Ugandan setting found CAPI to be acceptable. Global health planners and implementers considering CAPI for health coverage survey data collection should accommodate influencing factors during survey planning in order to maximize and facilitate acceptance and support by local stakeholders and community participants. Further research is needed to generate best practices for CAPI implementation and LMIC; higher quality, timely, streamlined and budget-friendly collection of MNCH indicators could help direct and improve programming to save lives of mothers and children. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
10 CFR 35.457 - Therapy-related computer systems.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 10 Energy 1 2014-01-01 2014-01-01 false Therapy-related computer systems. 35.457 Section 35.457... Therapy-related computer systems. The licensee shall perform acceptance testing on the treatment planning system of therapy-related computer systems in accordance with published protocols accepted by nationally...
10 CFR 35.457 - Therapy-related computer systems.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 10 Energy 1 2012-01-01 2012-01-01 false Therapy-related computer systems. 35.457 Section 35.457... Therapy-related computer systems. The licensee shall perform acceptance testing on the treatment planning system of therapy-related computer systems in accordance with published protocols accepted by nationally...
10 CFR 35.457 - Therapy-related computer systems.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 10 Energy 1 2013-01-01 2013-01-01 false Therapy-related computer systems. 35.457 Section 35.457... Therapy-related computer systems. The licensee shall perform acceptance testing on the treatment planning system of therapy-related computer systems in accordance with published protocols accepted by nationally...
10 CFR 35.457 - Therapy-related computer systems.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 1 2010-01-01 2010-01-01 false Therapy-related computer systems. 35.457 Section 35.457... Therapy-related computer systems. The licensee shall perform acceptance testing on the treatment planning system of therapy-related computer systems in accordance with published protocols accepted by nationally...
10 CFR 35.457 - Therapy-related computer systems.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 10 Energy 1 2011-01-01 2011-01-01 false Therapy-related computer systems. 35.457 Section 35.457... Therapy-related computer systems. The licensee shall perform acceptance testing on the treatment planning system of therapy-related computer systems in accordance with published protocols accepted by nationally...
Lindblom, Katrina; Gregory, Tess; Flight, Ingrid H K; Zajac, Ian
2011-01-01
Objective This study investigated the efficacy of an internet-based personalized decision support (PDS) tool designed to aid in the decision to screen for colorectal cancer (CRC) using a fecal occult blood test. We tested whether the efficacy of the tool in influencing attitudes to screening was mediated by perceived usability and acceptability, and considered the role of computer self-efficacy and computer anxiety in these relationships. Methods Eighty-one participants aged 50–76 years worked through the on-line PDS tool and completed questionnaires on computer self-efficacy, computer anxiety, attitudes to and beliefs about CRC screening before and after exposure to the PDS, and perceived usability and acceptability of the tool. Results Repeated measures ANOVA found that PDS exposure led to a significant increase in knowledge about CRC and screening, and more positive attitudes to CRC screening as measured by factors from the Preventive Health Model. Perceived usability and acceptability of the PDS mediated changes in attitudes toward CRC screening (but not CRC knowledge), and computer self-efficacy and computer anxiety were significant predictors of individuals' perceptions of the tool. Conclusion Interventions designed to decrease computer anxiety, such as computer courses and internet training, may improve the acceptability of new health information technologies including internet-based decision support tools, increasing their impact on behavior change. PMID:21857024
Thermal radiation view factor: Methods, accuracy and computer-aided procedures
NASA Technical Reports Server (NTRS)
Kadaba, P. V.
1982-01-01
The computer aided thermal analysis programs which predicts the result of predetermined acceptable temperature range prior to stationing of these orbiting equipment in various attitudes with respect to the Sun and the Earth was examined. Complexity of the surface geometries suggests the use of numerical schemes for the determination of these viewfactors. Basic definitions and standard methods which form the basis for various digital computer methods and various numerical methods are presented. The physical model and the mathematical methods on which a number of available programs are built are summarized. The strength and the weaknesses of the methods employed, the accuracy of the calculations and the time required for computations are evaluated. The situations where accuracies are important for energy calculations are identified and methods to save computational times are proposed. Guide to best use of the available programs at several centers and the future choices for efficient use of digital computers are included in the recommendations.
Delayed Slater determinant update algorithms for high efficiency quantum Monte Carlo
McDaniel, Tyler; D’Azevedo, Ed F.; Li, Ying Wai; ...
2017-11-07
Within ab initio Quantum Monte Carlo simulations, the leading numerical cost for large systems is the computation of the values of the Slater determinants in the trial wavefunction. Each Monte Carlo step requires finding the determinant of a dense matrix. This is most commonly iteratively evaluated using a rank-1 Sherman-Morrison updating scheme to avoid repeated explicit calculation of the inverse. The overall computational cost is therefore formally cubic in the number of electrons or matrix size. To improve the numerical efficiency of this procedure, we propose a novel multiple rank delayed update scheme. This strategy enables probability evaluation with applicationmore » of accepted moves to the matrices delayed until after a predetermined number of moves, K. The accepted events are then applied to the matrices en bloc with enhanced arithmetic intensity and computational efficiency via matrix-matrix operations instead of matrix-vector operations. Here this procedure does not change the underlying Monte Carlo sampling or its statistical efficiency. For calculations on large systems and algorithms such as diffusion Monte Carlo where the acceptance ratio is high, order of magnitude improvements in the update time can be obtained on both multi- core CPUs and GPUs.« less
Berman, Margit I.; Jr., Jay C. Buckey; Hull, Jay G.; Linardatos, Eftihia; Song, Sueyoung L.; McLellan, Robert K.; Hegel, Mark T.
2014-01-01
Computer-based depression interventions lacking live therapist support have difficulty engaging users. This study evaluated the usability, acceptability, credibility, therapeutic alliance and efficacy of a stand-alone multimedia, interactive, computer-based Problem Solving Treatment program (ePST™) for depression. The program simulated live treatment from an expert PST therapist, and delivered 6 ePST™ sessions over 9 weeks. Twenty-nine participants with moderate-severe symptoms received the intervention; 23 completed a mini mally adequate dose of ePST™ (at least 4 sessions). Program usability, acceptability, credibility, and therapeutic alliance were assessed at treatment midpoint and endpoint. Depressive symptoms and health-related functioning were assessed at baseline, treatment midpoint (4 weeks), and study endpoint (10 weeks). Depression outcomes and therapeutic alliance ratings were also compared to previously published research on live PST and computer-based depression therapy. Participants rated the program as highly usable, acceptable, and credible, and reported a therapeutic alliance with the program comparable to that observed in live therapy. Depressive symptoms improved significantly over time. These findings also provide preliminary evidence that ePST™ may be effective as a depression treatment. Larger clinical trials with diverse samples are indicated. PMID:24680231
Delayed Slater determinant update algorithms for high efficiency quantum Monte Carlo
DOE Office of Scientific and Technical Information (OSTI.GOV)
McDaniel, Tyler; D’Azevedo, Ed F.; Li, Ying Wai
Within ab initio Quantum Monte Carlo simulations, the leading numerical cost for large systems is the computation of the values of the Slater determinants in the trial wavefunction. Each Monte Carlo step requires finding the determinant of a dense matrix. This is most commonly iteratively evaluated using a rank-1 Sherman-Morrison updating scheme to avoid repeated explicit calculation of the inverse. The overall computational cost is therefore formally cubic in the number of electrons or matrix size. To improve the numerical efficiency of this procedure, we propose a novel multiple rank delayed update scheme. This strategy enables probability evaluation with applicationmore » of accepted moves to the matrices delayed until after a predetermined number of moves, K. The accepted events are then applied to the matrices en bloc with enhanced arithmetic intensity and computational efficiency via matrix-matrix operations instead of matrix-vector operations. Here this procedure does not change the underlying Monte Carlo sampling or its statistical efficiency. For calculations on large systems and algorithms such as diffusion Monte Carlo where the acceptance ratio is high, order of magnitude improvements in the update time can be obtained on both multi- core CPUs and GPUs.« less
Delayed Slater determinant update algorithms for high efficiency quantum Monte Carlo.
McDaniel, T; D'Azevedo, E F; Li, Y W; Wong, K; Kent, P R C
2017-11-07
Within ab initio Quantum Monte Carlo simulations, the leading numerical cost for large systems is the computation of the values of the Slater determinants in the trial wavefunction. Each Monte Carlo step requires finding the determinant of a dense matrix. This is most commonly iteratively evaluated using a rank-1 Sherman-Morrison updating scheme to avoid repeated explicit calculation of the inverse. The overall computational cost is, therefore, formally cubic in the number of electrons or matrix size. To improve the numerical efficiency of this procedure, we propose a novel multiple rank delayed update scheme. This strategy enables probability evaluation with an application of accepted moves to the matrices delayed until after a predetermined number of moves, K. The accepted events are then applied to the matrices en bloc with enhanced arithmetic intensity and computational efficiency via matrix-matrix operations instead of matrix-vector operations. This procedure does not change the underlying Monte Carlo sampling or its statistical efficiency. For calculations on large systems and algorithms such as diffusion Monte Carlo, where the acceptance ratio is high, order of magnitude improvements in the update time can be obtained on both multi-core central processing units and graphical processing units.
Delayed Slater determinant update algorithms for high efficiency quantum Monte Carlo
NASA Astrophysics Data System (ADS)
McDaniel, T.; D'Azevedo, E. F.; Li, Y. W.; Wong, K.; Kent, P. R. C.
2017-11-01
Within ab initio Quantum Monte Carlo simulations, the leading numerical cost for large systems is the computation of the values of the Slater determinants in the trial wavefunction. Each Monte Carlo step requires finding the determinant of a dense matrix. This is most commonly iteratively evaluated using a rank-1 Sherman-Morrison updating scheme to avoid repeated explicit calculation of the inverse. The overall computational cost is, therefore, formally cubic in the number of electrons or matrix size. To improve the numerical efficiency of this procedure, we propose a novel multiple rank delayed update scheme. This strategy enables probability evaluation with an application of accepted moves to the matrices delayed until after a predetermined number of moves, K. The accepted events are then applied to the matrices en bloc with enhanced arithmetic intensity and computational efficiency via matrix-matrix operations instead of matrix-vector operations. This procedure does not change the underlying Monte Carlo sampling or its statistical efficiency. For calculations on large systems and algorithms such as diffusion Monte Carlo, where the acceptance ratio is high, order of magnitude improvements in the update time can be obtained on both multi-core central processing units and graphical processing units.
NASA Technical Reports Server (NTRS)
Krebs, R. P.
1972-01-01
The computer program described calculates the design-point characteristics of a gas generator or a turbojet lift engine for V/STOL applications. The program computes the dimensions and mass, as well as the thermodynamic performance of the model engine and its components. The program was written in FORTRAN 4 language. Provision has been made so that the program accepts input values in either SI Units or U.S. Customary Units. Each engine design-point calculation requires less than 0.5 second of 7094 computer time.
Understanding and enhancing user acceptance of computer technology
NASA Technical Reports Server (NTRS)
Rouse, William B.; Morris, Nancy M.
1986-01-01
Technology-driven efforts to implement computer technology often encounter problems due to lack of acceptance or begrudging acceptance of the personnel involved. It is argued that individuals' acceptance of automation, in terms of either computerization or computer aiding, is heavily influenced by their perceptions of the impact of the automation on their discretion in performing their jobs. It is suggested that desired levels of discretion reflect needs to feel in control and achieve self-satisfaction in task performance, as well as perceptions of inadequacies of computer technology. Discussion of these factors leads to a structured set of considerations for performing front-end analysis, deciding what to automate, and implementing the resulting changes.
Enabling Rapid Naval Architecture Design Space Exploration
NASA Technical Reports Server (NTRS)
Mueller, Michael A.; Dufresne, Stephane; Balestrini-Robinson, Santiago; Mavris, Dimitri
2011-01-01
Well accepted conceptual ship design tools can be used to explore a design space, but more precise results can be found using detailed models in full-feature computer aided design programs. However, defining a detailed model can be a time intensive task and hence there is an incentive for time sensitive projects to use conceptual design tools to explore the design space. In this project, the combination of advanced aerospace systems design methods and an accepted conceptual design tool facilitates the creation of a tool that enables the user to not only visualize ship geometry but also determine design feasibility and estimate the performance of a design.
Real-time robot deliberation by compilation and monitoring of anytime algorithms
NASA Technical Reports Server (NTRS)
Zilberstein, Shlomo
1994-01-01
Anytime algorithms are algorithms whose quality of results improves gradually as computation time increases. Certainty, accuracy, and specificity are metrics useful in anytime algorighm construction. It is widely accepted that a successful robotic system must trade off between decision quality and the computational resources used to produce it. Anytime algorithms were designed to offer such a trade off. A model of compilation and monitoring mechanisms needed to build robots that can efficiently control their deliberation time is presented. This approach simplifies the design and implementation of complex intelligent robots, mechanizes the composition and monitoring processes, and provides independent real time robotic systems that automatically adjust resource allocation to yield optimum performance.
Kearney, N; Kidd, L; Miller, M; Sage, M; Khorrami, J; McGee, M; Cassidy, J; Niven, K; Gray, P
2006-07-01
Recent changes in cancer service provision mean that many patients spend a limited time in hospital and therefore experience and must cope with and manage treatment-related side effects at home. Information technology can provide innovative solutions in promoting patient care through information provision, enhancing communication, monitoring treatment-related side effects and promoting self-care. The aim of this feasibility study was to evaluate the acceptability of using handheld computers as a symptom assessment and management tool for patients receiving chemotherapy for cancer. A convenience sample of patients (n = 18) and health professionals (n = 9) at one Scottish cancer centre was recruited. Patients used the handheld computer to record and send daily symptom reports to the cancer centre and receive instant, tailored symptom management advice during two treatment cycles. Both patients' and health professionals' perceptions of the handheld computer system were evaluated at baseline and at the end of the project. Patients believed the handheld computer had improved their symptom management and felt comfortable in using it. The health professionals also found the handheld computer to be helpful in assessing and managing patients' symptoms. This project suggests that a handheld-computer-based symptom management tool is feasible and acceptable to both patients and health professionals in complementing the care of patients receiving chemotherapy.
The effect of magnification loupes on the performance of preclinical dental students.
Maggio, Margrit P; Villegas, Hilda; Blatz, Markus B
2011-01-01
optical magnifying devices such as magnification loupes are increasingly used in clinical practice and educational settings. However, scientific evidence to validate their benefits is limited. This study assessed the effect of dental magnification loupes on psychomotor skill acquisition during a preclinical operative dentistry course. the performance of first-year dental students was assessed during an Advanced Simulation Course (AS) using virtual reality-based technology (VRBT) training. The test group consisted of 116 dental students using magnification loupes (+MAG), while students not using them (-MAG, n = 116) served as the control. The following parameters were evaluated: number of successfully passing preparation procedures per course rotation, amount of time per tooth preparation, number of times students needed computer assistance and evaluation, and amount of time spent in the computer assistance and evaluation mode per procedure. Data were collected on each student through VRBT during the preparation procedure and stored on a closed network server computer. Unpaired t tests were used to analyze mean differences between the groups. In addition, student acceptance of magnification loupes was measured and evaluated through survey interpretation. +MAG students completed more preparations, worked faster per procedure, and used the computer-assisted evaluation less frequently and for shorter periods, therefore displaying greater overall performance. The survey revealed a high degree of student acceptance of using magnification. dental magnification loupes significantly enhanced student performance during preclinical dental education and were considered an effective adjunct by the students who used them.
Rational calculation accuracy in acousto-optical matrix-vector processor
NASA Astrophysics Data System (ADS)
Oparin, V. V.; Tigin, Dmitry V.
1994-01-01
The high speed of parallel computations for a comparatively small-size processor and acceptable power consumption makes the usage of acousto-optic matrix-vector multiplier (AOMVM) attractive for processing of large amounts of information in real time. The limited accuracy of computations is an essential disadvantage of such a processor. The reduced accuracy requirements allow for considerable simplification of the AOMVM architecture and the reduction of the demands on its components.
The Impact of Iranian Teachers Cultural Values on Computer Technology Acceptance
ERIC Educational Resources Information Center
Sadeghi, Karim; Saribagloo, Javad Amani; Aghdam, Samad Hanifepour; Mahmoudi, Hojjat
2014-01-01
This study was conducted with the aim of testing the technology acceptance model and the impact of Hofstede cultural values (masculinity/femininity, uncertainty avoidance, individualism/collectivism, and power distance) on computer technology acceptance among teachers at Urmia city (Iran) using the structural equation modeling approach. From among…
NASA Technical Reports Server (NTRS)
Staffanson, F. L.
1981-01-01
The FORTRAN computer program RAWINPROC accepts output from NASA Wallops computer program METPASS1; and produces input for NASA computer program 3.0.0700 (ECC-PRD). The three parts together form a software system for the completely automatic reduction of standard RAWINSONDE sounding data. RAWINPROC pre-edits the 0.1-second data, including time-of-day, azimuth, elevation, and sonde-modulated tone frequency, condenses the data according to successive dwells of the tone frequency, decommutates the condensed data into the proper channels (temperature, relative humidity, high and low references), determines the running baroswitch contact number and computes the associated pressure altitudes, and interpolates the data appropriate for input to ACC-PRD.
Efficiencies of joint non-local update moves in Monte Carlo simulations of coarse-grained polymers
NASA Astrophysics Data System (ADS)
Austin, Kieran S.; Marenz, Martin; Janke, Wolfhard
2018-03-01
In this study four update methods are compared in their performance in a Monte Carlo simulation of polymers in continuum space. The efficiencies of the update methods and combinations thereof are compared with the aid of the autocorrelation time with a fixed (optimal) acceptance ratio. Results are obtained for polymer lengths N = 14, 28 and 42 and temperatures below, at and above the collapse transition. In terms of autocorrelation, the optimal acceptance ratio is approximately 0.4. Furthermore, an overview of the step sizes of the update methods that correspond to this optimal acceptance ratio is given. This shall serve as a guide for future studies that rely on efficient computer simulations.
Bayley, Julie E; Brown, Katherine E
2015-12-09
With ongoing concerns about the sexual health and wellbeing of young people, there is increasing need to innovate intervention approaches. Engaging parents as agents to support their children, alongside capitalising on increasingly sophisticated technological options could jointly enhance support. Converting existing programmes into interactive game based options has the potential to broaden learning access whilst preserving behaviour change technique fidelity. However the acceptability of this approach and viability of adapting resources in this way is yet to be established. This paper reports on the process of converting an existing group programme ("What Should We Tell the Children?") and tests the acceptability within a community setting. Translation of the original programme included selecting exercises and gathering user feedback on character and message framing preferences. For acceptability testing, parents were randomised to either the game (n = 106) or a control (non-interactive webpage) condition (n = 76). At time 1 all participants completed a survey on demographics, computer literacy and Theory of Planned Behaviour (TPB) items. Post intervention (time 2) users repeated the TPB questions in addition to acceptability items. Interviews (n = 17) were conducted 3 months post intervention to gather qualitative feedback on transfer of learning into real life. The process of conversion identified clear preferences for first person role play, home setting and realistic characters alongside positively phrased feedback. Evaluation results show that the game was acceptable to parents on cognitive and emotional dimensions, particularly for parents of younger children. Acceptability was not influenced by baseline demographics, computer skills or baseline TPB variables. MANOVA analysis and qualitative feedback suggest potential for effective translation of learning into real life. However attrition was more likely in the game condition, potentially due to feedback text volume. A manualised group programme can be viably converted into a serious game format which is both cognitively and emotionally acceptable. The intervention may be more effectively targeted at parents with younger children, and further game developments must particularly address information dosing. Establishing the viability of digitally converting a group programme is a significant step forward for implementation focused research.
CSI computer system/remote interface unit acceptance test results
NASA Technical Reports Server (NTRS)
Sparks, Dean W., Jr.
1992-01-01
The validation tests conducted on the Control/Structures Interaction (CSI) Computer System (CCS)/Remote Interface Unit (RIU) is discussed. The CCS/RIU consists of a commercially available, Langley Research Center (LaRC) programmed, space flight qualified computer and a flight data acquisition and filtering computer, developed at LaRC. The tests were performed in the Space Structures Research Laboratory (SSRL) and included open loop excitation, closed loop control, safing, RIU digital filtering, and RIU stand alone testing with the CSI Evolutionary Model (CEM) Phase-0 testbed. The test results indicated that the CCS/RIU system is comparable to ground based systems in performing real-time control-structure experiments.
Neural network approach to proximity effect corrections in electron-beam lithography
NASA Astrophysics Data System (ADS)
Frye, Robert C.; Cummings, Kevin D.; Rietman, Edward A.
1990-05-01
The proximity effect, caused by electron beam backscattering during resist exposure, is an important concern in writing submicron features. It can be compensated by appropriate local changes in the incident beam dose, but computation of the optimal correction usually requires a prohibitively long time. We present an example of such a computation on a small test pattern, which we performed by an iterative method. We then used this solution as a training set for an adaptive neural network. After training, the network computed the same correction as the iterative method, but in a much shorter time. Correcting the image with a software based neural network resulted in a decrease in the computation time by a factor of 30, and a hardware based network enhanced the computation speed by more than a factor of 1000. Both methods had an acceptably small error of 0.5% compared to the results of the iterative computation. Additionally, we verified that the neural network correctly generalized the solution of the problem to include patterns not contained in its training set.
Powsiri Klinkhachorn; J. Moody; Philip A. Araman
1995-01-01
For the past few decades, researchers have devoted time and effort to apply automation and modern computer technologies towards improving the productivity of traditional industries. To be competitive, one must streamline operations and minimize production costs, while maintaining an acceptable margin of profit. This paper describes the effort of one such endeavor...
Contracting for Computer Software in Standardized Computer Languages
Brannigan, Vincent M.; Dayhoff, Ruth E.
1982-01-01
The interaction between standardized computer languages and contracts for programs which use these languages is important to the buyer or seller of software. The rationale for standardization, the problems in standardizing computer languages, and the difficulties of determining whether the product conforms to the standard are issues which must be understood. The contract law processes of delivery, acceptance testing, acceptance, rejection, and revocation of acceptance are applicable to the contracting process for standard language software. Appropriate contract language is suggested for requiring strict compliance with a standard, and an overview of remedies is given for failure to comply.
Impact of Collaborative Work on Technology Acceptance: A Case Study from Virtual Computing
ERIC Educational Resources Information Center
Konak, Abdullah; Kulturel-Konak, Sadan; Nasereddin, Mahdi; Bartolacci, Michael R.
2017-01-01
Aim/Purpose: This paper utilizes the Technology Acceptance Model (TAM) to examine the extent to which acceptance of Remote Virtual Computer Laboratories (RVCLs) is affected by students' technological backgrounds and the role of collaborative work. Background: RVCLs are widely used in information technology and cyber security education to provide…
Berner, Eta S.; Detmer, Don E.; Simborg, Donald
2005-01-01
For over thirty years, there have been predictions that the widespread clinical use of computers was imminent. Yet the “wave” has never broken. In this article, two broad time periods are examined: the 1960's to the 1980's and the 1980's to the present. Technology immaturity, health administrator focus on financial systems, application “unfriendliness,” and physician resistance were all barriers to acceptance during the early time period. Although these factors persist, changes in clinicians' economics, more computer literacy in the general population, and, most importantly, changes in government policies and increased support for clinical computing suggest that the wave may break in the next decade. PMID:15492029
Resistance to Change: Reactions to Workplace Computerization.
ERIC Educational Resources Information Center
Gattiker, Urs E.; Larwood, Laurie
Although past research has suggested that computer acceptance and knowledge are two variables crucial in attaining desired profitability increases with computer-based technology, few studies have examined how these variables occur in organizational settings. A study was undertaken to examine acceptance of, and knowledge about, computer-based…
Performance of VPIC on Trinity
NASA Astrophysics Data System (ADS)
Nystrom, W. D.; Bergen, B.; Bird, R. F.; Bowers, K. J.; Daughton, W. S.; Guo, F.; Li, H.; Nam, H. A.; Pang, X.; Rust, W. N., III; Wohlbier, J.; Yin, L.; Albright, B. J.
2016-10-01
Trinity is a new major DOE computing resource which is going through final acceptance testing at Los Alamos National Laboratory. Trinity has several new and unique architectural features including two compute partitions, one with dual socket Intel Haswell Xeon compute nodes and one with Intel Knights Landing (KNL) Xeon Phi compute nodes. Additional unique features include use of on package high bandwidth memory (HBM) for the KNL nodes, the ability to configure the KNL nodes with respect to HBM model and on die network topology in a variety of operational modes at run time, and use of solid state storage via burst buffer technology to reduce time required to perform I/O. An effort is in progress to port and optimize VPIC to Trinity and evaluate its performance. Because VPIC was recently released as Open Source, it is being used as part of acceptance testing for Trinity and is participating in the Trinity Open Science Program which has resulted in excellent collaboration activities with both Cray and Intel. Results of this work will be presented on performance of VPIC on both Haswell and KNL partitions for both single node runs and runs at scale. Work performed under the auspices of the U.S. Dept. of Energy by the Los Alamos National Security, LLC Los Alamos National Laboratory under contract DE-AC52-06NA25396 and supported by the LANL LDRD program.
NASA Technical Reports Server (NTRS)
Grantham, W. D.; Deal, P. L.
1974-01-01
A fixed-base simulator study was conducted to determine the minimum acceptable level of longitudinal stability for a representative turbofan STOL (short take-off and landing) transport airplane during the landing approach. Real-time digital simulation techniques were used. The computer was programed with equations of motion for six degrees of freedom, and the aerodynamic inputs were based on measured wind-tunnel data. The primary piloting task was an instrument approach to a breakout at a 60-m (200-ft) ceiling.
Technology Acceptance Predictors among Student Teachers and Experienced Classroom Teachers
ERIC Educational Resources Information Center
Smarkola, Claudia
2007-01-01
This study investigated 160 student teachers' and 158 experienced teachers' self-reported computer usage and their future intentions to use computer applications for school assignments. The Technology Acceptance Model (TAM) was used as the framework to determine computer usage and intentions. Statistically significant results showed that after…
Orientation/Time Management Skill Training Lesson: Development and Evaluation. Final Report.
ERIC Educational Resources Information Center
Dobrovolny, Jacqueline L.; And Others
A lesson was developed containing materials designed to assist students in their adaptation to the novelties of a computer assisted or managed instructional environment, providing students with appropriate role models for increasing acceptance of their increased responsibility for learning and introducing a progress tracking approach to assist…
Computational alternatives to obtain time optimal jet engine control. M.S. Thesis
NASA Technical Reports Server (NTRS)
Basso, R. J.; Leake, R. J.
1976-01-01
Two computational methods to determine an open loop time optimal control sequence for a simple single spool turbojet engine are described by a set of nonlinear differential equations. Both methods are modifications of widely accepted algorithms which can solve fixed time unconstrained optimal control problems with a free right end. Constrained problems to be considered have fixed right ends and free time. Dynamic programming is defined on a standard problem and it yields a successive approximation solution to the time optimal problem of interest. A feedback control law is obtained and it is then used to determine the corresponding open loop control sequence. The Fletcher-Reeves conjugate gradient method has been selected for adaptation to solve a nonlinear optimal control problem with state variable and control constraints.
Aktas, Aynur; Hullihen, Barbara; Shrotriya, Shiva; Thomas, Shirley; Walsh, Declan; Estfan, Bassam
2015-03-01
Incorporation of tablet computers (TCs) into patient assessment may facilitate safe and secure data collection. We evaluated the usefulness and acceptability of a TC as an electronic self-report symptom assessment instrument. Research Electronic Data Capture Web-based application supported data capture. Information was collected and disseminated in real time and a structured format. Completed questionnaires were printed and given to the physician before the patient visit. Most participants completed the survey without assistance. Completion rate was 100%. The median global quality of life was high for all. More than half reported pain. Based on Edmonton Symptom Assessment System, the top 3 most common symptoms were tiredness, anxiety, and decreased well-being. Patient and physician acceptability for these quick and useful TC-based surveys was excellent. © The Author(s) 2013.
COEFUV: A Computer Implementation of a Generalized Unmanned Vehicle Cost Model.
1978-10-01
78 T N OMBER . C A FEUCNTER CLASSIF lED DAS-TRRNL mh~hhhh~hhE DAS-TR-78-4 DAS-TR-78-4 coI COEFUV: A COMPUTER IMPLEMENTATION OF A IM GENERALIZED ...34 and the time to generate them are important. Many DAS participants supported this effort. The authors wish to acknow- ledge Richard H. Anderson for...conflict and the on-going COMBAT ANGEL program at Davis-Monthan Air Force Base, there is not a generally accepted costing methodology for unmanned vehicles
Mullen, Kristin H; Berry, Donna L; Zierler, Brenda K
2004-09-01
To determine the acceptability and usability of a computerized quality-of-life (QOL) and symptom assessment tool and the graphically displayed QOL and symptom output in an ambulatory radiation oncology clinic. Descriptive, cross-sectional. Radiation oncology clinic located in an urban university medical center. 45 patients with cancer being evaluated for radiation therapy and 10 clinicians, who submitted 12 surveys. Acceptability of the computerized assessment was measured with an online, 16-item, Likert-style survey delivered as 45 patients undergoing radiation therapy completed a 25-item QOL and symptom assessment. Usability of the graphic output was assessed with clinician completion of a four-item paper survey. Acceptability and usability of computerized patient assessment. The patient acceptability survey indicated that 70% (n = 28) liked computers and 10% (n = 4) did not. The program was easy to use for 79% (n = 26), easy to understand for 91% (n = 30), and enjoyable for 71% (n = 24). Seventy-six percent (n = 25) believed that the amount of time needed to complete the computerized survey was acceptable. Sixty-six percent (n = 21) responded that they were satisfied with the program, and none of the participants chose the very dissatisfied response. Eighty-three percent (n = 10) of the clinicians found the graphic output helpful in promoting communication with patients, 75% (n = 9) found the output report helpful in identifying appropriate areas of QOL deficits or concerns, and 83% (n = 10) indicated that the output helped guide clinical interactions with patients. The computer-based QOL and symptom assessment tool is acceptable to patients, and the graphically displayed QOL and symptom output is useful to radiation oncology nurses and physicians. Wider application of computerized patient-generated data can continue in various cancer settings and be tested for clinical and organizational outcomes.
Rubidium frequency standard test program for NAVSTAR GPS
NASA Technical Reports Server (NTRS)
Koide, F.; Dederich, D. J.
1978-01-01
Test data of the RFS Program in the Production phase and computer automation are presented, as an essential element in the evaluation of the RFS performance in a simulated spacecraft environment. Typical production test data will be discussed for stabilities from 1 to 100,000 seconds averaging time and simulated time error accumulation test. Also, design considerations in developing the RFS test systems for the acceptance test in production are discussed.
Simulating Nonequilibrium Radiation via Orthogonal Polynomial Refinement
2015-01-07
measured by the preprocessing time, computer memory space, and average query time. In many search procedures for the number of points np of a data set, a...analytic expression for the radiative flux density is possible by the commonly accepted local thermal equilibrium ( LTE ) approximation. A semi...Vol. 227, pp. 9463-9476, 2008. 10. Galvez, M., Ray-Tracing model for radiation transport in three-dimensional LTE system, App. Physics, Vol. 38
Experience using radio frequency laptops to access the electronic medical record in exam rooms.
Dworkin, L. A.; Krall, M.; Chin, H.; Robertson, N.; Harris, J.; Hughes, J.
1999-01-01
Kaiser Permanente, Northwest, evaluated the use of laptop computers to access our existing comprehensive Electronic Medical Record in exam rooms via a wireless radiofrequency (RF) network. Eleven of 22 clinicians who were offered the laptops successfully adopted their use in the exam room. These clinicians were able to increase their exam room time with the patient by almost 4 minutes (25%), apparently without lengthening their overall work day. Patient response to exam room computing was overwhelmingly positive. The RF network response time was similar to the hardwired network. Problems cited by some laptop users and many of the eleven non-adopters included battery issues, different equipment layout and function, and inadequate training. IT support needs for the RF laptops were two to four times greater than for hardwired desktops. Addressing the reliability and training issues should increase clinician acceptance, making a successful general roll-out for exam room computing more likely. PMID:10566458
Accelerated computer generated holography using sparse bases in the STFT domain.
Blinder, David; Schelkens, Peter
2018-01-22
Computer-generated holography at high resolutions is a computationally intensive task. Efficient algorithms are needed to generate holograms at acceptable speeds, especially for real-time and interactive applications such as holographic displays. We propose a novel technique to generate holograms using a sparse basis representation in the short-time Fourier space combined with a wavefront-recording plane placed in the middle of the 3D object. By computing the point spread functions in the transform domain, we update only a small subset of the precomputed largest-magnitude coefficients to significantly accelerate the algorithm over conventional look-up table methods. We implement the algorithm on a GPU, and report a speedup factor of over 30. We show that this transform is superior over wavelet-based approaches, and show quantitative and qualitative improvements over the state-of-the-art WASABI method; we report accuracy gains of 2dB PSNR, as well improved view preservation.
Discrete-State Simulated Annealing For Traveling-Wave Tube Slow-Wave Circuit Optimization
NASA Technical Reports Server (NTRS)
Wilson, Jeffrey D.; Bulson, Brian A.; Kory, Carol L.; Williams, W. Dan (Technical Monitor)
2001-01-01
Algorithms based on the global optimization technique of simulated annealing (SA) have proven useful in designing traveling-wave tube (TWT) slow-wave circuits for high RF power efficiency. The characteristic of SA that enables it to determine a globally optimized solution is its ability to accept non-improving moves in a controlled manner. In the initial stages of the optimization, the algorithm moves freely through configuration space, accepting most of the proposed designs. This freedom of movement allows non-intuitive designs to be explored rather than restricting the optimization to local improvement upon the initial configuration. As the optimization proceeds, the rate of acceptance of non-improving moves is gradually reduced until the algorithm converges to the optimized solution. The rate at which the freedom of movement is decreased is known as the annealing or cooling schedule of the SA algorithm. The main disadvantage of SA is that there is not a rigorous theoretical foundation for determining the parameters of the cooling schedule. The choice of these parameters is highly problem dependent and the designer needs to experiment in order to determine values that will provide a good optimization in a reasonable amount of computational time. This experimentation can absorb a large amount of time especially when the algorithm is being applied to a new type of design. In order to eliminate this disadvantage, a variation of SA known as discrete-state simulated annealing (DSSA), was recently developed. DSSA provides the theoretical foundation for a generic cooling schedule which is problem independent, Results of similar quality to SA can be obtained, but without the extra computational time required to tune the cooling parameters. Two algorithm variations based on DSSA were developed and programmed into a Microsoft Excel spreadsheet graphical user interface (GUI) to the two-dimensional nonlinear multisignal helix traveling-wave amplifier analysis program TWA3. The algorithms were used to optimize the computed RF efficiency of a TWT by determining the phase velocity profile of the slow-wave circuit. The mathematical theory and computational details of the DSSA algorithms will be presented and results will be compared to those obtained with a SA algorithm.
A FORTRAN program for multivariate survival analysis on the personal computer.
Mulder, P G
1988-01-01
In this paper a FORTRAN program is presented for multivariate survival or life table regression analysis in a competing risks' situation. The relevant failure rate (for example, a particular disease or mortality rate) is modelled as a log-linear function of a vector of (possibly time-dependent) explanatory variables. The explanatory variables may also include the variable time itself, which is useful for parameterizing piecewise exponential time-to-failure distributions in a Gompertz-like or Weibull-like way as a more efficient alternative to Cox's proportional hazards model. Maximum likelihood estimates of the coefficients of the log-linear relationship are obtained from the iterative Newton-Raphson method. The program runs on a personal computer under DOS; running time is quite acceptable, even for large samples.
Exploring Complex Social Phenomena with Computer Simulations
ERIC Educational Resources Information Center
Berson, Ilene R.; Berson, Michael J.
2007-01-01
In social studies classes, there is a longstanding interest in how societies evolve and change over time. However, as stories of the past unfold, it is often difficult to identify a direct link between causes and effects, so students are forced to accept at face value the interpretations of economists, political scientists, historians,…
Spielberg, Freya; Kurth, Ann E; Severynen, Anneleen; Hsieh, Yu-Hsiang; Moring-Parris, Daniel; Mackenzie, Sara; Rothman, Richard
2011-06-01
Providers in emergency care settings (ECSs) often face barriers to expanded HIV testing. We undertook formative research to understand the potential utility of a computer tool, "CARE," to facilitate rapid HIV testing in ECSs. Computer tool usability and acceptability were assessed among 35 adult patients, and provider focus groups were held, in two ECSs in Washington State and Maryland. The computer tool was usable by patients of varying computer literacy. Patients appreciated the tool's privacy and lack of judgment and their ability to reflect on HIV risks and create risk reduction plans. Staff voiced concerns regarding ECS-based HIV testing generally, including resources for follow-up of newly diagnosed people. Computer-delivered HIV testing support was acceptable and usable among low-literacy populations in two ECSs. Such tools may help circumvent some practical barriers associated with routine HIV testing in busy settings though linkages to care will still be needed.
NASA Astrophysics Data System (ADS)
Figl, Michael; Birkfellner, Wolfgang; Watzinger, Franz; Wanschitz, Felix; Hummel, Johann; Hanel, Rudolf A.; Ewers, Rolf; Bergmann, Helmar
2002-05-01
Two main concepts of Head Mounted Displays (HMD) for augmented reality (AR) visualization exist, the optical and video-see through type. Several research groups have pursued both approaches for utilizing HMDs for computer aided surgery. While the hardware requirements for a video see through HMD to achieve acceptable time delay and frame rate seem to be enormous the clinical acceptance of such a device is doubtful from a practical point of view. Starting from previous work in displaying additional computer-generated graphics in operating microscopes, we have adapted a miniature head mounted operating microscope for AR by integrating two very small computer displays. To calibrate the projection parameters of this so called Varioscope AR we have used Tsai's Algorithm for camera calibration. Connection to a surgical navigation system was performed by defining an open interface to the control unit of the Varioscope AR. The control unit consists of a standard PC with a dual head graphics adapter to render and display the desired augmentation of the scene. We connected this control unit to a computer aided surgery (CAS) system by the TCP/IP interface. In this paper we present the control unit for the HMD and its software design. We tested two different optical tracking systems, the Flashpoint (Image Guided Technologies, Boulder, CO), which provided about 10 frames per second, and the Polaris (Northern Digital, Ontario, Canada) which provided at least 30 frames per second, both with a time delay of one frame.
De Cocker, Katrien; De Bourdeaudhuij, Ilse; Cardon, Greet; Vandelanotte, Corneel
2015-09-24
Because of the adverse health effects in adults, interventions to influence workplace sitting, a large contributor to overall daily sedentary time, are needed. Computer-tailored interventions have demonstrated good outcomes in other health behaviours, though few have targeted sitting time at work. Therefore, the present aims were to (1) describe the development of a theory-driven, web-based, computer-tailored advice to influence sitting at work, (2) report on the feasibility of reaching employees, and (3) report on the acceptability of the advice. Employees from a public city service (n = 179) were invited by e-mail to participate. Employees interested to request the advice (n = 112) were sent the website link, a personal login and password. The online advice was based on different aspects of the Theory of Planned Behaviour, Self-Determination Theory and Self-Regulation Theory. Logistic regressions were conducted to compare characteristics (gender, age, education, employment status, amount of sitting and psychosocial correlates of workplace sitting) of employees requesting the advice (n = 90, 80.4%) with those who did not. Two weeks after visiting the website, 47 employees (52.2%) completed an online acceptability questionnaire. Those with a high education were more likely to request the advice than those with a low education (OR = 2.4, CI = 1.0-5.8), and those with a part-time job were more likely to request the advice compared to full-time employees (OR = 2.9, CI = 1.2-7.1). The majority found the advice interesting (n = 36/47, 76.6%), relevant (n = 33/47, 70.2%) and motivating (n = 29/47, 61.7%). Fewer employees believed the advice was practicable (n = 15/47, 31.9%). After completing the advice, 58.0% (n = 25/43) reported to have started interrupting their sitting and 32.6% (n = 17/43) additionally intended to do so; 14.0 % (n = 6/43) reported to have reduced their sitting and another 51.2% (n = 22/43) intended to do so. More efforts are needed to reach lower educated and full-time workers. Further research should examine the effects of this intervention in a rigorous randomised controlled trial. It is feasible to reach employees with this tool. Most of the employees who requested the advice found it acceptable and reported they changed their behaviour or intended to change it. Interrupting sitting appeared more achievable than reducing workplace sitting.
Rasmussen, Patrick P.; Gray, John R.; Glysson, G. Douglas; Ziegler, Andrew C.
2009-01-01
In-stream continuous turbidity and streamflow data, calibrated with measured suspended-sediment concentration data, can be used to compute a time series of suspended-sediment concentration and load at a stream site. Development of a simple linear (ordinary least squares) regression model for computing suspended-sediment concentrations from instantaneous turbidity data is the first step in the computation process. If the model standard percentage error (MSPE) of the simple linear regression model meets a minimum criterion, this model should be used to compute a time series of suspended-sediment concentrations. Otherwise, a multiple linear regression model using paired instantaneous turbidity and streamflow data is developed and compared to the simple regression model. If the inclusion of the streamflow variable proves to be statistically significant and the uncertainty associated with the multiple regression model results in an improvement over that for the simple linear model, the turbidity-streamflow multiple linear regression model should be used to compute a suspended-sediment concentration time series. The computed concentration time series is subsequently used with its paired streamflow time series to compute suspended-sediment loads by standard U.S. Geological Survey techniques. Once an acceptable regression model is developed, it can be used to compute suspended-sediment concentration beyond the period of record used in model development with proper ongoing collection and analysis of calibration samples. Regression models to compute suspended-sediment concentrations are generally site specific and should never be considered static, but they represent a set period in a continually dynamic system in which additional data will help verify any change in sediment load, type, and source.
Parametric Study of Pulse-Combustor-Driven Ejectors at High-Pressure
NASA Technical Reports Server (NTRS)
Yungster, Shaye; Paxson, Daniel E.; Perkins, Hugh D.
2015-01-01
Pulse-combustor configurations developed in recent studies have demonstrated performance levels at high-pressure operating conditions comparable to those observed at atmospheric conditions. However, problems related to the way fuel was being distributed within the pulse combustor were still limiting performance. In the first part of this study, new configurations are investigated computationally aimed at improving the fuel distribution and performance of the pulse-combustor. Subsequent sections investigate the performance of various pulse-combustor driven ejector configurations operating at highpressure conditions, focusing on the effects of fuel equivalence ratio and ejector throat area. The goal is to design pulse-combustor-ejector configurations that maximize pressure gain while achieving a thermal environment acceptable to a turbine, and at the same time maintain acceptable levels of NOx emissions and flow non-uniformities. The computations presented here have demonstrated pressure gains of up to 2.8%.
Parametric Study of Pulse-Combustor-Driven Ejectors at High-Pressure
NASA Technical Reports Server (NTRS)
Yungster, Shaye; Paxson, Daniel E.; Perkins, Hugh D.
2015-01-01
Pulse-combustor configurations developed in recent studies have demonstrated performance levels at high-pressure operating conditions comparable to those observed at atmospheric conditions. However, problems related to the way fuel was being distributed within the pulse combustor were still limiting performance. In the first part of this study, new configurations are investigated computationally aimed at improving the fuel distribution and performance of the pulse-combustor. Subsequent sections investigate the performance of various pulse-combustor driven ejector configurations operating at high pressure conditions, focusing on the effects of fuel equivalence ratio and ejector throat area. The goal is to design pulse-combustor-ejector configurations that maximize pressure gain while achieving a thermal environment acceptable to a turbine, and at the same time maintain acceptable levels of NO(x) emissions and flow non-uniformities. The computations presented here have demonstrated pressure gains of up to 2.8.
Simulation of Clinical Diagnosis: A Comparative Study
de Dombal, F. T.; Horrocks, Jane C.; Staniland, J. R.; Gill, P. W.
1971-01-01
This paper presents a comparison between three different modes of simulation of the diagnostic process—a computer-based system, a verbal mode, and a further mode in which cards were selected from a large board. A total of 34 subjects worked through a series of 444 diagnostic simulations. The verbal mode was found to be most enjoyable and realistic. At the board, considerable amounts of extra irrelevant data were selected. At the computer, the users asked the same questions every time, whether or not they were relevant to the particular diagnosis. They also found the teletype distracting, noisy, and slow. The need for an acceptable simulation system remains, and at present our Minisim and verbal modes are proving useful in training junior clinical students. Future simulators should be flexible, economical, and acceptably realistic—and to us this latter criterion implies the two-way use of speech. We are currently developing and testing such a system. PMID:5579197
ERIC Educational Resources Information Center
Mavrou, Katerina
2012-01-01
This paper discusses the results of peer acceptance in a study investigating the interactions of pairs of disabled and non-disabled pupils working together on computer-based tasks in mainstream primary schools in Cyprus. Twenty dyads of pupils were observed and videotaped while working together at the computer. Data analyses were based on the…
Framework for architecture-independent run-time reconfigurable applications
NASA Astrophysics Data System (ADS)
Lehn, David I.; Hudson, Rhett D.; Athanas, Peter M.
2000-10-01
Configurable Computing Machines (CCMs) have emerged as a technology with the computational benefits of custom ASICs as well as the flexibility and reconfigurability of general-purpose microprocessors. Significant effort from the research community has focused on techniques to move this reconfigurability from a rapid application development tool to a run-time tool. This requires the ability to change the hardware design while the application is executing and is known as Run-Time Reconfiguration (RTR). Widespread acceptance of run-time reconfigurable custom computing depends upon the existence of high-level automated design tools. Such tools must reduce the designers effort to port applications between different platforms as the architecture, hardware, and software evolves. A Java implementation of a high-level application framework, called Janus, is presented here. In this environment, developers create Java classes that describe the structural behavior of an application. The framework allows hardware and software modules to be freely mixed and interchanged. A compilation phase of the development process analyzes the structure of the application and adapts it to the target platform. Janus is capable of structuring the run-time behavior of an application to take advantage of the memory and computational resources available.
An acceptable role for computers in the aircraft design process
NASA Technical Reports Server (NTRS)
Gregory, T. J.; Roberts, L.
1980-01-01
Some of the reasons why the computerization trend is not wholly accepted are explored for two typical cases: computer use in the technical specialties and computer use in aircraft synthesis. The factors that limit acceptance are traced in part, to the large resources needed to understand the details of computer programs, the inability to include measured data as input to many of the theoretical programs, and the presentation of final results without supporting intermediate answers. Other factors are due solely to technical issues such as limited detail in aircraft synthesis and major simplifying assumptions in the technical specialties. These factors and others can be influenced by the technical specialist and aircraft designer. Some of these factors may become less significant as the computerization process evolves, but some issues, such as understanding large integrated systems, may remain issues in the future. Suggestions for improved acceptance include publishing computer programs so that they may be reviewed, edited, and read. Other mechanisms include extensive modularization of programs and ways to include measured information as part of the input to theoretical approaches.
NASA Technical Reports Server (NTRS)
Orr, John L.
1997-01-01
In many ways, the typical approach to the handling of bibliographic material for generating review articles and similar manuscripts has changed little since the use of xerographic reproduction has become widespread. The basic approach is to collect reprints of the relevant material and place it in folders or stacks based on its dominant content. As the amount of information available increases with the passage of time, the viability of this mechanical approach to bibliographic management decreases. The personal computer revolution has changed the way we deal with many familiar tasks. For example, word processing on personal computers has supplanted the typewriter for many applications. Similarly, spreadsheets have not only replaced many routine uses of calculators but have also made possible new applications because the cost of calculation is extremely low. Objective The objective of this research was to use personal computer bibliographic software technology to support the determination of spacecraft maximum acceptable concentration (SMAC) values. Specific Aims The specific aims were to produce draft SMAC documents for hydrogen sulfide and tetrachloroethylene taking maximum advantage of the bibliographic software.
ALMA Correlator Real-Time Data Processor
NASA Astrophysics Data System (ADS)
Pisano, J.; Amestica, R.; Perez, J.
2005-10-01
The design of a real-time Linux application utilizing Real-Time Application Interface (RTAI) to process real-time data from the radio astronomy correlator for the Atacama Large Millimeter Array (ALMA) is described. The correlator is a custom-built digital signal processor which computes the cross-correlation function of two digitized signal streams. ALMA will have 64 antennas with 2080 signal streams each with a sample rate of 4 giga-samples per second. The correlator's aggregate data output will be 1 gigabyte per second. The software is defined by hard deadlines with high input and processing data rates, while requiring interfaces to non real-time external computers. The designed computer system - the Correlator Data Processor or CDP, consists of a cluster of 17 SMP computers, 16 of which are compute nodes plus a master controller node all running real-time Linux kernels. Each compute node uses an RTAI kernel module to interface to a 32-bit parallel interface which accepts raw data at 64 megabytes per second in 1 megabyte chunks every 16 milliseconds. These data are transferred to tasks running on multiple CPUs in hard real-time using RTAI's LXRT facility to perform quantization corrections, data windowing, FFTs, and phase corrections for a processing rate of approximately 1 GFLOPS. Highly accurate timing signals are distributed to all seventeen computer nodes in order to synchronize them to other time-dependent devices in the observatory array. RTAI kernel tasks interface to the timing signals providing sub-millisecond timing resolution. The CDP interfaces, via the master node, to other computer systems on an external intra-net for command and control, data storage, and further data (image) processing. The master node accesses these external systems utilizing ALMA Common Software (ACS), a CORBA-based client-server software infrastructure providing logging, monitoring, data delivery, and intra-computer function invocation. The software is being developed in tandem with the correlator hardware which presents software engineering challenges as the hardware evolves. The current status of this project and future goals are also presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shorgin, Sergey Ya.; Pechinkin, Alexander V.; Samouylov, Konstantin E.
Cloud computing is promising technology to manage and improve utilization of computing center resources to deliver various computing and IT services. For the purpose of energy saving there is no need to unnecessarily operate many servers under light loads, and they are switched off. On the other hand, some servers should be switched on in heavy load cases to prevent very long delays. Thus, waiting times and system operating cost can be maintained on acceptable level by dynamically adding or removing servers. One more fact that should be taken into account is significant server setup costs and activation times. Formore » better energy efficiency, cloud computing system should not react on instantaneous increase or instantaneous decrease of load. That is the main motivation for using queuing systems with hysteresis for cloud computing system modelling. In the paper, we provide a model of cloud computing system in terms of multiple server threshold-based infinite capacity queuing system with hysteresis and noninstantanuous server activation. For proposed model, we develop a method for computing steady-state probabilities that allow to estimate a number of performance measures.« less
Experimental validation of pulsed column inventory estimators
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beyerlein, A.L.; Geldard, J.F.; Weh, R.
Near-real-time accounting (NRTA) for reprocessing plants relies on the timely measurement of all transfers through the process area and all inventory in the process. It is difficult to measure the inventory of the solvent contractors; therefore, estimation techniques are considered. We have used experimental data obtained at the TEKO facility in Karlsruhe and have applied computer codes developed at Clemson University to analyze this data. For uranium extraction, the computer predictions agree to within 15% of the measured inventories. We believe this study is significant in demonstrating that using theoretical models with a minimum amount of process data may bemore » an acceptable approach to column inventory estimation for NRTA. 15 refs., 7 figs.« less
A Computational Approach for Probabilistic Analysis of Water Impact Simulations
NASA Technical Reports Server (NTRS)
Horta, Lucas G.; Mason, Brian H.; Lyle, Karen H.
2009-01-01
NASA's development of new concepts for the Crew Exploration Vehicle Orion presents many similar challenges to those worked in the sixties during the Apollo program. However, with improved modeling capabilities, new challenges arise. For example, the use of the commercial code LS-DYNA, although widely used and accepted in the technical community, often involves high-dimensional, time consuming, and computationally intensive simulations. The challenge is to capture what is learned from a limited number of LS-DYNA simulations to develop models that allow users to conduct interpolation of solutions at a fraction of the computational time. This paper presents a description of the LS-DYNA model, a brief summary of the response surface techniques, the analysis of variance approach used in the sensitivity studies, equations used to estimate impact parameters, results showing conditions that might cause injuries, and concluding remarks.
An Augmented Lagrangian Filter Method for Real-Time Embedded Optimization
Chiang, Nai -Yuan; Huang, Rui; Zavala, Victor M.
2017-04-17
We present a filter line-search algorithm for nonconvex continuous optimization that combines an augmented Lagrangian function and a constraint violation metric to accept and reject steps. The approach is motivated by real-time optimization applications that need to be executed on embedded computing platforms with limited memory and processor speeds. The proposed method enables primal–dual regularization of the linear algebra system that in turn permits the use of solution strategies with lower computing overheads. We prove that the proposed algorithm is globally convergent and we demonstrate the developments using a nonconvex real-time optimization application for a building heating, ventilation, and airmore » conditioning system. Our numerical tests are performed on a standard processor and on an embedded platform. Lastly, we demonstrate that the approach reduces solution times by a factor of over 1000.« less
An Augmented Lagrangian Filter Method for Real-Time Embedded Optimization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chiang, Nai -Yuan; Huang, Rui; Zavala, Victor M.
We present a filter line-search algorithm for nonconvex continuous optimization that combines an augmented Lagrangian function and a constraint violation metric to accept and reject steps. The approach is motivated by real-time optimization applications that need to be executed on embedded computing platforms with limited memory and processor speeds. The proposed method enables primal–dual regularization of the linear algebra system that in turn permits the use of solution strategies with lower computing overheads. We prove that the proposed algorithm is globally convergent and we demonstrate the developments using a nonconvex real-time optimization application for a building heating, ventilation, and airmore » conditioning system. Our numerical tests are performed on a standard processor and on an embedded platform. Lastly, we demonstrate that the approach reduces solution times by a factor of over 1000.« less
Message From the Editor for Contributions to the 2016 Real Time Conference Issue of TNS
NASA Astrophysics Data System (ADS)
Schmeling, Sascha Marc
2017-06-01
This issue of the IEEE Transactions on Nuclear Science (TNS) is devoted to the 20th IEEE-NPSS Real Time Conference (RT2016) on Computing Applications in Nuclear and Plasma Sciences held in Padua, Italy, in June 2016. A total of 90 papers presented at the conference were submitted for possible publication in TNS. This conference issue presents 46 papers, which have been accepted so far after a thorough peer review process. These contributions come from a very broad range of fields of application, including Astrophysics, Medical Imaging, Nuclear and Plasma Physics, Particle Accelerators, and Particle Physics Experiments. Several papers were close to being accepted but did not make it into this special issue. They will be considered for further publication.
Sehlen, Susanne; Ott, Martin; Marten-Mittag, Birgitt; Haimerl, Wolfgang; Dinkel, Andreas; Duehmke, Eckhart; Klein, Christian; Schaefer, Christof; Herschbach, Peter
2012-07-01
This study investigated feasibility and acceptance of computer-based assessment for the identification of psychosocial distress in routine radiotherapy care. 155 cancer patients were assessed using QSC-R10, PO-Bado-SF and Mach-9. The congruence between computerized tablet PC and conventional paper assessment was analysed in 50 patients. The agreement between the 2 modes was high (ICC 0.869-0.980). Acceptance of computer-based assessment was very high (>95%). Sex, age, education, distress and Karnofsky performance status (KPS) did not influence acceptance. Computerized assessment was rated more difficult by older patients (p = 0.039) and patients with low KPS (p = 0.020). 75.5% of the respondents supported referral for psycho-social intervention for distressed patients. The prevalence of distress was 27.1% (QSC-R10). Computer-based assessment allows easy identification of distressed patients. Level of staff involvement is low, and the results are quickly available for care providers. © Georg Thieme Verlag KG Stuttgart · New York.
High-Productivity Computing in Computational Physics Education
NASA Astrophysics Data System (ADS)
Tel-Zur, Guy
2011-03-01
We describe the development of a new course in Computational Physics at the Ben-Gurion University. This elective course for 3rd year undergraduates and MSc. students is being taught during one semester. Computational Physics is by now well accepted as the Third Pillar of Science. This paper's claim is that modern Computational Physics education should deal also with High-Productivity Computing. The traditional approach of teaching Computational Physics emphasizes ``Correctness'' and then ``Accuracy'' and we add also ``Performance.'' Along with topics in Mathematical Methods and case studies in Physics the course deals a significant amount of time with ``Mini-Courses'' in topics such as: High-Throughput Computing - Condor, Parallel Programming - MPI and OpenMP, How to build a Beowulf, Visualization and Grid and Cloud Computing. The course does not intend to teach neither new physics nor new mathematics but it is focused on an integrated approach for solving problems starting from the physics problem, the corresponding mathematical solution, the numerical scheme, writing an efficient computer code and finally analysis and visualization.
Accelerating rejection-based simulation of biochemical reactions with bounded acceptance probability
NASA Astrophysics Data System (ADS)
Thanh, Vo Hong; Priami, Corrado; Zunino, Roberto
2016-06-01
Stochastic simulation of large biochemical reaction networks is often computationally expensive due to the disparate reaction rates and high variability of population of chemical species. An approach to accelerate the simulation is to allow multiple reaction firings before performing update by assuming that reaction propensities are changing of a negligible amount during a time interval. Species with small population in the firings of fast reactions significantly affect both performance and accuracy of this simulation approach. It is even worse when these small population species are involved in a large number of reactions. We present in this paper a new approximate algorithm to cope with this problem. It is based on bounding the acceptance probability of a reaction selected by the exact rejection-based simulation algorithm, which employs propensity bounds of reactions and the rejection-based mechanism to select next reaction firings. The reaction is ensured to be selected to fire with an acceptance rate greater than a predefined probability in which the selection becomes exact if the probability is set to one. Our new algorithm improves the computational cost for selecting the next reaction firing and reduces the updating the propensities of reactions.
Accelerating rejection-based simulation of biochemical reactions with bounded acceptance probability
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thanh, Vo Hong, E-mail: vo@cosbi.eu; Priami, Corrado, E-mail: priami@cosbi.eu; Department of Mathematics, University of Trento, Trento
Stochastic simulation of large biochemical reaction networks is often computationally expensive due to the disparate reaction rates and high variability of population of chemical species. An approach to accelerate the simulation is to allow multiple reaction firings before performing update by assuming that reaction propensities are changing of a negligible amount during a time interval. Species with small population in the firings of fast reactions significantly affect both performance and accuracy of this simulation approach. It is even worse when these small population species are involved in a large number of reactions. We present in this paper a new approximatemore » algorithm to cope with this problem. It is based on bounding the acceptance probability of a reaction selected by the exact rejection-based simulation algorithm, which employs propensity bounds of reactions and the rejection-based mechanism to select next reaction firings. The reaction is ensured to be selected to fire with an acceptance rate greater than a predefined probability in which the selection becomes exact if the probability is set to one. Our new algorithm improves the computational cost for selecting the next reaction firing and reduces the updating the propensities of reactions.« less
Integrated Sensing and Processing in Missile Systems
2004-03-31
Nanoprobes: The Geometry of Processing and Sensing," H. A. Schmitt, et al., 5th Asian Control Conference, Melbourne, Australia, 2004, accepted. 17. [CR...Computers, Special session on "Signal Processing for Agile Sensors, Pacific Grove, CA, 7-10 November 2004, accepted. 20. [CI] "Computational Origami for
Beck, J R; Fung, K; Lopez, H; Mongero, L B; Argenziano, M
2015-01-01
Delayed perfusionist identification and reaction to abnormal clinical situations has been reported to contribute to increased mortality and morbidity. The use of automated data acquisition and compliance safety alerts has been widely accepted in many industries and its use may improve operator performance. A study was conducted to evaluate the reaction time of perfusionists with and without the use of compliance alert. A compliance alert is a computer-generated pop-up banner on a pump-mounted computer screen to notify the user of clinical parameters outside of a predetermined range. A proctor monitored and recorded the time from an alert until the perfusionist recognized the parameter was outside the desired range. Group one included 10 cases utilizing compliance alerts. Group 2 included 10 cases with the primary perfusionist blinded to the compliance alerts. In Group 1, 97 compliance alerts were identified and, in group two, 86 alerts were identified. The average reaction time in the group using compliance alerts was 3.6 seconds. The average reaction time in the group not using the alerts was nearly ten times longer than the group using computer-assisted, real-time data feedback. Some believe that real-time computer data acquisition and feedback improves perfusionist performance and may allow clinicians to identify and rectify potentially dangerous situations. © The Author(s) 2014.
Computation of Acoustic Waves Through Sliding-Zone Interfaces Using an Euler/Navier-Stokes Code
NASA Technical Reports Server (NTRS)
Rumsey, Christopher L.
1996-01-01
The effect of a patched sliding-zone interface on the transmission of acoustic waves is examined for two- and three-dimensional model problems. A simple but general interpolation scheme at the patched boundary passes acoustic waves without distortion, provided that a sufficiently small time step is taken. A guideline is provided for the maximum permissible time step or zone speed that gives an acceptable error introduced by the sliding-zone interface.
Mitten, H.T.; Lines, G.C.; Berenbrock, Charles; Durbin, T.J.
1988-01-01
Because of the imbalance between recharge and pumpage, groundwater levels declined as much as 100 ft in some areas of Borrego Valley, California during drinking 1945-80. As an aid to analyzing the effects of pumping on the groundwater system, a three-dimensional finite-element groundwater flow model was developed. The model was calibrated for both steady-state (1945) and transient-state (1946-79) conditions. For the steady-state calibration, hydraulic conductivities of the three aquifers were varied within reasonable limits to obtain an acceptable match between measured and computed hydraulic heads. Recharge from streamflow infiltration (4,800 acre-ft/yr) was balanced by computed evapotranspiration (3,900 acre-ft/yr) and computed subsurface outflow from the model area (930 acre-ft/yr). For the transient state calibration, the volumes and distribution of net groundwater pumpage were estimated from land-use data and estimates of consumptive use for irrigated crops. The pumpage was assigned to the appropriate nodes in the model for each of seventeen 2-year time steps representing the period 1946-79. The specific yields of the three aquifers were varied within reasonable limits to obtain an acceptable match between measured and computed hydraulic heads. Groundwater pumpage input to the model was compensated by declines in both the computed evapotranspiration and the amount of groundwater in storage. (USGS)
2014-01-01
Background Existing instruments for measuring problematic computer and console gaming and internet use are often lengthy and often based on a pathological perspective. The objective was to develop and present a new and short non-clinical measurement tool for perceived problems related to computer use and gaming among adolescents and to study the association between screen time and perceived problems. Methods Cross-sectional school-survey of 11-, 13-, and 15-year old students in thirteen schools in the City of Aarhus, Denmark, participation rate 89%, n = 2100. The main exposure was time spend on weekdays on computer- and console-gaming and internet use for communication and surfing. The outcome measures were three indexes on perceived problems related to computer and console gaming and internet use. Results The three new indexes showed high face validity and acceptable internal consistency. Most schoolchildren with high screen time did not experience problems related to computer use. Still, there was a strong and graded association between time use and perceived problems related to computer gaming, console gaming (only boys) and internet use, odds ratios ranging from 6.90 to 10.23. Conclusion The three new measures of perceived problems related to computer and console gaming and internet use among adolescents are appropriate, reliable and valid for use in non-clinical surveys about young people’s everyday life and behaviour. These new measures do not assess Internet Gaming Disorder as it is listed in the DSM and therefore has no parity with DSM criteria. We found an increasing risk of perceived problems with increasing time spent with gaming and internet use. Nevertheless, most schoolchildren who spent much time with gaming and internet use did not experience problems. PMID:24731270
NASA Astrophysics Data System (ADS)
Newman, Gregory A.
2014-01-01
Many geoscientific applications exploit electrostatic and electromagnetic fields to interrogate and map subsurface electrical resistivity—an important geophysical attribute for characterizing mineral, energy, and water resources. In complex three-dimensional geologies, where many of these resources remain to be found, resistivity mapping requires large-scale modeling and imaging capabilities, as well as the ability to treat significant data volumes, which can easily overwhelm single-core and modest multicore computing hardware. To treat such problems requires large-scale parallel computational resources, necessary for reducing the time to solution to a time frame acceptable to the exploration process. The recognition that significant parallel computing processes must be brought to bear on these problems gives rise to choices that must be made in parallel computing hardware and software. In this review, some of these choices are presented, along with the resulting trade-offs. We also discuss future trends in high-performance computing and the anticipated impact on electromagnetic (EM) geophysics. Topics discussed in this review article include a survey of parallel computing platforms, graphics processing units to multicore CPUs with a fast interconnect, along with effective parallel solvers and associated solver libraries effective for inductive EM modeling and imaging.
[Results of the marketing research study "Acceptance of physician's office computer systems"].
Steinhausen, D; Brinkmann, F; Engelhard, A
1998-01-01
We report on a market research study on the acceptance of computer systems in surgeries. 11,000 returned questionnaires of surgeons--user and nonuser--were analysed. We found out that most of the surgeons used their computers in a limited way, i.e. as a device for accounting. Concerning the level of utilisation there are differentials of Men-Women, West-East and Young-Old. In this study we also analysed the computer using behaviour of gynaecologic surgeons. As a result two third of all nonusers are not intending to utilise a computer in the future.
Design Your Own Instructional Software: It's Easy.
ERIC Educational Resources Information Center
Pauline, Ronald F.
Computer Assisted Instruction (CAI) is, quite simply, an instance in which instructional content activities are delivered via a computer. Many commercially-available software programs, although excellent programs, may not be acceptable for each individual teacher's classroom. One way to insure that software is not only acceptable but also targets…
Computer vision syndrome (CVS) - Thermographic Analysis
NASA Astrophysics Data System (ADS)
Llamosa-Rincón, L. E.; Jaime-Díaz, J. M.; Ruiz-Cardona, D. F.
2017-01-01
The use of computers has reported an exponential growth in the last decades, the possibility of carrying out several tasks for both professional and leisure purposes has contributed to the great acceptance by the users. The consequences and impact of uninterrupted tasks with computers screens or displays on the visual health, have grabbed researcher’s attention. When spending long periods of time in front of a computer screen, human eyes are subjected to great efforts, which in turn triggers a set of symptoms known as Computer Vision Syndrome (CVS). Most common of them are: blurred vision, visual fatigue and Dry Eye Syndrome (DES) due to unappropriate lubrication of ocular surface when blinking decreases. An experimental protocol was de-signed and implemented to perform thermographic studies on healthy human eyes during exposure to dis-plays of computers, with the main purpose of comparing the existing differences in temperature variations of healthy ocular surfaces.
NASA Astrophysics Data System (ADS)
Delogu, A.; Furini, F.
1991-09-01
Increasing interest in radar cross section (RCS) reduction is placing new demands on theoretical, computation, and graphic techniques for calculating scattering properties of complex targets. In particular, computer codes capable of predicting the RCS of an entire aircraft at high frequency and of achieving RCS control with modest structural changes, are becoming of paramount importance in stealth design. A computer code, evaluating the RCS of arbitrary shaped metallic objects that are computer aided design (CAD) generated, and its validation with measurements carried out using ALENIA RCS test facilities are presented. The code, based on the physical optics method, is characterized by an efficient integration algorithm with error control, in order to contain the computer time within acceptable limits, and by an accurate parametric representation of the target surface in terms of bicubic splines.
Pierce, B
2000-05-01
This study evaluated the acceptance of using computers to take a medical history by rural Arkansas patients. Sex, age, race, education, previous computer experience and owning a computer were used as variables. Patients were asked a series of questions to rate their comfort level with using a computer to take their medical history. Comfort ratings ranged from 30 to 45, with a mean of 36.8 (SEM = 0.67). Neither sex, race, age, education, owning a personal computer, nor prior computer experience had a significant effect on the comfort rating. This study helps alleviate one of the concerns--patient acceptance--about the increasing use of computers in practicing medicine.
Adaptation of the CVT algorithm for catheter optimization in high dose rate brachytherapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Poulin, Eric; Fekete, Charles-Antoine Collins; Beaulieu, Luc
2013-11-15
Purpose: An innovative, simple, and fast method to optimize the number and position of catheters is presented for prostate and breast high dose rate (HDR) brachytherapy, both for arbitrary templates or template-free implants (such as robotic templates).Methods: Eight clinical cases were chosen randomly from a bank of patients, previously treated in our clinic to test our method. The 2D Centroidal Voronoi Tessellations (CVT) algorithm was adapted to distribute catheters uniformly in space, within the maximum external contour of the planning target volume. The catheters optimization procedure includes the inverse planning simulated annealing algorithm (IPSA). Complete treatment plans can then bemore » generated from the algorithm for different number of catheters. The best plan is chosen from different dosimetry criteria and will automatically provide the number of catheters and their positions. After the CVT algorithm parameters were optimized for speed and dosimetric results, it was validated against prostate clinical cases, using clinically relevant dose parameters. The robustness to implantation error was also evaluated. Finally, the efficiency of the method was tested in breast interstitial HDR brachytherapy cases.Results: The effect of the number and locations of the catheters on prostate cancer patients was studied. Treatment plans with a better or equivalent dose distributions could be obtained with fewer catheters. A better or equal prostate V100 was obtained down to 12 catheters. Plans with nine or less catheters would not be clinically acceptable in terms of prostate V100 and D90. Implantation errors up to 3 mm were acceptable since no statistical difference was found when compared to 0 mm error (p > 0.05). No significant difference in dosimetric indices was observed for the different combination of parameters within the CVT algorithm. A linear relation was found between the number of random points and the optimization time of the CVT algorithm. Because the computation time decrease with the number of points and that no effects were observed on the dosimetric indices when varying the number of sampling points and the number of iterations, they were respectively fixed to 2500 and to 100. The computation time to obtain ten complete treatments plans ranging from 9 to 18 catheters, with the corresponding dosimetric indices, was 90 s. However, 93% of the computation time is used by a research version of IPSA. For the breast, on average, the Radiation Therapy Oncology Group recommendations would be satisfied down to 12 catheters. Plans with nine or less catheters would not be clinically acceptable in terms of V100, dose homogeneity index, and D90.Conclusions: The authors have devised a simple, fast and efficient method to optimize the number and position of catheters in interstitial HDR brachytherapy. The method was shown to be robust for both prostate and breast HDR brachytherapy. More importantly, the computation time of the algorithm is acceptable for clinical use. Ultimately, this catheter optimization algorithm could be coupled with a 3D ultrasound system to allow real-time guidance and planning in HDR brachytherapy.« less
Efficient Mining of Interesting Patterns in Large Biological Sequences
Rashid, Md. Mamunur; Karim, Md. Rezaul; Jeong, Byeong-Soo
2012-01-01
Pattern discovery in biological sequences (e.g., DNA sequences) is one of the most challenging tasks in computational biology and bioinformatics. So far, in most approaches, the number of occurrences is a major measure of determining whether a pattern is interesting or not. In computational biology, however, a pattern that is not frequent may still be considered very informative if its actual support frequency exceeds the prior expectation by a large margin. In this paper, we propose a new interesting measure that can provide meaningful biological information. We also propose an efficient index-based method for mining such interesting patterns. Experimental results show that our approach can find interesting patterns within an acceptable computation time. PMID:23105928
Efficient mining of interesting patterns in large biological sequences.
Rashid, Md Mamunur; Karim, Md Rezaul; Jeong, Byeong-Soo; Choi, Ho-Jin
2012-03-01
Pattern discovery in biological sequences (e.g., DNA sequences) is one of the most challenging tasks in computational biology and bioinformatics. So far, in most approaches, the number of occurrences is a major measure of determining whether a pattern is interesting or not. In computational biology, however, a pattern that is not frequent may still be considered very informative if its actual support frequency exceeds the prior expectation by a large margin. In this paper, we propose a new interesting measure that can provide meaningful biological information. We also propose an efficient index-based method for mining such interesting patterns. Experimental results show that our approach can find interesting patterns within an acceptable computation time.
ERIC Educational Resources Information Center
Pankow, Lena; Kaiser, Gabriele; Busse, Andreas; König, Johannes; Blömeke, Sigrid; Hoth, Jessica; Döhrmann, Martina
2016-01-01
The paper presents results from a computer-based assessment in which 171 early career mathematics teachers from Germany were asked to anticipate typical student errors on a given mathematical topic and identify them under time constraints. Fast and accurate perception and knowledge-based judgments are widely accepted characteristics of teacher…
Hybrid Nested Partitions and Math Programming Framework for Large-scale Combinatorial Optimization
2010-03-31
optimization problems: 1) exact algorithms and 2) metaheuristic algorithms . This project will integrate concepts from these two technologies to develop...optimal solutions within an acceptable amount of computation time, and 2) metaheuristic algorithms such as genetic algorithms , tabu search, and the...integer programming decomposition approaches, such as Dantzig Wolfe decomposition and Lagrangian relaxation, and metaheuristics such as the Nested
Data Storage Hierarchy Systems for Data Base Computers
1979-08-01
Thesis Supervisor Accepted by ................................................ Chairman, Department Committee - /-111 Report...failing to comply with a collection of information if it does not display a currently valid OMB control number. 1. REPORT DATE AUG 1979 2. REPORT...with very large capacity and small access time. As part of the INFOPLEX research effort, this thesis is focused on the study of high performance, highly
NASA Astrophysics Data System (ADS)
Yang, Wen; Fung, Richard Y. K.
2014-06-01
This article considers an order acceptance problem in a make-to-stock manufacturing system with multiple demand classes in a finite time horizon. Demands in different periods are random variables and are independent of one another, and replenishments of inventory deviate from the scheduled quantities. The objective of this work is to maximize the expected net profit over the planning horizon by deciding the fraction of the demand that is going to be fulfilled. This article presents a stochastic order acceptance optimization model and analyses the existence of the optimal promising policies. An example of a discrete problem is used to illustrate the policies by applying the dynamic programming method. In order to solve the continuous problems, a heuristic algorithm based on stochastic approximation (HASA) is developed. Finally, the computational results of a case example illustrate the effectiveness and efficiency of the HASA approach, and make the application of the proposed model readily acceptable.
Predicting vibratory stresses from aero-acoustic loads
NASA Astrophysics Data System (ADS)
Shaw, Matthew D.
Sonic fatigue has been a concern of jet aircraft engineers for many years. As engines become more powerful, structures become more lightly damped and complex, and materials become lighter, stiffer, and more complicated, the need to understand and predict structural response to aeroacoustic loads becomes more important. Despite decades of research, vibration in panels caused by random pressure loads, such as those found in a supersonic jet, is still difficult to predict. The work in this research improves on current prediction methods in several ways, in particular for the structural response due to wall pressures induced by supersonic turbulent flows. First, solutions are calculated using time-domain input pressure loads that include shock cells and their interaction with turbulent flow. The solutions include both mean (static) and oscillatory components. Second, the time series of stresses are required for many fatigue assessment counting algorithms. To do this, a method is developed to compute time-dependent solutions in the frequency domain. The method is first applied to a single-degree-of-freedom system. The equations of motion are derived and solved in both the frequency domain and the time domain. The pressure input is a random (broadband) signal representative of jet flow. The method is then applied to a simply-supported beam vibrating in flexure using a line of pressure inputs computed with computational fluid dynamics (CFD). A modal summation approach is used to compute structural response. The coupling between the pressure field and the structure, through the joint acceptance, is reviewed and discussed for its application to more complicated structures. Results from the new method and from a direct time domain method are compared for method verification. Because the match is good and the new frequency domain method is faster computationally, it is chosen for use in a more complicated structure. The vibration of a two-dimensional panel loaded by jet nozzle discharge flow is addressed. The surface pressures calculated at Pratt and Whitney using viscous and compressible CFD are analyzed and compared to surface pressure measurements made at the United Technologies Research Center (UTRC). A structural finite element model is constructed to represent a flexible panel also used in the UTRC setup. The mode shapes, resonance frequencies, modal loss factors, and surface pressures are input into the solution method. Displacement time series and power spectral densities are computed and compared to measurement and show good agreement. The concept of joint acceptance is further addressed for two-dimensional plates excited by supersonic jet flow. Static and alternating stresses in the panel are also computed, and the most highly stressed modes are identified. The surface pressures are further analyzed in the wavenumber domain for insight into the physics of sonic fatigue. Most of the energy in the wall pressure wavenumber-frequency spectrum at subsonic speeds is in turbulent structures near the convective wavenumber. In supersonic flow, however, the shock region dominates the spectrum at low frequencies, but convective behavior is still dominant at higher frequencies. When the forcing function wavenumber energy overlaps the modal wavenumbers, the acceptance of energy by the structure from the flow field is greatest. The wavenumber analysis suggests a means of designing structures to minimize overlap of excitation and structural wavenumber peaks to minimize vibration and sonic fatigue.
GRAMPS: An Automated Ambulatory Geriatric Record
Hammond, Kenric W.; King, Carol A.; Date, Vishvanath V.; Prather, Robert J.; Loo, Lawrence; Siddiqui, Khwaja
1988-01-01
GRAMPS (Geriatric Record and Multidisciplinary Planning System) is an interactive MUMPS system developed for VA outpatient use. It allows physicians to effectively document care in problem-oriented format with structured narrative and free text, eliminating handwritten input. We evaluated the system in a one-year controlled cohort study. When the computer, was used, appointment times averaged 8.2 minutes longer (32.6 vs. 24.4 minutes) compared to control visits with the same physicians. Computer use was associated with better quality of care as measured in the management of a common problem, hypertension, as well as decreased overall costs of care. When a faster computer was installed, data entry times improved, suggesting that slower processing had accounted for a substantial portion of the observed difference in appointment lengths. The GRAMPS system was well-accepted by providers. The modular design used in GRAMPS has been extended to medical-care applications in Nursing and Mental Health.
SAPNEW: Parallel finite element code for thin shell structures on the Alliant FX-80
NASA Astrophysics Data System (ADS)
Kamat, Manohar P.; Watson, Brian C.
1992-11-01
The finite element method has proven to be an invaluable tool for analysis and design of complex, high performance systems, such as bladed-disk assemblies in aircraft turbofan engines. However, as the problem size increase, the computation time required by conventional computers can be prohibitively high. Parallel processing computers provide the means to overcome these computation time limits. This report summarizes the results of a research activity aimed at providing a finite element capability for analyzing turbomachinery bladed-disk assemblies in a vector/parallel processing environment. A special purpose code, named with the acronym SAPNEW, has been developed to perform static and eigen analysis of multi-degree-of-freedom blade models built-up from flat thin shell elements. SAPNEW provides a stand alone capability for static and eigen analysis on the Alliant FX/80, a parallel processing computer. A preprocessor, named with the acronym NTOS, has been developed to accept NASTRAN input decks and convert them to the SAPNEW format to make SAPNEW more readily used by researchers at NASA Lewis Research Center.
An Adaptive Priority Tuning System for Optimized Local CPU Scheduling using BOINC Clients
NASA Astrophysics Data System (ADS)
Mnaouer, Adel B.; Ragoonath, Colin
2010-11-01
Volunteer Computing (VC) is a Distributed Computing model which utilizes idle CPU cycles from computing resources donated by volunteers who are connected through the Internet to form a very large-scale, loosely coupled High Performance Computing environment. Distributed Volunteer Computing environments such as the BOINC framework is concerned mainly with the efficient scheduling of the available resources to the applications which require them. The BOINC framework thus contains a number of scheduling policies/algorithms both on the server-side and on the client which work together to maximize the available resources and to provide a degree of QoS in an environment which is highly volatile. This paper focuses on the BOINC client and introduces an adaptive priority tuning client side middleware application which improves the execution times of Work Units (WUs) while maintaining an acceptable Maximum Response Time (MRT) for the end user. We have conducted extensive experimentation of the proposed system and the results show clear speedup of BOINC applications using our optimized middleware as opposed to running using the original BOINC client.
NASA Technical Reports Server (NTRS)
Krebs, R. P.
1971-01-01
The computer program described in this report calculates the design-point characteristics of a compressed-air generator for use in V/STOL applications such as systems with a tip-turbine-driven lift fan. The program computes the dimensions and mass, as well as the thermodynamic performance of a model air generator configuration which involves a straight through-flow combustor. Physical and thermodynamic characteristics of the air generator components are also given. The program was written in FORTRAN IV language. Provision has been made so that the program will accept input values in either SI units or U.S. customary units. Each air generator design-point calculation requires about 1.5 seconds of 7094 computer time for execution.
Invariance of an Extended Technology Acceptance Model Across Gender and Age Group
ERIC Educational Resources Information Center
Ahmad, Tunku Badariah Tunku; Madarsha, Kamal Basha; Zainuddin, Ahmad Marzuki; Ismail, Nik Ahmad Hisham; Khairani, Ahmad Zamri; Nordin, Mohamad Sahari
2011-01-01
In this study, we examined the likelihood of a TAME (extended technology acceptance model), in which the interrelationships among computer self-efficacy, perceived usefulness, intention to use and self-reported use of computer-mediated technology were tested. In addition, the gender- and age-invariant of its causal structure were evaluated. The…
1984-01-01
working drawings, lists, and miscellaneous information needed for construction and testing (fig. 4). Detail design and construction in- cludes...still in test and evaluation phases, and is currently operational on a CDC computer. Its approach to management of geometric data is a unique and...been to provide the high degree of engineering user flexibility and yet achieve acceptable response times. In late 1983, a test system which has user
Control Law Design in a Computational Aeroelasticity Environment
NASA Technical Reports Server (NTRS)
Newsom, Jerry R.; Robertshaw, Harry H.; Kapania, Rakesh K.
2003-01-01
A methodology for designing active control laws in a computational aeroelasticity environment is given. The methodology involves employing a systems identification technique to develop an explicit state-space model for control law design from the output of a computational aeroelasticity code. The particular computational aeroelasticity code employed in this paper solves the transonic small disturbance aerodynamic equation using a time-accurate, finite-difference scheme. Linear structural dynamics equations are integrated simultaneously with the computational fluid dynamics equations to determine the time responses of the structure. These structural responses are employed as the input to a modern systems identification technique that determines the Markov parameters of an "equivalent linear system". The Eigensystem Realization Algorithm is then employed to develop an explicit state-space model of the equivalent linear system. The Linear Quadratic Guassian control law design technique is employed to design a control law. The computational aeroelasticity code is modified to accept control laws and perform closed-loop simulations. Flutter control of a rectangular wing model is chosen to demonstrate the methodology. Various cases are used to illustrate the usefulness of the methodology as the nonlinearity of the aeroelastic system is increased through increased angle-of-attack changes.
Hayman, Melanie; Reaburn, Peter; Browne, Matthew; Vandelanotte, Corneel; Alley, Stephanie; Short, Camille E
2017-03-23
Physical activity (PA) during pregnancy is associated with a variety of health benefits including a reduced risk of pregnancy related conditions such as pre-eclampsia and pregnancy-induced hypertension and leads to greater control over gestational weight gain. Despite these associated health benefits, very few pregnant women are sufficiently active. In an attempt to increase health outcomes, it is important to explore innovative ways to increase PA among pregnant women. Therefore, the aim of this study was to assess the feasibility, acceptability and efficacy of a four week web-based computer-tailored PA intervention among pregnant women. Seventy-seven participants were randomised into either: (1) an intervention group that received tailored PA advice and access to a resource library of articles relating to PA during pregnancy; or (2) a standard information group that only received access to the resources library. Objective moderate-to-vigorous physical activity (MVPA) was assessed at baseline and immediately post-intervention. Recruitment, attrition, intervention adherence, and website engagement were assessed. Questions on usability and satisfaction were administered post-intervention. Feasibility was demonstrated through acceptable recruitment (8.5 participants recruited and randomised/month), and attrition (25%). Acceptability among intervention group participants was positive with high intervention adherence (96% of 4 modules completed). High website engagement (participants logged in 1.6 times/week although only required to log in once per week), usability (75/100), and satisfaction outcomes were reported in both groups. However, participants in the intervention group viewed significantly more pages on the website (p < 0.05), reported that the website felt more personally relevant (p < 0.05), and significantly increased their MVPA from baseline to post-intervention (mean difference = 35.87 min), compared to the control group (mean difference = 9.83 min) (p < 0.05), suggesting efficacy. The delivery of a computer-tailored web-based intervention designed to increase PA in pregnant women is feasible, well accepted and associated with increases in short-term MVPA. Findings suggest the use of computer-tailored information leads to greater website engagement, satisfaction and greater PA levels among pregnant women compared to a generic information only website. The trial was 'retrospectively registered' with the Australian New Zealand Clinical Trials Registry ( ACTRN12614001105639 ) on 17 th October, 2014.
Becker, Annette; Herzberg, Dominikus; Marsden, Nicola; Thomanek, Sabine; Jung, Hartmut; Leonhardt, Corinna
2011-05-01
To develop a computer-based counselling system (CBCS) for the improvement of attitudes towards physical activity in chronically ill patients and to pilot its efficacy and acceptance in primary care. The system is tailored to patients' disease and motivational stage. During a pilot study in five German general practices, patients answered questions before, directly and 6 weeks after using the CBCS. Outcome criteria were attitudes and self-efficacy. Qualitative interviews were performed to identify acceptance indicators. Seventy-nine patients participated (mean age: 64.5 years, 53% males; 38% without previous computer experience). Patients' affective and cognitive attitudes changed significantly, self-efficacy showed only minor changes. Patients mentioned no difficulties in interacting with the CBCS. However, perception of the system's usefulness was inconsistent. Computer-based counselling for physical activity related attitudes in patients with chronic diseases is feasible, but the circumstances of use with respect to the target group and its integration into the management process have to be clarified in future studies. This study adds to the understanding of computer-based counselling in primary health care. Acceptance indicators identified in this study will be validated as part of a questionnaire on technology acceptability in a subsequent study. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
Predictors of nurses' acceptance of an intravenous catheter safety device.
Rivers, Dianna Lipp; Aday, Lu Ann; Frankowski, Ralph F; Felknor, Sarah; White, Donna; Nichols, Brenda
2003-01-01
It is important to determine the factors that predict whether nurses accept and use a new intravenous (IV) safety device because there are approximately 800,000 needlesticks per year with the risk of contracting a life-threatening bloodborne disease such as HIV or hepatitis C. To determine the predictors of nurses' acceptance of the Protectiv Plus IV catheter safety needle device at a teaching hospital in Texas. A one-time cross-sectional survey of nurses (N = 742) was conducted using a 34-item questionnaire. A framework was developed identifying organizational and individual predictors of acceptance. The three principal dimensions of acceptance were (a) satisfaction with the device, (b) extent to which the device is always used, and (c) nurse recommendations over other safety devices. Measurements included developing summary subscales for the variables of safety climate and acceptance. Descriptive statistics and multiple linear and logistic regression models were computed. The findings showed widespread acceptance of the device. Nurses who had adequate training and a positive institutional safety climate were more accepting (p
ERIC Educational Resources Information Center
Paiva, Andrea L.; Lipschitz, Jessica M.; Fernandez, Anne C.; Redding, Colleen A.; Prochaska, James O.
2014-01-01
Objective: To examine acceptability and feasibility of a Transtheoretical Model (TTM)-based computer-tailored intervention (CTI) for increasing human papillomavirus (HPV) vaccination in college-aged women. Participants: Two hundred forty-three women aged 18-26 were recruited between February and May of 2011. Methods: Participants completed the…
2014-12-01
observed an ERP system implementation that encountered this exact model. The modified COTS software worked and passed the acceptance tests but never... software -intensive program. We decided to create a very detailed master sched- ule with multiple supporting subschedules that linked and Implementing ...processes in place as part of the COTS implementation . For hardware , COTS can also present some risks. Many pro- grams use COTS computers and servers
ATC simulation of helicopter IFR approaches into major terminal areas using RNAV, MLS, and CDTI
NASA Technical Reports Server (NTRS)
Tobias, L.; Lee, H. Q.; Peach, L. L.; Willett, F. M., Jr.; Obrien, P. J.
1981-01-01
The introduction of independent helicopter IFR routes at hub airports was investigated in a real time air traffic control system simulation involving a piloted helicopter simulator, computer generated air traffic, and air traffic controllers. The helicopter simulator was equipped to fly area navigation (RNAV) routes and microwave landing system approaches. Problems studied included: (1) pilot acceptance of the approach procedure and tracking accuracy; (2) ATC procedures for handling a mix of helicopter and fixed wing traffic; and (3) utility of the cockpit display of traffic information (CDTI) for the helicopter in the hub airport environment. Results indicate that the helicopter routes were acceptable to the subject pilots and were noninterfering with fixed wing traffic. Merging and spacing maneuvers using CDTI were successfully carried out by the pilots, but controllers had some reservations concerning the acceptability of the CDTI procedures.
Cargo Movement Operations System (CMOS) Computer System Operator’s Manual. Draft
1990-06-27
are arranged in page number order. RATIONALE: N/A CMOS PMO ACCEPTS COMMENT: YES [ ] NO [ ] ERCI ACCEPTS COMMENT: YES [ ] NO [ ] COMMENT DISPOSITION...ACCEPTS COMMENT: YES [ ] NO [ ] COMMENT DISPOSITION: COMMEN7 STATUS: OPEN [ ] CLOSED [ ] ORIGINATOR CONTROL NUMBER: CSOM-0003 PROGRAM OFFICE CONTROL...ACCEPTS COMMENT: YES [ ] NO ( 3 ERCI ACCEPTS COMMENT: YES [ ] NO [ ] COMMENT DISPOSITION: COMMENT STATUS: OPEN [ ] CLOSED [ 3 ORIGINATOR CONTROL NUMBER
Damiano, Diane L.; Bulea, Thomas C.
2016-01-01
Individuals with cerebral palsy frequently exhibit crouch gait, a pathological walking pattern characterized by excessive knee flexion. Knowledge of the knee joint moment during crouch gait is necessary for the design and control of assistive devices used for treatment. Our goal was to 1) develop statistical models to estimate knee joint moment extrema and dynamic stiffness during crouch gait, and 2) use the models to estimate the instantaneous joint moment during weight-acceptance. We retrospectively computed knee moments from 10 children with crouch gait and used stepwise linear regression to develop statistical models describing the knee moment features. The models explained at least 90% of the response value variability: peak moment in early (99%) and late (90%) stance, and dynamic stiffness of weight-acceptance flexion (94%) and extension (98%). We estimated knee extensor moment profiles from the predicted dynamic stiffness and instantaneous knee angle. This approach captured the timing and shape of the computed moment (root-mean-squared error: 2.64 Nm); including the predicted early-stance peak moment as a correction factor improved model performance (root-mean-squared error: 1.37 Nm). Our strategy provides a practical, accurate method to estimate the knee moment during crouch gait, and could be used for real-time, adaptive control of robotic orthoses. PMID:27101612
Acquisition of gamma camera and physiological data by computer.
Hack, S N; Chang, M; Line, B R; Cooper, J A; Robeson, G H
1986-11-01
We have designed, implemented, and tested a new Research Data Acquisition System (RDAS) that permits a general purpose digital computer to acquire signals from both gamma camera sources and physiological signal sources concurrently. This system overcomes the limited multi-source, high speed data acquisition capabilities found in most clinically oriented nuclear medicine computers. The RDAS can simultaneously input signals from up to four gamma camera sources with a throughput of 200 kHz per source and from up to eight physiological signal sources with an aggregate throughput of 50 kHz. Rigorous testing has found the RDAS to exhibit acceptable linearity and timing characteristics. In addition, flood images obtained by this system were compared with flood images acquired by a commercial nuclear medicine computer system. National Electrical Manufacturers Association performance standards of the flood images were found to be comparable.
Newton, Amanda S; Dow, Nadia; Dong, Kathryn; Fitzpatrick, Eleanor; Cameron Wild, T; Johnson, David W; Ali, Samina; Colman, Ian; Rosychuk, Rhonda J
2017-08-11
This study piloted procedures and obtained data on intervention acceptability to determine the feasibility of a definitive randomised controlled trial (RCT) of the effectiveness of a computer-based brief intervention in the emergency department (ED). Two-arm, multi-site, pilot RCT. Adolescents aged 12-17 years presenting to three Canadian pediatric EDs from July 2010 to January 2013 for an alcohol-related complaint. Standard medical care plus computer-based screening and personalised assessment feedback (experimental group) or standard care plus computer-based sham (control group). ED and research staff, and adolescents were blinded to allocation. Main: change in alcohol consumption from baseline to 1- and 3 months post-intervention. Secondary: recruitment and retention rates, intervention acceptability and feasibility, perception of group allocation among ED and research staff, and change in health and social services utilisation. Of the 340 adolescents screened, 117 adolescents were eligible and 44 participated in the study (37.6% recruitment rate). Adolescents allocated to the intervention found it easy, quick and informative, but were divided on the credibility of the feedback provided (agreed it was credible: 44.4%, disagreed: 16.7%, unsure: 16.7%, no response: 22.2%). We found no evidence of a statistically significant relationship between which interventions adolescents were allocated to and which interventions staff thought they received. Alcohol consumption, and health and social services data were largely incomplete due to modest study retention rates of 47.7% and 40.9% at 1- and 3 months post-intervention, respectively. A computer-based intervention was acceptable to adolescents and delivery was feasible in the ED in terms of time to use and ease of use. However, adjustments are needed to the intervention to improve its credibility. A definitive RCT will be feasible if protocol adjustments are made to improve recruitment and retention rates; and increase the number of study sites and research staff. clinicaltrials.gov NCT01146665. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
ERIC Educational Resources Information Center
Gyamfi, Stephen Adu
2016-01-01
This study extends the technology acceptance model to identify factors that influence technology acceptance among pre-service teachers in Ghana. Data from 380 usable questionnaires were tested against the research model. Utilising the extended technology acceptance model (TAM) as a research framework, the study found that: pre-service teachers'…
Resource-constrained scheduling with hard due windows and rejection penalties
NASA Astrophysics Data System (ADS)
Garcia, Christopher
2016-09-01
This work studies a scheduling problem where each job must be either accepted and scheduled to complete within its specified due window, or rejected altogether. Each job has a certain processing time and contributes a certain profit if accepted or penalty cost if rejected. There is a set of renewable resources, and no resource limit can be exceeded at any time. Each job requires a certain amount of each resource when processed, and the objective is to maximize total profit. A mixed-integer programming formulation and three approximation algorithms are presented: a priority rule heuristic, an algorithm based on the metaheuristic for randomized priority search and an evolutionary algorithm. Computational experiments comparing these four solution methods were performed on a set of generated benchmark problems covering a wide range of problem characteristics. The evolutionary algorithm outperformed the other methods in most cases, often significantly, and never significantly underperformed any method.
ERIC Educational Resources Information Center
Montrieux, Hannelore; Courtois, Cédric; De Grove, Frederik; Raes, Annelies; Schellens, Tammy; De Marez, Lieven
2014-01-01
This paper examines the school-wide introduction of the tablet computer as a mobile learning tool in a secondary school in Belgium. Drawing upon the Decomposed Theory of Planned Behavior, we question during three waves of data collection which factors influence teachers' and students' acceptance and use of these devices for educational purposes.…
ERIC Educational Resources Information Center
Teo, Timothy
2010-01-01
The purpose of this study is to examine pre-service teachers' attitudes to computers. This study extends the technology acceptance model (TAM) framework by adding subjective norm, facilitating conditions, and technological complexity as external variables. Results show that the TAM and subjective norm, facilitating conditions, and technological…
Mitchell, Shannon Gwin; Monico, Laura B; Gryczynski, Jan; O'Grady, Kevin E; Schwartz, Robert P
2015-01-01
The use of computers for identifying and intervening with stigmatized behaviors, such as drug use, offers promise for underserved, rural areas; however, the acceptability and appropriateness of using computerized brief intervention (CBIs) must be taken into consideration. In the present study, 12 staff members representing a range of clinic roles in two rural, federally qualified health centers completed semi-structured interviews in a qualitative investigation of CBI vs. counselor-delivered individual brief intervention (IBI). Thematic content analysis was conducted using a constant comparative method, examining the range of responses within each interview as well as data across interview respondents. Overall, staff found the idea of providing CBIs both acceptable and appropriate for their patient population. Acceptability by clinic staff centered on the ready availability of the CBI. Staff also believed that patients might be more forthcoming in response to a computer program than a personal interview. However, some staff voiced reservations concerning the appropriateness of CBIs for subsets of patients, including older patients, illiterate individuals, or those unfamiliar with computers. Findings support the potential suitability and potential benefits of providing CBIs to patients in rural health centers.
A new approach for measuring power spectra and reconstructing time series in active galactic nuclei
NASA Astrophysics Data System (ADS)
Li, Yan-Rong; Wang, Jian-Min
2018-05-01
We provide a new approach to measure power spectra and reconstruct time series in active galactic nuclei (AGNs) based on the fact that the Fourier transform of AGN stochastic variations is a series of complex Gaussian random variables. The approach parametrizes a stochastic series in frequency domain and transforms it back to time domain to fit the observed data. The parameters and their uncertainties are derived in a Bayesian framework, which also allows us to compare the relative merits of different power spectral density models. The well-developed fast Fourier transform algorithm together with parallel computation enables an acceptable time complexity for the approach.
A Python-based interface to examine motions in time series of solar images
NASA Astrophysics Data System (ADS)
Campos-Rozo, J. I.; Vargas Domínguez, S.
2017-10-01
Python is considered to be a mature programming language, besides of being widely accepted as an engaging option for scientific analysis in multiple areas, as will be presented in this work for the particular case of solar physics research. SunPy is an open-source library based on Python that has been recently developed to furnish software tools to solar data analysis and visualization. In this work we present a graphical user interface (GUI) based on Python and Qt to effectively compute proper motions for the analysis of time series of solar data. This user-friendly computing interface, that is intended to be incorporated to the Sunpy library, uses a local correlation tracking technique and some extra tools that allows the selection of different parameters to calculate, vizualize and analyze vector velocity fields of solar data, i.e. time series of solar filtergrams and magnetograms.
A neural computational model for animal's time-to-collision estimation.
Wang, Ling; Yao, Dezhong
2013-04-17
The time-to-collision (TTC) is the time elapsed before a looming object hits the subject. An accurate estimation of TTC plays a critical role in the survival of animals in nature and acts as an important factor in artificial intelligence systems that depend on judging and avoiding potential dangers. The theoretic formula for TTC is 1/τ≈θ'/sin θ, where θ and θ' are the visual angle and its variation, respectively, and the widely used approximation computational model is θ'/θ. However, both of these measures are too complex to be implemented by a biological neuronal model. We propose a new simple computational model: 1/τ≈Mθ-P/(θ+Q)+N, where M, P, Q, and N are constants that depend on a predefined visual angle. This model, weighted summation of visual angle model (WSVAM), can achieve perfect implementation through a widely accepted biological neuronal model. WSVAM has additional merits, including a natural minimum consumption and simplicity. Thus, it yields a precise and neuronal-implemented estimation for TTC, which provides a simple and convenient implementation for artificial vision, and represents a potential visual brain mechanism.
An Analogue VLSI Implementation of the Meddis Inner Hair Cell Model
NASA Astrophysics Data System (ADS)
McEwan, Alistair; van Schaik, André
2003-12-01
The Meddis inner hair cell model is a widely accepted, but computationally intensive computer model of mammalian inner hair cell function. We have produced an analogue VLSI implementation of this model that operates in real time in the current domain by using translinear and log-domain circuits. The circuit has been fabricated on a chip and tested against the Meddis model for (a) rate level functions for onset and steady-state response, (b) recovery after masking, (c) additivity, (d) two-component adaptation, (e) phase locking, (f) recovery of spontaneous activity, and (g) computational efficiency. The advantage of this circuit, over other electronic inner hair cell models, is its nearly exact implementation of the Meddis model which can be tuned to behave similarly to the biological inner hair cell. This has important implications on our ability to simulate the auditory system in real time. Furthermore, the technique of mapping a mathematical model of first-order differential equations to a circuit of log-domain filters allows us to implement real-time neuromorphic signal processors for a host of models using the same approach.
NASA Technical Reports Server (NTRS)
Ko, William L.; Olona, Timothy; Muramoto, Kyle M.
1990-01-01
Different finite element models previously set up for thermal analysis of the space shuttle orbiter structure are discussed and their shortcomings identified. Element density criteria are established for the finite element thermal modelings of space shuttle orbiter-type large, hypersonic aircraft structures. These criteria are based on rigorous studies on solution accuracies using different finite element models having different element densities set up for one cell of the orbiter wing. Also, a method for optimization of the transient thermal analysis computer central processing unit (CPU) time is discussed. Based on the newly established element density criteria, the orbiter wing midspan segment was modeled for the examination of thermal analysis solution accuracies and the extent of computation CPU time requirements. The results showed that the distributions of the structural temperatures and the thermal stresses obtained from this wing segment model were satisfactory and the computation CPU time was at the acceptable level. The studies offered the hope that modeling the large, hypersonic aircraft structures using high-density elements for transient thermal analysis is possible if a CPU optimization technique was used.
NASA Astrophysics Data System (ADS)
Franco, Patrick; Ogier, Jean-Marc; Loonis, Pierre; Mullot, Rémy
Recently we have developed a model for shape description and matching. Based on minimum spanning trees construction and specifics stages like the mixture, it seems to have many desirable properties. Recognition invariance in front shift, rotated and noisy shape was checked through median scale tests related to GREC symbol reference database. Even if extracting the topology of a shape by mapping the shortest path connecting all the pixels seems to be powerful, the construction of graph induces an expensive algorithmic cost. In this article we discuss on the ways to reduce time computing. An alternative solution based on image compression concepts is provided and evaluated. The model no longer operates in the image space but in a compact space, namely the Discrete Cosine space. The use of block discrete cosine transform is discussed and justified. The experimental results led on the GREC2003 database show that the proposed method is characterized by a good discrimination power, a real robustness to noise with an acceptable time computing.
Strategic Performance Management Evaluation for the Navy’s Splice Local Area Networks.
1985-04-01
Communications Agency (DCA)/Federal Data Corporation (FDC) literature; an extensive survey of academic and professional book and article literature... interesting closing note on strategic planning characteristics is that the period during which collapse or disaster develops is of the same order as the...accepted set of standards. In computer performance, such things as paging rates , throughput, input/output channel usage, turnaround * 32 EM-. time
An automated system for pulmonary function testing
NASA Technical Reports Server (NTRS)
Mauldin, D. G.
1974-01-01
An experiment to quantitate pulmonary function was accepted for the space shuttle concept verification test. The single breath maneuver and the nitrogen washout are combined to reduce the test time. Parameters are defined from the forced vital capacity maneuvers. A spirometer measures the breath volume and a magnetic section mass spectrometer provides definition of gas composition. Mass spectrometer and spirometer data are analyzed by a PDP-81 digital computer.
Fekete, Szabolcs; Fekete, Jeno; Molnár, Imre; Ganzler, Katalin
2009-11-06
Many different strategies of reversed phase high performance liquid chromatographic (RP-HPLC) method development are used today. This paper describes a strategy for the systematic development of ultrahigh-pressure liquid chromatographic (UHPLC or UPLC) methods using 5cmx2.1mm columns packed with sub-2microm particles and computer simulation (DryLab((R)) package). Data for the accuracy of computer modeling in the Design Space under ultrahigh-pressure conditions are reported. An acceptable accuracy for these predictions of the computer models is presented. This work illustrates a method development strategy, focusing on time reduction up to a factor 3-5, compared to the conventional HPLC method development and exhibits parts of the Design Space elaboration as requested by the FDA and ICH Q8R1. Furthermore this paper demonstrates the accuracy of retention time prediction at elevated pressure (enhanced flow-rate) and shows that the computer-assisted simulation can be applied with sufficient precision for UHPLC applications (p>400bar). Examples of fast and effective method development in pharmaceutical analysis, both for gradient and isocratic separations are presented.
NASA Technical Reports Server (NTRS)
Thompson, T. W.; Cutts, J. A.
1981-01-01
A catalog of lunar and radar anomalies was generated to provide a base for comparison with Venusian radar signatures. The relationships between lunar radar anomalies and regolith processes were investigated, and a consortium was formed to compare lunar and Venusian radar images of craters. Time was scheduled at the Arecibo Observatory to use the 430 MHz radar to obtain high resolution radar maps of six areas of the lunar suface. Data from 1978 observations of Mare Serenitas and Plato are being analyzed on a PDP 11/70 computer to construct the computer program library necessary for the eventual reduction of the May 1981 and subsequent data acquisitions. Papers accepted for publication are presented.
NASA Astrophysics Data System (ADS)
Elantkowska, Magdalena; Ruczkowski, Jarosław; Sikorski, Andrzej; Dembczyński, Jerzy
2017-11-01
A parametric analysis of the hyperfine structure (hfs) for the even parity configurations of atomic terbium (Tb I) is presented in this work. We introduce the complete set of 4fN-core states in our high-performance computing (HPC) calculations. For calculations of the huge hyperfine structure matrix, requiring approximately 5000 hours when run on a single CPU, we propose the methods utilizing a personal computer cluster or, alternatively a cluster of Microsoft Azure virtual machines (VM). These methods give a factor 12 performance boost, enabling the calculations to complete in an acceptable time.
An algorithm to compute the sequency ordered Walsh transform
NASA Technical Reports Server (NTRS)
Larsen, H.
1976-01-01
A fast sequency-ordered Walsh transform algorithm is presented; this sequency-ordered fast transform is complementary to the sequency-ordered fast Walsh transform introduced by Manz (1972) and eliminating gray code reordering through a modification of the basic fast Hadamard transform structure. The new algorithm retains the advantages of its complement (it is in place and is its own inverse), while differing in having a decimation-in time structure, accepting data in normal order, and returning the coefficients in bit-reversed sequency order. Applications include estimation of Walsh power spectra for a random process, sequency filtering and computing logical autocorrelations, and selective bit reversing.
Factors Affecting Students' Acceptance of Tablet PCs: A Study in Italian High Schools
ERIC Educational Resources Information Center
Cacciamani, Stefano; Villani, Daniela; Bonanomi, Andrea; Carissoli, Claudia; Olivari, Maria Giulia; Morganti, Laura; Riva, Giuseppe; Confalonieri, Emanuela
2018-01-01
To maximize the advantages of the tablet personal computer (TPC) at school, this technology needs to be accepted by students as new tool for learning. With reference to the Technology Acceptance Model and the Unified Theory of Acceptance and Use of Technology, the aims of this study were (a) to analyze factors influencing high school students'…
ERIC Educational Resources Information Center
Wang, Chia-Sui; Huang, Yong-Ming
2016-01-01
Face-to-face computer-supported collaborative learning (CSCL) was used extensively to facilitate learning in classrooms. Cloud services not only allow a single user to edit a document, but they also enable multiple users to simultaneously edit a shared document. However, few researchers have compared student acceptance of such services in…
Kurth, Ann E; Chhun, Nok; Cleland, Charles M; Crespo-Fierro, Michele; Parés-Avila, José A; Lizcano, John A; Norman, Robert G; Shedlin, Michele G; Johnston, Barbara E; Sharp, Victoria L
2016-07-13
Human immunodeficiency virus (HIV) disease in the United States disproportionately affects minorities, including Latinos. Barriers including language are associated with lower antiretroviral therapy (ART) adherence seen among Latinos, yet ART and interventions for clinic visit adherence are rarely developed or delivered in Spanish. The aim was to adapt a computer-based counseling tool, demonstrated to reduce HIV-1 viral load and sexual risk transmission in a population of English-speaking adults, for use during routine clinical visits for an HIV-positive Spanish-speaking population (CARE+ Spanish); the Technology Acceptance Model (TAM) was the theoretical framework guiding program development. A longitudinal randomized controlled trial was conducted from June 4, 2010 to March 29, 2012. Participants were recruited from a comprehensive HIV treatment center comprising three clinics in New York City. Eligibility criteria were (1) adults (age ≥18 years), (2) Latino birth or ancestry, (3) speaks Spanish (mono- or multilingual), and (4) on antiretrovirals. Linear and generalized mixed linear effects models were used to analyze primary outcomes, which included ART adherence, sexual transmission risk behaviors, and HIV-1 viral loads. Exit interviews were offered to purposively selected intervention participants to explore cultural acceptability of the tool among participants, and focus groups explored the acceptability and system efficiency issues among clinic providers, using the TAM framework. A total of 494 Spanish-speaking HIV clinic attendees were enrolled and randomly assigned to the intervention (arm A: n=253) or risk assessment-only control (arm B, n=241) group and followed up at 3-month intervals for one year. Gender distribution was 296 (68.4%) male, 110 (25.4%) female, and 10 (2.3%) transgender. By study end, 433 of 494 (87.7%) participants were retained. Although intervention participants had reduced viral loads, increased ART adherence and decreased sexual transmission risk behaviors over time, these findings were not statistically significant. We also conducted 61 qualitative exit interviews with participants and two focus groups with a total of 16 providers. A computer-based counseling tool grounded in the TAM theoretical model and delivered in Spanish was acceptable and feasible to implement in a high-volume HIV clinic setting. It was able to provide evidence-based, linguistically appropriate ART adherence support without requiring additional staff time, bilingual status, or translation services. We found that language preferences and cultural acceptability of a computer-based counseling tool exist on a continuum in our urban Spanish-speaking population. Theoretical frameworks of technology's usefulness for behavioral modification need further exploration in other languages and cultures. ClinicalTrials.gov NCT01013935; https://clinicaltrials.gov/ct2/show/NCT01013935 (Archived by WebCite at http://www.webcitation.org/6ikaD3MT7).
Usability of a Low-Cost Head Tracking Computer Access Method following Stroke.
Mah, Jasmine; Jutai, Jeffrey W; Finestone, Hillel; Mckee, Hilary; Carter, Melanie
2015-01-01
Assistive technology devices for computer access can facilitate social reintegration and promote independence for people who have had a stroke. This work describes the exploration of the usefulness and acceptability of a new computer access device called the Nouse™ (Nose-as-mouse). The device uses standard webcam and video recognition algorithms to map the movement of the user's nose to a computer cursor, thereby allowing hands-free computer operation. Ten participants receiving in- or outpatient stroke rehabilitation completed a series of standardized and everyday computer tasks using the Nouse™ and then completed a device usability questionnaire. Task completion rates were high (90%) for computer activities only in the absence of time constraints. Most of the participants were satisfied with ease of use (70%) and liked using the Nouse™ (60%), indicating they could resume most of their usual computer activities apart from word-processing using the device. The findings suggest that hands-free computer access devices like the Nouse™ may be an option for people who experience upper motor impairment caused by stroke and are highly motivated to resume personal computing. More research is necessary to further evaluate the effectiveness of this technology, especially in relation to other computer access assistive technology devices.
Acceptability of a Virtual Patient Educator for Hispanic Women.
Wells, Kristen J; Vàzquez-Otero, Coralia; Bredice, Marissa; Meade, Cathy D; Chaet, Alexis; Rivera, Maria I; Arroyo, Gloria; Proctor, Sara K; Barnes, Laura E
2015-01-01
There are few Spanish language interactive, technology-driven health education programs. Objectives of this feasibility study were to (a) learn more about computer and technology usage among Hispanic women living in a rural community and (b) evaluate acceptability of the concept of using an embodied conversational agent (ECA) computer application among this population. A survey about computer usage history and interest in computers was administered to a convenience sample of 26 women. A sample video prototype of a hospital discharge ECA was administered followed by questions to gauge opinion about the ECA. Data indicate women exhibited both a high level of computer experience and enthusiasm for the ECA. Feedback from community is essential to ensure equity in state of the art dissemination of health information.
Weare, Jonathan; Dinner, Aaron R.; Roux, Benoît
2016-01-01
A multiple time-step integrator based on a dual Hamiltonian and a hybrid method combining molecular dynamics (MD) and Monte Carlo (MC) is proposed to sample systems in the canonical ensemble. The Dual Hamiltonian Multiple Time-Step (DHMTS) algorithm is based on two similar Hamiltonians: a computationally expensive one that serves as a reference and a computationally inexpensive one to which the workload is shifted. The central assumption is that the difference between the two Hamiltonians is slowly varying. Earlier work has shown that such dual Hamiltonian multiple time-step schemes effectively precondition nonlinear differential equations for dynamics by reformulating them into a recursive root finding problem that can be solved by propagating a correction term through an internal loop, analogous to RESPA. Of special interest in the present context, a hybrid MD-MC version of the DHMTS algorithm is introduced to enforce detailed balance via a Metropolis acceptance criterion and ensure consistency with the Boltzmann distribution. The Metropolis criterion suppresses the discretization errors normally associated with the propagation according to the computationally inexpensive Hamiltonian, treating the discretization error as an external work. Illustrative tests are carried out to demonstrate the effectiveness of the method. PMID:26918826
ERIC Educational Resources Information Center
Peelle, Howard A.
Computers have undoubtedly entered the educational arena, mainly in the areas of computer-assisted instruction (CAI) and artificial intelligence, but whether educators should embrace computers and exactly how they should use them are matters of great debate. The use of computers in support of educational administration is widely accepted.…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clemmens, W.B.; Koupal, J.W.; Sabourin, M.A.
1993-07-20
Apparatus is described for detecting motor vehicle exhaust gas catalytic converter deterioration comprising a first exhaust gas oxygen sensor adapted for communication with an exhaust stream before passage of the exhaust stream through a catalytic converter and a second exhaust gas oxygen sensor adapted for communication with the exhaust stream after passage of the exhaust stream through the catalytic converter, an on-board vehicle computational means, said computational means adapted to accept oxygen content signals from the before and after catalytic converter oxygen sensors and adapted to generate signal threshold values, said computational means adapted to compare over repeated time intervalsmore » the oxygen content signals to the signal threshold values and to store the output of the compared oxygen content signals, and in response after a specified number of time intervals for a specified mode of motor vehicle operation to determine and indicate a level of catalyst deterioration.« less
O'Reilly, Robert; Fedorko, Steve; Nicholson, Nigel
1983-01-01
This paper describes a structured interview process for medical school admissions supported by an Apple II computer system which provides feedback to interviewers and the College admissions committee. Presented are the rationale for the system, the preliminary results of analysis of some of the interview data, and a brief description of the computer program and output. The present data show that the structured interview yields very high interrater reliability coefficients, is acceptable to the medical school faculty, and results in quantitative data useful in the admission process. The system continues in development at this time, a second year of data will be shortly available, and further refinements are being made to the computer program to enhance its utilization and exportability.
Configuring Airspace Sectors with Approximate Dynamic Programming
NASA Technical Reports Server (NTRS)
Bloem, Michael; Gupta, Pramod
2010-01-01
In response to changing traffic and staffing conditions, supervisors dynamically configure airspace sectors by assigning them to control positions. A finite horizon airspace sector configuration problem models this supervisor decision. The problem is to select an airspace configuration at each time step while considering a workload cost, a reconfiguration cost, and a constraint on the number of control positions at each time step. Three algorithms for this problem are proposed and evaluated: a myopic heuristic, an exact dynamic programming algorithm, and a rollouts approximate dynamic programming algorithm. On problem instances from current operations with only dozens of possible configurations, an exact dynamic programming solution gives the optimal cost value. The rollouts algorithm achieves costs within 2% of optimal for these instances, on average. For larger problem instances that are representative of future operations and have thousands of possible configurations, excessive computation time prohibits the use of exact dynamic programming. On such problem instances, the rollouts algorithm reduces the cost achieved by the heuristic by more than 15% on average with an acceptable computation time.
Virtual viewpoint synthesis in multi-view video system
NASA Astrophysics Data System (ADS)
Li, Fang; Yang, Shiqiang
2005-07-01
In this paper, we present a virtual viewpoint video synthesis algorithm to satisfy the following three aims: low computing consuming; real time interpolation and acceptable video quality. In contrast with previous technologies, this method obtain incompletely 3D structure using neighbor video sources instead of getting total 3D information with all video sources, so that the computation is reduced greatly. So we demonstrate our interactive multi-view video synthesis algorithm in a personal computer. Furthermore, adopting the method of choosing feature points to build the correspondence between the frames captured by neighbor cameras, we need not require camera calibration. Finally, our method can be used when the angle between neighbor cameras is 25-30 degrees that it is much larger than common computer vision experiments. In this way, our method can be applied into many applications such as sports live, video conference, etc.
NASA Technical Reports Server (NTRS)
Apodaca, Tony; Porter, Tom
1989-01-01
The two worlds of interactive graphics and realistic graphics have remained separate. Fast graphics hardware runs simple algorithms and generates simple looking images. Photorealistic image synthesis software runs slowly on large expensive computers. The time has come for these two branches of computer graphics to merge. The speed and expense of graphics hardware is no longer the barrier to the wide acceptance of photorealism. There is every reason to believe that high quality image synthesis will become a standard capability of every graphics machine, from superworkstation to personal computer. The significant barrier has been the lack of a common language, an agreed-upon set of terms and conditions, for 3-D modeling systems to talk to 3-D rendering systems for computing an accurate rendition of that scene. Pixar has introduced RenderMan to serve as that common language. RenderMan, specifically the extensibility it offers in shading calculations, is discussed.
Computing at h1 - Experience and Future
NASA Astrophysics Data System (ADS)
Eckerlin, G.; Gerhards, R.; Kleinwort, C.; KrÜNer-Marquis, U.; Egli, S.; Niebergall, F.
The H1 experiment has now been successfully operating at the electron proton collider HERA at DESY for three years. During this time the computing environment has gradually shifted from a mainframe oriented environment to the distributed server/client Unix world. This transition is now almost complete. Computing needs are largely determined by the present amount of 1.5 TB of reconstructed data per year (1994), corresponding to 1.2 × 107 accepted events. All data are centrally available at DESY. In addition to data analysis, which is done in all collaborating institutes, most of the centrally organized Monte Carlo production is performed outside of DESY. New software tools to cope with offline computing needs include CENTIPEDE, a tool for the use of distributed batch and interactive resources for Monte Carlo production, and H1 UNIX, a software package for automatic updates of H1 software on all UNIX platforms.
Barriers to Acceptance of Personal Digital Assistants for HIV/AIDS Data Collection in Angola
Cheng, Karen G.; Ernesto, Francisco; Ovalle-Bahamón, Ricardo E.; Truong, Khai N.
2011-01-01
Purpose Handheld computers have potential to improve HIV/AIDS programs in healthcare settings in low-resource countries, by improving the speed and accuracy of collecting data. However, the acceptability of the technology (i.e., user attitude and reaction) is critical for its successful implementation. Acceptability is particularly critical for HIV/AIDS behavioral data, as it depends on respondents giving accurate information about a highly sensitive topic – sexual behavior. Methods To explore the acceptability of handheld computers for HIV/AIDS data collection and to identify potential barriers to acceptance, five focus groups of 8–10 participants each were conducted in Luanda, Angola. Facilitators presented Palm Tungsten E handhelds to the focus groups, probed participants’ perceptions of the handheld computer, and asked how they felt about disclosing intimate sexual behavior to an interviewer using a handheld computer. Discussions were conducted in Portuguese, the official language of Angola, and audio-taped. They were then transcribed and translated into English for analysis. Results In total, 49 people participated in the focus groups. PDAs were understood through the lens of social and cultural beliefs. Themes that emerged were suspicion of outsiders, concern with longevity, views on progress and development, and concern about social status. Conclusions The findings from this study suggest that personal and cultural beliefs influence participant acceptance of PDAs in Angola. While PDAs provide great advantages in terms of speed and efficiency of data collection, these barriers, if left unaddressed, may lead to biased reporting of HIV/AIDS risk data. An understanding of the barriers and why they are relevant in Angola may help researchers and practitioners to reduce the impact of these barriers on HIV/AIDS data collection. PMID:21622022
Measuring older adults' sedentary time: reliability, validity, and responsiveness.
Gardiner, Paul A; Clark, Bronwyn K; Healy, Genevieve N; Eakin, Elizabeth G; Winkler, Elisabeth A H; Owen, Neville
2011-11-01
With evidence that prolonged sitting has deleterious health consequences, decreasing sedentary time is a potentially important preventive health target. High-quality measures, particularly for use with older adults, who are the most sedentary population group, are needed to evaluate the effect of sedentary behavior interventions. We examined the reliability, validity, and responsiveness to change of a self-report sedentary behavior questionnaire that assessed time spent in behaviors common among older adults: watching television, computer use, reading, socializing, transport and hobbies, and a summary measure (total sedentary time). In the context of a sedentary behavior intervention, nonworking older adults (n = 48, age = 73 ± 8 yr (mean ± SD)) completed the questionnaire on three occasions during a 2-wk period (7 d between administrations) and wore an accelerometer (ActiGraph model GT1M) for two periods of 6 d. Test-retest reliability (for the individual items and the summary measure) and validity (self-reported total sedentary time compared with accelerometer-derived sedentary time) were assessed during the 1-wk preintervention period, using Spearman (ρ) correlations and 95% confidence intervals (CI). Responsiveness to change after the intervention was assessed using the responsiveness statistic (RS). Test-retest reliability was excellent for television viewing time (ρ (95% CI) = 0.78 (0.63-0.89)), computer use (ρ (95% CI) = 0.90 (0.83-0.94)), and reading (ρ (95% CI) = 0.77 (0.62-0.86)); acceptable for hobbies (ρ (95% CI) = 0.61 (0.39-0.76)); and poor for socializing and transport (ρ < 0.45). Total sedentary time had acceptable test-retest reliability (ρ (95% CI) = 0.52 (0.27-0.70)) and validity (ρ (95% CI) = 0.30 (0.02-0.54)). Self-report total sedentary time was similarly responsive to change (RS = 0.47) as accelerometer-derived sedentary time (RS = 0.39). The summary measure of total sedentary time has good repeatability and modest validity and is sufficiently responsive to change suggesting that it is suitable for use in interventions with older adults.
Technology Acceptance in Social Work Education: Implications for the Field Practicum
ERIC Educational Resources Information Center
Colvin, Alex Don; Bullock, Angela N.
2014-01-01
The exponential growth and sophistication of new information and computer technology (ICT) have greatly influenced human interactions and provided new metaphors for understanding the world. The acceptance and integration of ICT into social work field education are examined here using the technological acceptance model. This article also explores…
Evaluation of Operational Procedures for Using a Time-Based Airborne Inter-arrival Spacing Tool
NASA Technical Reports Server (NTRS)
Oseguera-Lohr, Rosa M.; Lohr, Gary W.; Abbott, Terence S.; Eischeid, Todd M.
2002-01-01
An airborne tool has been developed based on the concept of an aircraft maintaining a time-based spacing interval from the preceding aircraft. The Advanced Terminal Area Approach Spacing (ATAAS) tool uses Automatic Dependent Surveillance-Broadcast (ADS-B) aircraft state data to compute a speed command for the ATAAS-equipped aircraft to obtain a required time interval behind another aircraft. The tool and candidate operational procedures were tested in a high-fidelity, full mission simulator with active airline subject pilots flying an arrival scenario using three different modes for speed control. The objectives of this study were to validate the results of a prior Monte Carlo analysis of the ATAAS algorithm and to evaluate the concept from the standpoint of pilot acceptability and workload. Results showed that the aircraft was able to consistently achieve the target spacing interval within one second (the equivalent of approximately 220 ft at a final approach speed of 130 kt) when the ATAAS speed guidance was autothrottle-coupled, and a slightly greater (4-5 seconds), but consistent interval with the pilot-controlled speed modes. The subject pilots generally rated the workload level with the ATAAS procedure as similar to that with standard procedures, and also rated most aspects of the procedure high in terms of acceptability. Although pilots indicated that the head-down time was higher with ATAAS, the acceptability of head-down time was rated high. Oculometer data indicated slight changes in instrument scan patterns, but no significant change in the amount of time spent looking out the window between the ATAAS procedure versus standard procedures.
NASA Technical Reports Server (NTRS)
Allison, L. J.
1972-01-01
A complete documentation of Numbus 2 High Resolution infrared Radiometer data and ESSA-1 and 3 television photographs is presented for the life-time of Hurricane Inez, 1966. Ten computer produced radiation charts were analyzed in order to delineate the three dimensional cloud structure during the formative, mature and dissipating stages of this tropical cyclone. Time sections were drawn throughout the storm's life cycle to relate the warm core development and upper level outflow of the storm with their respective cloud canopies, as shown by the radiation data. Aerial reconnaissance weather reports, radar photographs and conventional weather analyses were used to complement the satellite data. A computer program was utilized to accept Nimbus 2 HRIR equivalent blackbody temperatures within historical maximum and minimum sea surface temperature limits over the tropical Atlantic Ocean.
HGML: a hypertext guideline markup language.
Hagerty, C. G.; Pickens, D.; Kulikowski, C.; Sonnenberg, F.
2000-01-01
Existing text-based clinical practice guidelines can be difficult to put into practice. While a growing number of such documents have gained acceptance in the medical community and contain a wealth of valuable information, the time required to digest them is substantial. Yet the expressive power, subtlety and flexibility of natural language pose challenges when designing computer tools that will help in their application. At the same time, formal computer languages typically lack such expressiveness and the effort required to translate existing documents into these languages may be costly. We propose a method based on the mark-up concept for converting text-based clinical guidelines into a machine-operable form. This allows existing guidelines to be manipulated by machine, and viewed in different formats at various levels of detail according to the needs of the practitioner, while preserving their originally published form. PMID:11079898
Benn, D K; Minden, N J; Pettigrew, J C; Shim, M
1994-08-01
President Clinton's Health Security Act proposes the formation of large scale health plans with improved quality assurance. Dental radiography consumes 4% ($1.2 billion in 1990) of total dental expenditure yet regular systematic office quality assurance is not performed. A pilot automated method is described for assessing density of exposed film and fogging of unexposed processed film. A workstation and camera were used to input intraoral radiographs. Test images were produced from a phantom jaw with increasing exposure times. Two radiologists subjectively classified the images as too light, acceptable, or too dark. A computer program automatically classified global grey level histograms from the test images as too light, acceptable, or too dark. The program correctly classified 95% of 88 clinical films. Optical density of unexposed film in the range 0.15 to 0.52 measured by computer was reliable to better than 0.01. Further work is needed to see if comprehensive centralized automated radiographic quality assurance systems with feedback to dentists are feasible, are able to improve quality, and are significantly cheaper than conventional clerical methods.
Tailoring High Order Time Discretizations for Use with Spatial Discretizations of Hyperbolic PDEs
2015-05-19
Duration of Grant Sigal Gottlieb, Professor of Mathematics, UMass Dartmouth. Daniel Higgs , Graduate Student, UMass Dartmouth. Zachary Grant, Undergraduate...Grant, and D. Higgs , “Optimal Explicit Strong Stability Preserving Runge– Kutta Methods with High Linear Order and optimal Nonlinear Order.” Accepted...for publica- tion in Mathematics of Computation. Available on Arxiv at http://arxiv.org/abs/1403. 6519 4. C. Bresten, S. Gottlieb, Z. Grant, D. Higgs
Cingi Steps for preoperative computer-assisted image editing before reduction rhinoplasty.
Cingi, Can Cemal; Cingi, Cemal; Bayar Muluk, Nuray
2014-04-01
The aim of this work is to provide a stepwise systematic guide for a preoperative photo-editing procedure for rhinoplasty cases involving the cooperation of a graphic artist and a surgeon. One hundred female subjects who planned to undergo a reduction rhinoplasty operation were included in this study. The Cingi Steps for Preoperative Computer Imaging (CS-PCI) program, a stepwise systematic guide for image editing using Adobe PhotoShop's "liquify" effect, was applied to the rhinoplasty candidates. The stages of CS-PCI are as follows: (1) lowering the hump; (2) shortening the nose; (3) adjusting the tip projection, (4) perfecting the nasal dorsum, (5) creating a supratip break, and (6) exaggerating the tip projection and/or dorsal slope. Performing the Cingi Steps allows the patient to see what will happen during the operation and observe the final appearance of his or her nose. After the application of described steps, 71 patients (71%) accepted step 4, and 21 (21%) of them accepted step 5. Only 10 patients (10%) wanted to make additional changes to their operation plans. The main benefits of using this method is that it decreases the time needed by the surgeon to perform a graphic analysis, and it reduces the time required for the patient to reach a decision about the procedure. It is an easy and reliable method that will provide improved physician-patient communication, increased patient confidence, and enhanced surgical planning while limiting the time needed for planning. © 2014 ARS-AAOA, LLC.
27 CFR 19.634 - Computer-generated reports and transaction forms.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 27 Alcohol, Tobacco Products and Firearms 1 2013-04-01 2013-04-01 false Computer-generated reports... Reports Filing Forms and Reports § 19.634 Computer-generated reports and transaction forms. TTB will accept computer-generated reports of operations and transaction forms made using a computer printer on...
27 CFR 19.634 - Computer-generated reports and transaction forms.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 27 Alcohol, Tobacco Products and Firearms 1 2014-04-01 2014-04-01 false Computer-generated reports... Reports Filing Forms and Reports § 19.634 Computer-generated reports and transaction forms. TTB will accept computer-generated reports of operations and transaction forms made using a computer printer on...
27 CFR 19.634 - Computer-generated reports and transaction forms.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 27 Alcohol, Tobacco Products and Firearms 1 2012-04-01 2012-04-01 false Computer-generated reports... Reports Filing Forms and Reports § 19.634 Computer-generated reports and transaction forms. TTB will accept computer-generated reports of operations and transaction forms made using a computer printer on...
27 CFR 19.634 - Computer-generated reports and transaction forms.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 27 Alcohol, Tobacco Products and Firearms 1 2011-04-01 2011-04-01 false Computer-generated reports... Reports Filing Forms and Reports § 19.634 Computer-generated reports and transaction forms. TTB will accept computer-generated reports of operations and transaction forms made using a computer printer on...
A Computational Approach for Probabilistic Analysis of LS-DYNA Water Impact Simulations
NASA Technical Reports Server (NTRS)
Horta, Lucas G.; Mason, Brian H.; Lyle, Karen H.
2010-01-01
NASA s development of new concepts for the Crew Exploration Vehicle Orion presents many similar challenges to those worked in the sixties during the Apollo program. However, with improved modeling capabilities, new challenges arise. For example, the use of the commercial code LS-DYNA, although widely used and accepted in the technical community, often involves high-dimensional, time consuming, and computationally intensive simulations. Because of the computational cost, these tools are often used to evaluate specific conditions and rarely used for statistical analysis. The challenge is to capture what is learned from a limited number of LS-DYNA simulations to develop models that allow users to conduct interpolation of solutions at a fraction of the computational time. For this problem, response surface models are used to predict the system time responses to a water landing as a function of capsule speed, direction, attitude, water speed, and water direction. Furthermore, these models can also be used to ascertain the adequacy of the design in terms of probability measures. This paper presents a description of the LS-DYNA model, a brief summary of the response surface techniques, the analysis of variance approach used in the sensitivity studies, equations used to estimate impact parameters, results showing conditions that might cause injuries, and concluding remarks.
Wood, Annabel; Morris, Helen; Emery, Jon; Hall, Per N; Cotton, Symon; Prevost, A Toby; Walter, Fiona M
2008-01-01
Pigmented skin lesions or 'moles' are a common presenting problem in general practice consultations: while the majority are benign, a minority are malignant melanomas. The MoleMate system is a novel diagnostic tool which incorporates spectrophotometric intracutaneous analysis (SIAscopy) within a non-invasive scanning technique and utilises a diagnostic algorithm specifically developed for use in primary care. The MoleMate training program is a short, computer-based course developed to train primary care practitioners to operate the MoleMate diagnostic tool. This pre-trial study used mixed methods to assess the effectiveness and acceptability of a computer-based training program CD-ROM, developed to teach primary care practitioners to identify the seven features of suspicious pigmented lesions (SPLs) seen with the MoleMate system. Twenty-five practitioners worked through the MoleMate training program: data on feature recognition and time taken to conduct the assessment of each lesion were collected. Acceptability of the training program and the MoleMate system in general was assessed by questionnaire. The MoleMate training program improved users' feature recognition by 10% (pre-test median 73.8%, p<0.001), and reduced the time taken to complete assessment of 30 SPLs (pre-test median 21 minutes 53 seconds, median improvement 3 minutes 17 seconds, p<0.001). All practitioners' feature recognition improved (21/21), with most also improving their time (18/21). Practitioners rated the training program as effective and easy to use. The MoleMate training program is a potentially effective and acceptable informatics tool to teach practitioners to recognise the features of SPLs identified by the MoleMate system. It will be used as part of the intervention in a randomised controlled trial to compare the diagnostic accuracy and appropriate referral rates of practitioners using the MoleMate system with best practice in primary care.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roper, J; Ghavidel, B; Godette, K
Purpose: To validate a knowledge-based algorithm for prostate LDR brachytherapy treatment planning. Methods: A dataset of 100 cases was compiled from an active prostate seed implant service. Cases were randomized into 10 subsets. For each subset, the 90 remaining library cases were registered to a common reference frame and then characterized on a point by point basis using principle component analysis (PCA). Each test case was converted to PCA vectors using the same process and compared with each library case using a Mahalanobis distance to evaluate similarity. Rank order PCA scores were used to select the best-matched library case. Themore » seed arrangement was extracted from the best-matched case and used as a starting point for planning the test case. Any subsequent modifications were recorded that required input from a treatment planner to achieve V100>95%, V150<60%, V200<20%. To simulate operating-room planning constraints, seed activity was held constant, and the seed count could not increase. Results: The computational time required to register test-case contours and evaluate PCA similarity across the library was 10s. Preliminary analysis of 2 subsets shows that 9 of 20 test cases did not require any seed modifications to obtain an acceptable plan. Five test cases required fewer than 10 seed modifications or a grid shift. Another 5 test cases required approximately 20 seed modifications. An acceptable plan was not achieved for 1 outlier, which was substantially larger than its best match. Modifications took between 5s and 6min. Conclusion: A knowledge-based treatment planning algorithm for prostate LDR brachytherapy is being cross validated using 100 prior cases. Preliminary results suggest that for this size library, acceptable plans can be achieved without planner input in about half of the cases while varying amounts of planner input are needed in remaining cases. Computational time and planning time are compatible with clinical practice.« less
Rosas, Lisa G; Trujillo, Celina; Camacho, Jose; Madrigal, Daniel; Bradman, Asa; Eskenazi, Brenda
2014-01-01
Objective To describe the acceptability of an interactive computer kiosk that provides environmental health education to low-income Latina prenatal patients. Methods A mixed-methods approach was used to assess the acceptability of the Prenatal Environmental Health Kiosk pregnant Latina women in Salinas, CA (n=152). The kiosk is a low literacy, interactive touch-screen computer program with an audio component and includes graphics and an interactive game. Results The majority had never used a kiosk before. Over 90% of women reported that they learned something new while using the kiosk. Prior to using the kiosk, 22% of women reported their preference of receiving health education from a kiosk over a pamphlet or video compared with 57% after using the kiosk (p<0.01). Qualitative data revealed: 1) benefit of exposure to computer use; 2) reinforcing strategy of health education; and 3) popularity of the interactive game. Conclusion The Prenatal Environmental Health Kiosk is an innovative patient health education modality that was shown to be acceptable among a population of low-income Latino pregnant women in a prenatal care clinic. Practice Implications This pilot study demonstrated that a health education kiosk was an acceptable strategy for providing Latina prenatal patients with information on pertinent environmental exposures. PMID:25085548
Rosas, Lisa G; Trujillo, Celina; Camacho, Jose; Madrigal, Daniel; Bradman, Asa; Eskenazi, Brenda
2014-11-01
To describe the acceptability of an interactive computer kiosk that provides environmental health education to low-income Latina prenatal patients. A mixed-methods approach was used to assess the acceptability of the Prenatal Environmental Health Kiosk pregnant Latina women in Salinas, CA (n=152). The kiosk is a low literacy, interactive touch-screen computer program with an audio component and includes graphics and an interactive game. The majority had never used a kiosk before. Over 90% of women reported that they learned something new while using the kiosk. Prior to using the kiosk, 22% of women reported their preference of receiving health education from a kiosk over a pamphlet or video compared with 57% after using the kiosk (p<0.01). Qualitative data revealed: (1) benefit of exposure to computer use; (2) reinforcing strategy of health education; and (3) popularity of the interactive game. The Prenatal Environmental Health Kiosk is an innovative patient health education modality that was shown to be acceptable among a population of low-income Latino pregnant women in a prenatal care clinic. This pilot study demonstrated that a health education kiosk was an acceptable strategy for providing Latina prenatal patients with information on pertinent environmental exposures. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Acceptability of the Talking Touchscreen for Health Literacy Assessment
Yost, Kathleen J.; Webster, Kimberly; Baker, David W.; Jacobs, Elizabeth A.; Anderson, Andy; Hahn, Elizabeth A.
2012-01-01
Self-administration of a multimedia health literacy measure in clinic settings is a novel concept. Demonstrated ease of use and acceptability will help predicate the future value of this strategy. We previously demonstrated the acceptability of a “Talking Touchscreen” for health status assessment. For this study, we adapted the touchscreen for self-administration of a new health literacy measure. Primary care patients (n=610) in clinics for underserved populations completed health status and health literacy questions on the Talking Touchscreen and participated in an interview. Participants were 51% female, 10% age 60+, 67% African American, 18% without a high school education, and 14% who had never used a computer. The majority (93%) had no difficulty using the touchscreen, including those who were computer-naïve (87%). Most rated the screen design as very good or excellent (72%), including computer-naïve patients (71%) and older patients (75%). Acceptability of the touchscreen did not differ by health literacy level. The Talking Touchscreen was easy to use and acceptable for self-administration of a new health literacy measure. Self-administration should reduce staff burden and costs, interview bias, and feelings of embarrassment by those with lower literacy. Tools like the Talking Touchscreen may increase exposure of underserved populations to new technologies. PMID:20845195
Nursing acceptance of a speech-input interface: a preliminary investigation.
Dillon, T W; McDowell, D; Norcio, A F; DeHaemer, M J
1994-01-01
Many new technologies are being developed to improve the efficiency and productivity of nursing staffs. User acceptance is a key to the success of these technologies. In this article, the authors present a discussion of nursing acceptance of computer systems, review the basic design issues for creating a speech-input interface, and report preliminary findings of a study of nursing acceptance of a prototype speech-input interface. Results of the study showed that the 19 nursing subjects expressed acceptance of the prototype speech-input interface.
Examining CEGEP Students' Acceptance of Computer-Based Learning Environments: A Test of Two Models
ERIC Educational Resources Information Center
Doleck, Tenzin; Bazelais, Paul; Lemay, David John
2017-01-01
As the use of technology in education advances and broadens, empirical research around its use assumes increased importance. Yet literature investigating technology acceptance in certain populations remains scarce. We recently argued that technology acceptance investigations should also consider the modality of the antecedent belief, to…
An Investigation of Employees' Use of E-Learning Systems: Applying the Technology Acceptance Model
ERIC Educational Resources Information Center
Lee, Yi-Hsuan; Hsieh, Yi-Chuan; Chen, Yen-Hsun
2013-01-01
The purpose of this study is to apply the technology acceptance model to examine the employees' attitudes and acceptance of electronic learning (e-learning) systems in organisations. This study examines four factors (organisational support, computer self-efficacy, prior experience and task equivocality) that are believed to influence employees'…
[Factors on internet game addiction among adolescents].
Park, Hyun Sook; Kwon, Yun Hee; Park, Kyung-Min
2007-08-01
The purpose of this study was to explore factors related to internet game addiction for adolescents. This study was a cross-sectional survey, and data was collected through self-report questionnaires. Data was analyzed using the SPSS program. In logistic regression analysis, the risk of being addicted to internet games was 2.22 times higher in males than females. Adolescents with low and middle academic performance also had a higher risk(2.08 times and 2.54 times) to become addicted to internet games. For the location of the computer, the risk of becoming addicted to internet games were .01 times lower in the living room or brother or sisters' room than in their own room. The risk of becoming addicted to internet games was 1.18 times higher in the higher usage time of internet games. The risk of becoming addicted to internet games was .49 times lower in the more accepting and autonomic parents' rearing attitude and .02 times lower in the high self-efficacy group than the low group. The result of this study suggests that there are noticeable relationships between internet game addiction and gender, academic performance, location of computer, usage time of internet games, parents' rearing attitude, and self efficacy.
NASA Astrophysics Data System (ADS)
Binti Shamsuddin, Norsila
Technology advancement and development in a higher learning institution is a chance for students to be motivated to learn in depth in the information technology areas. Students should take hold of the opportunity to blend their skills towards these technologies as preparation for them when graduating. The curriculum itself can rise up the students' interest and persuade them to be directly involved in the evolvement of the technology. The aim of this study is to see how deep is the students' involvement as well as their acceptance towards the adoption of the technology used in Computer Graphics and Image Processing subjects. The study will be towards the Bachelor students in Faculty of Industrial Information Technology (FIIT), Universiti Industri Selangor (UNISEL); Bac. In Multimedia Industry, BSc. Computer Science and BSc. Computer Science (Software Engineering). This study utilizes the new Unified Theory of Acceptance and Use of Technology (UTAUT) to further validate the model and enhance our understanding of the adoption of Computer Graphics and Image Processing Technologies. Four (4) out of eight (8) independent factors in UTAUT will be studied towards the dependent factor.
Neural mechanisms of economic commitment in the human medial prefrontal cortex
Tsetsos, Konstantinos; Wyart, Valentin; Shorkey, S Paul; Summerfield, Christopher
2014-01-01
Neurobiologists have studied decisions by offering successive, independent choices between goods or gambles. However, choices often have lasting consequences, as when investing in a house or choosing a partner. Here, humans decided whether to commit (by acceptance or rejection) to prospects that provided sustained financial return. BOLD signals in the rostral medial prefrontal cortex (rmPFC) encoded stimulus value only when acceptance or rejection was deferred into the future, suggesting a role in integrating value signals over time. By contrast, the dorsal anterior cingulate cortex (dACC) encoded stimulus value only when participants rejected (or deferred accepting) a prospect. dACC BOLD signals reflected two decision biases–to defer commitments to later, and to weight potential losses more heavily than gains–that (paradoxically) maximised reward in this task. These findings offer fresh insights into the pressures that shape economic decisions, and the computation of value in the medial prefrontal cortex. DOI: http://dx.doi.org/10.7554/eLife.03701.001 PMID:25333687
1978-08-01
21- accepts piping geometry as one of its basic inputs; whether this geometry comes from arrangement drawings or models is of no real consequence. c ... computer . Geometric data is taken from the catalogue and automatically merged with the piping geometry data. Also, fitting orientation is automatically...systems require a number of data manipulation routines to convert raw digitized data into logical pipe geometry acceptable to a computer -aided piping design
ERIC Educational Resources Information Center
Mei, Bing; Brown, Gavin T. L.; Teo, Timothy
2018-01-01
Despite the rapid proliferation of information and communication technologies, there exists a paucity of empirical research on the causes of the current low acceptance of computer-assisted language learning (CALL) by English as a foreign language (EFL) teachers in the People's Republic of China (PRC). This study aims to remedy this situation…
Frisby, Joshua; Smith, Vernon; Traub, Stephen; Patel, Vimla L
2017-01-01
Hospital Emergency Departments (EDs) frequently experience crowding. One of the factors that contributes to this crowding is the "door to doctor time", which is the time from a patient's registration to when the patient is first seen by a physician. This is also one of the Meaningful Use (MU) performance measures that emergency departments report to the Center for Medicare and Medicaid Services (CMS). Current documentation methods for this measure are inaccurate due to the imprecision in manual data collection. We describe a method for automatically (in real time) and more accurately documenting the door to physician time. Using sensor-based technology, the distance between the physician and the computer is calculated by using the single board computers installed in patient rooms that log each time a Bluetooth signal is seen from a device that the physicians carry. This distance is compared automatically with the accepted room radius to determine if the physicians are present in the room at the time logged to provide greater precision. The logged times, accurate to the second, were compared with physicians' handwritten times, showing automatic recordings to be more precise. This real time automatic method will free the physician from extra cognitive load of manually recording data. This method for evaluation of performance is generic and can be used in any other setting outside the ED, and for purposes other than measuring physician time. Copyright © 2016 Elsevier Inc. All rights reserved.
Digital Simulation Of Precise Sensor Degradations Including Non-Linearities And Shift Variance
NASA Astrophysics Data System (ADS)
Kornfeld, Gertrude H.
1987-09-01
Realistic atmospheric and Forward Looking Infrared Radiometer (FLIR) degradations were digitally simulated. Inputs to the routine are environmental observables and the FLIR specifications. It was possible to achieve realism in the thermal domain within acceptable computer time and random access memory (RAM) requirements because a shift variant recursive convolution algorithm that well describes thermal properties was invented and because each picture element (pixel) has radiative temperature, a materials parameter and range and altitude information. The computer generation steps start with the image synthesis of an undegraded scene. Atmospheric and sensor degradation follow. The final result is a realistic representation of an image seen on the display of a specific FLIR.
Delayed flap approach procedures for noise abatement and fuel conservation
NASA Technical Reports Server (NTRS)
Edwards, F. G.; Bull, J. S.; Foster, J. D.; Hegarty, D. M.; Drinkwater, F. J., III
1976-01-01
The NASA/Ames Research Center is currently investigating the delayed flap approach during which pilot actions are determined and prescribed by an onboard digital computer. The onboard digital computer determines the proper timing for the deployment of the landing gear and flaps based on the existing winds and airplane gross weight. Advisory commands are displayed to the pilot. The approach is flown along the conventional ILS glide slope but is initiated at a higher airspeed and in a clean aircraft configuration that allows for low thrust and results in reduced noise and fuel consumption. Topics discussed include operational procedures, pilot acceptability of these procedures, and fuel/noise benefits resulting from flight tests and simulation.
Fenna, D
1977-09-01
For nearly two decades, the development of computerized information systems has struggled for acceptable compromises between the unattainable "total system" and the unacceptable separate applications. Integration of related applications is essential if the computer is to be exploited fully, yet relative simplicity is necessary for systems to be implemented in a reasonable time-scale. This paper discusses a system being progressively developed from minimal beginnings but which, from the outset, had a highly flexible and fully integrated system basis. The system is for batch processing, but can accommodate on-line data input; it is similar in its approach to many transaction-processing real-time systems.
A computer controlled signal preprocessor for laser fringe anemometer applications
NASA Technical Reports Server (NTRS)
Oberle, Lawrence G.
1987-01-01
The operation of most commercially available laser fringe anemometer (LFA) counter-processors assumes that adjustments are made to the signal processing independent of the computer used for reducing the data acquired. Not only does the researcher desire a record of these parameters attached to the data acquired, but changes in flow conditions generally require that these settings be changed to improve data quality. Because of this limitation, on-line modification of the data acquisition parameters can be difficult and time consuming. A computer-controlled signal preprocessor has been developed which makes possible this optimization of the photomultiplier signal as a normal part of the data acquisition process. It allows computer control of the filter selection, signal gain, and photo-multiplier voltage. The raw signal from the photomultiplier tube is input to the preprocessor which, under the control of a digital computer, filters the signal and amplifies it to an acceptable level. The counter-processor used at Lewis Research Center generates the particle interarrival times, as well as the time-of-flight of the particle through the probe volume. The signal preprocessor allows computer control of the acquisition of these data.Through the preprocessor, the computer also can control the hand shaking signals for the interface between itself and the counter-processor. Finally, the signal preprocessor splits the pedestal from the signal before filtering, and monitors the photo-multiplier dc current, sends a signal proportional to this current to the computer through an analog to digital converter, and provides an alarm if the current exceeds a predefined maximum. Complete drawings and explanations are provided in the text as well as a sample interface program for use with the data acquisition software.
10 CFR 35.657 - Therapy-related computer systems.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 10 Energy 1 2012-01-01 2012-01-01 false Therapy-related computer systems. 35.657 Section 35.657... Units, Teletherapy Units, and Gamma Stereotactic Radiosurgery Units § 35.657 Therapy-related computer... computer systems in accordance with published protocols accepted by nationally recognized bodies. At a...
10 CFR 35.657 - Therapy-related computer systems.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 10 Energy 1 2013-01-01 2013-01-01 false Therapy-related computer systems. 35.657 Section 35.657... Units, Teletherapy Units, and Gamma Stereotactic Radiosurgery Units § 35.657 Therapy-related computer... computer systems in accordance with published protocols accepted by nationally recognized bodies. At a...
10 CFR 35.657 - Therapy-related computer systems.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 10 Energy 1 2014-01-01 2014-01-01 false Therapy-related computer systems. 35.657 Section 35.657... Units, Teletherapy Units, and Gamma Stereotactic Radiosurgery Units § 35.657 Therapy-related computer... computer systems in accordance with published protocols accepted by nationally recognized bodies. At a...
10 CFR 35.657 - Therapy-related computer systems.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 10 Energy 1 2011-01-01 2011-01-01 false Therapy-related computer systems. 35.657 Section 35.657... Units, Teletherapy Units, and Gamma Stereotactic Radiosurgery Units § 35.657 Therapy-related computer... computer systems in accordance with published protocols accepted by nationally recognized bodies. At a...
10 CFR 35.657 - Therapy-related computer systems.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 1 2010-01-01 2010-01-01 false Therapy-related computer systems. 35.657 Section 35.657... Units, Teletherapy Units, and Gamma Stereotactic Radiosurgery Units § 35.657 Therapy-related computer... computer systems in accordance with published protocols accepted by nationally recognized bodies. At a...
A Component-based Programming Model for Composite, Distributed Applications
NASA Technical Reports Server (NTRS)
Eidson, Thomas M.; Bushnell, Dennis M. (Technical Monitor)
2001-01-01
The nature of scientific programming is evolving to larger, composite applications that are composed of smaller element applications. These composite applications are more frequently being targeted for distributed, heterogeneous networks of computers. They are most likely programmed by a group of developers. Software component technology and computational frameworks are being proposed and developed to meet the programming requirements of these new applications. Historically, programming systems have had a hard time being accepted by the scientific programming community. In this paper, a programming model is outlined that attempts to organize the software component concepts and fundamental programming entities into programming abstractions that will be better understood by the application developers. The programming model is designed to support computational frameworks that manage many of the tedious programming details, but also that allow sufficient programmer control to design an accurate, high-performance application.
Kingston, Dawn; McDonald, Sheila; Biringer, Anne; Austin, Marie-Paule; Hegadoren, Kathy; McDonald, Sarah; Giallo, Rebecca; Ohinmaa, Arto; Lasiuk, Gerri; MacQueen, Glenda; Sword, Wendy; Lane-Smith, Marie; van Zanten, Sander Veldhuyzen
2014-01-02
Stress, depression, and anxiety affect 15% to 25% of pregnant women. However, substantial barriers to psychosocial assessment exist, resulting in less than 20% of prenatal care providers assessing and treating mental health problems. Moreover, pregnant women are often reluctant to disclose their mental health concerns to a healthcare provider. Identifying screening and assessment tools and procedures that are acceptable to both women and service providers, cost-effective, and clinically useful is needed. The primary objective of this randomized, parallel-group, superiority trial is to evaluate the feasibility and acceptability of a computer tablet-based prenatal psychosocial assessment (e-screening) compared to paper-based screening. Secondary objectives are to compare the two modes of screening on: (1) the level of detection of prenatal depression and anxiety symptoms and psychosocial risk; (2) the level of disclosure of symptoms; (3) the factors associated with feasibility, acceptability, and disclosure; (4) the psychometric properties of the e-version of the assessment tools; and (5) cost-effectiveness. A sample of 542 women will be recruited from large, primary care maternity clinics and a high-risk antenatal unit in an urban Canadian city. Pregnant women are eligible to participate if they: (1) receive care at one of the recruitment sites; (2) are able to speak/read English; (3) are willing to be randomized to e-screening; and (4) are willing to participate in a follow-up diagnostic interview within 1 week of recruitment. Allocation is by computer-generated randomization. Women in the intervention group will complete an online psychosocial assessment on a computer tablet, while those in the control group will complete the same assessment in paper-based form. All women will complete baseline questionnaires at the time of recruitment and will participate in a diagnostic interview within 1 week of recruitment. Research assistants conducting diagnostic interviews and physicians will be blinded. A qualitative descriptive study involving healthcare providers from the recruitment sites and women will provide data on feasibility and acceptability of the intervention. We hypothesize that mental health e-screening in primary care maternity settings and high-risk antenatal units will be as or more feasible, acceptable, and capable of detecting depression, anxiety, and psychosocial risk compared to paper-based screening. ClinicalTrials.gov Identifier: NCT01899534.
Edwards, Sandra L; Slattery, Martha L; Murtaugh, Maureen A; Edwards, Roger L; Bryner, James; Pearson, Mindy; Rogers, Amy; Edwards, Alison M; Tom-Orme, Lillian
2007-06-01
This article describes the development and usability of an audio computer-assisted self-interviewing (ACASI) questionnaire created to collect dietary, physical activity, medical history, and other lifestyle data in a population of American Indians. Study participants were part of a cohort of American Indians living in the southwestern United States. Data were collected between March 2004 and July 2005. Information for evaluating questionnaire usability and acceptability was collected from three different sources: baseline study data, auxiliary background data, and a short questionnaire administered to a subset of study participants. For the subset of participants, 39.6% reported not having used a computer in the past year. The ACASI questionnaires were well accepted: 96.0% of the subset of participants reported finding them enjoyable to use, 97.2% reported that they were easy to use, and 82.6% preferred them for future questionnaires. A lower educational level and infrequent computer use in the past year were predictors of having usability trouble. These results indicate that the ACASI questionnaire is both an acceptable and a preferable mode of data collection in this population.
Onboard Short Term Plan Viewer
NASA Technical Reports Server (NTRS)
Hall, Tim; LeBlanc, Troy; Ulman, Brian; McDonald, Aaron; Gramm, Paul; Chang, Li-Min; Keerthi, Suman; Kivlovitz, Dov; Hadlock, Jason
2011-01-01
Onboard Short Term Plan Viewer (OSTPV) is a computer program for electronic display of mission plans and timelines, both aboard the International Space Station (ISS) and in ISS ground control stations located in several countries. OSTPV was specifically designed both (1) for use within the limited ISS computing environment and (2) to be compatible with computers used in ground control stations. OSTPV supplants a prior system in which, aboard the ISS, timelines were printed on paper and incorporated into files that also contained other paper documents. Hence, the introduction of OSTPV has both reduced the consumption of resources and saved time in updating plans and timelines. OSTPV accepts, as input, the mission timeline output of a legacy, print-oriented, UNIX-based program called "Consolidated Planning System" and converts the timeline information for display in an interactive, dynamic, Windows Web-based graphical user interface that is used by both the ISS crew and ground control teams in real time. OSTPV enables the ISS crew to electronically indicate execution of timeline steps, launch electronic procedures, and efficiently report to ground control teams on the statuses of ISS activities, all by use of laptop computers aboard the ISS.
Medical Signal-Conditioning and Data-Interface System
NASA Technical Reports Server (NTRS)
Braun, Jeffrey; Jacobus, charles; Booth, Scott; Suarez, Michael; Smith, Derek; Hartnagle, Jeffrey; LePrell, Glenn
2006-01-01
A general-purpose portable, wearable electronic signal-conditioning and data-interface system is being developed for medical applications. The system can acquire multiple physiological signals (e.g., electrocardiographic, electroencephalographic, and electromyographic signals) from sensors on the wearer s body, digitize those signals that are received in analog form, preprocess the resulting data, and transmit the data to one or more remote location(s) via a radiocommunication link and/or the Internet. The system includes a computer running data-object-oriented software that can be programmed to configure the system to accept almost any analog or digital input signals from medical devices. The computing hardware and software implement a general-purpose data-routing-and-encapsulation architecture that supports tagging of input data and routing the data in a standardized way through the Internet and other modern packet-switching networks to one or more computer(s) for review by physicians. The architecture supports multiple-site buffering of data for redundancy and reliability, and supports both real-time and slower-than-real-time collection, routing, and viewing of signal data. Routing and viewing stations support insertion of automated analysis routines to aid in encoding, analysis, viewing, and diagnosis.
Computer and Internet use among Undergraduate Medical Students in Iran
Ayatollahi, Ali; Ayatollahi, Jamshid; Ayatollahi, Fatemeh; Ayatollahi, Reza; Shahcheraghi, Seyed Hossein
2014-01-01
Objective: Although computer technologies are now widely used in medicine, little is known about its use among medical students in Iran. The aim of this study was to determine the competence and access to computer and internet among the medical students. Methods: In this descriptive study, medical students of Shahid Sadoughi University of Medical Science, Yazd, Iran from the fifth years were asked to answer a questionnaire during a time-tabled lecture slot. The chi-square test was used to compare the frequency of computer and internet use between the two genders, and the level of statistical significance for all test was set at 0.05. Results: All the students have a personal computer and internet access. There were no statistically significant differences between men and women for the computer and internet access, use wireless device to access internet, having laptop and e-mail address and the difficulties encountered using internet. The main reason for less utilization of internet was slow speed of data transfer. Conclusions: Because of the wide range of computer skills and internet information among medical students in our institution, a single computer and internet course for all students would not be useful nor would it be accepted. PMID:25225525
The architecture of a distributed medical dictionary.
Fowler, J; Buffone, G; Moreau, D
1995-01-01
Exploiting high-speed computer networks to provide a national medical information infrastructure is a goal for medical informatics. The Distributed Medical Dictionary under development at Baylor College of Medicine is a model for an architecture that supports collaborative development of a distributed online medical terminology knowledge-base. A prototype is described that illustrates the concept. Issues that must be addressed by such a system include high availability, acceptable response time, support for local idiom, and control of vocabulary.
NASA Technical Reports Server (NTRS)
Lawrence, Stella
1992-01-01
This paper is concerned with methods of measuring and developing quality software. Reliable flight and ground support software is a highly important factor in the successful operation of the space shuttle program. Reliability is probably the most important of the characteristics inherent in the concept of 'software quality'. It is the probability of failure free operation of a computer program for a specified time and environment.
ERIC Educational Resources Information Center
Butler, Rory
2013-01-01
Internet-enabled mobile devices have increased the accessibility of learning content for students. Given the ubiquitous nature of mobile computing technology, a thorough understanding of the acceptance factors that impact a learner's intention to use mobile technology as an augment to their studies is warranted. Student acceptance of mobile…
ERIC Educational Resources Information Center
Huntington, Heidi; Worrell, Tracy
2013-01-01
Studies show that use of computer-based information communication technologies (ICTs) can have positive impacts on student motivation and learning. The present study examines the issue of ICT adoption in the classroom by expanding the Technology Acceptance Model to identify factors that contribute to teacher acceptance and use of these…
Leyva, Francisco J.; Bakshi, Rahul P.; Fuchs, Edward J.; Li, Liye; Caffo, Brian S.; Goldsmith, Arthur J.; Ventuneac, Ana; Carballo-Diéguez, Alex; Du, Yong; Leal, Jeffrey P.; Lee, Linda A.; Torbenson, Michael S.
2013-01-01
Abstract Rectally applied antiretroviral microbicides for preexposure prophylaxis (PrEP) of HIV infection are currently in development. Since enemas (rectal douches) are commonly used by men who have sex with men prior to receptive anal intercourse, a microbicide enema could enhance PrEP adherence by fitting seamlessly within the usual sexual practices. We assessed the distribution, safety, and acceptability of three enema types—hyperosmolar (Fleet), hypoosmolar (distilled water), and isoosmolar (Normosol-R)—in a crossover design. Nine men received each enema type in random order. Enemas were radiolabeled [99mTc-diethylene triamine pentaacetic acid (DTPA)] to assess enema distribution in the colon using single photon emission computed tomography/computed tomography (SPECT/CT) imaging. Plasma 99mTc-DTPA indicated mucosal permeability. Sigmoidoscopic colon tissue biopsies were taken to assess injury as well as tissue penetration of the 99mTc-DTPA. Acceptability was assessed after each product use and at the end of the study. SPECT/CT imaging showed that the isoosmolar enema had greater proximal colonic distribution (up to the splenic flexure) and greater luminal and colon tissue concentrations of 99mTc-DTPA when compared to the other enemas (p<0.01). Colon biopsies also showed that only the hyperosmolar enema caused sloughing of the colonic epithelium (p<0.05). In permeability testing, the hypoosmolar enema had higher plasma 99mTc-DTPA 24-h area under the concentration-time curve and peak concentration compared to the hyperosmolar and isoosmolar enemas, respectively. Acceptability was generally good with no clear preferences among the three enema types. The isoosmolar enema was superior or similar to the other enemas in all categories and is a good candidate for further development as a rectal microbicide vehicle. PMID:23885722
Agreement and Reliability of Tinnitus Loudness Matching and Pitch Likeness Rating
Hoare, Derek J.; Edmondson-Jones, Mark; Gander, Phillip E.; Hall, Deborah A.
2014-01-01
The ability to reproducibly match tinnitus loudness and pitch is important to research and clinical management. Here we examine agreement and reliability of tinnitus loudness matching and pitch likeness ratings when using a computer-based method to measure the tinnitus spectrum and estimate a dominant tinnitus pitch, using tonal or narrowband sounds. Group level data indicated a significant effect of time between test session 1 and 2 for loudness matching, likely procedural or perceptual learning, which needs to be accounted in study design. Pitch likeness rating across multiple frequencies appeared inherently more variable and with no systematic effect of time. Dominant pitch estimates reached a level of clinical acceptability when sessions were spaced two weeks apart. However when dominant tinnitus pitch assessments were separated by three months, acceptable agreement was achieved only for group mean data, not for individual estimates. This has implications for prescription of some sound-based interventions that rely on accurate measures of individual dominant tinnitus pitch. PMID:25478690
Campbell, Aimee N C; Nunes, Edward V; Pavlicova, Martina; Hatch-Maillette, Mary; Hu, Mei-Chen; Bailey, Genie L; Sugarman, Dawn E; Miele, Gloria M; Rieckmann, Traci; Shores-Wilson, Kathy; Turrigiano, Eva; Greenfield, Shelly F
2015-06-01
Digital technologies show promise for increasing treatment accessibility and improving quality of care, but little is known about gender differences. This secondary analysis uses data from a multi-site effectiveness trial of a computer-assisted behavioral intervention, conducted within NIDA's National Drug Abuse Clinical Trials Network, to explore gender differences in intervention acceptability and treatment outcomes. Men (n=314) and women (n=192) were randomly assigned to 12-weeks of treatment-as-usual (TAU) or modified TAU+Therapeutic Education System (TES), whereby TES substituted for 2hours of TAU per week. TES is composed of 62 Web-delivered, multimedia modules, covering skills for achieving and maintaining abstinence plus prize-based incentives contingent on abstinence and treatment adherence. Outcomes were: (1) abstinence from drugs and heavy drinking in the last 4weeks of treatment, (2) retention, (3) social functioning, and (4) drug and alcohol craving. Acceptability was the mean score across five indicators (i.e., interesting, useful, novel, easy to understand, and satisfaction). Gender did not moderate the effect of treatment on any outcome. Women reported higher acceptability scores at week 4 (p=.02), but no gender differences were detected at weeks 8 or 12. Acceptability was positively associated with abstinence, but only among women (p=.01). Findings suggest that men and women derive similar benefits from participating in a computer-assisted intervention, a promising outcome as technology-based treatments expand. Acceptability was associated with abstinence outcomes among women. Future research should explore characteristics of women who report less satisfaction with this modality of treatment and ways to improve overall acceptability. Copyright © 2015 Elsevier Inc. All rights reserved.
Identifying the Factors that Influence Computer Use in the Early Childhood Classroom
ERIC Educational Resources Information Center
Edwards, Suzy
2005-01-01
Computers have become an increasingly accepted learning tool in the early childhood classroom. Despite initial concerns regarding the effect of computers on children's development, past research has indicated that computer use by young children can support their learning and developmental outcomes (Siraj-Blatchford & Whitebread, 2003; Yelland,…
Advanced Certification Program for Computer Graphic Specialists. Final Performance Report.
ERIC Educational Resources Information Center
Parkland Coll., Champaign, IL.
A pioneer program in computer graphics was implemented at Parkland College (Illinois) to meet the demand for specialized technicians to visualize data generated on high performance computers. In summer 1989, 23 students were accepted into the pilot program. Courses included C programming, calculus and analytic geometry, computer graphics, and…
21 CFR 870.1110 - Blood pressure computer.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Blood pressure computer. 870.1110 Section 870.1110 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED... computer. (a) Identification. A blood pressure computer is a device that accepts the electrical signal from...
21 CFR 870.1110 - Blood pressure computer.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Blood pressure computer. 870.1110 Section 870.1110 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED... computer. (a) Identification. A blood pressure computer is a device that accepts the electrical signal from...
21 CFR 870.1110 - Blood pressure computer.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Blood pressure computer. 870.1110 Section 870.1110 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED... computer. (a) Identification. A blood pressure computer is a device that accepts the electrical signal from...
21 CFR 870.1110 - Blood pressure computer.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Blood pressure computer. 870.1110 Section 870.1110 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED... computer. (a) Identification. A blood pressure computer is a device that accepts the electrical signal from...
An interactive data management and analysis system for clinical investigators.
Groner, G F; Hopwood, M D; Palley, N A; Sibley, W L; Baker, W R; Christopher, T G; Thompson, H K
1978-09-01
An interactive minicomputer-based system has been developed that enables the clinical research investigator to personally explore and analyze his research data and, as a consequence of these explorations, to acquire more information. This system, which does not require extensive training or computer programming, enables the investigator to describe his data interactively in his own terms, enter data values while having them checked for validity, store time-oriented patient data in a carefully controlled on-line data base, retrieve data by patient, variable, and time, create subsets of patients with common characteristics, perform statistical analyses, and produce tables and graphs. It also permits data to be transferred to and from other computers. The system is well accepted and is being used by a variety of medical specialists at the three clinical research centers where it is operational. Reported benefits include less elapsed and nonproductive time, more thorough analysis of more data, greater and earlier insight into the meaning of research data, and increased publishable results.
Cloud GPU-based simulations for SQUAREMR.
Kantasis, George; Xanthis, Christos G; Haris, Kostas; Heiberg, Einar; Aletras, Anthony H
2017-01-01
Quantitative Magnetic Resonance Imaging (MRI) is a research tool, used more and more in clinical practice, as it provides objective information with respect to the tissues being imaged. Pixel-wise T 1 quantification (T 1 mapping) of the myocardium is one such application with diagnostic significance. A number of mapping sequences have been developed for myocardial T 1 mapping with a wide range in terms of measurement accuracy and precision. Furthermore, measurement results obtained with these pulse sequences are affected by errors introduced by the particular acquisition parameters used. SQUAREMR is a new method which has the potential of improving the accuracy of these mapping sequences through the use of massively parallel simulations on Graphical Processing Units (GPUs) by taking into account different acquisition parameter sets. This method has been shown to be effective in myocardial T 1 mapping; however, execution times may exceed 30min which is prohibitively long for clinical applications. The purpose of this study was to accelerate the construction of SQUAREMR's multi-parametric database to more clinically acceptable levels. The aim of this study was to develop a cloud-based cluster in order to distribute the computational load to several GPU-enabled nodes and accelerate SQUAREMR. This would accommodate high demands for computational resources without the need for major upfront equipment investment. Moreover, the parameter space explored by the simulations was optimized in order to reduce the computational load without compromising the T 1 estimates compared to a non-optimized parameter space approach. A cloud-based cluster with 16 nodes resulted in a speedup of up to 13.5 times compared to a single-node execution. Finally, the optimized parameter set approach allowed for an execution time of 28s using the 16-node cluster, without compromising the T 1 estimates by more than 10ms. The developed cloud-based cluster and optimization of the parameter set reduced the execution time of the simulations involved in constructing the SQUAREMR multi-parametric database thus bringing SQUAREMR's applicability within time frames that would be likely acceptable in the clinic. Copyright © 2016 Elsevier Inc. All rights reserved.
Agreement processing and attraction errors in aging: evidence from subject-verb agreement in German.
Reifegerste, Jana; Hauer, Franziska; Felser, Claudia
2017-11-01
Effects of aging on lexical processing are well attested, but the picture is less clear for grammatical processing. Where age differences emerge, these are usually ascribed to working-memory (WM) decline. Previous studies on the influence of WM on agreement computation have yielded inconclusive results, and work on aging and subject-verb agreement processing is lacking. In two experiments (Experiment 1: timed grammaticality judgment, Experiment 2: self-paced reading + WM test), we investigated older (OA) and younger (YA) adults' susceptibility to agreement attraction errors. We found longer reading latencies and judgment reaction times (RTs) for OAs. Further, OAs, particularly those with low WM scores, were more accepting of sentences with attraction errors than YAs. OAs showed longer reading latencies for ungrammatical sentences, again modulated by WM, than YAs. Our results indicate that OAs have greater difficulty blocking intervening nouns from interfering with the computation of agreement dependencies. WM can modulate this effect.
Computational Multiqubit Tunnelling in Programmable Quantum Annealers
2016-08-25
ARTICLE Received 3 Jun 2015 | Accepted 26 Nov 2015 | Published 7 Jan 2016 Computational multiqubit tunnelling in programmable quantum annealers...state itself. Quantum tunnelling has been hypothesized as an advantageous physical resource for optimization in quantum annealing. However, computational ...qubit tunnelling plays a computational role in a currently available programmable quantum annealer. We devise a probe for tunnelling, a computational
A Computer-Controlled Laser Bore Scanner
NASA Astrophysics Data System (ADS)
Cheng, Charles C.
1980-08-01
This paper describes the design and engineering of a laser scanning system for production applications. The laser scanning techniques, the timing control, the logic design of the pattern recognition subsystem, the digital computer servo control for the loading and un-loading of parts, and the laser probe rotation and its synchronization will be discussed. The laser inspection machine is designed to automatically inspect the surface of precision-bored holes, such as those in automobile master cylinders, without contacting the machined surface. Although the controls are relatively sophisticated, operation of the laser inspection machine is simple. A laser light beam from a commercially available gas laser, directed through a probe, scans the entire surface of the bore. Reflected light, picked up through optics by photoelectric sensors, generates signals that are fed to a mini-computer for processing. A pattern recognition techniques program in the computer determines acceptance or rejection of the part being inspected. The system's acceptance specifications are adjustable and are set to the user's established tolerances. However, the computer-controlled laser system is capable of defining from 10 to 75 rms surface finish, and voids or flaws from 0.0005 to 0.020 inch. Following the successful demonstration with an engineering prototype, the described laser machine has proved its capability to consistently ensure high-quality master brake cylinders. It thus provides a safety improvement for the automotive braking system. Flawless, smooth cylinder bores eliminate premature wearing of the rubber seals, resulting in a longer-lasting master brake cylinder and a safer and more reliable automobile. The results obtained from use of this system, which has been in operation about a year for replacement of a tedious, manual operation on one of the high-volume lines at the Bendix Hydraulics Division, have been very satisfactory.
The Adam language: Ada extended with support for multiway activities
NASA Technical Reports Server (NTRS)
Charlesworth, Arthur
1993-01-01
The Adam language is an extension of Ada that supports multiway activities, which are cooperative activities involving two or more processes. This support is provided by three new constructs: diva procedures, meet statements, and multiway accept statements. Diva procedures are recursive generic procedures having a particular restrictive syntax that facilitates translation for parallel computers. Meet statements and multiway accept statements provide two ways to express a multiway rendezvous, which is an n-way rendezvous generalizing Ada's 2-way rendezvous. While meet statements tend to have simpler rules than multiway accept statements, the latter approach is a more straightforward extension of Ada. The only nonnull statements permitted within meet statements and multiway accept statements are calls on instantiated diva procedures. A call on an instantiated diva procedure is also permitted outside a multiway rendezvous; thus sequential Adam programs using diva procedures can be written. Adam programs are translated into Ada programs appropriate for use on parallel computers.
Abernethy, Amy P; Herndon, James E; Wheeler, Jane L; Day, Jeannette M; Hood, Linda; Patwardhan, Meenal; Shaw, Heather; Lyerly, Herbert Kim
2009-06-01
Programmed, notebook-style, personal computers ("e/Tablets") can collect symptom and quality-of-life (QOL) data at the point of care. Patients use an e/Tablet in the clinic waiting area to complete electronic surveys. Information then travels wirelessly to a server, which generates a real-time report for use during the clinical visit. The objective of this study was to determine whether academic oncology patients find e/Tablets logistically acceptable and a satisfactory means of communicating symptoms to providers during repeated clinic visits. Sixty-six metastatic breast cancer patients at Duke Breast Cancer Clinic participated. E/Tablets were customized to electronically administer a satisfaction/acceptability survey, several validated questionnaires, and the Patient Care Monitor (PCM) review of symptoms survey. At each of the four visits within six months, participants completed the patient satisfaction/acceptability survey, which furnished data for the current analysis. Participant demographics were: mean age of 54 years, 77% Caucasian, and 47% with less than a college education. Participants reported that e/Tablets were easy to read (94%), easy to navigate (99%), and had a comfortable weight (90%); they found it easy to respond to questions using the e/Tablet (98%). Seventy-five percent initially indicated satisfaction with PCM for reporting symptoms; this proportion increased over time. By the last visit, 88% of participants indicated that they would recommend the PCM to other patients; 74% felt that the e/Tablet helped them remember symptoms to report to their clinician. E/Tablets offered a feasible and acceptable method for collecting longitudinal patient-reported symptom and QOL data within an academic, tertiary care, breast cancer clinic.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roper, J; Bradshaw, B; Godette, K
Purpose: To create a knowledge-based algorithm for prostate LDR brachytherapy treatment planning that standardizes plan quality using seed arrangements tailored to individual physician preferences while being fast enough for real-time planning. Methods: A dataset of 130 prior cases was compiled for a physician with an active prostate seed implant practice. Ten cases were randomly selected to test the algorithm. Contours from the 120 library cases were registered to a common reference frame. Contour variations were characterized on a point by point basis using principle component analysis (PCA). A test case was converted to PCA vectors using the same process andmore » then compared with each library case using a Mahalanobis distance to evaluate similarity. Rank order PCA scores were used to select the best-matched library case. The seed arrangement was extracted from the best-matched case and used as a starting point for planning the test case. Computational time was recorded. Any subsequent modifications were recorded that required input from a treatment planner to achieve an acceptable plan. Results: The computational time required to register contours from a test case and evaluate PCA similarity across the library was approximately 10s. Five of the ten test cases did not require any seed additions, deletions, or moves to obtain an acceptable plan. The remaining five test cases required on average 4.2 seed modifications. The time to complete manual plan modifications was less than 30s in all cases. Conclusion: A knowledge-based treatment planning algorithm was developed for prostate LDR brachytherapy based on principle component analysis. Initial results suggest that this approach can be used to quickly create treatment plans that require few if any modifications by the treatment planner. In general, test case plans have seed arrangements which are very similar to prior cases, and thus are inherently tailored to physician preferences.« less
Students' Acceptance of Tablet PCs in Italian High Schools: Profiles and Differences
ERIC Educational Resources Information Center
Villani, Daniela; Morganti, Laura; Carissoli, Claudia; Gatti, Elena; Bonanomi, Andrea; Cacciamani, Stefano; Confalonieri, Emanuela; Riva, Giuseppe
2018-01-01
The tablet PC represents a very popular mobile computing device, and together with other technologies it is changing the world of education. This study aimed to explore the acceptance of tablet PC of Italian high school students in order to outline the typical students' profiles and to compare the acceptance conveyed in two types of use (learning…
Experimental investigation of the persuasive impact of computer generated presentation graphics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vogel, D.R.
1986-01-01
Computer generated presentation graphics are increasingly becoming a tool to aid management in communicating information and to cause an audience to accept a point of view or take action. Unfortunately, technological capability significantly exceeds current levels of user understanding and effective application. This research examines experimentally one aspect of this problem, the persuasive impact of characteristics of computer generated presentation graphics. The research was founded in theory based on the message learning approach to persuasion. Characteristics examined were color versus black and white, text versus image enhancement, and overhead transparencies versus 35 mm slides. Treatments were presented in association withmore » a videotaped presentation intended to persuade subjects to invest time and money in a set of time management seminars. Data were collected using pre-measure, post measure, and post measure follow up questionnaires. Presentation support had a direct impact on perceptions of the presenter as well as components of persuasion, i.e., attention, comprehension, yielding, and retention. Further, a strong positive relationship existed between enhanced perceptions of the presenter and attention and yielding.« less
Dynamic modeling of Tampa Bay urban development using parallel computing
Xian, G.; Crane, M.; Steinwand, D.
2005-01-01
Urban land use and land cover has changed significantly in the environs of Tampa Bay, Florida, over the past 50 years. Extensive urbanization has created substantial change to the region's landscape and ecosystems. This paper uses a dynamic urban-growth model, SLEUTH, which applies six geospatial data themes (slope, land use, exclusion, urban extent, transportation, hillside), to study the process of urbanization and associated land use and land cover change in the Tampa Bay area. To reduce processing time and complete the modeling process within an acceptable period, the model is recoded and ported to a Beowulf cluster. The parallel-processing computer system accomplishes the massive amount of computation the modeling simulation requires. SLEUTH calibration process for the Tampa Bay urban growth simulation spends only 10 h CPU time. The model predicts future land use/cover change trends for Tampa Bay from 1992 to 2025. Urban extent is predicted to double in the Tampa Bay watershed between 1992 and 2025. Results show an upward trend of urbanization at the expense of a decline of 58% and 80% in agriculture and forested lands, respectively.
Application of multiphase modelling for vortex occurrence in vertical pump intake - a review
NASA Astrophysics Data System (ADS)
Samsudin, M. L.; Munisamy, K. M.; Thangaraju, S. K.
2015-09-01
Vortex formation within pump intake is one of common problems faced for power plant cooling water system. This phenomenon, categorised as surface and sub-surface vortices, can lead to several operational problems and increased maintenance costs. Physical model study was recommended from published guidelines but proved to be time and resource consuming. Hence, the use of Computational Fluid Dynamics (CFD) is an attractive alternative in managing the problem. At the early stage, flow analysis was conducted using single phase simulation and found to find good agreement with the observation from physical model study. With the development of computers, multiphase simulation found further enhancement in obtaining accurate results for representing air entrainment and sub-surface vortices which were earlier not well predicted from the single phase simulation. The purpose of this paper is to describe the application of multiphase modelling with CFD analysis for investigating vortex formation for a vertically inverted pump intake. In applying multiphase modelling, there ought to be a balance between the acceptable usage for computational time and resources and the degree of accuracy and realism in the results as expected from the analysis.
2016-04-01
the DOD will put DOD systems and data at a risk level comparable to that of their neighbors in the cloud. Just as a user browses a Web page on the...proxy servers for controlling user access to Web pages, and large-scale storage for data management. Each of these devices allows access to the...user to develop applications. Acunetics.com describes Web applications as “computer programs allowing Website visitors to submit and retrieve data
1979-08-21
Appendix s - Outline and Draft Material for Proposed Triservice Interim Guideline on Application of Software Acceptance Criteria....... 269 Appendix 9...AND DRAFT MATERIAL FOR PROPOSED TRISERVICE INTERIM GUIDELINE ON APPLICATION OF SOFTWARE ACCEPTANCE CRITERIA I I INTRODUCTION The purpose of this guide...contract item (CPCI) (code) 5. CPCI test plan 6. CPCI test procedures 7. CPCI test report 8. Handbooks and manuals. Al though additional material does
40 CFR 86.005-17 - On-board diagnostics.
Code of Federal Regulations, 2013 CFR
2013-07-01
... other available operating parameters), and functionality checks for computer output components (proper... considered acceptable. (e) Storing of computer codes. The OBD system shall record and store in computer... monitors that can be considered continuously operating monitors (e.g., misfire monitor, fuel system monitor...
40 CFR 86.005-17 - On-board diagnostics.
Code of Federal Regulations, 2012 CFR
2012-07-01
... other available operating parameters), and functionality checks for computer output components (proper... considered acceptable. (e) Storing of computer codes. The OBD system shall record and store in computer... monitors that can be considered continuously operating monitors (e.g., misfire monitor, fuel system monitor...
Automatic design of optical systems by digital computer
NASA Technical Reports Server (NTRS)
Casad, T. A.; Schmidt, L. F.
1967-01-01
Computer program uses geometrical optical techniques and a least squares optimization method employing computing equipment for the automatic design of optical systems. It evaluates changes in various optical parameters, provides comprehensive ray-tracing, and generally determines the acceptability of the optical system characteristics.
Cohall, Alwyn T; Dini, Sheila; Senathirajah, Yalini; Nye, Andrea; Neu, Natalie; Powell, Donald; Powell, Borris; Hyden, Christel
2008-01-01
Significant advances in the treatment of human immunodeficiency virus (HIV)/acquired immunodeficiency syndrome (AIDS) place a premium on early detection and linkage to care. Recognizing the need to efficiently yet comprehensively provide HIV counseling, we assessed the feasibility of using audio computer-assisted self-inventory (A-CASI) in a community-based HIV counseling and testing facility. A convenience sample of 50 adults presenting for HIV testing was recruited to complete an 85-item computerized HIV Assessment of Risk Inventory (HARI) containing domains of demographics, sexual behaviors, alcohol and substance use, emotional well-being, past experiences with HIV testing, and attitudes about taking HARI. Client acceptance rate was limited by the completion time outlined during the intake process. However, the majority of respondents who completed HARI felt that it took only a short to moderate time to complete and was easy to understand. A majority also reported a preference for using a computerized format in the future. Further, HARI identified a number of risk-taking behaviors, including unprotected anal sex and substance use prior to past sexual encounters. Additionally, more than half of the sample reported moderate to severe depressive symptoms. Those respondents who had time to complete the survey accepted the A-CASI interview, and it was successful at identifying a substantial level of risk-taking behaviors. A-CASI has the potential to guide HIV counselors in providing risk-reduction counseling and referral activities. However, results suggested the need to shorten the instrument, and further studies are needed to determine applicability in other HIV testing sites.
Lloyd, Tom; Buck, Harleah; Foy, Andrew; Black, Sara; Pinter, Antony; Pogash, Rosanne; Eismann, Bobby; Balaban, Eric; Chan, John; Kunselman, Allen; Smyth, Joshua; Boehmer, John
2017-05-01
The Penn State Heart Assistant, a web-based, tablet computer-accessed, secure application was developed to conduct a proof of concept test, targeting patient self-care activities of heart failure patients including daily medication adherence, weight monitoring, and aerobic activity. Patients (n = 12) used the tablet computer-accessed program for 30 days-recording their information and viewing a short educational video. Linear random coefficient models assessed the relationship between weight and time and exercise and time. Good medication adherence (66% reporting taking 75% of prescribed medications) was reported. Group compliance over 30 days for weight and exercise was 84 percent. No persistent weight gain over 30 days, and some indication of weight loss (slope of weight vs time was negative (-0.17; p value = 0.002)), as well as increased exercise (slope of exercise vs time was positive (0.08; p value = 0.04)) was observed. This study suggests that mobile technology is feasible, acceptable, and has potential for cost-effective opportunities to manage heart failure patients safely at home.
Li, Jian; Bloch, Pavel; Xu, Jing; Sarunic, Marinko V; Shannon, Lesley
2011-05-01
Fourier domain optical coherence tomography (FD-OCT) provides faster line rates, better resolution, and higher sensitivity for noninvasive, in vivo biomedical imaging compared to traditional time domain OCT (TD-OCT). However, because the signal processing for FD-OCT is computationally intensive, real-time FD-OCT applications demand powerful computing platforms to deliver acceptable performance. Graphics processing units (GPUs) have been used as coprocessors to accelerate FD-OCT by leveraging their relatively simple programming model to exploit thread-level parallelism. Unfortunately, GPUs do not "share" memory with their host processors, requiring additional data transfers between the GPU and CPU. In this paper, we implement a complete FD-OCT accelerator on a consumer grade GPU/CPU platform. Our data acquisition system uses spectrometer-based detection and a dual-arm interferometer topology with numerical dispersion compensation for retinal imaging. We demonstrate that the maximum line rate is dictated by the memory transfer time and not the processing time due to the GPU platform's memory model. Finally, we discuss how the performance trends of GPU-based accelerators compare to the expected future requirements of FD-OCT data rates.
Prefetching in file systems for MIMD multiprocessors
NASA Technical Reports Server (NTRS)
Kotz, David F.; Ellis, Carla Schlatter
1990-01-01
The question of whether prefetching blocks on the file into the block cache can effectively reduce overall execution time of a parallel computation, even under favorable assumptions, is considered. Experiments have been conducted with an interleaved file system testbed on the Butterfly Plus multiprocessor. Results of these experiments suggest that (1) the hit ratio, the accepted measure in traditional caching studies, may not be an adequate measure of performance when the workload consists of parallel computations and parallel file access patterns, (2) caching with prefetching can significantly improve the hit ratio and the average time to perform an I/O (input/output) operation, and (3) an improvement in overall execution time has been observed in most cases. In spite of these gains, prefetching sometimes results in increased execution times (a negative result, given the optimistic nature of the study). The authors explore why it is not trivial to translate savings on individual I/O requests into consistently better overall performance and identify the key problems that need to be addressed in order to improve the potential of prefetching techniques in the environment.
Arrival Metering Precision Study
NASA Technical Reports Server (NTRS)
Prevot, Thomas; Mercer, Joey; Homola, Jeffrey; Hunt, Sarah; Gomez, Ashley; Bienert, Nancy; Omar, Faisal; Kraut, Joshua; Brasil, Connie; Wu, Minghong, G.
2015-01-01
This paper describes the background, method and results of the Arrival Metering Precision Study (AMPS) conducted in the Airspace Operations Laboratory at NASA Ames Research Center in May 2014. The simulation study measured delivery accuracy, flight efficiency, controller workload, and acceptability of time-based metering operations to a meter fix at the terminal area boundary for different resolution levels of metering delay times displayed to the air traffic controllers and different levels of airspeed information made available to the Time-Based Flow Management (TBFM) system computing the delay. The results show that the resolution of the delay countdown timer (DCT) on the controllers display has a significant impact on the delivery accuracy at the meter fix. Using the 10 seconds rounded and 1 minute rounded DCT resolutions resulted in more accurate delivery than 1 minute truncated and were preferred by the controllers. Using the speeds the controllers entered into the fourth line of the data tag to update the delay computation in TBFM in high and low altitude sectors increased air traffic control efficiency and reduced fuel burn for arriving aircraft during time based metering.
Distributed MRI reconstruction using Gadgetron-based cloud computing.
Xue, Hui; Inati, Souheil; Sørensen, Thomas Sangild; Kellman, Peter; Hansen, Michael S
2015-03-01
To expand the open source Gadgetron reconstruction framework to support distributed computing and to demonstrate that a multinode version of the Gadgetron can be used to provide nonlinear reconstruction with clinically acceptable latency. The Gadgetron framework was extended with new software components that enable an arbitrary number of Gadgetron instances to collaborate on a reconstruction task. This cloud-enabled version of the Gadgetron was deployed on three different distributed computing platforms ranging from a heterogeneous collection of commodity computers to the commercial Amazon Elastic Compute Cloud. The Gadgetron cloud was used to provide nonlinear, compressed sensing reconstruction on a clinical scanner with low reconstruction latency (eg, cardiac and neuroimaging applications). The proposed setup was able to handle acquisition and 11 -SPIRiT reconstruction of nine high temporal resolution real-time, cardiac short axis cine acquisitions, covering the ventricles for functional evaluation, in under 1 min. A three-dimensional high-resolution brain acquisition with 1 mm(3) isotropic pixel size was acquired and reconstructed with nonlinear reconstruction in less than 5 min. A distributed computing enabled Gadgetron provides a scalable way to improve reconstruction performance using commodity cluster computing. Nonlinear, compressed sensing reconstruction can be deployed clinically with low image reconstruction latency. © 2014 Wiley Periodicals, Inc.
Reconciliation of the cloud computing model with US federal electronic health record regulations
2011-01-01
Cloud computing refers to subscription-based, fee-for-service utilization of computer hardware and software over the Internet. The model is gaining acceptance for business information technology (IT) applications because it allows capacity and functionality to increase on the fly without major investment in infrastructure, personnel or licensing fees. Large IT investments can be converted to a series of smaller operating expenses. Cloud architectures could potentially be superior to traditional electronic health record (EHR) designs in terms of economy, efficiency and utility. A central issue for EHR developers in the US is that these systems are constrained by federal regulatory legislation and oversight. These laws focus on security and privacy, which are well-recognized challenges for cloud computing systems in general. EHRs built with the cloud computing model can achieve acceptable privacy and security through business associate contracts with cloud providers that specify compliance requirements, performance metrics and liability sharing. PMID:21727204
Reconciliation of the cloud computing model with US federal electronic health record regulations.
Schweitzer, Eugene J
2012-01-01
Cloud computing refers to subscription-based, fee-for-service utilization of computer hardware and software over the Internet. The model is gaining acceptance for business information technology (IT) applications because it allows capacity and functionality to increase on the fly without major investment in infrastructure, personnel or licensing fees. Large IT investments can be converted to a series of smaller operating expenses. Cloud architectures could potentially be superior to traditional electronic health record (EHR) designs in terms of economy, efficiency and utility. A central issue for EHR developers in the US is that these systems are constrained by federal regulatory legislation and oversight. These laws focus on security and privacy, which are well-recognized challenges for cloud computing systems in general. EHRs built with the cloud computing model can achieve acceptable privacy and security through business associate contracts with cloud providers that specify compliance requirements, performance metrics and liability sharing.
Cassetta, Michele; Altieri, Federica; Pandolfi, Stefano; Giansanti, Matteo
2017-01-01
The aim of this case report was to describe an innovative orthodontic treatment method that combined surgical and orthodontic techniques. The novel method was used to achieve a positive result in a case of moderate crowding by employing a computer-guided piezocision procedure followed by the use of clear aligners. A 23-year-old woman had a malocclusion with moderate crowding. Her periodontal indices, oral health-related quality of life (OHRQoL), and treatment time were evaluated. The treatment included interproximal corticotomy cuts extending through the entire thickness of the cortical layer, without a full-thickness flap reflection. This was achieved with a three-dimensionally printed surgical guide using computer-aided design and computer-aided manufacturing. Orthodontic force was applied to the teeth immediately after surgery by using clear appliances for better control of tooth movement. The total treatment time was 8 months. The periodontal indices improved after crowding correction, but the oral health impact profile showed a slight deterioration of OHRQoL during the 3 days following surgery. At the 2-year retention follow-up, the stability of treatment was excellent. The reduction in surgical time and patient discomfort, increased periodontal safety and patient acceptability, and accurate control of orthodontic movement without the risk of losing anchorage may encourage the use of this combined technique in appropriate cases. PMID:28337422
Cassetta, Michele; Altieri, Federica; Pandolfi, Stefano; Giansanti, Matteo
2017-03-01
The aim of this case report was to describe an innovative orthodontic treatment method that combined surgical and orthodontic techniques. The novel method was used to achieve a positive result in a case of moderate crowding by employing a computer-guided piezocision procedure followed by the use of clear aligners. A 23-year-old woman had a malocclusion with moderate crowding. Her periodontal indices, oral health-related quality of life (OHRQoL), and treatment time were evaluated. The treatment included interproximal corticotomy cuts extending through the entire thickness of the cortical layer, without a full-thickness flap reflection. This was achieved with a three-dimensionally printed surgical guide using computer-aided design and computer-aided manufacturing. Orthodontic force was applied to the teeth immediately after surgery by using clear appliances for better control of tooth movement. The total treatment time was 8 months. The periodontal indices improved after crowding correction, but the oral health impact profile showed a slight deterioration of OHRQoL during the 3 days following surgery. At the 2-year retention follow-up, the stability of treatment was excellent. The reduction in surgical time and patient discomfort, increased periodontal safety and patient acceptability, and accurate control of orthodontic movement without the risk of losing anchorage may encourage the use of this combined technique in appropriate cases.
NASA Technical Reports Server (NTRS)
Hewett, Marle D.; Tartt, David M.; Duke, Eugene L.; Antoniewicz, Robert F.; Brumbaugh, Randal W.
1988-01-01
The development of an automated flight test management system (ATMS) as a component of a rapid-prototyping flight research facility for AI-based flight systems concepts is described. The rapid-prototyping facility includes real-time high-fidelity simulators, numeric and symbolic processors, and high-performance research aircraft modified to accept commands for a ground-based remotely augmented vehicle facility. The flight system configuration of the ATMS includes three computers: the TI explorer LX and two GOULD SEL 32/27s.
Military Standard: Technical Reviews and Audits for Systems, Equipments, and Computer Software
1985-06-04
Concept Exploration or Demonstration and Validation phase. Such reviews may De conducted at any time but normal’y -dill be conducted after the...method of re•olutiun shall also De reviewed." All proposed environmental tests shall De reviewe• for compatibility wi•h the specified na•ura...accepted. See Attachment _ for comments. Attached is a list of de `iciencies. Signature(s) of FCA Team Member(s) "Sub-Team Chairperscn 0. Figure 3 Page 5 of
da Costa, Rosa Maria Esteves Moreira; de Carvalho, Luís Alfredo Vidal
2004-03-01
This study presents a process of virtual environment development supported by a cognitive model that is specific to cognitive deficits of diverse disorders or traumatic brain injury, and evaluates the acceptance of computer devices by a group of schizophrenic patients. The subjects that participated in this experiment accepted to work with computers and immersive glasses and demonstrated a high level of interest in the proposed tasks. No problems of illness have been observed. This experiment indicated that further research projects must be carried out to verify the value of virtual reality technology for cognitive rehabilitation of psychiatric patients. The results of the current study represent a small but necessary step in the realization of that potential.
Large holographic displays for real-time applications
NASA Astrophysics Data System (ADS)
Schwerdtner, A.; Häussler, R.; Leister, N.
2008-02-01
Holography is generally accepted as the ultimate approach to display three-dimensional scenes or objects. Principally, the reconstruction of an object from a perfect hologram would appear indistinguishable from viewing the corresponding real-world object. Up to now two main obstacles have prevented large-screen Computer-Generated Holograms (CGH) from achieving a satisfactory laboratory prototype not to mention a marketable one. The reason is a small cell pitch CGH resulting in a huge number of hologram cells and a very high computational load for encoding the CGH. These seemingly inevitable technological hurdles for a long time have not been cleared limiting the use of holography to special applications, such as optical filtering, interference, beam forming, digital holography for capturing the 3-D shape of objects, and others. SeeReal Technologies has developed a new approach for real-time capable CGH using the socalled Tracked Viewing Windows technology to overcome these problems. The paper will show that today's state of the art reconfigurable Spatial Light Modulators (SLM), especially today's feasible LCD panels are suited for reconstructing large 3-D scenes which can be observed from large viewing angles. For this to achieve the original holographic concept of containing information from the entire scene in each part of the CGH has been abandoned. This substantially reduces the hologram resolution and thus the computational load by several orders of magnitude making thus real-time computation possible. A monochrome real-time prototype measuring 20 inches has been built and demonstrated at last year's SID conference and exhibition 2007 and at several other events.
Soft computing techniques toward modeling the water supplies of Cyprus.
Iliadis, L; Maris, F; Tachos, S
2011-10-01
This research effort aims in the application of soft computing techniques toward water resources management. More specifically, the target is the development of reliable soft computing models capable of estimating the water supply for the case of "Germasogeia" mountainous watersheds in Cyprus. Initially, ε-Regression Support Vector Machines (ε-RSVM) and fuzzy weighted ε-RSVMR models have been developed that accept five input parameters. At the same time, reliable artificial neural networks have been developed to perform the same job. The 5-fold cross validation approach has been employed in order to eliminate bad local behaviors and to produce a more representative training data set. Thus, the fuzzy weighted Support Vector Regression (SVR) combined with the fuzzy partition has been employed in an effort to enhance the quality of the results. Several rational and reliable models have been produced that can enhance the efficiency of water policy designers. Copyright © 2011 Elsevier Ltd. All rights reserved.
Computational Analysis on Performance of Thermal Energy Storage (TES) Diffuser
NASA Astrophysics Data System (ADS)
Adib, M. A. H. M.; Adnan, F.; Ismail, A. R.; Kardigama, K.; Salaam, H. A.; Ahmad, Z.; Johari, N. H.; Anuar, Z.; Azmi, N. S. N.
2012-09-01
Application of thermal energy storage (TES) system reduces cost and energy consumption. The performance of the overall operation is affected by diffuser design. In this study, computational analysis is used to determine the thermocline thickness. Three dimensional simulations with different tank height-to-diameter ratio (HD), diffuser opening and the effect of difference number of diffuser holes are investigated. Medium HD tanks simulations with double ring octagonal diffuser show good thermocline behavior and clear distinction between warm and cold water. The result show, the best performance of thermocline thickness during 50% time charging occur in medium tank with height-to-diameter ratio of 4.0 and double ring octagonal diffuser with 48 holes (9mm opening ~ 60%) acceptable compared to diffuser with 6mm ~ 40% and 12mm ~ 80% opening. The conclusion is computational analysis method are very useful in the study on performance of thermal energy storage (TES).
The acceptability of computer applications to group practices.
Zimmerman, J; Gordon, R S; Tao, D K; Boxerman, S B
1978-01-01
Of the 72 identified group practices in a midwest urban environment, 39 were found to use computers. The practices had been influenced strongly by vendors in their selection of an automated system or service, and had usually spent less than a work-month analyzing their needs and reviewing alternate ways in which those needs could be met. Ninety-seven percent of the practices had some financial applications and 64% had administrative applications, but only 2.5% had medical applications. For half the practices at least 2 months elapsed from the time the automated applications were put into operation until they were considered to be integrated into the office routine. Advantages experienced by at least a third of the practices using computers were that the work was done faster, information was more readily available, and costs were reduced. The most common disadvantage was inflexibility. Most (89%) of the practices believed that automation was preferable to their previous manual system.
Permittivity and conductivity parameter estimations using full waveform inversion
NASA Astrophysics Data System (ADS)
Serrano, Jheyston O.; Ramirez, Ana B.; Abreo, Sergio A.; Sadler, Brian M.
2018-04-01
Full waveform inversion of Ground Penetrating Radar (GPR) data is a promising strategy to estimate quantitative characteristics of the subsurface such as permittivity and conductivity. In this paper, we propose a methodology that uses Full Waveform Inversion (FWI) in time domain of 2D GPR data to obtain highly resolved images of the permittivity and conductivity parameters of the subsurface. FWI is an iterative method that requires a cost function to measure the misfit between observed and modeled data, a wave propagator to compute the modeled data and an initial velocity model that is updated at each iteration until an acceptable decrease of the cost function is reached. The use of FWI with GPR are expensive computationally because it is based on the computation of the electromagnetic full wave propagation. Also, the commercially available acquisition systems use only one transmitter and one receiver antenna at zero offset, requiring a large number of shots to scan a single line.
Are You Listening to Your Computer?
ERIC Educational Resources Information Center
Shugg, Alan
1992-01-01
Accepting the great motivational value of computers in second-language learning, this article describes ways to use authentic language recorded on a computer with HyperCard. Graphics, sound, and hardware/software requirements are noted, along with brief descriptions of programing with sound and specific programs. (LB)
Integrated software system for low level waste management
DOE Office of Scientific and Technical Information (OSTI.GOV)
Worku, G.
1995-12-31
In the continually changing and uncertain world of low level waste management, many generators in the US are faced with the prospect of having to store their waste on site for the indefinite future. This consequently increases the set of tasks performed by the generators in the areas of packaging, characterizing, classifying, screening (if a set of acceptance criteria applies), and managing the inventory for the duration of onsite storage. When disposal sites become available, it is expected that the work will require re-evaluating the waste packages, including possible re-processing, re-packaging, or re-classifying in preparation for shipment for disposal undermore » the regulatory requirements of the time. In this day and age, when there is wide use of computers and computer literacy is at high levels, an important waste management tool would be an integrated software system that aids waste management personnel in conducting these tasks quickly and accurately. It has become evident that such an integrated radwaste management software system offers great benefits to radwaste generators both in the US and other countries. This paper discusses one such approach to integrated radwaste management utilizing some globally accepted radiological assessment software applications.« less
It pays to compare: an experimental study on computational estimation.
Star, Jon R; Rittle-Johnson, Bethany
2009-04-01
Comparing and contrasting examples is a core cognitive process that supports learning in children and adults across a variety of topics. In this experimental study, we evaluated the benefits of supporting comparison in a classroom context for children learning about computational estimation. Fifth- and sixth-grade students (N=157) learned about estimation either by comparing alternative solution strategies or by reflecting on the strategies one at a time. At posttest and retention test, students who compared were more flexible problem solvers on a variety of measures. Comparison also supported greater conceptual knowledge, but only for students who already knew some estimation strategies. These findings indicate that comparison is an effective learning and instructional practice in a domain with multiple acceptable answers.
Computer aided fixture design - A case based approach
NASA Astrophysics Data System (ADS)
Tanji, Shekhar; Raiker, Saiesh; Mathew, Arun Tom
2017-11-01
Automated fixture design plays important role in process planning and integration of CAD and CAM. An automated fixture setup design system is developed where when fixturing surfaces and points are described allowing modular fixture components to get automatically select for generating fixture units and placed into position with satisfying assembled conditions. In past, various knowledge based system have been developed to implement CAFD in practice. In this paper, to obtain an acceptable automated machining fixture design, a case-based reasoning method with developed retrieval system is proposed. Visual Basic (VB) programming language is used in integrating with SolidWorks API (Application programming interface) module for better retrieval procedure reducing computational time. These properties are incorporated in numerical simulation to determine the best fit for practical use.
Stochastic Evolutionary Algorithms for Planning Robot Paths
NASA Technical Reports Server (NTRS)
Fink, Wolfgang; Aghazarian, Hrand; Huntsberger, Terrance; Terrile, Richard
2006-01-01
A computer program implements stochastic evolutionary algorithms for planning and optimizing collision-free paths for robots and their jointed limbs. Stochastic evolutionary algorithms can be made to produce acceptably close approximations to exact, optimal solutions for path-planning problems while often demanding much less computation than do exhaustive-search and deterministic inverse-kinematics algorithms that have been used previously for this purpose. Hence, the present software is better suited for application aboard robots having limited computing capabilities (see figure). The stochastic aspect lies in the use of simulated annealing to (1) prevent trapping of an optimization algorithm in local minima of an energy-like error measure by which the fitness of a trial solution is evaluated while (2) ensuring that the entire multidimensional configuration and parameter space of the path-planning problem is sampled efficiently with respect to both robot joint angles and computation time. Simulated annealing is an established technique for avoiding local minima in multidimensional optimization problems, but has not, until now, been applied to planning collision-free robot paths by use of low-power computers.
A fast object-oriented Matlab implementation of the Reproducing Kernel Particle Method
NASA Astrophysics Data System (ADS)
Barbieri, Ettore; Meo, Michele
2012-05-01
Novel numerical methods, known as Meshless Methods or Meshfree Methods and, in a wider perspective, Partition of Unity Methods, promise to overcome most of disadvantages of the traditional finite element techniques. The absence of a mesh makes meshfree methods very attractive for those problems involving large deformations, moving boundaries and crack propagation. However, meshfree methods still have significant limitations that prevent their acceptance among researchers and engineers, namely the computational costs. This paper presents an in-depth analysis of computational techniques to speed-up the computation of the shape functions in the Reproducing Kernel Particle Method and Moving Least Squares, with particular focus on their bottlenecks, like the neighbour search, the inversion of the moment matrix and the assembly of the stiffness matrix. The paper presents numerous computational solutions aimed at a considerable reduction of the computational times: the use of kd-trees for the neighbour search, sparse indexing of the nodes-points connectivity and, most importantly, the explicit and vectorized inversion of the moment matrix without using loops and numerical routines.
ERIC Educational Resources Information Center
Lee, Woong-Kyu
2012-01-01
The principal objective of this study was to gain insight into attitude changes occurring during IT acceptance from the perspective of elaboration likelihood model (ELM). In particular, the primary target of this study was the process of IT acceptance through an education program. Although the Internet and computers are now quite ubiquitous, and…
ERIC Educational Resources Information Center
Huber, Christian; Gerullis, Anita; Gebhardt, Markus; Schwab, Susanne
2018-01-01
This computer-based study evaluates whether teacher feedback can have an effect on the acceptance of children with and without disabilities in inclusive, special and regular schools. The social acceptance of four children shown in photo vignettes (child with Down Syndrome, child in a wheelchair, child with migrant background and child with no…
Patient attitudes toward using computers to improve health services delivery.
Sciamanna, Christopher N; Diaz, Joseph; Myne, Puja
2002-09-11
The aim of this study was to examine the acceptability of point of care computerized prompts to improve health services delivery among a sample of primary care patients. Primary data collection. Cross-sectional survey. Patients were surveyed after their visit with a primary care provider. Data were obtained from patients of ten community-based primary care practices in the spring of 2001. Almost all patients reported that they would support using a computer before each visit to prompt their doctor to: "do health screening tests" (92%), "counsel about health behaviors (like diet and exercise)" (92%) and "change treatments for health conditions" (86%). In multivariate testing, the only variable that was associated with acceptability of the point of care computerized prompts was patient's confidence in their ability to answer questions about their health using a computer (beta = 0.39, p =.001). Concerns about data security were expressed by 36.3% of subjects, but were not related to acceptability of the prompts. Support for using computers to generate point of care prompts to improve quality-oriented processes of care was high in our sample, but may be contingent on patients feeling familiar with their personal medical history.
Computer Game-Based Learning: Perceptions and Experiences of Senior Chinese Adults
ERIC Educational Resources Information Center
Wang, Feihong; Lockee, Barbara B.; Burton, John K.
2012-01-01
The purpose of this study was to investigate senior Chinese adults' potential acceptance of computer game-based learning (CGBL) by probing their perceptions of computer game play and their perceived impacts of game play on their learning of computer skills and life satisfaction. A total of 60 senior adults from a local senior adult learning center…
ERIC Educational Resources Information Center
Teo, T.; Lee, C. B.; Chai, C. S.
2008-01-01
Computers are increasingly widespread, influencing many aspects of our social and work lives. As we move into a technology-based society, it is important that classroom experiences with computers are made available for all students. The purpose of this study is to examine pre-service teachers' attitudes towards computers. This study extends the…
12 CFR 615.5206 - Permanent capital ratio computation.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Permanent capital ratio computation. 615.5206... capital ratio computation. (a) The institution's permanent capital ratio is determined on the basis of the financial statements of the institution prepared in accordance with generally accepted accounting principles...
; means (a) copies of the computer program commonly known as SUNREL, and all of the contents of files accordance with the Documentation. 1.2 "Computer" means an electronic device that accepts computer." 1.4 "Licensee" means the Individual Licensee. 1.5 "Licensed Single Site"
Ifinedo, Princely
2016-03-01
The purpose of this study was to educate on the moderating effects of demographic (i.e., educational level and age) and individual characteristics (i.e., years of nursing experience and computer knowledge) on nurses' acceptance of information systems (IS). The technology acceptance model (TAM) with its constituent variables such as perceived usefulness (PUSS) and perceived ease of use (PEOU) was the theoretical framework used for this study. A cross-sectional study was conducted in Nova Scotia, Canada. Usable data was collected from 197 registered nurses (RNs). Relevant hypotheses were formulated and the partial least squares (PLS) technique was used for data analysis. The results of the hypothesized relationships showed that education and computer knowledge have positive moderating effects on the influences of PEOU and PUSS on nurses' attitudes toward IS (ATTI). The factors of nurses' years of nursing experience and age did not yield meaningful results. ATTI impacted behavioral intentions to use IS, which positively impacted nurses' use of IS. The nurses sampled in the study have positive IS use behaviors. This study demonstrates that relevant demographic factors and individual characteristics, if incorporated into frameworks used for investigating nurses' acceptance of IS, could permit the emergence of useful insights for practitioners and researchers. Specifically, this study showed that nurses with higher educational attainments and more basic computer knowledge readily accept implemented IS at work. Hospital administrators benefit from insights such as the one presented in this study. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Lessons learned from the usability assessment of home-based telemedicine systems.
Agnisarman, Sruthy Orozhiyathumana; Chalil Madathil, Kapil; Smith, Kevin; Ashok, Aparna; Welch, Brandon; McElligott, James T
2017-01-01
At-home telemedicine visits are quickly becoming an acceptable alternative for in-person patient visits. However, little work has been done to understand the usability of these home-based telemedicine solutions. It is critical for user acceptance and real-world applicability to evaluate available telemedicine solutions within the context-specific needs of the users of this technology. To address this need, this study evaluated the usability of four home-based telemedicine software platforms: Doxy.me, Vidyo, VSee, and Polycom. Using a within-subjects experimental design, twenty participants were asked to complete a telemedicine session involving several tasks using the four platforms. Upon completion of these tasks for each platform, participants completed the IBM computer system usability questionnaire (CSUQ) and the NASA Task Load Index test. Upon completing the tasks on all four platforms, the participants completed a final post-test subjective questionnaire ranking the platforms based on their preference. Of the twenty participants, 19 completed the study. Statistically significant differences among the telemedicine software platforms were found for task completion time, total workload, mental demand, effort, frustration, preference ranking and computer system usability scores. Usability problems with installation and account creation led to high mental demand and task completion time, suggesting the participants preferred a system without such requirements. Majority of the usability issues were identified at the telemedicine initiation phase. The findings from this study can be used by software developers to develop user-friendly telemedicine systems. Copyright © 2016 Elsevier Ltd. All rights reserved.
Computer-Based Arithmetic Test Generation
ERIC Educational Resources Information Center
Trocchi, Robert F.
1973-01-01
The computer can be a welcome partner in the instructional process, but only if there is man-machine interaction. Man should not compromise system design because of available hardware; the computer must fit the system design for the result to represent an acceptable solution to instructional technology. The Arithmetic Test Generator system fits…
College Students' Use of the Internet.
ERIC Educational Resources Information Center
McFadden, Anna C.
1999-01-01
Studied use of the Internet by college students by determining sites selected on 6 of 70 computers in a college computer laboratory. The overwhelming use of the Internet in this open lab conformed to university acceptable-use policy, with almost no use of the computers to contact pornographic sites. (SLD)
Traditional vs. Innovative Uses of Computers among Mathematics Pre-Service Teachers in Serbia
ERIC Educational Resources Information Center
Teo, Timothy; Milutinovic, Verica; Zhou, Mingming; Bankovic, Dragic
2017-01-01
This study examined pre-service teachers' intentions to use computers in traditional and innovative teaching practices in primary mathematics classrooms. It extended the technology acceptance model (TAM) by adding as external variables pre-service teachers' experience with computers and their technological pedagogical content knowledge (TPCK).…
Tubaishat, Ahmad
2017-09-18
Electronic health records (EHRs) are increasingly being implemented in healthcare organizations but little attention has been paid to the degree to which nurses as end-users will accept these systems and subsequently use them. To explore nurses' perceptions of usefulness and ease-of-use of EHRs. The relationship between these constructs was examined, and its predictors were studied. A national exploratory study was conducted with 1539 nurses from 15 randomly selected hospitals, representative of different regions and healthcare sectors in Jordan. Data were collected using a self-administered questionnaire, which was based on the Technology Acceptance Model. Correlations and linear multiple regression were utilized to analyze the data. Jordanian nurses demonstrated a positive perception of the usefulness and ease-of-use of EHRs, and subsequently accepted the technology. Significant positive correlations were found between these two constructs. The variables that predict usefulness were the gender, professional rank, EHR experience, and computer skills of the nurses. The perceived ease-of-use was affected by nursing and EHR experience, and computers skills. This study adds to the growing body of knowledge on issues related to the acceptance of technology in the health informatics field, focusing on nurses' acceptance of EHRs.
Zhang, Jiangheng; Chen, Yangxi; Zhou, Xiukun
2002-09-01
The characteristics of lip-mouth region including the soft and hard tissues in smiling position with frontal fixed position photographic computer-aided analysis were studied. The subjects were 80 persons (40 male and 40 females, age range: 17 to approximately 25 years) with acceptable faces and individual normal occlusions. The subjects were asked to take maximum smiling position to accept photographic measurement with computer-aided analysis. The maximum smile line could be divided into 3 categories: low smile line (16.25%), average smile line (68.75%), and high smile line (15%). The method adopting maximum smiling position to study the lip-month region is reproducible and comparable. This study would be helpful to provide a quantitative reference for clinical investigation, diagnosis, treatment and efficacy appraisal.
Mackenzie, Kelly; Goyder, Elizabeth; Eves, Francis
2015-12-24
Prolonged sedentary time is linked with poor health, independent of physical activity levels. Workplace sitting significantly contributes to sedentary time, but there is limited research evaluating low-cost interventions targeting reductions in workplace sitting. Current evidence supports the use of multi-modal interventions developed using participative approaches. This study aimed to explore the acceptability and feasibility of a low-cost, co-produced, multi-modal intervention to reduce workplace sitting. The intervention was developed with eleven volunteers from a large university department in the UK using participative approaches and "brainstorming" techniques. Main components of the intervention included: emails suggesting ways to "sit less" e.g. walking and standing meetings; free reminder software to install onto computers; social media to increase awareness; workplace champions; management support; and point-of-decision prompts e.g. by lifts encouraging stair use. All staff (n = 317) were invited to take part. Seventeen participated in all aspects of the evaluation, completing pre- and post-intervention sitting logs and questionnaires. The intervention was delivered over four weeks from 7th July to 3rd August 2014. Pre- and post-intervention difference in daily workplace sitting time was presented as a mean ± standard deviation. Questionnaires were used to establish awareness of the intervention and its various elements, and to collect qualitative data regarding intervention acceptability and feasibility. Mean baseline sitting time of 440 min/workday was reported with a mean reduction of 26 ± 54 min/workday post-intervention (n = 17, 95 % CI = -2 to 53). All participants were aware of the intervention as a whole, although there was a range of awareness for individual elements of the intervention. The intervention was generally felt to be both acceptable and feasible. Management support was perceived to be a strength, whilst specific strategies that were encouraged, including walking and standing meetings, received mixed feedback. This small-scale pilot provides encouragement for the acceptability and feasibility of low-cost, multi-modal interventions to reduce workplace sitting in UK settings. Evaluation of this intervention provides useful information to support participatory approaches during intervention development and the potential for more sustainable low-cost interventions. Findings may be limited in terms of generalisability as this pilot was carried out within a health-related academic setting.
NASA Astrophysics Data System (ADS)
Schneider, E. A.; Deinert, M. R.; Cady, K. B.
2006-10-01
The balance of isotopes in a nuclear reactor core is key to understanding the overall performance of a given fuel cycle. This balance is in turn most strongly affected by the time and energy-dependent neutron flux. While many large and involved computer packages exist for determining this spectrum, a simplified approach amenable to rapid computation is missing from the literature. We present such a model, which accepts as inputs the fuel element/moderator geometry and composition, reactor geometry, fuel residence time and target burnup and we compare it to OECD/NEA benchmarks for homogeneous MOX and UOX LWR cores. Collision probability approximations to the neutron transport equation are used to decouple the spatial and energy variables. The lethargy dependent neutron flux, governed by coupled integral equations for the fuel and moderator/coolant regions is treated by multigroup thermalization methods, and the transport of neutrons through space is modeled by fuel to moderator transport and escape probabilities. Reactivity control is achieved through use of a burnable poison or adjustable control medium. The model calculates the buildup of 24 actinides, as well as fission products, along with the lethargy dependent neutron flux and the results of several simulations are compared with benchmarked standards.
Qualification and Approval of Personal Computer-Based Aviation Training Devices
DOT National Transportation Integrated Search
1997-05-12
This Advisory Circular (AC) provides information and guidance to potential training device manufacturers and aviation training consumers concerning a means, acceptable to the Administrator, by which personal computer-based aviation training devices (...
ERIC Educational Resources Information Center
Venkatesh, Vijay P.
2013-01-01
The current computing landscape owes its roots to the birth of hardware and software technologies from the 1940s and 1950s. Since then, the advent of mainframes, miniaturized computing, and internetworking has given rise to the now prevalent cloud computing era. In the past few months just after 2010, cloud computing adoption has picked up pace…
Hutchesson, Melinda J; Rollo, Megan E; Callister, Robin; Collins, Clare E
2015-01-01
Adherence and accuracy of self-monitoring of dietary intake influences success in weight management interventions. Information technologies such as computers and smartphones have the potential to improve adherence and accuracy by reducing the burden associated with monitoring dietary intake using traditional paper-based food records. We evaluated the acceptability and accuracy of three different 7-day food record methods (online accessed via computer, online accessed via smartphone, and paper-based). Young women (N=18; aged 23.4±2.9 years; body mass index 24.0±2.2) completed the three 7-day food records in random order with 7-day washout periods between each method. Total energy expenditure (TEE) was derived from resting energy expenditure (REE) measured by indirect calorimetry and physical activity level (PAL) derived from accelerometers (TEE=REE×PAL). Accuracy of the three methods was assessed by calculating absolute (energy intake [EI]-TEE) and percentage difference (EI/TEE×100) between self-reported EI and TEE. Acceptability was assessed via questionnaire. Mean±standard deviation TEE was 2,185±302 kcal/day and EI was 1,729±249 kcal/day, 1,675±287kcal/day, and 1,682±352 kcal/day for computer, smartphone, and paper records, respectively. There were no significant differences between absolute and percentage differences between EI and TEE for the three methods: computer, -510±389 kcal/day (78%); smartphone, -456±372 kcal/day (80%); and paper, -503±513 kcal/day (79%). Half of participants (n=9) preferred computer recording, 44.4% preferred smartphone, and 5.6% preferred paper-based records. Most participants (89%) least preferred the paper-based record. Because online food records completed on either computer or smartphone were as accurate as paper-based records but more acceptable to young women, they should be considered when self-monitoring of intake is recommended to young women. Copyright © 2015 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.
Stochastic Modeling Approach to the Incubation Time of Prionic Diseases
NASA Astrophysics Data System (ADS)
Ferreira, A. S.; da Silva, M. A.; Cressoni, J. C.
2003-05-01
Transmissible spongiform encephalopathies are neurodegenerative diseases for which prions are the attributed pathogenic agents. A widely accepted theory assumes that prion replication is due to a direct interaction between the pathologic (PrPSc) form and the host-encoded (PrPC) conformation, in a kind of autocatalytic process. Here we show that the overall features of the incubation time of prion diseases are readily obtained if the prion reaction is described by a simple mean-field model. An analytical expression for the incubation time distribution then follows by associating the rate constant to a stochastic variable log normally distributed. The incubation time distribution is then also shown to be log normal and fits the observed BSE (bovine spongiform encephalopathy) data very well. Computer simulation results also yield the correct BSE incubation time distribution at low PrPC densities.
Computer-Aided Techniques for Providing Operator Performance Measures.
ERIC Educational Resources Information Center
Connelly, Edward M.; And Others
This report documents the theory, structure, and implementation of a performance processor (written in FORTRAN IV) that can accept performance demonstration data representing various levels of operator's skill and, under user control, analyze data to provide candidate performance measures and validation test results. The processor accepts two…
GPU-accelerated regularized iterative reconstruction for few-view cone beam CT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matenine, Dmitri, E-mail: dmitri.matenine.1@ulaval.ca; Goussard, Yves, E-mail: yves.goussard@polymtl.ca; Després, Philippe, E-mail: philippe.despres@phy.ulaval.ca
2015-04-15
Purpose: The present work proposes an iterative reconstruction technique designed for x-ray transmission computed tomography (CT). The main objective is to provide a model-based solution to the cone-beam CT reconstruction problem, yielding accurate low-dose images via few-views acquisitions in clinically acceptable time frames. Methods: The proposed technique combines a modified ordered subsets convex (OSC) algorithm and the total variation minimization (TV) regularization technique and is called OSC-TV. The number of subsets of each OSC iteration follows a reduction pattern in order to ensure the best performance of the regularization method. Considering the high computational cost of the algorithm, it ismore » implemented on a graphics processing unit, using parallelization to accelerate computations. Results: The reconstructions were performed on computer-simulated as well as human pelvic cone-beam CT projection data and image quality was assessed. In terms of convergence and image quality, OSC-TV performs well in reconstruction of low-dose cone-beam CT data obtained via a few-view acquisition protocol. It compares favorably to the few-view TV-regularized projections onto convex sets (POCS-TV) algorithm. It also appears to be a viable alternative to full-dataset filtered backprojection. Execution times are of 1–2 min and are compatible with the typical clinical workflow for nonreal-time applications. Conclusions: Considering the image quality and execution times, this method may be useful for reconstruction of low-dose clinical acquisitions. It may be of particular benefit to patients who undergo multiple acquisitions by reducing the overall imaging radiation dose and associated risks.« less
Augmented Reality-Guided Lumbar Facet Joint Injections.
Agten, Christoph A; Dennler, Cyrill; Rosskopf, Andrea B; Jaberg, Laurenz; Pfirrmann, Christian W A; Farshad, Mazda
2018-05-08
The aim of this study was to assess feasibility and accuracy of augmented reality-guided lumbar facet joint injections. A spine phantom completely embedded in hardened opaque agar with 3 ring markers was built. A 3-dimensional model of the phantom was uploaded to an augmented reality headset (Microsoft HoloLens). Two radiologists independently performed 20 augmented reality-guided and 20 computed tomography (CT)-guided facet joint injections each: for each augmented reality-guided injection, the hologram was manually aligned with the phantom container using the ring markers. The radiologists targeted the virtual facet joint and tried to place the needle tip in the holographic joint space. Computed tomography was performed after each needle placement to document final needle tip position. Time needed from grabbing the needle to final needle placement was measured for each simulated injection. An independent radiologist rated images of all needle placements in a randomized order blinded to modality (augmented reality vs CT) and performer as perfect, acceptable, incorrect, or unsafe. Accuracy and time to place needles were compared between augmented reality-guided and CT-guided facet joint injections. In total, 39/40 (97.5%) of augmented reality-guided needle placements were either perfect or acceptable compared with 40/40 (100%) CT-guided needle placements (P = 0.5). One augmented reality-guided injection missed the facet joint space by 2 mm. No unsafe needle placements occurred. Time to final needle placement was substantially faster with augmented reality guidance (mean 14 ± 6 seconds vs 39 ± 15 seconds, P < 0.001 for both readers). Augmented reality-guided facet joint injections are feasible and accurate without potentially harmful needle placement in an experimental setting.
Knowledge-based tracking algorithm
NASA Astrophysics Data System (ADS)
Corbeil, Allan F.; Hawkins, Linda J.; Gilgallon, Paul F.
1990-10-01
This paper describes the Knowledge-Based Tracking (KBT) algorithm for which a real-time flight test demonstration was recently conducted at Rome Air Development Center (RADC). In KBT processing, the radar signal in each resolution cell is thresholded at a lower than normal setting to detect low RCS targets. This lower threshold produces a larger than normal false alarm rate. Therefore, additional signal processing including spectral filtering, CFAR and knowledge-based acceptance testing are performed to eliminate some of the false alarms. TSC's knowledge-based Track-Before-Detect (TBD) algorithm is then applied to the data from each azimuth sector to detect target tracks. In this algorithm, tentative track templates are formed for each threshold crossing and knowledge-based association rules are applied to the range, Doppler, and azimuth measurements from successive scans. Lastly, an M-association out of N-scan rule is used to declare a detection. This scan-to-scan integration enhances the probability of target detection while maintaining an acceptably low output false alarm rate. For a real-time demonstration of the KBT algorithm, the L-band radar in the Surveillance Laboratory (SL) at RADC was used to illuminate a small Cessna 310 test aircraft. The received radar signal wa digitized and processed by a ST-100 Array Processor and VAX computer network in the lab. The ST-100 performed all of the radar signal processing functions, including Moving Target Indicator (MTI) pulse cancelling, FFT Doppler filtering, and CFAR detection. The VAX computers performed the remaining range-Doppler clustering, beamsplitting and TBD processing functions. The KBT algorithm provided a 9.5 dB improvement relative to single scan performance with a nominal real time delay of less than one second between illumination and display.
Visualization of spiral and scroll waves in simulated and experimental cardiac tissue
NASA Astrophysics Data System (ADS)
Cherry, E. M.; Fenton, F. H.
2008-12-01
The heart is a nonlinear biological system that can exhibit complex electrical dynamics, complete with period-doubling bifurcations and spiral and scroll waves that can lead to fibrillatory states that compromise the heart's ability to contract and pump blood efficiently. Despite the importance of understanding the range of cardiac dynamics, studying how spiral and scroll waves can initiate, evolve, and be terminated is challenging because of the complicated electrophysiology and anatomy of the heart. Nevertheless, over the last two decades advances in experimental techniques have improved access to experimental data and have made it possible to visualize the electrical state of the heart in more detail than ever before. During the same time, progress in mathematical modeling and computational techniques has facilitated using simulations as a tool for investigating cardiac dynamics. In this paper, we present data from experimental and simulated cardiac tissue and discuss visualization techniques that facilitate understanding of the behavior of electrical spiral and scroll waves in the context of the heart. The paper contains many interactive media, including movies and interactive two- and three-dimensional Java appletsDisclaimer: IOP Publishing was not involved in the programming of this software and does not accept any responsibility for it. You download and run the software at your own risk. If you experience any problems with the software, please contact the author directly. To the fullest extent permitted by law, IOP Publishing Ltd accepts no responsibility for any loss, damage and/or other adverse effect on your computer system caused by your downloading and running this software. IOP Publishing Ltd accepts no responsibility for consequential loss..
A Strategic Approach to Network Defense: Framing the Cloud
2011-03-10
accepted network defensive principles, to reduce risks associated with emerging virtualization capabilities and scalability of cloud computing . This expanded...defensive framework can assist enterprise networking and cloud computing architects to better design more secure systems.
Prosodic alignment in human-computer interaction
NASA Astrophysics Data System (ADS)
Suzuki, N.; Katagiri, Y.
2007-06-01
Androids that replicate humans in form also need to replicate them in behaviour to achieve a high level of believability or lifelikeness. We explore the minimal social cues that can induce in people the human tendency for social acceptance, or ethopoeia, toward artifacts, including androids. It has been observed that people exhibit a strong tendency to adjust to each other, through a number of speech and language features in human-human conversational interactions, to obtain communication efficiency and emotional engagement. We investigate in this paper the phenomena related to prosodic alignment in human-computer interactions, with particular focus on human-computer alignment of speech characteristics. We found that people exhibit unidirectional and spontaneous short-term alignment of loudness and response latency in their speech in response to computer-generated speech. We believe this phenomenon of prosodic alignment provides one of the key components for building social acceptance of androids.
An introduction to computer forensics.
Furneaux, Nick
2006-07-01
This paper provides an introduction to the discipline of Computer Forensics. With computers being involved in an increasing number, and type, of crimes the trace data left on electronic media can play a vital part in the legal process. To ensure acceptance by the courts, accepted processes and procedures have to be adopted and demonstrated which are not dissimilar to the issues surrounding traditional forensic investigations. This paper provides a straightforward overview of the three steps involved in the examination of digital media: Acquisition of data. Investigation of evidence. Reporting and presentation of evidence. Although many of the traditional readers of Medicine, Science and the Law are those involved in the biological aspects of forensics, I believe that both disciplines can learn from each other, with electronic evidence being more readily sought and considered by the legal community and the long, tried and tested scientific methods of the forensic community being shared and adopted by the computer forensic world.
ERIC Educational Resources Information Center
Ishola, Bashiru Abayomi
2017-01-01
Cloud computing has recently emerged as a potential alternative to the traditional on-premise computing that businesses can leverage to achieve operational efficiencies. Consequently, technology managers are often tasked with the responsibilities to analyze the barriers and variables critical to organizational cloud adoption decisions. This…
Religious Studies as a Test-Case For Computer-Assisted Instruction In The Humanities.
ERIC Educational Resources Information Center
Jones, Bruce William
Experiences with computer-assisted instructional (CAI) programs written for religious studies indicate that CAI has contributions to offer the humanities and social sciences. The usefulness of the computer for presentation, drill and review of factual material and its applicability to quantifiable data is well accepted. There now exist…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-24
... Digital Computer-Based Instrumentation and Control Systems.'' This BTP is to be cited as the acceptance criteria for Diversity and Defense-in-Depth in Digital Computer-Based Instrumentation and Control Systems... Evaluation of Diversity and Defense-in-Depth in Digital Computer-Based Instrumentation and Control Systems...
NASA Astrophysics Data System (ADS)
King, W. E.; Anderson, A. T.; Ferencz, R. M.; Hodge, N. E.; Kamath, C.; Khairallah, S. A.; Rubenchik, A. M.
2015-12-01
The production of metal parts via laser powder bed fusion additive manufacturing is growing exponentially. However, the transition of this technology from production of prototypes to production of critical parts is hindered by a lack of confidence in the quality of the part. Confidence can be established via a fundamental understanding of the physics of the process. It is generally accepted that this understanding will be increasingly achieved through modeling and simulation. However, there are significant physics, computational, and materials challenges stemming from the broad range of length and time scales and temperature ranges associated with the process. In this paper, we review the current state of the art and describe the challenges that need to be met to achieve the desired fundamental understanding of the physics of the process.
NASA Technical Reports Server (NTRS)
Farhat, Charbel
1998-01-01
In this grant, we have proposed a three-year research effort focused on developing High Performance Computation and Communication (HPCC) methodologies for structural analysis on parallel processors and clusters of workstations, with emphasis on reducing the structural design cycle time. Besides consolidating and further improving the FETI solver technology to address plate and shell structures, we have proposed to tackle the following design related issues: (a) parallel coupling and assembly of independently designed and analyzed three-dimensional substructures with non-matching interfaces, (b) fast and smart parallel re-analysis of a given structure after it has undergone design modifications, (c) parallel evaluation of sensitivity operators (derivatives) for design optimization, and (d) fast parallel analysis of mildly nonlinear structures. While our proposal was accepted, support was provided only for one year.
Dark chocolate acceptability: influence of cocoa origin and processing conditions.
Torres-Moreno, Miriam; Tarrega, Amparo; Costell, Elvira; Blanch, Consol
2012-01-30
Chocolate properties can vary depending on cocoa origin, composition and manufacturing procedure, which affect consumer acceptability. The aim of this work was to study the effect of two cocoa origins (Ghana and Ecuador) and two processing conditions (roasting time and conching time) on dark chocolate acceptability. Overall acceptability and acceptability for different attributes (colour, flavour, odour and texture) were evaluated by 95 consumers. Differences in acceptability among dark chocolates were mainly related to differences in flavour acceptability. The use of a long roasting time lowered chocolate acceptability in Ghanaian samples while it had no effect on acceptability of Ecuadorian chocolates. This response was observed for most consumers (two subgroups with different frequency consumption of dark chocolate). However, for a third group of consumers identified as distinguishers, the most acceptable dark chocolate samples were those produced with specific combinations of roasting time and conching time for each of the cocoa geographical origin considered. To produce dark chocolates from a single origin it is important to know the target market preferences and to select the appropriate roasting and conching conditions. Copyright © 2011 Society of Chemical Industry.
Students' Acceptance of Tablet PCs in the Classroom
ERIC Educational Resources Information Center
Ifenthaler, Dirk; Schweinbenz, Volker
2016-01-01
In recent years digital technologies, such as tablet personal computers (TPCs), have become an integral part of a school's infrastructure and are seen as a promising way to facilitate students' learning processes. This study empirically tested a theoretical model derived from the technology acceptance model containing key constructs developed in…
NASA Astrophysics Data System (ADS)
Devaraj, Rajesh; Sarkar, Arnab; Biswas, Santosh
2015-11-01
In the article 'Supervisory control for fault-tolerant scheduling of real-time multiprocessor systems with aperiodic tasks', Park and Cho presented a systematic way of computing a largest fault-tolerant and schedulable language that provides information on whether the scheduler (i.e., supervisor) should accept or reject a newly arrived aperiodic task. The computation of such a language is mainly dependent on the task execution model presented in their paper. However, the task execution model is unable to capture the situation when the fault of a processor occurs even before the task has arrived. Consequently, a task execution model that does not capture this fact may possibly be assigned for execution on a faulty processor. This problem has been illustrated with an appropriate example. Then, the task execution model of Park and Cho has been modified to strengthen the requirement that none of the tasks are assigned for execution on a faulty processor.
Graphical Interface for the Study of Gas-Phase Reaction Kinetics: Cyclopentene Vapor Pyrolysis
NASA Astrophysics Data System (ADS)
Marcotte, Ronald E.; Wilson, Lenore D.
2001-06-01
The undergraduate laboratory experiment on the pyrolysis of gaseous cyclopentene has been modernized to improve safety, speed, and precision and to better reflect the current practice of physical chemistry. It now utilizes virtual instrument techniques to create a graphical computer interface for the collection and display of experimental data. An electronic pressure gauge has replaced the mercury manometer formerly needed in proximity to the 500 °C pyrolysis oven. Students have much better real-time information available to them and no longer require multiple lab periods to get rate constants and acceptable Arrhenius parameters. The time saved on manual data collection is used to give the students a tour of the computer interfacing hardware and software and a hands-on introduction to gas-phase reagent preparation using a research-grade high-vacuum system. This includes loading the sample, degassing it by the freeze-pump-thaw technique, handling liquid nitrogen and working through the logic necessary for each reconfiguration of the diffusion pump section and the submanifolds.
Auditory presentation and synchronization in Adobe Flash and HTML5/JavaScript Web experiments.
Reimers, Stian; Stewart, Neil
2016-09-01
Substantial recent research has examined the accuracy of presentation durations and response time measurements for visually presented stimuli in Web-based experiments, with a general conclusion that accuracy is acceptable for most kinds of experiments. However, many areas of behavioral research use auditory stimuli instead of, or in addition to, visual stimuli. Much less is known about auditory accuracy using standard Web-based testing procedures. We used a millisecond-accurate Black Box Toolkit to measure the actual durations of auditory stimuli and the synchronization of auditory and visual presentation onsets. We examined the distribution of timings for 100 presentations of auditory and visual stimuli across two computers with difference specs, three commonly used browsers, and code written in either Adobe Flash or JavaScript. We also examined different coding options for attempting to synchronize the auditory and visual onsets. Overall, we found that auditory durations were very consistent, but that the lags between visual and auditory onsets varied substantially across browsers and computer systems.
GPU implementation of prior image constrained compressed sensing (PICCS)
NASA Astrophysics Data System (ADS)
Nett, Brian E.; Tang, Jie; Chen, Guang-Hong
2010-04-01
The Prior Image Constrained Compressed Sensing (PICCS) algorithm (Med. Phys. 35, pg. 660, 2008) has been applied to several computed tomography applications with both standard CT systems and flat-panel based systems designed for guiding interventional procedures and radiation therapy treatment delivery. The PICCS algorithm typically utilizes a prior image which is reconstructed via the standard Filtered Backprojection (FBP) reconstruction algorithm. The algorithm then iteratively solves for the image volume that matches the measured data, while simultaneously assuring the image is similar to the prior image. The PICCS algorithm has demonstrated utility in several applications including: improved temporal resolution reconstruction, 4D respiratory phase specific reconstructions for radiation therapy, and cardiac reconstruction from data acquired on an interventional C-arm. One disadvantage of the PICCS algorithm, just as other iterative algorithms, is the long computation times typically associated with reconstruction. In order for an algorithm to gain clinical acceptance reconstruction must be achievable in minutes rather than hours. In this work the PICCS algorithm has been implemented on the GPU in order to significantly reduce the reconstruction time of the PICCS algorithm. The Compute Unified Device Architecture (CUDA) was used in this implementation.
Computer Description of the Field Artillery Ammunition Supply Vehicle
1983-04-01
Combinatorial Geometry (COM-GEOM) GIFT Computer Code Computer Target Description 2& AfTNACT (Cmne M feerve shb N ,neemssalyan ify by block number) A...input to the GIFT computer code to generate target vulnerability data. F.a- 4 ono OF I NOV 5S OLETE UNCLASSIFIED SECUOITY CLASSIFICATION OF THIS PAGE...Combinatorial Geometry (COM-GEOM) desrription. The "Geometric Information for Tarqets" ( GIFT ) computer code accepts the CO!-GEOM description and
About the Beginnings of Medical Informatics in Europe
Roger France, Francis
2014-01-01
The term “Informatics” was created in 1962 from two words, information and automatic, and covers all techniques, information concepts and applications of computers. Among them, medicine is the field where we will describe some factors of development in Europe since the late sixties. It took some time for obtaining the acceptance of this new terminology worldwide, but today medical informatics is a well defined discipline which had a tremendous development last decades. This paper tries to recall the context and events from the beginning of medical informatics in Europe. PMID:24648614
Wealth and price distribution by diffusive approximation in a repeated prediction market
NASA Astrophysics Data System (ADS)
Bottazzi, Giulio; Giachini, Daniele
2017-04-01
The approximate agents' wealth and price invariant densities of a repeated prediction market model is derived using the Fokker-Planck equation of the associated continuous-time jump process. We show that the approximation obtained from the evolution of log-wealth difference can be reliably exploited to compute all the quantities of interest in all the acceptable parameter space. When the risk aversion of the trader is high enough, we are able to derive an explicit closed-form solution for the price distribution which is asymptotically correct.
Explicit finite-difference simulation of optical integrated devices on massive parallel computers.
Sterkenburgh, T; Michels, R M; Dress, P; Franke, H
1997-02-20
An explicit method for the numerical simulation of optical integrated circuits by means of the finite-difference time-domain (FDTD) method is presented. This method, based on an explicit solution of Maxwell's equations, is well established in microwave technology. Although the simulation areas are small, we verified the behavior of three interesting problems, especially nonparaxial problems, with typical aspects of integrated optical devices. Because numerical losses are within acceptable limits, we suggest the use of the FDTD method to achieve promising quantitative simulation results.
Analyzing Pulse-Code Modulation On A Small Computer
NASA Technical Reports Server (NTRS)
Massey, David E.
1988-01-01
System for analysis pulse-code modulation (PCM) comprises personal computer, computer program, and peripheral interface adapter on circuit board that plugs into expansion bus of computer. Functions essentially as "snapshot" PCM decommutator, which accepts and stores thousands of frames of PCM data, sifts through them repeatedly to process according to routines specified by operator. Enables faster testing and involves less equipment than older testing systems.
Novel continuous authentication using biometrics
NASA Astrophysics Data System (ADS)
Dubey, Prakash; Patidar, Rinku; Mishra, Vikas; Norman, Jasmine; Mangayarkarasi, R.
2017-11-01
We explore whether a classifier can consistent1y verify c1ients and interact with the computer using camera and behavior of users. In this paper we propose a new way of authentication of user which wi1l capture many images of user in random time and ana1ysis of its touch biometric behavior. In this system experiment the touch conduct of a c1ient/user between an en1istment stage is stored in the database and it is checked its mean time behavior during equa1 partition of time. This touch behavior wi1l ab1e to accept or reject the user. This wi1l modify the use of biometric more accurate to use. In this system the work p1an going to perform is the user wi1l ask single time to a1low to take it picture before 1ogin. Then it wi1l take images of user without permission of user automatica1ly and store in the database. This images and existing image of user wi1l be compare and reject or accept wi1l depend on its comparison. The user touch behavior wi1l keep storing with number of touch make in equa1 amount of time of the user. This touch behavior and image wi1l fina1ly perform authentication of the user automatically.
NASA Technical Reports Server (NTRS)
Jackson, Dan E.
2010-01-01
Time-Tag Generation Script (TTaGS) is an application program, written in the AWK scripting language, for generating commands for aiming one Ku-band antenna and two S-band antennas for communicating with spacecraft. TTaGS saves between 2 and 4 person-hours per every 24 hours by automating the repetitious process of building between 150 and 180 antenna-control commands. TTaGS reads a text database of communication satellite schedules and a text database of satellite rise and set times and cross-references items in the two databases. It then compares the scheduled start and stop with the geometric rise and set to compute the times to execute antenna control commands. While so doing, TTaGS determines whether to generate commands for guidance, navigation, and control computers to tell them which satellites to track. To help prevent Ku-band irradiation of the Earth, TTaGS accepts input from the user about horizon tolerance and accordingly restricts activation and effects deactivation of the transmitter. TTaGS can be modified easily to enable tracking of additional satellites and for such other tasks as reading Sun-rise/set tables to generate commands to point the solar photovoltaic arrays of the International Space Station at the Sun.
Real time flight simulation methodology
NASA Technical Reports Server (NTRS)
Parrish, E. A.; Cook, G.; Mcvey, E. S.
1976-01-01
An example sensitivity study is presented to demonstrate how a digital autopilot designer could make a decision on minimum sampling rate for computer specification. It consists of comparing the simulated step response of an existing analog autopilot and its associated aircraft dynamics to the digital version operating at various sampling frequencies and specifying a sampling frequency that results in an acceptable change in relative stability. In general, the zero order hold introduces phase lag which will increase overshoot and settling time. It should be noted that this solution is for substituting a digital autopilot for a continuous autopilot. A complete redesign could result in results which more closely resemble the continuous results or which conform better to original design goals.
Computer laboratory in medical education for medical students.
Hercigonja-Szekeres, Mira; Marinović, Darko; Kern, Josipa
2009-01-01
Five generations of second year students at the Zagreb University School of Medicine were interviewed through an anonymous questionnaire on their use of personal computers, Internet, computer laboratories and computer-assisted education in general. Results show an advance in students' usage of information and communication technology during the period from 1998/99 to 2002/03. However, their positive opinion about computer laboratory depends on installed capacities: the better the computer laboratory technology, the better the students' acceptance and use of it.
Maintaining Privacy in Pervasive Computing - Enabling Acceptance of Sensor-based Services
NASA Astrophysics Data System (ADS)
Soppera, A.; Burbridge, T.
During the 1980s, Mark Weiser [1] predicted a world in which computing was so pervasive that devices embedded in the environment could sense their relationship to us and to each other. These tiny ubiquitous devices would continually feed information from the physical world into the information world. Twenty years ago, this vision was the exclusive territory of academic computer scientists and science fiction writers. Today this subject has become of interest to business, government, and society. Governmental authorities exercise their power through the networked environment. Credit card databases maintain our credit history and decide whether we are allowed to rent a house or obtain a loan. Mobile telephones can locate us in real time so that we do not miss calls. Within another 10 years, all sorts of devices will be connected through the network. Our fridge, our food, together with our health information, may all be networked for the purpose of maintaining diet and well-being. The Internet will move from being an infrastructure to connect computers, to being an infrastructure to connect everything [2, 3].
V/STOLAND avionics system flight-test data on a UH-1H helicopter
NASA Technical Reports Server (NTRS)
Baker, F. A.; Jaynes, D. N.; Corliss, L. D.; Liden, S.; Merrick, R. B.; Dugan, D. C.
1980-01-01
The flight-acceptance test results obtained during the acceptance tests of the V/STOLAND (versatile simplex digital avionics system) digital avionics system on a Bell UH-1H helicopter in 1977 at Ames Research Center are presented. The system provides navigation, guidance, control, and display functions for NASA terminal area VTOL research programs and for the Army handling qualities research programs at Ames Research Center. The acceptance test verified system performance and contractual acceptability. The V/STOLAND hardware navigation, guidance, and control laws resident in the digital computers are described. Typical flight-test data are shown and discussed as documentation of the system performance at acceptance from the contractor.
NASA Astrophysics Data System (ADS)
Takemiya, Tetsushi
In modern aerospace engineering, the physics-based computational design method is becoming more important, as it is more efficient than experiments and because it is more suitable in designing new types of aircraft (e.g., unmanned aerial vehicles or supersonic business jets) than the conventional design method, which heavily relies on historical data. To enhance the reliability of the physics-based computational design method, researchers have made tremendous efforts to improve the fidelity of models. However, high-fidelity models require longer computational time, so the advantage of efficiency is partially lost. This problem has been overcome with the development of variable fidelity optimization (VFO). In VFO, different fidelity models are simultaneously employed in order to improve the speed and the accuracy of convergence in an optimization process. Among the various types of VFO methods, one of the most promising methods is the approximation management framework (AMF). In the AMF, objective and constraint functions of a low-fidelity model are scaled at a design point so that the scaled functions, which are referred to as "surrogate functions," match those of a high-fidelity model. Since scaling functions and the low-fidelity model constitutes surrogate functions, evaluating the surrogate functions is faster than evaluating the high-fidelity model. Therefore, in the optimization process, in which gradient-based optimization is implemented and thus many function calls are required, the surrogate functions are used instead of the high-fidelity model to obtain a new design point. The best feature of the AMF is that it may converge to a local optimum of the high-fidelity model in much less computational time than the high-fidelity model. However, through literature surveys and implementations of the AMF, the author xx found that (1) the AMF is very vulnerable when the computational analysis models have numerical noise, which is very common in high-fidelity models, and that (2) the AMF terminates optimization erroneously when the optimization problems have constraints. The first problem is due to inaccuracy in computing derivatives in the AMF, and the second problem is due to erroneous treatment of the trust region ratio, which sets the size of the domain for an optimization in the AMF. In order to solve the first problem of the AMF, automatic differentiation (AD) technique, which reads the codes of analysis models and automatically generates new derivative codes based on some mathematical rules, is applied. If derivatives are computed with the generated derivative code, they are analytical, and the required computational time is independent of the number of design variables, which is very advantageous for realistic aerospace engineering problems. However, if analysis models implement iterative computations such as computational fluid dynamics (CFD), which solves system partial differential equations iteratively, computing derivatives through the AD requires a massive memory size. The author solved this deficiency by modifying the AD approach and developing a more efficient implementation with CFD, and successfully applied the AD to general CFD software. In order to solve the second problem of the AMF, the governing equation of the trust region ratio, which is very strict against the violation of constraints, is modified so that it can accept the violation of constraints within some tolerance. By accepting violations of constraints during the optimization process, the AMF can continue optimization without terminating immaturely and eventually find the true optimum design point. With these modifications, the AMF is referred to as "Robust AMF," and it is applied to airfoil and wing aerodynamic design problems using Euler CFD software. The former problem has 21 design variables, and the latter 64. In both problems, derivatives computed with the proposed AD method are first compared with those computed with the finite differentiation (FD) method, and then, the Robust AMF is implemented along with the sequential quadratic programming (SQP) optimization method with only high-fidelity models. The proposed AD method computes derivatives more accurately and faster than the FD method, and the Robust AMF successfully optimizes shapes of the airfoil and the wing in a much shorter time than SQP with only high-fidelity models. These results clearly show the effectiveness of the Robust AMF. Finally, the feasibility of reducing computational time for calculating derivatives and the necessity of AMF with an optimum design point always in the feasible region are discussed as future work.
ERIC Educational Resources Information Center
Bhatiasevi, Veera; Naglis, Michael
2016-01-01
This research is one of the first few to investigate the adoption and usage of cloud computing in higher education in the context of developing countries, in this case Thailand. It proposes extending the technology acceptance model to integrate subjective norm, perceived convenience, trust, computer self-efficacy, and software functionality in…
ERIC Educational Resources Information Center
Brandell, Gerd; Carlsson, Svante; Eklbom, Hakan; Nord, Ann-Charlotte
1997-01-01
Describes the process of starting a new program in computer science and engineering that is heavily based on applied mathematics and only open to women. Emphasizes that success requires considerable interest in mathematics and curiosity about computer science among female students at the secondary level and the acceptance of the single-sex program…
ERIC Educational Resources Information Center
Larbi-Apau, Josephine; Oti-Boadi, Mabel; Tetteh, Albert
2018-01-01
Both computer attitude and eLearning self-efficacy are critical complementary factors in determining confidence levels and behavioral belief systems, and can directly affect students' actions, performances and achievements. This study applied a multidimensional construct in validating computer attitude and eLearning self-efficacy of Psychology…
Real-time inspection by submarine images
NASA Astrophysics Data System (ADS)
Tascini, Guido; Zingaretti, Primo; Conte, Giuseppe
1996-10-01
A real-time application of computer vision concerning tracking and inspection of a submarine pipeline is described. The objective is to develop automatic procedures for supporting human operators in the real-time analysis of images acquired by means of cameras mounted on underwater remotely operated vehicles (ROV) Implementation of such procedures gives rise to a human-machine system for underwater pipeline inspection that can automatically detect and signal the presence of the pipe, of its structural or accessory elements, and of dangerous or alien objects in its neighborhood. The possibility of modifying the image acquisition rate in the simulations performed on video- recorded images is used to prove that the system performs all necessary processing with an acceptable robustness working in real-time up to a speed of about 2.5 kn, widely greater than that the actual ROVs and the security features allow.
ERIC Educational Resources Information Center
Fernandez, Anne, Ed.; Sproats, Lee, Ed.; Sorensen, Stacey, Ed.
2000-01-01
The science community has been trying to use computers in teaching for many years. There has been much conformity in how this was to be achieved, and the wheel has been re-invented again and again as enthusiast after enthusiast has "done their bit" towards getting computers accepted. Computers are now used by science undergraduates (as well as…
Mora, Emanuel C.; Macías, Silvio; Hechavarría, Julio; Vater, Marianne; Kössl, Manfred
2013-01-01
Echolocating bats use the time elapsed from biosonar pulse emission to the arrival of echo (defined as echo-delay) to assess target-distance. Target-distance is represented in the brain by delay-tuned neurons that are classified as either “heteroharmonic” or “homoharmormic.” Heteroharmonic neurons respond more strongly to pulse-echo pairs in which the timing of the pulse is given by the fundamental biosonar harmonic while the timing of echoes is provided by one (or several) of the higher order harmonics. On the other hand, homoharmonic neurons are tuned to the echo delay between similar harmonics in the emitted pulse and echo. It is generally accepted that heteroharmonic computations are advantageous over homoharmonic computations; i.e., heteroharmonic neurons receive information from call and echo in different frequency-bands which helps to avoid jamming between pulse and echo signals. Heteroharmonic neurons have been found in two species of the family Mormoopidae (Pteronotus parnellii and Pteronotus quadridens) and in Rhinolophus rouxi. Recently, it was proposed that heteroharmonic target-range computations are a primitive feature of the genus Pteronotus that was preserved in the evolution of the genus. Here, we review recent findings on the evolution of echolocation in Mormoopidae, and try to link those findings to the evolution of the heteroharmonic computation strategy (HtHCS). We stress the hypothesis that the ability to perform heteroharmonic computations evolved separately from the ability of using long constant-frequency echolocation calls, high duty cycle echolocation, and Doppler Shift Compensation. Also, we present the idea that heteroharmonic computations might have been of advantage for categorizing prey size, hunting eared insects, and living in large conspecific colonies. We make five testable predictions that might help future investigations to clarify the evolution of the heteroharmonic echolocation in Mormoopidae and other families. PMID:23781209
H(2)- and H(infinity)-design tools for linear time-invariant systems
NASA Technical Reports Server (NTRS)
Ly, Uy-Loi
1989-01-01
Recent advances in optimal control have brought design techniques based on optimization of H(2) and H(infinity) norm criteria, closer to be attractive alternatives to single-loop design methods for linear time-variant systems. Significant steps forward in this technology are the deeper understanding of performance and robustness issues of these design procedures and means to perform design trade-offs. However acceptance of the technology is hindered by the lack of convenient design tools to exercise these powerful multivariable techniques, while still allowing single-loop design formulation. Presented is a unique computer tool for designing arbitrary low-order linear time-invarient controllers than encompasses both performance and robustness issues via the familiar H(2) and H(infinity) norm optimization. Application to disturbance rejection design for a commercial transport is demonstrated.
ERIC Educational Resources Information Center
Yeom, Soonja; Choi-Lundberg, Derek L.; Fluck, Andrew Edward; Sale, Arthur
2017-01-01
Purpose: This study aims to evaluate factors influencing undergraduate students' acceptance of a computer-aided learning resource using the Phantom Omni haptic stylus to enable rotation, touch and kinaesthetic feedback and display of names of three-dimensional (3D) human anatomical structures on a visual display. Design/methodology/approach: The…
ERIC Educational Resources Information Center
Agbatogun, Alaba Olaoluwakotansibe
2014-01-01
This study examined the predictive power of teachers' perceived usefulness (PU), perceived ease of use (PEU), behavioural intention (BI) to use personal response system (PRS) and computer experience (CE) on teachers' acceptance and attitude towards using PRS in improving communicative competence in the classroom where English is taught as a second…
Learning Computerese: The Role of Second Language Learning Aptitude in Technology Acceptance
ERIC Educational Resources Information Center
Warner, Janis A.; Koufteros, Xenophon; Verghese, Anto
2014-01-01
This article introduces a new construct coined as Computer User Learning Aptitude (CULA). To establish construct validity, CULA is embedded in a nomological network that extends the technology acceptance model (TAM). Specifically, CULA is posited to affect perceived usefulness and perceived ease of use, the two underlying TAM constructs.…
A Causal Model of Teacher Acceptance of Technology
ERIC Educational Resources Information Center
Chang, Jui-Ling; Lieu, Pang-Tien; Liang, Jung-Hui; Liu, Hsiang-Te; Wong, Seng-lee
2012-01-01
This study proposes a causal model for investigating teacher acceptance of technology. We received 258 effective replies from teachers at public and private universities in Taiwan. A questionnaire survey was utilized to test the proposed model. The Lisrel was applied to test the proposed hypotheses. The result shows that computer self-efficacy has…
Williams, Marie A; Soiza, Roy L; Jenkinson, Alison McE; Stewart, Alison
2010-09-13
Falls management programmes have been instituted to attempt to reduce falls. This pilot study was undertaken to determine whether the Nintendo® WiiFit was a feasible and acceptable intervention in community-dwelling older fallers. Community-dwelling fallers over 70 years were recruited and attended for computer-based exercises (n = 15) or standard care (n = 6). Balance and fear of falling were assessed at weeks 0, 4 and 12. Participants were interviewed on completion of the study to determine whether the intervention was acceptable.Eighty percent of participants attended 75% or more of the exercise sessions. An improvement in Berg Score was seen at four weeks (p = 0.02) and in Wii Age at 12 weeks (p = 0.03) in the intervention group. There was no improvement in balance scores in the standard care group. WiiFit exercise is acceptable in self-referred older people with a history of falls. The WiiFit has the potential to improve balance but further work is required. ClinicalTrials.gov - NCT01082042.
Eye Tracking Based Control System for Natural Human-Computer Interaction
Lin, Shu-Fan
2017-01-01
Eye movement can be regarded as a pivotal real-time input medium for human-computer communication, which is especially important for people with physical disability. In order to improve the reliability, mobility, and usability of eye tracking technique in user-computer dialogue, a novel eye control system with integrating both mouse and keyboard functions is proposed in this paper. The proposed system focuses on providing a simple and convenient interactive mode by only using user's eye. The usage flow of the proposed system is designed to perfectly follow human natural habits. Additionally, a magnifier module is proposed to allow the accurate operation. In the experiment, two interactive tasks with different difficulty (searching article and browsing multimedia web) were done to compare the proposed eye control tool with an existing system. The Technology Acceptance Model (TAM) measures are used to evaluate the perceived effectiveness of our system. It is demonstrated that the proposed system is very effective with regard to usability and interface design. PMID:29403528
Extravehicular mobility unit thermal simulator
NASA Technical Reports Server (NTRS)
Hixon, C. W.; Phillips, M. A.
1973-01-01
The analytical methods, thermal model, and user's instructions for the SIM bay extravehicular mobility unit (EMU) routine are presented. This digital computer program was developed for detailed thermal performance predictions of the crewman performing a command module extravehicular activity during transearth coast. It accounts for conductive, convective, and radiative heat transfer as well as fluid flow and associated flow control components. The program is a derivative of the Apollo lunar surface EMU digital simulator. It has the operational flexibility to accept card or magnetic tape for both the input data and program logic. Output can be tabular and/or plotted and the mission simulation can be stopped and restarted at the discretion of the user. The program was developed for the NASA-JSC Univac 1108 computer system and several of the capabilities represent utilization of unique features of that system. Analytical methods used in the computer routine are based on finite difference approximations to differential heat and mass balance equations which account for temperature or time dependent thermo-physical properties.
Robust adaptive kinematic control of redundant robots
NASA Technical Reports Server (NTRS)
Tarokh, M.; Zuck, D. D.
1992-01-01
The paper presents a general method for the resolution of redundancy that combines the Jacobian pseudoinverse and augmentation approaches. A direct adaptive control scheme is developed to generate joint angle trajectories for achieving desired end-effector motion as well as additional user defined tasks. The scheme ensures arbitrarily small errors between the desired and the actual motion of the manipulator. Explicit bounds on the errors are established that are directly related to the mismatch between actual and estimated pseudoinverse Jacobian matrix, motion velocity and the controller gain. It is shown that the scheme is tolerant of the mismatch and consequently only infrequent pseudoinverse computations are needed during a typical robot motion. As a result, the scheme is computationally fast, and can be implemented for real-time control of redundant robots. A method is incorporated to cope with the robot singularities allowing the manipulator to get very close or even pass through a singularity while maintaining a good tracking performance and acceptable joint velocities. Computer simulations and experimental results are provided in support of the theoretical developments.
NASA Technical Reports Server (NTRS)
Lind, Richard C. (Inventor); Brenner, Martin J.
2001-01-01
A structured singular value (mu) analysis method of computing flutter margins has robust stability of a linear aeroelastic model with uncertainty operators (Delta). Flight data is used to update the uncertainty operators to accurately account for errors in the computed model and the observed range of aircraft dynamics of the aircraft under test caused by time-varying aircraft parameters, nonlinearities, and flight anomalies, such as test nonrepeatability. This mu-based approach computes predict flutter margins that are worst case with respect to the modeling uncertainty for use in determining when the aircraft is approaching a flutter condition and defining an expanded safe flight envelope for the aircraft that is accepted with more confidence than traditional methods that do not update the analysis algorithm with flight data by introducing mu as a flutter margin parameter that presents several advantages over tracking damping trends as a measure of a tendency to instability from available flight data.
Eye Tracking Based Control System for Natural Human-Computer Interaction.
Zhang, Xuebai; Liu, Xiaolong; Yuan, Shyan-Ming; Lin, Shu-Fan
2017-01-01
Eye movement can be regarded as a pivotal real-time input medium for human-computer communication, which is especially important for people with physical disability. In order to improve the reliability, mobility, and usability of eye tracking technique in user-computer dialogue, a novel eye control system with integrating both mouse and keyboard functions is proposed in this paper. The proposed system focuses on providing a simple and convenient interactive mode by only using user's eye. The usage flow of the proposed system is designed to perfectly follow human natural habits. Additionally, a magnifier module is proposed to allow the accurate operation. In the experiment, two interactive tasks with different difficulty (searching article and browsing multimedia web) were done to compare the proposed eye control tool with an existing system. The Technology Acceptance Model (TAM) measures are used to evaluate the perceived effectiveness of our system. It is demonstrated that the proposed system is very effective with regard to usability and interface design.
NASA Astrophysics Data System (ADS)
Pua, Rizza; Park, Miran; Wi, Sunhee; Cho, Seungryong
2016-12-01
We propose a hybrid metal artifact reduction (MAR) approach for computed tomography (CT) that is computationally more efficient than a fully iterative reconstruction method, but at the same time achieves superior image quality to the interpolation-based in-painting techniques. Our proposed MAR method, an image-based artifact subtraction approach, utilizes an intermediate prior image reconstructed via PDART to recover the background information underlying the high density objects. For comparison, prior images generated by total-variation minimization (TVM) algorithm, as a realization of fully iterative approach, were also utilized as intermediate images. From the simulation and real experimental results, it has been shown that PDART drastically accelerates the reconstruction to an acceptable quality of prior images. Incorporating PDART-reconstructed prior images in the proposed MAR scheme achieved higher quality images than those by a conventional in-painting method. Furthermore, the results were comparable to the fully iterative MAR that uses high-quality TVM prior images.
Coster, Wendy J.; Haley, Stephen M.; Ni, Pengsheng; Dumas, Helene M.; Fragala-Pinkham, Maria A.
2009-01-01
Objective To examine score agreement, validity, precision, and response burden of a prototype computer adaptive testing (CAT) version of the Self-Care and Social Function scales of the Pediatric Evaluation of Disability Inventory (PEDI) compared to the full-length version of these scales. Design Computer simulation analysis of cross-sectional and longitudinal retrospective data; cross-sectional prospective study. Settings Pediatric rehabilitation hospital, including inpatient acute rehabilitation, day school program, outpatient clinics; community-based day care, preschool, and children’s homes. Participants Four hundred sixty-nine children with disabilities and 412 children with no disabilities (analytic sample); 38 children with disabilities and 35 children without disabilities (cross-validation sample). Interventions Not applicable. Main Outcome Measures Summary scores from prototype CAT applications of each scale using 15-, 10-, and 5-item stopping rules; scores from the full-length Self-Care and Social Function scales; time (in seconds) to complete assessments and respondent ratings of burden. Results Scores from both computer simulations and field administration of the prototype CATs were highly consistent with scores from full-length administration (all r’s between .94 and .99). Using computer simulation of retrospective data, discriminant validity and sensitivity to change of the CATs closely approximated that of the full-length scales, especially when the 15- and 10-item stopping rules were applied. In the cross-validation study the time to administer both CATs was 4 minutes, compared to over 16 minutes to complete the full-length scales. Conclusions Self-care and Social Function score estimates from CAT administration are highly comparable to those obtained from full-length scale administration, with small losses in validity and precision and substantial decreases in administration time. PMID:18373991
Radley, S C; Jones, G L; Tanguy, E A; Stevens, V G; Nelson, C; Mathers, N J
2006-02-01
To develop and evaluate a Web-based, electronic pelvic floor symptoms assessment questionnaire (e-PAQ)1 for women. A cross-sectional study in primary and secondary care. Two general practices, two community health clinics and a secondary care urogynaecology clinic. A total of 432 women (204 in primary care and 228 in secondary care) were recruited between June 2003 and January 2004. The e-PAQ was located on a workstation (computer, touchscreen and printer). Women completed the e-PAQ prior to their appointment. Untreated women in primary care were asked to return seven days later to complete the e-PAQ a second time (test-retest). Factor analysis, reliability, validity, patient satisfaction, completion times and system costs. In secondary care, factor analysis identified 14 domains within the four dimensions (urinary, bowel, vaginal and sexual symptoms) with internal consistency (Cronbach's alpha)>or=0.7 in 11 of these. In primary care, alpha values were all>or=0.7 and test-retest analysis found acceptable intraclass correlations of 0.50-0.95 (P<0.001) for all domains. A measure of face validity and utility was gained using a nine-item questionnaire, which yielded strongly positive patient views on relevance and acceptability. The e-PAQ offers a user-friendly clinical tool, which provides valid and reliable data. The system offers comprehensive symptoms and quality of life evaluation and may enhance the clinical episode as well as the quality of care for women with pelvic floor disorders.
Advanced Map For Real-Time Process Control
NASA Astrophysics Data System (ADS)
Shiobara, Yasuhisa; Matsudaira, Takayuki; Sashida, Yoshio; Chikuma, Makoto
1987-10-01
MAP, a communications protocol for factory automation proposed by General Motors [1], has been accepted by users throughout the world and is rapidly becoming a user standard. In fact, it is now a LAN standard for factory automation. MAP is intended to interconnect different devices, such as computers and programmable devices, made by different manufacturers, enabling them to exchange information. It is based on the OSI intercomputer com-munications protocol standard under development by the ISO. With progress and standardization, MAP is being investigated for application to process control fields other than factory automation [2]. The transmission response time of the network system and centralized management of data exchanged with various devices for distributed control are import-ant in the case of a real-time process control with programmable controllers, computers, and instruments connected to a LAN system. MAP/EPA and MINI MAP aim at reduced overhead in protocol processing and enhanced transmission response. If applied to real-time process control, a protocol based on point-to-point and request-response transactions limits throughput and transmission response. This paper describes an advanced MAP LAN system applied to real-time process control by adding a new data transmission control that performs multicasting communication voluntarily and periodically in the priority order of data to be exchanged.
2013-01-01
Background Handheld computers for data collection (HCDC) and management have become increasingly common in health research. However, current knowledge about the use of HCDC in health research in China is very limited. In this study, we administered a survey to a hard-to-reach population in China using HCDC and assessed the acceptability and adoption of HCDC in China. Methods Handheld computers operating Windows Mobile and Questionnaire Development Studio (QDS) software (Nova Research Company) were used for this survey. Questions on tobacco use and susceptibility were drawn from the Global Adult Tobacco Survey (GATS) and other validated instruments, and these were programmed in Chinese characters by local staff. We conducted a half-day training session for survey supervisors and a three-day training session for 20 interviewers and 9 supervisors. After the training, all trainees completed a self-assessment of their skill level using HCDC. The main study was implemented in fall 2010 in 10 sites, with data managed centrally in Beijing. Study interviewers completed a post-survey evaluation questionnaire on the acceptability and utility of HCDC in survey research. Results Twenty-nine trainees completed post-training surveys, and 20 interviewers completed post-data collection questionnaires. After training, more than 90% felt confident about their ability to collect survey data using HCDC, to transfer study data from a handheld computer to a laptop, and to encrypt the survey data file. After data collection, 80% of the interviewers thought data collection and management were easy and 60% of staff felt confident they could solve problems they might encounter. Overall, after data collection, nearly 70% of interviewers reported that they would prefer to use handheld computers for future surveys. More than half (55%) felt the HCDC was a particularly useful data collection tool for studies conducted in China. Conclusions We successfully conducted a health-related survey using HCDC. Using handheld computers for data collection was a feasible, acceptable, and preferred method by Chinese interviewers. Despite minor technical issues that occurred during data collection, HCDC is a promising methodology to be used in survey-based research in China. PMID:23802988
Wan, Xia; Raymond, H Fisher; Wen, Tiancai; Ding, Ding; Wang, Qian; Shin, Sanghyuk S; Yang, Gonghuan; Chai, Wanxing; Zhang, Peng; Novotny, Thomas E
2013-06-26
Handheld computers for data collection (HCDC) and management have become increasingly common in health research. However, current knowledge about the use of HCDC in health research in China is very limited. In this study, we administered a survey to a hard-to-reach population in China using HCDC and assessed the acceptability and adoption of HCDC in China. Handheld computers operating Windows Mobile and Questionnaire Development Studio (QDS) software (Nova Research Company) were used for this survey. Questions on tobacco use and susceptibility were drawn from the Global Adult Tobacco Survey (GATS) and other validated instruments, and these were programmed in Chinese characters by local staff. We conducted a half-day training session for survey supervisors and a three-day training session for 20 interviewers and 9 supervisors. After the training, all trainees completed a self-assessment of their skill level using HCDC. The main study was implemented in fall 2010 in 10 sites, with data managed centrally in Beijing. Study interviewers completed a post-survey evaluation questionnaire on the acceptability and utility of HCDC in survey research. Twenty-nine trainees completed post-training surveys, and 20 interviewers completed post-data collection questionnaires. After training, more than 90% felt confident about their ability to collect survey data using HCDC, to transfer study data from a handheld computer to a laptop, and to encrypt the survey data file. After data collection, 80% of the interviewers thought data collection and management were easy and 60% of staff felt confident they could solve problems they might encounter. Overall, after data collection, nearly 70% of interviewers reported that they would prefer to use handheld computers for future surveys. More than half (55%) felt the HCDC was a particularly useful data collection tool for studies conducted in China. We successfully conducted a health-related survey using HCDC. Using handheld computers for data collection was a feasible, acceptable, and preferred method by Chinese interviewers. Despite minor technical issues that occurred during data collection, HCDC is a promising methodology to be used in survey-based research in China.
Man, V; Polzer, S; Gasser, T C; Novotny, T; Bursa, J
2018-03-01
Biomechanics-based assessment of Abdominal Aortic Aneurysm (AAA) rupture risk has gained considerable scientific and clinical momentum. However, computation of peak wall stress (PWS) using state-of-the-art finite element models is time demanding. This study investigates which features of the constitutive description of AAA wall are decisive for achieving acceptable stress predictions in it. Influence of five different isotropic constitutive descriptions of AAA wall is tested; models reflect realistic non-linear, artificially stiff non-linear, or artificially stiff pseudo-linear constitutive descriptions of AAA wall. Influence of the AAA wall model is tested on idealized (n=4) and patient-specific (n=16) AAA geometries. Wall stress computations consider a (hypothetical) load-free configuration and include residual stresses homogenizing the stresses across the wall. Wall stress differences amongst the different descriptions were statistically analyzed. When the qualitatively similar non-linear response of the AAA wall with low initial stiffness and subsequent strain stiffening was taken into consideration, wall stress (and PWS) predictions did not change significantly. Keeping this non-linear feature when using an artificially stiff wall can save up to 30% of the computational time, without significant change in PWS. In contrast, a stiff pseudo-linear elastic model may underestimate the PWS and is not reliable for AAA wall stress computations. Copyright © 2018 IPEM. Published by Elsevier Ltd. All rights reserved.
Chen, Hong-Ren; Tseng, Hsiao-Fen
2012-08-01
Web-based e-learning is not restricted by time or place and can provide teachers with a learning environment that is flexible and convenient, enabling them to efficiently learn, quickly develop their professional expertise, and advance professionally. Many research reports on web-based e-learning have neglected the role of the teacher's perspective in the acceptance of using web-based e-learning systems for in-service education. We distributed questionnaires to 402 junior high school teachers in central Taiwan. This study used the Technology Acceptance Model (TAM) as our theoretical foundation and employed the Structure Equation Model (SEM) to examine factors that influenced intentions to use in-service training conducted through web-based e-learning. The results showed that motivation to use and Internet self-efficacy were significantly positively associated with behavioral intentions regarding the use of web-based e-learning for in-service training through the factors of perceived usefulness and perceived ease of use. The factor of computer anxiety had a significantly negative effect on behavioral intentions toward web-based e-learning in-service training through the factor of perceived ease of use. Perceived usefulness and motivation to use were the primary reasons for the acceptance by junior high school teachers of web-based e-learning systems for in-service training. Copyright © 2011 Elsevier Ltd. All rights reserved.
Espejo-Trung, Luciana Cardoso; Elian, Silvia Nagib; Luz, Maria Aparecia Alves De Cerqueira
2015-11-01
Learning objects (LOs) associated with augmented reality have been used as attractive new technologic tools in the educational process. However, the acceptance of new LOs must be verified with the purpose of using these innovations in the learning process in general. The aim of this study was to develop a new LO and investigate the acceptance of gold onlay in teaching preparation design at a dental school in Brazil. Questionnaires were designed to assess, first, the users' computational ability and knowledge of computers (Q1) and, second, the users' acceptance of the new LO (Q2). For both questionnaires, the internal consistency index was calculated to determine whether the questions were measuring the same construct. The reliability of Q2 was measured with a retest procedure. The LO was tested by dental students (n=28), professors and postgraduate students in dentistry and prosthetics (n=30), and dentists participating in a continuing education or remedial course in dentistry and/or prosthetics (n=19). Analyses of internal consistency (Kappa coefficient and Cronbach's alpha) demonstrated a high degree of confidence in the questionnaires. Tests for simple linear regressions were conducted between the response variable (Q2) and the following explanative variables: the Q1 score, age, gender, and group. The results showed wide acceptance regardless of the subjects' computational ability (p=0.99; R2=0), gender (p=0.27; R2=1.6%), age (p=0.27; R2=0.1%), or group (p=0.53; R2=1.9%). The methodology used enabled the development of an LO with a high index of acceptance for all groups.
Discrete-time stability of continuous-time controller designs for large space structures
NASA Technical Reports Server (NTRS)
Balas, M. J.
1982-01-01
In most of the stable control designs for flexible structures, continuous time is assumed. However, in view of the implementation of the controllers by on-line digital computers, the discrete-time stability of such controllers is an important consideration. In the case of direct-velocity feedback (DVFB), involving negative feedback from collocated force actuators and velocity sensors, it is not immediately apparent how much delay due to digital implementation of DVFB can be tolerated without loss of stability. The present investigation is concerned with such questions. A study is conducted of the discrete-time stability of DVFB, taking into account an employment of Euler's method of approximation of the time derivative. The obtained result gives an indication of the acceptable time-step size for stable digital implementation of DVFB. A result derived in connection with the consideration of the discrete-time stability of stable continuous-time systems provides a general condition under which digital implementation of such a system will remain stable.
Predictions of vacuum loss of evacuated vials from initial air leak rates.
Prisco, Michael R; Ochoa, Jorge A; Yardimci, Atif M
2013-08-01
Container closure integrity is a critical factor for maintaining product sterility and stability. Therefore, closure systems (found in vials, syringes, and cartridges) are designed to provide a seal between rubber stoppers and glass containers. To ensure that the contained product has maintained its sterility and stability at the time of deployment, the seal must remain intact within acceptable limits. To this end, a mathematical model has been developed to describe vacuum loss in evacuated drug vials. The model computes equivalent leak diameter corresponding to initial air leak rate as well as vacuum loss as a function of time and vial size. The theory accounts for three flow regimes that may be encountered. Initial leak rates from 10(-8) to 10(3) sccm (standard cubic centimeters per minute) were investigated for vials ranging from 1 to 100 mL. Corresponding leak diameters of 0.25-173 μm were predicted. The time for a vial to lose half of its vacuum, the T50 value, ranged from many years at the lowest leak rates and largest vials, to fractions of a second at the highest leak rates and smallest vials. These results may be used to determine what level of initial vacuum leak is acceptable for a given product. Copyright © 2013 Wiley Periodicals, Inc.
King, W. E.; Anderson, A. T.; Ferencz, R. M.; ...
2015-12-29
The production of metal parts via laser powder bed fusion additive manufacturing is growing exponentially. However, the transition of this technology from production of prototypes to production of critical parts is hindered by a lack of confidence in the quality of the part. Confidence can be established via a fundamental understanding of the physics of the process. It is generally accepted that this understanding will be increasingly achieved through modeling and simulation. However, there are significant physics, computational, and materials challenges stemming from the broad range of length and time scales and temperature ranges associated with the process. In thismore » study, we review the current state of the art and describe the challenges that need to be met to achieve the desired fundamental understanding of the physics of the process.« less
A predictive software tool for optimal timing in contrast enhanced carotid MR angiography
NASA Astrophysics Data System (ADS)
Moghaddam, Abbas N.; Balawi, Tariq; Habibi, Reza; Panknin, Christoph; Laub, Gerhard; Ruehm, Stefan; Finn, J. Paul
2008-03-01
A clear understanding of the first pass dynamics of contrast agents in the vascular system is crucial in synchronizing data acquisition of 3D MR angiography (MRA) with arrival of the contrast bolus in the vessels of interest. We implemented a computational model to simulate contrast dynamics in the vessels using the theory of linear time-invariant systems. The algorithm calculates a patient-specific impulse response for the contrast concentration from time-resolved images following a small test bolus injection. This is performed for a specific region of interest and through deconvolution of the intensity curve using the long division method. Since high spatial resolution 3D MRA is not time-resolved, the method was validated on time-resolved arterial contrast enhancement in Multi Slice CT angiography. For 20 patients, the timing of the contrast enhancement of the main bolus was predicted by our algorithm from the response to the test bolus, and then for each case the predicted time of maximum intensity was compared to the corresponding time in the actual scan which resulted in an acceptable agreement. Furthermore, as a qualitative validation, the algorithm's predictions of the timing of the carotid MRA in 20 patients with high quality MRA were correlated with the actual timing of those studies. We conclude that the above algorithm can be used as a practical clinical tool to eliminate guesswork and to replace empiric formulae by a priori computation of patient-specific timing of data acquisition for MR angiography.
ERIC Educational Resources Information Center
Ekufu, ThankGod K.
2012-01-01
Organizations are finding it difficult in today's economy to implement the vast information technology infrastructure required to effectively conduct their business operations. Despite the fact that some of these organizations are leveraging on the computational powers and the cost-saving benefits of computing on the Internet cloud, others…
Teachers' Changing Roles in Computer Assisted Roles in Kenyan Secondary Schools
ERIC Educational Resources Information Center
Tanui, Edward K.; Kiboss, Joel K.; Walaba, Aggrey A.; Nassiuma, Dankit
2008-01-01
The use of computer technology in Kenyan schools is a relatively new approach that is currently being included in the school curriculum. The introduction of computer technology for use in teaching does not always seem to be accepted outright by most teachers. The purpose of the study reported in this paper was to investigate the teachers' changing…
What Does CALL Have to Offer Computer Science and What Does Computer Science Have to Offer CALL?
ERIC Educational Resources Information Center
Cushion, Steve
2006-01-01
We will argue that CALL can usefully be viewed as a subset of computer software engineering and can profit from adopting some of the recent progress in software development theory. The unified modelling language has become the industry standard modelling technique and the accompanying unified process is rapidly gaining acceptance. The manner in…
Tablet computers in assessing performance in a high stakes exam: opinion matters.
Currie, G P; Sinha, S; Thomson, F; Cleland, J; Denison, A R
2017-06-01
Background Tablet computers have emerged as a tool to capture, process and store data in examinations, yet evidence relating to their acceptability and usefulness in assessment is limited. Methods We performed an observational study to explore opinions and attitudes relating to tablet computer use in recording performance in a final year objective structured clinical examination at a single UK medical school. Examiners completed a short questionnaire encompassing background, forced-choice and open questions. Forced choice questions were analysed using descriptive statistics and open questions by framework analysis. Results Ninety-two (97% response rate) examiners completed the questionnaire of whom 85% had previous use of tablet computers. Ninety per cent felt checklist mark allocation was 'very/quite easy', while approximately half considered recording 'free-type' comments was 'easy/very easy'. Greater overall efficiency of marking and resource savings were considered the main advantages of tablet computers, while concerns relating to technological failure and ability to record free type comments were raised. Discussion In a context where examiners were familiar with tablet computers, they were preferred to paper checklists, although concerns were raised. This study adds to the limited literature underpinning the use of electronic devices as acceptable tools in objective structured clinical examinations.
100 or 30 years after Janeway or Bartter, Healthwatch helps avoid 'flying blind'.
Cornélissen, Germaine; Halberg, Franz; Bakken, Earl; Singh, Ram B; Otsuka, Kuniaki; Tomlinson, Brian; Delcourt, Alain; Toussaint, Guy; Bathina, Srilakshmi; Schwartzkopff, Othild; Wang, Zhengrong; Tarquini, Roberto; Perfetto, Federico; Pantaleoni, Giancarlo; Jozsa, Rita; Delmore, Patrick A; Nolley, Ellis
2004-10-01
Longitudinal records of blood pressure (BP) and heart rate (HR) around the clock for days, weeks, months, years, and even decades obtained by manual self-measurements (during waking) and/or automatically by ambulatory monitoring reveal, in addition to well-known large within-day variation, also considerable day-to-day variability in most people, whether normotensive or hypertensive. As a first step, the circadian rhythm is considered along with gender differences and changes as a function of age to derive time-specified reference values (chronodesms), while reference values accumulate to also account for the circaseptan variation. Chronodesms serve for the interpretation of single measurements and of circadian and other rhythm parameters. Refined diagnoses can thus be obtained, namely MESOR-hypertension when the chronome-adjusted mean value (MESOR) of BP is above the upper limit of acceptability, excessive pulse pressure (EPP) when the difference in MESOR between the systolic (S) and diastolic (D) BP is too large, CHAT (circadian hyper-amplitude tension) when the circadian BP amplitude is excessive, DHRV (decreased heart rate variability) when the standard deviation (SD) of HR is below the acceptable range, and/or ecphasia when the overall high values recurring each day occur at an odd time (a condition also contributing to the risk associated with 'non-dipping'). A non-parametric approach consisting of a computer comparison of the subject's profile with the time-varying limits of acceptability further serves as a guide to optimize the efficacy of any needed treatment by timing its administration (chronotherapy) and selecting a treatment schedule best suited to normalize abnormal patterns in BP and/or HR. The merit of the proposed chronobiological approach to BP screening, diagnosis and therapy (chronotheranostics) is assessed in the light of outcome studies. Elevated risk associated with abnormal patterns of BP and/or HR variability, even when most if not all measurements lie within the range of acceptable values, becomes amenable to treatment as a critical step toward prevention (prehabilitation) to reduce the need for rehabilitation (the latter often after costly surgical intervention).
High-efficiency wavefunction updates for large scale Quantum Monte Carlo
NASA Astrophysics Data System (ADS)
Kent, Paul; McDaniel, Tyler; Li, Ying Wai; D'Azevedo, Ed
Within ab intio Quantum Monte Carlo (QMC) simulations, the leading numerical cost for large systems is the computation of the values of the Slater determinants in the trial wavefunctions. The evaluation of each Monte Carlo move requires finding the determinant of a dense matrix, which is traditionally iteratively evaluated using a rank-1 Sherman-Morrison updating scheme to avoid repeated explicit calculation of the inverse. For calculations with thousands of electrons, this operation dominates the execution profile. We propose a novel rank- k delayed update scheme. This strategy enables probability evaluation for multiple successive Monte Carlo moves, with application of accepted moves to the matrices delayed until after a predetermined number of moves, k. Accepted events grouped in this manner are then applied to the matrices en bloc with enhanced arithmetic intensity and computational efficiency. This procedure does not change the underlying Monte Carlo sampling or the sampling efficiency. For large systems and algorithms such as diffusion Monte Carlo where the acceptance ratio is high, order of magnitude speedups can be obtained on both multi-core CPU and on GPUs, making this algorithm highly advantageous for current petascale and future exascale computations.
Adapting a computer-delivered brief alcohol intervention for veterans with Hepatitis C.
Cucciare, Michael A; Jamison, Andrea L; Combs, Ann S; Joshi, Gauri; Cheung, Ramsey C; Rongey, Catherine; Huggins, Joe; Humphreys, Keith
2017-12-01
This study adapted an existing computer-delivered brief alcohol intervention (cBAI) for use in Veterans with the hepatitis C virus (HCV) and examined its acceptability and feasibility in this patient population. A four-stage model consisting of initial pilot testing, qualitative interviews with key stakeholders, development of a beta version of the cBAI, and usability testing was used to achieve the study objectives. In-depth interviews gathered feedback for modifying the cBAI, including adding HCV-related content such as the health effects of alcohol on liver functioning, immune system functioning, and management of HCV, a preference for concepts to be displayed through "newer looking" graphics, and limiting the use of text to convey key concepts. Results from usability testing indicated that the modified cBAI was acceptable and feasible for use in this patient population. The development model used in this study is effective for gathering actionable feedback that can inform the development of a cBAI and can result in the development of an acceptable and feasible intervention for use in this population. Findings also have implications for developing computer-delivered interventions targeting behavior change more broadly.
Tailor, Vijay K; Glaze, Selina; Khandelwal, Payal; Davis, Alison; Adams, Gillian G W; Xing, Wen; Bunce, Catey; Dahlmann-Noor, Annegret
2015-01-01
Amblyopia ("lazy eye") is the commonest vision deficit in children. If not fully corrected by glasses, amblyopia is treated by patching or blurring the better-seeing eye. Compliance with patching is often poor. Computer-based activities are increasingly topical, both as an adjunct to standard treatment and as a platform for novel treatments. Acceptability by families has not been explored, and feasibility of a randomised controlled trial (RCT) using computer games in terms of recruitment and treatment acceptability is uncertain. We carried out a pilot RCT to test whether computer-based activities are acceptable and accessible to families and to test trial methods such as recruitment and retention rates, randomisation, trial-specific data collection tools and analysis. The trial had three arms: standard near activity advice, Eye Five, a package developed for children with amblyopia, and an off-the-shelf handheld games console with pre-installed games. We enrolled 60 children age 3-8 years with moderate or severe amblyopia after completion of optical treatment. This trial was registered as UKCRN-ID 11074. Pre-screening of 3600 medical notes identified 189 potentially eligible children, of whom 60 remained eligible after optical treatment, and were enrolled between April 2012 and March 2013. One participant was randomised twice and withdrawn from the study. Of the 58 remaining, 37 were boys. The mean (SD) age was 4.6 (1.7) years. Thirty-seven had moderate and 21 severe amblyopia. Three participants were withdrawn at week 6, and in total, four were lost to follow-up at week 12. Most children and parents/carers found the study procedures, i.e. occlusion treatment, usage of the allocated near activity and completion of a study diary, easy. The prescribed cumulative dose of near activity was 84 h at 12 weeks. Reported near activity usage numbers were close to prescribed numbers in moderate amblyopes (94 % of prescribed) but markedly less in severe amblyopes (64 %). Reported occlusion usage at 12 weeks was 90 % of prescribed dose for moderate and 33 % for severe amblyopes. Computer-based games and activities appear acceptable to families as part of their child's amblyopia treatment. Trial methods were appropriate and accepted by families.
Ethical and Professional Issues in Computer-Assisted Therapy.
ERIC Educational Resources Information Center
Ford, B. Douglas
1993-01-01
Discusses ethical and professional issues in psychology regarding computer-assisted therapy (CAT). Topics addressed include an explanation of CAT; whether CAT is psychotherapy; software, including independent use, validation of effectiveness, and restricted access; clinician resistance; client acceptance; the impact on ethical standards; and a…
The Role of Crop Systems Simulation in Agriculture and Environment
USDA-ARS?s Scientific Manuscript database
Over the past 30 to 40 years, simulation of crop systems has advanced from a neophyte science with inadequate computing power into a robust and increasingly accepted science supported by improved software, languages, development tools, and computer capabilities. Crop system simulators contain mathe...
Concept of Operations Evaluation for Using Remote-Guidance Ultrasound for Exploration Spaceflight.
Hurst, Victor W; Peterson, Sean; Garcia, Kathleen; Ebert, Douglas; Ham, David; Amponsah, David; Dulchavsky, Scott
2015-12-01
Remote-guidance (RG) techniques aboard the International Space Station (ISS) have enabled astronauts to collect diagnostic-level ultrasound (US) images. Exploration-class missions will likely require nonformally trained sonographers to operate with greater autonomy given longer communication delays (> 6 s for missions beyond the Moon) and blackouts. Training requirements for autonomous collection of US images by non-US experts are being determined. Novice US operators were randomly assigned to one of three groups to collect standardized US images while drawing expertise from A) RG only, B) a computer training tool only, or C) both RG and a computer training tool. Images were assessed for quality and examination duration. All operators were given a 10-min standardized generic training session in US scanning. The imaging task included: 1) bone fracture assessment in a phantom and 2) Focused Assessment with Sonography in Trauma (FAST) examination in a healthy volunteer. A human factors questionnaire was also completed. Mean time for group B during FAST was shorter (20.4 vs. 22.7 min) than time for the other groups. Image quality scoring was lower than in groups A or C, but all groups produced images of acceptable diagnostic quality. RG produces US images of higher quality than those produced with only computer-based instruction. Extended communication delays in exploration missions will eliminate the option of real-time guidance, thus requiring autonomous operation. The computer program used appears effective and could be a model for future digital US expertise banks. Terrestrially, it also provides adequate self-training and mentoring mechanisms.
Gender Divide and Acceptance of Collaborative Web 2.0 Applications for Learning in Higher Education
ERIC Educational Resources Information Center
Huang, Wen-Hao David; Hood, Denice Ward; Yoo, Sun Joo
2013-01-01
Situated in the gender digital divide framework, this survey study investigated the role of computer anxiety in influencing female college students' perceptions toward Web 2.0 applications for learning. Based on 432 college students' "Web 2.0 for learning" perception ratings collected by relevant categories of "Unified Theory of Acceptance and Use…
Investigating the Determinants and Age and Gender Differences in the Acceptance of Mobile Learning
ERIC Educational Resources Information Center
Wang, Yi-Shun; Wu, Ming-Cheng; Wang, Hsiu-Yuan
2009-01-01
With the proliferation of mobile computing technology, mobile learning (m-learning) will play a vital role in the rapidly growing electronic learning market. M-learning is the delivery of learning to students anytime and anywhere through the use of wireless Internet and mobile devices. However, acceptance of m-learning by individuals is critical…
ERIC Educational Resources Information Center
Dutton, Donna H.
The paper describes three strategies featuring a microcomputer to promote the integration and acceptance of students with disabilities among their nondisabled peers. The first strategy is a cross-age tutoring program in which disabled, learning disabled, emotionally disabled, or mildly retarded students demonstrate computer use to nondisabled…
The Influence of Demographic Factor on Personal Innovativeness towards Technology Acceptance
ERIC Educational Resources Information Center
Noh, Noraini Mohamed; Hamzah, Mahizer; Abdullah, Norazilawati
2016-01-01
Library and Media Teacher (LMT) readiness of accepting and using technology innovation earlier than their colleagues could expedite the technology innovation process into the school education system. The aim of this paper is to report on a study that explored the impact of experience in using computer and the level of ICT knowledge towards…
Teachers Left Behind: Acceptance and Use of Technology in Lebanese Public High Schools
ERIC Educational Resources Information Center
Baytiyeh, Hoda
2014-01-01
Nowadays, the use of computers in education is increasing worldwide. Information technology is deemed essential for the digital generation's classrooms. However, the adoption of technology in teaching and learning largely depends on the culture and social context. The aim of this research study is to evaluate the acceptance and use of technology…
Development of the sonic pump levitator
NASA Technical Reports Server (NTRS)
Dunn, S. A.
1985-01-01
The process and mechanism involved in producing glass microballoons (GMBs) of acceptable quality for laser triggered inertial fusion through use of glass jet levitation and manipulation are considered. The gas jet levitation device, called sonic pumps, provides positioning by timely and appropriate application of gas mementum from one or more of six sonic pumps which are arranged orthogonally in opposed pairs about the levitation region and are activated by an electrooptical, computer controlled, feedback system. The levitation device was fabricated and its associated control systems were assembled into a package and tested in reduced gravity flight regime of the NASA KC-135 aircraft.
Computer graphics for quality control in the INAA of geological samples
Grossman, J.N.; Baedecker, P.A.
1987-01-01
A data reduction system for the routine instrumental activation analysis of samples is described, with particular emphasis on interactive graphics capabilities for evaluating analytical quality. Graphics procedures have been developed to interactively control the analysis of selected photopeaks during spectral analysis, and to evaluate detector performance during a given counting cycle. Graphics algorithms are also used to compare the data on reference samples with accepted values, to prepare quality control charts to evaluate long term precision and to search for systematic variations in data on reference samples as a function of time. ?? 1987 Akade??miai Kiado??.
Fan-beam scanning laser optical computed tomography for large volume dosimetry
NASA Astrophysics Data System (ADS)
Dekker, K. H.; Battista, J. J.; Jordan, K. J.
2017-05-01
A prototype scanning-laser fan beam optical CT scanner is reported which is capable of high resolution, large volume dosimetry with reasonable scan time. An acylindrical, asymmetric aquarium design is presented which serves to 1) generate parallel-beam scan geometry, 2) focus light towards a small acceptance angle detector, and 3) avoid interference fringe-related artifacts. Preliminary experiments with uniform solution phantoms (11 and 15 cm diameter) and finger phantoms (13.5 mm diameter FEP tubing) demonstrate that the design allows accurate optical CT imaging, with optical CT measurements agreeing within 3% of independent Beer-Lambert law calculations.
Demand Response Resource Quantification with Detailed Building Energy Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hale, Elaine; Horsey, Henry; Merket, Noel
Demand response is a broad suite of technologies that enables changes in electrical load operations in support of power system reliability and efficiency. Although demand response is not a new concept, there is new appetite for comprehensively evaluating its technical potential in the context of renewable energy integration. The complexity of demand response makes this task difficult -- we present new methods for capturing the heterogeneity of potential responses from buildings, their time-varying nature, and metrics such as thermal comfort that help quantify likely acceptability of specific demand response actions. Computed with an automated software framework, the methods are scalable.
Data acquisition using the 168/E. [CERN ISR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carroll, J.T.; Cittolin, S.; Demoulin, M.
1983-03-01
Event sizes and data rates at the CERN anti p p collider compose a formidable environment for a high level trigger. A system using three 168/E processors for experiment UA1 real-time event selection is described. With 168/E data memory expanded to 512K bytes, each processor holds a complete event allowing a FORTRAN trigger algorithm access to data from the entire detector. A smart CAMAC interface reads five Remus branches in parallel transferring one word to the target processor every 0.5 ..mu..s. The NORD host computer can simultaneously read an accepted event from another processor.
Probabilistic Reasoning for Robustness in Automated Planning
NASA Technical Reports Server (NTRS)
Schaffer, Steven; Clement, Bradley; Chien, Steve
2007-01-01
A general-purpose computer program for planning the actions of a spacecraft or other complex system has been augmented by incorporating a subprogram that reasons about uncertainties in such continuous variables as times taken to perform tasks and amounts of resources to be consumed. This subprogram computes parametric probability distributions for time and resource variables on the basis of user-supplied models of actions and resources that they consume. The current system accepts bounded Gaussian distributions over action duration and resource use. The distributions are then combined during planning to determine the net probability distribution of each resource at any time point. In addition to a full combinatoric approach, several approximations for arriving at these combined distributions are available, including maximum-likelihood and pessimistic algorithms. Each such probability distribution can then be integrated to obtain a probability that execution of the plan under consideration would violate any constraints on the resource. The key idea is to use these probabilities of conflict to score potential plans and drive a search toward planning low-risk actions. An output plan provides a balance between the user s specified averseness to risk and other measures of optimality.
Gap Acceptance During Lane Changes by Large-Truck Drivers—An Image-Based Analysis
Nobukawa, Kazutoshi; Bao, Shan; LeBlanc, David J.; Zhao, Ding; Peng, Huei; Pan, Christopher S.
2016-01-01
This paper presents an analysis of rearward gap acceptance characteristics of drivers of large trucks in highway lane change scenarios. The range between the vehicles was inferred from camera images using the estimated lane width obtained from the lane tracking camera as the reference. Six-hundred lane change events were acquired from a large-scale naturalistic driving data set. The kinematic variables from the image-based gap analysis were filtered by the weighted linear least squares in order to extrapolate them at the lane change time. In addition, the time-to-collision and required deceleration were computed, and potential safety threshold values are provided. The resulting range and range rate distributions showed directional discrepancies, i.e., in left lane changes, large trucks are often slower than other vehicles in the target lane, whereas they are usually faster in right lane changes. Video observations have confirmed that major motivations for changing lanes are different depending on the direction of move, i.e., moving to the left (faster) lane occurs due to a slower vehicle ahead or a merging vehicle on the right-hand side, whereas right lane changes are frequently made to return to the original lane after passing. PMID:26924947
Gap Acceptance During Lane Changes by Large-Truck Drivers-An Image-Based Analysis.
Nobukawa, Kazutoshi; Bao, Shan; LeBlanc, David J; Zhao, Ding; Peng, Huei; Pan, Christopher S
2016-03-01
This paper presents an analysis of rearward gap acceptance characteristics of drivers of large trucks in highway lane change scenarios. The range between the vehicles was inferred from camera images using the estimated lane width obtained from the lane tracking camera as the reference. Six-hundred lane change events were acquired from a large-scale naturalistic driving data set. The kinematic variables from the image-based gap analysis were filtered by the weighted linear least squares in order to extrapolate them at the lane change time. In addition, the time-to-collision and required deceleration were computed, and potential safety threshold values are provided. The resulting range and range rate distributions showed directional discrepancies, i.e., in left lane changes, large trucks are often slower than other vehicles in the target lane, whereas they are usually faster in right lane changes. Video observations have confirmed that major motivations for changing lanes are different depending on the direction of move, i.e., moving to the left (faster) lane occurs due to a slower vehicle ahead or a merging vehicle on the right-hand side, whereas right lane changes are frequently made to return to the original lane after passing.
Computer-Assisted Learning for the Hearing Impaired: An Interactive Written Language Enviroment.
ERIC Educational Resources Information Center
Ward, R. D.; Rostron, A. B.
1983-01-01
To help hearing-impaired children develop their linguistic competence, a computer system that can process sentences and give feedback about their acceptability was developed. Suggestions are made of ways to use the system as an environment for interactive written communication. (Author/CL)
19 CFR 4.99 - Forms; substitution.
Code of Federal Regulations, 2011 CFR
2011-04-01
... Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY... the instructions shall be followed. (c) The port director, in his discretion, may accept a computer printout instead of Customs Form 1302 for use at a specific port. However, to ensure that computer...
19 CFR 4.99 - Forms; substitution.
Code of Federal Regulations, 2010 CFR
2010-04-01
... Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY... the instructions shall be followed. (c) The port director, in his discretion, may accept a computer printout instead of Customs Form 1302 for use at a specific port. However, to ensure that computer...
19 CFR 4.99 - Forms; substitution.
Code of Federal Regulations, 2013 CFR
2013-04-01
... Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY... the instructions shall be followed. (c) The port director, in his discretion, may accept a computer printout instead of Customs Form 1302 for use at a specific port. However, to ensure that computer...
19 CFR 4.99 - Forms; substitution.
Code of Federal Regulations, 2012 CFR
2012-04-01
... Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY... the instructions shall be followed. (c) The port director, in his discretion, may accept a computer printout instead of Customs Form 1302 for use at a specific port. However, to ensure that computer...
19 CFR 4.99 - Forms; substitution.
Code of Federal Regulations, 2014 CFR
2014-04-01
... Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY... the instructions shall be followed. (c) The port director, in his discretion, may accept a computer printout instead of Customs Form 1302 for use at a specific port. However, to ensure that computer...
A Mixed-Methods Exploration of an Environment for Learning Computer Programming
ERIC Educational Resources Information Center
Mather, Richard
2015-01-01
A mixed-methods approach is evaluated for exploring collaborative behaviour, acceptance and progress surrounding an interactive technology for learning computer programming. A review of literature reveals a compelling case for using mixed-methods approaches when evaluating technology-enhanced-learning environments. Here, ethnographic approaches…
Objective speech quality evaluation of real-time speech coders
NASA Astrophysics Data System (ADS)
Viswanathan, V. R.; Russell, W. H.; Huggins, A. W. F.
1984-02-01
This report describes the work performed in two areas: subjective testing of a real-time 16 kbit/s adaptive predictive coder (APC) and objective speech quality evaluation of real-time coders. The speech intelligibility of the APC coder was tested using the Diagnostic Rhyme Test (DRT), and the speech quality was tested using the Diagnostic Acceptability Measure (DAM) test, under eight operating conditions involving channel error, acoustic background noise, and tandem link with two other coders. The test results showed that the DRT and DAM scores of the APC coder equalled or exceeded the corresponding test scores fo the 32 kbit/s CVSD coder. In the area of objective speech quality evaluation, the report describes the development, testing, and validation of a procedure for automatically computing several objective speech quality measures, given only the tape-recordings of the input speech and the corresponding output speech of a real-time speech coder.
Real-time aerodynamic heating and surface temperature calculations for hypersonic flight simulation
NASA Technical Reports Server (NTRS)
Quinn, Robert D.; Gong, Leslie
1990-01-01
A real-time heating algorithm was derived and installed on the Ames Research Center Dryden Flight Research Facility real-time flight simulator. This program can calculate two- and three-dimensional stagnation point surface heating rates and surface temperatures. The two-dimensional calculations can be made with or without leading-edge sweep. In addition, upper and lower surface heating rates and surface temperatures for flat plates, wedges, and cones can be calculated. Laminar or turbulent heating can be calculated, with boundary-layer transition made a function of free-stream Reynolds number and free-stream Mach number. Real-time heating rates and surface temperatures calculated for a generic hypersonic vehicle are presented and compared with more exact values computed by a batch aeroheating program. As these comparisons show, the heating algorithm used on the flight simulator calculates surface heating rates and temperatures well within the accuracy required to evaluate flight profiles for acceptable heating trajectories.
A prospective study of acceptance of pain and patient functioning with chronic pain.
McCracken, Lance M; Eccleston, Christopher
2005-11-01
Acceptance of chronic pain is emerging as an important concept in understanding ways that chronic pain sufferers can remain engaged with valued aspects of life. Recent studies have relied heavily on cross-sectional investigations at a single time point. The present study sought to prospectively investigate relations between acceptance of chronic pain and patient functioning. A sample of adults referred for interdisciplinary treatment of severe and disabling chronic pain was assessed twice, an average of 3.9 months apart. Results showed that pain and acceptance were largely unrelated. Pain at Time 2 was weakly related to measures of functioning at Time 2. On the other hand, acceptance at Time 1 was consistently related to patient functioning at Time 2. Those patients who reported greater acceptance at Time 1 reported better emotional, social, and physical functioning, less medication consumption, and better work status at Time 2. These data suggest that willingness to have pain, and to engage in activity regardless of pain, can lead to healthy functioning for patients with chronic pain. Treatment outcome and process studies may demonstrate the potential for acceptance-based clinical methods for chronic pain management.
ERIC Educational Resources Information Center
Teo, Timothy; Luan, Wong Su; Sing, Chai Ching
2008-01-01
As computers becomes more ubiquitous in our everyday lives, educational settings are being transformed where educators and students are expected to teach and learn, using computers (Lee, 2003). This study, therefore, explored pre-service teachers' self reported future intentions to use computers in Singapore and Malaysia. A survey methodology was…
FY 1978 Budget, FY 1979 Authorization Request and FY 1978-1982 Defense Programs,
1977-01-17
technological opportunities with defense applica- tions -- such as long-range cruise missiles and guidance, improved sensors, 25 miniaturization, and computer ...Various methods exist for computing the number of theater nuclear weapons needed to perform these missions with an acceptable level of confidence...foreign military forces. Mini-micro computers are especially interesting. -- Finally, since geography remains important, we must recognize that the
Computational System For Rapid CFD Analysis In Engineering
NASA Technical Reports Server (NTRS)
Barson, Steven L.; Ascoli, Edward P.; Decroix, Michelle E.; Sindir, Munir M.
1995-01-01
Computational system comprising modular hardware and software sub-systems developed to accelerate and facilitate use of techniques of computational fluid dynamics (CFD) in engineering environment. Addresses integration of all aspects of CFD analysis process, including definition of hardware surfaces, generation of computational grids, CFD flow solution, and postprocessing. Incorporates interfaces for integration of all hardware and software tools needed to perform complete CFD analysis. Includes tools for efficient definition of flow geometry, generation of computational grids, computation of flows on grids, and postprocessing of flow data. System accepts geometric input from any of three basic sources: computer-aided design (CAD), computer-aided engineering (CAE), or definition by user.
Precision of computer-assisted core decompression drilling of the femoral head.
Beckmann, J; Goetz, J; Baethis, H; Kalteis, T; Grifka, J; Perlick, L
2006-08-01
Osteonecrosis of the femoral head is a local destructive disease with progression into devastating stages. Left untreated it mostly leads to severe secondary osteoarthrosis and early endoprosthetic joint replacement. Core decompression by exact drilling into the ischemic areas can be performed in early stages according to Ficat or ARCO. Computer-aided surgery might enhance the precision of the drilling and lower the radiation exposure time of both staff and patients. The aim of this study was to evaluate the precision of the fluoroscopically based VectorVision navigation system in an in vitro model. Thirty sawbones were prepared with a defect filled up with a radiopaque gypsum sphere mimicking the osteonecrosis. Twenty sawbones were drilled by guidance of an intraoperative navigation system VectorVision (BrainLAB, Munich, Germany) and 10 sawbones by fluoroscopic control only. No gypsum sphere was missed. There was a statistically significant difference regarding the three-dimensional deviation (Euclidian norm) as well as maximum deviation in x-, y- or z-direction (maximum norm) to the desired mid-point of the lesion, with a mean of 0.51 and 0.4 mm in the navigated group and 1.1 and 0.88 mm in the control group, respectively. Furthermore, significant difference was found in the number of drilling corrections as well as the radiation time needed: no second drilling or correction of drilling direction was necessary in the navigated group compared to 1.4 in the control group. The radiation time needed was less than 1 s compared to 3.1 s, respectively. The fluoroscopy-based VectorVision navigation system shows a high feasibility of computer-guided drilling with a clear reduction of radiation exposure time and can therefore be integrated into clinical routine. The additional time needed is acceptable regarding the simultaneous reduction of radiation time.
New methodology for fast prediction of wheel wear evolution
NASA Astrophysics Data System (ADS)
Apezetxea, I. S.; Perez, X.; Casanueva, C.; Alonso, A.
2017-07-01
In railway applications wear prediction in the wheel-rail interface is a fundamental matter in order to study problems such as wheel lifespan and the evolution of vehicle dynamic characteristic with time. However, one of the principal drawbacks of the existing methodologies for calculating the wear evolution is the computational cost. This paper proposes a new wear prediction methodology with a reduced computational cost. This methodology is based on two main steps: the first one is the substitution of the calculations over the whole network by the calculation of the contact conditions in certain characteristic point from whose result the wheel wear evolution can be inferred. The second one is the substitution of the dynamic calculation (time integration calculations) by the quasi-static calculation (the solution of the quasi-static situation of a vehicle at a certain point which is the same that neglecting the acceleration terms in the dynamic equations). These simplifications allow a significant reduction of computational cost to be obtained while maintaining an acceptable level of accuracy (error order of 5-10%). Several case studies are analysed along the paper with the objective of assessing the proposed methodology. The results obtained in the case studies allow concluding that the proposed methodology is valid for an arbitrary vehicle running through an arbitrary track layout.
Fully implicit adaptive mesh refinement algorithm for reduced MHD
NASA Astrophysics Data System (ADS)
Philip, Bobby; Pernice, Michael; Chacon, Luis
2006-10-01
In the macroscopic simulation of plasmas, the numerical modeler is faced with the challenge of dealing with multiple time and length scales. Traditional approaches based on explicit time integration techniques and fixed meshes are not suitable for this challenge, as such approaches prevent the modeler from using realistic plasma parameters to keep the computation feasible. We propose here a novel approach, based on implicit methods and structured adaptive mesh refinement (SAMR). Our emphasis is on both accuracy and scalability with the number of degrees of freedom. As a proof-of-principle, we focus on the reduced resistive MHD model as a basic MHD model paradigm, which is truly multiscale. The approach taken here is to adapt mature physics-based technology to AMR grids, and employ AMR-aware multilevel techniques (such as fast adaptive composite grid --FAC-- algorithms) for scalability. We demonstrate that the concept is indeed feasible, featuring near-optimal scalability under grid refinement. Results of fully-implicit, dynamically-adaptive AMR simulations in challenging dissipation regimes will be presented on a variety of problems that benefit from this capability, including tearing modes, the island coalescence instability, and the tilt mode instability. L. Chac'on et al., J. Comput. Phys. 178 (1), 15- 36 (2002) B. Philip, M. Pernice, and L. Chac'on, Lecture Notes in Computational Science and Engineering, accepted (2006)
A new scheme of the time-domain fluorescence tomography for a semi-infinite turbid medium
NASA Astrophysics Data System (ADS)
Prieto, Kernel; Nishimura, Goro
2017-04-01
A new scheme for reconstruction of a fluorophore target embedded in a semi-infinite medium was proposed and evaluated. In this scheme, we neglected the presence of the fluorophore target for the excitation light and used an analytical solution of the time-dependent radiative transfer equation (RTE) for the excitation light in a homogeneous semi-infinite media instead of solving the RTE numerically in the forward calculation. The inverse problem for imaging the fluorophore target was solved using the Landweber-Kaczmarz method with the concept of the adjoint fields. Numerical experiments show that the proposed scheme provides acceptable results of the reconstructed shape and location of the target. The computation times of the solution of the forward problem and the whole reconstruction process were reduced by about 40 and 15%, respectively.
Kent, Alexander Dale [Los Alamos, NM
2008-09-02
Methods and systems in a data/computer network for authenticating identifying data transmitted from a client to a server through use of a gateway interface system which are communicately coupled to each other are disclosed. An authentication packet transmitted from a client to a server of the data network is intercepted by the interface, wherein the authentication packet is encrypted with a one-time password for transmission from the client to the server. The one-time password associated with the authentication packet can be verified utilizing a one-time password token system. The authentication packet can then be modified for acceptance by the server, wherein the response packet generated by the server is thereafter intercepted, verified and modified for transmission back to the client in a similar but reverse process.
High-resolution short-exposure small-animal laboratory x-ray phase-contrast tomography
NASA Astrophysics Data System (ADS)
Larsson, Daniel H.; Vågberg, William; Yaroshenko, Andre; Yildirim, Ali Önder; Hertz, Hans M.
2016-12-01
X-ray computed tomography of small animals and their organs is an essential tool in basic and preclinical biomedical research. In both phase-contrast and absorption tomography high spatial resolution and short exposure times are of key importance. However, the observable spatial resolutions and achievable exposure times are presently limited by system parameters rather than more fundamental constraints like, e.g., dose. Here we demonstrate laboratory tomography with few-ten μm spatial resolution and few-minute exposure time at an acceptable dose for small-animal imaging, both with absorption contrast and phase contrast. The method relies on a magnifying imaging scheme in combination with a high-power small-spot liquid-metal-jet electron-impact source. The tomographic imaging is demonstrated on intact mouse, phantoms and excised lungs, both healthy and with pulmonary emphysema.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Zuwei; Zhao, Haibo, E-mail: klinsmannzhb@163.com; Zheng, Chuguang
2015-01-15
This paper proposes a comprehensive framework for accelerating population balance-Monte Carlo (PBMC) simulation of particle coagulation dynamics. By combining Markov jump model, weighted majorant kernel and GPU (graphics processing unit) parallel computing, a significant gain in computational efficiency is achieved. The Markov jump model constructs a coagulation-rule matrix of differentially-weighted simulation particles, so as to capture the time evolution of particle size distribution with low statistical noise over the full size range and as far as possible to reduce the number of time loopings. Here three coagulation rules are highlighted and it is found that constructing appropriate coagulation rule providesmore » a route to attain the compromise between accuracy and cost of PBMC methods. Further, in order to avoid double looping over all simulation particles when considering the two-particle events (typically, particle coagulation), the weighted majorant kernel is introduced to estimate the maximum coagulation rates being used for acceptance–rejection processes by single-looping over all particles, and meanwhile the mean time-step of coagulation event is estimated by summing the coagulation kernels of rejected and accepted particle pairs. The computational load of these fast differentially-weighted PBMC simulations (based on the Markov jump model) is reduced greatly to be proportional to the number of simulation particles in a zero-dimensional system (single cell). Finally, for a spatially inhomogeneous multi-dimensional (multi-cell) simulation, the proposed fast PBMC is performed in each cell, and multiple cells are parallel processed by multi-cores on a GPU that can implement the massively threaded data-parallel tasks to obtain remarkable speedup ratio (comparing with CPU computation, the speedup ratio of GPU parallel computing is as high as 200 in a case of 100 cells with 10 000 simulation particles per cell). These accelerating approaches of PBMC are demonstrated in a physically realistic Brownian coagulation case. The computational accuracy is validated with benchmark solution of discrete-sectional method. The simulation results show that the comprehensive approach can attain very favorable improvement in cost without sacrificing computational accuracy.« less
Impact of Media Richness and Flow on E-Learning Technology Acceptance
ERIC Educational Resources Information Center
Liu, Su-Houn; Liao, Hsiu-Li; Pratt, Jean A.
2009-01-01
Advances in e-learning technologies parallels a general increase in sophistication by computer users. The use of just one theory or model, such as the technology acceptance model, is no longer sufficient to study the intended use of e-learning systems. Rather, a combination of theories must be integrated in order to fully capture the complexity of…
ERIC Educational Resources Information Center
Khechine, Hager; Lakhal, Sawsen
2018-01-01
Aim/Purpose: We aim to bring a better understanding of technology use in the educational context. More specifically, we investigate the determinants of webinar acceptance by university students and the effects of this acceptance on students' outcomes in the presence of personal characteristics such as anxiety, attitude, computer self-efficacy, and…
ERIC Educational Resources Information Center
Kline, Terence R.; Kneen, Harold; Barrett, Eric; Kleinschmidt, Andy; Doohan, Doug
2012-01-01
Differences in vegetable production methods utilized by American growers create distinct challenges for Extension personnel providing food safety training to producer groups. A program employing computers and projectors will not be accepted by an Amish group that does not accept modern technology. We have developed an outreach program that covers…
ERIC Educational Resources Information Center
Hsu, Liwei
2016-01-01
This study aims to explore the structural relationships among the variables of EFL (English as a foreign language) learners' perceptual learning styles and Technology Acceptance Model (TAM). Three hundred and forty-one (n = 341) EFL learners were invited to join a self-regulated English pronunciation training program through automatic speech…
A study on acceptance of mobileschool at secondary schools in Malaysia: Urban vs rural
NASA Astrophysics Data System (ADS)
Hashim, Ahmad Sobri; Ahmad, Wan Fatimah Wan; Sarlan, Aliza
2017-10-01
Developing countries are in dilemma where sophisticated technologies are more advance as compared to the way their people think. In education, there have been many novel approaches and technologies were introduced. However, very minimal efforts were put to apply in our education. MobileSchool is a mobile learning (m-learning) management system, developed for administrative, teaching and learning processes at secondary schools in Malaysia. The paper presents the acceptance of MobileSchool between urban and rural secondary schools in Malaysia. Research framework was designed based on Technology Acceptance Model (TAM). The constructs of the framework include computer anxiety, self-efficacy, facilitating condition, technological complexity, perceived behavioral control, perceive ease of use, perceive usefulness, attitude and behavioral intention. Questionnaire was applied as research instrument which involved 373 students from four secondary schools (two schools in urban category and another two in rural category) in Perak. Inferential analyses using hypothesis and t-test, and descriptive analyses using mean and percentage were used to analyze the data. Results showed that there were no big difference (<20%) of all acceptance constructs between urban and rural secondary schools except computer anxiety.
Standard practices for the implementation of computer software
NASA Technical Reports Server (NTRS)
Irvine, A. P. (Editor)
1978-01-01
A standard approach to the development of computer program is provided that covers the file cycle of software development from the planning and requirements phase through the software acceptance testing phase. All documents necessary to provide the required visibility into the software life cycle process are discussed in detail.
Implementing Project Based Learning in Computer Classroom
ERIC Educational Resources Information Center
Asan, Askin; Haliloglu, Zeynep
2005-01-01
Project-based learning offers the opportunity to apply theoretical and practical knowledge, and to develop the student's group working, and collaboration skills. In this paper we presented a design of effective computer class that implements the well-known and highly accepted project-based learning paradigm. A pre-test/post-test control group…
REST: a computer system for estimating logging residue by using the line-intersect method
A. Jeff Martin
1975-01-01
A computer program was designed to accept logging-residue measurements obtained by line-intersect sampling and transform them into summaries useful for the land manager. The features of the program, along with inputs and outputs, are briefly described, with a note on machine compatibility.
Spectrum/Orbit-Utilization Program
NASA Technical Reports Server (NTRS)
Miller, Edward F.; Sawitz, Paul; Zusman, Fred
1988-01-01
Interferences among geostationary satellites determine allocations. Spectrum/Orbit Utilization Program (SOUP) is analytical computer program for determining mutual interferences among geostationary-satellite communication systems operating in given scenario. Major computed outputs are carrier-to-interference ratios at receivers at specified stations on Earth. Information enables determination of acceptability of planned communication systems. Written in FORTRAN.
An effective support system of emergency medical services with tablet computers.
Yamada, Kosuke C; Inoue, Satoshi; Sakamoto, Yuichiro
2015-02-27
There were over 5,000,000 ambulance dispatches during 2010 in Japan, and the time for transportation has been increasing, it took over 37 minutes from dispatch to the hospitals. A way to reduce transportation time by ambulance is to shorten the time of searching for an appropriate facility/hospital during the prehospital phase. Although the information system of medical institutions and emergency medical service (EMS) was established in 2003 in Saga Prefecture, Japan, it has not been utilized efficiently. The Saga Prefectural Government renewed the previous system in an effort to make it the real-time support system that can efficiently manage emergency demand and acceptance for the first time in Japan in April 2011. The objective of this study was to evaluate if the new system promotes efficient emergency transportation for critically ill patients and provides valuable epidemiological data. The new system has provided both emergency personnel in the ambulance, or at the scene, and the medical staff in each hospital to be able to share up-to-date information about available hospitals by means of cloud computing. All 55 ambulances in Saga are equipped with tablet computers through third generation/long term evolution networks. When the emergency personnel arrive on the scene and discern the type of patient's illness, they can search for an appropriate facility/hospital with their tablet computer based on the patient's symptoms and available medical specialists. Data were collected prospectively over a three-year period from April 1, 2011 to March 31, 2013. The transportation time by ambulance in Saga was shortened for the first time since the statistics were first kept in 1999; the mean time was 34.3 minutes in 2010 (based on administrative statistics) and 33.9 minutes (95% CI 33.6-34.1) in 2011. The ratio of transportation to the tertiary care facilities in Saga has decreased by 3.12% from the year before, 32.7% in 2010 (regional average) and 29.58% (9085/30,709) in 2011. The system entry completion rate by the emergency personnel was 100.00% (93,110/93,110) and by the medical staff was 46.11% (14,159/30,709) to 47.57% (14,639/30,772) over a three-year period. Finally, the new system reduced the operational costs by 40,000,000 yen (about $400,000 US dollars) a year. The transportation time by ambulance was shorter following the implementation of the tablet computer in the current support system of EMS in Saga Prefecture, Japan. The cloud computing reduced the cost of the EMS system.
de Lima, Camila; Salomão Helou, Elias
2018-01-01
Iterative methods for tomographic image reconstruction have the computational cost of each iteration dominated by the computation of the (back)projection operator, which take roughly O(N 3 ) floating point operations (flops) for N × N pixels images. Furthermore, classical iterative algorithms may take too many iterations in order to achieve acceptable images, thereby making the use of these techniques unpractical for high-resolution images. Techniques have been developed in the literature in order to reduce the computational cost of the (back)projection operator to O(N 2 logN) flops. Also, incremental algorithms have been devised that reduce by an order of magnitude the number of iterations required to achieve acceptable images. The present paper introduces an incremental algorithm with a cost of O(N 2 logN) flops per iteration and applies it to the reconstruction of very large tomographic images obtained from synchrotron light illuminated data.
The European computer model for optronic system performance prediction (ECOMOS)
NASA Astrophysics Data System (ADS)
Repasi, Endre; Bijl, Piet; Labarre, Luc; Wittenstein, Wolfgang; Bürsing, Helge
2017-05-01
ECOMOS is a multinational effort within the framework of an EDA Project Arrangement. Its aim is to provide a generally accepted and harmonized European computer model for computing nominal Target Acquisition (TA) ranges of optronic imagers operating in the Visible or thermal Infrared (IR). The project involves close co-operation of defence and security industry and public research institutes from France, Germany, Italy, The Netherlands and Sweden. ECOMOS uses and combines well-accepted existing European tools to build up a strong competitive position. This includes two TA models: the analytical TRM4 model and the image-based TOD model. In addition, it uses the atmosphere model MATISSE. In this paper, the central idea of ECOMOS is exposed. The overall software structure and the underlying models are shown and elucidated. The status of the project development is given as well as a short outlook on validation tests and the future potential of simulation for sensor assessment.
Smith, W; Bedayse, S; Lalwah, S L; Paryag, A
2009-08-01
The University of the West Indies (UWI) Dental School is planning to implement computer-based information systems to manage student and patient data. In order to measure the acceptance of the proposed implementation and to determine the degree of training that would be required, a survey was undertaken of the computer literacy and attitude of all staff and students. Data were collected via 230 questionnaires from all staff and students. A 78% response rate was obtained. The computer literacy of the majority of respondents was ranked as 'more than adequate' compared to other European Dental Schools. Respondents < 50 years had significantly higher computer literacy scores than older age groups (P < 0.05). Similarly, respondents who owned an email address, a computer, or were members of online social networking sites had significantly higher computer literacy scores than those who did not (P < 0.05). Sex, nationality and whether the respondent was student/staff were not significant factors. Most respondents felt that computer literacy should be a part of every modern undergraduate curriculum; that computer assisted learning applications and web-based learning activity could effectively supplement the traditional undergraduate curriculum and that a suitable information system would improve the efficiency in the school's management of students, teaching and clinics. The implementation of a computer-based information system is likely to have widespread acceptance among students and staff at the UWI Dental School. The computer literacy of the students and staff are on par with those of schools in the US and Europe.
Performance Prediction Toolkit
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chennupati, Gopinath; Santhi, Nanadakishore; Eidenbenz, Stephen
The Performance Prediction Toolkit (PPT), is a scalable co-design tool that contains the hardware and middle-ware models, which accept proxy applications as input in runtime prediction. PPT relies on Simian, a parallel discrete event simulation engine in Python or Lua, that uses the process concept, where each computing unit (host, node, core) is a Simian entity. Processes perform their task through message exchanges to remain active, sleep, wake-up, begin and end. The PPT hardware model of a compute core (such as a Haswell core) consists of a set of parameters, such as clock speed, memory hierarchy levels, their respective sizes,more » cache-lines, access times for different cache levels, average cycle counts of ALU operations, etc. These parameters are ideally read off a spec sheet or are learned using regression models learned from hardware counters (PAPI) data. The compute core model offers an API to the software model, a function called time_compute(), which takes as input a tasklist. A tasklist is an unordered set of ALU, and other CPU-type operations (in particular virtual memory loads and stores). The PPT application model mimics the loop structure of the application and replaces the computational kernels with a call to the hardware model's time_compute() function giving tasklists as input that model the compute kernel. A PPT application model thus consists of tasklists representing kernels and the high-er level loop structure that we like to think of as pseudo code. The key challenge for the hardware model's time_compute-function is to translate virtual memory accesses into actual cache hierarchy level hits and misses.PPT also contains another CPU core level hardware model, Analytical Memory Model (AMM). The AMM solves this challenge soundly, where our previous alternatives explicitly include the L1,L2,L3 hit-rates as inputs to the tasklists. Explicit hit-rates inevitably only reflect the application modeler's best guess, perhaps informed by a few small test problems using hardware counters; also, hard-coded hit-rates make the hardware model insensitive to changes in cache sizes. Alternatively, we use reuse distance distributions in the tasklists. In general, reuse profiles require the application modeler to run a very expensive trace analysis on the real code that realistically can be done at best for small examples.« less
Scheduling periodic jobs that allow imprecise results
NASA Technical Reports Server (NTRS)
Chung, Jen-Yao; Liu, Jane W. S.; Lin, Kwei-Jay
1990-01-01
The problem of scheduling periodic jobs in hard real-time systems that support imprecise computations is discussed. Two workload models of imprecise computations are presented. These models differ from traditional models in that a task may be terminated any time after it has produced an acceptable result. Each task is logically decomposed into a mandatory part followed by an optional part. In a feasible schedule, the mandatory part of every task is completed before the deadline of the task. The optional part refines the result produced by the mandatory part to reduce the error in the result. Applications are classified as type N and type C, according to undesirable effects of errors. The two workload models characterize the two types of applications. The optional parts of the tasks in an N job need not ever be completed. The resulting quality of each type-N job is measured in terms of the average error in the results over several consecutive periods. A class of preemptive, priority-driven algorithms that leads to feasible schedules with small average error is described and evaluated.
An electron beam linear scanning mode for industrial limited-angle nano-computed tomography.
Wang, Chengxiang; Zeng, Li; Yu, Wei; Zhang, Lingli; Guo, Yumeng; Gong, Changcheng
2018-01-01
Nano-computed tomography (nano-CT), which utilizes X-rays to research the inner structure of some small objects and has been widely utilized in biomedical research, electronic technology, geology, material sciences, etc., is a high spatial resolution and non-destructive research technique. A traditional nano-CT scanning model with a very high mechanical precision and stability of object manipulator, which is difficult to reach when the scanned object is continuously rotated, is required for high resolution imaging. To reduce the scanning time and attain a stable and high resolution imaging in industrial non-destructive testing, we study an electron beam linear scanning mode of nano-CT system that can avoid mechanical vibration and object movement caused by the continuously rotated object. Furthermore, to further save the scanning time and study how small the scanning range could be considered with acceptable spatial resolution, an alternating iterative algorithm based on ℓ 0 minimization is utilized to limited-angle nano-CT reconstruction problem with the electron beam linear scanning mode. The experimental results confirm the feasibility of the electron beam linear scanning mode of nano-CT system.
A multi-user real time inventorying system for radioactive materials: a networking approach.
Mehta, S; Bandyopadhyay, D; Hoory, S
1998-01-01
A computerized system for radioisotope management and real time inventory coordinated across a large organization is reported. It handles hundreds of individual users and their separate inventory records. Use of highly efficient computer network and database technologies makes it possible to accept, maintain, and furnish all records related to receipt, usage, and disposal of the radioactive materials for the users separately and collectively. The system's central processor is an HP-9000/800 G60 RISC server and users from across the organization use their personal computers to login to this server using the TCP/IP networking protocol, which makes distributed use of the system possible. Radioisotope decay is automatically calculated by the program, so that it can make the up-to-date radioisotope inventory data of an entire institution available immediately. The system is specifically designed to allow use by large numbers of users (about 300) and accommodates high volumes of data input and retrieval without compromising simplicity and accuracy. Overall, it is an example of a true multi-user, on-line, relational database information system that makes the functioning of a radiation safety department efficient.
An electron beam linear scanning mode for industrial limited-angle nano-computed tomography
NASA Astrophysics Data System (ADS)
Wang, Chengxiang; Zeng, Li; Yu, Wei; Zhang, Lingli; Guo, Yumeng; Gong, Changcheng
2018-01-01
Nano-computed tomography (nano-CT), which utilizes X-rays to research the inner structure of some small objects and has been widely utilized in biomedical research, electronic technology, geology, material sciences, etc., is a high spatial resolution and non-destructive research technique. A traditional nano-CT scanning model with a very high mechanical precision and stability of object manipulator, which is difficult to reach when the scanned object is continuously rotated, is required for high resolution imaging. To reduce the scanning time and attain a stable and high resolution imaging in industrial non-destructive testing, we study an electron beam linear scanning mode of nano-CT system that can avoid mechanical vibration and object movement caused by the continuously rotated object. Furthermore, to further save the scanning time and study how small the scanning range could be considered with acceptable spatial resolution, an alternating iterative algorithm based on ℓ0 minimization is utilized to limited-angle nano-CT reconstruction problem with the electron beam linear scanning mode. The experimental results confirm the feasibility of the electron beam linear scanning mode of nano-CT system.
Lyngholm, Ann Marie; Pedersen, Begitte H; Petersen, Lars J
2008-09-01
Intestinal activity at the inferior myocardial wall represents an issue for assessment of myocardial perfusion imaging (MPI) with 99mTc-labelled tracers. The aim of this study was to investigate the effect of time and food on upper abdominal activity in 99mTc-tetrofosmin MPI. The study population consisted of 152 consecutive patients referred for routine MPI. All patients underwent 2-day stress-rest 99mTc-tetrofosmin single-photon emission computed tomography MPI. Before stress testing, patients were randomized in a factorial design to four different regimens. Group A: early scan (image acquisition initiated within 15 min after injection of the tracer) and no food; group B: early scan and food (two pieces of white bread with butter and a minimum of 450 ml of water); group C: late scan (image acquisition 30-60 min after injection of the tracer) and no food; and group D: late and scan with food. Patients underwent standard bicycle exercise or pharmacological stress test. The degree of upper abdominal activity was evaluated by trained observers blinded to the randomization code. The primary endpoint was the proportion of accepted scans in the intention-to-treat population in stress MPI. The results showed statistical significant impact on both time and food on upper abdominal activity. The primary endpoint showed that the acceptance rate improved from 55% in group A to 100% success rate in group D. An early scan reduced the acceptance rate by 30% versus a late scan [hazard ratio 0.70, 95% confidence interval 0.58-0.84; P<0.0001], whereas the addition of food improved the success rate versus no food by 27% (hazard ratio 1.27, 95% confidence interval 1.07-1.51; P=0.006). No significant interaction between food and time was observed. An analysis of accepted scans according to the actual scan time and food consumption confirmed the findings of the intention-to-treat analysis. In addition, similar findings were seen in 116 of 152 patients with a rest MPI (success rate of 53% in group A vs. 96% in group D). A combination of solid food and water administered after injection of the tracer and delayed image acquisition led to significant and clinically relevant decrease of interfering upper abdominal activity in 99mTc-tetrofosmin MPI.
Computer-generated graphical presentations: use of multimedia to enhance communication.
Marks, L S; Penson, D F; Maller, J J; Nielsen, R T; deKernion, J B
1997-01-01
Personal computers may be used to create, store, and deliver graphical presentations. With computer-generated combinations of the five media (text, images, sound, video, and animation)--that is, multimedia presentations--the effectiveness of message delivery can be greatly increased. The basic tools are (1) a personal computer; (2) presentation software; and (3) a projector to enlarge the monitor images for audience viewing. Use of this new method has grown rapidly in the business-conference world, but has yet to gain widespread acceptance at medical meetings. We review herein the rationale for multimedia presentations in medicine (vis-à-vis traditional slide shows) as an improved means for increasing audience attention, comprehension, and retention. The evolution of multimedia is traced from earliest times to the present. The steps involved in making a multimedia presentation are summarized, emphasizing advances in technology that bring the new method within practical reach of busy physicians. Specific attention is given to software, digital image processing, storage devices, and delivery methods. Our development of a urology multimedia presentation--delivered May 4, 1996, before the Society for Urology and Engineering and now Internet-accessible at http://www.usrf.org--was the impetus for this work.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goebel, J
2004-02-27
Without stable hardware any program will fail. The frustration and expense of supporting bad hardware can drain an organization, delay progress, and frustrate everyone involved. At Stanford Linear Accelerator Center (SLAC), we have created a testing method that helps our group, SLAC Computer Services (SCS), weed out potentially bad hardware and purchase the best hardware at the best possible cost. Commodity hardware changes often, so new evaluations happen periodically each time we purchase systems and minor re-evaluations happen for revised systems for our clusters, about twice a year. This general framework helps SCS perform correct, efficient evaluations. This article outlinesmore » SCS's computer testing methods and our system acceptance criteria. We expanded the basic ideas to other evaluations such as storage, and we think the methods outlined in this article has helped us choose hardware that is much more stable and supportable than our previous purchases. We have found that commodity hardware ranges in quality, so systematic method and tools for hardware evaluation were necessary. This article is based on one instance of a hardware purchase, but the guidelines apply to the general problem of purchasing commodity computer systems for production computational work.« less
Ladan, Muhammad Awwal; Wharrad, Heather; Windle, Richard
2018-03-09
Technologies have globally been recognised to improve productivity across different areas of practice including healthcare. This has been achieved due to the expansion of computers and other forms of information technologies. Despite this advancement, there has also been the growing challenge of the adoption and use of these technologies within practice and especially in healthcare. The evolution of information technologies and more specifically e-health within the healthcare practice has its own barriers and facilitators. This paper describes a pilot study to explore these factors that influence information and technology adoption and use by health professionals in the clinical area in Sub-Saharan Africa. We report on the use of Q-methodology and the models of technology acceptance used in combination for the first time. The methodology used for this study aims to explore the subjectivity of healthcare professionals and present their shared views (factors) on their adoption and use of e-health within clinical practice.
Evaluation of a transfinite element numerical solution method for nonlinear heat transfer problems
NASA Technical Reports Server (NTRS)
Cerro, J. A.; Scotti, S. J.
1991-01-01
Laplace transform techniques have been widely used to solve linear, transient field problems. A transform-based algorithm enables calculation of the response at selected times of interest without the need for stepping in time as required by conventional time integration schemes. The elimination of time stepping can substantially reduce computer time when transform techniques are implemented in a numerical finite element program. The coupling of transform techniques with spatial discretization techniques such as the finite element method has resulted in what are known as transfinite element methods. Recently attempts have been made to extend the transfinite element method to solve nonlinear, transient field problems. This paper examines the theoretical basis and numerical implementation of one such algorithm, applied to nonlinear heat transfer problems. The problem is linearized and solved by requiring a numerical iteration at selected times of interest. While shown to be acceptable for weakly nonlinear problems, this algorithm is ineffective as a general nonlinear solution method.
computer land use mapping via TV waveform analysis of space photography
NASA Technical Reports Server (NTRS)
1972-01-01
An instrumentation and computer system which offers the potential for analyzing photogeographic distributions is described. To satisfy the requirement for computer acceptance, a television and waveform system was developed to transpose pictorial or iconic photo forms to the analytic. A video conversion was accomplished, and each pattern visible on the original photography was represented by a certain range of percentages. With spatial occurrences in digital form, a computer program was developed that could identify, analyze, and map geographic inputs.
Rectal temperature-based death time estimation in infants.
Igari, Yui; Hosokai, Yoshiyuki; Funayama, Masato
2016-03-01
In determining the time of death in infants based on rectal temperature, the same methods used in adults are generally used. However, whether the methods for adults are suitable for infants is unclear. In this study, we examined the following 3 methods in 20 infant death cases: computer simulation of rectal temperature based on the infinite cylinder model (Ohno's method), computer-based double exponential approximation based on Marshall and Hoare's double exponential model with Henssge's parameter determination (Henssge's method), and computer-based collinear approximation based on extrapolation of the rectal temperature curve (collinear approximation). The interval between the last time the infant was seen alive and the time that he/she was found dead was defined as the death time interval and compared with the estimated time of death. In Ohno's method, 7 cases were within the death time interval, and the average deviation in the other 12 cases was approximately 80 min. The results of both Henssge's method and collinear approximation were apparently inferior to the results of Ohno's method. The corrective factor was set within the range of 0.7-1.3 in Henssge's method, and a modified program was newly developed to make it possible to change the corrective factors. Modification A, in which the upper limit of the corrective factor range was set as the maximum value in each body weight, produced the best results: 8 cases were within the death time interval, and the average deviation in the other 12 cases was approximately 80min. There was a possibility that the influence of thermal isolation on the actual infants was stronger than that previously shown by Henssge. We conclude that Ohno's method and Modification A are useful for death time estimation in infants. However, it is important to accept the estimated time of death with certain latitude considering other circumstances. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Winfree, William P.; Howell, Patricia A.; Zalameda, Joseph N.
2014-01-01
Flaw detection and characterization with thermographic techniques in graphite polymer composites are often limited by localized variations in the thermographic response. Variations in properties such as acceptable porosity, fiber volume content and surface polymer thickness result in variations in the thermal response that in general cause significant variations in the initial thermal response. These result in a "noise" floor that increases the difficulty of detecting and characterizing deeper flaws. A method is presented for computationally removing a significant amount of the "noise" from near surface porosity by diffusing the early time response, then subtracting it from subsequent responses. Simulations of the thermal response of a composite are utilized in defining the limitations of the technique. This method for reducing the data is shown to give considerable improvement characterizing both the size and depth of damage. Examples are shown for data acquired on specimens with fabricated delaminations and impact damage.
Accelerating execution of the integrated TIGER series Monte Carlo radiation transport codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, L.M.; Hochstedler, R.D.
1997-02-01
Execution of the integrated TIGER series (ITS) of coupled electron/photon Monte Carlo radiation transport codes has been accelerated by modifying the FORTRAN source code for more efficient computation. Each member code of ITS was benchmarked and profiled with a specific test case that directed the acceleration effort toward the most computationally intensive subroutines. Techniques for accelerating these subroutines included replacing linear search algorithms with binary versions, replacing the pseudo-random number generator, reducing program memory allocation, and proofing the input files for geometrical redundancies. All techniques produced identical or statistically similar results to the original code. Final benchmark timing of themore » accelerated code resulted in speed-up factors of 2.00 for TIGER (the one-dimensional slab geometry code), 1.74 for CYLTRAN (the two-dimensional cylindrical geometry code), and 1.90 for ACCEPT (the arbitrary three-dimensional geometry code).« less
NASA Astrophysics Data System (ADS)
Winfree, William P.; Howell, Patricia A.; Zalameda, Joseph N.
2014-05-01
Flaw detection and characterization with thermographic techniques in graphite polymer composites are often limited by localized variations in the thermographic response. Variations in properties such as acceptable porosity, fiber volume content and surface polymer thickness result in variations in the thermal response that in general cause significant variations in the initial thermal response. These result in a "noise" floor that increases the difficulty of detecting and characterizing deeper flaws. A method is presented for computationally removing a significant amount of the "noise" from near surface porosity by diffusing the early time response, then subtracting it from subsequent responses. Simulations of the thermal response of a composite are utilized in defining the limitations of the technique. This method for reducing the data is shown to give considerable improvement characterizing both the size and depth of damage. Examples are shown for data acquired on specimens with fabricated delaminations and impact damage.
Is gross moist stability a useful quantity for studying the moisture mode theory?
NASA Astrophysics Data System (ADS)
Inoue, K.; Back, L. E.
2016-12-01
The idea is growing and being accepted that the Madden-Julian Oscillation (MJO) is a moisture mode. Along with the appearance of the moisture mode theory, a conceptual quantity called gross moist stability (GMS) has gained increasing attention. However, the GMS is a vexing quantity because it can be interpreted in different ways, depending on the size of spatial domains where the GMS is computed and on computation methodologies. We present a few different illustrations of the GMS using satellite observations. We first show GMS variability as a phase transition on a phase plane that we refer to as the GMS plane. Second, we demonstrate that the GMS variability shown as a time-series, which much past literature presented, is most likely not relevant to the moisture mode theory. In this talk, we present a protocol of moisture-mode-oriented GMS analyses with satellite observations.
Multi-Mounted X-Ray Computed Tomography.
Fu, Jian; Liu, Zhenzhong; Wang, Jingzheng
2016-01-01
Most existing X-ray computed tomography (CT) techniques work in single-mounted mode and need to scan the inspected objects one by one. It is time-consuming and not acceptable for the inspection in a large scale. In this paper, we report a multi-mounted CT method and its first engineering implementation. It consists of a multi-mounted scanning geometry and the corresponding algebraic iterative reconstruction algorithm. This approach permits the CT rotation scanning of multiple objects simultaneously without the increase of penetration thickness and the signal crosstalk. Compared with the conventional single-mounted methods, it has the potential to improve the imaging efficiency and suppress the artifacts from the beam hardening and the scatter. This work comprises a numerical study of the method and its experimental verification using a dataset measured with a developed multi-mounted X-ray CT prototype system. We believe that this technique is of particular interest for pushing the engineering applications of X-ray CT.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-03
... DEPARTMENT OF THE INTERIOR Bureau of Reclamation Time Extension To Accept Proposals, Select One Lessee, and Contract for Hydroelectric Power Development at the Pueblo Dam River Outlet, a Feature of the... period for accepting written proposals detailed in the Notice of Intent to Accept Proposals, Select One...
Wright-Berryman, Jennifer L; Salyers, Michelle P; O'Halloran, James P; Kemp, Aaron S; Mueser, Kim T; Diazoni, Amanda J
2013-12-01
To explore mental health consumer and provider responses to a computerized version of the Illness Management and Recovery (IMR) program. Semistructured interviews were conducted to gather data from 6 providers and 12 consumers who participated in a computerized prototype of the IMR program. An inductive-consensus-based approach was used to analyze the interview responses. Qualitative analysis revealed consumers perceived various personal benefits and ease of use afforded by the new technology platform. Consumers also highly valued provider assistance and offered several suggestions to improve the program. The largest perceived barriers to future implementation were lack of computer skills and access to computers. Similarly, IMR providers commented on its ease and convenience, and the reduction of time intensive material preparation. Providers also expressed that the use of technology creates more options for the consumer to access treatment. The technology was acceptable, easy to use, and well-liked by consumers and providers. Clinician assistance with technology was viewed as helpful to get clients started with the program, as lack of computer skills and access to computers was a concern. Access to materials between sessions appears to be desired; however, given perceived barriers of computer skills and computer access, additional supports may be needed for consumers to achieve full benefits of a computerized version of IMR. PsycINFO Database Record (c) 2013 APA, all rights reserved.
US EPA - A*Star Partnership - Accelerating the Acceptance of ...
The path for incorporating new alternative methods and technologies into quantitative chemical risk assessment poses a diverse set of scientific challenges. Some of these challenges include development of relevant and predictive test systems and computational models to integrate and extrapolate experimental data, and rapid characterization and acceptance of these systems and models. The series of presentations will highlight a collaborative effort between the U.S. Environmental Protection Agency (EPA) and the Agency for Science, Technology and Research (A*STAR) that is focused on developing and applying experimental and computational models for predicting chemical-induced liver and kidney toxicity, brain angiogenesis, and blood-brain-barrier formation. In addressing some of these challenges, the U.S. EPA and A*STAR collaboration will provide a glimpse of what chemical risk assessments could look like in the 21st century. Presentation on US EPA – A*STAR Partnership at international symposium on Accelerating the acceptance of next-generation sciences and their application to regulatory risk assessment in Singapore.
Approximate Bayesian Computation in the estimation of the parameters of the Forbush decrease model
NASA Astrophysics Data System (ADS)
Wawrzynczak, A.; Kopka, P.
2017-12-01
Realistic modeling of the complicated phenomena as Forbush decrease of the galactic cosmic ray intensity is a quite challenging task. One aspect is a numerical solution of the Fokker-Planck equation in five-dimensional space (three spatial variables, the time and particles energy). The second difficulty arises from a lack of detailed knowledge about the spatial and time profiles of the parameters responsible for the creation of the Forbush decrease. Among these parameters, the central role plays a diffusion coefficient. Assessment of the correctness of the proposed model can be done only by comparison of the model output with the experimental observations of the galactic cosmic ray intensity. We apply the Approximate Bayesian Computation (ABC) methodology to match the Forbush decrease model to experimental data. The ABC method is becoming increasing exploited for dynamic complex problems in which the likelihood function is costly to compute. The main idea of all ABC methods is to accept samples as an approximate posterior draw if its associated modeled data are close enough to the observed one. In this paper, we present application of the Sequential Monte Carlo Approximate Bayesian Computation algorithm scanning the space of the diffusion coefficient parameters. The proposed algorithm is adopted to create the model of the Forbush decrease observed by the neutron monitors at the Earth in March 2002. The model of the Forbush decrease is based on the stochastic approach to the solution of the Fokker-Planck equation.
Hu, Jun; Mercer, Jay; Peyton, Liam; Kantarcioglu, Murat; Malin, Bradley; Buckeridge, David; Samet, Saeed; Earle, Craig
2011-01-01
Background Providers have been reluctant to disclose patient data for public-health purposes. Even if patient privacy is ensured, the desire to protect provider confidentiality has been an important driver of this reluctance. Methods Six requirements for a surveillance protocol were defined that satisfy the confidentiality needs of providers and ensure utility to public health. The authors developed a secure multi-party computation protocol using the Paillier cryptosystem to allow the disclosure of stratified case counts and denominators to meet these requirements. The authors evaluated the protocol in a simulated environment on its computation performance and ability to detect disease outbreak clusters. Results Theoretical and empirical assessments demonstrate that all requirements are met by the protocol. A system implementing the protocol scales linearly in terms of computation time as the number of providers is increased. The absolute time to perform the computations was 12.5 s for data from 3000 practices. This is acceptable performance, given that the reporting would normally be done at 24 h intervals. The accuracy of detection disease outbreak cluster was unchanged compared with a non-secure distributed surveillance protocol, with an F-score higher than 0.92 for outbreaks involving 500 or more cases. Conclusion The protocol and associated software provide a practical method for providers to disclose patient data for sentinel, syndromic or other indicator-based surveillance while protecting patient privacy and the identity of individual providers. PMID:21486880
A Real-Time Imaging System for Stereo Atomic Microscopy at SPring-8's BL25SU
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matsushita, Tomohiro; Guo, Fang Zhun; Muro, Takayuki
2007-01-19
We have developed a real-time photoelectron angular distribution (PEAD) and Auger-electron angular distribution (AEAD) imaging system at SPring-8 BL25SU, Japan. In addition, a real-time imaging system for circular dichroism (CD) studies of PEAD/AEAD has been newly developed. Two PEAD images recorded with left- and right-circularly polarized light can be regarded as a stereo image of the atomic arrangement. A two-dimensional display type mirror analyzer (DIANA) has been installed at the beamline, making it possible to record PEAD/AEAD patterns with an acceptance angle of {+-}60 deg. in real-time. The twin-helical undulators at BL25SU enable helicity switching of the circularly polarized lightmore » at 10Hz, 1Hz or 0.1Hz. In order to realize real-time measurements of the CD of the PEAD/AEAD, the CCD camera must be synchronized to the switching frequency. The VME computer that controls the ID is connected to the measurement computer with two BNC cables, and the helicity information is sent using TTL signals. For maximum flexibility, rather than using a hardware shutter synchronizing with the TTL signal we have developed software to synchronize the CCD shutter with the TTL signal. We have succeeded in synchronizing the CCD camera in both the 1Hz and 0.1Hz modes.« less
Schwenke, Michael; Georgii, Joachim; Preusser, Tobias
2017-07-01
Focused ultrasound (FUS) is rapidly gaining clinical acceptance for several target tissues in the human body. Yet, treating liver targets is not clinically applied due to a high complexity of the procedure (noninvasiveness, target motion, complex anatomy, blood cooling effects, shielding by ribs, and limited image-based monitoring). To reduce the complexity, numerical FUS simulations can be utilized for both treatment planning and execution. These use-cases demand highly accurate and computationally efficient simulations. We propose a numerical method for the simulation of abdominal FUS treatments during respiratory motion of the organs and target. Especially, a novel approach is proposed to simulate the heating during motion by solving Pennes' bioheat equation in a computational reference space, i.e., the equation is mathematically transformed to the reference. The approach allows for motion discontinuities, e.g., the sliding of the liver along the abdominal wall. Implementing the solver completely on the graphics processing unit and combining it with an atlas-based ultrasound simulation approach yields a simulation performance faster than real time (less than 50-s computing time for 100 s of treatment time) on a modern off-the-shelf laptop. The simulation method is incorporated into a treatment planning demonstration application that allows to simulate real patient cases including respiratory motion. The high performance of the presented simulation method opens the door to clinical applications. The methods bear the potential to enable the application of FUS for moving organs.
Cimperman, Miha; Makovec Brenčič, Maja; Trkman, Peter
2016-06-01
Although telehealth offers an improved approach to providing healthcare services, its adoption by end users remains slow. With an older population as the main target, these traditionally conservative users pose a big challenge to the successful implementation of innovative telehealth services. The objective of this study was to develop and empirically test a model for predicting the factors affecting older users' acceptance of Home Telehealth Services (HTS). A survey instrument was administered to 400 participants aged 50 years and above from both rural and urban environments in Slovenia. Structural equation modeling was applied to analyze the causal effect of seven hypothesized predicting factors. HTS were introduced as a bundle of functionalities, representing future services that currently do not exist. This enabled users' perceptions to be measured on the conceptual level, rather than attitudes to a specific technical solution. Six relevant predictors were confirmed in older users' HTS acceptance behavior, with Performance Expectancy (r=0.30), Effort Expectancy (r=0.49), Facilitating Conditions (r=0.12), and Perceived Security (r=0.16) having a direct impact on behavioral intention to use HTS. In addition, Computer Anxiety is positioned as an antecedent of Effort Expectancy with a strong negative influence (r=-0.61), and Doctor's Opinion influence showed a strong impact on Performance Expectancy (r=0.31). The results also indicate Social Influence as an irrelevant predictor of acceptance behavior. The model of six predictors yielded 77% of the total variance explained in the final measured Behavioral Intention to Use HTS by older adults. The level at which HTS are perceived as easy to use and manage is the leading acceptance predictor in older users' HTS acceptance. Together with Perceived Usefulness and Perceived Security, these three factors represent the key influence on older people's HTS acceptance behavior. When promoting HTS, interventions should focus to portray it as secure. Marketing interventions should focus also on promoting HTS among health professionals, using them as social agents to frame the services as useful and beneficial. The important role of computer anxiety may result in a need to use different equipment such as a tablet computer to access HTS. Finally, this paper introduces important methodological guidelines for measuring perceptions on a conceptual level of future services that currently do not exist. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Test Takers' Attitudes about the TOEFL iBT[TM]. TOEFL iBT Research Report. RR-10-2
ERIC Educational Resources Information Center
Stricker, Lawrence J.; Attali, Yigal
2010-01-01
The principal aims of this study, a conceptual replication of an earlier investigation of the TOEFL[R] computer-based test, or TOEFL CBT, in Buenos Aires, Cairo, and Frankfurt, were to assess test takers' reported acceptance of the TOEFL Internet-based test, or TOEFL iBT[TM], and its associations with possible determinants of this acceptance and…
Total knee arthroplasty with a computer-navigated saw: a pilot study.
Garvin, Kevin L; Barrera, Andres; Mahoney, Craig R; Hartman, Curtis W; Haider, Hani
2013-01-01
Computer-aided surgery aims to improve implant alignment in TKA but has only been adopted by a minority for routine use. A novel approach, navigated freehand bone cutting (NFC), is intended to achieve wider acceptance by eliminating the need for cumbersome, implant-specific mechanical jigs and avoiding the expense of navigation. We determined cutting time, surface quality, implant fit, and implant alignment after NFC of synthetic femoral specimens and the feasibility and alignment of a complete TKA performed with NFC technology in cadaveric specimens. Seven surgeons prepared six synthetic femoral specimens each, using our custom NFC system. Cutting times, quality of bone cuts, and implant fit and alignment were assessed quantitatively by CT surface scanning and computational measurements. Additionally, a single surgeon performed a complete TKA on two cadaveric specimens using the NFC system, with cutting time and implant alignment analyzed through plain radiographs and CT. For the synthetic specimens, femoral coronal alignment was within ± 2° of neutral in 94% of the specimens. Sagittal alignment was within 0° to 5° of flexion in all specimens. Rotation was within ± 1° of the epicondylar axis in 97% of the specimens. The mean time to make cuts improved from 13 minutes for the first specimen to 9 minutes for the fourth specimen. TKA was performed in two cadaveric specimens without complications and implants were well aligned. TKA is feasible with NFC, which eliminates the need for implant-specific instruments. We observed a fast learning curve. NFC has the potential to improve TKA alignment, reduce operative time, and reduce the number of instruments in surgery. Fewer instruments and less sterilization could reduce costs associated with TKA.
Taborri, Juri; Rossi, Stefano; Palermo, Eduardo; Patanè, Fabrizio; Cappa, Paolo
2014-09-02
In this work, we decided to apply a hierarchical weighted decision, proposed and used in other research fields, for the recognition of gait phases. The developed and validated novel distributed classifier is based on hierarchical weighted decision from outputs of scalar Hidden Markov Models (HMM) applied to angular velocities of foot, shank, and thigh. The angular velocities of ten healthy subjects were acquired via three uni-axial gyroscopes embedded in inertial measurement units (IMUs) during one walking task, repeated three times, on a treadmill. After validating the novel distributed classifier and scalar and vectorial classifiers-already proposed in the literature, with a cross-validation, classifiers were compared for sensitivity, specificity, and computational load for all combinations of the three targeted anatomical segments. Moreover, the performance of the novel distributed classifier in the estimation of gait variability in terms of mean time and coefficient of variation was evaluated. The highest values of specificity and sensitivity (>0.98) for the three classifiers examined here were obtained when the angular velocity of the foot was processed. Distributed and vectorial classifiers reached acceptable values (>0.95) when the angular velocity of shank and thigh were analyzed. Distributed and scalar classifiers showed values of computational load about 100 times lower than the one obtained with the vectorial classifier. In addition, distributed classifiers showed an excellent reliability for the evaluation of mean time and a good/excellent reliability for the coefficient of variation. In conclusion, due to the better performance and the small value of computational load, the here proposed novel distributed classifier can be implemented in the real-time application of gait phases recognition, such as to evaluate gait variability in patients or to control active orthoses for the recovery of mobility of lower limb joints.
Bedouch, Pierrick; Tessier, Alexandre; Baudrant, Magalie; Labarere, José; Foroni, Luc; Calop, Jean; Bosson, Jean-Luc; Allenet, Benoît
2012-08-01
To analyse pharmacists' interventions in a setting where a computerized physician order entry system (CPOE) is in use and a pharmacist works on the ward. A prospective cohort study was conducted in seven wards of a French teaching hospital using CPOE along with the presence of a full-time on-ward pharmacy resident. We documented the characteristics of pharmacists' interventions communicated to physicians during the medication order validation process whenever a drug-related problem was identified. Independent predictors of the physician's acceptance of the pharmacist's intervention were assessed using multiple logistic regression analysis. The 448 pharmacists' interventions concerned: non-conformity to guidelines or contraindications (22%), too high doses (19%), drug interactions (15%) and improper administration (15%). The interventions consisted of changes in drug choice (41%), dose adjustment (23%), drug monitoring (19%) and optimization of administration (17%). Interventions were communicated via the CPOE in 57% of cases and 43% orally. The rate of physicians' acceptance was 79.2%. In multivariate analysis, acceptance was significantly associated with the physician's status [higher for residents vs. seniors: OR = 7.23, CI 95 (2.37-22.10), P < 0.01], method of communication [higher for oral vs. computer communication: OR = 12.5, CI 95 (4.16-37.57), P < 0.01] and type of recommendation [higher for drug monitoring vs. drug choice recommendations: OR = 10.32, CI 95 (3.20-33.29), P < 0.01]. When a clinical pharmacist is present on a ward in which a CPOE is in use, the pharmacists' interventions are well accepted by physicians. Specific predictors of the acceptance by physicians emerge, but further research as to the impact of CPOE on pharmacist-physician communication is needed. © 2011 Blackwell Publishing Ltd.
Ehrler, Frederic; Ducloux, Pascal; Wu, Danny T Y; Lovis, Christian; Blondon, Katherine
2018-01-01
Supporting caregivers' workflow with mobile applications (apps) is a growing trend. At the bedside, apps can provide new ways to support the documentation process rather than using a desktop computer in a nursing office. Although these applications show potential, few existing reports have studied the real impact of such solutions. At the University Hospitals of Geneva, we developed BEDside Mobility, a mobile application supporting nurses' daily workflow. In a pilot study, the app was trialed in two wards for a period of one month. We collected data of the actual usage of the app and asked the users to complete a tailored technology acceptance model questionnaire at the end of the study period. Results show that participation remain stable with time with participants using in average the tool for almost 29 minutes per day. The technology acceptance questionnaires revealed a high usability of the app and good promotion from the institution although users did not perceive any increase in productivity. Overall, intent of use was divergent between promoters and antagonist. Furthermore, some participants considered the tool as an addition to their workload. This evaluation underlines the importance of helping all end users perceive the benefits of a new intervention since coworkers strong influence each other.
Adhami, A; Rabiee, A; Adhami, M
2015-01-01
This paper's aim was to develop a conceptual overview of SMS marketing and delineate factors of new communications technologies on business practice. This study, which was a descriptive survey, was built on primary and secondary data source including a literature review of SMS marketing and a Questionnaire were used as the primary means of collecting secondary data. The sample size of 300 patients was determined according to the Cochran formula. Moreover, data analysis was done in SPSS by using linear regression, chi-square, t-test and Binomial test. According to the research, sex, age, education, relevance, timeliness, reliability to sender, sense of control were variables affecting the SMS marketing acceptance. This paper was qualitative and provided a solid conceptual foundation for the future empirical research on e- marketing. The potential limitation was related to the broad user of computer and mobile. In this research, we considered SMS marketing, Mobile marketing, SMS advertising as the same subject. This research will be a useful resource with important insight into the factors that may encourage or determine consumer acceptance of this new form of direct marketing. This paper addressed an important timely issue, and added to the body of literature and knowledge focusing on e-marketing.
Adhami, A; Rabiee, A; Adhami, M
2015-01-01
This paper’s aim was to develop a conceptual overview of SMS marketing and delineate factors of new communications technologies on business practice. This study, which was a descriptive survey, was built on primary and secondary data source including a literature review of SMS marketing and a Questionnaire were used as the primary means of collecting secondary data. The sample size of 300 patients was determined according to the Cochran formula. Moreover, data analysis was done in SPSS by using linear regression, chi-square, t-test and Binomial test. According to the research, sex, age, education, relevance, timeliness, reliability to sender, sense of control were variables affecting the SMS marketing acceptance. This paper was qualitative and provided a solid conceptual foundation for the future empirical research on e- marketing. The potential limitation was related to the broad user of computer and mobile. In this research, we considered SMS marketing, Mobile marketing, SMS advertising as the same subject. This research will be a useful resource with important insight into the factors that may encourage or determine consumer acceptance of this new form of direct marketing. This paper addressed an important timely issue, and added to the body of literature and knowledge focusing on e-marketing. PMID:28255405
ERIC Educational Resources Information Center
Schlag, Myriam; Imhof, Margarete
2017-01-01
The aim of this study is to contribute to a better understanding of challenges and factors which influence learning efficiency with electronic-portfolios. Based on the "Technology Acceptance Model" (TAM; Davis, Bagozzi, & Warshaw, 1989) we analyzed "external variables" (e.g., computer-anxiety) that influence technology…
ERIC Educational Resources Information Center
Gezgin, Deniz Mertkan; Adnan, Muge; Acar Guvendir, Meltem
2018-01-01
Mobile learning has started to perform an increasingly significant role in improving learning outcomes in education. Successful and efficient implementation of m-learning in higher education, as with all educational levels, depends on users' acceptance of this technology. This study focuses on investigating the attitudes of undergraduate students…
The Ideology of Computer Literacy in Schools.
ERIC Educational Resources Information Center
Mangan, J. Marshall
This research project brings a critical perspective to the examination of computer literacy as an ideological form through a study of the reactions of high school teachers and students. On-site interviews with teachers and students found both acceptance of and resistance to the message of adjustment to an inevitable future of vocational and…
The Effect of Emotional Feedback on Behavioral Intention to Use Computer Based Assessment
ERIC Educational Resources Information Center
Terzis, Vasileios; Moridis, Christos N.; Economides, Anastasios A.
2012-01-01
This study introduces emotional feedback as a construct in an acceptance model. It explores the effect of emotional feedback on behavioral intention to use Computer Based Assessment (CBA). A female Embodied Conversational Agent (ECA) with empathetic encouragement behavior was displayed as emotional feedback. More specifically, this research aims…
The Reality of Computers at the Community College Level.
ERIC Educational Resources Information Center
Leone, Stephen J.
Writing teachers at the community college level who teach using a computer have come to accept the fact that it is more than "just teaching" composition. Such teaching often requires instructors to be as knowledgeable as some of the technicians. Two-year college students and faculty are typically given little support in using computers…
Interdisciplinary Facilities that Support Collaborative Teaching and Learning
ERIC Educational Resources Information Center
Asoodeh, Mike; Bonnette, Roy
2006-01-01
It has become widely accepted that the computer is an indispensable tool in the study of science and technology. Thus, in recent years curricular programs such as Industrial Technology and associated scientific disciplines have been adopting and adapting the computer as a tool in new and innovative ways to support teaching, learning, and research.…
The Importance of Computer Programming Skills to Educational Researchers.
ERIC Educational Resources Information Center
Lawson, Stephen
The use of the modern computer has revolutionized the field of educational research. Software packages are currently available that allow almost anyone to analyze data efficiently and rapidly. Yet, caution must temper the widespread acceptance and use of these programs. It is recommended that the researcher not rely solely on the use of…
Utility and Usability as Factors Influencing Teacher Decisions about Software Integration
ERIC Educational Resources Information Center
Okumus, Samet; Lewis, Lindsey; Wiebe, Eric; Hollebrands, Karen
2016-01-01
Given the importance of teacher in the implementation of computer technology in classrooms, the technology acceptance model and TPACK model were used to better understand the decision-making process teachers use in determining how, when, and where computer software is used in mathematics classrooms. Thirty-four (34) teachers implementing…
ERIC Educational Resources Information Center
Mather, Richard
2015-01-01
This paper explores the application of canonical gradient analysis to evaluate and visualize student performance and acceptance of a learning system platform. The subject of evaluation is a first year BSc module for computer programming. This uses "Ceebot," an animated and immersive game-like development environment. Multivariate…
Trends in Handheld Computing Among Medical Students
Grasso, Michael A.; Yen, M. Jim; Mintz, Matthew L.
2005-01-01
The purpose of this study was to identify trends in the utilization and acceptance of handheld computers (personal digital assistants) among medical students during preclinical and clinical training. These results can be used to identify differences between preclinical and clinical users, differences between current use and idealized use, and perceived limitations of these devices. PMID:16779255
Phantom feet on digital radionuclide images and other scary computer tales
DOE Office of Scientific and Technical Information (OSTI.GOV)
Freitas, J.E.; Dworkin, H.J.; Dees, S.M.
1989-09-01
Malfunction of a computer-assisted digital gamma camera is reported. Despite what appeared to be adequate acceptance testing, an error in the system gave rise to switching of images and identification text. A suggestion is made for using a hot marker, which would avoid the potential error of misinterpretation of patient images.
Hale, Leigh A; Satherley, Jessica A; McMillan, Nicole J; Milosavljevic, Stephan; Hijmans, Juha M; King, Marcus J
2012-01-01
This article reports on the perceptions of 14 adults with chronic stroke who participated in a pilot study to determine the utility, acceptability, and potential efficacy of using an adapted CyWee Z handheld game controller to play a variety of computer games aimed at improving upper-limb function. Four qualitative in-depth interviews and two focus groups explored participant perceptions. Data were thematically analyzed with the general inductive approach. Participants enjoyed playing the computer games with the technology. The perceived benefits included improved upper-limb function, concentration, and balance; however, six participants reported shoulder and/or arm pain or discomfort, which presented while they were engaged in play but appeared to ease during rest. Participants suggested changes to the games and provided opinions on the use of computer games in rehabilitation. Using an adapted CyWee Z controller and computer games in upper-limb rehabilitation for people with chronic stroke is an acceptable and potentially beneficial adjunct to rehabilitation. The development of shoulder pain was a negative side effect for some participants and requires further investigation.
Kurth, Ann E.; Severynen, Anneleen; Spielberg, Freya
2014-01-01
HIV testing in emergency departments (EDs) remains underutilized. We evaluated a computer tool to facilitate rapid HIV testing in an urban ED. Randomly assigned non-acute adult ED patients to computer tool (‘CARE’) and rapid HIV testing before standard visit (n=258) or to standard visit (n=259) with chart access. Assessed intervention acceptability and compared noted HIV risks. Participants were 56% non-white, 58% male; median age 37 years. In the CARE arm nearly all (251/258) completed the session and received HIV results; 4 declined test consent. HIV risks were reported by 54% of users and there was one confirmed HIV-positive and 2 false-positives (seroprevalence 0.4%, 95% CI 0.01–2.2%). Half (55%) preferred computerized, over face-to-face, counseling for future HIV testing. In standard arm, one HIV test and 2 referrals for testing occurred. Computer-facilitated HIV testing appears acceptable to ED patients. Future research should assess cost-effectiveness compared with staff-delivered approaches. PMID:23837807
39 CFR 121.1 - First-Class Mail.
Code of Federal Regulations, 2010 CFR
2010-07-01
... First-Class Mail® pieces properly accepted before the day-zero Critical Entry Time at origin, the..., if that mail is accepted before the day-zero Critical Entry Time at origin, if sufficient customer... domestic First-Class Mail pieces properly accepted before the day-zero Critical Entry Time at origin if a 1...
Computer Controlled Portable Greenhouse Climate Control System for Enhanced Energy Efficiency
NASA Astrophysics Data System (ADS)
Datsenko, Anthony; Myer, Steve; Petties, Albert; Hustek, Ryan; Thompson, Mark
2010-04-01
This paper discusses a student project at Kettering University focusing on the design and construction of an energy efficient greenhouse climate control system. In order to maintain acceptable temperatures and stabilize temperature fluctuations in a portable plastic greenhouse economically, a computer controlled climate control system was developed to capture and store thermal energy incident on the structure during daylight periods and release the stored thermal energy during dark periods. The thermal storage mass for the greenhouse system consisted of a water filled base unit. The heat exchanger consisted of a system of PVC tubing. The control system used a programmable LabView computer interface to meet functional specifications that minimized temperature fluctuations and recorded data during operation. The greenhouse was a portable sized unit with a 5' x 5' footprint. Control input sensors were temperature, water level, and humidity sensors and output control devices were fan actuating relays and water fill solenoid valves. A Graphical User Interface was developed to monitor the system, set control parameters, and to provide programmable data recording times and intervals.
An expert system for spectroscopic analysis of rocket engine plumes
NASA Technical Reports Server (NTRS)
Reese, Greg; Valenti, Elizabeth; Alphonso, Keith; Holladay, Wendy
1991-01-01
The expert system described in this paper analyzes spectral emissions of rocket engine exhaust plumes and shows major promise for use in engine health diagnostics. Plume emission spectroscopy is an important tool for diagnosing engine anomalies, but it is time-consuming and requires highly skilled personnel. The expert system was created to alleviate such problems. The system accepts a spectral plot in the form of wavelength vs intensity pairs and finds the emission peaks in the spectrum, lists the elemental emitters present in the data and deduces the emitter that produced each peak. The system consists of a conventional language component and a commercially available inference engine that runs on an Apple Macintosh computer. The expert system has undergone limited preliminary testing. It detects elements well and significantly decreases analysis time.
Integration of energy management concepts into the flight deck
NASA Technical Reports Server (NTRS)
Morello, S. A.
1981-01-01
The rapid rise of fuel costs has become a major concern of the commercial aviation industry, and it has become mandatory to seek means by which to conserve fuel. A research program was initiated in 1979 to investigate the integration of fuel-conservative energy/flight management computations and information into today's and tomorrow's flight deck. One completed effort within this program has been the development and flight testing of a fuel-efficient, time-based metering descent algorithm in a research cockpit environment. Research flights have demonstrated that time guidance and control in the cockpit was acceptable to both pilots and ATC controllers. Proper descent planning and energy management can save fuel for the individual aircraft as well as the fleet by helping to maintain a regularized flow into the terminal area.
Moving Force Identification: a Time Domain Method
NASA Astrophysics Data System (ADS)
Law, S. S.; Chan, T. H. T.; Zeng, Q. H.
1997-03-01
The solution for the vertical dynamic interaction forces between a moving vehicle and the bridge deck is analytically derived and experimentally verified. The deck is modelled as a simply supported beam with viscous damping, and the vehicle/bridge interaction force is modelled as one-point or two-point loads with fixed axle spacing, moving at constant speed. The method is based on modal superposition and is developed to identify the forces in the time domain. Both cases of one-point and two-point forces moving on a simply supported beam are simulated. Results of laboratory tests on the identification of the vehicle/bridge interaction forces are presented. Computation simulations and laboratory tests show that the method is effective, and acceptable results can be obtained by combining the use of bending moment and acceleration measurements.
Lehane, Christine M; Nielsen, Tine; Wittich, Walter; Langer, Shelby; Dammeyer, Jesper
2018-03-30
Hearing-, vision-, and dual-sensory loss have been linked to relational and psychological distress among adults with sensory loss (AWSLs) and their spouses. Regardless, research on factors associated with couples' adjustment is lacking. This study examined the stability and strength of associations between self-acceptance of sensory loss, perceived partner acceptance of sensory loss, and relationship satisfaction and psychological distress among AWSLs and their spouses over time. A total of 122 AWSLs and their spouses completed an online survey at two time points over a 6-month period. A multigroup (i.e., time 1 and time 2) actor-partner interdependence model assessed the stability and strength of actor and partner effects of self-acceptance and perceived partner acceptance on each partner's relationship satisfaction and psychological distress over time. No moderation by time was identified, indicating stability in associations over the 6-month period. Overall, both actor and partner effects were evident. Specifically, self-acceptance among AWSLs was inversely associated with own psychological distress and the relationship satisfaction of spouses. Self-acceptance by spouses was inversely associated with the psychological distress of AWSLs and spouses. Perception of spouse acceptance by AWSLs was positively associated with own and spouse relationship satisfaction. Interventions targeting acceptance that incorporate a family systems perspective may be beneficial in alleviating psychological and relational distress among couples coping with sensory loss. Statement of contribution What is already known on this subject? The experience of hearing and/or vision loss has been linked to heightened distress both psychologically and within intimate relationships. Prior research has demonstrated a link between an individual's ability to accept their sensory loss and healthier well-being. What does this study add? This is the first dyadic study of sensory loss acceptance and its link to relationship satisfaction and distress. Acceptance operates interpersonally protecting against distress for those with sensory loss and their spouses. Perceiving that one's spouse accepts the sensory loss is important for both partner's relationship satisfaction. © 2018 The British Psychological Society.
CMSA: a heterogeneous CPU/GPU computing system for multiple similar RNA/DNA sequence alignment.
Chen, Xi; Wang, Chen; Tang, Shanjiang; Yu, Ce; Zou, Quan
2017-06-24
The multiple sequence alignment (MSA) is a classic and powerful technique for sequence analysis in bioinformatics. With the rapid growth of biological datasets, MSA parallelization becomes necessary to keep its running time in an acceptable level. Although there are a lot of work on MSA problems, their approaches are either insufficient or contain some implicit assumptions that limit the generality of usage. First, the information of users' sequences, including the sizes of datasets and the lengths of sequences, can be of arbitrary values and are generally unknown before submitted, which are unfortunately ignored by previous work. Second, the center star strategy is suited for aligning similar sequences. But its first stage, center sequence selection, is highly time-consuming and requires further optimization. Moreover, given the heterogeneous CPU/GPU platform, prior studies consider the MSA parallelization on GPU devices only, making the CPUs idle during the computation. Co-run computation, however, can maximize the utilization of the computing resources by enabling the workload computation on both CPU and GPU simultaneously. This paper presents CMSA, a robust and efficient MSA system for large-scale datasets on the heterogeneous CPU/GPU platform. It performs and optimizes multiple sequence alignment automatically for users' submitted sequences without any assumptions. CMSA adopts the co-run computation model so that both CPU and GPU devices are fully utilized. Moreover, CMSA proposes an improved center star strategy that reduces the time complexity of its center sequence selection process from O(mn 2 ) to O(mn). The experimental results show that CMSA achieves an up to 11× speedup and outperforms the state-of-the-art software. CMSA focuses on the multiple similar RNA/DNA sequence alignment and proposes a novel bitmap based algorithm to improve the center star strategy. We can conclude that harvesting the high performance of modern GPU is a promising approach to accelerate multiple sequence alignment. Besides, adopting the co-run computation model can maximize the entire system utilization significantly. The source code is available at https://github.com/wangvsa/CMSA .
Acceptance of Internet Banking Systems among Young Managers
NASA Astrophysics Data System (ADS)
Ariff, Mohd Shoki Md; M, Yeow S.; Zakuan, Norhayati; Zaidi Bahari, Ahamad
2013-06-01
The aim of this paper is to determine acceptance of internet banking system among potential young users, specifically future young managers. The relationships and the effects of computer self-efficacy (CSE) and extended technology acceptance model (TAM) on the behavioural intention (BI) to use internet banking system were examined. Measurement of CSE, TAM and BI were adapted from previous studies. However construct for TAM has been extended by adding a new variable which is perceived credibility (PC). A survey through questionnaire was conducted to determine the acceptance level of CSE, TAM and BI. Data were obtained from 275 Technology Management students, who are pursuing their undergraduate studies in a Malaysia's public university. The confirmatory factor analysis performed has identified four variables as determinant factors of internet banking acceptance. The first variable is computer self-efficacy (CSE), and another three variables from TAM constructs which are perceived usefulness (PU), perceived ease of use (PE) and perceived credibility (PC). The finding of this study indicated that CSE has a positive effect on PU and PE of the Internet banking systems. Respondents' CSE was positively affecting their PC of the systems, indicating that the higher the ability of one in computer skills, the higher the security and privacy issues of PC will be concerned. The multiple regression analysis indicated that only two construct of TAM; PU and PC were significantly associated with BI. It was found that the future managers' CSE indirectly affects their BI to use the internet banking systems through PU and PC of TAM. TAM was found to have direct effects on respondents' BI to use the systems. Both CSE and the PU and PC of TAM were good predictors in understanding individual responses to information technology. The role of PE of the original TAM to predict the attitude of users towards the use of information technology systems was surprisingly insignificant.
NASA Technical Reports Server (NTRS)
Saracino, G.; Greenberg, N. L.; Shiota, T.; Corsi, C.; Lamberti, C.; Thomas, J. D.
2002-01-01
Real-time three-dimensional echocardiography (RT3DE) is an innovative cardiac imaging modality. However, partly due to lack of user-friendly software, RT3DE has not been widely accepted as a clinical tool. The object of this study was to develop and implement a fast and interactive volume renderer of RT3DE datasets designed for a clinical environment where speed and simplicity are not secondary to accuracy. Thirty-six patients (20 regurgitation, 8 normal, 8 cardiomyopathy) were imaged using RT3DE. Using our newly developed software, all 3D data sets were rendered in real-time throughout the cardiac cycle and assessment of cardiac function and pathology was performed for each case. The real-time interactive volume visualization system is user friendly and instantly provides consistent and reliable 3D images without expensive workstations or dedicated hardware. We believe that this novel tool can be used clinically for dynamic visualization of cardiac anatomy.
Ritchie, David W; Kozakov, Dima; Vajda, Sandor
2008-09-01
Predicting how proteins interact at the molecular level is a computationally intensive task. Many protein docking algorithms begin by using fast Fourier transform (FFT) correlation techniques to find putative rigid body docking orientations. Most such approaches use 3D Cartesian grids and are therefore limited to computing three dimensional (3D) translational correlations. However, translational FFTs can speed up the calculation in only three of the six rigid body degrees of freedom, and they cannot easily incorporate prior knowledge about a complex to focus and hence further accelerate the calculation. Furthemore, several groups have developed multi-term interaction potentials and others use multi-copy approaches to simulate protein flexibility, which both add to the computational cost of FFT-based docking algorithms. Hence there is a need to develop more powerful and more versatile FFT docking techniques. This article presents a closed-form 6D spherical polar Fourier correlation expression from which arbitrary multi-dimensional multi-property multi-resolution FFT correlations may be generated. The approach is demonstrated by calculating 1D, 3D and 5D rotational correlations of 3D shape and electrostatic expansions up to polynomial order L=30 on a 2 GB personal computer. As expected, 3D correlations are found to be considerably faster than 1D correlations but, surprisingly, 5D correlations are often slower than 3D correlations. Nonetheless, we show that 5D correlations will be advantageous when calculating multi-term knowledge-based interaction potentials. When docking the 84 complexes of the Protein Docking Benchmark, blind 3D shape plus electrostatic correlations take around 30 minutes on a contemporary personal computer and find acceptable solutions within the top 20 in 16 cases. Applying a simple angular constraint to focus the calculation around the receptor binding site produces acceptable solutions within the top 20 in 28 cases. Further constraining the search to the ligand binding site gives up to 48 solutions within the top 20, with calculation times of just a few minutes per complex. Hence the approach described provides a practical and fast tool for rigid body protein-protein docking, especially when prior knowledge about one or both binding sites is available.
Quantitative Prediction of Computational Quality (so the S and C Folks will Accept it)
NASA Technical Reports Server (NTRS)
Hemsch, Michael J.; Luckring, James M.; Morrison, Joseph H.
2004-01-01
Our choice of title may seem strange but we mean each word. In this talk, we are not going to be concerned with computations made "after the fact", i.e. those for which data are available and which are being conducted for explanation and insight. Here we are interested in preventing S&C design problems by finding them through computation before data are available. For such a computation to have any credibility with those who absorb the risk, it is necessary to quantitatively PREDICT the quality of the computational results.
A general derivation and quantification of the third law of thermodynamics.
Masanes, Lluís; Oppenheim, Jonathan
2017-03-14
The most accepted version of the third law of thermodynamics, the unattainability principle, states that any process cannot reach absolute zero temperature in a finite number of steps and within a finite time. Here, we provide a derivation of the principle that applies to arbitrary cooling processes, even those exploiting the laws of quantum mechanics or involving an infinite-dimensional reservoir. We quantify the resources needed to cool a system to any temperature, and translate these resources into the minimal time or number of steps, by considering the notion of a thermal machine that obeys similar restrictions to universal computers. We generally find that the obtainable temperature can scale as an inverse power of the cooling time. Our results also clarify the connection between two versions of the third law (the unattainability principle and the heat theorem), and place ultimate bounds on the speed at which information can be erased.
A general derivation and quantification of the third law of thermodynamics
Masanes, Lluís; Oppenheim, Jonathan
2017-01-01
The most accepted version of the third law of thermodynamics, the unattainability principle, states that any process cannot reach absolute zero temperature in a finite number of steps and within a finite time. Here, we provide a derivation of the principle that applies to arbitrary cooling processes, even those exploiting the laws of quantum mechanics or involving an infinite-dimensional reservoir. We quantify the resources needed to cool a system to any temperature, and translate these resources into the minimal time or number of steps, by considering the notion of a thermal machine that obeys similar restrictions to universal computers. We generally find that the obtainable temperature can scale as an inverse power of the cooling time. Our results also clarify the connection between two versions of the third law (the unattainability principle and the heat theorem), and place ultimate bounds on the speed at which information can be erased. PMID:28290452
High-resolution short-exposure small-animal laboratory x-ray phase-contrast tomography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Larsson, Daniel H.; Vågberg, William; Yaroshenko, Andre
X-ray computed tomography of small animals and their organs is an essential tool in basic and preclinical biomedical research. In both phase-contrast and absorption tomography high spatial resolution and short exposure times are of key importance. However, the observable spatial resolutions and achievable exposure times are presently limited by system parameters rather than more fundamental constraints like, e.g., dose. Here we demonstrate laboratory tomography with few-ten μm spatial resolution and few-minute exposure time at an acceptable dose for small-animal imaging, both with absorption contrast and phase contrast. The method relies on a magnifying imaging scheme in combination with a high-powermore » small-spot liquid-metal-jet electron-impact source. Lastly, the tomographic imaging is demonstrated on intact mouse, phantoms and excised lungs, both healthy and with pulmonary emphysema.« less
NASA Technical Reports Server (NTRS)
Clement, Warren F.; Gorder, Pater J.; Jewell, Wayne F.; Coppenbarger, Richard
1990-01-01
Developing a single-pilot all-weather NOE capability requires fully automatic NOE navigation and flight control. Innovative guidance and control concepts are being investigated to (1) organize the onboard computer-based storage and real-time updating of NOE terrain profiles and obstacles; (2) define a class of automatic anticipative pursuit guidance algorithms to follow the vertical, lateral, and longitudinal guidance commands; (3) automate a decision-making process for unexpected obstacle avoidance; and (4) provide several rapid response maneuvers. Acquired knowledge from the sensed environment is correlated with the recorded environment which is then used to determine an appropriate evasive maneuver if a nonconformity is observed. This research effort has been evaluated in both fixed-base and moving-base real-time piloted simulations thereby evaluating pilot acceptance of the automated concepts, supervisory override, manual operation, and reengagement of the automatic system.
High-resolution short-exposure small-animal laboratory x-ray phase-contrast tomography
Larsson, Daniel H.; Vågberg, William; Yaroshenko, Andre; ...
2016-12-13
X-ray computed tomography of small animals and their organs is an essential tool in basic and preclinical biomedical research. In both phase-contrast and absorption tomography high spatial resolution and short exposure times are of key importance. However, the observable spatial resolutions and achievable exposure times are presently limited by system parameters rather than more fundamental constraints like, e.g., dose. Here we demonstrate laboratory tomography with few-ten μm spatial resolution and few-minute exposure time at an acceptable dose for small-animal imaging, both with absorption contrast and phase contrast. The method relies on a magnifying imaging scheme in combination with a high-powermore » small-spot liquid-metal-jet electron-impact source. Lastly, the tomographic imaging is demonstrated on intact mouse, phantoms and excised lungs, both healthy and with pulmonary emphysema.« less
In-flight demonstration of a Real-Time Flush Airdata Sensing (RT-FADS) system
NASA Technical Reports Server (NTRS)
Whitmore, Stephen A.; Davis, Roy J.; Fife, John Michael
1995-01-01
A prototype real-time flush airdata sensing (RT-FADS) system has been developed and flight tested at the NASA Dryden Flight Research Center. This system uses a matrix of pressure orifices on the vehicle nose to estimate airdata parameters in real time using nonlinear regression. The algorithm is robust to sensor failures and noise in the measured pressures. The RT-FADS system has been calibrated using inertial trajectory measurements that were bootstrapped for atmospheric conditions using meteorological data. Mach numbers as high as 1.6 and angles of attack greater than 45 deg have been tested. The system performance has been evaluated by comparing the RT-FADS to the ship system airdata computer measurements to give a quantitative evaluation relative to an accepted measurement standard. Nominal agreements of approximately 0.003 in Mach number and 0.20 deg in angle of attack and angle of sideslip have been achieved.
A real-time navigation monitoring expert system for the Space Shuttle Mission Control Center
NASA Technical Reports Server (NTRS)
Wang, Lui; Fletcher, Malise
1993-01-01
The ONAV (Onboard Navigation) Expert System has been developed as a real time console assistant for use by ONAV flight controllers in the Mission Control Center at the Johnson Space Center. This expert knowledge based system is used to monitor the Space Shuttle onboard navigation system, detect faults, and advise flight operations personnel. This application is the first knowledge-based system to use both telemetry and trajectory data from the Mission Operations Computer (MOC). To arrive at this stage, from a prototype to real world application, the ONAV project has had to deal with not only AI issues but operating environment issues. The AI issues included the maturity of AI languages and the debugging tools, verification, and availability, stability and size of the expert pool. The environmental issues included real time data acquisition, hardware suitability, and how to achieve acceptance by users and management.
Beam dynamics validation of the Halbach Technology FFAG Cell for Cornell-BNL Energy Recovery Linac
NASA Astrophysics Data System (ADS)
Méot, F.; Tsoupas, N.; Brooks, S.; Trbojevic, D.
2018-07-01
The Cornell-BNL Electron Test Accelerator (CBETA), a 150 MeV energy recovery linac (ERL) now in construction at Cornell, employs a fixed-field alternating gradient optics return loop: a single beam line comprised of FFAG cells, which accepts four recirculated energies. CBETA FFAG cell uses Halbach permanent magnet technology, its design studies have covered an extended period of time supported by extensive particle dynamics simulations using computed 3-D field map models. This approach is discussed, and illustrated here, based on the final stage in these beam dynamics studies, namely the validation of a ultimate, optimized design of the Halbach cell.
NASA Astrophysics Data System (ADS)
Langenberg, J. H.; Bucur, I. B.; Archirel, P.
1997-09-01
We show that in the simple case of van der Waals ionic clusters, the optimisation of orbitals within VB can be easily simulated with the help of pseudopotentials. The procedure yields the ground and the first excited states of the cluster simultaneously. This makes the calculation of potential energy surfaces for tri- and tetraatomic clusters possible, with very acceptable computation times. We give potential curves for (ArCO) +, (ArN 2) + and N 4+. An application to the simulation of the SCF method is shown for Na +H 2O.
ASTRYD: A new numerical tool for aircraft cabin and environmental noise prediction
NASA Astrophysics Data System (ADS)
Berhault, J.-P.; Venet, G.; Clerc, C.
ASTRYD is an analytical tool, developed originally for underwater applications, that computes acoustic pressure distribution around three-dimensional bodies in closed spaces like aircraft cabins. The program accepts data from measurements or other simulations, processes them in the time domain, and delivers temporal evolutions of the acoustic pressures and accelerations, as well as the radiated/diffracted pressure at arbitrary points located in the external/internal space. A typical aerospace application is prediction of acoustic load on satellites during the launching phase. An aeronautic application is engine noise distribution on a business jet body for prediction of environmental and cabin noise.
Parallel Simulation of Unsteady Turbulent Flames
NASA Technical Reports Server (NTRS)
Menon, Suresh
1996-01-01
Time-accurate simulation of turbulent flames in high Reynolds number flows is a challenging task since both fluid dynamics and combustion must be modeled accurately. To numerically simulate this phenomenon, very large computer resources (both time and memory) are required. Although current vector supercomputers are capable of providing adequate resources for simulations of this nature, the high cost and their limited availability, makes practical use of such machines less than satisfactory. At the same time, the explicit time integration algorithms used in unsteady flow simulations often possess a very high degree of parallelism, making them very amenable to efficient implementation on large-scale parallel computers. Under these circumstances, distributed memory parallel computers offer an excellent near-term solution for greatly increased computational speed and memory, at a cost that may render the unsteady simulations of the type discussed above more feasible and affordable.This paper discusses the study of unsteady turbulent flames using a simulation algorithm that is capable of retaining high parallel efficiency on distributed memory parallel architectures. Numerical studies are carried out using large-eddy simulation (LES). In LES, the scales larger than the grid are computed using a time- and space-accurate scheme, while the unresolved small scales are modeled using eddy viscosity based subgrid models. This is acceptable for the moment/energy closure since the small scales primarily provide a dissipative mechanism for the energy transferred from the large scales. However, for combustion to occur, the species must first undergo mixing at the small scales and then come into molecular contact. Therefore, global models cannot be used. Recently, a new model for turbulent combustion was developed, in which the combustion is modeled, within the subgrid (small-scales) using a methodology that simulates the mixing and the molecular transport and the chemical kinetics within each LES grid cell. Finite-rate kinetics can be included without any closure and this approach actually provides a means to predict the turbulent rates and the turbulent flame speed. The subgrid combustion model requires resolution of the local time scales associated with small-scale mixing, molecular diffusion and chemical kinetics and, therefore, within each grid cell, a significant amount of computations must be carried out before the large-scale (LES resolved) effects are incorporated. Therefore, this approach is uniquely suited for parallel processing and has been implemented on various systems such as: Intel Paragon, IBM SP-2, Cray T3D and SGI Power Challenge (PC) using the system independent Message Passing Interface (MPI) compiler. In this paper, timing data on these machines is reported along with some characteristic results.
Foong, Rachel E.; Harper, Alana J.; King, Louise; Turkovic, Lidija; Davis, Miriam; Clem, Charles C.; Davis, Stephanie D.; Ranganathan, Sarath; Hall, Graham L.
2018-01-01
The lung clearance index (LCI) from the multiple-breath washout (MBW) test is a promising surveillance tool for pre-school children with cystic fibrosis (CF). Current guidelines for MBW testing recommend that three acceptable trials are required. However, success rates to achieve these criteria are low in children aged <7 years and feasibility may improve with modified pre-school criteria that accepts tests with two acceptable trials. This study aimed to determine if relationships between LCI and clinical outcomes of CF lung disease differ when only two acceptable MBW trials are assessed. Healthy children and children with CF aged 3–6 years were recruited for MBW testing. Children with CF also underwent bronchoalveolar lavage fluid collection and a chest computed tomography scan. MBW feasibility increased from 46% to 75% when tests with two trials were deemed acceptable compared with tests where three acceptable trials were required. Relationships between MBW outcomes and markers of pulmonary inflammation, infection and structural lung disease were not different between tests with three acceptable trials compared with tests with two acceptable trials. This study indicates that pre-school MBW data from two acceptable trials may provide sufficient information on ventilation distribution if three acceptable trials are not possible. PMID:29707562
Control of Cellular Structural Networks Through Unstructured Protein Domains
2016-07-01
stem cells (hPSCs), including embryonic and induced pluripotent stem cells . We had a third paper accepted to Scientific Reports in which we showed...2012 Stem Cells Young Investigator Award. We then had a followup paper accepted to Integrative Biology extending these ideas to human pluripotent ...morphology, mechanics, and neurogenesis in neural stem cells ; (3) To develop and use multiscale computational 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND
A historical perspective of the popular use of electric and magnetic therapy.
Basford, J R
2001-09-01
To review the history of the therapeutic use of static electric and magnetic fields and to understand its implications for current popular and medical acceptance of these and other alternative and complementary therapies. Comprehensive MEDLINE (1960-2000) and CINAHL (1982-2000) computer literature searches by using key words such as electricity, magnetism, electromagnetic, therapy, medicine, EMF, history of medicine, and fields. Additional references were obtained from the bibliographies of the selected articles. In addition, discussions were held with curators of medical history museums and supplemental searches were made of Internet sources through various search engines. Primary references were used whenever possible. In a few instances, secondary references, particularly those requiring translations of early texts, were used. The use of electric and magnetic forces to treat disease has intrigued the general public and the scientific community since at least the time of the ancient Greeks. The popularity of these therapies has waxed and waned over the millennia, but at all times the popular imagination, often spurred by dynamic and colorful practitioners of pseudoscience, has been more excited than the medical or political establishment. In fact, a pattern seems to reappear. In each era, unsophisticated public acceptance is met first with medical disdain, then with investigation, and, finally, with a failure to find objective evidence of efficacy. This pattern continues today with the public acceptance of magnetic therapy (and alternative and complementary medicine in general) far outstripping acceptance by the medical community. The therapeutic implications of applying electrical and magnetic fields to heal disease have continually captured the popular imagination. Approaches thousands of years apart can be remarkably similar, but, in each era, proof has been lacking and the prevailing medical establishment has remained unconvinced. Interest persists today. Although these agents may have a future role in the healing of human disease, their history and a minimal scientific rationale makes it unlikely that the dichotomy between the hopes of the public and the medical skepticism will disappear.
Improving Individual Acceptance of Health Clouds through Confidentiality Assurance.
Ermakova, Tatiana; Fabian, Benjamin; Zarnekow, Rüdiger
2016-10-26
Cloud computing promises to essentially improve healthcare delivery performance. However, shifting sensitive medical records to third-party cloud providers could create an adoption hurdle because of security and privacy concerns. This study examines the effect of confidentiality assurance in a cloud-computing environment on individuals' willingness to accept the infrastructure for inter-organizational sharing of medical data. We empirically investigate our research question by a survey with over 260 full responses. For the setting with a high confidentiality assurance, we base on a recent multi-cloud architecture which provides very high confidentiality assurance through a secret-sharing mechanism: Health information is cryptographically encoded and distributed in a way that no single and no small group of cloud providers is able to decode it. Our results indicate the importance of confidentiality assurance in individuals' acceptance of health clouds for sensitive medical data. Specifically, this finding holds for a variety of practically relevant circumstances, i.e., in the absence and despite the presence of conventional offline alternatives and along with pseudonymization. On the other hand, we do not find support for the effect of confidentiality assurance in individuals' acceptance of health clouds for non-sensitive medical data. These results could support the process of privacy engineering for health-cloud solutions.
Improving Individual Acceptance of Health Clouds through Confidentiality Assurance
Fabian, Benjamin; Zarnekow, Rüdiger
2016-01-01
Summary Background Cloud computing promises to essentially improve healthcare delivery performance. However, shifting sensitive medical records to third-party cloud providers could create an adoption hurdle because of security and privacy concerns. Objectives This study examines the effect of confidentiality assurance in a cloud-computing environment on individuals’ willingness to accept the infrastructure for inter-organizational sharing of medical data. Methods We empirically investigate our research question by a survey with over 260 full responses. For the setting with a high confidentiality assurance, we base on a recent multi-cloud architecture which provides very high confidentiality assurance through a secret-sharing mechanism: Health information is cryptographically encoded and distributed in a way that no single and no small group of cloud providers is able to decode it. Results Our results indicate the importance of confidentiality assurance in individuals’ acceptance of health clouds for sensitive medical data. Specifically, this finding holds for a variety of practically relevant circumstances, i.e., in the absence and despite the presence of conventional offline alternatives and along with pseudonymization. On the other hand, we do not find support for the effect of confidentiality assurance in individuals’ acceptance of health clouds for non-sensitive medical data. These results could support the process of privacy engineering for health-cloud solutions. PMID:27781238
Benaroia, Mark; Elinson, Roman; Zarnke, Kelly
2007-04-01
Patients can be used as a resource to enter their own pertinent medical information. This study will evaluate the feasibility of an intelligent computer medical history-taking device directed at patients in the emergency department (ED). Two of the authors (MB, RE) developed an expert system that can take patient-directed medical histories. Patients interacted with the computer in the ED waiting room while it gathered a medical history based on chief complaint (CC). A survey was completed post history. A sub-study assessed the computer's ability to take an adequate history for an index CC. We compared the computer and emergency physician histories for the presence or absence of important historical elements. Sixty-seven patients used the interactive computer system. The mean time to complete the history was 5 min and 32s +/- 1 min and 21s. The patient response rate was 97%. Over 83% felt that the computer was very easy to use and over 92% would very much use the computer again. A total of 15 patients with abdominal pain (index CC) were evaluated for the sub-study. The computer history asked 90+/-7%, and the emergency physician asked 55+/-18%, of the important historical elements. These groups were statistically different with a p-value of <0.00001. This feasibility study has shown that the computer history-taking device is well accepted by patients and that such a system can be integrated into the normal process of patient triage without delaying patient care. Such a system can serve as an initial mode for documentation and data acquisition directly from the patient.
Computers as learning resources in the health sciences: impact and issues.
Ellis, L B; Hannigan, G G
1986-01-01
Starting with two computer terminals in 1972, the Health Sciences Learning Resources Center of the University of Minnesota Bio-Medical Library expanded its instructional facilities to ten terminals and thirty-five microcomputers by 1985. Computer use accounted for 28% of total center circulation. The impact of these resources on health sciences curricula is described and issues related to use, support, and planning are raised and discussed. Judged by their acceptance and educational value, computers are successful health sciences learning resources at the University of Minnesota. PMID:3518843
Yoon, Hyung-In; Han, Jung-Suk
2016-02-01
The fabrication of dental prostheses with computer-aided design and computer-aided manufacturing shows acceptable marginal fits and favorable treatment outcomes. This clinical report describes the management of a patient who had undergone a mandibulectomy and received an implant-supported fixed prosthesis by using additive manufacturing for the framework and subtractive manufacturing for the monolithic zirconia restorations. Copyright © 2016 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.
McCarthy, Peter M.
2006-01-01
The Yellowstone River is very important in a variety of ways to the residents of southeastern Montana; however, it is especially vulnerable to spilled contaminants. In 2004, the U.S. Geological Survey, in cooperation with Montana Department of Environmental Quality, initiated a study to develop a computer program to rapidly estimate instream travel times and concentrations of a potential contaminant in the Yellowstone River using regression equations developed in 1999 by the U.S. Geological Survey. The purpose of this report is to describe these equations and their limitations, describe the development of a computer program to apply the equations to the Yellowstone River, and provide detailed instructions on how to use the program. This program is available online at [http://pubs.water.usgs.gov/sir2006-5057/includes/ytot.xls]. The regression equations provide estimates of instream travel times and concentrations in rivers where little or no contaminant-transport data are available. Equations were developed and presented for the most probable flow velocity and the maximum probable flow velocity. These velocity estimates can then be used to calculate instream travel times and concentrations of a potential contaminant. The computer program was developed so estimation equations for instream travel times and concentrations can be solved quickly for sites along the Yellowstone River between Corwin Springs and Sidney, Montana. The basic types of data needed to run the program are spill data, streamflow data, and data for locations of interest along the Yellowstone River. Data output from the program includes spill location, river mileage at specified locations, instantaneous discharge, mean-annual discharge, drainage area, and channel slope. Travel times and concentrations are provided for estimates of the most probable velocity of the peak concentration and the maximum probable velocity of the peak concentration. Verification of estimates of instream travel times and concentrations for the Yellowstone River requires information about the flow velocity throughout the 520 mi of river in the study area. Dye-tracer studies would provide the best data about flow velocities and would provide the best verification of instream travel times and concentrations estimated from this computer program; however, data from such studies does not currently (2006) exist and new studies would be expensive and time-consuming. An alternative approach used in this study for verification of instream travel times is based on the use of flood-wave velocities determined from recorded streamflow hydrographs at selected mainstem streamflow-gaging stations along the Yellowstone River. The ratios of flood-wave velocity to the most probable velocity for the base flow estimated from the computer program are within the accepted range of 2.5 to 4.0 and indicate that flow velocities estimated from the computer program are reasonable for the Yellowstone River. The ratios of flood-wave velocity to the maximum probable velocity are within a range of 1.9 to 2.8 and indicate that the maximum probable flow velocities estimated from the computer program, which corresponds to the shortest travel times and maximum probable concentrations, are conservative and reasonable for the Yellowstone River.
Numerical Technology for Large-Scale Computational Electromagnetics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sharpe, R; Champagne, N; White, D
The key bottleneck of implicit computational electromagnetics tools for large complex geometries is the solution of the resulting linear system of equations. The goal of this effort was to research and develop critical numerical technology that alleviates this bottleneck for large-scale computational electromagnetics (CEM). The mathematical operators and numerical formulations used in this arena of CEM yield linear equations that are complex valued, unstructured, and indefinite. Also, simultaneously applying multiple mathematical modeling formulations to different portions of a complex problem (hybrid formulations) results in a mixed structure linear system, further increasing the computational difficulty. Typically, these hybrid linear systems aremore » solved using a direct solution method, which was acceptable for Cray-class machines but does not scale adequately for ASCI-class machines. Additionally, LLNL's previously existing linear solvers were not well suited for the linear systems that are created by hybrid implicit CEM codes. Hence, a new approach was required to make effective use of ASCI-class computing platforms and to enable the next generation design capabilities. Multiple approaches were investigated, including the latest sparse-direct methods developed by our ASCI collaborators. In addition, approaches that combine domain decomposition (or matrix partitioning) with general-purpose iterative methods and special purpose pre-conditioners were investigated. Special-purpose pre-conditioners that take advantage of the structure of the matrix were adapted and developed based on intimate knowledge of the matrix properties. Finally, new operator formulations were developed that radically improve the conditioning of the resulting linear systems thus greatly reducing solution time. The goal was to enable the solution of CEM problems that are 10 to 100 times larger than our previous capability.« less
A Computer Solution of the Parking Lot Problem.
ERIC Educational Resources Information Center
Rumble, Richard T.
A computer program has been developed that will accept as inputs the physical description of a portion of land, and the parking design standards to be followed. The program will then give as outputs the numerical and graphical descriptions of the maximum-density parking lot for that portion of land. The problem has been treated as a standard…
Educators' Theories and Beliefs and the Use of Computers in Secondary Schools
ERIC Educational Resources Information Center
Naicker, Visvanathan
2011-01-01
Schools and educators are under considerable pressure to change. Educators have reported varying attitudes to the use of computers, ranging from supportive to negative. However, there is an acceptance that cannot simply be overlooked. It will also be important to consider all aspects of the educators' beliefs, resistance and the anxieties that…
ERIC Educational Resources Information Center
Dominguez, Alfredo
2013-01-01
Cloud computing has emerged as a new paradigm for on-demand delivery and consumption of shared IT resources over the Internet. Research has predicted that small and medium organizations (SMEs) would be among the earliest adopters of cloud solutions; however, this projection has not materialized. This study set out to investigate if behavior…
ERIC Educational Resources Information Center
Paquet, Katherine G.
2013-01-01
Cloud computing may provide cost benefits for organizations by eliminating the overhead costs of software, hardware, and maintenance (e.g., license renewals, upgrading software, servers and their physical storage space, administration along with funding a large IT department). In addition to the promised savings, the organization may require…
Cloud Computing Adoption and Usage in Community Colleges
ERIC Educational Resources Information Center
Behrend, Tara S.; Wiebe, Eric N.; London, Jennifer E.; Johnson, Emily C.
2011-01-01
Cloud computing is gaining popularity in higher education settings, but the costs and benefits of this tool have gone largely unexplored. The purpose of this study was to examine the factors that lead to technology adoption in a higher education setting. Specifically, we examined a range of predictors and outcomes relating to the acceptance of a…
Biosensor Technologies for Augmented Brain-Computer Interfaces in the Next Decades
2012-05-13
Research Triangle Park, NC 27709-2211 Augmented brain–computer interface (ABCI);biosensor; cognitive-state monitoring; electroencephalogram( EEG ); human...biosensor; cognitive-state monitoring; electroencephalogram ( EEG ); human brain imaging Manuscript received November 28, 2011; accepted December 20...magnetic reso- nance imaging (fMRI) [1], positron emission tomography (PET) [2], electroencephalograms ( EEGs ) and optical brain imaging techniques (i.e
Examining a Web-Based Peer Feedback System in an Introductory Computer Literacy Course
ERIC Educational Resources Information Center
Adiguzel, Tufan; Varank, Ilhan; Erkoç, Mehmet Fatih; Buyukimdat, Meryem Koskeroglu
2017-01-01
This study focused on formative use of peer feedback in an online system that was used in basic computer literacy for word processing assignment-related purposes. Specifically, the effect of quantity, modality and satisfaction of peer feedback provided through the online system on students' performance, self-efficacy, and technology acceptance was…
ROMI-RIP: Rough Mill RIP-first simulator user's guide
R. Edward Thomas
1995-01-01
The ROugh Mill RIP-first simulator (ROMI-RIP) is a computer software package for IBM compatible personal computers that simulates current industrial practices for gang-ripping lumber. This guide shows the user how to set and examine the results of simulations regarding current or proposed mill practices. ROMI-RIP accepts cutting bills with up to 300 different part...
Automated Estimation Of Software-Development Costs
NASA Technical Reports Server (NTRS)
Roush, George B.; Reini, William
1993-01-01
COSTMODL is automated software development-estimation tool. Yields significant reduction in risk of cost overruns and failed projects. Accepts description of software product developed and computes estimates of effort required to produce it, calendar schedule required, and distribution of effort and staffing as function of defined set of development life-cycle phases. Written for IBM PC(R)-compatible computers.
How Patient Interactions with a Computer-Based Video Intervention Affect Decisions to Test for HIV
ERIC Educational Resources Information Center
Aronson, Ian David; Rajan, Sonali; Marsch, Lisa A.; Bania, Theodore C.
2014-01-01
The current study examines predictors of HIV test acceptance among emergency department patients who received an educational video intervention designed to increase HIV testing. A total of 202 patients in the main treatment areas of a high-volume, urban hospital emergency department used inexpensive netbook computers to watch brief educational…
ERIC Educational Resources Information Center
Udoh, Emmanuel E.
2010-01-01
Advances in grid technology have enabled some organizations to harness enormous computational power on demand. However, the prediction of widespread adoption of the grid technology has not materialized despite the obvious grid advantages. This situation has encouraged intense efforts to close the research gap in the grid adoption process. In this…
26 CFR 1.669(b)-1A - Tax on distribution.
Code of Federal Regulations, 2010 CFR
2010-04-01
... section (the “exact” method), or (2) The tax computed under paragraph (c) of this section (the “short-cut..., the method used in the return shall be accepted as the method that produces the lesser tax. The... tax imposed by section 668(a)(2). (b) Computation of partial tax by the exact method. The partial tax...
Computer-Assisted Career Guidance Systems: A Part of NCDA History
ERIC Educational Resources Information Center
Harris-Bowlsbey, JoAnn
2013-01-01
The first computer-assisted career planning systems were developed in the late 1960s and were based soundly on the best of career development and decision-making theory. Over the years, this tradition has continued as the technology that delivers these systems' content has improved dramatically and as they have been universally accepted as…