Comfort and experience with online learning: trends over nine years and associations with knowledge.
Cook, David A; Thompson, Warren G
2014-07-01
Some evidence suggests that attitude toward computer-based instruction is an important determinant of success in online learning. We sought to determine how comfort using computers and perceptions of prior online learning experiences have changed over the past decade, and how these associate with learning outcomes. Each year from 2003-2011 we conducted a prospective trial of online learning. As part of each year's study, we asked medicine residents about their comfort using computers and if their previous experiences with online learning were favorable. We assessed knowledge using a multiple-choice test. We used regression to analyze associations and changes over time. 371 internal medicine and family medicine residents participated. Neither comfort with computers nor perceptions of prior online learning experiences showed a significant change across years (p > 0.61), with mean comfort rating 3.96 (maximum 5 = very comfortable) and mean experience rating 4.42 (maximum 6 = strongly agree [favorable]). Comfort showed no significant association with knowledge scores (p = 0.39) but perceptions of prior experiences did, with a 1.56% rise in knowledge score for a 1-point rise in experience score (p = 0.02). Correlations among comfort, perceptions of prior experiences, and number of prior experiences were all small and not statistically significant. Comfort with computers and perceptions of prior experience with online learning remained stable over nine years. Prior good experiences (but not comfort with computers) demonstrated a modest association with knowledge outcomes, suggesting that prior course satisfaction may influence subsequent learning.
Comfort and experience with online learning: trends over nine years and associations with knowledge
2014-01-01
Background Some evidence suggests that attitude toward computer-based instruction is an important determinant of success in online learning. We sought to determine how comfort using computers and perceptions of prior online learning experiences have changed over the past decade, and how these associate with learning outcomes. Methods Each year from 2003–2011 we conducted a prospective trial of online learning. As part of each year’s study, we asked medicine residents about their comfort using computers and if their previous experiences with online learning were favorable. We assessed knowledge using a multiple-choice test. We used regression to analyze associations and changes over time. Results 371 internal medicine and family medicine residents participated. Neither comfort with computers nor perceptions of prior online learning experiences showed a significant change across years (p > 0.61), with mean comfort rating 3.96 (maximum 5 = very comfortable) and mean experience rating 4.42 (maximum 6 = strongly agree [favorable]). Comfort showed no significant association with knowledge scores (p = 0.39) but perceptions of prior experiences did, with a 1.56% rise in knowledge score for a 1-point rise in experience score (p = 0.02). Correlations among comfort, perceptions of prior experiences, and number of prior experiences were all small and not statistically significant. Conclusions Comfort with computers and perceptions of prior experience with online learning remained stable over nine years. Prior good experiences (but not comfort with computers) demonstrated a modest association with knowledge outcomes, suggesting that prior course satisfaction may influence subsequent learning. PMID:24985690
ERIC Educational Resources Information Center
Wheeler, Lindsay B.; Chiu, Jennie L.; Grisham, Charles M.
2016-01-01
This article explores how integrating computational tools into a general chemistry laboratory course can influence student perceptions of programming and investigates relationships among student perceptions, prior experience, and student outcomes.
1997-02-01
application with a strong resemblance to a video game , concern has been raised that prior video game experience might have a moderating effect on scores. Much...such as spatial ability. The effects of computer or video game experience on work sample scores have not been systematically investigated. The purpose...of this study was to evaluate the incremental validity of prior video game experience over that of general aptitude as a predictor of work sample test
TREAT Reactor Control and Protection System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lipinski, W.C.; Brookshier, W.K.; Burrows, D.R.
1985-01-01
The main control algorithm of the Transient Reactor Test Facility (TREAT) Automatic Reactor Control System (ARCS) resides in Read Only Memory (ROM) and only experiment specific parameters are input via keyboard entry. Prior to executing an experiment, the software and hardware of the control computer is tested by a closed loop real-time simulation. Two computers with parallel processing are used for the reactor simulation and another computer is used for simulation of the control rod system. A monitor computer, used as a redundant diverse reactor protection channel, uses more conservative setpoints and reduces challenges to the Reactor Trip System (RTS).more » The RTS consists of triplicated hardwired channels with one out of three logic. The RTS is automatically tested by a digital Dedicated Microprocessor Tester (DMT) prior to the execution of an experiment. 6 refs., 5 figs., 1 tab.« less
ERIC Educational Resources Information Center
Johnson, Donald M.; Ferguson, James A.; Lester, Melissa L.
1999-01-01
Of 175 freshmen agriculture students, 74% had prior computer courses, 62% owned computers. The number of computer topics studied predicted both computer self-efficacy and computer knowledge. A substantial positive correlation was found between self-efficacy and computer knowledge. (SK)
Effect of computer game playing on baseline laparoscopic simulator skills.
Halvorsen, Fredrik H; Cvancarova, Milada; Fosse, Erik; Mjåland, Odd
2013-08-01
Studies examining the possible association between computer game playing and laparoscopic performance in general have yielded conflicting results and neither has a relationship between computer game playing and baseline performance on laparoscopic simulators been established. The aim of this study was to examine the possible association between previous and present computer game playing and baseline performance on a virtual reality laparoscopic performance in a sample of potential future medical students. The participating students completed a questionnaire covering the weekly amount and type of computer game playing activity during the previous year and 3 years ago. They then performed 2 repetitions of 2 tasks ("gallbladder dissection" and "traverse tube") on a virtual reality laparoscopic simulator. Performance on the simulator were then analyzed for association to their computer game experience. Local high school, Norway. Forty-eight students from 2 high school classes volunteered to participate in the study. No association between prior and present computer game playing and baseline performance was found. The results were similar both for prior and present action game playing and prior and present computer game playing in general. Our results indicate that prior and present computer game playing may not affect baseline performance in a virtual reality simulator.
Knowledge Structures of Entering Computer Networking Students and Their Instructors
ERIC Educational Resources Information Center
DiCerbo, Kristen E.
2007-01-01
Students bring prior knowledge to their learning experiences. This prior knowledge is known to affect how students encode and later retrieve new information learned. Teachers and content developers can use information about students' prior knowledge to create more effective lessons and materials. In many content areas, particularly the sciences,…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-04
...--Intersection of Cloud Computing and Mobility Forum and Workshop AGENCY: National Institute of Standards and.../intersection-of-cloud-and-mobility.cfm . SUPPLEMENTARY INFORMATION: NIST hosted six prior Cloud Computing Forum... interoperability, portability, and security, discuss the Federal Government's experience with cloud computing...
ERIC Educational Resources Information Center
Lai, Ming-Ling
2008-01-01
Purpose: This study aims to assess the state of technology readiness of professional accounting students in Malaysia, to examine their level of internet self-efficacy, to assess their prior computing experience, and to explore if they are satisfied with the professional course that they are pursuing in improving their technology skills.…
Prior Consent: Not-So-Strange Bedfellows Plan Library/Computing Partnerships.
ERIC Educational Resources Information Center
McDonough, Kristin
The increasing sophistication of information technologies and the nearly universal access to computing have blurred distinctions among information delivery units on college campuses, forcing institutions to rethink the separate organizational structures that evolved when computing in academe was more localized and less prevalent. Experiences in…
If I Survey You Again Today, Will Still Love Me Tomorrow?
ERIC Educational Resources Information Center
Webster, Sarah P.
1989-01-01
Description of academic computing services at Syracuse University focuses on surveys of students and faculty that have identified hardware and software use, problems encountered, prior computer experience, and attitudes toward computers. Advances in microcomputers, word processing, and graphics are described; resource allocation is discussed; and…
The Effects of Computer Usage on Computer Screen Reading Rate.
ERIC Educational Resources Information Center
Clausing, Carolyn S.; Schmitt, Dorren Rafael
This study investigated the differences in the reading rate of eighth grade students on a cloze reading exercise involving the reading of text from a computer monitor. Several different modes of presentation were used in order to determine the effect of prior experience with computers on the students' reading rate. Subjects were 240 eighth grade…
NASA Astrophysics Data System (ADS)
Aiken, John; Schatz, Michael; Burk, John; Caballero, Marcos; Thoms, Brian
2012-03-01
We describe the assessment of computational modeling in a ninth grade classroom in the context of the Arizona Modeling Instruction physics curriculum. Using a high-level programming environment (VPython), students develop computational models to predict the motion of objects under a variety of physical situations (e.g., constant net force), to simulate real world phenomenon (e.g., car crash), and to visualize abstract quantities (e.g., acceleration). The impact of teaching computation is evaluated through a proctored assignment that asks the students to complete a provided program to represent the correct motion. Using questions isomorphic to the Force Concept Inventory we gauge students understanding of force in relation to the simulation. The students are given an open ended essay question that asks them to explain the steps they would use to model a physical situation. We also investigate the attitudes and prior experiences of each student using the Computation Modeling in Physics Attitudinal Student Survey (COMPASS) developed at Georgia Tech as well as a prior computational experiences survey.
ERIC Educational Resources Information Center
Kautz, Karlheinz; Kofoed, Uffe
2004-01-01
Teachers at universities are facing an increasing disparity in students' prior IT knowledge and, at the same time, experience a growing disengagement of the students with regard to involvement in study activities. As computer science teachers in a joint programme in computer science and business administration, we made a number of similar…
Nair, Sankaran N; Czaja, Sara J; Sharit, Joseph
2007-06-01
This article explores the role of age, cognitive abilities, prior experience, and knowledge in skill acquisition for a computer-based simulated customer service task. Fifty-two participants aged 50-80 performed the task over 4 consecutive days following training. They also completed a battery that assessed prior computer experience and cognitive abilities. The data indicated that overall quality and efficiency of performance improved with practice. The predictors of initial level of performance and rate of change in performance varied according to the performance parameter assessed. Age and fluid intelligence predicted initial level and rate of improvement in overall quality, whereas crystallized intelligence and age predicted initial e-mail processing time, and crystallized intelligence predicted rate of change in e-mail processing time over days. We discuss the implications of these findings for the design of intervention strategies.
Pierce, B
2000-05-01
This study evaluated the acceptance of using computers to take a medical history by rural Arkansas patients. Sex, age, race, education, previous computer experience and owning a computer were used as variables. Patients were asked a series of questions to rate their comfort level with using a computer to take their medical history. Comfort ratings ranged from 30 to 45, with a mean of 36.8 (SEM = 0.67). Neither sex, race, age, education, owning a personal computer, nor prior computer experience had a significant effect on the comfort rating. This study helps alleviate one of the concerns--patient acceptance--about the increasing use of computers in practicing medicine.
Merlé, Y; Mentré, F
1995-02-01
In this paper 3 criteria to design experiments for Bayesian estimation of the parameters of nonlinear models with respect to their parameters, when a prior distribution is available, are presented: the determinant of the Bayesian information matrix, the determinant of the pre-posterior covariance matrix, and the expected information provided by an experiment. A procedure to simplify the computation of these criteria is proposed in the case of continuous prior distributions and is compared with the criterion obtained from a linearization of the model about the mean of the prior distribution for the parameters. This procedure is applied to two models commonly encountered in the area of pharmacokinetics and pharmacodynamics: the one-compartment open model with bolus intravenous single-dose injection and the Emax model. They both involve two parameters. Additive as well as multiplicative gaussian measurement errors are considered with normal prior distributions. Various combinations of the variances of the prior distribution and of the measurement error are studied. Our attention is restricted to designs with limited numbers of measurements (1 or 2 measurements). This situation often occurs in practice when Bayesian estimation is performed. The optimal Bayesian designs that result vary with the variances of the parameter distribution and with the measurement error. The two-point optimal designs sometimes differ from the D-optimal designs for the mean of the prior distribution and may consist of replicating measurements. For the studied cases, the determinant of the Bayesian information matrix and its linearized form lead to the same optimal designs. In some cases, the pre-posterior covariance matrix can be far from its lower bound, namely, the inverse of the Bayesian information matrix, especially for the Emax model and a multiplicative measurement error. The expected information provided by the experiment and the determinant of the pre-posterior covariance matrix generally lead to the same designs except for the Emax model and the multiplicative measurement error. Results show that these criteria can be easily computed and that they could be incorporated in modules for designing experiments.
Evaluating the Usage of Cloud-Based Collaboration Services through Teamwork
ERIC Educational Resources Information Center
Qin, Li; Hsu, Jeffrey; Stern, Mel
2016-01-01
With the proliferation of cloud computing for both organizational and educational use, cloud-based collaboration services are transforming how people work in teams. The authors investigated the determinants of the usage of cloud-based collaboration services including teamwork quality, computer self-efficacy, and prior experience, as well as its…
Sukums, Felix; Mensah, Nathan; Mpembeni, Rose; Kaltschmidt, Jens; Haefeli, Walter E.; Blank, Antje
2014-01-01
Background The QUALMAT (Quality of Maternal and Prenatal Care: Bridging the Know-do Gap) project has introduced an electronic clinical decision support system (CDSS) for pre-natal and maternal care services in rural primary health facilities in Burkina Faso, Ghana, and Tanzania. Objective To report an assessment of health providers’ computer knowledge, experience, and attitudes prior to the implementation of the QUALMAT electronic CDSS. Design A cross-sectional study was conducted with providers in 24 QUALMAT project sites. Information was collected using structured questionnaires. Chi-squared tests and one-way ANOVA describe the association between computer knowledge, attitudes, and other factors. Semi-structured interviews and focus groups were conducted to gain further insights. Results A total of 108 providers responded, 63% were from Tanzania and 37% from Ghana. The mean age was 37.6 years, and 79% were female. Only 40% had ever used computers, and 29% had prior computer training. About 80% were computer illiterate or beginners. Educational level, age, and years of work experience were significantly associated with computer knowledge (p<0.01). Most (95.3%) had positive attitudes towards computers – average score (±SD) of 37.2 (±4.9). Females had significantly lower scores than males. Interviews and group discussions showed that although most were lacking computer knowledge and experience, they were optimistic about overcoming challenges associated with the introduction of computers in their workplace. Conclusions Given the low levels of computer knowledge among rural health workers in Africa, it is important to provide adequate training and support to ensure the successful uptake of electronic CDSSs in these settings. The positive attitudes to computers found in this study underscore that also rural care providers are ready to use such technology. PMID:25361721
Sukums, Felix; Mensah, Nathan; Mpembeni, Rose; Kaltschmidt, Jens; Haefeli, Walter E; Blank, Antje
2014-01-01
The QUALMAT (Quality of Maternal and Prenatal Care: Bridging the Know-do Gap) project has introduced an electronic clinical decision support system (CDSS) for pre-natal and maternal care services in rural primary health facilities in Burkina Faso, Ghana, and Tanzania. To report an assessment of health providers' computer knowledge, experience, and attitudes prior to the implementation of the QUALMAT electronic CDSS. A cross-sectional study was conducted with providers in 24 QUALMAT project sites. Information was collected using structured questionnaires. Chi-squared tests and one-way ANOVA describe the association between computer knowledge, attitudes, and other factors. Semi-structured interviews and focus groups were conducted to gain further insights. A total of 108 providers responded, 63% were from Tanzania and 37% from Ghana. The mean age was 37.6 years, and 79% were female. Only 40% had ever used computers, and 29% had prior computer training. About 80% were computer illiterate or beginners. Educational level, age, and years of work experience were significantly associated with computer knowledge (p<0.01). Most (95.3%) had positive attitudes towards computers - average score (±SD) of 37.2 (±4.9). Females had significantly lower scores than males. Interviews and group discussions showed that although most were lacking computer knowledge and experience, they were optimistic about overcoming challenges associated with the introduction of computers in their workplace. Given the low levels of computer knowledge among rural health workers in Africa, it is important to provide adequate training and support to ensure the successful uptake of electronic CDSSs in these settings. The positive attitudes to computers found in this study underscore that also rural care providers are ready to use such technology.
Accelerated Application Development: The ORNL Titan Experience
Joubert, Wayne; Archibald, Richard K.; Berrill, Mark A.; ...
2015-05-09
The use of computational accelerators such as NVIDIA GPUs and Intel Xeon Phi processors is now widespread in the high performance computing community, with many applications delivering impressive performance gains. However, programming these systems for high performance, performance portability and software maintainability has been a challenge. In this paper we discuss experiences porting applications to the Titan system. Titan, which began planning in 2009 and was deployed for general use in 2013, was the first multi-petaflop system based on accelerator hardware. To ready applications for accelerated computing, a preparedness effort was undertaken prior to delivery of Titan. In this papermore » we report experiences and lessons learned from this process and describe how users are currently making use of computational accelerators on Titan.« less
Accelerated application development: The ORNL Titan experience
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joubert, Wayne; Archibald, Rick; Berrill, Mark
2015-08-01
The use of computational accelerators such as NVIDIA GPUs and Intel Xeon Phi processors is now widespread in the high performance computing community, with many applications delivering impressive performance gains. However, programming these systems for high performance, performance portability and software maintainability has been a challenge. In this paper we discuss experiences porting applications to the Titan system. Titan, which began planning in 2009 and was deployed for general use in 2013, was the first multi-petaflop system based on accelerator hardware. To ready applications for accelerated computing, a preparedness effort was undertaken prior to delivery of Titan. In this papermore » we report experiences and lessons learned from this process and describe how users are currently making use of computational accelerators on Titan.« less
theoretical mathematics, Western New England University, Springfield, MA B.S. in computer science, University Prior Work Experience Founder, Mobile Makes Sense Senior IT Specialist, IBM Graduate Teaching Assistant
Hippocampal Networks Habituate as Novelty Accumulates
ERIC Educational Resources Information Center
Murty, Vishnu P.; Ballard, Ian C.; Macduffie, Katherine E.; Krebs, Ruth M.; Adcock, R. Alison
2013-01-01
Novelty detection, a critical computation within the medial temporal lobe (MTL) memory system, necessarily depends on prior experience. The current study used functional magnetic resonance imaging (fMRI) in humans to investigate dynamic changes in MTL activation and functional connectivity as experience with novelty accumulates. fMRI data were…
The Effect of CRT Screen Design on Learning.
ERIC Educational Resources Information Center
Grabinger, R. Scott; Albers, Starleen
Two computer assisted instruction programs tested the effects of plain and enhanced screen designs with or without information about those designs and task-type on time and learning. Subjects were 140 fourth grade students in Lincoln, Nebraska who had extensive prior experience with computers. The enhanced versions used headings, directive cues,…
ERIC Educational Resources Information Center
Weiss, Charles J.
2017-01-01
The Scientific Computing for Chemists course taught at Wabash College teaches chemistry students to use the Python programming language, Jupyter notebooks, and a number of common Python scientific libraries to process, analyze, and visualize data. Assuming no prior programming experience, the course introduces students to basic programming and…
NASA Technical Reports Server (NTRS)
Sidik, S. M.
1972-01-01
The error variance of the process prior multivariate normal distributions of the parameters of the models are assumed to be specified, prior probabilities of the models being correct. A rule for termination of sampling is proposed. Upon termination, the model with the largest posterior probability is chosen as correct. If sampling is not terminated, posterior probabilities of the models and posterior distributions of the parameters are computed. An experiment was chosen to maximize the expected Kullback-Leibler information function. Monte Carlo simulation experiments were performed to investigate large and small sample behavior of the sequential adaptive procedure.
Performance monitoring for brain-computer-interface actions.
Schurger, Aaron; Gale, Steven; Gozel, Olivia; Blanke, Olaf
2017-02-01
When presented with a difficult perceptual decision, human observers are able to make metacognitive judgements of subjective certainty. Such judgements can be made independently of and prior to any overt response to a sensory stimulus, presumably via internal monitoring. Retrospective judgements about one's own task performance, on the other hand, require first that the subject perform a task and thus could potentially be made based on motor processes, proprioceptive, and other sensory feedback rather than internal monitoring. With this dichotomy in mind, we set out to study performance monitoring using a brain-computer interface (BCI), with which subjects could voluntarily perform an action - moving a cursor on a computer screen - without any movement of the body, and thus without somatosensory feedback. Real-time visual feedback was available to subjects during training, but not during the experiment where the true final position of the cursor was only revealed after the subject had estimated where s/he thought it had ended up after 6s of BCI-based cursor control. During the first half of the experiment subjects based their assessments primarily on the prior probability of the end position of the cursor on previous trials. However, during the second half of the experiment subjects' judgements moved significantly closer to the true end position of the cursor, and away from the prior. This suggests that subjects can monitor task performance when the task is performed without overt movement of the body. Copyright © 2016 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Perone, Sam P.
The objective of this project has been the development of a successful approach for the incorporation of on-line computer technology into the undergraduate chemistry laboratory. This approach assumes no prior programing, electronics or instrumental analysis experience on the part of the student; it does not displace the chemistry content with…
Technology Tools in the Information Age Classroom. Using Technology in the Classroom Series.
ERIC Educational Resources Information Center
Finkel, LeRoy
This book is designed for use in an introductory, college level course on educational technology, and no prior experience with computers or computing is assumed. The first of a series on technology in the classroom, the text provides a foundation for exploring more specific topics in greater depth. The book is divided into three…
ERIC Educational Resources Information Center
Charleston, LaVar J.; George, Phillis L.; Jackson, Jerlando F. L.; Berhanu, Jonathan; Amechi, Mauriell H.
2014-01-01
Women in the United States have long been underrepresented in computing science disciplines across college campuses and in industry alike (Hanson, 2004; Jackson & Charleston, 2012). This disparity is exacerbated when African American women are scrutinized. Additionally, prior research (e.g., Hanson, 2004; Jackson & Charleston, 2012;…
Pedagogy and Processes for a Computer Programming Outreach Workshop--The Bridge to College Model
ERIC Educational Resources Information Center
Tangney, Brendan; Oldham, Elizabeth; Conneely, Claire; Barrett, Stephen; Lawlor, John
2010-01-01
This paper describes a model for computer programming outreach workshops aimed at second-level students (ages 15-16). Participants engage in a series of programming activities based on the Scratch visual programming language, and a very strong group-based pedagogy is followed. Participants are not required to have any prior programming experience.…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morgan, H.S.; Stone, C.M.; Krieg, R.D.
Several large scale in situ experiments in bedded salt formations are currently underway at the Waste Isolation Pilot Plant (WIPP) near Carlsbad, New Mexico, USA. In these experiments, the thermal and creep responses of salt around several different underground room configurations are being measured. Data from the tests are to be compared to thermal and structural responses predicted in pretest reference calculations. The purpose of these comparisons is to evaluate computational models developed from laboratory data prior to fielding of the in situ experiments. In this paper, the computational models used in the pretest reference calculation for one of themore » large scale tests, The Overtest for Defense High Level Waste, are described; and the pretest computed thermal and structural responses are compared to early data from the experiment. The comparisons indicate that computed and measured temperatures for the test agree to within ten percent but that measured deformation rates are between two and three times greater than corresponsing computed rates. 10 figs., 3 tabs.« less
Mairesse, Olivier; Hofmans, Joeri; Theuns, Peter
2008-05-01
We propose a free, easy-to-use computer program that does not requires prior knowledge of computer programming to generate and run experiments using textual or pictorial stimuli. Although the FM Experiment Builder suite was initially programmed for building and conducting FM experiments, it can also be applied for non-FM experiments that necessitate randomized, single, or multifactorial designs. The program is highly configurable, allowing multilingual use and a wide range of different response formats. The outputs of the experiments are Microsoft Excel compatible .xls files that allow easy copy-paste of the results into Weiss's FM CalSTAT program (2006) or any other statistical package. Its Java-based structure is compatible with both Windows and Macintosh operating systems, and its compactness (< 1 MB) makes it easily distributable over the Internet.
The benefits of computer-generated feedback for mathematics problem solving.
Fyfe, Emily R; Rittle-Johnson, Bethany
2016-07-01
The goal of the current research was to better understand when and why feedback has positive effects on learning and to identify features of feedback that may improve its efficacy. In a randomized experiment, second-grade children received instruction on a correct problem-solving strategy and then solved a set of relevant problems. Children were assigned to receive no feedback, immediate feedback, or summative feedback from the computer. On a posttest the following day, feedback resulted in higher scores relative to no feedback for children who started with low prior knowledge. Immediate feedback was particularly effective, facilitating mastery of the material for children with both low and high prior knowledge. Results suggest that minimal computer-generated feedback can be a powerful form of guidance during problem solving. Copyright © 2016 Elsevier Inc. All rights reserved.
Introduction to the World Wide Web and Mosaic
NASA Technical Reports Server (NTRS)
Youngblood, Jim
1994-01-01
This tutorial provides an introduction to some of the terminology related to the use of the World Wide Web and Mosaic. It is assumed that the user has some prior computer experience. References are included to other sources of additional information.
Tractable Experiment Design via Mathematical Surrogates
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Brian J.
This presentation summarizes the development and implementation of quantitative design criteria motivated by targeted inference objectives for identifying new, potentially expensive computational or physical experiments. The first application is concerned with estimating features of quantities of interest arising from complex computational models, such as quantiles or failure probabilities. A sequential strategy is proposed for iterative refinement of the importance distributions used to efficiently sample the uncertain inputs to the computational model. In the second application, effective use of mathematical surrogates is investigated to help alleviate the analytical and numerical intractability often associated with Bayesian experiment design. This approach allows formore » the incorporation of prior information into the design process without the need for gross simplification of the design criterion. Illustrative examples of both design problems will be presented as an argument for the relevance of these research problems.« less
What you see is what you expect: rapid scene understanding benefits from prior experience.
Greene, Michelle R; Botros, Abraham P; Beck, Diane M; Fei-Fei, Li
2015-05-01
Although we are able to rapidly understand novel scene images, little is known about the mechanisms that support this ability. Theories of optimal coding assert that prior visual experience can be used to ease the computational burden of visual processing. A consequence of this idea is that more probable visual inputs should be facilitated relative to more unlikely stimuli. In three experiments, we compared the perceptions of highly improbable real-world scenes (e.g., an underwater press conference) with common images matched for visual and semantic features. Although the two groups of images could not be distinguished by their low-level visual features, we found profound deficits related to the improbable images: Observers wrote poorer descriptions of these images (Exp. 1), had difficulties classifying the images as unusual (Exp. 2), and even had lower sensitivity to detect these images in noise than to detect their more probable counterparts (Exp. 3). Taken together, these results place a limit on our abilities for rapid scene perception and suggest that perception is facilitated by prior visual experience.
Designing for deeper learning in a blended computer science course for middle school students
NASA Astrophysics Data System (ADS)
Grover, Shuchi; Pea, Roy; Cooper, Stephen
2015-04-01
The focus of this research was to create and test an introductory computer science course for middle school. Titled "Foundations for Advancing Computational Thinking" (FACT), the course aims to prepare and motivate middle school learners for future engagement with algorithmic problem solving. FACT was also piloted as a seven-week course on Stanford's OpenEdX MOOC platform for blended in-class learning. Unique aspects of FACT include balanced pedagogical designs that address the cognitive, interpersonal, and intrapersonal aspects of "deeper learning"; a focus on pedagogical strategies for mediating and assessing for transfer from block-based to text-based programming; curricular materials for remedying misperceptions of computing; and "systems of assessments" (including formative and summative quizzes and tests, directed as well as open-ended programming assignments, and a transfer test) to get a comprehensive picture of students' deeper computational learning. Empirical investigations, accomplished over two iterations of a design-based research effort with students (aged 11-14 years) in a public school, sought to examine student understanding of algorithmic constructs, and how well students transferred this learning from Scratch to text-based languages. Changes in student perceptions of computing as a discipline were measured. Results and mixed-method analyses revealed that students in both studies (1) achieved substantial learning gains in algorithmic thinking skills, (2) were able to transfer their learning from Scratch to a text-based programming context, and (3) achieved significant growth toward a more mature understanding of computing as a discipline. Factor analyses of prior computing experience, multivariate regression analyses, and qualitative analyses of student projects and artifact-based interviews were conducted to better understand the factors affecting learning outcomes. Prior computing experiences (as measured by a pretest) and math ability were found to be strong predictors of learning outcomes.
Colour computer-generated holography for point clouds utilizing the Phong illumination model.
Symeonidou, Athanasia; Blinder, David; Schelkens, Peter
2018-04-16
A technique integrating the bidirectional reflectance distribution function (BRDF) is proposed to generate realistic high-quality colour computer-generated holograms (CGHs). We build on prior work, namely a fast computer-generated holography method for point clouds that handles occlusions. We extend the method by integrating the Phong illumination model so that the properties of the objects' surfaces are taken into account to achieve natural light phenomena such as reflections and shadows. Our experiments show that rendering holograms with the proposed algorithm provides realistic looking objects without any noteworthy increase to the computational cost.
Using Amazon's Elastic Compute Cloud to dynamically scale CMS computational resources
NASA Astrophysics Data System (ADS)
Evans, D.; Fisk, I.; Holzman, B.; Melo, A.; Metson, S.; Pordes, R.; Sheldon, P.; Tiradani, A.
2011-12-01
Large international scientific collaborations such as the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider have traditionally addressed their data reduction and analysis needs by building and maintaining dedicated computational infrastructure. Emerging cloud computing services such as Amazon's Elastic Compute Cloud (EC2) offer short-term CPU and storage resources with costs based on usage. These services allow experiments to purchase computing resources as needed, without significant prior planning and without long term investments in facilities and their management. We have demonstrated that services such as EC2 can successfully be integrated into the production-computing model of CMS, and find that they work very well as worker nodes. The cost-structure and transient nature of EC2 services makes them inappropriate for some CMS production services and functions. We also found that the resources are not truely "on-demand" as limits and caps on usage are imposed. Our trial workflows allow us to make a cost comparison between EC2 resources and dedicated CMS resources at a University, and conclude that it is most cost effective to purchase dedicated resources for the "base-line" needs of experiments such as CMS. However, if the ability to use cloud computing resources is built into an experiment's software framework before demand requires their use, cloud computing resources make sense for bursting during times when spikes in usage are required.
HIFiRE-1 Turbulent Shock Boundary Layer Interaction - Flight Data and Computations
NASA Technical Reports Server (NTRS)
Kimmel, Roger L.; Prabhu, Dinesh
2015-01-01
The Hypersonic International Flight Research Experimentation (HIFiRE) program is a hypersonic flight test program executed by the Air Force Research Laboratory (AFRL) and Australian Defence Science and Technology Organisation (DSTO). This flight contained a cylinder-flare induced shock boundary layer interaction (SBLI). Computations of the interaction were conducted for a number of times during the ascent. The DPLR code used for predictions was calibrated against ground test data prior to exercising the code at flight conditions. Generally, the computations predicted the upstream influence and interaction pressures very well. Plateau pressures on the cylinder were predicted well at all conditions. Although the experimental heat transfer showed a large amount of scatter, especially at low heating levels, the measured heat transfer agreed well with computations. The primary discrepancy between the experiment and computation occurred in the pressures measured on the flare during second stage burn. Measured pressures exhibited large overshoots late in the second stage burn, the mechanism of which is unknown. The good agreement between flight measurements and CFD helps validate the philosophy of calibrating CFD against ground test, prior to exercising it at flight conditions.
Learning what to expect (in visual perception)
Seriès, Peggy; Seitz, Aaron R.
2013-01-01
Expectations are known to greatly affect our experience of the world. A growing theory in computational neuroscience is that perception can be successfully described using Bayesian inference models and that the brain is “Bayes-optimal” under some constraints. In this context, expectations are particularly interesting, because they can be viewed as prior beliefs in the statistical inference process. A number of questions remain unsolved, however, for example: How fast do priors change over time? Are there limits in the complexity of the priors that can be learned? How do an individual’s priors compare to the true scene statistics? Can we unlearn priors that are thought to correspond to natural scene statistics? Where and what are the neural substrate of priors? Focusing on the perception of visual motion, we here review recent studies from our laboratories and others addressing these issues. We discuss how these data on motion perception fit within the broader literature on perceptual Bayesian priors, perceptual expectations, and statistical and perceptual learning and review the possible neural basis of priors. PMID:24187536
Proof of concept of a simple computer-assisted technique for correcting bone deformities.
Ma, Burton; Simpson, Amber L; Ellis, Randy E
2007-01-01
We propose a computer-assisted technique for correcting bone deformities using the Ilizarov method. Our technique is an improvement over prior art in that it does not require a tracking system, navigation hardware and software, or intraoperative registration. Instead, we rely on a postoperative CT scan to obtain all of the information necessary to plan the correction and compute a correction schedule for the patient. Our laboratory experiments using plastic phantoms produced deformity corrections accurate to within 3.0 degrees of rotation and 1 mm of lengthening.
Prior schemata transfer as an account for assessing the intuitive use of new technology.
Fischer, Sandrine; Itoh, Makoto; Inagaki, Toshiyuki
2015-01-01
New devices are considered intuitive when they allow users to transfer prior knowledge. Drawing upon fundamental psychology experiments that distinguish prior knowledge transfer from new schema induction, a procedure was specified for assessing intuitive use. This procedure was tested with 31 participants who, prior to using an on-board computer prototype, studied its screenshots in reading vs. schema induction conditions. Distinct patterns of transfer or induction resulted for features of the prototype whose functions were familiar or unfamiliar, respectively. Though moderated by participants' cognitive style, these findings demonstrated a means for quantitatively assessing transfer of prior knowledge as the operation that underlies intuitive use. Implications for interface evaluation and design, as well as potential improvements to the procedure, are discussed. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Kalet, A L; Song, H S; Sarpel, U; Schwartz, R; Brenner, J; Ark, T K; Plass, J
2012-01-01
Well-designed computer-assisted instruction (CAI) can potentially transform medical education. Yet little is known about whether specific design features such as direct manipulation of the content yield meaningful gains in clinical learning. We designed three versions of a multimedia module on the abdominal exam incorporating different types of interactivity. As part of their physical diagnosis course, 162 second-year medical students were randomly assigned (1:1:1) to Watch, Click or Drag versions of the abdominal exam module. First, students' prior knowledge, spatial ability, and prior experience with abdominal exams were assessed. After using the module, students took a posttest; demonstrated the abdominal exam on a standardized patient; and wrote structured notes of their findings. Data from 143 students were analyzed. Baseline measures showed no differences among groups regarding prior knowledge, experience, or spatial ability. Overall there was no difference in knowledge across groups. However, physical exam scores were significantly higher for students in the Click group. A mid-range level of behavioral interactivity was associated with small to moderate improvements in performance of clinical skills. These improvements were likely mediated by enhanced engagement with the material, within the bounds of learners' cognitive capacity. These findings have implications for the design of CAI materials to teach procedural skills.
Familiarity with Technology among First-Year Students in Rwandan Tertiary Education
ERIC Educational Resources Information Center
Byungura, Jean Claude; Hansson, Henrik; Muparasi, Mugabe; Ruhinda, Ben
2018-01-01
The more the students get experienced with technologies, the more the need for tertiary education systems to adopt innovative pedagogical strategies for accommodating different learning needs. Depending on students' prior experience with computer-based tools, they may have different degrees of familiarity with new technologies. At University of…
ERIC Educational Resources Information Center
Witherspoon, Eben B.; Schunn, Christian D.; Higashi, Ross M.; Baehr, Emily C.
2016-01-01
Background: Robotics competitions are increasingly popular and potentially provide an on-ramp to computer science, which is currently highly gender imbalanced. However, within competitive robotics teams, student participation in programming is not universal. This study gathered surveys from over 500 elementary, middle, and high school robotics…
Cortical plasticity as a mechanism for storing Bayesian priors in sensory perception.
Köver, Hania; Bao, Shaowen
2010-05-05
Human perception of ambiguous sensory signals is biased by prior experiences. It is not known how such prior information is encoded, retrieved and combined with sensory information by neurons. Previous authors have suggested dynamic encoding mechanisms for prior information, whereby top-down modulation of firing patterns on a trial-by-trial basis creates short-term representations of priors. Although such a mechanism may well account for perceptual bias arising in the short-term, it does not account for the often irreversible and robust changes in perception that result from long-term, developmental experience. Based on the finding that more frequently experienced stimuli gain greater representations in sensory cortices during development, we reasoned that prior information could be stored in the size of cortical sensory representations. For the case of auditory perception, we use a computational model to show that prior information about sound frequency distributions may be stored in the size of primary auditory cortex frequency representations, read-out by elevated baseline activity in all neurons and combined with sensory-evoked activity to generate a perception that conforms to Bayesian integration theory. Our results suggest an alternative neural mechanism for experience-induced long-term perceptual bias in the context of auditory perception. They make the testable prediction that the extent of such perceptual prior bias is modulated by both the degree of cortical reorganization and the magnitude of spontaneous activity in primary auditory cortex. Given that cortical over-representation of frequently experienced stimuli, as well as perceptual bias towards such stimuli is a common phenomenon across sensory modalities, our model may generalize to sensory perception, rather than being specific to auditory perception.
Adolescents' emotions prior to sexual activity and associations with sexual risk factors.
Houck, Christopher; Swenson, Rebecca; Donenberg, Geri; Papino, Andrew; Emerson, Erin; Brown, Larry K
2014-08-01
The present study examined the link between the emotional context of sexual situations and sexual risk, specifically by examining the relationship of teens' recall of their affective states prior to sex with their sexual risk behaviors and attitudes. Adolescents (ages 13-19) attending therapeutic schools due to emotional and behavioral difficulties (n = 247) completed audio computer-assisted self-interviews regarding sexual behavior, including ratings of their emotions prior to last sexual activity. Positive emotions were most commonly endorsed (43-57 % of participants), however, significant proportions (8-23 %) also endorsed negative emotions prior to last sex. Both positive and negative emotions were significantly related to risk attitudes and behavior in regression analyses. The affective contexts of sexual experiences may be important predictors of risk in adolescence.
Adolescents’ emotions prior to sexual activity and associations with sexual risk factors
Houck, Christopher; Swenson, Rebecca; Donenberg, Geri; Papino, Andrew; Emerson, Erin; Brown, Larry K.
2014-01-01
The present study examined the link between the emotional context of sexual situations and sexual risk, specifically by examining the relationship of teens’ recall of their affective states prior to sex with their sexual risk behaviors and attitudes. Adolescents (ages 13-19) attending therapeutic schools due to emotional and behavioral difficulties (n=247) completed audio computer-assisted self-interviews regarding sexual behavior, including ratings of their emotions prior to last sexual activity. Positive emotions were most commonly endorsed (43-57% of participants), however, significant proportions (8-23%) also endorsed negative emotions prior to last sex. Both positive and negative emotions were significantly related to risk attitudes and behavior in regression analyses. The affective contexts of sexual experiences may be important predictors of risk in adolescence. PMID:24558097
Computational modeling of electrostatic charge and fields produced by hypervelocity impact
Crawford, David A.
2015-05-19
Following prior experimental evidence of electrostatic charge separation, electric and magnetic fields produced by hypervelocity impact, we have developed a model of electrostatic charge separation based on plasma sheath theory and implemented it into the CTH shock physics code. Preliminary assessment of the model shows good qualitative and quantitative agreement between the model and prior experiments at least in the hypervelocity regime for the porous carbonate material tested. The model agrees with the scaling analysis of experimental data performed in the prior work, suggesting that electric charge separation and the resulting electric and magnetic fields can be a substantial effectmore » at larger scales, higher impact velocities, or both.« less
Integrating Computational Science Tools into a Thermodynamics Course
NASA Astrophysics Data System (ADS)
Vieira, Camilo; Magana, Alejandra J.; García, R. Edwin; Jana, Aniruddha; Krafcik, Matthew
2018-01-01
Computational tools and methods have permeated multiple science and engineering disciplines, because they enable scientists and engineers to process large amounts of data, represent abstract phenomena, and to model and simulate complex concepts. In order to prepare future engineers with the ability to use computational tools in the context of their disciplines, some universities have started to integrate these tools within core courses. This paper evaluates the effect of introducing three computational modules within a thermodynamics course on student disciplinary learning and self-beliefs about computation. The results suggest that using worked examples paired to computer simulations to implement these modules have a positive effect on (1) student disciplinary learning, (2) student perceived ability to do scientific computing, and (3) student perceived ability to do computer programming. These effects were identified regardless of the students' prior experiences with computer programming.
Perceptual learning of degraded speech by minimizing prediction error.
Sohoglu, Ediz; Davis, Matthew H
2016-03-22
Human perception is shaped by past experience on multiple timescales. Sudden and dramatic changes in perception occur when prior knowledge or expectations match stimulus content. These immediate effects contrast with the longer-term, more gradual improvements that are characteristic of perceptual learning. Despite extensive investigation of these two experience-dependent phenomena, there is considerable debate about whether they result from common or dissociable neural mechanisms. Here we test single- and dual-mechanism accounts of experience-dependent changes in perception using concurrent magnetoencephalographic and EEG recordings of neural responses evoked by degraded speech. When speech clarity was enhanced by prior knowledge obtained from matching text, we observed reduced neural activity in a peri-auditory region of the superior temporal gyrus (STG). Critically, longer-term improvements in the accuracy of speech recognition following perceptual learning resulted in reduced activity in a nearly identical STG region. Moreover, short-term neural changes caused by prior knowledge and longer-term neural changes arising from perceptual learning were correlated across subjects with the magnitude of learning-induced changes in recognition accuracy. These experience-dependent effects on neural processing could be dissociated from the neural effect of hearing physically clearer speech, which similarly enhanced perception but increased rather than decreased STG responses. Hence, the observed neural effects of prior knowledge and perceptual learning cannot be attributed to epiphenomenal changes in listening effort that accompany enhanced perception. Instead, our results support a predictive coding account of speech perception; computational simulations show how a single mechanism, minimization of prediction error, can drive immediate perceptual effects of prior knowledge and longer-term perceptual learning of degraded speech.
Perceptual learning of degraded speech by minimizing prediction error
Sohoglu, Ediz
2016-01-01
Human perception is shaped by past experience on multiple timescales. Sudden and dramatic changes in perception occur when prior knowledge or expectations match stimulus content. These immediate effects contrast with the longer-term, more gradual improvements that are characteristic of perceptual learning. Despite extensive investigation of these two experience-dependent phenomena, there is considerable debate about whether they result from common or dissociable neural mechanisms. Here we test single- and dual-mechanism accounts of experience-dependent changes in perception using concurrent magnetoencephalographic and EEG recordings of neural responses evoked by degraded speech. When speech clarity was enhanced by prior knowledge obtained from matching text, we observed reduced neural activity in a peri-auditory region of the superior temporal gyrus (STG). Critically, longer-term improvements in the accuracy of speech recognition following perceptual learning resulted in reduced activity in a nearly identical STG region. Moreover, short-term neural changes caused by prior knowledge and longer-term neural changes arising from perceptual learning were correlated across subjects with the magnitude of learning-induced changes in recognition accuracy. These experience-dependent effects on neural processing could be dissociated from the neural effect of hearing physically clearer speech, which similarly enhanced perception but increased rather than decreased STG responses. Hence, the observed neural effects of prior knowledge and perceptual learning cannot be attributed to epiphenomenal changes in listening effort that accompany enhanced perception. Instead, our results support a predictive coding account of speech perception; computational simulations show how a single mechanism, minimization of prediction error, can drive immediate perceptual effects of prior knowledge and longer-term perceptual learning of degraded speech. PMID:26957596
Computationally modeling interpersonal trust.
Lee, Jin Joo; Knox, W Bradley; Wormwood, Jolie B; Breazeal, Cynthia; Desteno, David
2013-01-01
We present a computational model capable of predicting-above human accuracy-the degree of trust a person has toward their novel partner by observing the trust-related nonverbal cues expressed in their social interaction. We summarize our prior work, in which we identify nonverbal cues that signal untrustworthy behavior and also demonstrate the human mind's readiness to interpret those cues to assess the trustworthiness of a social robot. We demonstrate that domain knowledge gained from our prior work using human-subjects experiments, when incorporated into the feature engineering process, permits a computational model to outperform both human predictions and a baseline model built in naiveté of this domain knowledge. We then present the construction of hidden Markov models to investigate temporal relationships among the trust-related nonverbal cues. By interpreting the resulting learned structure, we observe that models built to emulate different levels of trust exhibit different sequences of nonverbal cues. From this observation, we derived sequence-based temporal features that further improve the accuracy of our computational model. Our multi-step research process presented in this paper combines the strength of experimental manipulation and machine learning to not only design a computational trust model but also to further our understanding of the dynamics of interpersonal trust.
ERIC Educational Resources Information Center
Johnson, A. M.; Ozogul, G.; Reisslein, M.
2015-01-01
An experiment examined the effects of visual signalling to relevant information in multiple external representations and the visual presence of an animated pedagogical agent (APA). Students learned electric circuit analysis using a computer-based learning environment that included Cartesian graphs, equations and electric circuit diagrams. The…
Perceptions of Ability to Program or to Use a Word Processor.
ERIC Educational Resources Information Center
Colley, Ann; And Others
1996-01-01
This study examined 117 undergraduates' perceptions of ability at computer programming and word processing. In particular, it rated the importance of prior experience factors, keyboarding skills, and personal attributes such as enjoyment of problem solving. Those were discovered, in general, to be more important than formal training or aptitude in…
Beyond Introductory Programming: Success Factors for Advanced Programming
ERIC Educational Resources Information Center
Hoskey, Arthur; Maurino, Paula San Millan
2011-01-01
Numerous studies document high drop-out and failure rates for students in computer programming classes. Studies show that even when some students pass programming classes, they still do not know how to program. Many factors have been considered to explain this problem including gender, age, prior programming experience, major, math background,…
An Investigation of Employees' Use of E-Learning Systems: Applying the Technology Acceptance Model
ERIC Educational Resources Information Center
Lee, Yi-Hsuan; Hsieh, Yi-Chuan; Chen, Yen-Hsun
2013-01-01
The purpose of this study is to apply the technology acceptance model to examine the employees' attitudes and acceptance of electronic learning (e-learning) systems in organisations. This study examines four factors (organisational support, computer self-efficacy, prior experience and task equivocality) that are believed to influence employees'…
ERIC Educational Resources Information Center
Zheng, Lanqin; Huang, Ronghuai; Hwang, Gwo-Jen; Yang, Kaicheng
2015-01-01
The purpose of this study is to quantitatively measure the level of knowledge elaboration and explore the relationships between prior knowledge of a group, group performance, and knowledge elaboration in collaborative learning. Two experiments were conducted to investigate the level of knowledge elaboration. The collaborative learning objective in…
Why doctors don't use computers: some empirical findings.
Anderson, J G; Jay, S J; Schweer, H M; Anderson, M M
1986-01-01
The attitudes of 148 medical students, 141 residents, and 644 practising physicians towards computer applications in medicine were studied. The results indicate that physicians recognize the potential of computers to improve patient care, but are concerned about the possibility of increased governmental and hospital control, threats to privacy, and legal and ethical problems. In general, all three groups are uncertain as to the potential effects of computers on their traditional professional role and on the organization of practice. Practising physicians, however, express more concern about these potential effects of computers than do medical students and residents. While attitudes appear to be somewhat independent of prior computer experience, they significantly affect the extent to which physicians use a computer-based hospital information system. This may be a major reason for the slow introduction of clinical computer systems. PMID:3701749
The Use of a Microcomputer in Collecting Data from Cardiovascular Experiments on Muscle Relaxants
Thut, Paul D.; Polansky, Gregg; Pruzansky, Elysa
1983-01-01
The possible association of cardiovascular side-effects from potentially, clinically useful non-depolarizing neuromuscular blocking drugs has been studied with the aid of a micro- computer. The maximal changes in heart rate, systolic, diastolic and mean arterial pressure and pulse pressure were recorded in the onset, maximal effect and recovery phase of relaxant activity in dogs anesthetized with isoflurane. The data collection system employed a Gould 2800S polygraph, an Apple II Plus microcomputer, a Cyborg Corp. ‘Issac’ 12 bit analog to digital converter, two 5 1/4″ floppy disk drives, a ‘Videoterm’ 80 column display board and a 12″ green phosphor monitor. Prior to development of the computer system, direct analysis of polygraph records required more than three times more time than the actual experiment. With the aid of the computer, analysis of data, tabular and graphic presentation and narrative reports were completed within 15 minutes after the end of the experiment.
NASA Astrophysics Data System (ADS)
Casadei, D.
2014-10-01
The objective Bayesian treatment of a model representing two independent Poisson processes, labelled as ``signal'' and ``background'' and both contributing additively to the total number of counted events, is considered. It is shown that the reference prior for the parameter of interest (the signal intensity) can be well approximated by the widely (ab)used flat prior only when the expected background is very high. On the other hand, a very simple approximation (the limiting form of the reference prior for perfect prior background knowledge) can be safely used over a large portion of the background parameters space. The resulting approximate reference posterior is a Gamma density whose parameters are related to the observed counts. This limiting form is simpler than the result obtained with a flat prior, with the additional advantage of representing a much closer approximation to the reference posterior in all cases. Hence such limiting prior should be considered a better default or conventional prior than the uniform prior. On the computing side, it is shown that a 2-parameter fitting function is able to reproduce extremely well the reference prior for any background prior. Thus, it can be useful in applications requiring the evaluation of the reference prior for a very large number of times.
ERIC Educational Resources Information Center
Ocak, Mehmet
2008-01-01
This correlational study examined the relationship between gender and the students' attitude and prior knowledge of using one of the mathematical software programs (MATLAB). Participants were selected from one community college, one state university and one private college. Students were volunteers from three Calculus I classrooms (one class from…
Investigating the impact of spatial priors on the performance of model-based IVUS elastography
Richards, M S; Doyley, M M
2012-01-01
This paper describes methods that provide pre-requisite information for computing circumferential stress in modulus elastograms recovered from vascular tissue—information that could help cardiologists detect life-threatening plaques and predict their propensity to rupture. The modulus recovery process is an ill-posed problem; therefore additional information is needed to provide useful elastograms. In this work, prior geometrical information was used to impose hard or soft constraints on the reconstruction process. We conducted simulation and phantom studies to evaluate and compare modulus elastograms computed with soft and hard constraints versus those computed without any prior information. The results revealed that (1) the contrast-to-noise ratio of modulus elastograms achieved using the soft prior and hard prior reconstruction methods exceeded those computed without any prior information; (2) the soft prior and hard prior reconstruction methods could tolerate up to 8 % measurement noise; and (3) the performance of soft and hard prior modulus elastogram degraded when incomplete spatial priors were employed. This work demonstrates that including spatial priors in the reconstruction process should improve the performance of model-based elastography, and the soft prior approach should enhance the robustness of the reconstruction process to errors in the geometrical information. PMID:22037648
2009-06-01
131 cases with 131 biopsy proven masses, of which 27 were malignant and 104 benign. The true locations of the masses were identified by an experi- enced ...two acquisitions would cause differ- ences in the subtlety of the masses on the FFDMs and SFMs. However, assuming that the differences are ran- dom... Lado , M. Souto, and J. J. Vidal, “Computer-aided diagnosis: Automatic detection of malignant masses in digitized mammograms,” Med. Phys. 25, 957–964
2007-06-01
the masses were identified by an experi- enced Mammography Quality Standards Act (MQSA) radiologist. The no-mass data set contained 98 cases. The time...force, and the difference in time between the two acquisitions would cause differ- ences in the subtlety of the masses on the FFDMs and SFMs. However...images," Medical Physics 18, 955-963 (1991). 20A. J. Mendez, P. G. Tahoces, M. J. Lado , M. Souto, and J. J. Vidal, "Computer-aided diagnosis: Automatic
Shock compression response of cold-rolled Ni/Al multilayer composites
Specht, Paul E.; Weihs, Timothy P.; Thadhani, Naresh N.
2017-01-06
Uniaxial strain, plate-on-plate impact experiments were performed on cold-rolled Ni/Al multilayer composites and the resulting Hugoniot was determined through time-resolved measurements combined with impedance matching. The experimental Hugoniot agreed with that previously predicted by two dimensional (2D) meso-scale calculations. Additional 2D meso-scale simulations were performed using the same computational method as the prior study to reproduce the experimentally measured free surface velocities and stress profiles. Finally, these simulations accurately replicated the experimental profiles, providing additional validation for the previous computational work.
Computer-based learning in neuroanatomy: A longitudinal study of learning, transfer, and retention
NASA Astrophysics Data System (ADS)
Chariker, Julia H.
A longitudinal experiment was conducted to explore computer-based learning of neuroanatomy. Using a realistic 3D graphical model of neuroanatomy, and sections derived from the model, exploratory graphical tools were integrated into interactive computer programs so as to allow adaptive exploration. 72 participants learned either sectional anatomy alone or learned whole anatomy followed by sectional anatomy. Sectional anatomy was explored either in perceptually continuous animation or discretely, as in the use of an anatomical atlas. Learning was measured longitudinally to a high performance criterion. After learning, transfer to biomedical images and long-term retention was tested. Learning whole anatomy prior to learning sectional anatomy led to a more efficient learning experience. Learners demonstrated high levels of transfer from whole anatomy to sectional anatomy and from sectional anatomy to complex biomedical images. All learning groups demonstrated high levels of retention at 2--3 weeks.
On Favorable Thermal Fields for Detached Bridgman Growth
NASA Technical Reports Server (NTRS)
Stelian, Carmen; Volz, Martin P.; Derby, Jeffrey J.
2009-01-01
The thermal fields of two Bridgman-like configurations, representative of real systems used in prior experiments for the detached growth of CdTe and Ge crystals, are studied. These detailed heat transfer computations are performed using the CrysMAS code and expand upon our previous analyses [14] that posited a new mechanism involving the thermal field and meniscus position to explain stable conditions for dewetted Bridgman growth. Computational results indicate that heat transfer conditions that led to successful detached growth in both of these systems are in accordance with our prior assertion, namely that the prevention of crystal reattachment to the crucible wall requires the avoidance of any undercooling of the melt meniscus during the growth run. Significantly, relatively simple process modifications that promote favorable thermal conditions for detached growth may overcome detrimental factors associated with meniscus shape and crucible wetting. Thus, these ideas may be important to advance the practice of detached growth for many materials.
The photon identification loophole in EPRB experiments: computer models with single-wing selection
NASA Astrophysics Data System (ADS)
De Raedt, Hans; Michielsen, Kristel; Hess, Karl
2017-11-01
Recent Einstein-Podolsky-Rosen-Bohm experiments [M. Giustina et al. Phys. Rev. Lett. 115, 250401 (2015); L. K. Shalm et al. Phys. Rev. Lett. 115, 250402 (2015)] that claim to be loophole free are scrutinized. The combination of a digital computer and discrete-event simulation is used to construct a minimal but faithful model of the most perfected realization of these laboratory experiments. In contrast to prior simulations, all photon selections are strictly made, as they are in the actual experiments, at the local station and no other "post-selection" is involved. The simulation results demonstrate that a manifestly non-quantum model that identifies photons in the same local manner as in these experiments can produce correlations that are in excellent agreement with those of the quantum theoretical description of the corresponding thought experiment, in conflict with Bell's theorem which states that this is impossible. The failure of Bell's theorem is possible because of our recognition of the photon identification loophole. Such identification measurement-procedures are necessarily included in all actual experiments but are not included in the theory of Bell and his followers.
On the Origins of Suboptimality in Human Probabilistic Inference
Acerbi, Luigi; Vijayakumar, Sethu; Wolpert, Daniel M.
2014-01-01
Humans have been shown to combine noisy sensory information with previous experience (priors), in qualitative and sometimes quantitative agreement with the statistically-optimal predictions of Bayesian integration. However, when the prior distribution becomes more complex than a simple Gaussian, such as skewed or bimodal, training takes much longer and performance appears suboptimal. It is unclear whether such suboptimality arises from an imprecise internal representation of the complex prior, or from additional constraints in performing probabilistic computations on complex distributions, even when accurately represented. Here we probe the sources of suboptimality in probabilistic inference using a novel estimation task in which subjects are exposed to an explicitly provided distribution, thereby removing the need to remember the prior. Subjects had to estimate the location of a target given a noisy cue and a visual representation of the prior probability density over locations, which changed on each trial. Different classes of priors were examined (Gaussian, unimodal, bimodal). Subjects' performance was in qualitative agreement with the predictions of Bayesian Decision Theory although generally suboptimal. The degree of suboptimality was modulated by statistical features of the priors but was largely independent of the class of the prior and level of noise in the cue, suggesting that suboptimality in dealing with complex statistical features, such as bimodality, may be due to a problem of acquiring the priors rather than computing with them. We performed a factorial model comparison across a large set of Bayesian observer models to identify additional sources of noise and suboptimality. Our analysis rejects several models of stochastic behavior, including probability matching and sample-averaging strategies. Instead we show that subjects' response variability was mainly driven by a combination of a noisy estimation of the parameters of the priors, and by variability in the decision process, which we represent as a noisy or stochastic posterior. PMID:24945142
Sex differences in effects of testing medium and response format on a visuospatial task.
Cherney, Isabelle D; Rendell, Jariel A
2010-06-01
Sex differences on visuospatial tests are among the most reliably replicated. It is unclear to what extent these performance differences reflect underlying differences in skills or testing factors. To assess whether testing medium and response format affect visuospatial sex differences, performances of introductory psychology students (100 men, 104 women) were examined on a visuospatial task presented in paper-and-pencil and tablet computer forms. Both sexes performed better when tested on paper, although men outperformed women. The introduction of an open-ended component to the visuospatial task eliminated sex differences when prior spatial experiences were controlled, but men outperformed women when prior spatial experiences were not considered. In general, the open-ended version and computerized format of the test diminished performance, suggesting that response format and medium are testing factors that influence visuospatial abilities.
ERIC Educational Resources Information Center
Hilbig, Annemarie; Proske, Antje
2014-01-01
Although academic writing is a complex interplay of comprehending and producing text the aspect of collecting information from source texts is hardly addressed in writing research. This study examined the impact of instructions supporting the collection process on writing quality, as well as the role of prior motivation and computer experience.…
ERIC Educational Resources Information Center
Reisslein, Jana; Seeling, Patrick; Reisslein, Martin
2005-01-01
An important challenge in the introductory communication networks course in electrical and computer engineering curricula is to integrate emerging topics, such as wireless Internet access and network security, into the already content-intensive course. At the same time it is essential to provide students with experiences in online collaboration,…
NASA Astrophysics Data System (ADS)
Li, Zheng; Jiang, Yi-han; Duan, Lian; Zhu, Chao-zhe
2017-08-01
Objective. Functional near infra-red spectroscopy (fNIRS) is a promising brain imaging technology for brain-computer interfaces (BCI). Future clinical uses of fNIRS will likely require operation over long time spans, during which neural activation patterns may change. However, current decoders for fNIRS signals are not designed to handle changing activation patterns. The objective of this study is to test via simulations a new adaptive decoder for fNIRS signals, the Gaussian mixture model adaptive classifier (GMMAC). Approach. GMMAC can simultaneously classify and track activation pattern changes without the need for ground-truth labels. This adaptive classifier uses computationally efficient variational Bayesian inference to label new data points and update mixture model parameters, using the previous model parameters as priors. We test GMMAC in simulations in which neural activation patterns change over time and compare to static decoders and unsupervised adaptive linear discriminant analysis classifiers. Main results. Our simulation experiments show GMMAC can accurately decode under time-varying activation patterns: shifts of activation region, expansions of activation region, and combined contractions and shifts of activation region. Furthermore, the experiments show the proposed method can track the changing shape of the activation region. Compared to prior work, GMMAC performed significantly better than the other unsupervised adaptive classifiers on a difficult activation pattern change simulation: 99% versus <54% in two-choice classification accuracy. Significance. We believe GMMAC will be useful for clinical fNIRS-based brain-computer interfaces, including neurofeedback training systems, where operation over long time spans is required.
Simulation Experiment Description Markup Language (SED-ML) Level 1 Version 2.
Bergmann, Frank T; Cooper, Jonathan; Le Novère, Nicolas; Nickerson, David; Waltemath, Dagmar
2015-09-04
The number, size and complexity of computational models of biological systems are growing at an ever increasing pace. It is imperative to build on existing studies by reusing and adapting existing models and parts thereof. The description of the structure of models is not sufficient to enable the reproduction of simulation results. One also needs to describe the procedures the models are subjected to, as recommended by the Minimum Information About a Simulation Experiment (MIASE) guidelines. This document presents Level 1 Version 2 of the Simulation Experiment Description Markup Language (SED-ML), a computer-readable format for encoding simulation and analysis experiments to apply to computational models. SED-ML files are encoded in the Extensible Markup Language (XML) and can be used in conjunction with any XML-based model encoding format, such as CellML or SBML. A SED-ML file includes details of which models to use, how to modify them prior to executing a simulation, which simulation and analysis procedures to apply, which results to extract and how to present them. Level 1 Version 2 extends the format by allowing the encoding of repeated and chained procedures.
Simulation Experiment Description Markup Language (SED-ML) Level 1 Version 2.
Bergmann, Frank T; Cooper, Jonathan; Le Novère, Nicolas; Nickerson, David; Waltemath, Dagmar
2015-06-01
The number, size and complexity of computational models of biological systems are growing at an ever increasing pace. It is imperative to build on existing studies by reusing and adapting existing models and parts thereof. The description of the structure of models is not sufficient to enable the reproduction of simulation results. One also needs to describe the procedures the models are subjected to, as recommended by the Minimum Information About a Simulation Experiment (MIASE) guidelines. This document presents Level 1 Version 2 of the Simulation Experiment Description Markup Language (SED-ML), a computer-readable format for encoding simulation and analysis experiments to apply to computational models. SED-ML files are encoded in the Extensible Markup Language (XML) and can be used in conjunction with any XML-based model encoding format, such as CellML or SBML. A SED-ML file includes details of which models to use, how to modify them prior to executing a simulation, which simulation and analysis procedures to apply, which results to extract and how to present them. Level 1 Version 2 extends the format by allowing the encoding of repeated and chained procedures.
Perception of earthquake risk in Taiwan: effects of gender and past earthquake experience.
Kung, Yi-Wen; Chen, Sue-Huei
2012-09-01
This study explored how individuals in Taiwan perceive the risk of earthquake and the relationship of past earthquake experience and gender to risk perception. Participants (n= 1,405), including earthquake survivors and those in the general population without prior direct earthquake exposure, were selected and interviewed through a computer-assisted telephone interviewing procedure using a random sampling and stratification method covering all 24 regions of Taiwan. A factor analysis of the interview data yielded a two-factor structure of risk perception in regard to earthquake. The first factor, "personal impact," encompassed perception of threat and fear related to earthquakes. The second factor, "controllability," encompassed a sense of efficacy of self-protection in regard to earthquakes. The findings indicated prior earthquake survivors and females reported higher scores on the personal impact factor than males and those with no prior direct earthquake experience, although there were no group differences on the controllability factor. The findings support that risk perception has multiple components, and suggest that past experience (survivor status) and gender (female) affect the perception of risk. Exploration of potential contributions of other demographic factors such as age, education, and marital status to personal impact, especially for females and survivors, is discussed. Future research on and intervention program with regard to risk perception are suggested accordingly. © 2012 Society for Risk Analysis.
ERIC Educational Resources Information Center
Winberg, T. Mikael; Hedman, Leif
2008-01-01
Attitudes toward learning (ATL) have been shown to influence students' learning outcomes. However, there is a lack of knowledge about the ways in which the interaction between ATL, the learning situation, and the level of students' prior knowledge influence affective reactions and conceptual change. In this study, a simulation of acid-base…
Use of a computer model in the understanding of erythropoietic control mechanisms
NASA Technical Reports Server (NTRS)
Dunn, C. D. R.
1978-01-01
During an eight-week visit approximately 200 simulations using the computer model for the regulation of erythopoiesis were carries out in four general areas: with the human model simulating hypoxia and dehydration, evaluation of the simulation of dehydration using the mouse model. The experiments led to two considerations for the models. Firstly, a direct relationship between erythropoietin concentration and bone marrow sensitivity to the hormone and, secondly, a partial correction of tissue hypoxia prior to compensation by an increased hematocrit. This latter change in particular produced a better simuation of the effects of hypoxia on plasma erythropoietin concentrations.
Shock compression response of cold-rolled Ni/Al multilayer composites
NASA Astrophysics Data System (ADS)
Specht, Paul E.; Weihs, Timothy P.; Thadhani, Naresh N.
2017-01-01
Uniaxial strain, plate-on-plate impact experiments were performed on cold-rolled Ni/Al multilayer composites and the resulting Hugoniot was determined through time-resolved measurements combined with impedance matching. The experimental Hugoniot agreed with that previously predicted by two dimensional (2D) meso-scale calculations [Specht et al., J. Appl. Phys. 111, 073527 (2012)]. Additional 2D meso-scale simulations were performed using the same computational method as the prior study to reproduce the experimentally measured free surface velocities and stress profiles. These simulations accurately replicated the experimental profiles, providing additional validation for the previous computational work.
Influence of prior information on pain involves biased perceptual decision-making.
Wiech, Katja; Vandekerckhove, Joachim; Zaman, Jonas; Tuerlinckx, Francis; Vlaeyen, Johan W S; Tracey, Irene
2014-08-04
Prior information about features of a stimulus is a strong modulator of perception. For instance, the prospect of more intense pain leads to an increased perception of pain, whereas the expectation of analgesia reduces pain, as shown in placebo analgesia and expectancy modulations during drug administration. This influence is commonly assumed to be rooted in altered sensory processing and expectancy-related modulations in the spinal cord, are often taken as evidence for this notion. Contemporary models of perception, however, suggest that prior information can also modulate perception by biasing perceptual decision-making - the inferential process underlying perception in which prior information is used to interpret sensory information. In this type of bias, the information is already present in the system before the stimulus is observed. Computational models can distinguish between changes in sensory processing and altered decision-making as they result in different response times for incorrect choices in a perceptual decision-making task (Figure S1A,B). Using a drift-diffusion model, we investigated the influence of both processes in two independent experiments. The results of both experiments strongly suggest that these changes in pain perception are predominantly based on altered perceptual decision-making. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.
Progress Towards a Microgravity CFD Validation Study Using the ISS SPHERES-SLOSH Experiment
NASA Technical Reports Server (NTRS)
Storey, Jedediah M.; Kirk, Daniel; Marsell, Brandon (Editor); Schallhorn, Paul (Editor)
2017-01-01
Understanding, predicting, and controlling fluid slosh dynamics is critical to safety and improving performance of space missions when a significant percentage of the spacecrafts mass is a liquid. Computational fluid dynamics simulations can be used to predict the dynamics of slosh, but these programs require extensive validation. Many CFD programs have been validated by slosh experiments using various fluids in earth gravity, but prior to the ISS SPHERES-Slosh experiment1, little experimental data for long-duration, zero-gravity slosh existed. This paper presents the current status of an ongoing CFD validation study using the ISS SPHERES-Slosh experimental data.
Progress Towards a Microgravity CFD Validation Study Using the ISS SPHERES-SLOSH Experiment
NASA Technical Reports Server (NTRS)
Storey, Jed; Kirk, Daniel (Editor); Marsell, Brandon (Editor); Schallhorn, Paul (Editor)
2017-01-01
Understanding, predicting, and controlling fluid slosh dynamics is critical to safety and improving performance of space missions when a significant percentage of the spacecrafts mass is a liquid. Computational fluid dynamics simulations can be used to predict the dynamics of slosh, but these programs require extensive validation. Many CFD programs have been validated by slosh experiments using various fluids in earth gravity, but prior to the ISS SPHERES-Slosh experiment, little experimental data for long-duration, zero-gravity slosh existed. This paper presents the current status of an ongoing CFD validation study using the ISS SPHERES-Slosh experimental data.
Kleinman, L; Leidy, N K; Crawley, J; Bonomi, A; Schoenfeld, P
2001-02-01
Although most health-related quality of life questionnaires are self-administered by means of paper and pencil, new technologies for automated computer administration are becoming more readily available. Novel methods of instrument administration must be assessed for score equivalence in addition to consistency in reliability and validity. The present study compared the psychometric characteristics (score equivalence and structure, internal consistency, and reproducibility reliability and construct validity) of the Quality of Life in Reflux And Dyspepsia (QOLRAD) questionnaire when self-administered by means of paper and pencil versus touch-screen computer. The influence of age, education, and prior experience with computers on score equivalence was also examined. This crossover trial randomized 134 patients with gastroesophageal reflux disease to 1 of 2 groups: paper-and-pencil questionnaire administration followed by computer administration or computer administration followed by use of paper and pencil. To minimize learning effects and respondent fatigue, administrations were scheduled 3 days apart. A random sample of 32 patients participated in a 1-week reproducibility evaluation of the computer-administered QOLRAD. QOLRAD scores were equivalent across the 2 methods of administration regardless of subject age, education, and prior computer use. Internal consistency levels were very high (alpha = 0.93-0.99). Interscale correlations were strong and generally consistent across methods (r = 0.7-0.87). Correlations between the QOLRAD and Short Form 36 (SF-36) were high, with no significant differences by method. Test-retest reliability of the computer-administered QOLRAD was also very high (ICC = 0.93-0.96). Results of the present study suggest that the QOLRAD is reliable and valid when self-administered by means of computer touch-screen or paper and pencil.
Klooster, Nathaniel B.; Cook, Susan W.; Uc, Ergun Y.; Duff, Melissa C.
2015-01-01
Hand gesture, a ubiquitous feature of human interaction, facilitates communication. Gesture also facilitates new learning, benefiting speakers and listeners alike. Thus, gestures must impact cognition beyond simply supporting the expression of already-formed ideas. However, the cognitive and neural mechanisms supporting the effects of gesture on learning and memory are largely unknown. We hypothesized that gesture's ability to drive new learning is supported by procedural memory and that procedural memory deficits will disrupt gesture production and comprehension. We tested this proposal in patients with intact declarative memory, but impaired procedural memory as a consequence of Parkinson's disease (PD), and healthy comparison participants with intact declarative and procedural memory. In separate experiments, we manipulated the gestures participants saw and produced in a Tower of Hanoi (TOH) paradigm. In the first experiment, participants solved the task either on a physical board, requiring high arching movements to manipulate the discs from peg to peg, or on a computer, requiring only flat, sideways movements of the mouse. When explaining the task, healthy participants with intact procedural memory displayed evidence of their previous experience in their gestures, producing higher, more arching hand gestures after solving on a physical board, and smaller, flatter gestures after solving on a computer. In the second experiment, healthy participants who saw high arching hand gestures in an explanation prior to solving the task subsequently moved the mouse with significantly higher curvature than those who saw smaller, flatter gestures prior to solving the task. These patterns were absent in both gesture production and comprehension experiments in patients with procedural memory impairment. These findings suggest that the procedural memory system supports the ability of gesture to drive new learning. PMID:25628556
Perceptions and performance using computer-based testing: One institution's experience.
Bloom, Timothy J; Rich, Wesley D; Olson, Stephanie M; Adams, Michael L
2018-02-01
The purpose of this study was to evaluate student and faculty perceptions of the transition to a required computer-based testing format and to identify any impact of this transition on student exam performance. Separate questionnaires sent to students and faculty asked about perceptions of and problems with computer-based testing. Exam results from program-required courses for two years prior to and two years following the adoption of computer-based testing were compared to determine if this testing format impacted student performance. Responses to Likert-type questions about perceived ease of use showed no difference between students with one and three semesters experience with computer-based testing. Of 223 student-reported problems, 23% related to faculty training with the testing software. Students most commonly reported improved feedback (46% of responses) and ease of exam-taking (17% of responses) as benefits to computer-based testing. Faculty-reported difficulties were most commonly related to problems with student computers during an exam (38% of responses) while the most commonly identified benefit was collecting assessment data (32% of responses). Neither faculty nor students perceived an impact on exam performance due to computer-based testing. An analysis of exam grades confirmed there was no consistent performance difference between the paper and computer-based formats. Both faculty and students rapidly adapted to using computer-based testing. There was no evidence that switching to computer-based testing had any impact on student exam performance. Copyright © 2017 Elsevier Inc. All rights reserved.
Hispanic women overcoming deterrents to computer science: A phenomenological study
NASA Astrophysics Data System (ADS)
Herling, Lourdes
The products of computer science are important to all aspects of society and are tools in the solution of the world's problems. It is, therefore, troubling that the United States faces a shortage in qualified graduates in computer science. The number of women and minorities in computer science is significantly lower than the percentage of the U.S. population which they represent. The overall enrollment in computer science programs has continued to decline with the enrollment of women declining at a higher rate than that of men. This study addressed three aspects of underrepresentation about which there has been little previous research: addressing computing disciplines specifically rather than embedding them within the STEM disciplines, what attracts women and minorities to computer science, and addressing the issues of race/ethnicity and gender in conjunction rather than in isolation. Since women of underrepresented ethnicities are more severely underrepresented than women in general, it is important to consider whether race and ethnicity play a role in addition to gender as has been suggested by previous research. Therefore, this study examined what attracted Hispanic women to computer science specifically. The study determines whether being subjected to multiple marginalizations---female and Hispanic---played a role in the experiences of Hispanic women currently in computer science. The study found five emergent themes within the experiences of Hispanic women in computer science. Encouragement and role models strongly influenced not only the participants' choice to major in the field, but to persist as well. Most of the participants experienced a negative atmosphere and feelings of not fitting in while in college and industry. The interdisciplinary nature of computer science was the most common aspect that attracted the participants to computer science. The aptitudes participants commonly believed are needed for success in computer science are the Twenty-First Century skills problem solving, creativity, and critical thinking. While not all the participants had experience with computers or programming prior to attending college, experience played a role in the self-confidence of those who did.
Saito, Atsushi; Nawano, Shigeru; Shimizu, Akinobu
2017-05-01
This paper addresses joint optimization for segmentation and shape priors, including translation, to overcome inter-subject variability in the location of an organ. Because a simple extension of the previous exact optimization method is too computationally complex, we propose a fast approximation for optimization. The effectiveness of the proposed approximation is validated in the context of gallbladder segmentation from a non-contrast computed tomography (CT) volume. After spatial standardization and estimation of the posterior probability of the target organ, simultaneous optimization of the segmentation, shape, and location priors is performed using a branch-and-bound method. Fast approximation is achieved by combining sampling in the eigenshape space to reduce the number of shape priors and an efficient computational technique for evaluating the lower bound. Performance was evaluated using threefold cross-validation of 27 CT volumes. Optimization in terms of translation of the shape prior significantly improved segmentation performance. The proposed method achieved a result of 0.623 on the Jaccard index in gallbladder segmentation, which is comparable to that of state-of-the-art methods. The computational efficiency of the algorithm is confirmed to be good enough to allow execution on a personal computer. Joint optimization of the segmentation, shape, and location priors was proposed, and it proved to be effective in gallbladder segmentation with high computational efficiency.
Fincher, Danielle; VanderEnde, Kristin; Colbert, Kia; Houry, Debra; Smith, L Shakiyla; Yount, Kathryn M
2015-03-01
African American women in the United States report intimate partner violence (IPV) more often than the general population of women. Overall, women underreport IPV because of shame, embarrassment, fear of retribution, or low expectation of legal support. African American women may be especially unlikely to report IPV because of poverty, low social support, and past experiences of discrimination. The purpose of this article is to determine the context in which low-income African American women disclose IPV. Consenting African American women receiving Special Supplemental Nutrition Program for Women, Infants, and Children (WIC) services in WIC clinics were randomized to complete an IPV screening (Revised Conflict Tactics Scales-Short Form) via computer-assisted self-interview (CASI) or face-to-face interview (FTFI). Women (n = 368) reported high rates of lifetime and prior-year verbal (48%, 34%), physical (12%, 7%), sexual (10%, 7%), and any (49%, 36%) IPV, as well as IPV-related injury (13%, 7%). Mode of screening, but not interviewer race, affected disclosure. Women screened via FTFI reported significantly more lifetime and prior-year negotiation (adjusted odds ratio [aOR] = 10.54, 3.97) and more prior-year verbal (aOR = 2.10), sexual (aOR = 4.31), and any (aOR = 2.02) IPV than CASI-screened women. African American women in a WIC setting disclosed IPV more often in face-to-face than computer screening, and race-matching of client and interviewer did not affect disclosure. Findings highlight the potential value of face-to-face screening to identify women at risk of IPV. Programs should weigh the costs and benefits of training staff versus using computer-based technologies to screen for IPV in WIC settings. © The Author(s) 2014.
Sensitivity to spatial frequency content is not specific to face perception
Williams, N. Rankin; Willenbockel, Verena; Gauthier, Isabel
2010-01-01
Prior work using a matching task between images that were complementary in spatial frequency and orientation information suggested that the representation of faces, but not objects, retains low-level spatial frequency (SF) information (Biederman & Kalocsai. 1997). In two experiments, we reexamine the claim that faces are uniquely sensitive to changes in SF. In contrast to prior work, we used a design allowing the computation of sensitivity and response criterion for each category, and in one experiment, equalized low-level image properties across object categories. In both experiments, we find that observers are sensitive to SF changes for upright and inverted faces and nonface objects. Differential response biases across categories contributed to a larger sensitivity for faces, but even sensitivity showed a larger effect for faces, especially when faces were upright and in a front-facing view. However, when objects were inverted, or upright but shown in a three-quarter view, the matching of objects and faces was equally sensitive to SF changes. Accordingly, face perception does not appear to be uniquely affected by changes in SF content. PMID:19576237
Blowers, Paul; Hollingshead, Kyle
2009-05-21
In this work, the global warming potential (GWP) of methylene fluoride (CH(2)F(2)), or HFC-32, is estimated through computational chemistry methods. We find our computational chemistry approach reproduces well all phenomena important for predicting global warming potentials. Geometries predicted using the B3LYP/6-311g** method were in good agreement with experiment, although some other computational methods performed slightly better. Frequencies needed for both partition function calculations in transition-state theory and infrared intensities needed for radiative forcing estimates agreed well with experiment compared to other computational methods. A modified CBS-RAD method used to obtain energies led to superior results to all other previous heat of reaction estimates and most barrier height calculations when the B3LYP/6-311g** optimized geometry was used as the base structure. Use of the small-curvature tunneling correction and a hindered rotor treatment where appropriate led to accurate reaction rate constants and radiative forcing estimates without requiring any experimental data. Atmospheric lifetimes from theory at 277 K were indistinguishable from experimental results, as were the final global warming potentials compared to experiment. This is the first time entirely computational methods have been applied to estimate a global warming potential for a chemical, and we have found the approach to be robust, inexpensive, and accurate compared to prior experimental results. This methodology was subsequently used to estimate GWPs for three additional species [methane (CH(4)); fluoromethane (CH(3)F), or HFC-41; and fluoroform (CHF(3)), or HFC-23], where estimations also compare favorably to experimental values.
NASA Astrophysics Data System (ADS)
Tran, Diem-Trang T.; Le, Ly T.; Truong, Thanh N.
2013-08-01
Drug binding and unbinding are transient processes which are hardly observed by experiment and difficult to analyze by computational techniques. In this paper, we employed a cost-effective method called "pathway docking" in which molecular docking was used to screen ligand-receptor binding free energy surface to reveal possible paths of ligand approaching protein binding pocket. A case study was applied on oseltamivir, the key drug against influenza a virus. The equilibrium pathways identified by this method are found to be similar to those identified in prior studies using highly expensive computational approaches.
Identification of Boolean Network Models From Time Series Data Incorporating Prior Knowledge.
Leifeld, Thomas; Zhang, Zhihua; Zhang, Ping
2018-01-01
Motivation: Mathematical models take an important place in science and engineering. A model can help scientists to explain dynamic behavior of a system and to understand the functionality of system components. Since length of a time series and number of replicates is limited by the cost of experiments, Boolean networks as a structurally simple and parameter-free logical model for gene regulatory networks have attracted interests of many scientists. In order to fit into the biological contexts and to lower the data requirements, biological prior knowledge is taken into consideration during the inference procedure. In the literature, the existing identification approaches can only deal with a subset of possible types of prior knowledge. Results: We propose a new approach to identify Boolean networks from time series data incorporating prior knowledge, such as partial network structure, canalizing property, positive and negative unateness. Using vector form of Boolean variables and applying a generalized matrix multiplication called the semi-tensor product (STP), each Boolean function can be equivalently converted into a matrix expression. Based on this, the identification problem is reformulated as an integer linear programming problem to reveal the system matrix of Boolean model in a computationally efficient way, whose dynamics are consistent with the important dynamics captured in the data. By using prior knowledge the number of candidate functions can be reduced during the inference. Hence, identification incorporating prior knowledge is especially suitable for the case of small size time series data and data without sufficient stimuli. The proposed approach is illustrated with the help of a biological model of the network of oxidative stress response. Conclusions: The combination of efficient reformulation of the identification problem with the possibility to incorporate various types of prior knowledge enables the application of computational model inference to systems with limited amount of time series data. The general applicability of this methodological approach makes it suitable for a variety of biological systems and of general interest for biological and medical research.
Blood Donor Test-Seeking Motivation and Prior HIV Testing Experiences in São Paulo, Brazil.
Truong, Hong-Ha M; Blatyta, Paula F; Santos, Fernanda M; Montebello, Sandra; Esposti, Sandra P D; Hangai, Fatima N; Salles, Nanci Alves; Mendrone, Alfredo; Sabino, Ester C; McFarland, Willi; Gonçalez, Thelma T
2015-09-01
HIV test-seeking behavior among blood donors has been observed worldwide and may pose a threat to the safety of the blood supply. We evaluated current test-seeking motivations and prior alternative HIV testing experiences among blood donors in São Paulo, Brazil. All candidate or potential blood donors were consecutively approached and recruited to participate in the study upon presentation at Fundação Pró-Sangue Hemocentro, the largest blood bank in Brazil. Participants were recruited between August 2012 and May 2013 after they were screened for donor eligibility. Questionnaires were administered through audio computer-assisted self-interview. Among 11,867 donors, 38 % previously tested for HIV apart from blood donation, of whom 47.7 % tested at public facilities and 2.7 % acknowledged getting tested for HIV as the primary reason for donating. Dissatisfaction with prior alternative testing experience was reported by 2.5 % of donors. Current test-seeking motivation was associated with dissatisfaction with prior alternative testing experience and testing at a public alternative facility. The most common reasons for dissatisfaction were too long of a wait to get tested and for results, counseling was too long, lack of privacy, and low confidence in the equipment and accuracy of the test. Lack of awareness about the availability of free and confidential public HIV testing services as well as dissatisfaction with past HIV testing and counseling experiences motivate some individuals to test at blood banks. Test-seeking behavior among blood donors may be best addressed by improving alternative testing programs, particularly with respect to time delays, privacy and perceptions about test accuracy. Educational campaigns on safe blood donation and HIV testing for diagnosis, risk counseling and referral to care are also needed for the general public and for health care providers.
Factors influencing use of an e-health website in a community sample of older adults.
Czaja, Sara J; Sharit, Joseph; Lee, Chin Chin; Nair, Sankaran N; Hernández, Mario A; Arana, Neysarí; Fu, Shih Hua
2013-01-01
The use of the internet as a source of health information and link to healthcare services has raised concerns about the ability of consumers, especially vulnerable populations such as older adults, to access these applications. This study examined the influence of training on the ability of adults (aged 45+ years) to use the Medicare.gov website to solve problems related to health management. The influence of computer experience and cognitive abilities on performance was also examined. Seventy-one participants, aged 47-92, were randomized into a Multimedia training, Unimodal training, or Cold Start condition and completed three healthcare management problems. MEASUREMENT AND ANALYSES: Computer/internet experience was measured via questionnaire, and cognitive abilities were assessed using standard neuropsychological tests. Performance metrics included measures of navigation, accuracy and efficiency. Data were analyzed using analysis of variance, χ(2) and regression techniques. The data indicate that there was no difference among the three conditions on measures of accuracy, efficiency, or navigation. However, results of the regression analyses showed that, overall, people who received training performed better on the tasks, as evidenced by greater accuracy and efficiency. Performance was also significantly influenced by prior computer experience and cognitive abilities. Participants with more computer experience and higher cognitive abilities performed better. The findings indicate that training, experience, and abilities are important when using complex health websites. However, training alone is not sufficient. The complexity of web content needs to be considered to ensure successful use of these websites by those with lower abilities.
Factors influencing use of an e-health website in a community sample of older adults
Sharit, Joseph; Lee, Chin Chin; Nair, Sankaran N; Hernández, Mario A; Arana, Neysarí; Fu, Shih Hua
2013-01-01
Objective The use of the internet as a source of health information and link to healthcare services has raised concerns about the ability of consumers, especially vulnerable populations such as older adults, to access these applications. This study examined the influence of training on the ability of adults (aged 45+ years) to use the Medicare.gov website to solve problems related to health management. The influence of computer experience and cognitive abilities on performance was also examined. Design Seventy-one participants, aged 47–92, were randomized into a Multimedia training, Unimodal training, or Cold Start condition and completed three healthcare management problems. Measurement and analyses Computer/internet experience was measured via questionnaire, and cognitive abilities were assessed using standard neuropsychological tests. Performance metrics included measures of navigation, accuracy and efficiency. Data were analyzed using analysis of variance, χ2 and regression techniques. Results The data indicate that there was no difference among the three conditions on measures of accuracy, efficiency, or navigation. However, results of the regression analyses showed that, overall, people who received training performed better on the tasks, as evidenced by greater accuracy and efficiency. Performance was also significantly influenced by prior computer experience and cognitive abilities. Participants with more computer experience and higher cognitive abilities performed better. Conclusions The findings indicate that training, experience, and abilities are important when using complex health websites. However, training alone is not sufficient. The complexity of web content needs to be considered to ensure successful use of these websites by those with lower abilities. PMID:22802269
Understanding the Role of Prior Knowledge in a Multimedia Learning Application
ERIC Educational Resources Information Center
Rias, Riaza Mohd; Zaman, Halimah Badioze
2013-01-01
This study looked at the effects that individual differences in prior knowledge have on student understanding in learning with multimedia in a computer science subject. Students were identified as having either low or high prior knowledge from a series of questions asked in a survey conducted at the Faculty of Computer and Mathematical Sciences at…
Computational Neuropsychology and Bayesian Inference.
Parr, Thomas; Rees, Geraint; Friston, Karl J
2018-01-01
Computational theories of brain function have become very influential in neuroscience. They have facilitated the growth of formal approaches to disease, particularly in psychiatric research. In this paper, we provide a narrative review of the body of computational research addressing neuropsychological syndromes, and focus on those that employ Bayesian frameworks. Bayesian approaches to understanding brain function formulate perception and action as inferential processes. These inferences combine 'prior' beliefs with a generative (predictive) model to explain the causes of sensations. Under this view, neuropsychological deficits can be thought of as false inferences that arise due to aberrant prior beliefs (that are poor fits to the real world). This draws upon the notion of a Bayes optimal pathology - optimal inference with suboptimal priors - and provides a means for computational phenotyping. In principle, any given neuropsychological disorder could be characterized by the set of prior beliefs that would make a patient's behavior appear Bayes optimal. We start with an overview of some key theoretical constructs and use these to motivate a form of computational neuropsychology that relates anatomical structures in the brain to the computations they perform. Throughout, we draw upon computational accounts of neuropsychological syndromes. These are selected to emphasize the key features of a Bayesian approach, and the possible types of pathological prior that may be present. They range from visual neglect through hallucinations to autism. Through these illustrative examples, we review the use of Bayesian approaches to understand the link between biology and computation that is at the heart of neuropsychology.
Yang, Jin; Lee, Joonyeol; Lisberger, Stephen G.
2012-01-01
Sensory-motor behavior results from a complex interaction of noisy sensory data with priors based on recent experience. By varying the stimulus form and contrast for the initiation of smooth pursuit eye movements in monkeys, we show that visual motion inputs compete with two independent priors: one prior biases eye speed toward zero; the other prior attracts eye direction according to the past several days’ history of target directions. The priors bias the speed and direction of the initiation of pursuit for the weak sensory data provided by the motion of a low-contrast sine wave grating. However, the priors have relatively little effect on pursuit speed and direction when the visual stimulus arises from the coherent motion of a high-contrast patch of dots. For any given stimulus form, the mean and variance of eye speed co-vary in the initiation of pursuit, as expected for signal-dependent noise. This relationship suggests that pursuit implements a trade-off between movement accuracy and variation, reducing both when the sensory signals are noisy. The tradeoff is implemented as a competition of sensory data and priors that follows the rules of Bayesian estimation. Computer simulations show that the priors can be understood as direction specific control of the strength of visual-motor transmission, and can be implemented in a neural-network model that makes testable predictions about the population response in the smooth eye movement region of the frontal eye fields. PMID:23223286
Modeling of a latent fault detector in a digital system
NASA Technical Reports Server (NTRS)
Nagel, P. M.
1978-01-01
Methods of modeling the detection time or latency period of a hardware fault in a digital system are proposed that explain how a computer detects faults in a computational mode. The objectives were to study how software reacts to a fault, to account for as many variables as possible affecting detection and to forecast a given program's detecting ability prior to computation. A series of experiments were conducted on a small emulated microprocessor with fault injection capability. Results indicate that the detecting capability of a program largely depends on the instruction subset used during computation and the frequency of its use and has little direct dependence on such variables as fault mode, number set, degree of branching and program length. A model is discussed which employs an analog with balls in an urn to explain the rate of which subsequent repetitions of an instruction or instruction set detect a given fault.
Calibrated birth-death phylogenetic time-tree priors for bayesian inference.
Heled, Joseph; Drummond, Alexei J
2015-05-01
Here we introduce a general class of multiple calibration birth-death tree priors for use in Bayesian phylogenetic inference. All tree priors in this class separate ancestral node heights into a set of "calibrated nodes" and "uncalibrated nodes" such that the marginal distribution of the calibrated nodes is user-specified whereas the density ratio of the birth-death prior is retained for trees with equal values for the calibrated nodes. We describe two formulations, one in which the calibration information informs the prior on ranked tree topologies, through the (conditional) prior, and the other which factorizes the prior on divergence times and ranked topologies, thus allowing uniform, or any arbitrary prior distribution on ranked topologies. Although the first of these formulations has some attractive properties, the algorithm we present for computing its prior density is computationally intensive. However, the second formulation is always faster and computationally efficient for up to six calibrations. We demonstrate the utility of the new class of multiple-calibration tree priors using both small simulations and a real-world analysis and compare the results to existing schemes. The two new calibrated tree priors described in this article offer greater flexibility and control of prior specification in calibrated time-tree inference and divergence time dating, and will remove the need for indirect approaches to the assessment of the combined effect of calibration densities and tree priors in Bayesian phylogenetic inference. © The Author(s) 2014. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.
Identification of subsurface structures using electromagnetic data and shape priors
NASA Astrophysics Data System (ADS)
Tveit, Svenn; Bakr, Shaaban A.; Lien, Martha; Mannseth, Trond
2015-03-01
We consider the inverse problem of identifying large-scale subsurface structures using the controlled source electromagnetic method. To identify structures in the subsurface where the contrast in electric conductivity can be small, regularization is needed to bias the solution towards preserving structural information. We propose to combine two approaches for regularization of the inverse problem. In the first approach we utilize a model-based, reduced, composite representation of the electric conductivity that is highly flexible, even for a moderate number of degrees of freedom. With a low number of parameters, the inverse problem is efficiently solved using a standard, second-order gradient-based optimization algorithm. Further regularization is obtained using structural prior information, available, e.g., from interpreted seismic data. The reduced conductivity representation is suitable for incorporation of structural prior information. Such prior information cannot, however, be accurately modeled with a gaussian distribution. To alleviate this, we incorporate the structural information using shape priors. The shape prior technique requires the choice of kernel function, which is application dependent. We argue for using the conditionally positive definite kernel which is shown to have computational advantages over the commonly applied gaussian kernel for our problem. Numerical experiments on various test cases show that the methodology is able to identify fairly complex subsurface electric conductivity distributions while preserving structural prior information during the inversion.
The Impact of Learner's Prior Knowledge on Their Use of Chemistry Computer Simulations: A Case Study
ERIC Educational Resources Information Center
Liu, Han-Chin; Andre, Thomas; Greenbowe, Thomas
2008-01-01
It is complicated to design a computer simulation that adapts to students with different characteristics. This study documented cases that show how college students' prior chemistry knowledge level affected their interaction with peers and their approach to solving problems with the use of computer simulations that were designed to learn…
NASA Astrophysics Data System (ADS)
Pua, Rizza; Park, Miran; Wi, Sunhee; Cho, Seungryong
2016-12-01
We propose a hybrid metal artifact reduction (MAR) approach for computed tomography (CT) that is computationally more efficient than a fully iterative reconstruction method, but at the same time achieves superior image quality to the interpolation-based in-painting techniques. Our proposed MAR method, an image-based artifact subtraction approach, utilizes an intermediate prior image reconstructed via PDART to recover the background information underlying the high density objects. For comparison, prior images generated by total-variation minimization (TVM) algorithm, as a realization of fully iterative approach, were also utilized as intermediate images. From the simulation and real experimental results, it has been shown that PDART drastically accelerates the reconstruction to an acceptable quality of prior images. Incorporating PDART-reconstructed prior images in the proposed MAR scheme achieved higher quality images than those by a conventional in-painting method. Furthermore, the results were comparable to the fully iterative MAR that uses high-quality TVM prior images.
Digital hardware implementation of a stochastic two-dimensional neuron model.
Grassia, F; Kohno, T; Levi, T
2016-11-01
This study explores the feasibility of stochastic neuron simulation in digital systems (FPGA), which realizes an implementation of a two-dimensional neuron model. The stochasticity is added by a source of current noise in the silicon neuron using an Ornstein-Uhlenbeck process. This approach uses digital computation to emulate individual neuron behavior using fixed point arithmetic operation. The neuron model's computations are performed in arithmetic pipelines. It was designed in VHDL language and simulated prior to mapping in the FPGA. The experimental results confirmed the validity of the developed stochastic FPGA implementation, which makes the implementation of the silicon neuron more biologically plausible for future hybrid experiments. Copyright © 2017 Elsevier Ltd. All rights reserved.
Eye-gaze determination of user intent at the computer interface
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldberg, J.H.; Schryver, J.C.
1993-12-31
Determination of user intent at the computer interface through eye-gaze monitoring can significantly aid applications for the disabled, as well as telerobotics and process control interfaces. Whereas current eye-gaze control applications are limited to object selection and x/y gazepoint tracking, a methodology was developed here to discriminate a more abstract interface operation: zooming-in or out. This methodology first collects samples of eve-gaze location looking at controlled stimuli, at 30 Hz, just prior to a user`s decision to zoom. The sample is broken into data frames, or temporal snapshots. Within a data frame, all spatial samples are connected into a minimummore » spanning tree, then clustered, according to user defined parameters. Each cluster is mapped to one in the prior data frame, and statistics are computed from each cluster. These characteristics include cluster size, position, and pupil size. A multiple discriminant analysis uses these statistics both within and between data frames to formulate optimal rules for assigning the observations into zooming, zoom-out, or no zoom conditions. The statistical procedure effectively generates heuristics for future assignments, based upon these variables. Future work will enhance the accuracy and precision of the modeling technique, and will empirically test users in controlled experiments.« less
Iterative Region-of-Interest Reconstruction from Limited Data Using Prior Information
NASA Astrophysics Data System (ADS)
Vogelgesang, Jonas; Schorr, Christian
2017-12-01
In practice, computed tomography and computed laminography applications suffer from incomplete data. In particular, when inspecting large objects with extremely different diameters in longitudinal and transversal directions or when high resolution reconstructions are desired, the physical conditions of the scanning system lead to restricted data and truncated projections, also known as the interior or region-of-interest (ROI) problem. To recover the searched-for density function of the inspected object, we derive a semi-discrete model of the ROI problem that inherently allows the incorporation of geometrical prior information in an abstract Hilbert space setting for bounded linear operators. Assuming that the attenuation inside the object is approximately constant, as for fibre reinforced plastics parts or homogeneous objects where one is interested in locating defects like cracks or porosities, we apply the semi-discrete Landweber-Kaczmarz method to recover the inner structure of the object inside the ROI from the measured data resulting in a semi-discrete iteration method. Finally, numerical experiments for three-dimensional tomographic applications with both an inherent restricted source and ROI problem are provided to verify the proposed method for the ROI reconstruction.
Geraci, Lisa; Hughes, Matthew L; Miller, Tyler M; De Forrest, Ross L
2016-01-01
Negative aging stereotypes can lead older adults to perform poorly on memory tests. Yet, memory performance can be improved if older adults have a single successful experience on a cognitive test prior to participating in a memory experiment (Geraci & Miller, 2013, Psychology and Aging, 28, 340-345). The current study examined the effects of different types of prior task experience on subsequent memory performance. Before participating in a verbal free recall experiment, older adults in Experiment 1 successfully completed either a verbal or a visual cognitive task or no task. In Experiment 2, they successfully completed either a motor task or no task before participating in the free recall experiment. Results from Experiment 1 showed that relative to control (no prior task), participants who had prior success, either on a verbal or a visual task, had better subsequent recall performance. Experiment 2 showed that prior success on a motor task, however, did not lead to a later memory advantage relative to control. These findings demonstrate that older adults' memory can be improved by a successful prior task experience so long as that experience is in a cognitive domain.
Pharmacist Computer Skills and Needs Assessment Survey
Jewesson, Peter J
2004-01-01
Background To use technology effectively for the advancement of patient care, pharmacists must possess a variety of computer skills. We recently introduced a novel applied informatics program in this Canadian hospital clinical service unit to enhance the informatics skills of our members. Objective This study was conducted to gain a better understanding of the baseline computer skills and needs of our hospital pharmacists immediately prior to the implementation of an applied informatics program. Methods In May 2001, an 84-question written survey was distributed by mail to 106 practicing hospital pharmacists in our multi-site, 1500-bed, acute-adult-tertiary care Canadian teaching hospital in Vancouver, British Columbia. Results Fifty-eight surveys (55% of total) were returned within the two-week study period. The survey responses reflected the opinions of licensed BSc and PharmD hospital pharmacists with a broad range of pharmacy practice experience. Most respondents had home access to personal computers, and regularly used computers in the work environment for drug distribution, information management, and communication purposes. Few respondents reported experience with handheld computers. Software use experience varied according to application. Although patient-care information software and e-mail were commonly used, experience with spreadsheet, statistical, and presentation software was negligible. The respondents were familiar with Internet search engines, and these were reported to be the most common method of seeking clinical information online. Although many respondents rated themselves as being generally computer literate and not particularly anxious about using computers, the majority believed they required more training to reach their desired level of computer literacy. Lack of familiarity with computer-related terms was prevalent. Self-reported basic computer skill was typically at a moderate level, and varied depending on the task. Specifically, respondents rated their ability to manipulate files, use software help features, and install software as low, but rated their ability to access and navigate the Internet as high. Respondents were generally aware of what online resources were available to them and Clinical Pharmacology was the most commonly employed reference. In terms of anticipated needs, most pharmacists believed they needed to upgrade their computer skills. Medical database and Internet searching skills were identified as those in greatest need of improvement. Conclusions Most pharmacists believed they needed to upgrade their computer skills. Medical database and Internet searching skills were identified as those in greatest need of improvement for the purposes of improving practice effectiveness. PMID:15111277
The utility of multiple synthesized views in the recognition of unfamiliar faces.
Jones, Scott P; Dwyer, Dominic M; Lewis, Michael B
2017-05-01
The ability to recognize an unfamiliar individual on the basis of prior exposure to a photograph is notoriously poor and prone to errors, but recognition accuracy is improved when multiple photographs are available. In applied situations, when only limited real images are available (e.g., from a mugshot or CCTV image), the generation of new images might provide a technological prosthesis for otherwise fallible human recognition. We report two experiments examining the effects of providing computer-generated additional views of a target face. In Experiment 1, provision of computer-generated views supported better target face recognition than exposure to the target image alone and equivalent performance to that for exposure of multiple photograph views. Experiment 2 replicated the advantage of providing generated views, but also indicated an advantage for multiple viewings of the single target photograph. These results strengthen the claim that identifying a target face can be improved by providing multiple synthesized views based on a single target image. In addition, our results suggest that the degree of advantage provided by synthesized views may be affected by the quality of synthesized material.
Incorporating linguistic knowledge for learning distributed word representations.
Wang, Yan; Liu, Zhiyuan; Sun, Maosong
2015-01-01
Combined with neural language models, distributed word representations achieve significant advantages in computational linguistics and text mining. Most existing models estimate distributed word vectors from large-scale data in an unsupervised fashion, which, however, do not take rich linguistic knowledge into consideration. Linguistic knowledge can be represented as either link-based knowledge or preference-based knowledge, and we propose knowledge regularized word representation models (KRWR) to incorporate these prior knowledge for learning distributed word representations. Experiment results demonstrate that our estimated word representation achieves better performance in task of semantic relatedness ranking. This indicates that our methods can efficiently encode both prior knowledge from knowledge bases and statistical knowledge from large-scale text corpora into a unified word representation model, which will benefit many tasks in text mining.
Incorporating Linguistic Knowledge for Learning Distributed Word Representations
Wang, Yan; Liu, Zhiyuan; Sun, Maosong
2015-01-01
Combined with neural language models, distributed word representations achieve significant advantages in computational linguistics and text mining. Most existing models estimate distributed word vectors from large-scale data in an unsupervised fashion, which, however, do not take rich linguistic knowledge into consideration. Linguistic knowledge can be represented as either link-based knowledge or preference-based knowledge, and we propose knowledge regularized word representation models (KRWR) to incorporate these prior knowledge for learning distributed word representations. Experiment results demonstrate that our estimated word representation achieves better performance in task of semantic relatedness ranking. This indicates that our methods can efficiently encode both prior knowledge from knowledge bases and statistical knowledge from large-scale text corpora into a unified word representation model, which will benefit many tasks in text mining. PMID:25874581
Image deblurring using a joint entropy prior in x-ray luminescence computed tomography
NASA Astrophysics Data System (ADS)
Su, Chang; Dutta, Joyita; Zhang, Hui; El Fakhri, Georges; Li, Quanzheng
2017-03-01
X-ray luminescence computed tomography (XLCT) is an emerging hybrid imaging modality that can provide functional and anatomical images at the same time. Traditional narrow beam XLCT can achieve high spatial resolution as well as high sensitivity. However, by treating the CCD camera as a single pixel detector, this kind of scheme resembles the first generation of CT scanner which results in a long scanning time and a high radiation dose. Although cone beam or fan beam XLCT has the ability to mitigate this problem with an optical propagation model introduced, image quality is affected because the inverse problem is ill-conditioned. Much effort has been done to improve the image quality through hardware improvements or by developing new reconstruction techniques for XLCT. The objective of this work is to further enhance the already reconstructed image by introducing anatomical information through retrospective processing. The deblurring process used a spatially variant point spread function (PSF) model and a joint entropy based anatomical prior derived from a CT image acquired using the same XLCT system. A numerical experiment was conducted with a real mouse CT image from the Digimouse phantom used as the anatomical prior. The resultant images of bone and lung regions showed sharp edges and good consistency with the CT image. Activity error was reduced by 52.3% even for nanophosphor lesion size as small as 0.8mm.
Cone beam x-ray luminescence computed tomography reconstruction with a priori anatomical information
NASA Astrophysics Data System (ADS)
Lo, Pei-An; Lin, Meng-Lung; Jin, Shih-Chun; Chen, Jyh-Cheng; Lin, Syue-Liang; Chang, C. Allen; Chiang, Huihua Kenny
2014-09-01
X-ray luminescence computed tomography (XLCT) is a novel molecular imaging modality that reconstructs the optical distribution of x-ray-excited phosphor particles with prior informational of anatomical CT image. The prior information improves the accuracy of image reconstruction. The system can also present anatomical CT image. The optical system based on a high sensitive charge coupled device (CCD) is perpendicular with a CT system. In the XLCT system, the xray was adopted to excite the phosphor of the sample and CCD camera was utilized to acquire luminescence emitted from the sample in 360 degrees projection free-space. In this study, the fluorescence diffuse optical tomography (FDOT)-like algorithm was used for image reconstruction, the structural prior information was incorporated in the reconstruction by adding a penalty term to the minimization function. The phosphor used in this study is Gd2O2S:Tb. For the simulation and experiments, the data was collected from 16 projections. The cylinder phantom was 40 mm in diameter and contains 8 mm diameter inclusion; the phosphor in the in vivo study was 5 mm in diameter at a depth of 3 mm. Both the errors were no more than 5%. Based on the results from these simulation and experimental studies, the novel XLCT method has demonstrated the feasibility for in vivo animal model studies.
The prevalence of computer-related musculoskeletal complaints in female college students.
Hamilton, Audra G; Jacobs, Karen; Orsmond, Gael
2005-01-01
The purpose of this study was to determine the prevalence of computer-related musculoskeletal complaints in female college students. This research also explored whether the number of hours per day spent using a computer, type of computer used (laptop vs. desktop), or academic major was related to the presence of musculoskeletal complaints. Additionally, "job strain", a measure of job stress which can affect the physical health of an individual, was measured to determine whether students feel stress from the job of "student" and if so, whether it contributed to these complaints. Two surveys, The Boston University Computer and Health Survey and the Job Content Questionnaire [9], were distributed to 111 female college students to measure musculoskeletal complaints and job strain. Seventy-two surveys were returned. Chi-square and logistical regression were used to analyze the data. The results indicated that 80.6% of the participants reported computer-related musculoskeletal complaints in the two weeks prior to completing the survey, although none of the examined factors were associated with the complaints. It is notable, however, that 82% of the students reported spending 0-6 hours/day using a computer, with almost 28% reporting 4-6 hours/day of usage. Eleven percent of the participants reported using the computer more than 8 hours/day. Of those students who use a laptop computer for all computer use, 90.1% reported musculoskeletal complaints. The students reported that they did not experience job strain. Further studies should be performed using a survey specifically intended for college students. The majority of female college students in this study reported musculoskeletal discomfort during or after computer use. Although a statistical correlation could not be made, students using laptop computers reported a higher incidence of musculoskeletal symptoms than those using desktop computers. Additionally, female college students did not seem to experience job strain. Future research should continue on larger, more diverse samples of students to better understand the prevalence and contributors of musculoskeletal complaints, how college students experience job strain (stress), and whether these two factors are related.
Academic computer science and gender: A naturalistic study investigating the causes of attrition
NASA Astrophysics Data System (ADS)
Declue, Timothy Hall
Far fewer women than men take computer science classes in high school, enroll in computer science programs in college, or complete advanced degrees in computer science. The computer science pipeline begins to shrink for women even before entering college, but it is at the college level that the "brain drain" is the most evident numerically, especially in the first class taken by most computer science majors called "Computer Science 1" or CS-I. The result, for both academia and industry, is a pronounced technological gender disparity in academic and industrial computer science. The study revealed the existence of several factors influencing success in CS-I. First, and most clearly, the effect of attribution processes seemed to be quite strong. These processes tend to work against success for females and in favor of success for males. Likewise, evidence was discovered which strengthens theories related to prior experience and the perception that computer science has a culture which is hostile to females. Two unanticipated themes related to the motivation and persistence of successful computer science majors. The findings did not support the belief that females have greater logistical problems in computer science than males, or that females tend to have a different programming style than males which adversely affects the females' ability to succeed in CS-I.
Random Walk Graph Laplacian-Based Smoothness Prior for Soft Decoding of JPEG Images.
Liu, Xianming; Cheung, Gene; Wu, Xiaolin; Zhao, Debin
2017-02-01
Given the prevalence of joint photographic experts group (JPEG) compressed images, optimizing image reconstruction from the compressed format remains an important problem. Instead of simply reconstructing a pixel block from the centers of indexed discrete cosine transform (DCT) coefficient quantization bins (hard decoding), soft decoding reconstructs a block by selecting appropriate coefficient values within the indexed bins with the help of signal priors. The challenge thus lies in how to define suitable priors and apply them effectively. In this paper, we combine three image priors-Laplacian prior for DCT coefficients, sparsity prior, and graph-signal smoothness prior for image patches-to construct an efficient JPEG soft decoding algorithm. Specifically, we first use the Laplacian prior to compute a minimum mean square error initial solution for each code block. Next, we show that while the sparsity prior can reduce block artifacts, limiting the size of the overcomplete dictionary (to lower computation) would lead to poor recovery of high DCT frequencies. To alleviate this problem, we design a new graph-signal smoothness prior (desired signal has mainly low graph frequencies) based on the left eigenvectors of the random walk graph Laplacian matrix (LERaG). Compared with the previous graph-signal smoothness priors, LERaG has desirable image filtering properties with low computation overhead. We demonstrate how LERaG can facilitate recovery of high DCT frequencies of a piecewise smooth signal via an interpretation of low graph frequency components as relaxed solutions to normalized cut in spectral clustering. Finally, we construct a soft decoding algorithm using the three signal priors with appropriate prior weights. Experimental results show that our proposal outperforms the state-of-the-art soft decoding algorithms in both objective and subjective evaluations noticeably.
Assessing Prior Experience in the Selection of Air Traffic Control Specialists
2013-04-01
Crosstabulation --------------------------B1 APPENDIX C: IFR Operations Experience • Academy Training Performance Crosstabulation ---------C1 APPENDIX...Control Specialist (ATCS) rating? (n=9,333) BQ35 – Do you have prior Instrument Flight Rules ( IFR ) operations experience? (n = 9,349) 2. Hold or...not have a prior ATCS rating. Do you have prior IFR (Instrument Flight Rules) Operations experience? Of the 9,349 respondents to this question
NASA Astrophysics Data System (ADS)
Lari, Z.; El-Sheimy, N.
2017-09-01
In recent years, the increasing incidence of climate-related disasters has tremendously affected our environment. In order to effectively manage and reduce dramatic impacts of such events, the development of timely disaster management plans is essential. Since these disasters are spatial phenomena, timely provision of geospatial information is crucial for effective development of response and management plans. Due to inaccessibility of the affected areas and limited budget of first-responders, timely acquisition of the required geospatial data for these applications is usually possible only using low-cost imaging and georefencing sensors mounted on unmanned platforms. Despite rapid collection of the required data using these systems, available processing techniques are not yet capable of delivering geospatial information to responders and decision makers in a timely manner. To address this issue, this paper introduces a new technique for dense 3D reconstruction of the affected scenes which can deliver and improve the needed geospatial information incrementally. This approach is implemented based on prior 3D knowledge of the scene and employs computationally-efficient 2D triangulation, feature descriptor, feature matching and point verification techniques to optimize and speed up 3D dense scene reconstruction procedure. To verify the feasibility and computational efficiency of the proposed approach, an experiment using a set of consecutive images collected onboard a UAV platform and prior low-density airborne laser scanning over the same area is conducted and step by step results are provided. A comparative analysis of the proposed approach and an available image-based dense reconstruction technique is also conducted to prove the computational efficiency and competency of this technique for delivering geospatial information with pre-specified accuracy.
Nadeau, Stephen E; Dobkin, Bruce; Wu, Samuel S; Pei, Qinglin; Duncan, Pamela W
2016-08-01
Background Paresis in stroke is largely a result of damage to descending corticospinal and corticobulbar pathways. Recovery of paresis predominantly reflects the impact on the neural consequences of this white matter lesion by reactive neuroplasticity (mechanisms involved in spontaneous recovery) and experience-dependent neuroplasticity, driven by therapy and daily experience. However, both theoretical considerations and empirical data suggest that type of stroke (large vessel distribution/lacunar infarction, hemorrhage), locus and extent of infarction (basal ganglia, right-hemisphere cerebral cortex), and the presence of leukoaraiosis or prior stroke might influence long-term recovery of walking ability. In this secondary analysis based on the 408 participants in the Locomotor Experience Applied Post-Stroke (LEAPS) study database, we seek to address these possibilities. Methods Lesion type, locus, and extent were characterized by the 2 neurologists in the LEAPS trial on the basis of clinical computed tomography and magnetic resonance imaging scans. A series of regression models was used to test our hypotheses regarding the effects of lesion type, locus, extent, and laterality on 2- to 12-month change in gait speed, controlling for baseline gait speed, age, and Berg Balance Scale score. Results Gait speed change at 1 year was significantly reduced in participants with basal ganglia involvement and prior stroke. There was a trend toward reduction of gait speed change in participants with lacunar infarctions. The presence of right-hemisphere cortical involvement had no significant impact on outcome. Conclusions Type, locus, and extent of lesion, and the loss of substrate for neuroplastic effect as a result of prior stroke may affect long-term outcome of rehabilitation of hemiparetic gait. © The Author(s) 2015.
The Episodic Nature of Experience: A Dynamical Systems Analysis.
Sreekumar, Vishnu; Dennis, Simon; Doxas, Isidoros
2017-07-01
Context is an important construct in many domains of cognition, including learning, memory, and emotion. We used dynamical systems methods to demonstrate the episodic nature of experience by showing a natural separation between the scales over which within-context and between-context relationships operate. To do this, we represented an individual's emails extending over about 5 years in a high-dimensional semantic space and computed the dimensionalities of the subspaces occupied by these emails. Personal discourse has a two-scaled geometry with smaller within-context dimensionalities than between-context dimensionalities. Prior studies have shown that reading experience (Doxas, Dennis, & Oliver, 2010) and visual experience (Sreekumar, Dennis, Doxas, Zhuang, & Belkin, 2014) have a similar two-scaled structure. Furthermore, the recurrence plot of the emails revealed that experience is predictable and hierarchical, supporting the constructs of some influential theories of memory. The results demonstrate that experience is not scale-free and provide an important target for accounts of how experience shapes cognition. Copyright © 2016 Cognitive Science Society, Inc.
Computational Neuropsychology and Bayesian Inference
Parr, Thomas; Rees, Geraint; Friston, Karl J.
2018-01-01
Computational theories of brain function have become very influential in neuroscience. They have facilitated the growth of formal approaches to disease, particularly in psychiatric research. In this paper, we provide a narrative review of the body of computational research addressing neuropsychological syndromes, and focus on those that employ Bayesian frameworks. Bayesian approaches to understanding brain function formulate perception and action as inferential processes. These inferences combine ‘prior’ beliefs with a generative (predictive) model to explain the causes of sensations. Under this view, neuropsychological deficits can be thought of as false inferences that arise due to aberrant prior beliefs (that are poor fits to the real world). This draws upon the notion of a Bayes optimal pathology – optimal inference with suboptimal priors – and provides a means for computational phenotyping. In principle, any given neuropsychological disorder could be characterized by the set of prior beliefs that would make a patient’s behavior appear Bayes optimal. We start with an overview of some key theoretical constructs and use these to motivate a form of computational neuropsychology that relates anatomical structures in the brain to the computations they perform. Throughout, we draw upon computational accounts of neuropsychological syndromes. These are selected to emphasize the key features of a Bayesian approach, and the possible types of pathological prior that may be present. They range from visual neglect through hallucinations to autism. Through these illustrative examples, we review the use of Bayesian approaches to understand the link between biology and computation that is at the heart of neuropsychology. PMID:29527157
NASA Astrophysics Data System (ADS)
Neves, Rui Gomes; Teodoro, Vítor Duarte
2012-09-01
A teaching approach aiming at an epistemologically balanced integration of computational modelling in science and mathematics education is presented. The approach is based on interactive engagement learning activities built around computational modelling experiments that span the range of different kinds of modelling from explorative to expressive modelling. The activities are designed to make a progressive introduction to scientific computation without requiring prior development of a working knowledge of programming, generate and foster the resolution of cognitive conflicts in the understanding of scientific and mathematical concepts and promote performative competency in the manipulation of different and complementary representations of mathematical models. The activities are supported by interactive PDF documents which explain the fundamental concepts, methods and reasoning processes using text, images and embedded movies, and include free space for multimedia enriched student modelling reports and teacher feedback. To illustrate, an example from physics implemented in the Modellus environment and tested in undergraduate university general physics and biophysics courses is discussed.
Probabilistic Modeling and Visualization of the Flexibility in Morphable Models
NASA Astrophysics Data System (ADS)
Lüthi, M.; Albrecht, T.; Vetter, T.
Statistical shape models, and in particular morphable models, have gained widespread use in computer vision, computer graphics and medical imaging. Researchers have started to build models of almost any anatomical structure in the human body. While these models provide a useful prior for many image analysis task, relatively little information about the shape represented by the morphable model is exploited. We propose a method for computing and visualizing the remaining flexibility, when a part of the shape is fixed. Our method, which is based on Probabilistic PCA, not only leads to an approach for reconstructing the full shape from partial information, but also allows us to investigate and visualize the uncertainty of a reconstruction. To show the feasibility of our approach we performed experiments on a statistical model of the human face and the femur bone. The visualization of the remaining flexibility allows for greater insight into the statistical properties of the shape.
NASA Technical Reports Server (NTRS)
Crane, J. M.; Boucek, G. P., Jr.; Smith, W. D.
1986-01-01
A flight management computer (FMC) control display unit (CDU) test was conducted to compare two types of input devices: a fixed legend (dedicated) keyboard and a programmable legend (multifunction) keyboard. The task used for comparison was operation of the flight management computer for the Boeing 737-300. The same tasks were performed by twelve pilots on the FMC control display unit configured with a programmable legend keyboard and with the currently used B737-300 dedicated keyboard. Flight simulator work activity levels and input task complexity were varied during each pilot session. Half of the points tested were previously familiar with the B737-300 dedicated keyboard CDU and half had no prior experience with it. The data collected included simulator flight parameters, keystroke time and sequences, and pilot questionnaire responses. A timeline analysis was also used for evaluation of the two keyboard concepts.
Exploring Human Cognition Using Large Image Databases.
Griffiths, Thomas L; Abbott, Joshua T; Hsu, Anne S
2016-07-01
Most cognitive psychology experiments evaluate models of human cognition using a relatively small, well-controlled set of stimuli. This approach stands in contrast to current work in neuroscience, perception, and computer vision, which have begun to focus on using large databases of natural images. We argue that natural images provide a powerful tool for characterizing the statistical environment in which people operate, for better evaluating psychological theories, and for bringing the insights of cognitive science closer to real applications. We discuss how some of the challenges of using natural images as stimuli in experiments can be addressed through increased sample sizes, using representations from computer vision, and developing new experimental methods. Finally, we illustrate these points by summarizing recent work using large image databases to explore questions about human cognition in four different domains: modeling subjective randomness, defining a quantitative measure of representativeness, identifying prior knowledge used in word learning, and determining the structure of natural categories. Copyright © 2016 Cognitive Science Society, Inc.
Unbiased, scalable sampling of protein loop conformations from probabilistic priors.
Zhang, Yajia; Hauser, Kris
2013-01-01
Protein loops are flexible structures that are intimately tied to function, but understanding loop motion and generating loop conformation ensembles remain significant computational challenges. Discrete search techniques scale poorly to large loops, optimization and molecular dynamics techniques are prone to local minima, and inverse kinematics techniques can only incorporate structural preferences in adhoc fashion. This paper presents Sub-Loop Inverse Kinematics Monte Carlo (SLIKMC), a new Markov chain Monte Carlo algorithm for generating conformations of closed loops according to experimentally available, heterogeneous structural preferences. Our simulation experiments demonstrate that the method computes high-scoring conformations of large loops (>10 residues) orders of magnitude faster than standard Monte Carlo and discrete search techniques. Two new developments contribute to the scalability of the new method. First, structural preferences are specified via a probabilistic graphical model (PGM) that links conformation variables, spatial variables (e.g., atom positions), constraints and prior information in a unified framework. The method uses a sparse PGM that exploits locality of interactions between atoms and residues. Second, a novel method for sampling sub-loops is developed to generate statistically unbiased samples of probability densities restricted by loop-closure constraints. Numerical experiments confirm that SLIKMC generates conformation ensembles that are statistically consistent with specified structural preferences. Protein conformations with 100+ residues are sampled on standard PC hardware in seconds. Application to proteins involved in ion-binding demonstrate its potential as a tool for loop ensemble generation and missing structure completion.
Unbiased, scalable sampling of protein loop conformations from probabilistic priors
2013-01-01
Background Protein loops are flexible structures that are intimately tied to function, but understanding loop motion and generating loop conformation ensembles remain significant computational challenges. Discrete search techniques scale poorly to large loops, optimization and molecular dynamics techniques are prone to local minima, and inverse kinematics techniques can only incorporate structural preferences in adhoc fashion. This paper presents Sub-Loop Inverse Kinematics Monte Carlo (SLIKMC), a new Markov chain Monte Carlo algorithm for generating conformations of closed loops according to experimentally available, heterogeneous structural preferences. Results Our simulation experiments demonstrate that the method computes high-scoring conformations of large loops (>10 residues) orders of magnitude faster than standard Monte Carlo and discrete search techniques. Two new developments contribute to the scalability of the new method. First, structural preferences are specified via a probabilistic graphical model (PGM) that links conformation variables, spatial variables (e.g., atom positions), constraints and prior information in a unified framework. The method uses a sparse PGM that exploits locality of interactions between atoms and residues. Second, a novel method for sampling sub-loops is developed to generate statistically unbiased samples of probability densities restricted by loop-closure constraints. Conclusion Numerical experiments confirm that SLIKMC generates conformation ensembles that are statistically consistent with specified structural preferences. Protein conformations with 100+ residues are sampled on standard PC hardware in seconds. Application to proteins involved in ion-binding demonstrate its potential as a tool for loop ensemble generation and missing structure completion. PMID:24565175
Fast algorithms for computing phylogenetic divergence time.
Crosby, Ralph W; Williams, Tiffani L
2017-12-06
The inference of species divergence time is a key step in most phylogenetic studies. Methods have been available for the last ten years to perform the inference, but the performance of the methods does not yet scale well to studies with hundreds of taxa and thousands of DNA base pairs. For example a study of 349 primate taxa was estimated to require over 9 months of processing time. In this work, we present a new algorithm, AncestralAge, that significantly improves the performance of the divergence time process. As part of AncestralAge, we demonstrate a new method for the computation of phylogenetic likelihood and our experiments show a 90% improvement in likelihood computation time on the aforementioned dataset of 349 primates taxa with over 60,000 DNA base pairs. Additionally, we show that our new method for the computation of the Bayesian prior on node ages reduces the running time for this computation on the 349 taxa dataset by 99%. Through the use of these new algorithms we open up the ability to perform divergence time inference on large phylogenetic studies.
Math anxiety: Brain cortical network changes in anticipation of doing mathematics.
Klados, Manousos A; Pandria, Niki; Micheloyannis, Sifis; Margulies, Daniel; Bamidis, Panagiotis D
2017-12-01
Following our previous work regarding the involvement of math anxiety (MA) in math-oriented tasks, this study tries to explore the differences in the cerebral networks' topology between self-reported low math-anxious (LMA) and high math-anxious (HMA) individuals, during the anticipation phase prior to a mathematical related experiment. For this reason, multichannel EEG recordings were adopted, while the solution of the inverse problem was applied in a generic head model, in order to obtain the cortical signals. The cortical networks have been computed for each band separately, using the magnitude square coherence metric. The main graph theoretical parameters, showed differences in segregation and integration in almost all EEG bands of the HMAs in comparison to LMAs, indicative of a great influence of the anticipatory anxiety prior to mathematical performance. Copyright © 2017 Elsevier B.V. All rights reserved.
Diffuse prior monotonic likelihood ratio test for evaluation of fused image quality measures.
Wei, Chuanming; Kaplan, Lance M; Burks, Stephen D; Blum, Rick S
2011-02-01
This paper introduces a novel method to score how well proposed fused image quality measures (FIQMs) indicate the effectiveness of humans to detect targets in fused imagery. The human detection performance is measured via human perception experiments. A good FIQM should relate to perception results in a monotonic fashion. The method computes a new diffuse prior monotonic likelihood ratio (DPMLR) to facilitate the comparison of the H(1) hypothesis that the intrinsic human detection performance is related to the FIQM via a monotonic function against the null hypothesis that the detection and image quality relationship is random. The paper discusses many interesting properties of the DPMLR and demonstrates the effectiveness of the DPMLR test via Monte Carlo simulations. Finally, the DPMLR is used to score FIQMs with test cases considering over 35 scenes and various image fusion algorithms.
A context-specific latent inhibition effect in a human conditioned suppression task.
Byron Nelson, James; del Carmen Sanjuan, Maria
2006-06-01
Three studies used a computer video game preparation to demonstrate latent inhibition in adult humans. In all studies participants fired torpedoes at a target spaceship by clicking the mouse. Conditioned stimuli (CSs) were presented in the form of coloured "sensors" at the bottom of the screen. Conditioning was conducted by pairing a sensor with an attack from the target spaceship. Participants learned to suppress their rate of mouse clicking in preparation for an attack. In Experiment 1 a total of 10 preexposures to the sensor CS, prior to conditioning, retarded acquisition of suppression. In Experiment 2 the effect of preexposure was shown to be context specific. Experiment 3 showed little generalization of the preexposure effect from one sensor CS to another. Experiment 3 also showed that preexposure did not make the sensor CS inhibitory. Comparisons with conditioned suppression procedures with animals and negative-priming procedures are briefly discussed.
Sinha, Shriprakash
2016-12-01
Simulation study in systems biology involving computational experiments dealing with Wnt signaling pathways abound in literature but often lack a pedagogical perspective that might ease the understanding of beginner students and researchers in transition, who intend to work on the modeling of the pathway. This paucity might happen due to restrictive business policies which enforce an unwanted embargo on the sharing of important scientific knowledge. A tutorial introduction to computational modeling of Wnt signaling pathway in a human colorectal cancer dataset using static Bayesian network models is provided. The walkthrough might aid biologists/informaticians in understanding the design of computational experiments that is interleaved with exposition of the Matlab code and causal models from Bayesian network toolbox. The manuscript elucidates the coding contents of the advance article by Sinha (Integr. Biol. 6:1034-1048, 2014) and takes the reader in a step-by-step process of how (a) the collection and the transformation of the available biological information from literature is done, (b) the integration of the heterogeneous data and prior biological knowledge in the network is achieved, (c) the simulation study is designed, (d) the hypothesis regarding a biological phenomena is transformed into computational framework, and (e) results and inferences drawn using d -connectivity/separability are reported. The manuscript finally ends with a programming assignment to help the readers get hands-on experience of a perturbation project. Description of Matlab files is made available under GNU GPL v3 license at the Google code project on https://code.google.com/p/static-bn-for-wnt-signaling-pathway and https: //sites.google.com/site/shriprakashsinha/shriprakashsinha/projects/static-bn-for-wnt-signaling-pathway. Latest updates can be found in the latter website.
Insect-Inspired Optical-Flow Navigation Sensors
NASA Technical Reports Server (NTRS)
Thakoor, Sarita; Morookian, John M.; Chahl, Javan; Soccol, Dean; Hines, Butler; Zornetzer, Steven
2005-01-01
Integrated circuits that exploit optical flow to sense motions of computer mice on or near surfaces ( optical mouse chips ) are used as navigation sensors in a class of small flying robots now undergoing development for potential use in such applications as exploration, search, and surveillance. The basic principles of these robots were described briefly in Insect-Inspired Flight Control for Small Flying Robots (NPO-30545), NASA Tech Briefs, Vol. 29, No. 1 (January 2005), page 61. To recapitulate from the cited prior article: The concept of optical flow can be defined, loosely, as the use of texture in images as a source of motion cues. The flight-control and navigation systems of these robots are inspired largely by the designs and functions of the vision systems and brains of insects, which have been demonstrated to utilize optical flow (as detected by their eyes and brains) resulting from their own motions in the environment. Optical flow has been shown to be very effective as a means of avoiding obstacles and controlling speeds and altitudes in robotic navigation. Prior systems used in experiments on navigating by means of optical flow have involved the use of panoramic optics, high-resolution image sensors, and programmable imagedata- processing computers.
MacNeilage, Paul R.; Ganesan, Narayan; Angelaki, Dora E.
2008-01-01
Spatial orientation is the sense of body orientation and self-motion relative to the stationary environment, fundamental to normal waking behavior and control of everyday motor actions including eye movements, postural control, and locomotion. The brain achieves spatial orientation by integrating visual, vestibular, and somatosensory signals. Over the past years, considerable progress has been made toward understanding how these signals are processed by the brain using multiple computational approaches that include frequency domain analysis, the concept of internal models, observer theory, Bayesian theory, and Kalman filtering. Here we put these approaches in context by examining the specific questions that can be addressed by each technique and some of the scientific insights that have resulted. We conclude with a recent application of particle filtering, a probabilistic simulation technique that aims to generate the most likely state estimates by incorporating internal models of sensor dynamics and physical laws and noise associated with sensory processing as well as prior knowledge or experience. In this framework, priors for low angular velocity and linear acceleration can explain the phenomena of velocity storage and frequency segregation, both of which have been modeled previously using arbitrary low-pass filtering. How Kalman and particle filters may be implemented by the brain is an emerging field. Unlike past neurophysiological research that has aimed to characterize mean responses of single neurons, investigations of dynamic Bayesian inference should attempt to characterize population activities that constitute probabilistic representations of sensory and prior information. PMID:18842952
Torp, Steffen; Bing-Jonsson, Pia C; Hanson, Elizabeth
2013-09-01
This multi-municipal intervention study explored whether informal carers of frail older people and disabled children living at home made use of information and communication technology (ICT) to gain knowledge about caring and to form informal support networks, thereby improving their health. Seventy-nine informal carers accessed web-based information about caring and an e-based discussion forum via their personal computers. They were able to maintain contact with each other using a web camera and via normal group meetings. After the first 12 months, 17 informal carers participated in focus group interviews and completed a short questionnaire. Four staff members were also interviewed. Participant carers who had prior experiences with a similar ICT-based support network reported greater satisfaction and more extensive use of the network than did participants with no such prior experience. It seems that infrequent usage of the service may be explained by too few other carers to identify with and inappropriate recruitment procedures. Nevertheless, carers of disabled children reported that the intervention had resulted in improved services across the participant municipalities. To achieve optimal effects of an ICT-based support network due attention must be given to recruitment processes and social environment building for which care practitioners require training and support.
Bayesian Inference in the Modern Design of Experiments
NASA Technical Reports Server (NTRS)
DeLoach, Richard
2008-01-01
This paper provides an elementary tutorial overview of Bayesian inference and its potential for application in aerospace experimentation in general and wind tunnel testing in particular. Bayes Theorem is reviewed and examples are provided to illustrate how it can be applied to objectively revise prior knowledge by incorporating insights subsequently obtained from additional observations, resulting in new (posterior) knowledge that combines information from both sources. A logical merger of Bayesian methods and certain aspects of Response Surface Modeling is explored. Specific applications to wind tunnel testing, computational code validation, and instrumentation calibration are discussed.
Mugler, Emily M; Ruf, Carolin A; Halder, Sebastian; Bensch, Michael; Kubler, Andrea
2010-12-01
An electroencephalographic (EEG) brain-computer interface (BCI) internet browser was designed and evaluated with 10 healthy volunteers and three individuals with advanced amyotrophic lateral sclerosis (ALS), all of whom were given tasks to execute on the internet using the browser. Participants with ALS achieved an average accuracy of 73% and a subsequent information transfer rate (ITR) of 8.6 bits/min and healthy participants with no prior BCI experience over 90% accuracy and an ITR of 14.4 bits/min. We define additional criteria for unrestricted internet access for evaluation of the presented and future internet browsers, and we provide a review of the existing browsers in the literature. The P300-based browser provides unrestricted access and enables free web surfing for individuals with paralysis.
Real-time simulation of biological soft tissues: a PGD approach.
Niroomandi, S; González, D; Alfaro, I; Bordeu, F; Leygue, A; Cueto, E; Chinesta, F
2013-05-01
We introduce here a novel approach for the numerical simulation of nonlinear, hyperelastic soft tissues at kilohertz feedback rates necessary for haptic rendering. This approach is based upon the use of proper generalized decomposition techniques, a generalization of PODs. Proper generalized decomposition techniques can be considered as a means of a priori model order reduction and provides a physics-based meta-model without the need for prior computer experiments. The suggested strategy is thus composed of an offline phase, in which a general meta-model is computed, and an online evaluation phase in which the results are obtained at real time. Results are provided that show the potential of the proposed technique, together with some benchmark test that shows the accuracy of the method. Copyright © 2013 John Wiley & Sons, Ltd.
Structuring and extracting knowledge for the support of hypothesis generation in molecular biology
Roos, Marco; Marshall, M Scott; Gibson, Andrew P; Schuemie, Martijn; Meij, Edgar; Katrenko, Sophia; van Hage, Willem Robert; Krommydas, Konstantinos; Adriaans, Pieter W
2009-01-01
Background Hypothesis generation in molecular and cellular biology is an empirical process in which knowledge derived from prior experiments is distilled into a comprehensible model. The requirement of automated support is exemplified by the difficulty of considering all relevant facts that are contained in the millions of documents available from PubMed. Semantic Web provides tools for sharing prior knowledge, while information retrieval and information extraction techniques enable its extraction from literature. Their combination makes prior knowledge available for computational analysis and inference. While some tools provide complete solutions that limit the control over the modeling and extraction processes, we seek a methodology that supports control by the experimenter over these critical processes. Results We describe progress towards automated support for the generation of biomolecular hypotheses. Semantic Web technologies are used to structure and store knowledge, while a workflow extracts knowledge from text. We designed minimal proto-ontologies in OWL for capturing different aspects of a text mining experiment: the biological hypothesis, text and documents, text mining, and workflow provenance. The models fit a methodology that allows focus on the requirements of a single experiment while supporting reuse and posterior analysis of extracted knowledge from multiple experiments. Our workflow is composed of services from the 'Adaptive Information Disclosure Application' (AIDA) toolkit as well as a few others. The output is a semantic model with putative biological relations, with each relation linked to the corresponding evidence. Conclusion We demonstrated a 'do-it-yourself' approach for structuring and extracting knowledge in the context of experimental research on biomolecular mechanisms. The methodology can be used to bootstrap the construction of semantically rich biological models using the results of knowledge extraction processes. Models specific to particular experiments can be constructed that, in turn, link with other semantic models, creating a web of knowledge that spans experiments. Mapping mechanisms can link to other knowledge resources such as OBO ontologies or SKOS vocabularies. AIDA Web Services can be used to design personalized knowledge extraction procedures. In our example experiment, we found three proteins (NF-Kappa B, p21, and Bax) potentially playing a role in the interplay between nutrients and epigenetic gene regulation. PMID:19796406
ERIC Educational Resources Information Center
Bernacki, Matthew
2010-01-01
This study examined how learners construct textbase and situation model knowledge in hypertext computer-based learning environments (CBLEs) and documented the influence of specific self-regulated learning (SRL) tactics, prior knowledge, and characteristics of the learner on posttest knowledge scores from exposure to a hypertext. A sample of 160…
The Role of Prior Experience in Feedback of Beginning Teachers
ERIC Educational Resources Information Center
Blount, Tametra Danielle
2010-01-01
This causal-comparative, mixed-methods study examined the role of prior experience in the mentoring needs of first-year teachers from alternative certification programs in three Tennessee counties. Teachers examined were: teachers from traditional teacher education programs, teachers with no prior teacher education experience, teachers with prior…
Müller, H; Naujoks, F; Dietz, S
2002-08-01
Problems encountered during the installation and introduction of an automated anaesthesia documentation system are discussed. Difficulties have to be expected in the area of staff training because of heterogeneous experience in computer usage and in the field of online documentation of vital signs. Moreover the areas of net administration and hardware configuration as well as general administrative issues also represent possible sources of drawbacks. System administration and reliable support provided by personnel of the department of anaesthesiology assuring staff motivation and reducing time of system failures require adequately staffed departments. Based on our own experiences, we recommend that anaesthesiology departments considering the future installation and use of an automated anaesthesia documentation system should verify sufficient personnel capacities prior to their decision.
Robots Learn to Recognize Individuals from Imitative Encounters with People and Avatars
NASA Astrophysics Data System (ADS)
Boucenna, Sofiane; Cohen, David; Meltzoff, Andrew N.; Gaussier, Philippe; Chetouani, Mohamed
2016-02-01
Prior to language, human infants are prolific imitators. Developmental science grounds infant imitation in the neural coding of actions, and highlights the use of imitation for learning from and about people. Here, we used computational modeling and a robot implementation to explore the functional value of action imitation. We report 3 experiments using a mutual imitation task between robots, adults, typically developing children, and children with Autism Spectrum Disorder. We show that a particular learning architecture - specifically one combining artificial neural nets for (i) extraction of visual features, (ii) the robot’s motor internal state, (iii) posture recognition, and (iv) novelty detection - is able to learn from an interactive experience involving mutual imitation. This mutual imitation experience allowed the robot to recognize the interactive agent in a subsequent encounter. These experiments using robots as tools for modeling human cognitive development, based on developmental theory, confirm the promise of developmental robotics. Additionally, findings illustrate how person recognition may emerge through imitative experience, intercorporeal mapping, and statistical learning.
Robots Learn to Recognize Individuals from Imitative Encounters with People and Avatars
Boucenna, Sofiane; Cohen, David; Meltzoff, Andrew N.; Gaussier, Philippe; Chetouani, Mohamed
2016-01-01
Prior to language, human infants are prolific imitators. Developmental science grounds infant imitation in the neural coding of actions, and highlights the use of imitation for learning from and about people. Here, we used computational modeling and a robot implementation to explore the functional value of action imitation. We report 3 experiments using a mutual imitation task between robots, adults, typically developing children, and children with Autism Spectrum Disorder. We show that a particular learning architecture - specifically one combining artificial neural nets for (i) extraction of visual features, (ii) the robot’s motor internal state, (iii) posture recognition, and (iv) novelty detection - is able to learn from an interactive experience involving mutual imitation. This mutual imitation experience allowed the robot to recognize the interactive agent in a subsequent encounter. These experiments using robots as tools for modeling human cognitive development, based on developmental theory, confirm the promise of developmental robotics. Additionally, findings illustrate how person recognition may emerge through imitative experience, intercorporeal mapping, and statistical learning. PMID:26844862
Effects of prior aversive experience upon retrograde amnesia induced by hypothermia.
Jensen, R A; Riccio, D C; Gehres, L
1975-08-01
Two experiments examined the extent to which retrograde amnesia (RA) is attenuated by prior learning experiences. In Experiment 1, rats initially received either passive avoidance training in a step-through apparatus, exposure to the apparatus, or noncontingent footshock. When training on a second but different passive avoidance task was followed by hypothermia treatment, RA was obtained only in the latter two groups. In Experiment 2, one-way active avoidance training, yoked noncontingent shocks, or apparatus exposure constituted the initial experience. Subsequent step-down passive avoidance training and amnestic treatment resulted in memory loss for the prior apparatus exposure group, but not for either of the preshocked conditions. These experiments demonstrate that certain types of prior aversive experience can substantially modify the magnitude of RA, and, in conjunction with other familiarization studies, emphasize a paradox for interpretations of RA based solely upon CNS disruption. The possibility that hypothermia treatment serves as an important contextual or encoding cue necessary for memory retrieval was considered. It was suggested that prior experience may block RA by enabling rats to differentiate training and treatment conditions.
Efficient experimental design for uncertainty reduction in gene regulatory networks.
Dehghannasiri, Roozbeh; Yoon, Byung-Jun; Dougherty, Edward R
2015-01-01
An accurate understanding of interactions among genes plays a major role in developing therapeutic intervention methods. Gene regulatory networks often contain a significant amount of uncertainty. The process of prioritizing biological experiments to reduce the uncertainty of gene regulatory networks is called experimental design. Under such a strategy, the experiments with high priority are suggested to be conducted first. The authors have already proposed an optimal experimental design method based upon the objective for modeling gene regulatory networks, such as deriving therapeutic interventions. The experimental design method utilizes the concept of mean objective cost of uncertainty (MOCU). MOCU quantifies the expected increase of cost resulting from uncertainty. The optimal experiment to be conducted first is the one which leads to the minimum expected remaining MOCU subsequent to the experiment. In the process, one must find the optimal intervention for every gene regulatory network compatible with the prior knowledge, which can be prohibitively expensive when the size of the network is large. In this paper, we propose a computationally efficient experimental design method. This method incorporates a network reduction scheme by introducing a novel cost function that takes into account the disruption in the ranking of potential experiments. We then estimate the approximate expected remaining MOCU at a lower computational cost using the reduced networks. Simulation results based on synthetic and real gene regulatory networks show that the proposed approximate method has close performance to that of the optimal method but at lower computational cost. The proposed approximate method also outperforms the random selection policy significantly. A MATLAB software implementing the proposed experimental design method is available at http://gsp.tamu.edu/Publications/supplementary/roozbeh15a/.
Efficient experimental design for uncertainty reduction in gene regulatory networks
2015-01-01
Background An accurate understanding of interactions among genes plays a major role in developing therapeutic intervention methods. Gene regulatory networks often contain a significant amount of uncertainty. The process of prioritizing biological experiments to reduce the uncertainty of gene regulatory networks is called experimental design. Under such a strategy, the experiments with high priority are suggested to be conducted first. Results The authors have already proposed an optimal experimental design method based upon the objective for modeling gene regulatory networks, such as deriving therapeutic interventions. The experimental design method utilizes the concept of mean objective cost of uncertainty (MOCU). MOCU quantifies the expected increase of cost resulting from uncertainty. The optimal experiment to be conducted first is the one which leads to the minimum expected remaining MOCU subsequent to the experiment. In the process, one must find the optimal intervention for every gene regulatory network compatible with the prior knowledge, which can be prohibitively expensive when the size of the network is large. In this paper, we propose a computationally efficient experimental design method. This method incorporates a network reduction scheme by introducing a novel cost function that takes into account the disruption in the ranking of potential experiments. We then estimate the approximate expected remaining MOCU at a lower computational cost using the reduced networks. Conclusions Simulation results based on synthetic and real gene regulatory networks show that the proposed approximate method has close performance to that of the optimal method but at lower computational cost. The proposed approximate method also outperforms the random selection policy significantly. A MATLAB software implementing the proposed experimental design method is available at http://gsp.tamu.edu/Publications/supplementary/roozbeh15a/. PMID:26423515
Bayesian least squares deconvolution
NASA Astrophysics Data System (ADS)
Asensio Ramos, A.; Petit, P.
2015-11-01
Aims: We develop a fully Bayesian least squares deconvolution (LSD) that can be applied to the reliable detection of magnetic signals in noise-limited stellar spectropolarimetric observations using multiline techniques. Methods: We consider LSD under the Bayesian framework and we introduce a flexible Gaussian process (GP) prior for the LSD profile. This prior allows the result to automatically adapt to the presence of signal. We exploit several linear algebra identities to accelerate the calculations. The final algorithm can deal with thousands of spectral lines in a few seconds. Results: We demonstrate the reliability of the method with synthetic experiments and we apply it to real spectropolarimetric observations of magnetic stars. We are able to recover the magnetic signals using a small number of spectral lines, together with the uncertainty at each velocity bin. This allows the user to consider if the detected signal is reliable. The code to compute the Bayesian LSD profile is freely available.
2017-01-01
entrants’ prior work experience . This dissertation utilizes DoD personnel data to (1) describe civilian DAW cohorts in terms of past work experience and...illustrate how the hiring surge has changed cohort past-work- experience characteristics; (2) evaluate how prior work experience relates to retention...growth initiative was fueled mainly by outside hires with no prior DoD experience , and some evidence suggests that these DoD newcomers—in general
Dietterich, Hannah; Lev, Einat; Chen, Jiangzhi; Richardson, Jacob A.; Cashman, Katharine V.
2017-01-01
Numerical simulations of lava flow emplacement are valuable for assessing lava flow hazards, forecasting active flows, designing flow mitigation measures, interpreting past eruptions, and understanding the controls on lava flow behavior. Existing lava flow models vary in simplifying assumptions, physics, dimensionality, and the degree to which they have been validated against analytical solutions, experiments, and natural observations. In order to assess existing models and guide the development of new codes, we conduct a benchmarking study of computational fluid dynamics (CFD) models for lava flow emplacement, including VolcFlow, OpenFOAM, FLOW-3D, COMSOL, and MOLASSES. We model viscous, cooling, and solidifying flows over horizontal planes, sloping surfaces, and into topographic obstacles. We compare model results to physical observations made during well-controlled analogue and molten basalt experiments, and to analytical theory when available. Overall, the models accurately simulate viscous flow with some variability in flow thickness where flows intersect obstacles. OpenFOAM, COMSOL, and FLOW-3D can each reproduce experimental measurements of cooling viscous flows, and OpenFOAM and FLOW-3D simulations with temperature-dependent rheology match results from molten basalt experiments. We assess the goodness-of-fit of the simulation results and the computational cost. Our results guide the selection of numerical simulation codes for different applications, including inferring emplacement conditions of past lava flows, modeling the temporal evolution of ongoing flows during eruption, and probabilistic assessment of lava flow hazard prior to eruption. Finally, we outline potential experiments and desired key observational data from future flows that would extend existing benchmarking data sets.
Does Cation Size Affect Occupancy and Electrostatic Screening of the Nucleic Acid Ion Atmosphere?
2016-01-01
Electrostatics are central to all aspects of nucleic acid behavior, including their folding, condensation, and binding to other molecules, and the energetics of these processes are profoundly influenced by the ion atmosphere that surrounds nucleic acids. Given the highly complex and dynamic nature of the ion atmosphere, understanding its properties and effects will require synergy between computational modeling and experiment. Prior computational models and experiments suggest that cation occupancy in the ion atmosphere depends on the size of the cation. However, the computational models have not been independently tested, and the experimentally observed effects were small. Here, we evaluate a computational model of ion size effects by experimentally testing a blind prediction made from that model, and we present additional experimental results that extend our understanding of the ion atmosphere. Giambasu et al. developed and implemented a three-dimensional reference interaction site (3D-RISM) model for monovalent cations surrounding DNA and RNA helices, and this model predicts that Na+ would outcompete Cs+ by 1.8–2.1-fold; i.e., with Cs+ in 2-fold excess of Na+ the ion atmosphere would contain an equal number of each cation (Nucleic Acids Res.2015, 43, 8405). However, our ion counting experiments indicate that there is no significant preference for Na+ over Cs+. There is an ∼25% preferential occupancy of Li+ over larger cations in the ion atmosphere but, counter to general expectations from existing models, no size dependence for the other alkali metal ions. Further, we followed the folding of the P4–P6 RNA and showed that differences in folding with different alkali metal ions observed at high concentration arise from cation–anion interactions and not cation size effects. Overall, our results provide a critical test of a computational prediction, fundamental information about ion atmosphere properties, and parameters that will aid in the development of next-generation nucleic acid computational models. PMID:27479701
NASA Astrophysics Data System (ADS)
Gómez-Bombarelli, Rafael; Aguilera-Iparraguirre, Jorge; Hirzel, Timothy D.; Ha, Dong-Gwang; Einzinger, Markus; Wu, Tony; Baldo, Marc A.; Aspuru-Guzik, Alán.
2016-09-01
Discovering new OLED emitters requires many experiments to synthesize candidates and test performance in devices. Large scale computer simulation can greatly speed this search process but the problem remains challenging enough that brute force application of massive computing power is not enough to successfully identify novel structures. We report a successful High Throughput Virtual Screening study that leveraged a range of methods to optimize the search process. The generation of candidate structures was constrained to contain combinatorial explosion. Simulations were tuned to the specific problem and calibrated with experimental results. Experimentalists and theorists actively collaborated such that experimental feedback was regularly utilized to update and shape the computational search. Supervised machine learning methods prioritized candidate structures prior to quantum chemistry simulation to prevent wasting compute on likely poor performers. With this combination of techniques, each multiplying the strength of the search, this effort managed to navigate an area of molecular space and identify hundreds of promising OLED candidate structures. An experimentally validated selection of this set shows emitters with external quantum efficiencies as high as 22%.
Experiment in Onboard Synthetic Aperture Radar Data Processing
NASA Technical Reports Server (NTRS)
Holland, Matthew
2011-01-01
Single event upsets (SEUs) are a threat to any computing system running on hardware that has not been physically radiation hardened. In addition to mandating the use of performance-limited, hardened heritage equipment, prior techniques for dealing with the SEU problem often involved hardware-based error detection and correction (EDAC). With limited computing resources, software- based EDAC, or any more elaborate recovery methods, were often not feasible. Synthetic aperture radars (SARs), when operated in the space environment, are interesting due to their relevance to NASAs objectives, but problematic in the sense of producing prodigious amounts of raw data. Prior implementations of the SAR data processing algorithm have been too slow, too computationally intensive, and require too much application memory for onboard execution to be a realistic option when using the type of heritage processing technology described above. This standard C-language implementation of SAR data processing is distributed over many cores of a Tilera Multicore Processor, and employs novel Radiation Hardening by Software (RHBS) techniques designed to protect the component processes (one per core) and their shared application memory from the sort of SEUs expected in the space environment. The source code includes calls to Tilera APIs, and a specialized Tilera compiler is required to produce a Tilera executable. The compiled application reads input data describing the position and orientation of a radar platform, as well as its radar-burst data, over time and writes out processed data in a form that is useful for analysis of the radar observations.
The Structure and Properties of Silica Glass Nanostructures using Novel Computational Systems
NASA Astrophysics Data System (ADS)
Doblack, Benjamin N.
The structure and properties of silica glass nanostructures are examined using computational methods in this work. Standard synthesis methods of silica and its associated material properties are first discussed in brief. A review of prior experiments on this amorphous material is also presented. Background and methodology for the simulation of mechanical tests on amorphous bulk silica and nanostructures are later presented. A new computational system for the accurate and fast simulation of silica glass is also presented, using an appropriate interatomic potential for this material within the open-source molecular dynamics computer program LAMMPS. This alternative computational method uses modern graphics processors, Nvidia CUDA technology and specialized scientific codes to overcome processing speed barriers common to traditional computing methods. In conjunction with a virtual reality system used to model select materials, this enhancement allows the addition of accelerated molecular dynamics simulation capability. The motivation is to provide a novel research environment which simultaneously allows visualization, simulation, modeling and analysis. The research goal of this project is to investigate the structure and size dependent mechanical properties of silica glass nanohelical structures under tensile MD conditions using the innovative computational system. Specifically, silica nanoribbons and nanosprings are evaluated which revealed unique size dependent elastic moduli when compared to the bulk material. For the nanoribbons, the tensile behavior differed widely between the models simulated, with distinct characteristic extended elastic regions. In the case of the nanosprings simulated, more clear trends are observed. In particular, larger nanospring wire cross-sectional radii (r) lead to larger Young's moduli, while larger helical diameters (2R) resulted in smaller Young's moduli. Structural transformations and theoretical models are also analyzed to identify possible factors which might affect the mechanical response of silica nanostructures under tension. The work presented outlines an innovative simulation methodology, and discusses how results can be validated against prior experimental and simulation findings. The ultimate goal is to develop new computational methods for the study of nanostructures which will make the field of materials science more accessible, cost effective and efficient.
An approach for reduction of false predictions in reverse engineering of gene regulatory networks.
Khan, Abhinandan; Saha, Goutam; Pal, Rajat Kumar
2018-05-14
A gene regulatory network discloses the regulatory interactions amongst genes, at a particular condition of the human body. The accurate reconstruction of such networks from time-series genetic expression data using computational tools offers a stiff challenge for contemporary computer scientists. This is crucial to facilitate the understanding of the proper functioning of a living organism. Unfortunately, the computational methods produce many false predictions along with the correct predictions, which is unwanted. Investigations in the domain focus on the identification of as many correct regulations as possible in the reverse engineering of gene regulatory networks to make it more reliable and biologically relevant. One way to achieve this is to reduce the number of incorrect predictions in the reconstructed networks. In the present investigation, we have proposed a novel scheme to decrease the number of false predictions by suitably combining several metaheuristic techniques. We have implemented the same using a dataset ensemble approach (i.e. combining multiple datasets) also. We have employed the proposed methodology on real-world experimental datasets of the SOS DNA Repair network of Escherichia coli and the IMRA network of Saccharomyces cerevisiae. Subsequently, we have experimented upon somewhat larger, in silico networks, namely, DREAM3 and DREAM4 Challenge networks, and 15-gene and 20-gene networks extracted from the GeneNetWeaver database. To study the effect of multiple datasets on the quality of the inferred networks, we have used four datasets in each experiment. The obtained results are encouraging enough as the proposed methodology can reduce the number of false predictions significantly, without using any supplementary prior biological information for larger gene regulatory networks. It is also observed that if a small amount of prior biological information is incorporated here, the results improve further w.r.t. the prediction of true positives. Copyright © 2018 Elsevier Ltd. All rights reserved.
Computer Literacy Project. A General Orientation in Basic Computer Concepts and Applications.
ERIC Educational Resources Information Center
Murray, David R.
This paper proposes a two-part, basic computer literacy program for university faculty, staff, and students with no prior exposure to computers. The program described would introduce basic computer concepts and computing center service programs and resources; provide fundamental preparation for other computer courses; and orient faculty towards…
Adams, Audrey; Timmins, Fiona
2006-01-01
This paper describes students' experiences of a Web-based innovation at one university. This paper reports on the first phase of this development where two Web-based modules were developed. Using a survey approach (n=44) students' access to and use of computer technology were explored. Findings revealed that students' prior use of computers and Internet technologies was higher than previously reported, although use of databases was low. Skills in this area increased during the programme, with a significant rise in database, email, search engine and word processing use. Many specific computer skills were learned during the programme, with high numbers reporting ability to deal adequately with files and folders. Overall, the experience was a positive one for students. While a sense of student isolation was not reported, as many students kept in touch by phone and class attendance continued, some individual students did appear to isolate themselves. This teaching methodology has much to offer in the provision of convenient easy to access programmes that can be easily adapted to the individual lifestyle. However, student support mechanisms need careful consideration for students who are at risk of becoming isolated. Staff also need to supported in the provision of this methodology and face-to-face contact with teachers for some part of the programme is preferable.
Bayard, David S.; Neely, Michael
2016-01-01
An experimental design approach is presented for individualized therapy in the special case where the prior information is specified by a nonparametric (NP) population model. Here, a nonparametric model refers to a discrete probability model characterized by a finite set of support points and their associated weights. An important question arises as to how to best design experiments for this type of model. Many experimental design methods are based on Fisher Information or other approaches originally developed for parametric models. While such approaches have been used with some success across various applications, it is interesting to note that they largely fail to address the fundamentally discrete nature of the nonparametric model. Specifically, the problem of identifying an individual from a nonparametric prior is more naturally treated as a problem of classification, i.e., to find a support point that best matches the patient’s behavior. This paper studies the discrete nature of the NP experiment design problem from a classification point of view. Several new insights are provided including the use of Bayes Risk as an information measure, and new alternative methods for experiment design. One particular method, denoted as MMopt (Multiple-Model Optimal), will be examined in detail and shown to require minimal computation while having distinct advantages compared to existing approaches. Several simulated examples, including a case study involving oral voriconazole in children, are given to demonstrate the usefulness of MMopt in pharmacokinetics applications. PMID:27909942
Bayard, David S; Neely, Michael
2017-04-01
An experimental design approach is presented for individualized therapy in the special case where the prior information is specified by a nonparametric (NP) population model. Here, a NP model refers to a discrete probability model characterized by a finite set of support points and their associated weights. An important question arises as to how to best design experiments for this type of model. Many experimental design methods are based on Fisher information or other approaches originally developed for parametric models. While such approaches have been used with some success across various applications, it is interesting to note that they largely fail to address the fundamentally discrete nature of the NP model. Specifically, the problem of identifying an individual from a NP prior is more naturally treated as a problem of classification, i.e., to find a support point that best matches the patient's behavior. This paper studies the discrete nature of the NP experiment design problem from a classification point of view. Several new insights are provided including the use of Bayes Risk as an information measure, and new alternative methods for experiment design. One particular method, denoted as MMopt (multiple-model optimal), will be examined in detail and shown to require minimal computation while having distinct advantages compared to existing approaches. Several simulated examples, including a case study involving oral voriconazole in children, are given to demonstrate the usefulness of MMopt in pharmacokinetics applications.
Jackson, Suzanne F.; Cole, Donald C.
2013-01-01
The Dalla Lana School of Public Health uses an “add-on” or concentration model of global health education. Records of masters’ graduate cohorts across five disciplinary fields from 2006 to 2009 were classified as to prior experience at application and completion of global health concentration requirements. Alumni from the first two cohorts (2006-08 and 2007-09) were interviewed using a semi-structured interview guide. Prior experience was not linked consistently with the number of elective courses, location of practica or completion of requirements. Successful completion of the global health requirements depended more on the student’s base disciplinary program. Interviewed alumni with medium prior experience reported greater satisfaction with the concentration. Alumni with lower prior experience wanted more courses and support with practica. The pros and cons of a concentration model of global public health graduate education are discussed. PMID:23618475
ERIC Educational Resources Information Center
Ocampo, Amber C.; Squire, Larry R.; Clark, Robert E.
2018-01-01
Prior experience has been shown to improve learning in both humans and animals, but it is unclear what aspects of recent experience are necessary to produce beneficial effects. Here, we examined the capacity of rats with complete hippocampal lesions, restricted CA1 lesions, or sham surgeries to benefit from prior experience. Animals were tested in…
Exploring hurdles to transfer : student experiences of applying knowledge across disciplines
NASA Astrophysics Data System (ADS)
Lappalainen, Jouni; Rosqvist, Juho
2015-04-01
This paper explores the ways students perceive the transfer of learned knowledge to new situations - often a surprisingly difficult prospect. The novel aspect compared to the traditional transfer studies is that the learning phase is not a part of the experiment itself. The intention was only to activate acquired knowledge relevant to the transfer target using a short primer immediately prior to the situation where the knowledge was to be applied. Eight volunteer students from either mathematics or computer science curricula were given a task of designing an adder circuit using logic gates: a new context in which to apply knowledge of binary arithmetic and Boolean algebra. The results of a phenomenographic classification of the views presented by the students in their post-experiment interviews are reported. The degree to which the students were conscious of the acquired knowledge they employed and how they applied it in a new context emerged as the differentiating factors.
Computational Methods for MOF/Polymer Membranes.
Erucar, Ilknur; Keskin, Seda
2016-04-01
Metal-organic framework (MOF)/polymer mixed matrix membranes (MMMs) have received significant interest in the last decade. MOFs are incorporated into polymers to make MMMs that exhibit improved gas permeability and selectivity compared with pure polymer membranes. The fundamental challenge in this area is to choose the appropriate MOF/polymer combinations for a gas separation of interest. Even if a single polymer is considered, there are thousands of MOFs that could potentially be used as fillers in MMMs. As a result, there has been a large demand for computational studies that can accurately predict the gas separation performance of MOF/polymer MMMs prior to experiments. We have developed computational approaches to assess gas separation potentials of MOF/polymer MMMs and used them to identify the most promising MOF/polymer pairs. In this Personal Account, we aim to provide a critical overview of current computational methods for modeling MOF/polymer MMMs. We give our perspective on the background, successes, and failures that led to developments in this area and discuss the opportunities and challenges of using computational methods for MOF/polymer MMMs. © 2016 The Chemical Society of Japan & Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Cognitive biases, linguistic universals, and constraint-based grammar learning.
Culbertson, Jennifer; Smolensky, Paul; Wilson, Colin
2013-07-01
According to classical arguments, language learning is both facilitated and constrained by cognitive biases. These biases are reflected in linguistic typology-the distribution of linguistic patterns across the world's languages-and can be probed with artificial grammar experiments on child and adult learners. Beginning with a widely successful approach to typology (Optimality Theory), and adapting techniques from computational approaches to statistical learning, we develop a Bayesian model of cognitive biases and show that it accounts for the detailed pattern of results of artificial grammar experiments on noun-phrase word order (Culbertson, Smolensky, & Legendre, 2012). Our proposal has several novel properties that distinguish it from prior work in the domains of linguistic theory, computational cognitive science, and machine learning. This study illustrates how ideas from these domains can be synthesized into a model of language learning in which biases range in strength from hard (absolute) to soft (statistical), and in which language-specific and domain-general biases combine to account for data from the macro-level scale of typological distribution to the micro-level scale of learning by individuals. Copyright © 2013 Cognitive Science Society, Inc.
Sobie, Eric A
2011-09-13
This two-part lecture introduces students to the scientific computing language MATLAB. Prior computer programming experience is not required. The lectures present basic concepts of computer programming logic that tend to cause difficulties for beginners in addition to concepts that relate specifically to the MATLAB language syntax. The lectures begin with a discussion of vectors, matrices, and arrays. Because many types of biological data, such as fluorescence images and DNA microarrays, are stored as two-dimensional objects, processing these data is a form of array manipulation, and MATLAB is especially adept at handling such array objects. The students are introduced to basic commands in MATLAB, as well as built-in functions that provide useful shortcuts. The second lecture focuses on the differences between MATLAB scripts and MATLAB functions and describes when one method of programming organization might be preferable to the other. The principles are illustrated through the analysis of experimental data, specifically measurements of intracellular calcium concentration in live cells obtained using confocal microscopy.
Sobie, Eric A.
2014-01-01
This two-part lecture introduces students to the scientific computing language MATLAB. Prior computer programming experience is not required. The lectures present basic concepts of computer programming logic that tend to cause difficulties for beginners in addition to concepts that relate specifically to the MATLAB language syntax. The lectures begin with a discussion of vectors, matrices, and arrays. Because many types of biological data, such as fluorescence images and DNA microarrays, are stored as two-dimensional objects, processing these data is a form of array manipulation, and MATLAB is especially adept at handling such array objects. The students are introduced to basic commands in MATLAB, as well as built-in functions that provide useful shortcuts. The second lecture focuses on the differences between MATLAB scripts and MATLAB functions and describes when one method of programming organization might be preferable to the other. The principles are illustrated through the analysis of experimental data, specifically measurements of intracellular calcium concentration in live cells obtained using confocal microscopy. PMID:21934110
Minkara, Mona S; Weaver, Michael N; Gorske, Jim; Bowers, Clifford R; Merz, Kenneth M
2015-08-11
There exists a sparse representation of blind and low-vision students in science, technology, engineering and mathematics (STEM) fields. This is due in part to these individuals being discouraged from pursuing STEM degrees as well as a lack of appropriate adaptive resources in upper level STEM courses and research. Mona Minkara is a rising fifth year graduate student in computational chemistry at the University of Florida. She is also blind. This account presents efforts conducted by an expansive team of university and student personnel in conjunction with Mona to adapt different portions of the graduate student curriculum to meet Mona's needs. The most important consideration is prior preparation of materials to assist with coursework and cumulative exams. Herein we present an account of the first four years of Mona's graduate experience hoping this will assist in the development of protocols for future blind and low-vision graduate students in computational chemistry.
Bayet, Laurie; Pascalis, Olivier; Quinn, Paul C.; Lee, Kang; Gentaz, Édouard; Tanaka, James W.
2015-01-01
Angry faces are perceived as more masculine by adults. However, the developmental course and underlying mechanism (bottom-up stimulus driven or top-down belief driven) associated with the angry-male bias remain unclear. Here we report that anger biases face gender categorization toward “male” responding in children as young as 5–6 years. The bias is observed for both own- and other-race faces, and is remarkably unchanged across development (into adulthood) as revealed by signal detection analyses (Experiments 1–2). The developmental course of the angry-male bias, along with its extension to other-race faces, combine to suggest that it is not rooted in extensive experience, e.g., observing males engaging in aggressive acts during the school years. Based on several computational simulations of gender categorization (Experiment 3), we further conclude that (1) the angry-male bias results, at least partially, from a strategy of attending to facial features or their second-order relations when categorizing face gender, and (2) any single choice of computational representation (e.g., Principal Component Analysis) is insufficient to assess resemblances between face categories, as different representations of the very same faces suggest different bases for the angry-male bias. Our findings are thus consistent with stimulus-and stereotyped-belief driven accounts of the angry-male bias. Taken together, the evidence suggests considerable stability in the interaction between some facial dimensions in social categorization that is present prior to the onset of formal schooling. PMID:25859238
A Bayesian account of ‘hysteria’
Adams, Rick A.; Brown, Harriet; Pareés, Isabel; Friston, Karl J.
2012-01-01
This article provides a neurobiological account of symptoms that have been called ‘hysterical’, ‘psychogenic’ or ‘medically unexplained’, which we will call functional motor and sensory symptoms. We use a neurobiologically informed model of hierarchical Bayesian inference in the brain to explain functional motor and sensory symptoms in terms of perception and action arising from inference based on prior beliefs and sensory information. This explanation exploits the key balance between prior beliefs and sensory evidence that is mediated by (body focused) attention, symptom expectations, physical and emotional experiences and beliefs about illness. Crucially, this furnishes an explanation at three different levels: (i) underlying neuromodulatory (synaptic) mechanisms; (ii) cognitive and experiential processes (attention and attribution of agency); and (iii) formal computations that underlie perceptual inference (representation of uncertainty or precision). Our explanation involves primary and secondary failures of inference; the primary failure is the (autonomous) emergence of a percept or belief that is held with undue certainty (precision) following top-down attentional modulation of synaptic gain. This belief can constitute a sensory percept (or its absence) or induce movement (or its absence). The secondary failure of inference is when the ensuing percept (and any somatosensory consequences) is falsely inferred to be a symptom to explain why its content was not predicted by the source of attentional modulation. This account accommodates several fundamental observations about functional motor and sensory symptoms, including: (i) their induction and maintenance by attention; (ii) their modification by expectation, prior experience and cultural beliefs and (iii) their involuntary and symptomatic nature. PMID:22641838
2014-12-07
parameters of resin viscosity and preform permeability prior to resin gelation. However, there could be significant variations in these two parameters...during actual manufacturing due to differences in the resin batches, mixes, temperature, ambient conditions for viscosity ; in the preform rolls...optimal injection time and locations for given process parameters of resin viscosity and preform permeability prior to resin gelation. However, there
Ranganathan, Rajiv; Wieser, Jon; Mosier, Kristine M; Mussa-Ivaldi, Ferdinando A; Scheidt, Robert A
2014-06-11
Prior learning of a motor skill creates motor memories that can facilitate or interfere with learning of new, but related, motor skills. One hypothesis of motor learning posits that for a sensorimotor task with redundant degrees of freedom, the nervous system learns the geometric structure of the task and improves performance by selectively operating within that task space. We tested this hypothesis by examining if transfer of learning between two tasks depends on shared dimensionality between their respective task spaces. Human participants wore a data glove and learned to manipulate a computer cursor by moving their fingers. Separate groups of participants learned two tasks: a prior task that was unique to each group and a criterion task that was common to all groups. We manipulated the mapping between finger motions and cursor positions in the prior task to define task spaces that either shared or did not share the task space dimensions (x-y axes) of the criterion task. We found that if the prior task shared task dimensions with the criterion task, there was an initial facilitation in criterion task performance. However, if the prior task did not share task dimensions with the criterion task, there was prolonged interference in learning the criterion task due to participants finding inefficient task solutions. These results show that the nervous system learns the task space through practice, and that the degree of shared task space dimensionality influences the extent to which prior experience transfers to subsequent learning of related motor skills. Copyright © 2014 the authors 0270-6474/14/348289-11$15.00/0.
Improving older adults' memory performance using prior task success.
Geraci, Lisa; Miller, Tyler M
2013-06-01
Holding negative aging stereotypes can lead older adults to perform poorly on memory tests. We attempted to improve older adults' memory performance by giving them task experience that would counter their negative performance expectations. Before participating in a memory experiment, younger and older adults were given a cognitive task that they could either successfully complete, not successfully complete, or they were given no prior task. For older adults, recall was significantly higher and self-reported anxiety was significantly lower for the prior task success group relative to the other groups. There was no effect of prior task experience on younger adults' memory performance. Results suggest that older adults' memory can be improved with a single successful prior task experience. PsycINFO Database Record (c) 2013 APA, all rights reserved.
A Neural Computational Model of Incentive Salience
Zhang, Jun; Berridge, Kent C.; Tindell, Amy J.; Smith, Kyle S.; Aldridge, J. Wayne
2009-01-01
Incentive salience is a motivational property with ‘magnet-like’ qualities. When attributed to reward-predicting stimuli (cues), incentive salience triggers a pulse of ‘wanting’ and an individual is pulled toward the cues and reward. A key computational question is how incentive salience is generated during a cue re-encounter, which combines both learning and the state of limbic brain mechanisms. Learning processes, such as temporal-difference models, provide one way for stimuli to acquire cached predictive values of rewards. However, empirical data show that subsequent incentive values are also modulated on the fly by dynamic fluctuation in physiological states, altering cached values in ways requiring additional motivation mechanisms. Dynamic modulation of incentive salience for a Pavlovian conditioned stimulus (CS or cue) occurs during certain states, without necessarily requiring (re)learning about the cue. In some cases, dynamic modulation of cue value occurs during states that are quite novel, never having been experienced before, and even prior to experience of the associated unconditioned reward in the new state. Such cases can include novel drug-induced mesolimbic activation and addictive incentive-sensitization, as well as natural appetite states such as salt appetite. Dynamic enhancement specifically raises the incentive salience of an appropriate CS, without necessarily changing that of other CSs. Here we suggest a new computational model that modulates incentive salience by integrating changing physiological states with prior learning. We support the model with behavioral and neurobiological data from empirical tests that demonstrate dynamic elevations in cue-triggered motivation (involving natural salt appetite, and drug-induced intoxication and sensitization). Our data call for a dynamic model of incentive salience, such as presented here. Computational models can adequately capture fluctuations in cue-triggered ‘wanting’ only by incorporating modulation of previously learned values by natural appetite and addiction-related states. PMID:19609350
Wild, Katherine V.; Mattek, Nora; Maxwell, Shoshana A.; Dodge, Hiroko H.; Jimison, Holly B.; Kaye, Jeffrey A.
2012-01-01
Background This study examines differences in computer related self-efficacy and anxiety in subgroups of older adults, and changes in those measures following exposure to a systematic training program and subsequent computer use. Methods Participants were volunteers in the Intelligent Systems for Assessment of Aging Changes Study (ISAAC) carried out by the Oregon Center for Aging and Technology. Participants were administered two questionnaires prior to training and again one year later, related to computer self-efficacy and anxiety. Continuous recording of computer use was also assessed for a subset of participants. Results Baseline comparisons by gender, age, education, living arrangement, and computer proficiency, but not cognitive status, yielded significant differences in confidence and anxiety related to specific aspects of computer use. At one-year follow-up, participants reported less anxiety and greater confidence. However, the benefits of training and exposure varied by group and task. Comparisons based on cognitive status showed that the cognitively intact participants benefited more from training and/or experience with computers than did participants with Mild Cognitive Impairment (MCI), who after one year continued to report less confidence and more anxiety regarding certain aspects of computer use. Conclusion After one year of consistent computer use, cognitively intact participants in this study reported reduced levels of anxiety and increased self-confidence in their ability to perform specific computer tasks. Participants with MCI at baseline were less likely to demonstrate increased efficacy or confidence than their cognitively intact counterparts. PMID:23102124
NASA Astrophysics Data System (ADS)
Wissing, Dennis Robert
The purpose of the this research was to explore undergraduates' conceptual development for oxygen transport and utilization, as a component of a cardiopulmonary physiology and advanced respiratory care course in the allied health program. This exploration focused on the student's development of knowledge and the presence of alternative conceptions, prior to, during, and after completing cardiopulmonary physiology and advanced respiratory care courses. Using the simulation program, SimBioSysTM (Samsel, 1994), student-participants completed a series of laboratory exercises focusing on cardiopulmonary disease states. This study examined data gathered from: (1) a novice group receiving the simulation program prior to instruction, (2) a novice group that experienced the simulation program following course completion in cardiopulmonary physiology, and (3) an intermediate group who experienced the simulation program following completion of formal education in Respiratory Care. This research was based on the theory of Human Constructivism as described by Mintzes, Wandersee, and Novak (1997). Data-gathering techniques were based on theories supported by Novak (1984), Wandersee (1997), and Chi (1997). Data were generated by exams, interviews, verbal analysis (Chi, 1997), and concept mapping. Results suggest that simulation may be an effective instructional method for assessing conceptual development and diagnosing alternative conceptions in undergraduates enrolled in a cardiopulmonary science program. Use of simulation in conjunction with clinical interview and concept mapping may assist in verifying gaps in learning and conceptual knowledge. This study found only limited evidence to support the use of computer simulation prior to lecture to augment learning. However, it was demonstrated that students' prelecture experience with the computer simulation helped the instructor assess what the learner knew so he or she could be taught accordingly. In addition, use of computer simulation after formal instruction was shown to be useful in aiding students identified by the instructor as needing remediation.
Adaptive zooming in X-ray computed tomography.
Dabravolski, Andrei; Batenburg, Kees Joost; Sijbers, Jan
2014-01-01
In computed tomography (CT), the source-detector system commonly rotates around the object in a circular trajectory. Such a trajectory does not allow to exploit a detector fully when scanning elongated objects. Increase the spatial resolution of the reconstructed image by optimal zooming during scanning. A new approach is proposed, in which the full width of the detector is exploited for every projection angle. This approach is based on the use of prior information about the object's convex hull to move the source as close as possible to the object, while avoiding truncation of the projections. Experiments show that the proposed approach can significantly improve reconstruction quality, producing reconstructions with smaller errors and revealing more details in the object. The proposed approach can lead to more accurate reconstructions and increased spatial resolution in the object compared to the conventional circular trajectory.
Zhang, Guanglei; Liu, Fei; Zhang, Bin; He, Yun; Luo, Jianwen; Bai, Jing
2013-04-01
Pharmacokinetic rates have the potential to provide quantitative physiological and pathological information for biological studies and drug development. Fluorescence molecular tomography (FMT) is an attractive imaging tool for three-dimensionally resolving fluorophore distribution in small animals. In this letter, pharmacokinetic rates of indocyanine green (ICG) in mouse liver are imaged with a hybrid FMT and x-ray computed tomography (XCT) system. A recently developed FMT method using structural priors from an XCT system is adopted to improve the quality of FMT reconstruction. In the in vivo experiments, images of uptake and excretion rates of ICG in mouse liver are obtained, which can be used to quantitatively evaluate liver function. The accuracy of the results is validated by a fiber-based fluorescence measurement system.
NASA Astrophysics Data System (ADS)
Tsujimura, Norio; Yoshida, Tadayoshi; Yashima, Hiroshi
The criticality accident alarm system (CAAS), which was recently developed and installed at the Japan Atomic Energy Agency's Tokai Reprocessing Plant, consists of a plastic scintillator combined with a cadmium-lined polyethylene moderator and thereby responds to both neutrons and gamma rays. To evaluate the neutron absorbed dose rate response of the CAAS detector, a 24 keV quasi-monoenergetic neutron irradiation experiment was performed at the B-1 facility of the Kyoto University Research Reactor. The detector's evaluated neutron response was confirmed to agree reasonably well with prior computer-predicted responses.
Dong, Han; Sharma, Diksha; Badano, Aldo
2014-12-01
Monte Carlo simulations play a vital role in the understanding of the fundamental limitations, design, and optimization of existing and emerging medical imaging systems. Efforts in this area have resulted in the development of a wide variety of open-source software packages. One such package, hybridmantis, uses a novel hybrid concept to model indirect scintillator detectors by balancing the computational load using dual CPU and graphics processing unit (GPU) processors, obtaining computational efficiency with reasonable accuracy. In this work, the authors describe two open-source visualization interfaces, webmantis and visualmantis to facilitate the setup of computational experiments via hybridmantis. The visualization tools visualmantis and webmantis enable the user to control simulation properties through a user interface. In the case of webmantis, control via a web browser allows access through mobile devices such as smartphones or tablets. webmantis acts as a server back-end and communicates with an NVIDIA GPU computing cluster that can support multiuser environments where users can execute different experiments in parallel. The output consists of point response and pulse-height spectrum, and optical transport statistics generated by hybridmantis. The users can download the output images and statistics through a zip file for future reference. In addition, webmantis provides a visualization window that displays a few selected optical photon path as they get transported through the detector columns and allows the user to trace the history of the optical photons. The visualization tools visualmantis and webmantis provide features such as on the fly generation of pulse-height spectra and response functions for microcolumnar x-ray imagers while allowing users to save simulation parameters and results from prior experiments. The graphical interfaces simplify the simulation setup and allow the user to go directly from specifying input parameters to receiving visual feedback for the model predictions.
Manza, Peter; Hu, Sien; Ide, Jaime S; Farr, Olivia M; Zhang, Sheng; Leung, Hoi-Chung; Li, Chiang-shan R
2016-03-01
To adapt flexibly to a rapidly changing environment, humans must anticipate conflict and respond to surprising, unexpected events. To this end, the brain estimates upcoming conflict on the basis of prior experience and computes unsigned prediction error (UPE). Although much work implicates catecholamines in cognitive control, little is known about how pharmacological manipulation of catecholamines affects the neural processes underlying conflict anticipation and UPE computation. We addressed this issue by imaging 24 healthy young adults who received a 45 mg oral dose of methylphenidate (MPH) and 62 matched controls who did not receive MPH prior to performing the stop-signal task. We used a Bayesian Dynamic Belief Model to make trial-by-trial estimates of conflict and UPE during task performance. Replicating previous research, the control group showed anticipation-related activation in the presupplementary motor area and deactivation in the ventromedial prefrontal cortex and parahippocampal gyrus, as well as UPE-related activations in the dorsal anterior cingulate, insula, and inferior parietal lobule. In group comparison, MPH increased anticipation activity in the bilateral caudate head and decreased UPE activity in each of the aforementioned regions. These findings highlight distinct effects of catecholamines on the neural mechanisms underlying conflict anticipation and UPE, signals critical to learning and adaptive behavior. © The Author(s) 2016.
Manza, Peter; Hu, Sien; Ide, Jaime S; Farr, Olivia M; Zhang, Sheng; Leung, Hoi-Chung; Li, Chiang-shan R
2016-01-01
To adapt flexibly to a rapidly changing environment, humans must anticipate conflict and respond to surprising, unexpected events. To this end, the brain estimates upcoming conflict on the basis of prior experience and computes unsigned prediction error (UPE). Although much work implicates catecholamines in cognitive control, little is known about how pharmacological manipulation of catecholamines affects the neural processes underlying conflict anticipation and UPE computation. We addressed this issue by imaging 24 healthy young adults who received a 45 mg oral dose of methylphenidate (MPH) and 62 matched controls who did not receive MPH prior to performing the stop-signal task. We used a Bayesian Dynamic Belief Model to make trial-by-trial estimates of conflict and UPE during task performance. Replicating previous research, the control group showed anticipation-related activation in the presupplementary motor area and deactivation in the ventromedial prefrontal cortex and parahippocampal gyrus, as well as UPE-related activations in the dorsal anterior cingulate, insula, and inferior parietal lobule. In group comparison, MPH increased anticipation activity in the bilateral caudate head and decreased UPE activity in each of the aforementioned regions. These findings highlight distinct effects of catecholamines on the neural mechanisms underlying conflict anticipation and UPE, signals critical to learning and adaptive behavior. PMID:26755547
Optimizing Requirements Decisions with KEYS
NASA Technical Reports Server (NTRS)
Jalali, Omid; Menzies, Tim; Feather, Martin
2008-01-01
Recent work with NASA's Jet Propulsion Laboratory has allowed for external access to five of JPL's real-world requirements models, anonymized to conceal proprietary information, but retaining their computational nature. Experimentation with these models, reported herein, demonstrates a dramatic speedup in the computations performed on them. These models have a well defined goal: select mitigations that retire risks which, in turn, increases the number of attainable requirements. Such a non-linear optimization is a well-studied problem. However identification of not only (a) the optimal solution(s) but also (b) the key factors leading to them is less well studied. Our technique, called KEYS, shows a rapid way of simultaneously identifying the solutions and their key factors. KEYS improves on prior work by several orders of magnitude. Prior experiments with simulated annealing or treatment learning took tens of minutes to hours to terminate. KEYS runs much faster than that; e.g for one model, KEYS ran 13,000 times faster than treatment learning (40 minutes versus 0.18 seconds). Processing these JPL models is a non-linear optimization problem: the fewest mitigations must be selected while achieving the most requirements. Non-linear optimization is a well studied problem. With this paper, we challenge other members of the PROMISE community to improve on our results with other techniques.
Wofford, J L; Currin, D; Michielutte, R; Wofford, M M
2001-04-20
Inadequate reading literacy is a major barrier to better educating patients. Despite its high prevalence, practical solutions for detecting and overcoming low literacy in a busy clinical setting remain elusive. In exploring the potential role for the multimedia computer in improving office-based patient education, we compared the accuracy of information captured from audio-computer interviewing of patients with that obtained from subsequent verbal questioning. Adult medicine clinic, urban community health center Convenience sample of patients awaiting clinic appointments (n = 59). Exclusion criteria included obvious psychoneurologic impairment or primary language other than English. A multimedia computer presentation that used audio-computer interviewing with localized imagery and voices to elicit responses to 4 questions on prior computer use and cancer risk perceptions. Three patients refused or were unable to interact with the computer at all, and 3 patients required restarting the presentation from the beginning but ultimately completed the computerized survey. Of the 51 evaluable patients (72.5% African-American, 66.7% female, mean age 47.5 [+/- 18.1]), the mean time in the computer presentation was significantly longer with older age and with no prior computer use but did not differ by gender or race. Despite a high proportion of no prior computer use (60.8%), there was a high rate of agreement (88.7% overall) between audio-computer interviewing and subsequent verbal questioning. Audio-computer interviewing is feasible in this urban community health center. The computer offers a partial solution for overcoming literacy barriers inherent in written patient education materials and provides an efficient means of data collection that can be used to better target patients' educational needs.
Specific Previous Experience Affects Perception of Harmony and Meter
ERIC Educational Resources Information Center
Creel, Sarah C.
2011-01-01
Prior knowledge shapes our experiences, but which prior knowledge shapes which experiences? This question is addressed in the domain of music perception. Three experiments were used to determine whether listeners activate specific musical memories during music listening. Each experiment provided listeners with one of two musical contexts that was…
López, Florente; Menez, Marina
2012-07-01
In two experiments we examined the influence of response and time factors on the speed of acquisition of temporal control on FI schedules. In Experiment 1, prior exposure to FT accelerated the development of temporal control on FI schedules of the same temporal value. It was also found that the slower acquisition on FI with prior RT was similar to that of rats with prior standard training. In Experiment 2, prior exposure to FT accelerated the development of temporal control on a FI schedule with a threefold increase in temporal value. Additionally, it was found that with prior FI 30s training, acquisition of temporal control on FI 90s was even faster than with prior FT 30s. Measures of head-entries into the feeder along the experiments indicated that temporal control was already developed during the periodic but not during the non-periodic histories and that this control transferred to lever press during FI testing phase. Copyright © 2012 Elsevier B.V. All rights reserved.
Bayesian bivariate meta-analysis of diagnostic test studies with interpretable priors.
Guo, Jingyi; Riebler, Andrea; Rue, Håvard
2017-08-30
In a bivariate meta-analysis, the number of diagnostic studies involved is often very low so that frequentist methods may result in problems. Using Bayesian inference is particularly attractive as informative priors that add a small amount of information can stabilise the analysis without overwhelming the data. However, Bayesian analysis is often computationally demanding and the selection of the prior for the covariance matrix of the bivariate structure is crucial with little data. The integrated nested Laplace approximations method provides an efficient solution to the computational issues by avoiding any sampling, but the important question of priors remain. We explore the penalised complexity (PC) prior framework for specifying informative priors for the variance parameters and the correlation parameter. PC priors facilitate model interpretation and hyperparameter specification as expert knowledge can be incorporated intuitively. We conduct a simulation study to compare the properties and behaviour of differently defined PC priors to currently used priors in the field. The simulation study shows that the PC prior seems beneficial for the variance parameters. The use of PC priors for the correlation parameter results in more precise estimates when specified in a sensible neighbourhood around the truth. To investigate the usage of PC priors in practice, we reanalyse a meta-analysis using the telomerase marker for the diagnosis of bladder cancer and compare the results with those obtained by other commonly used modelling approaches. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Phantom torso experiment on the international space station; flight measurements and calculations
NASA Astrophysics Data System (ADS)
Atwell, W.; Semones, E.; Cucinotta, F.
The Phantom Torso Experiment (PTE) first flew on the 10-day Space Shuttle mission STS-91 in June 1998 during a period near solar minimum. The PTE was re- f l o w n on the I ternational Space Station (ISS) Increment 2 mission from April-n A u g u s t 2001 during a period near solar maximum. The experiment was located with a suite of other radiation experiments in the US Lab module Human Research Facility (HRF) rack. The objective of the experiment was to measure space radiation exposures at several radiosensitive critical body organs (brain, thyroid, heart/lung, stomach and colon) and two locations on the surface (skin) of a modified RandoTM phantom. Prior to flight, active solid -state silicon dosimeters were located at the RandoTM critical body organ locations and passive dosimeters were placed at the two surface locations. Using a mathematically modified Computerized Anatomical Male (CAM) model, shielding distributions were generated for the five critical body organ and two skin locations. These shielding distributions were then combined with the ISS HRF rack shielding distribution to account for the total shielding "seen" by the PTE. Using the trapped proton and galactic cosmic radiation environment models and high -energy particle transport codes, absorbed dose, dose equivalent, and LET (linear energy transfer) values were computed for the seven dose point locations of interest. The results of these computations are compared with the actual flight measurements.
Khomtchouk, Bohdan B; Van Booven, Derek J; Wahlestedt, Claes
2014-01-01
The graphical visualization of gene expression data using heatmaps has become an integral component of modern-day medical research. Heatmaps are used extensively to plot quantitative differences in gene expression levels, such as those measured with RNAseq and microarray experiments, to provide qualitative large-scale views of the transcriptonomic landscape. Creating high-quality heatmaps is a computationally intensive task, often requiring considerable programming experience, particularly for customizing features to a specific dataset at hand. Software to create publication-quality heatmaps is developed with the R programming language, C++ programming language, and OpenGL application programming interface (API) to create industry-grade high performance graphics. We create a graphical user interface (GUI) software package called HeatmapGenerator for Windows OS and Mac OS X as an intuitive, user-friendly alternative to researchers with minimal prior coding experience to allow them to create publication-quality heatmaps using R graphics without sacrificing their desired level of customization. The simplicity of HeatmapGenerator is that it only requires the user to upload a preformatted input file and download the publicly available R software language, among a few other operating system-specific requirements. Advanced features such as color, text labels, scaling, legend construction, and even database storage can be easily customized with no prior programming knowledge. We provide an intuitive and user-friendly software package, HeatmapGenerator, to create high-quality, customizable heatmaps generated using the high-resolution color graphics capabilities of R. The software is available for Microsoft Windows and Apple Mac OS X. HeatmapGenerator is released under the GNU General Public License and publicly available at: http://sourceforge.net/projects/heatmapgenerator/. The Mac OS X direct download is available at: http://sourceforge.net/projects/heatmapgenerator/files/HeatmapGenerator_MAC_OSX.tar.gz/download. The Windows OS direct download is available at: http://sourceforge.net/projects/heatmapgenerator/files/HeatmapGenerator_WINDOWS.zip/download.
Explanation and Prior Knowledge Interact to Guide Learning
ERIC Educational Resources Information Center
Williams, Joseph J.; Lombrozo, Tania
2013-01-01
How do explaining and prior knowledge contribute to learning? Four experiments explored the relationship between explanation and prior knowledge in category learning. The experiments independently manipulated whether participants were prompted to explain the category membership of study observations and whether category labels were informative in…
Kolodny, Oren; Lotem, Arnon; Edelman, Shimon
2015-03-01
We introduce a set of biologically and computationally motivated design choices for modeling the learning of language, or of other types of sequential, hierarchically structured experience and behavior, and describe an implemented system that conforms to these choices and is capable of unsupervised learning from raw natural-language corpora. Given a stream of linguistic input, our model incrementally learns a grammar that captures its statistical patterns, which can then be used to parse or generate new data. The grammar constructed in this manner takes the form of a directed weighted graph, whose nodes are recursively (hierarchically) defined patterns over the elements of the input stream. We evaluated the model in seventeen experiments, grouped into five studies, which examined, respectively, (a) the generative ability of grammar learned from a corpus of natural language, (b) the characteristics of the learned representation, (c) sequence segmentation and chunking, (d) artificial grammar learning, and (e) certain types of structure dependence. The model's performance largely vindicates our design choices, suggesting that progress in modeling language acquisition can be made on a broad front-ranging from issues of generativity to the replication of human experimental findings-by bringing biological and computational considerations, as well as lessons from prior efforts, to bear on the modeling approach. Copyright © 2014 Cognitive Science Society, Inc.
Moving beyond qualitative evaluations of Bayesian models of cognition.
Hemmer, Pernille; Tauber, Sean; Steyvers, Mark
2015-06-01
Bayesian models of cognition provide a powerful way to understand the behavior and goals of individuals from a computational point of view. Much of the focus in the Bayesian cognitive modeling approach has been on qualitative model evaluations, where predictions from the models are compared to data that is often averaged over individuals. In many cognitive tasks, however, there are pervasive individual differences. We introduce an approach to directly infer individual differences related to subjective mental representations within the framework of Bayesian models of cognition. In this approach, Bayesian data analysis methods are used to estimate cognitive parameters and motivate the inference process within a Bayesian cognitive model. We illustrate this integrative Bayesian approach on a model of memory. We apply the model to behavioral data from a memory experiment involving the recall of heights of people. A cross-validation analysis shows that the Bayesian memory model with inferred subjective priors predicts withheld data better than a Bayesian model where the priors are based on environmental statistics. In addition, the model with inferred priors at the individual subject level led to the best overall generalization performance, suggesting that individual differences are important to consider in Bayesian models of cognition.
Rivard, Justin D; Vergis, Ashley S; Unger, Bertram J; Hardy, Krista M; Andrew, Chris G; Gillman, Lawrence M; Park, Jason
2014-06-01
Computer-based surgical simulators capture a multitude of metrics based on different aspects of performance, such as speed, accuracy, and movement efficiency. However, without rigorous assessment, it may be unclear whether all, some, or none of these metrics actually reflect technical skill, which can compromise educational efforts on these simulators. We assessed the construct validity of individual performance metrics on the LapVR simulator (Immersion Medical, San Jose, CA, USA) and used these data to create task-specific summary metrics. Medical students with no prior laparoscopic experience (novices, N = 12), junior surgical residents with some laparoscopic experience (intermediates, N = 12), and experienced surgeons (experts, N = 11) all completed three repetitions of four LapVR simulator tasks. The tasks included three basic skills (peg transfer, cutting, clipping) and one procedural skill (adhesiolysis). We selected 36 individual metrics on the four tasks that assessed six different aspects of performance, including speed, motion path length, respect for tissue, accuracy, task-specific errors, and successful task completion. Four of seven individual metrics assessed for peg transfer, six of ten metrics for cutting, four of nine metrics for clipping, and three of ten metrics for adhesiolysis discriminated between experience levels. Time and motion path length were significant on all four tasks. We used the validated individual metrics to create summary equations for each task, which successfully distinguished between the different experience levels. Educators should maintain some skepticism when reviewing the plethora of metrics captured by computer-based simulators, as some but not all are valid. We showed the construct validity of a limited number of individual metrics and developed summary metrics for the LapVR. The summary metrics provide a succinct way of assessing skill with a single metric for each task, but require further validation.
EGASP: the human ENCODE Genome Annotation Assessment Project
Guigó, Roderic; Flicek, Paul; Abril, Josep F; Reymond, Alexandre; Lagarde, Julien; Denoeud, France; Antonarakis, Stylianos; Ashburner, Michael; Bajic, Vladimir B; Birney, Ewan; Castelo, Robert; Eyras, Eduardo; Ucla, Catherine; Gingeras, Thomas R; Harrow, Jennifer; Hubbard, Tim; Lewis, Suzanna E; Reese, Martin G
2006-01-01
Background We present the results of EGASP, a community experiment to assess the state-of-the-art in genome annotation within the ENCODE regions, which span 1% of the human genome sequence. The experiment had two major goals: the assessment of the accuracy of computational methods to predict protein coding genes; and the overall assessment of the completeness of the current human genome annotations as represented in the ENCODE regions. For the computational prediction assessment, eighteen groups contributed gene predictions. We evaluated these submissions against each other based on a 'reference set' of annotations generated as part of the GENCODE project. These annotations were not available to the prediction groups prior to the submission deadline, so that their predictions were blind and an external advisory committee could perform a fair assessment. Results The best methods had at least one gene transcript correctly predicted for close to 70% of the annotated genes. Nevertheless, the multiple transcript accuracy, taking into account alternative splicing, reached only approximately 40% to 50% accuracy. At the coding nucleotide level, the best programs reached an accuracy of 90% in both sensitivity and specificity. Programs relying on mRNA and protein sequences were the most accurate in reproducing the manually curated annotations. Experimental validation shows that only a very small percentage (3.2%) of the selected 221 computationally predicted exons outside of the existing annotation could be verified. Conclusion This is the first such experiment in human DNA, and we have followed the standards established in a similar experiment, GASP1, in Drosophila melanogaster. We believe the results presented here contribute to the value of ongoing large-scale annotation projects and should guide further experimental methods when being scaled up to the entire human genome sequence. PMID:16925836
Experimental Results from the Thermal Energy Storage-1 (TES-1) Flight Experiment
NASA Technical Reports Server (NTRS)
Wald, Lawrence W.; Tolbert, Carol; Jacqmin, David
1995-01-01
The Thermal Energy Storage-1 (TES-1) is a flight experiment that flew on the Space Shuttle Columbia (STS-62), in March 1994, as part of the OAST-2 mission. TES-1 is the first experiment in a four experiment suite designed to provide data for understanding the long duration microgravity behavior of thermal energy storage fluoride salts that undergo repeated melting and freezing. Such data have never been obtained before and have direct application for the development of space-based solar dynamic (SD) power systems. These power systems will store solar energy in a thermal energy salt such as lithium fluoride or calcium fluoride. The stored energy is extracted during the shade portion of the orbit. This enables the solar dynamic power system to provide constant electrical power over the entire orbit. Analytical computer codes have been developed for predicting performance of a spaced-based solar dynamic power system. Experimental verification of the analytical predictions is needed prior to using the analytical results for future space power design applications. The four TES flight experiments will be used to obtain the needed experimental data. This paper will focus on the flight results from the first experiment, TES-1, in comparison to the predicted results from the Thermal Energy Storage Simulation (TESSIM) analytical computer code. The TES-1 conceptual development, hardware design, final development, and system verification testing were accomplished at the NASA lewis Research Center (LeRC). TES-1 was developed under the In-Space Technology Experiment Program (IN-STEP), which sponsors NASA, industry, and university flight experiments designed to enable and enhance space flight technology. The IN-STEP Program is sponsored by the Office of Space Access and Technology (OSAT).
2007-06-01
Video game -based environments are an increasingly popular medium for training Soldiers. This research investigated how various strategies for...modifying task difficulty over the progression of an instructional video game impact learner performance and motivation. Further, the influence of prior... video game experience on these learning outcomes was examined, as well as the role prior experience played in determining the optimal approach for
Lee, Moon J; Kang, Hannah
2018-05-01
To test whether message framing (ie, gain vs. loss) and risk type (ie, health vs appearance risk) in skin cancer prevention messages interact with one's prior experience. Two experiments with a 2 (message framing: gain vs loss) × 2 (risk type: health vs appearance risk) factorial design were conducted. The participants were given a URL to the experiment website via e-mail. On the first page of the website, the participants were told that they would be asked to evaluate a skin cancer print public service announcement (PSA): Online experiments. A total of 397 individuals participated (236 for experiment 1 and 161 for experiment 2). Apparatus: Four versions of the skin cancer print PSAs were developed. Four PSAs were identical except for the 2 manipulated components: message framing and risk type. Measures were adopted from Cho and Boster (message framing), Jones and Leary and Kiene et al. (risk type), De Vries, Mesters, van't Riet, Willems, and Reubsaet and Knight, Kirincich, Farmer, and Hood (prior experience), and Hammond, Fong, Zanna, Thrasher, and Borland and Hoffner and Ye (behavioral intent). General linear models were used to test hypotheses. Three-way interactions among message framing, risk type, and prior experience were found: When the intent of the message was to encourage sunscreen use, the effects of message framing and risk type were shown to be the exact opposite directions from when the intent was to discourage indoor/outdoor tanning. To discourage tanning among those with prior experience, messages emphasizing losses in terms of one's health will work better. For those with no prior experience, messages emphasizing potential appearance losses will work better for discouraging tanning while messages emphasizing gains like improving appearance will do a better job in encouraging sunscreen use.
Does prediction error drive one-shot declarative learning?
Greve, Andrea; Cooper, Elisa; Kaula, Alexander; Anderson, Michael C; Henson, Richard
2017-06-01
The role of prediction error (PE) in driving learning is well-established in fields such as classical and instrumental conditioning, reward learning and procedural memory; however, its role in human one-shot declarative encoding is less clear. According to one recent hypothesis, PE reflects the divergence between two probability distributions: one reflecting the prior probability (from previous experiences) and the other reflecting the sensory evidence (from the current experience). Assuming unimodal probability distributions, PE can be manipulated in three ways: (1) the distance between the mode of the prior and evidence, (2) the precision of the prior, and (3) the precision of the evidence. We tested these three manipulations across five experiments, in terms of peoples' ability to encode a single presentation of a scene-item pairing as a function of previous exposures to that scene and/or item. Memory was probed by presenting the scene together with three choices for the previously paired item, in which the two foil items were from other pairings within the same condition as the target item. In Experiment 1, we manipulated the evidence to be either consistent or inconsistent with prior expectations, predicting PE to be larger, and hence memory better, when the new pairing was inconsistent. In Experiments 2a-c, we manipulated the precision of the priors, predicting better memory for a new pairing when the (inconsistent) priors were more precise. In Experiment 3, we manipulated both visual noise and prior exposure for unfamiliar faces, before pairing them with scenes, predicting better memory when the sensory evidence was more precise. In all experiments, the PE hypotheses were supported. We discuss alternative explanations of individual experiments, and conclude the Predictive Interactive Multiple Memory Signals (PIMMS) framework provides the most parsimonious account of the full pattern of results.
Nelson, Jonathon H; Deutsch, Nina; Cohen, Ira T; Reddy, Srijaya K
2017-01-01
Anesthesiology residency programs commonly have rotations at free-standing children's hospitals to provide and/or supplement their residents' training in pediatric anesthesia. Length and timing of these rotations differ from program to program as can their residents' existing medical knowledge and clinical skills. We predicted that residents with prior pediatric anesthesia experience, who rotate at our pediatric institution for two consecutive months, will score higher on an exit exam compared to residents without prior pediatric experience or those that only rotate for one month. A 50-question multiple choice test was created using pediatric questions released from The American Board of Anesthesiology (ABA) written examinations. The test was administered and proctored at the end of each rotation. Study participants came from three different programs: Program A offers prior pediatric anesthesia experience and a one month rotation; Program B - offers prior pediatric anesthesia experience and a two month rotation; and Program C - does not offer prior pediatric anesthesia experience but includes a two month rotation. The 2014-2015 cohort consisted of 26 rotating second-year clinical anesthesia (CA-2) residents. One resident's exam scores were excluded from this study due to protocol violation. Mean exam scores for Program A, B, and C were 70.5% ± 5.7, 64.2% ± 7.0, and 67.3% ± 4.3, respectively. There was no statistically significant difference in the exit exam scores among the three groups. Prior pediatric anesthesia experience or length of time for subspecialty rotation was not associated with any significant difference in exit exam scores for CA-2 residents.
A comparison of traditional textbook and interactive computer learning of neuromuscular block.
Ohrn, M A; van Oostrom, J H; van Meurs, W L
1997-03-01
We designed an educational software package, RELAX, for teaching first-year anesthesiology residents about the pharmacology and clinical management of neuromuscular blockade. The software uses an interactive, problem-based approach and moves the user through cases in an operating room environment. It can be run on personal computers with Microsoft Windows (Microsoft Corp., Redmond, WA) and combines video, graphics, and text with mouse-driven user input. We utilized test scores 1) to determine whether our software was beneficial to be the educational progress of anesthesiology residents and 2) to compare computer-based learning with textbook learning. Twenty-three residents were divided into two groups matched for age and sex, and a pretest was administered to all 23 residents. There was no significant difference (P > 0.05) in the pretest scores of the two groups. Three weeks later, both groups were subjected to an educational intervention; one with our computer software and the other with selected textbooks. Both groups took a posttest immediately after the intervention. The test scores of the computer group improved significantly more (P < 0.05) than those of the textbook group. Although prior to the study the two groups showed no statistical difference in their familiarity with computers, the computer group reported much higher satisfaction with their learning experience than did the textbook group (P < 0.0001).
Ale, Angelique; Schulz, Ralf B; Sarantopoulos, Athanasios; Ntziachristos, Vasilis
2010-05-01
The performance is studied of two newly introduced and previously suggested methods that incorporate priors into inversion schemes associated with data from a recently developed hybrid x-ray computed tomography and fluorescence molecular tomography system, the latter based on CCD camera photon detection. The unique data set studied attains accurately registered data of high spatially sampled photon fields propagating through tissue along 360 degrees projections. Approaches that incorporate structural prior information were included in the inverse problem by adding a penalty term to the minimization function utilized for image reconstructions. Results were compared as to their performance with simulated and experimental data from a lung inflammation animal model and against the inversions achieved when not using priors. The importance of using priors over stand-alone inversions is also showcased with high spatial sampling simulated and experimental data. The approach of optimal performance in resolving fluorescent biodistribution in small animals is also discussed. Inclusion of prior information from x-ray CT data in the reconstruction of the fluorescence biodistribution leads to improved agreement between the reconstruction and validation images for both simulated and experimental data.
[The Computer Competency of Nurses in Long-Term Care Facilities and Related Factors].
Chang, Ya-Ping; Kuo, Huai-Ting; Li, I-Chuan
2016-12-01
It is important for nurses who work in long-term care facilities (LTCFs) to have an adequate level of computer competency due to the multidisciplinary and comprehensive nature of long-term care services. Thus, it is important to understand the current computer competency of nursing staff in LTCFs and the factors that relate to this competency. To explore the computer competency of LTCF nurses and to identify the demographic and computer-usage characteristics that relate significantly to computer competency in the LTCF environment. A cross-sectional research design and a self-report questionnaire were used to collect data from 185 nurses working at LTCFs in Taipei. The results found that the variables of the frequency of computer use (β = .33), age (β = -.30), type(s) of the software used at work (β = .28), hours of on-the-job training (β = -.14), prior work experience at other LTCFs (β = -.14), and Internet use at home (β = .12) explain 58.0% of the variance in the computer competency of participants. The results of the present study suggest that the following measures may help increase the computer competency of LTCF nurses. (1) Nurses should be encouraged to use electronic nursing records rather than handwritten records. (2) On-the-job training programs should emphasize participant competency in the Excel software package in order to maintain efficient and good-quality of LTC services after implementing of the LTC insurance policy.
34 CFR 642.32 - Prior experience.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Training Program project under title IV-A-4 of the Higher Education Act within the three fiscal years prior... 34 Education 3 2010-07-01 2010-07-01 false Prior experience. 642.32 Section 642.32 Education Regulations of the Offices of the Department of Education (Continued) OFFICE OF POSTSECONDARY EDUCATION...
26 CFR 1.181-4 - Special rules.
Code of Federal Regulations, 2014 CFR
2014-04-01
... before computing gain or loss from the disposition. (2) Principal photography not commencing prior to the... for which principal photography does not commence prior to the date of expiration of section 181, the...
26 CFR 1.181-4 - Special rules.
Code of Federal Regulations, 2012 CFR
2012-04-01
... before computing gain or loss from the disposition. (2) Principal photography not commencing prior to the... for which principal photography does not commence prior to the date of expiration of section 181, the...
26 CFR 1.181-4 - Special rules.
Code of Federal Regulations, 2013 CFR
2013-04-01
... before computing gain or loss from the disposition. (2) Principal photography not commencing prior to the... for which principal photography does not commence prior to the date of expiration of section 181, the...
When does prior knowledge disproportionately benefit older adults’ memory?
Badham, Stephen P.; Hay, Mhairi; Foxon, Natasha; Kaur, Kiran; Maylor, Elizabeth A.
2016-01-01
ABSTRACT Material consistent with knowledge/experience is generally more memorable than material inconsistent with knowledge/experience – an effect that can be more extreme in older adults. Four experiments investigated knowledge effects on memory with young and older adults. Memory for familiar and unfamiliar proverbs (Experiment 1) and for common and uncommon scenes (Experiment 2) showed similar knowledge effects across age groups. Memory for person-consistent and person-neutral actions (Experiment 3) showed a greater benefit of prior knowledge in older adults. For cued recall of related and unrelated word pairs (Experiment 4), older adults benefited more from prior knowledge only when it provided uniquely useful additional information beyond the episodic association itself. The current data and literature suggest that prior knowledge has the age-dissociable mnemonic properties of (1) improving memory for the episodes themselves (age invariant), and (2) providing conceptual information about the tasks/stimuli extrinsically to the actual episodic memory (particularly aiding older adults). PMID:26473767
Of bits and wows: A Bayesian theory of surprise with applications to attention.
Baldi, Pierre; Itti, Laurent
2010-06-01
The amount of information contained in a piece of data can be measured by the effect this data has on its observer. Fundamentally, this effect is to transform the observer's prior beliefs into posterior beliefs, according to Bayes theorem. Thus the amount of information can be measured in a natural way by the distance (relative entropy) between the prior and posterior distributions of the observer over the available space of hypotheses. This facet of information, termed "surprise", is important in dynamic situations where beliefs change, in particular during learning and adaptation. Surprise can often be computed analytically, for instance in the case of distributions from the exponential family, or it can be numerically approximated. During sequential Bayesian learning, surprise decreases as the inverse of the number of training examples. Theoretical properties of surprise are discussed, in particular how it differs and complements Shannon's definition of information. A computer vision neural network architecture is then presented capable of computing surprise over images and video stimuli. Hypothesizing that surprising data ought to attract natural or artificial attention systems, the output of this architecture is used in a psychophysical experiment to analyze human eye movements in the presence of natural video stimuli. Surprise is found to yield robust performance at predicting human gaze (ROC-like ordinal dominance score approximately 0.7 compared to approximately 0.8 for human inter-observer repeatability, approximately 0.6 for simpler intensity contrast-based predictor, and 0.5 for chance). The resulting theory of surprise is applicable across different spatio-temporal scales, modalities, and levels of abstraction. Copyright 2010 Elsevier Ltd. All rights reserved.
Attention in a Bayesian Framework
Whiteley, Louise; Sahani, Maneesh
2012-01-01
The behavioral phenomena of sensory attention are thought to reflect the allocation of a limited processing resource, but there is little consensus on the nature of the resource or why it should be limited. Here we argue that a fundamental bottleneck emerges naturally within Bayesian models of perception, and use this observation to frame a new computational account of the need for, and action of, attention – unifying diverse attentional phenomena in a way that goes beyond previous inferential, probabilistic and Bayesian models. Attentional effects are most evident in cluttered environments, and include both selective phenomena, where attention is invoked by cues that point to particular stimuli, and integrative phenomena, where attention is invoked dynamically by endogenous processing. However, most previous Bayesian accounts of attention have focused on describing relatively simple experimental settings, where cues shape expectations about a small number of upcoming stimuli and thus convey “prior” information about clearly defined objects. While operationally consistent with the experiments it seeks to describe, this view of attention as prior seems to miss many essential elements of both its selective and integrative roles, and thus cannot be easily extended to complex environments. We suggest that the resource bottleneck stems from the computational intractability of exact perceptual inference in complex settings, and that attention reflects an evolved mechanism for approximate inference which can be shaped to refine the local accuracy of perception. We show that this approach extends the simple picture of attention as prior, so as to provide a unified and computationally driven account of both selective and integrative attentional phenomena. PMID:22712010
An Interview with Matthew P. Greving, PhD. Interview by Vicki Glaser.
Greving, Matthew P
2011-10-01
Matthew P. Greving is Chief Scientific Officer at Nextval Inc., a company founded in early 2010 that has developed a discovery platform called MassInsight™.. He received his PhD in Biochemistry from Arizona State University, and prior to that he spent nearly 7 years working as a software engineer. This experience in solving complex computational problems fueled his interest in developing technologies and algorithms related to acquisition and analysis of high-dimensional biochemical data. To address the existing problems associated with label-based microarray readouts, he beganwork on a technique for label-free mass spectrometry (MS) microarray readout compatible with both matrix-assisted laser/desorption ionization (MALDI) and matrix-free nanostructure initiator mass spectrometry (NIMS). This is the core of Nextval’s MassInsight technology, which utilizes picoliter noncontact deposition of high-density arrays on mass-readout substrates along with computational algorithms for high-dimensional data processingand reduction.
Development of a computational model for astronaut reorientation.
Stirling, Leia; Willcox, Karen; Newman, Dava
2010-08-26
The ability to model astronaut reorientations computationally provides a simple way to develop and study human motion control strategies. Since the cost of experimenting in microgravity is high, and underwater training can lead to motions inappropriate for microgravity, these techniques allow for motions to be developed and well-understood prior to any microgravity exposure. By including a model of the current space suit, we have the ability to study both intravehicular and extravehicular activities. We present several techniques for rotating about the axes of the body and show that motions performed by the legs create a greater net rotation than those performed by the arms. Adding a space suit to the motions was seen to increase the resistance torque and limit the available range of motion. While rotations about the body axes can be performed in the current space suit, the resulting motions generated a reduced rotation when compared to the unsuited configuration. 2010 Elsevier Ltd. All rights reserved.
Computational fluid dynamics: Transition to design applications
NASA Technical Reports Server (NTRS)
Bradley, R. G.; Bhateley, I. C.; Howell, G. A.
1987-01-01
The development of aerospace vehicles, over the years, was an evolutionary process in which engineering progress in the aerospace community was based, generally, on prior experience and data bases obtained through wind tunnel and flight testing. Advances in the fundamental understanding of flow physics, wind tunnel and flight test capability, and mathematical insights into the governing flow equations were translated into improved air vehicle design. The modern day field of Computational Fluid Dynamics (CFD) is a continuation of the growth in analytical capability and the digital mathematics needed to solve the more rigorous form of the flow equations. Some of the technical and managerial challenges that result from rapidly developing CFD capabilites, some of the steps being taken by the Fort Worth Division of General Dynamics to meet these challenges, and some of the specific areas of application for high performance air vehicles are presented.
Computational model of polarized actin cables and cytokinetic actin ring formation in budding yeast
Tang, Haosu; Bidone, Tamara C.
2015-01-01
The budding yeast actin cables and contractile ring are important for polarized growth and division, revealing basic aspects of cytoskeletal function. To study these formin-nucleated structures, we built a 3D computational model with actin filaments represented as beads connected by springs. Polymerization by formins at the bud tip and bud neck, crosslinking, severing, and myosin pulling, are included. Parameter values were estimated from prior experiments. The model generates actin cable structures and dynamics similar to those of wild type and formin deletion mutant cells. Simulations with increased polymerization rate result in long, wavy cables. Simulated pulling by type V myosin stretches actin cables. Increasing the affinity of actin filaments for the bud neck together with reduced myosin V pulling promotes the formation of a bundle of antiparallel filaments at the bud neck, which we suggest as a model for the assembly of actin filaments to the contractile ring. PMID:26538307
Tremel, Joshua J; Ortiz, Daniella M; Fiez, Julie A
2018-06-01
When making a decision, we have to identify, collect, and evaluate relevant bits of information to ensure an optimal outcome. How we approach a given choice can be influenced by prior experience. Contextual factors and structural elements of these past decisions can cause a shift in how information is encoded and can in turn influence later decision-making. In this two-experiment study, we sought to manipulate declarative memory efficacy and decision-making in a concurrent discrimination learning task by altering the amount of information to be learned. Subjects learned correct responses to pairs of items across several repetitions of a 50- or 100-pair set and were tested for memory retention. In one experiment, this memory test interrupted learning after an initial encoding experience in order to test for early encoding differences and associate those differences with changes in decision-making. In a second experiment, we used fMRI to probe neural differences between the two list-length groups related to decision-making across learning and assessed subsequent memory retention. We found that a striatum-based system was associated with decision-making patterns when learning a longer list of items, while a medial cortical network was associated with patterns when learning a shorter list. Additionally, the hippocampus was exclusively active for the shorter list group. Altogether, these behavioral, computational, and imaging results provide evidence that multiple types of mnemonic representations contribute to experienced-based decision-making. Moreover, contextual and structural factors of the task and of prior decisions can influence what types of evidence are drawn upon during decision-making. Copyright © 2018 Elsevier Ltd. All rights reserved.
Active Prior Tactile Knowledge Transfer for Learning Tactual Properties of New Objects
Feng, Di
2018-01-01
Reusing the tactile knowledge of some previously-explored objects (prior objects) helps us to easily recognize the tactual properties of new objects. In this paper, we enable a robotic arm equipped with multi-modal artificial skin, like humans, to actively transfer the prior tactile exploratory action experiences when it learns the detailed physical properties of new objects. These experiences, or prior tactile knowledge, are built by the feature observations that the robot perceives from multiple sensory modalities, when it applies the pressing, sliding, and static contact movements on objects with different action parameters. We call our method Active Prior Tactile Knowledge Transfer (APTKT), and systematically evaluated its performance by several experiments. Results show that the robot improved the discrimination accuracy by around 10% when it used only one training sample with the feature observations of prior objects. By further incorporating the predictions from the observation models of prior objects as auxiliary features, our method improved the discrimination accuracy by over 20%. The results also show that the proposed method is robust against transferring irrelevant prior tactile knowledge (negative knowledge transfer). PMID:29466300
Computers and Instruction: Implications of the Rising Tide of Criticism for Reading Education.
ERIC Educational Resources Information Center
Balajthy, Ernest
1988-01-01
Examines two major reasons that schools have adopted computers without careful prior examination and planning. Surveys a variety of criticisms targeted toward some aspects of computer-based instruction in reading in an effort to direct attention to the beneficial implications of computers in the classroom. (MS)
The Role of Prior Knowledge in Learning from Analogies in Science Texts
ERIC Educational Resources Information Center
Braasch, Jason L. G.; Goldman, Susan R.
2010-01-01
Two experiments examined whether inconsistent effects of analogies in promoting new content learning from text are related to prior knowledge of the analogy "per se." In Experiment 1, college students who demonstrated little understanding of weather systems and different levels of prior knowledge (more vs. less) of an analogous everyday…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Jared M; Ferber, Aaron E; Greenlee, Elliot D
Akatosh is a highly configurable system based on the integration of the capabilities of one or more Intrusion Detection Systems (IDS) and automated forensic analysis. Akatosh reduces the false positive rates of IDSs and alleviates costs of incident response by pointing forensic personnel to the root cause of an incident on affected endpoint devices. Akatosh is able to analyze a computer system in near real-time and provide operations and forensic analyst personnel with continuous feedback on the impact of malware and software on deployed systems. Additionally, Akatosh provides the ability to look back into any prior state in the historymore » of the computer system along with the ability to compare one or more prior system states with any other prior state.« less
Parallelized Bayesian inversion for three-dimensional dental X-ray imaging.
Kolehmainen, Ville; Vanne, Antti; Siltanen, Samuli; Järvenpää, Seppo; Kaipio, Jari P; Lassas, Matti; Kalke, Martti
2006-02-01
Diagnostic and operational tasks based on dental radiology often require three-dimensional (3-D) information that is not available in a single X-ray projection image. Comprehensive 3-D information about tissues can be obtained by computerized tomography (CT) imaging. However, in dental imaging a conventional CT scan may not be available or practical because of high radiation dose, low-resolution or the cost of the CT scanner equipment. In this paper, we consider a novel type of 3-D imaging modality for dental radiology. We consider situations in which projection images of the teeth are taken from a few sparsely distributed projection directions using the dentist's regular (digital) X-ray equipment and the 3-D X-ray attenuation function is reconstructed. A complication in these experiments is that the reconstruction of the 3-D structure based on a few projection images becomes an ill-posed inverse problem. Bayesian inversion is a well suited framework for reconstruction from such incomplete data. In Bayesian inversion, the ill-posed reconstruction problem is formulated in a well-posed probabilistic form in which a priori information is used to compensate for the incomplete information of the projection data. In this paper we propose a Bayesian method for 3-D reconstruction in dental radiology. The method is partially based on Kolehmainen et al. 2003. The prior model for dental structures consist of a weighted l1 and total variation (TV)-prior together with the positivity prior. The inverse problem is stated as finding the maximum a posteriori (MAP) estimate. To make the 3-D reconstruction computationally feasible, a parallelized version of an optimization algorithm is implemented for a Beowulf cluster computer. The method is tested with projection data from dental specimens and patient data. Tomosynthetic reconstructions are given as reference for the proposed method.
Development of schemas revealed by prior experience and NMDA receptor knock-out
Dragoi, George; Tonegawa, Susumu
2013-01-01
Prior experience accelerates acquisition of novel, related information through processes like assimilation into mental schemas, but the underlying neuronal mechanisms are poorly understood. We investigated the roles that prior experience and hippocampal CA3 N-Methyl-D-aspartate receptor (NMDAR)-dependent synaptic plasticity play in CA1 place cell sequence encoding and learning during novel spatial experiences. We found that specific representations of de novo experiences on linear environments were formed on a framework of pre configured network activity expressed in the preceding sleep and were rapidly, flexibly adjusted via NMDAR-dependent activity. This prior experience accelerated encoding of subsequent experiences on contiguous or isolated novel tracks, significantly decreasing their NMDAR-dependence. Similarly, de novo learning of an alternation task was facilitated by CA3 NMDARs; this experience accelerated subsequent learning of related tasks, independent of CA3 NMDARs, consistent with a schema-based learning. These results reveal the existence of distinct neuronal encoding schemes which could explain why hippocampal dysfunction results in anterograde amnesia while sparing recollection of old, schema-based memories. DOI: http://dx.doi.org/10.7554/eLife.01326.001 PMID:24327561
CT Imaging, Data Reduction, and Visualization of Hardwood Logs
Daniel L. Schmoldt
1996-01-01
Computer tomography (CT) is a mathematical technique that, combined with noninvasive scanning such as x-ray imaging, has become a powerful tool to nondestructively test materials prior to use or to evaluate materials prior to processing. In the current context, hardwood lumber processing can benefit greatly by knowing what a log looks like prior to initial breakdown....
Effects of Prior Experience on Shelter-Seeking Behavior of Juvenile American Lobsters.
Bayer, Skylar R; Bianchi, Katherine M; Atema, Jelle; Jacobs, Molly W
2017-04-01
Shelter-seeking behaviors are vital for survival for a range of juvenile benthic organisms. These behaviors may be innate or they may be affected by prior experience. After hatching, American lobsters Homarus americanus likely first come into contact with shelter during the late postlarval (decapodid) stage, known as stage IV. After the subsequent molt to the first juvenile stage (stage V), they are entirely benthic and are thought to be highly cryptic. We hypothesized that postlarval (stage IV) experience with shelter would carry over into the first juvenile stage (stage V) and reduce the time needed for juveniles to locate and enter shelters (sheltering). We found some evidence of a carryover effect, but not the one we predicted: stage V juveniles with postlarval shelter experience took significantly longer to initiate sheltering. We also hypothesized that stage V juveniles would demonstrate learning by relocating shelters more quickly with immediate prior experience. Our findings were mixed. In a maze, juveniles with immediate prior experience were faster to regain visual contact with shelter, suggesting that they had learned the location of the shelter. In contrast, there was no significant effect of immediate prior experience on time to initiate sheltering in an open arena, or in the maze after juveniles had regained visual contact. We conclude that very young (stage V) juvenile lobsters modify their shelter-seeking behavior based on prior experiences across several timescales. Ecologically relevant variation in habitat exposure among postlarval and early juvenile lobsters may influence successful recruitment in this culturally and commercially important fishery species.
Littel, Marianne; van Schie, Kevin; van den Hout, Marcel A.
2017-01-01
ABSTRACT Background: Eye movement desensitization and reprocessing (EMDR) is an effective psychological treatment for posttraumatic stress disorder. Recalling a memory while simultaneously making eye movements (EM) decreases a memory’s vividness and/or emotionality. It has been argued that non-specific factors, such as treatment expectancy and experimental demand, may contribute to the EMDR’s effectiveness. Objective: The present study was designed to test whether expectations about the working mechanism of EMDR would alter the memory attenuating effects of EM. Two experiments were conducted. In Experiment 1, we examined the effects of pre-existing (non-manipulated) knowledge of EMDR in participants with and without prior knowledge. In Experiment 2, we experimentally manipulated prior knowledge by providing participants without prior knowledge with correct or incorrect information about EMDR’s working mechanism. Method: Participants in both experiments recalled two aversive, autobiographical memories during brief sets of EM (Recall+EM) or keeping eyes stationary (Recall Only). Before and after the intervention, participants scored their memories on vividness and emotionality. A Bayesian approach was used to compare two competing hypotheses on the effects of (existing/given) prior knowledge: (1) Prior (correct) knowledge increases the effects of Recall+EM vs. Recall Only, vs. (2) prior knowledge does not affect the effects of Recall+EM. Results: Recall+EM caused greater reductions in memory vividness and emotionality than Recall Only in all groups, including the incorrect information group. In Experiment 1, both hypotheses were supported by the data: prior knowledge boosted the effects of EM, but only modestly. In Experiment 2, the second hypothesis was clearly supported over the first: providing knowledge of the underlying mechanism of EMDR did not alter the effects of EM. Conclusions: Recall+EM appears to be quite robust against the effects of prior expectations. As Recall+EM is the core component of EMDR, expectancy effects probably contribute little to the effectiveness of EMDR treatment. PMID:29038685
Hierarchical clustering method for improved prostate cancer imaging in diffuse optical tomography
NASA Astrophysics Data System (ADS)
Kavuri, Venkaiah C.; Liu, Hanli
2013-03-01
We investigate the feasibility of trans-rectal near infrared (NIR) based diffuse optical tomography (DOT) for early detection of prostate cancer using a transrectal ultrasound (TRUS) compatible imaging probe. For this purpose, we designed a TRUS-compatible, NIR-based image system (780nm), in which the photo diodes were placed on the trans-rectal probe. DC signals were recorded and used for estimating the absorption coefficient. We validated the system using laboratory phantoms. For further improvement, we also developed a hierarchical clustering method (HCM) to improve the accuracy of image reconstruction with limited prior information. We demonstrated the method using computer simulations laboratory phantom experiments.
Students' explanations in complex learning of disciplinary programming
NASA Astrophysics Data System (ADS)
Vieira, Camilo
Computational Science and Engineering (CSE) has been denominated as the third pillar of science and as a set of important skills to solve the problems of a global society. Along with the theoretical and the experimental approaches, computation offers a third alternative to solve complex problems that require processing large amounts of data, or representing complex phenomena that are not easy to experiment with. Despite the relevance of CSE, current professionals and scientists are not well prepared to take advantage of this set of tools and methods. Computation is usually taught in an isolated way from engineering disciplines, and therefore, engineers do not know how to exploit CSE affordances. This dissertation intends to introduce computational tools and methods contextualized within the Materials Science and Engineering curriculum. Considering that learning how to program is a complex task, the dissertation explores effective pedagogical practices that can support student disciplinary and computational learning. Two case studies will be evaluated to identify the characteristics of effective worked examples in the context of CSE. Specifically, this dissertation explores students explanations of these worked examples in two engineering courses with different levels of transparency: a programming course in materials science and engineering glass box and a thermodynamics course involving computational representations black box. Results from this study suggest that students benefit in different ways from writing in-code comments. These benefits include but are not limited to: connecting xv individual lines of code to the overall problem, getting familiar with the syntax, learning effective algorithm design strategies, and connecting computation with their discipline. Students in the glass box context generate higher quality explanations than students in the black box context. These explanations are related to students prior experiences. Specifically, students with low ability to do programming engage in a more thorough explanation process than students with high ability. This dissertation concludes proposing an adaptation to the instructional principles of worked-examples for the context of CSE education.
GPU implementation of prior image constrained compressed sensing (PICCS)
NASA Astrophysics Data System (ADS)
Nett, Brian E.; Tang, Jie; Chen, Guang-Hong
2010-04-01
The Prior Image Constrained Compressed Sensing (PICCS) algorithm (Med. Phys. 35, pg. 660, 2008) has been applied to several computed tomography applications with both standard CT systems and flat-panel based systems designed for guiding interventional procedures and radiation therapy treatment delivery. The PICCS algorithm typically utilizes a prior image which is reconstructed via the standard Filtered Backprojection (FBP) reconstruction algorithm. The algorithm then iteratively solves for the image volume that matches the measured data, while simultaneously assuring the image is similar to the prior image. The PICCS algorithm has demonstrated utility in several applications including: improved temporal resolution reconstruction, 4D respiratory phase specific reconstructions for radiation therapy, and cardiac reconstruction from data acquired on an interventional C-arm. One disadvantage of the PICCS algorithm, just as other iterative algorithms, is the long computation times typically associated with reconstruction. In order for an algorithm to gain clinical acceptance reconstruction must be achievable in minutes rather than hours. In this work the PICCS algorithm has been implemented on the GPU in order to significantly reduce the reconstruction time of the PICCS algorithm. The Compute Unified Device Architecture (CUDA) was used in this implementation.
The impact of the rate prior on Bayesian estimation of divergence times with multiple Loci.
Dos Reis, Mario; Zhu, Tianqi; Yang, Ziheng
2014-07-01
Bayesian methods provide a powerful way to estimate species divergence times by combining information from molecular sequences with information from the fossil record. With the explosive increase of genomic data, divergence time estimation increasingly uses data of multiple loci (genes or site partitions). Widely used computer programs to estimate divergence times use independent and identically distributed (i.i.d.) priors on the substitution rates for different loci. The i.i.d. prior is problematic. As the number of loci (L) increases, the prior variance of the average rate across all loci goes to zero at the rate 1/L. As a consequence, the rate prior dominates posterior time estimates when many loci are analyzed, and if the rate prior is misspecified, the estimated divergence times will converge to wrong values with very narrow credibility intervals. Here we develop a new prior on the locus rates based on the Dirichlet distribution that corrects the problematic behavior of the i.i.d. prior. We use computer simulation and real data analysis to highlight the differences between the old and new priors. For a dataset for six primate species, we show that with the old i.i.d. prior, if the prior rate is too high (or too low), the estimated divergence times are too young (or too old), outside the bounds imposed by the fossil calibrations. In contrast, with the new Dirichlet prior, posterior time estimates are insensitive to the rate prior and are compatible with the fossil calibrations. We re-analyzed a phylogenomic data set of 36 mammal species and show that using many fossil calibrations can alleviate the adverse impact of a misspecified rate prior to some extent. We recommend the use of the new Dirichlet prior in Bayesian divergence time estimation. [Bayesian inference, divergence time, relaxed clock, rate prior, partition analysis.]. © The Author(s) 2014. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.
NASA Astrophysics Data System (ADS)
Pohlman, Nicholas A.; Hynes, Eric; Kutz, April
2015-11-01
Lectures in introductory fluid mechanics at NIU are a combination of students with standard enrollment and students seeking honors credit for an enriching experience. Most honors students dread the additional homework problems or an extra paper assigned by the instructor. During the past three years, honors students of my class have instead collaborated to design wet-lab experiments for their peers to predict variable volume flow rates of open reservoirs driven by gravity. Rather than learn extra, the honors students learn the Bernoulli head-loss equation earlier to design appropriate systems for an experimental wet lab. Prior designs incorporated minor loss features such as sudden contraction or multiple unions and valves. The honors students from Spring 2015 expanded the repertoire of available options by developing large scale set-ups with multiple pipe networks that could be combined together to test the flexibility of the student team's computational programs. The engagement of bridging the theory with practice was appreciated by all of the students such that multiple teams were able to predict performance within 4% accuracy. The challenges, schedules, and cost estimates of incorporating the experimental lab into an introductory fluid mechanics course will be reported.
Quantifying natural delta variability using a multiple-point geostatistics prior uncertainty model
NASA Astrophysics Data System (ADS)
Scheidt, Céline; Fernandes, Anjali M.; Paola, Chris; Caers, Jef
2016-10-01
We address the question of quantifying uncertainty associated with autogenic pattern variability in a channelized transport system by means of a modern geostatistical method. This question has considerable relevance for practical subsurface applications as well, particularly those related to uncertainty quantification relying on Bayesian approaches. Specifically, we show how the autogenic variability in a laboratory experiment can be represented and reproduced by a multiple-point geostatistical prior uncertainty model. The latter geostatistical method requires selection of a limited set of training images from which a possibly infinite set of geostatistical model realizations, mimicking the training image patterns, can be generated. To that end, we investigate two methods to determine how many training images and what training images should be provided to reproduce natural autogenic variability. The first method relies on distance-based clustering of overhead snapshots of the experiment; the second method relies on a rate of change quantification by means of a computer vision algorithm termed the demon algorithm. We show quantitatively that with either training image selection method, we can statistically reproduce the natural variability of the delta formed in the experiment. In addition, we study the nature of the patterns represented in the set of training images as a representation of the "eigenpatterns" of the natural system. The eigenpattern in the training image sets display patterns consistent with previous physical interpretations of the fundamental modes of this type of delta system: a highly channelized, incisional mode; a poorly channelized, depositional mode; and an intermediate mode between the two.
Own and Others' Prior Experiences Influence Children's Imitation of Causal Acts
ERIC Educational Resources Information Center
Williamson, Rebecca A.; Meltzoff, Andrew N.
2011-01-01
Young children learn from others' examples, and they do so selectively. We examine whether the efficacy of prior experiences influences children's imitation. Thirty-six-month-olds had initial experience on a causal learning task either by performing the task themselves or by watching an adult perform it. The nature of the experience was…
Altan, Irem; Charbonneau, Patrick; Snell, Edward H.
2016-01-01
Crystallization is a key step in macromolecular structure determination by crystallography. While a robust theoretical treatment of the process is available, due to the complexity of the system, the experimental process is still largely one of trial and error. In this article, efforts in the field are discussed together with a theoretical underpinning using a solubility phase diagram. Prior knowledge has been used to develop tools that computationally predict the crystallization outcome and define mutational approaches that enhance the likelihood of crystallization. For the most part these tools are based on binary outcomes (crystal or no crystal), and the full information contained in an assembly of crystallization screening experiments is lost. The potential of this additional information is illustrated by examples where new biological knowledge can be obtained and where a target can be sub-categorized to predict which class of reagents provides the crystallization driving force. Computational analysis of crystallization requires complete and correctly formatted data. While massive crystallization screening efforts are under way, the data available from many of these studies are sparse. The potential for this data and the steps needed to realize this potential are discussed. PMID:26792536
College students and computers: assessment of usage patterns and musculoskeletal discomfort.
Noack-Cooper, Karen L; Sommerich, Carolyn M; Mirka, Gary A
2009-01-01
A limited number of studies have focused on computer-use-related MSDs in college students, though risk factor exposure may be similar to that of workers who use computers. This study examined computer use patterns of college students, and made comparisons to a group of previously studied computer-using professionals. 234 students completed a web-based questionnaire concerning computer use habits and physical discomfort respondents specifically associated with computer use. As a group, students reported their computer use to be at least 'Somewhat likely' 18 out of 24 h/day, compared to 12 h for the professionals. Students reported more uninterrupted work behaviours than the professionals. Younger graduate students reported 33.7 average weekly computing hours, similar to hours reported by younger professionals. Students generally reported more frequent upper extremity discomfort than the professionals. Frequent assumption of awkward postures was associated with frequent discomfort. The findings signal a need for intervention, including, training and education, prior to entry into the workforce. Students are future workers, and so it is important to determine whether their increasing exposure to computers, prior to entering the workforce, may make it so they enter already injured or do not enter their chosen profession due to upper extremity MSDs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Churchfield, M.; Wang, Q.; Scholbrock, A.
Here, we describe the process of using large-eddy simulations of wind turbine wake flow to help design a wake measurement campaign. The main goal of the experiment is to measure wakes and wake deflection that result from intentional yaw misalignment under a variety of atmospheric conditions at the Scaled Wind Farm Technology facility operated by Sandia National Laboratories in Lubbock, Texas. Prior simulation studies have shown that wake deflection may be used for wind-plant control that maximizes plant power output. In this study, simulations are performed to characterize wake deflection and general behavior before the experiment is performed to ensuremore » better upfront planning. Beyond characterizing the expected wake behavior, we also use the large-eddy simulation to test a virtual version of the lidar we plan to use to measure the wake and better understand our lidar scan strategy options. This work is an excellent example of a 'simulation-in-the-loop' measurement campaign.« less
Improving collaborative learning in online software engineering education
NASA Astrophysics Data System (ADS)
Neill, Colin J.; DeFranco, Joanna F.; Sangwan, Raghvinder S.
2017-11-01
Team projects are commonplace in software engineering education. They address a key educational objective, provide students critical experience relevant to their future careers, allow instructors to set problems of greater scale and complexity than could be tackled individually, and are a vehicle for socially constructed learning. While all student teams experience challenges, those in fully online programmes must also deal with remote working, asynchronous coordination, and computer-mediated communications all of which contribute to greater social distance between team members. We have developed a facilitation framework to aid team collaboration and have demonstrated its efficacy, in prior research, with respect to team performance and outcomes. Those studies indicated, however, that despite experiencing improved project outcomes, students working in effective software engineering teams did not experience significantly improved individual achievement. To address this deficiency we implemented theoretically grounded refinements to the collaboration model based upon peer-tutoring research. Our results indicate a modest, but statistically significant (p = .08), improvement in individual achievement using this refined model.
NASA Astrophysics Data System (ADS)
Churchfield, M.; Wang, Q.; Scholbrock, A.; Herges, T.; Mikkelsen, T.; Sjöholm, M.
2016-09-01
We describe the process of using large-eddy simulations of wind turbine wake flow to help design a wake measurement campaign. The main goal of the experiment is to measure wakes and wake deflection that result from intentional yaw misalignment under a variety of atmospheric conditions at the Scaled Wind Farm Technology facility operated by Sandia National Laboratories in Lubbock, Texas. Prior simulation studies have shown that wake deflection may be used for wind-plant control that maximizes plant power output. In this study, simulations are performed to characterize wake deflection and general behavior before the experiment is performed to ensure better upfront planning. Beyond characterizing the expected wake behavior, we also use the large-eddy simulation to test a virtual version of the lidar we plan to use to measure the wake and better understand our lidar scan strategy options. This work is an excellent example of a “simulation-in-the-loop” measurement campaign.
Churchfield, M.; Wang, Q.; Scholbrock, A.; ...
2016-10-03
Here, we describe the process of using large-eddy simulations of wind turbine wake flow to help design a wake measurement campaign. The main goal of the experiment is to measure wakes and wake deflection that result from intentional yaw misalignment under a variety of atmospheric conditions at the Scaled Wind Farm Technology facility operated by Sandia National Laboratories in Lubbock, Texas. Prior simulation studies have shown that wake deflection may be used for wind-plant control that maximizes plant power output. In this study, simulations are performed to characterize wake deflection and general behavior before the experiment is performed to ensuremore » better upfront planning. Beyond characterizing the expected wake behavior, we also use the large-eddy simulation to test a virtual version of the lidar we plan to use to measure the wake and better understand our lidar scan strategy options. This work is an excellent example of a 'simulation-in-the-loop' measurement campaign.« less
Objectified quantification of uncertainties in Bayesian atmospheric inversions
NASA Astrophysics Data System (ADS)
Berchet, A.; Pison, I.; Chevallier, F.; Bousquet, P.; Bonne, J.-L.; Paris, J.-D.
2015-05-01
Classical Bayesian atmospheric inversions process atmospheric observations and prior emissions, the two being connected by an observation operator picturing mainly the atmospheric transport. These inversions rely on prescribed errors in the observations, the prior emissions and the observation operator. When data pieces are sparse, inversion results are very sensitive to the prescribed error distributions, which are not accurately known. The classical Bayesian framework experiences difficulties in quantifying the impact of mis-specified error distributions on the optimized fluxes. In order to cope with this issue, we rely on recent research results to enhance the classical Bayesian inversion framework through a marginalization on a large set of plausible errors that can be prescribed in the system. The marginalization consists in computing inversions for all possible error distributions weighted by the probability of occurrence of the error distributions. The posterior distribution of the fluxes calculated by the marginalization is not explicitly describable. As a consequence, we carry out a Monte Carlo sampling based on an approximation of the probability of occurrence of the error distributions. This approximation is deduced from the well-tested method of the maximum likelihood estimation. Thus, the marginalized inversion relies on an automatic objectified diagnosis of the error statistics, without any prior knowledge about the matrices. It robustly accounts for the uncertainties on the error distributions, contrary to what is classically done with frozen expert-knowledge error statistics. Some expert knowledge is still used in the method for the choice of an emission aggregation pattern and of a sampling protocol in order to reduce the computation cost. The relevance and the robustness of the method is tested on a case study: the inversion of methane surface fluxes at the mesoscale with virtual observations on a realistic network in Eurasia. Observing system simulation experiments are carried out with different transport patterns, flux distributions and total prior amounts of emitted methane. The method proves to consistently reproduce the known "truth" in most cases, with satisfactory tolerance intervals. Additionally, the method explicitly provides influence scores and posterior correlation matrices. An in-depth interpretation of the inversion results is then possible. The more objective quantification of the influence of the observations on the fluxes proposed here allows us to evaluate the impact of the observation network on the characterization of the surface fluxes. The explicit correlations between emission aggregates reveal the mis-separated regions, hence the typical temporal and spatial scales the inversion can analyse. These scales are consistent with the chosen aggregation patterns.
2007-03-01
They conclude that the most important contributors to turnover in human services fields are stress, burnout , and lack of job satisfaction...as it relates to performance, satisfaction, and turnover. Lucas (1999) conducted a controlled experiment with college students in an attempt to... dental benefits of service in the military. 6. Prior-Enlisted Status The prior-enlisted status variable is dichotomous, where non-prior=0 and prior=1
NASA Astrophysics Data System (ADS)
Shi, X.
2015-12-01
As NSF indicated - "Theory and experimentation have for centuries been regarded as two fundamental pillars of science. It is now widely recognized that computational and data-enabled science forms a critical third pillar." Geocomputation is the third pillar of GIScience and geosciences. With the exponential growth of geodata, the challenge of scalable and high performance computing for big data analytics become urgent because many research activities are constrained by the inability of software or tool that even could not complete the computation process. Heterogeneous geodata integration and analytics obviously magnify the complexity and operational time frame. Many large-scale geospatial problems may be not processable at all if the computer system does not have sufficient memory or computational power. Emerging computer architectures, such as Intel's Many Integrated Core (MIC) Architecture and Graphics Processing Unit (GPU), and advanced computing technologies provide promising solutions to employ massive parallelism and hardware resources to achieve scalability and high performance for data intensive computing over large spatiotemporal and social media data. Exploring novel algorithms and deploying the solutions in massively parallel computing environment to achieve the capability for scalable data processing and analytics over large-scale, complex, and heterogeneous geodata with consistent quality and high-performance has been the central theme of our research team in the Department of Geosciences at the University of Arkansas (UARK). New multi-core architectures combined with application accelerators hold the promise to achieve scalability and high performance by exploiting task and data levels of parallelism that are not supported by the conventional computing systems. Such a parallel or distributed computing environment is particularly suitable for large-scale geocomputation over big data as proved by our prior works, while the potential of such advanced infrastructure remains unexplored in this domain. Within this presentation, our prior and on-going initiatives will be summarized to exemplify how we exploit multicore CPUs, GPUs, and MICs, and clusters of CPUs, GPUs and MICs, to accelerate geocomputation in different applications.
45 CFR 1616.3 - Qualifications.
Code of Federal Regulations, 2010 CFR
2010-10-01
...) Academic training and performance; (b) The nature and extent of prior legal experience; (c) Knowledge and understanding of the legal problems and needs of the poor; (d) Prior working experience in the client community...
Nonstationary homogeneous nucleation
NASA Technical Reports Server (NTRS)
Harstad, K. G.
1974-01-01
The theory of homogeneous condensation is reviewed and equations describing this process are presented. Numerical computer solutions to transient problems in nucleation (relaxation to steady state) are presented and compared to a prior computation.
U.S. emissions of HFC-134a derived for 2008-2012 from an extensive flask-air sampling network
NASA Astrophysics Data System (ADS)
Hu, Lei; Montzka, Stephen A.; Miller, John B.; Andrews, Aryln E.; Lehman, Scott J.; Miller, Benjamin R.; Thoning, Kirk; Sweeney, Colm; Chen, Huilin; Godwin, David S.; Masarie, Kenneth; Bruhwiler, Lori; Fischer, Marc L.; Biraud, Sebastien C.; Torn, Margaret S.; Mountain, Marikate; Nehrkorn, Thomas; Eluszkiewicz, Janusz; Miller, Scot; Draxler, Roland R.; Stein, Ariel F.; Hall, Bradley D.; Elkins, James W.; Tans, Pieter P.
2015-01-01
national and regional emissions of HFC-134a are derived for 2008-2012 based on atmospheric observations from ground and aircraft sites across the U.S. and a newly developed regional inverse model. Synthetic data experiments were first conducted to optimize the model assimilation design and to assess model-data mismatch errors and prior flux error covariances computed using a maximum likelihood estimation technique. The synthetic data experiments also tested the sensitivity of derived national and regional emissions to a range of assumed prior emissions, with the goal of designing a system that was minimally reliant on the prior. We then explored the influence of additional sources of error in inversions with actual observations, such as those associated with background mole fractions and transport uncertainties. Estimated emissions of HFC-134a range from 52 to 61 Gg yr-1 for the contiguous U.S. during 2008-2012 for inversions using air transport from Hybrid Single-Particle Lagrangian Integrated Trajectory (HYSPLIT) model driven by the 12 km resolution meteorogical data from North American Mesoscale Forecast System (NAM12) and all tested combinations of prior emissions and background mole fractions. Estimated emissions for 2008-2010 were 20% lower when specifying alternative transport from Stochastic Time-Inverted Lagrangian Transport (STILT) model driven by the Weather Research and Forecasting (WRF) meteorology. Our estimates (for HYSPLIT-NAM12) are consistent with annual emissions reported by U.S. Environmental Protection Agency for the full study interval. The results suggest a 10-20% drop in U.S. national HFC-134a emission in 2009 coincident with a reduction in transportation-related fossil fuel CO2 emissions, perhaps related to the economic recession. All inversions show seasonal variation in national HFC-134a emissions in all years, with summer emissions greater than winter emissions by 20-50%.
Pupils, Teachers & Palmtop Computers.
ERIC Educational Resources Information Center
Robertson, S. I.; And Others
1996-01-01
To examine the effects of introducing portable computers into secondary schools, a study was conducted regarding information technology skills and attitudes of staff and eighth grade students prior to and after receiving individual portable computers. Knowledge and use of word processing, spreadsheets, and database applications increased for both…
14 CFR 147.31 - Attendance and enrollment, tests, and credit for prior instruction or experience.
Code of Federal Regulations, 2013 CFR
2013-01-01
... credit for prior instruction or experience. 147.31 Section 147.31 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) SCHOOLS AND OTHER CERTIFICATED AGENCIES AVIATION... instruction or experience. (a) A certificated aviation maintenance technician school may not require any...
14 CFR 147.31 - Attendance and enrollment, tests, and credit for prior instruction or experience.
Code of Federal Regulations, 2014 CFR
2014-01-01
... credit for prior instruction or experience. 147.31 Section 147.31 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) SCHOOLS AND OTHER CERTIFICATED AGENCIES AVIATION... instruction or experience. (a) A certificated aviation maintenance technician school may not require any...
14 CFR 147.31 - Attendance and enrollment, tests, and credit for prior instruction or experience.
Code of Federal Regulations, 2011 CFR
2011-01-01
... credit for prior instruction or experience. 147.31 Section 147.31 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) SCHOOLS AND OTHER CERTIFICATED AGENCIES AVIATION... instruction or experience. (a) A certificated aviation maintenance technician school may not require any...
14 CFR 147.31 - Attendance and enrollment, tests, and credit for prior instruction or experience.
Code of Federal Regulations, 2012 CFR
2012-01-01
... credit for prior instruction or experience. 147.31 Section 147.31 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) SCHOOLS AND OTHER CERTIFICATED AGENCIES AVIATION... instruction or experience. (a) A certificated aviation maintenance technician school may not require any...
Dynamic tensile-failure-induced velocity deficits in rock
NASA Technical Reports Server (NTRS)
Rubin, Allan M.; Ahrens, Thomas J.
1991-01-01
Planar impact experiments were employed to induce dynamic tensile failure in Bedford limestone. Rock disks were impacted with aluminum and polymethyl methacralate (PMMA) flyer plates at velocities of 10 to 25 m/s. Tensile stress magnitudes and duration were chosen so as to induce a range of microcrack growth insufficient to cause complete spalling of the samples. Ultrasonic P- and S-wave velocities of recovered targets were compared to the velocities prior to impact. Velocity reduction, and by inference microcrack production, occurred in samples subjected to stresses above 35 MPa in the 1.3 microsec PMMA experiments and 60 MPa in the 0.5 microsec aluminum experiments. Using a simple model for the time-dependent stress-intensity factor at the tips of existing flaws, apparent fracture toughnesses of 2.4 and 2.5 MPa sq rt m are computed for the 1.3 and 0.5 microsec experiments. These are a factor of about 2 to 3 greater than quasi-static values. The greater dynamic fracture toughness observed may result from microcrack interaction during tensile failure. Data for water-saturated and dry targets are indistinguishable.
ERIC Educational Resources Information Center
Sanseau, Pierre-Yves; Ansart, Sandrine
2013-01-01
In this paper, the researchers analyse how lifelong learning can be enriched and develop a different perspective based on the experiment involving the accreditation of prior experiential learning (APEL) conducted in France at the university level. The French system for the accreditation of prior experiential learning, called Validation des Acquis…
Yu, Zhicong; Leng, Shuai; Li, Zhoubo; McCollough, Cynthia H.
2016-01-01
Photon-counting computed tomography (PCCT) is an emerging imaging technique that enables multi-energy imaging with only a single scan acquisition. To enable multi-energy imaging, the detected photons corresponding to the full x-ray spectrum are divided into several subgroups of bin data that correspond to narrower energy windows. Consequently, noise in each energy bin increases compared to the full-spectrum data. This work proposes an iterative reconstruction algorithm for noise suppression in the narrower energy bins used in PCCT imaging. The algorithm is based on the framework of prior image constrained compressed sensing (PICCS) and is called spectral PICCS; it uses the full-spectrum image reconstructed using conventional filtered back-projection as the prior image. The spectral PICCS algorithm is implemented using a constrained optimization scheme with adaptive iterative step sizes such that only two tuning parameters are required in most cases. The algorithm was first evaluated using computer simulations, and then validated by both physical phantoms and in-vivo swine studies using a research PCCT system. Results from both computer-simulation and experimental studies showed substantial image noise reduction in narrow energy bins (43~73%) without sacrificing CT number accuracy or spatial resolution. PMID:27551878
NASA Astrophysics Data System (ADS)
Yu, Zhicong; Leng, Shuai; Li, Zhoubo; McCollough, Cynthia H.
2016-09-01
Photon-counting computed tomography (PCCT) is an emerging imaging technique that enables multi-energy imaging with only a single scan acquisition. To enable multi-energy imaging, the detected photons corresponding to the full x-ray spectrum are divided into several subgroups of bin data that correspond to narrower energy windows. Consequently, noise in each energy bin increases compared to the full-spectrum data. This work proposes an iterative reconstruction algorithm for noise suppression in the narrower energy bins used in PCCT imaging. The algorithm is based on the framework of prior image constrained compressed sensing (PICCS) and is called spectral PICCS; it uses the full-spectrum image reconstructed using conventional filtered back-projection as the prior image. The spectral PICCS algorithm is implemented using a constrained optimization scheme with adaptive iterative step sizes such that only two tuning parameters are required in most cases. The algorithm was first evaluated using computer simulations, and then validated by both physical phantoms and in vivo swine studies using a research PCCT system. Results from both computer-simulation and experimental studies showed substantial image noise reduction in narrow energy bins (43-73%) without sacrificing CT number accuracy or spatial resolution.
ERIC Educational Resources Information Center
Van Dusseldorp, Ralph
1984-01-01
Describes the successful, low-cost program for infusion of computer competencies into the curriculum of the School of Education at the University of Alaska, Anchorage, where all students are required to become computer competent prior to graduation. Computer competency goals for students in school's certification programs are outlined. (MBR)
Orthonormal filters for identification in active control systems
NASA Astrophysics Data System (ADS)
Mayer, Dirk
2015-12-01
Many active noise and vibration control systems require models of the control paths. When the controlled system changes slightly over time, adaptive digital filters for the identification of the models are useful. This paper aims at the investigation of a special class of adaptive digital filters: orthonormal filter banks possess the robust and simple adaptation of the widely applied finite impulse response (FIR) filters, but at a lower model order, which is important when considering implementation on embedded systems. However, the filter banks require prior knowledge about the resonance frequencies and damping of the structure. This knowledge can be supposed to be of limited precision, since in many practical systems, uncertainties in the structural parameters exist. In this work, a procedure using a number of training systems to find the fixed parameters for the filter banks is applied. The effect of uncertainties in the prior knowledge on the model error is examined both with a basic example and in an experiment. Furthermore, the possibilities to compensate for the imprecise prior knowledge by a higher filter order are investigated. Also comparisons with FIR filters are implemented in order to assess the possible advantages of the orthonormal filter banks. Numerical and experimental investigations show that significantly lower computational effort can be reached by the filter banks under certain conditions.
Improving the accuracy of burn-surface estimation.
Nichter, L S; Williams, J; Bryant, C A; Edlich, R F
1985-09-01
A user-friendly computer-assisted method of calculating total body surface area burned (TBSAB) has been developed. This method is more accurate, faster, and subject to less error than conventional methods. For comparison, the ability of 30 physicians to estimate TBSAB was tested. Parameters studied included the effect of prior burn care experience, the influence of burn size, the ability to accurately sketch the size of burns on standard burn charts, and the ability to estimate percent TBSAB from the sketches. Despite the ability for physicians of all levels of training to accurately sketch TBSAB, significant burn size over-estimation (p less than 0.01) and large interrater variability of potential consequence was noted. Direct benefits of a computerized system are many. These include the need for minimal user experience and the ability for wound-trend analysis, permanent record storage, calculation of fluid and caloric requirements, hemodynamic parameters, and the ability to compare meaningfully the different treatment protocols.
How previous experience shapes perception in different sensory modalities
Snyder, Joel S.; Schwiedrzik, Caspar M.; Vitela, A. Davi; Melloni, Lucia
2015-01-01
What has transpired immediately before has a strong influence on how sensory stimuli are processed and perceived. In particular, temporal context can have contrastive effects, repelling perception away from the interpretation of the context stimulus, and attractive effects (TCEs), whereby perception repeats upon successive presentations of the same stimulus. For decades, scientists have documented contrastive and attractive temporal context effects mostly with simple visual stimuli. But both types of effects also occur in other modalities, e.g., audition and touch, and for stimuli of varying complexity, raising the possibility that context effects reflect general computational principles of sensory systems. Neuroimaging shows that contrastive and attractive context effects arise from neural processes in different areas of the cerebral cortex, suggesting two separate operations with distinct functional roles. Bayesian models can provide a functional account of both context effects, whereby prior experience adjusts sensory systems to optimize perception of future stimuli. PMID:26582982
Student Teachers' Reflections on Prior Experiences of Learning Geography
ERIC Educational Resources Information Center
Dolan, Anne M.; Waldron, Fionnuala; Pike, Susan; Greenwood, Richard
2014-01-01
Primary geography education is an important part of initial teacher education. The importance of prior experiences in the development of student teachers has long been recognised and there is growing evidence of the nature of those experiences in areas such as geography. This paper reports the findings of research conducted with one cohort of…
The effect of technology on student science achievement
NASA Astrophysics Data System (ADS)
Hilton, June Kraft
2003-10-01
Prior research indicates that technology has had little effect on raising student achievement. Little empirical research exists, however, studying the effects of technology as a tool to improve student achievement through development of higher order thinking skills. Also, prior studies have not focused on the manner in which technology is being used in the classroom and at home to enhance teaching and learning. Empirical data from a secondary school representative of those in California were analyzed to determine the effects of technology on student science achievement. The quantitative analysis methods for the school data study included a multiple linear path analysis, using final course grade as the ultimate exogenous variable. In addition, empirical data from a nationwide survey on how Americans use the Internet were disaggregated by age and analyzed to determine the relationships between computer and Internet experience and (a) Internet use at home for school assignments and (b) more general computer use at home for school assignments for school age children. Analysis of data collected from the a "A Nation Online" Survey conducted by the United States Census Bureau assessed these relationships via correlations and cross-tabulations. Finally, results from these data analyses were assessed in conjunction with systemic reform efforts from 12 states designed to address improvements in science and mathematics education in light of the Third International Mathematics and Science Survey (TIMSS). Examination of the technology efforts in those states provided a more nuanced understanding of the impact technology has on student achievement. Key findings included evidence that technology training for teachers increased their use of the computer for instruction but students' final science course grade did not improve; school age children across the country did not use the computer at home for such higher-order cognitive activities as graphics and design or spreadsheets/databases; and states whose systemic reform initiatives included a mix of capacity building and alignment to state standards realized improved student achievement on the 2000 NAEP Science Assessment.
Experiments in Computing: A Survey
Moisseinen, Nella
2014-01-01
Experiments play a central role in science. The role of experiments in computing is, however, unclear. Questions about the relevance of experiments in computing attracted little attention until the 1980s. As the discipline then saw a push towards experimental computer science, a variety of technically, theoretically, and empirically oriented views on experiments emerged. As a consequence of those debates, today's computing fields use experiments and experiment terminology in a variety of ways. This paper analyzes experimentation debates in computing. It presents five ways in which debaters have conceptualized experiments in computing: feasibility experiment, trial experiment, field experiment, comparison experiment, and controlled experiment. This paper has three aims: to clarify experiment terminology in computing; to contribute to disciplinary self-understanding of computing; and, due to computing's centrality in other fields, to promote understanding of experiments in modern science in general. PMID:24688404
Experiments in computing: a survey.
Tedre, Matti; Moisseinen, Nella
2014-01-01
Experiments play a central role in science. The role of experiments in computing is, however, unclear. Questions about the relevance of experiments in computing attracted little attention until the 1980s. As the discipline then saw a push towards experimental computer science, a variety of technically, theoretically, and empirically oriented views on experiments emerged. As a consequence of those debates, today's computing fields use experiments and experiment terminology in a variety of ways. This paper analyzes experimentation debates in computing. It presents five ways in which debaters have conceptualized experiments in computing: feasibility experiment, trial experiment, field experiment, comparison experiment, and controlled experiment. This paper has three aims: to clarify experiment terminology in computing; to contribute to disciplinary self-understanding of computing; and, due to computing's centrality in other fields, to promote understanding of experiments in modern science in general.
Brunner, Stephen; Nett, Brian E; Tolakanahalli, Ranjini; Chen, Guang-Hong
2011-02-21
X-ray scatter is a significant problem in cone-beam computed tomography when thicker objects and larger cone angles are used, as scattered radiation can lead to reduced contrast and CT number inaccuracy. Advances have been made in x-ray computed tomography (CT) by incorporating a high quality prior image into the image reconstruction process. In this paper, we extend this idea to correct scatter-induced shading artifacts in cone-beam CT image-guided radiation therapy. Specifically, this paper presents a new scatter correction algorithm which uses a prior image with low scatter artifacts to reduce shading artifacts in cone-beam CT images acquired under conditions of high scatter. The proposed correction algorithm begins with an empirical hypothesis that the target image can be written as a weighted summation of a series of basis images that are generated by raising the raw cone-beam projection data to different powers, and then, reconstructing using the standard filtered backprojection algorithm. The weight for each basis image is calculated by minimizing the difference between the target image and the prior image. The performance of the scatter correction algorithm is qualitatively and quantitatively evaluated through phantom studies using a Varian 2100 EX System with an on-board imager. Results show that the proposed scatter correction algorithm using a prior image with low scatter artifacts can substantially mitigate scatter-induced shading artifacts in both full-fan and half-fan modes.
NASA Astrophysics Data System (ADS)
Grujicic, M.; Yavari, R.; Ramaswami, S.; Snipes, J. S.; Yen, C.-F.; Cheeseman, B. A.
2013-11-01
A comprehensive all-atom molecular-level computational investigation is carried out in order to identify and quantify: (i) the effect of prior longitudinal-compressive or axial-torsional loading on the longitudinal-tensile behavior of p-phenylene terephthalamide (PPTA) fibrils/fibers; and (ii) the role various microstructural/topological defects play in affecting this behavior. Experimental and computational results available in the relevant open literature were utilized to construct various defects within the molecular-level model and to assign the concentration to these defects consistent with the values generally encountered under "prototypical" PPTA-polymer synthesis and fiber fabrication conditions. When quantifying the effect of the prior longitudinal-compressive/axial-torsional loading on the longitudinal-tensile behavior of PPTA fibrils, the stochastic nature of the size/potency of these defects was taken into account. The results obtained revealed that: (a) due to the stochastic nature of the defect type, concentration/number density and size/potency, the PPTA fibril/fiber longitudinal-tensile strength is a statistical quantity possessing a characteristic probability density function; (b) application of the prior axial compression or axial torsion to the PPTA imperfect single-crystalline fibrils degrades their longitudinal-tensile strength and only slightly modifies the associated probability density function; and (c) introduction of the fibril/fiber interfaces into the computational analyses showed that prior axial torsion can induce major changes in the material microstructure, causing significant reductions in the PPTA-fiber longitudinal-tensile strength and appreciable changes in the associated probability density function.
Continuous recognition of spatial and nonspatial stimuli in hippocampal-lesioned rats.
Jackson-Smith, P; Kesner, R P; Chiba, A A
1993-03-01
The present experiments compared the performance of hippocampal-lesioned rats to control rats on a spatial continuous recognition task and an analogous nonspatial task with similar processing demands. Daily sessions for Experiment 1 involved sequential presentation of individual arms on a 12-arm radial maze. Each arm contained a Froot Loop reinforcement the first time it was presented, and latency to traverse the arm was measured. A subset of the arms were repeated, but did not contain reinforcement. Repeated arms were presented with lags ranging from 0 to 6 (0 to 6 different arm presentations occurred between the first and the repeated presentation). Difference scores were computed by subtracting the latency on first presentations from the latency on repeated presentations, and these scores were high in all rats prior to surgery, with a decreasing function across lag. There were no differences in performance following cortical control or sham surgery. However, there was a total deficit in performance following large electrolytic lesions of the hippocampus. The second experiment employed the same continuous recognition memory procedure, but used three-dimensional visual objects (toys, junk items, etc., in various shapes, sizes, and textures) as stimuli on a flat runway. As in Experiment 1, the stimuli were presented successively and latency to run to and move the object was measured. Objects were repeated with lags ranging from 0 to 4. Performance on this task following surgery did not differ from performance prior to surgery for either the control group or the hippocampal lesion group. These results provide support for Kesner's attribute model of hippocampal function in that the hippocampus is assumed to mediate data-based memory for spatial locations, but not three-dimensional visual objects.
Effects of offenders' age and health on sentencing decisions.
Mueller-Johnson, Katrin U; Dhami, Mandeep K
2010-01-01
Two experiments investigated the effects of age and health on mock judges' sentencing decisions. The effects of these variables on length of prison sentence were examined in the context of offense severity and prior convictions. Experiment 1 involved a violent crime. Main effects were observed for age, health, offense severity and prior convictions. There was also an age by offense severity interaction. Experiment 2 involved a child sexual abuse case. Main effects were observed for health, offense severity, and prior convictions. In addition, an age by offense severity by prior convictions interaction effect was found. Thus, across both experiments, the age leniency effect was moderated by legal factors, suggesting that extra-legal factors affect sentencing in the context of legal factors. Further, for both offenses, offenders in poor health received shorter sentences than offenders in good health, suggesting that health deserves further research attention as an extra-legal variable.
Counterfactual and Factual Reflection: The Influence of Past Misdeeds on Future Immoral Behavior.
Gaspar, Joseph P; Seabright, Mark A; Reynolds, Scott J; Yam, Kai Chi
2015-01-01
Though the decision to behave immorally is situated within the context of prior immoral behavior, research has provided contradictory insights into this process. In a series of experiments, we demonstrate that the effects of prior immoral behavior depend on how individuals think about, or reflect on, their immoral behavior. In Experiment 1, participants who reflected counterfactually on their prior moral lapses morally disengaged (i.e., rationalized) less than participants who reflected factually. In Experiment 2, participants who reflected counterfactually on their prior moral lapses experienced more guilt than those who reflected factually. Finally, in Experiments 3 and 4, participants who reflected counterfactually lied less on unrelated tasks with real monetary stakes than those who reflected factually. Our studies provide important insights into moral rationalization and moral compensation processes and demonstrate the profound influence of reflection in everyday moral life.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dong, Han; Sharma, Diksha; Badano, Aldo, E-mail: aldo.badano@fda.hhs.gov
2014-12-15
Purpose: Monte Carlo simulations play a vital role in the understanding of the fundamental limitations, design, and optimization of existing and emerging medical imaging systems. Efforts in this area have resulted in the development of a wide variety of open-source software packages. One such package, hybridMANTIS, uses a novel hybrid concept to model indirect scintillator detectors by balancing the computational load using dual CPU and graphics processing unit (GPU) processors, obtaining computational efficiency with reasonable accuracy. In this work, the authors describe two open-source visualization interfaces, webMANTIS and visualMANTIS to facilitate the setup of computational experiments via hybridMANTIS. Methods: Themore » visualization tools visualMANTIS and webMANTIS enable the user to control simulation properties through a user interface. In the case of webMANTIS, control via a web browser allows access through mobile devices such as smartphones or tablets. webMANTIS acts as a server back-end and communicates with an NVIDIA GPU computing cluster that can support multiuser environments where users can execute different experiments in parallel. Results: The output consists of point response and pulse-height spectrum, and optical transport statistics generated by hybridMANTIS. The users can download the output images and statistics through a zip file for future reference. In addition, webMANTIS provides a visualization window that displays a few selected optical photon path as they get transported through the detector columns and allows the user to trace the history of the optical photons. Conclusions: The visualization tools visualMANTIS and webMANTIS provide features such as on the fly generation of pulse-height spectra and response functions for microcolumnar x-ray imagers while allowing users to save simulation parameters and results from prior experiments. The graphical interfaces simplify the simulation setup and allow the user to go directly from specifying input parameters to receiving visual feedback for the model predictions.« less
Technology Teaching or Mediated Learning, Part I: Are Computers Skinnerian or Vygotskian?
ERIC Educational Resources Information Center
Coufal, Kathy L.
2002-01-01
This article highlights the theoretical framework that dominated speech-language pathology prior to the widespread introduction of microcomputers and poses questions regarding the application of computers in assessment and intervention for children with language-learning impairments. It discusses implications of computer use in the context of…
The Zero Boil-Off Tank Experiment Ground Testing and Verification of Fluid and Thermal Performance
NASA Technical Reports Server (NTRS)
Chato, David J.; Kassemi, Mohammad; Kahwaji, Michel; Kieckhafer, Alexander
2016-01-01
The Zero Boil-Off Technology (ZBOT) Experiment involves performing a small scale International Space Station (ISS) experiment to study tank pressurization and pressure control in microgravity. The ZBOT experiment consists of a vacuum jacketed test tank filled with an inert fluorocarbon simulant liquid. Heaters and thermo-electric coolers are used in conjunction with an axial jet mixer flow loop to study a range of thermal conditions within the tank. The objective is to provide a high quality database of low gravity fluid motions and thermal transients which will be used to validate Computational Fluid Dynamic (CFD) modeling. This CFD can then be used in turn to predict behavior in larger systems with cryogens. This paper will discuss the work that has been done to demonstrate that the ZBOT experiment is capable of performing the functions required to produce a meaningful and accurate results, prior to its launch to the International Space Station. Main systems discussed are expected to include the thermal control system, the optical imaging system, and the tank filling system.This work is sponsored by NASAs Human Exploration Mission Directorates Physical Sciences Research program.
Epigenetic priors for identifying active transcription factor binding sites.
Cuellar-Partida, Gabriel; Buske, Fabian A; McLeay, Robert C; Whitington, Tom; Noble, William Stafford; Bailey, Timothy L
2012-01-01
Accurate knowledge of the genome-wide binding of transcription factors in a particular cell type or under a particular condition is necessary for understanding transcriptional regulation. Using epigenetic data such as histone modification and DNase I, accessibility data has been shown to improve motif-based in silico methods for predicting such binding, but this approach has not yet been fully explored. We describe a probabilistic method for combining one or more tracks of epigenetic data with a standard DNA sequence motif model to improve our ability to identify active transcription factor binding sites (TFBSs). We convert each data type into a position-specific probabilistic prior and combine these priors with a traditional probabilistic motif model to compute a log-posterior odds score. Our experiments, using histone modifications H3K4me1, H3K4me3, H3K9ac and H3K27ac, as well as DNase I sensitivity, show conclusively that the log-posterior odds score consistently outperforms a simple binary filter based on the same data. We also show that our approach performs competitively with a more complex method, CENTIPEDE, and suggest that the relative simplicity of the log-posterior odds scoring method makes it an appealing and very general method for identifying functional TFBSs on the basis of DNA and epigenetic evidence. FIMO, part of the MEME Suite software toolkit, now supports log-posterior odds scoring using position-specific priors for motif search. A web server and source code are available at http://meme.nbcr.net. Utilities for creating priors are at http://research.imb.uq.edu.au/t.bailey/SD/Cuellar2011. t.bailey@uq.edu.au Supplementary data are available at Bioinformatics online.
Own and Others' Prior Experiences Influence Children's Imitation of Causal Acts.
Williamson, Rebecca A; Meltzoff, Andrew N
2011-07-01
Young children learn from others' examples, and they do so selectively. We examine whether the efficacy of prior experiences influences children's imitation. Thirty-six-month-olds had initial experience on a causal learning task either by performing the task themselves or by watching an adult perform it. The nature of the experience was manipulated such that the actor had either an easy or a difficult experience completing the task. Next, a second adult demonstrated an innovative technique for completing it. Children who had a difficult first-person experience, and those who had witnessed another person having difficulty, were significantly more likely to adopt and imitate the adult's innovation than those who had or witnessed an easy experience. Children who observed another were also more likely to imitate than were those who had the initial experience themselves. Imitation is influenced by prior experience, both when it is obtained through one's own hands-on motor manipulation and when it derives from observing the acts of others.
ERIC Educational Resources Information Center
Plumlee, Tucker; Klein-Collins, Rebecca
2017-01-01
In 2015, the U.S. Department of Labor invited postsecondary institutions to participate in an experiment to learn how federal financial aid might be used to cover the costs of prior learning assessment (PLA). PLA is the process of evaluating a student's prior workplace and experiential learning for academic credit. While the experiment is still…
Bayesian analysis of caustic-crossing microlensing events
NASA Astrophysics Data System (ADS)
Cassan, A.; Horne, K.; Kains, N.; Tsapras, Y.; Browne, P.
2010-06-01
Aims: Caustic-crossing binary-lens microlensing events are important anomalous events because they are capable of detecting an extrasolar planet companion orbiting the lens star. Fast and robust modelling methods are thus of prime interest in helping to decide whether a planet is detected by an event. Cassan introduced a new set of parameters to model binary-lens events, which are closely related to properties of the light curve. In this work, we explain how Bayesian priors can be added to this framework, and investigate on interesting options. Methods: We develop a mathematical formulation that allows us to compute analytically the priors on the new parameters, given some previous knowledge about other physical quantities. We explicitly compute the priors for a number of interesting cases, and show how this can be implemented in a fully Bayesian, Markov chain Monte Carlo algorithm. Results: Using Bayesian priors can accelerate microlens fitting codes by reducing the time spent considering physically implausible models, and helps us to discriminate between alternative models based on the physical plausibility of their parameters.
Computerized screening for cognitive impairment in patients with COPD.
Campman, Carlijn; van Ranst, Dirk; Meijer, Jan Willem; Sitskoorn, Margriet
2017-01-01
COPD is associated with cognitive impairment. These impairments should be diagnosed, but due to time- and budget-reasons, they are often not investigated. The aim of this study is to examine the viability of a brief computerized cognitive test battery, Central Nervous System Vital Signs (CNSVS), in COPD patients. Patients with COPD referred to tertiary pulmonary rehabilitation were included. Cognitive functioning of patients was assessed with CNSVS before pulmonary rehabilitation and compared with age-corrected CNSVS norms. CNSVS is a 30 minute computerized test battery that includes tests of verbal and visual memory, psychomotor speed, processing speed, cognitive flexibility, complex attention, executive functioning, and reaction time. CNSVS was fully completed by 205 (93.2%, 105 females, 100 males) of the total group of patients (n=220, 116 females, 104 males). Z -tests showed that COPD patients performed significantly worse than the norms on all CNSVS cognitive domains. Slightly more than half of the patients (51.8%) had impaired functioning on 1 or more cognitive domains. Patients without computer experience performed significantly worse on CNSVS than patients using the computer frequently. The completion rate of CNSVS was high and cognitive dysfunctions measured with this screening were similar to the results found in prior research, including paper and pen cognitive tests. These results support the viability of this brief computerized cognitive screening in COPD patients, that may lead to better care for these patients. Cognitive performance of patients with little computer experience should be interpreted carefully. Future research on this issue is needed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, T.K.
1977-01-01
Field studies were undertaken to determine the nature and extent of melanism in two populations of the cryptic moth, Panthea furcilla. Melanic frequencies significantly increased over a three year period in both populations of P. furcilla sampled. Predation experiments showed that melanics suffer less predation than typicals. However, life expectancies for typical and melanic morphs were nearly equal as computed from mark-release-recapture data. Accordingly it is suggested that one advantage melanics enjoy is their greater vigor prior to the imaginal stage. Acid-rainfall, as a Northeast regional problem, is advanced as a possible cause for the increase in melanic frequencies. 22more » references, 9 tables.« less
Study on Spacelab software development and integration concepts
NASA Technical Reports Server (NTRS)
1974-01-01
A study was conducted to define the complexity and magnitude of the Spacelab software challenge. The study was based on current Spacelab program concepts, anticipated flight schedules, and ground operation plans. The study was primarily directed toward identifying and solving problems related to the experiment flight application and tests and checkout software executing in the Spacelab onboard command and data management subsystem (CDMS) computers and electrical ground support equipment (EGSE). The study provides a conceptual base from which it is possible to proceed into the development phase of the Software Test and Integration Laboratory (STIL) and establishes guidelines for the definition of standards which will ensure that the total Spacelab software is understood prior to entering development.
ERIC Educational Resources Information Center
Palagi, Patti
2017-01-01
To determine the qualities associated with effective building leaders, this study identifies how those tasked with hiring principals value core leadership practices and past educational experience. The inclusion of prior classroom teaching experience as a pre-requisite for applying to a principal preparation program provided the impetus for this…
Covariance specification and estimation to improve top-down Green House Gas emission estimates
NASA Astrophysics Data System (ADS)
Ghosh, S.; Lopez-Coto, I.; Prasad, K.; Whetstone, J. R.
2015-12-01
The National Institute of Standards and Technology (NIST) operates the North-East Corridor (NEC) project and the Indianapolis Flux Experiment (INFLUX) in order to develop measurement methods to quantify sources of Greenhouse Gas (GHG) emissions as well as their uncertainties in urban domains using a top down inversion method. Top down inversion updates prior knowledge using observations in a Bayesian way. One primary consideration in a Bayesian inversion framework is the covariance structure of (1) the emission prior residuals and (2) the observation residuals (i.e. the difference between observations and model predicted observations). These covariance matrices are respectively referred to as the prior covariance matrix and the model-data mismatch covariance matrix. It is known that the choice of these covariances can have large effect on estimates. The main objective of this work is to determine the impact of different covariance models on inversion estimates and their associated uncertainties in urban domains. We use a pseudo-data Bayesian inversion framework using footprints (i.e. sensitivities of tower measurements of GHGs to surface emissions) and emission priors (based on Hestia project to quantify fossil-fuel emissions) to estimate posterior emissions using different covariance schemes. The posterior emission estimates and uncertainties are compared to the hypothetical truth. We find that, if we correctly specify spatial variability and spatio-temporal variability in prior and model-data mismatch covariances respectively, then we can compute more accurate posterior estimates. We discuss few covariance models to introduce space-time interacting mismatches along with estimation of the involved parameters. We then compare several candidate prior spatial covariance models from the Matern covariance class and estimate their parameters with specified mismatches. We find that best-fitted prior covariances are not always best in recovering the truth. To achieve accuracy, we perform a sensitivity study to further tune covariance parameters. Finally, we introduce a shrinkage based sample covariance estimation technique for both prior and mismatch covariances. This technique allows us to achieve similar accuracy nonparametrically in a more efficient and automated way.
Nonlocal Intuition: Replication and Paired-subjects Enhancement Effects
Mirzaei, Maryam; Zali, Mohammad Reza
2014-01-01
This article reports the results of a study of repeat entrepreneurs in Tehran, Iran, in which nonlocal intuition was investigated in a replication and extension of experiment using measures of heart rate variability (HRV). Nonlocal intuition is the perception of information about a distant or future event by the body's psychophysiological systems, which is not based on reason or memories of prior experience. This study follows up on the McCraty, Radin, and Bradley studies, which found evidence of nonlocal intuition. We used Radin's experimental protocol, with the addition of HRV measures as in the McCraty studies involving computer administration of a random sequence of calm and emotional pictures as the stimulus, and conducted two experiments on mutually exclusive samples—the first on a group of single participants (N=15) and the second on a group of co-participant pairs (N=30)—to investigate the question of the “amplification” of intuition effects by social connection. Each experiment was conducted over 45 trials while heart rate rhythm activity was recorded continuously. Results, using random permutation analysis, a statistically conservative procedure, show significant pre-stimulus results—that is, for the period before the computer had randomly selected the picture stimulus—for both experiments. Moreover, while significant separation between the emotional and calm HRV curves was observed in the single-participant experiment, an even larger separation was apparent for the experiment on co-participant pairs; the difference between the two groups was also significant. Overall, the results of the single-participant experiment confirm previous finding: that electrophysiological measures, especially changes in the heart rhythm, can detect intuitive foreknowledge. This result is notable because it constitutes cross-cultural corroboration in a non-Western context—namely, Iran. In addition, the results for co-participant pairs offer new evidence on the amplification of the nonlocal intuition signal. PMID:24808977
Isaacs, Alex N; Walton, Alison M; Nisly, Sarah A
2015-04-25
To implement and evaluate interactive web-based learning modules prior to advanced pharmacy practice experiences (APPEs) on inpatient general medicine. Three clinical web-based learning modules were developed for use prior to APPEs in 4 health care systems. The aim of the interactive modules was to strengthen baseline clinical knowledge before the APPE to enable the application of learned material through the delivery of patient care. For the primary endpoint, postassessment scores increased overall and for each individual module compared to preassessment scores. Postassessment scores were similar among the health care systems. The survey demonstrated positive student perceptions of this learning experience. Prior to inpatient general medicine APPEs, web-based learning enabled the standardization and assessment of baseline student knowledge across 4 health care systems.
Divilov, Konstantin; Wiesner-Hanks, Tyr; Barba, Paola; Cadle-Davidson, Lance; Reisch, Bruce I
2017-12-01
Quantitative phenotyping of downy mildew sporulation is frequently used in plant breeding and genetic studies, as well as in studies focused on pathogen biology such as chemical efficacy trials. In these scenarios, phenotyping a large number of genotypes or treatments can be advantageous but is often limited by time and cost. We present a novel computational pipeline dedicated to estimating the percent area of downy mildew sporulation from images of inoculated grapevine leaf discs in a manner that is time and cost efficient. The pipeline was tested on images from leaf disc assay experiments involving two F 1 grapevine families, one that had glabrous leaves (Vitis rupestris B38 × 'Horizon' [RH]) and another that had leaf trichomes (Horizon × V. cinerea B9 [HC]). Correlations between computer vision and manual visual ratings reached 0.89 in the RH family and 0.43 in the HC family. Additionally, we were able to use the computer vision system prior to sporulation to measure the percent leaf trichome area. We estimate that an experienced rater scoring sporulation would spend at least 90% less time using the computer vision system compared with the manual visual method. This will allow more treatments to be phenotyped in order to better understand the genetic architecture of downy mildew resistance and of leaf trichome density. We anticipate that this computer vision system will find applications in other pathosystems or traits where responses can be imaged with sufficient contrast from the background.
Le Couteulx, S; Caudron, J; Dubourg, B; Cauchois, G; Dupré, M; Michelin, P; Durand, E; Eltchaninoff, H; Dacher, J-N
2018-05-01
To evaluate intra- and inter-observer variability of multidetector computed tomography (MDCT) sizing of the aortic annulus before transcatheter aortic valve replacement (TAVR) and the effect of observer experience, aortic valve calcification and image quality. MDCT examinations of 52 consecutive patients with tricuspid aortic valve (30 women, 22 men) with a mean age of 83±7 (SD) years (range: 64-93 years) were evaluated retrospectively. The maximum and minimum diameters, area and circumference of the aortic annulus were measured twice at diastole and systole with a standardized approach by three independent observers with different levels of experience (expert [observer 1]; resident with intensive 6 months practice [observer 2]; trained resident with starting experience [observer 3]). Observers were requested to recommend the valve prosthesis size. Calcification volume of the aortic valve and signal to noise ratio were evaluated. Intra- and inter-observer reproducibility was excellent for all aortic annulus dimensions, with an intraclass correlation coefficient ranging respectively from 0.84 to 0.98 and from 0.82 to 0.97. Agreement for selection of prosthesis size was almost perfect between the two most experienced observers (k=0.82) and substantial with the inexperienced observer (k=0.67). Aortic valve calcification did not influence intra-observer reproducibility. Image quality influenced reproducibility of the inexperienced observer. Intra- and inter-observer variability of aortic annulus sizing by MDCT is low. Nevertheless, the less experienced observer showed lower reliability suggesting a learning curve. Copyright © 2017. Published by Elsevier Masson SAS.
A computational visual saliency model based on statistics and machine learning.
Lin, Ru-Je; Lin, Wei-Song
2014-08-01
Identifying the type of stimuli that attracts human visual attention has been an appealing topic for scientists for many years. In particular, marking the salient regions in images is useful for both psychologists and many computer vision applications. In this paper, we propose a computational approach for producing saliency maps using statistics and machine learning methods. Based on four assumptions, three properties (Feature-Prior, Position-Prior, and Feature-Distribution) can be derived and combined by a simple intersection operation to obtain a saliency map. These properties are implemented by a similarity computation, support vector regression (SVR) technique, statistical analysis of training samples, and information theory using low-level features. This technique is able to learn the preferences of human visual behavior while simultaneously considering feature uniqueness. Experimental results show that our approach performs better in predicting human visual attention regions than 12 other models in two test databases. © 2014 ARVO.
Design and Performance of a Spectrometer for Deployment on MISSE 7
NASA Technical Reports Server (NTRS)
Pippin, Gary; Beymer, Jim; Robb, Andrew; Longino, James; Perry, George; Stewart, Alan; Finkenor, Miria
2009-01-01
A spectrometer for reflectance and transmission measurements of samples exposed to the space environment has been developed for deployment on the Materials on the International Space Station Experiment (MISSE) 7. The instrument incorporates a miniature commercial fiber optic coupled spectrometer with a computer control system for detector operation, sample motion and illumination. A set of three spectrometers were recently integrated on the MISSE7 platform with launch and deployment on the International Space Station scheduled for summer of this year. The instrument is one of many active experiments on the platform. The performance of the instrument prior to launch will be discussed. Data from samples measured in the laboratory will be compared to those from the instrument prior to launch. These comparisons will illustrate the capabilities of the current design. The space environment challenges many materials. When in operation on the MISSE 7 platform, the new spectrometer will provide real time data on the how the space environment affects the optical properties of thermal control paints and optical coatings. Data obtained from comparison of pre and post flight measurements on hundreds of samples exposed on previous MISSE platforms have been reported at these meetings. With the new spectrometer and the ability to correlate measured changes with time on orbit and the occurrence of both natural events and human activities, a better understanding of the processes responsible for degradation of materials in space will be possible.
Unintended/Unexpected Outcomes of Computer Usage in Higher Education. AIR 1987 Annual Forum Paper.
ERIC Educational Resources Information Center
Muffo, John A.; Conner, Mark E.
Unpredicted ways in which the use of computers has affected social interactions in colleges and universities are considered. Information was gathered from a literature review and from personal observations. One outcome of introducing computers into an academic or administrative unit is the development of alliances depending on prior experience…
ERIC Educational Resources Information Center
Kumi, Richard; Reychav, Iris; Sabherwal, Rajiv
2016-01-01
Many educational institutions are integrating mobile-computing technologies (MCT) into the classroom to improve learning outcomes. There is also a growing interest in research to understand how MCT influence learning outcomes. The diversity of results in prior research indicates that computer-mediated learning has different effects on various…
An Adaptive Evaluation Structure for Computer-Based Instruction.
ERIC Educational Resources Information Center
Welsh, William A.
Adaptive Evaluation Structure (AES) is a set of linked computer programs designed to increase the effectiveness of interactive computer-assisted instruction at the college level. The package has four major features, the first of which is based on a prior cognitive inventory and on the accuracy and pace of student responses. AES adjusts materials…
Predictors of Enrollment in High School Computer Courses.
ERIC Educational Resources Information Center
Campbell, N. Jo; Perry, Katye M.
Factors affecting the motivation of high school students to learn to use computers were examined in this study. The subjects were 160 students enrolled in a large city high school, 89 females and 71 males who represented five ethnic groups--White, Black, Hispanic, Asian, and American Indian. The majority of subjects had prior computer coursework…
Computer Communications and Operations--Intermediate, Data Processing Technology: 8025.21.
ERIC Educational Resources Information Center
Dade County Public Schools, Miami, FL.
The following course outline is a guide which presents students with the basic programing and operation concepts for developing the skills necessary to become proficient in the area of computer communications and operation. The student must have met the objectives of Introduction to Computer Programming prior to enrollment in this course. The…
Computer-Assisted Diagnostic Decision Support: History, Challenges, and Possible Paths Forward
ERIC Educational Resources Information Center
Miller, Randolph A.
2009-01-01
This paper presents a brief history of computer-assisted diagnosis, including challenges and future directions. Some ideas presented in this article on computer-assisted diagnostic decision support systems (CDDSS) derive from prior work by the author and his colleagues (see list in Acknowledgments) on the INTERNIST-1 and QMR projects. References…
2006-01-01
Journal of Psychosomatic ResThe effects of prior combat experience on the expression of somatic and affective symptoms in deploying soldiers William...rates of somatic complaints compared with combat-naive soldiers. Methods: Self-reports of posttraumatic stress disorder (PTSD) and affective and somatic ...identical for the experienced and inexperienced groups, scores on the Affective and Somatic scales differed as a function of prior combat history. Previous
The effect of loudness on the reverberance of music: reverberance prediction using loudness models.
Lee, Doheon; Cabrera, Densil; Martens, William L
2012-02-01
This study examines the auditory attribute that describes the perceived amount of reverberation, known as "reverberance." Listening experiments were performed using two signals commonly heard in auditoria: excerpts of orchestral music and western classical singing. Listeners adjusted the decay rate of room impulse responses prior to convolution with these signals, so as to match the reverberance of each stimulus to that of a reference stimulus. The analysis examines the hypothesis that reverberance is related to the loudness decay rate of the underlying room impulse response. This hypothesis is tested using computational models of time varying or dynamic loudness, from which parameters analogous to conventional reverberation parameters (early decay time and reverberation time) are derived. The results show that listening level significantly affects reverberance, and that the loudness-based parameters outperform related conventional parameters. Results support the proposed relationship between reverberance and the computationally predicted loudness decay function of sound in rooms. © 2012 Acoustical Society of America
Retif, Paul; Reinhard, Aurélie; Paquot, Héna; Jouan-Hureaux, Valérie; Chateau, Alicia; Sancey, Lucie; Barberi-Heyob, Muriel; Pinel, Sophie; Bastogne, Thierry
2016-01-01
This article addresses the in silico–in vitro prediction issue of organometallic nanoparticles (NPs)-based radiosensitization enhancement. The goal was to carry out computational experiments to quickly identify efficient nanostructures and then to preferentially select the most promising ones for the subsequent in vivo studies. To this aim, this interdisciplinary article introduces a new theoretical Monte Carlo computational ranking method and tests it using 3 different organometallic NPs in terms of size and composition. While the ranking predicted in a classical theoretical scenario did not fit the reference results at all, in contrast, we showed for the first time how our accelerated in silico virtual screening method, based on basic in vitro experimental data (which takes into account the NPs cell biodistribution), was able to predict a relevant ranking in accordance with in vitro clonogenic efficiency. This corroborates the pertinence of such a prior ranking method that could speed up the preclinical development of NPs in radiation therapy. PMID:27920524
Retif, Paul; Reinhard, Aurélie; Paquot, Héna; Jouan-Hureaux, Valérie; Chateau, Alicia; Sancey, Lucie; Barberi-Heyob, Muriel; Pinel, Sophie; Bastogne, Thierry
This article addresses the in silico-in vitro prediction issue of organometallic nanoparticles (NPs)-based radiosensitization enhancement. The goal was to carry out computational experiments to quickly identify efficient nanostructures and then to preferentially select the most promising ones for the subsequent in vivo studies. To this aim, this interdisciplinary article introduces a new theoretical Monte Carlo computational ranking method and tests it using 3 different organometallic NPs in terms of size and composition. While the ranking predicted in a classical theoretical scenario did not fit the reference results at all, in contrast, we showed for the first time how our accelerated in silico virtual screening method, based on basic in vitro experimental data (which takes into account the NPs cell biodistribution), was able to predict a relevant ranking in accordance with in vitro clonogenic efficiency. This corroborates the pertinence of such a prior ranking method that could speed up the preclinical development of NPs in radiation therapy.
Bayesian statistical ionospheric tomography improved by incorporating ionosonde measurements
NASA Astrophysics Data System (ADS)
Norberg, Johannes; Virtanen, Ilkka I.; Roininen, Lassi; Vierinen, Juha; Orispää, Mikko; Kauristie, Kirsti; Lehtinen, Markku S.
2016-04-01
We validate two-dimensional ionospheric tomography reconstructions against EISCAT incoherent scatter radar measurements. Our tomography method is based on Bayesian statistical inversion with prior distribution given by its mean and covariance. We employ ionosonde measurements for the choice of the prior mean and covariance parameters and use the Gaussian Markov random fields as a sparse matrix approximation for the numerical computations. This results in a computationally efficient tomographic inversion algorithm with clear probabilistic interpretation. We demonstrate how this method works with simultaneous beacon satellite and ionosonde measurements obtained in northern Scandinavia. The performance is compared with results obtained with a zero-mean prior and with the prior mean taken from the International Reference Ionosphere 2007 model. In validating the results, we use EISCAT ultra-high-frequency incoherent scatter radar measurements as the ground truth for the ionization profile shape. We find that in comparison to the alternative prior information sources, ionosonde measurements improve the reconstruction by adding accurate information about the absolute value and the altitude distribution of electron density. With an ionosonde at continuous disposal, the presented method enhances stand-alone near-real-time ionospheric tomography for the given conditions significantly.
Youk, Shin-Young; Lee, Jee-Ho; Heo, Seong-Joo; Roh, Hyun-Ki; Park, Eun-Jin; Shin, Im Hee
2014-01-01
PURPOSE This study aims to investigate the degree of subjective pain and the satisfaction of patients who have undergone an implant treatment using a computer-guided template. MATERIALS AND METHODS A survey was conducted for 135 patients who have undergone implant surgery with and without the use of the computer-guided template during the period of 2012 and 2013 in university hospitals, dental hospitals and dental clinics that practiced implant surgery using the computer-guided template. Likert scale and VAS score were used in the survey questions, and the independent t-test and One-Way ANOVA were performed (α=.05). RESULTS The route that the subjects were introduced to the computer-guided implant surgery using a surgical template was mostly advices by dentists, and the most common reason for which they chose to undergo such surgery was that it was accurate and safe. Most of them gave an answer that they were willing to recommend it to others. The patients who have undergone the computer-guided implant surgery felt less pain during the operation and showed higher satisfaction than those who have undergone conventional implant surgery. Among the patients who have undergone computer-guided implant surgery, those who also had prior experience of surgery without a computer-guided template expressed higher satisfaction with the former (P<.05). CONCLUSION In this study, it could be seen that the patients who have undergone computer-guided implant surgery employing a surgical template felt less pain and had higher satisfaction than those with the conventional one, and the dentist's description could provide the confidence about the safety of surgery. PMID:25352962
Is automatic speech-to-text transcription ready for use in psychological experiments?
Ziman, Kirsten; Heusser, Andrew C; Fitzpatrick, Paxton C; Field, Campbell E; Manning, Jeremy R
2018-04-23
Verbal responses are a convenient and naturalistic way for participants to provide data in psychological experiments (Salzinger, The Journal of General Psychology, 61(1),65-94:1959). However, audio recordings of verbal responses typically require additional processing, such as transcribing the recordings into text, as compared with other behavioral response modalities (e.g., typed responses, button presses, etc.). Further, the transcription process is often tedious and time-intensive, requiring human listeners to manually examine each moment of recorded speech. Here we evaluate the performance of a state-of-the-art speech recognition algorithm (Halpern et al., 2016) in transcribing audio data into text during a list-learning experiment. We compare transcripts made by human annotators to the computer-generated transcripts. Both sets of transcripts matched to a high degree and exhibited similar statistical properties, in terms of the participants' recall performance and recall dynamics that the transcripts captured. This proof-of-concept study suggests that speech-to-text engines could provide a cheap, reliable, and rapid means of automatically transcribing speech data in psychological experiments. Further, our findings open the door for verbal response experiments that scale to thousands of participants (e.g., administered online), as well as a new generation of experiments that decode speech on the fly and adapt experimental parameters based on participants' prior responses.
ERIC Educational Resources Information Center
Metraglia, Riccardo; Villa, Valerio; Baronio, Gabriele; Adamini, Riccardo
2015-01-01
Today's students enter engineering colleges with different technical backgrounds and prior graphics experience. This may due to their high school of provenience, which can be technical or non-technical. The prior experience affects students' ability in learning and hence their motivation and self-efficacy beliefs. This study intended to evaluate…
ERIC Educational Resources Information Center
Schindler, Maike; Hußmann, Stephan; Nilsson, Per; Bakker, Arthur
2017-01-01
Negative numbers are among the first formalizations students encounter in their mathematics learning that clearly differ from out-of-school experiences. What has not sufficiently been addressed in previous research is the question of how students draw on their prior experiences when reasoning on negative numbers and how they infer from these…
ERIC Educational Resources Information Center
Waldron, Fionnuala; Pike, Susan; Varley, Janet; Murphy, Colette; Greenwood, Richard
2007-01-01
Research into student teachers' perceptions, attitudes and prior experiences of learning suggests that these experiences can exert an influence on practice which can be relatively undisturbed by their initial teacher education. This article is based on the initial findings of an all-Ireland survey of all first-year students on B.Ed. courses in…
Motor Skills Enhance Procedural Memory Formation and Protect against Age-Related Decline
Müller, Nils C. J.; Genzel, Lisa; Konrad, Boris N.; Pawlowski, Marcel; Neville, David; Fernández, Guillén; Steiger, Axel
2016-01-01
The ability to consolidate procedural memories declines with increasing age. Prior knowledge enhances learning and memory consolidation of novel but related information in various domains. Here, we present evidence that prior motor experience–in our case piano skills–increases procedural learning and has a protective effect against age-related decline for the consolidation of novel but related manual movements. In our main experiment, we tested 128 participants with a sequential finger-tapping motor task during two sessions 24 hours apart. We observed enhanced online learning speed and offline memory consolidation for piano players. Enhanced memory consolidation was driven by a strong effect in older participants, whereas younger participants did not benefit significantly from prior piano experience. In a follow up independent control experiment, this compensatory effect of piano experience was not visible after a brief offline period of 30 minutes, hence requiring an extended consolidation window potentially involving sleep. Through a further control experiment, we rejected the possibility that the decreased effect in younger participants was caused by training saturation. We discuss our results in the context of the neurobiological schema approach and suggest that prior experience has the potential to rescue memory consolidation from age-related cognitive decline. PMID:27333186
Melittin Aggregation in Aqueous Solutions: Insight from Molecular Dynamics Simulations.
Liao, Chenyi; Esai Selvan, Myvizhi; Zhao, Jun; Slimovitch, Jonathan L; Schneebeli, Severin T; Shelley, Mee; Shelley, John C; Li, Jianing
2015-08-20
Melittin is a natural peptide that aggregates in aqueous solutions with paradigmatic monomer-to-tetramer and coil-to-helix transitions. Since little is known about the molecular mechanisms of melittin aggregation in solution, we simulated its self-aggregation process under various conditions. After confirming the stability of a melittin tetramer in solution, we observed—for the first time in atomistic detail—that four separated melittin monomers aggregate into a tetramer. Our simulated dependence of melittin aggregation on peptide concentration, temperature, and ionic strength is in good agreement with prior experiments. We propose that melittin mainly self-aggregates via a mechanism involving the sequential addition of monomers, which is supported by both qualitative and quantitative evidence obtained from unbiased and metadynamics simulations. Moreover, by combining computer simulations and a theory of the electrical double layer, we provide evidence to suggest why melittin aggregation in solution likely stops at the tetramer, rather than forming higher-order oligomers. Overall, our study not only explains prior experimental results at the molecular level but also provides quantitative mechanistic information that may guide the engineering of melittin for higher efficacy and safety.
[Inferential evaluation of intimacy based on observation of interpersonal communication].
Kimura, Masanori
2015-06-01
How do people inferentially evaluate others' levels of intimacy with friends? We examined the inferential evaluation of intimacy based on the observation of interpersonal communication. In Experiment 1, participants (N = 41) responded to questions after observing conversations between friends. Results indicated that participants inferentially evaluated not only goodness of communication, but also intimacy between friends, using an expressivity heuristic approach. In Experiment 2, we investigated how inferential evaluation of intimacy was affected by prior information about relationships and by individual differences in face-to-face interactional ability. Participants (N = 64) were divided into prior- and no-prior-information groups and all performed the same task as in Experiment 1. Additionally, their interactional ability was assessed. In the prior-information group, individual differences had no effect on inferential evaluation of intimacy. On the other hand, in the no-prior-information group, face-to-face interactional ability partially influenced evaluations of intimacy. Finally, we discuss the fact that to understand one's social environment, it is important to observe others' interpersonal communications.
Data mining and statistical inference in selective laser melting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kamath, Chandrika
Selective laser melting (SLM) is an additive manufacturing process that builds a complex three-dimensional part, layer-by-layer, using a laser beam to fuse fine metal powder together. The design freedom afforded by SLM comes associated with complexity. As the physical phenomena occur over a broad range of length and time scales, the computational cost of modeling the process is high. At the same time, the large number of parameters that control the quality of a part make experiments expensive. In this paper, we describe ways in which we can use data mining and statistical inference techniques to intelligently combine simulations andmore » experiments to build parts with desired properties. We start with a brief summary of prior work in finding process parameters for high-density parts. We then expand on this work to show how we can improve the approach by using feature selection techniques to identify important variables, data-driven surrogate models to reduce computational costs, improved sampling techniques to cover the design space adequately, and uncertainty analysis for statistical inference. Here, our results indicate that techniques from data mining and statistics can complement those from physical modeling to provide greater insight into complex processes such as selective laser melting.« less
Data mining and statistical inference in selective laser melting
Kamath, Chandrika
2016-01-11
Selective laser melting (SLM) is an additive manufacturing process that builds a complex three-dimensional part, layer-by-layer, using a laser beam to fuse fine metal powder together. The design freedom afforded by SLM comes associated with complexity. As the physical phenomena occur over a broad range of length and time scales, the computational cost of modeling the process is high. At the same time, the large number of parameters that control the quality of a part make experiments expensive. In this paper, we describe ways in which we can use data mining and statistical inference techniques to intelligently combine simulations andmore » experiments to build parts with desired properties. We start with a brief summary of prior work in finding process parameters for high-density parts. We then expand on this work to show how we can improve the approach by using feature selection techniques to identify important variables, data-driven surrogate models to reduce computational costs, improved sampling techniques to cover the design space adequately, and uncertainty analysis for statistical inference. Here, our results indicate that techniques from data mining and statistics can complement those from physical modeling to provide greater insight into complex processes such as selective laser melting.« less
Perturbation Biology: Inferring Signaling Networks in Cellular Systems
Miller, Martin L.; Gauthier, Nicholas P.; Jing, Xiaohong; Kaushik, Poorvi; He, Qin; Mills, Gordon; Solit, David B.; Pratilas, Christine A.; Weigt, Martin; Braunstein, Alfredo; Pagnani, Andrea; Zecchina, Riccardo; Sander, Chris
2013-01-01
We present a powerful experimental-computational technology for inferring network models that predict the response of cells to perturbations, and that may be useful in the design of combinatorial therapy against cancer. The experiments are systematic series of perturbations of cancer cell lines by targeted drugs, singly or in combination. The response to perturbation is quantified in terms of relative changes in the measured levels of proteins, phospho-proteins and cellular phenotypes such as viability. Computational network models are derived de novo, i.e., without prior knowledge of signaling pathways, and are based on simple non-linear differential equations. The prohibitively large solution space of all possible network models is explored efficiently using a probabilistic algorithm, Belief Propagation (BP), which is three orders of magnitude faster than standard Monte Carlo methods. Explicit executable models are derived for a set of perturbation experiments in SKMEL-133 melanoma cell lines, which are resistant to the therapeutically important inhibitor of RAF kinase. The resulting network models reproduce and extend known pathway biology. They empower potential discoveries of new molecular interactions and predict efficacious novel drug perturbations, such as the inhibition of PLK1, which is verified experimentally. This technology is suitable for application to larger systems in diverse areas of molecular biology. PMID:24367245
Zuromski, Kelly L; Resnick, Heidi; Price, Matthew; Galea, Sandro; Kilpatrick, Dean G; Ruggiero, Kenneth
2018-05-07
The current study examined variables, including prior traumatic events, disaster exposure, and current mental health symptomatology, associated with suicidal ideation following experience of a natural disaster. Utilizing a sample of 2,000 adolescents exposed to the spring 2011 tornadoes in the areas surrounding Tuscaloosa, Alabama, and Joplin, Missouri, we hypothesized that prior interpersonal violence (IPV), more so than other prior traumatic events or other symptoms, would be associated with suicidal ideation after the disaster. Suicidal ideation was reported by approximately 5% of the sample. Results of binary logistic regression were consistent with hypotheses in that prior IPV exposure emerged as the variable most robustly related to presence of postdisaster suicidal ideation, even accounting for current symptoms (i.e., posttraumatic stress disorder and depression). Moreover, neither prior accident nor prior natural disaster exposure was significantly associated with postdisaster suicidal ideation, suggesting that something specific to IPV may be conferring risk for suicidality. No other variables, including disaster exposure variables or demographic characteristics, emerged as significantly related. Our results suggest that individuals who have a history of IPV may be particularly vulnerable following experience of additional traumatic events and that for suicide risk, the experience of prior IPV may be more relevant to consider in the aftermath of natural disasters beyond variables related to the index trauma or current symptomatology. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Boundary Condition for Modeling Semiconductor Nanostructures
NASA Technical Reports Server (NTRS)
Lee, Seungwon; Oyafuso, Fabiano; von Allmen, Paul; Klimeck, Gerhard
2006-01-01
A recently proposed boundary condition for atomistic computational modeling of semiconductor nanostructures (particularly, quantum dots) is an improved alternative to two prior such boundary conditions. As explained, this boundary condition helps to reduce the amount of computation while maintaining accuracy.
Integrating Retraction Modeling Into an Atlas-Based Framework for Brain Shift Prediction
Chen, Ishita; Ong, Rowena E.; Simpson, Amber L.; Sun, Kay; Thompson, Reid C.
2015-01-01
In recent work, an atlas-based statistical model for brain shift prediction, which accounts for uncertainty in the intraoperative environment, has been proposed. Previous work reported in the literature using this technique did not account for local deformation caused by surgical retraction. It is challenging to precisely localize the retractor location prior to surgery and the retractor is often moved in the course of the procedure. This paper proposes a technique that involves computing the retractor-induced brain deformation in the operating room through an active model solve and linearly superposing the solution with the precomputed deformation atlas. As a result, the new method takes advantage of the atlas-based framework’s accounting for uncertainties while also incorporating the effects of retraction with minimal intraoperative computing. This new approach was tested using simulation and phantom experiments. The results showed an improvement in average shift correction from 50% (ranging from 14 to 81%) for gravity atlas alone to 80% using the active solve retraction component (ranging from 73 to 85%). This paper presents a novel yet simple way to integrate retraction into the atlas-based brain shift computation framework. PMID:23864146
Use of knowledge-sharing web-based portal in gross and microscopic anatomy.
Durosaro, Olayemi; Lachman, Nirusha; Pawlina, Wojciech
2008-12-01
Changes in worldwide healthcare delivery require review of current medical school curricula structure to develop learning outcomes that ensures mastery of knowledge and clinical competency. In the last 3 years, Mayo Medical School implemented outcomes-based curriculum to encompass new graduate outcomes. Standard courses were replaced by 6-week clinically-integrated didactic blocks separated by student-self selected academic enrichment activities. Gross and microscopic anatomy was integrated with radiology and genetics respectively. Laboratory components include virtual microscopy and anatomical dissection. Students assigned to teams utilise computer portals to share learning experiences. High-resolution computed tomographic (CT) scans of cadavers prior to dissection were made available for correlative learning between the cadaveric material and radiologic images. Students work in teams on assigned presentations that include histology, cell and molecular biology, genetics and genomic using the Nexus Portal, based on DrupalEd, to share their observations, reflections and dissection findings. New generation of medical students are clearly comfortable utilising web-based programmes that maximise their learning potential of conceptually difficult and labor intensive courses. Team-based learning approach emphasising the use of knowledge-sharing computer portals maximises opportunities for students to master their knowledge and improve cognitive skills to ensure clinical competency.
Computational crystallization.
Altan, Irem; Charbonneau, Patrick; Snell, Edward H
2016-07-15
Crystallization is a key step in macromolecular structure determination by crystallography. While a robust theoretical treatment of the process is available, due to the complexity of the system, the experimental process is still largely one of trial and error. In this article, efforts in the field are discussed together with a theoretical underpinning using a solubility phase diagram. Prior knowledge has been used to develop tools that computationally predict the crystallization outcome and define mutational approaches that enhance the likelihood of crystallization. For the most part these tools are based on binary outcomes (crystal or no crystal), and the full information contained in an assembly of crystallization screening experiments is lost. The potential of this additional information is illustrated by examples where new biological knowledge can be obtained and where a target can be sub-categorized to predict which class of reagents provides the crystallization driving force. Computational analysis of crystallization requires complete and correctly formatted data. While massive crystallization screening efforts are under way, the data available from many of these studies are sparse. The potential for this data and the steps needed to realize this potential are discussed. Copyright © 2016 Elsevier Inc. All rights reserved.
The effect of force feedback on student reasoning about gravity, mass, force and motion
NASA Astrophysics Data System (ADS)
Bussell, Linda
The purpose of this study was to examine whether force feedback within a computer simulation had an effect on reasoning by fifth grade students about gravity, mass, force, and motion, concepts which can be difficult for learners to grasp. Few studies have been done on cognitive learning and haptic feedback, particularly with young learners, but there is an extensive base of literature on children's conceptions of science and a number of studies focus specifically on children's conceptions of force and motion. This case study used a computer-based paddleball simulation with guided inquiry as the primary stimulus. Within the simulation, the learner could adjust the mass of the ball and the gravitational force. The experimental group used the simulation with visual and force feedback; the control group used the simulation with visual feedback but without force feedback. The proposition was that there would be differences in reasoning between the experimental and control groups, with force feedback being helpful with concepts that are more obvious when felt. Participants were 34 fifth-grade students from three schools. Students completed a modal (visual, auditory, and haptic) learning preference assessment and a pretest. The sessions, including participant experimentation and interviews, were audio recorded and observed. The interviews were followed by a written posttest. These data were analyzed to determine whether there were differences based on treatment, learning style, demographics, prior gaming experience, force feedback experience, or prior knowledge. Work with the simulation, regardless of group, was found to increase students' understanding of key concepts. The experimental group appeared to benefit from the supplementary help that force feedback provided. Those in the experimental group scored higher on the posttest than those in the control group. The greatest difference between mean group scores was on a question concerning the effects of increased gravitational force.
NASA Astrophysics Data System (ADS)
Powell, Rita Manco
Currently women are underrepresented in departments of computer science, making up approximately 18% of the undergraduate enrollment in selective universities. Most attrition in computer science occurs early in this major, in the freshman and sophomore years, and women drop out in disproportionately greater numbers than their male counterparts. Taking an ethnographic approach to investigating women's experiences and progress in the first year courses in the computer science major at the University of Pennsylvania, this study examined the pre-college influences that led these women to the major and the nature of their experiences in and outside of class with faculty, peers, and academic support services. This study sought an understanding of the challenges these women faced in the first year of the major with the goal of informing institutional practice about how to best support their persistence. The research reviewed for this study included patterns of leaving majors in science, math and engineering (Seymour & Hewitt 1997), the high school preparation needed to pursue math and engineering majors in college (Strenta, Elliott, Adair, Matier, & Scott, 1994), and intervention programs that have positively impacted persistence of women in computer science (Margolis & Fisher, 2002). The research method of this study employed a series of personal interviews over the course of one calendar year with fourteen first year women who had either declared on intended to declare the computer science major in the School of Engineering and Applied Science at the University of Pennsylvania. Other data sources were focus groups and personal interviews with faculty, administrators, admissions and student life professionals, teaching assistants, female graduate students, and male first year students at the University of Pennsylvania. This study found that the women in this study group came to the University of Pennsylvania with a thorough grounding in mathematics, but many either had an inadequate background in computer science, or at least perceived inadequacies in their background, which prevented them from beginning the major on an equal footing with their mostly male peers and caused some to lose confidence and consequently interest in the major. Issues also emanated from their gender-minority status in the Computer and Information Science Department, causing them to be socially isolated from their peers and further weakening their resolve to persist. These findings suggest that female first year students could benefit from multiple pathways into the major designed for students with varying degrees of prior experience with computer science. In addition, a computer science community within the department characterized by more frequent interaction and collaboration with faculty and peers could positively impact women's persistence in the major.
The Impact of Prior Risk Experiences on Subsequent Risky Decision-Making: The Role of the Insula
Xue, Gui; Lu, Zhonglin; Levin, Irwin P.; Bechara, Antoine
2010-01-01
Risky decision-making is significantly affected by homeostatic states associated with different prior risk experiences, yet the neural mechanisms have not been well understood. Using functional MRI, we examined how gambling decisions and their underlying neural responses were modulated by prior risk experiences, with a focus on the insular cortex since it has been implicated in interoception, emotion and risky decision-making. Fourteen healthy young participants were scanned while performing a gambling task that was designed to simulate daily-life risk taking. Prior risk experience was manipulated by presenting participants with gambles that they were very likely to accept or gambles that they were unlikely to accept. A probe gamble, which was sensitive to individual's risk preference, was presented to examine the effect of prior risk experiences (Risk vs. Norisk) on subsequent risky decisions. Compared to passing on a gamble (Norisk), taking a gamble, especially winning a gamble (Riskwin), was associated with significantly stronger activation in the insular and dorsal medial prefrontal cortices. Decision making after Norisk was more risky and more likely to recruit activation of the insular and anterior cingulate cortices. This insular activity during decision making predicted the extent of risky decisions both within- and across- subjects, and was also correlated with an individual's personality trait of urgency. These findings suggest that the insula plays an important role in activating representations of homeostatic states associated with the experience of risk, which in turn exerts an influence on subsequent decisions. PMID:20045470
Variability in individual assessment behaviour and its implications for collective decision-making.
O'Shea-Wheller, Thomas A; Masuda, Naoki; Sendova-Franks, Ana B; Franks, Nigel R
2017-02-08
Self-organized systems of collective behaviour have been demonstrated in a number of group-living organisms. There is, however, less research relating to how variation in individual assessments may facilitate group decision-making. Here, we investigate this using the decentralized system of collective nest choice behaviour employed by the ant Temnothorax albipennis, combining experimental results with computational modelling. In experiments, isolated workers of this species were allowed to investigate new nest sites of differing quality, and it was found that for any given nest quality, there was wide variation among individuals in the durations that they spent within each nest site. Additionally, individual workers were consistent in spending more time in nest sites of higher quality, and less time in those of lower quality. Hence, the time spent in a new nest site must have included an assessment of nest quality. As nest site visit durations (henceforth termed assessment durations) are linked to recruitment, it is possible that the variability we observed may influence the collective decision-making process of colonies. Thus, we explored this further using a computational model of nest site selection, and found that heterogeneous nest assessments conferred a number of potential benefits. Furthermore, our experiments showed that nest quality assessments were flexible, being influenced by experience of prior options. Our findings help to elucidate the potential mechanisms underlying group behaviour, and highlight the importance of heterogeneity among individuals, rather than precise calibration, in shaping collective decision-making. © 2017 The Author(s).
CBT competence in novice therapists improves anxiety outcomes.
Brown, Lily A; Craske, Michelle G; Glenn, Daniel E; Stein, Murray B; Sullivan, Greer; Sherbourne, Cathy; Bystritsky, Alexander; Welch, Stacy S; Campbell-Sills, Laura; Lang, Ariel; Roy-Byrne, Peter; Rose, Raphael D
2013-02-01
This study explores the relationships between therapist variables (cognitive behavioral therapy [CBT] competence, and CBT adherence) and clinical outcomes of computer-assisted CBT for anxiety disorders delivered by novice therapists in a primary care setting. Participants were recruited for a randomized controlled trial of evidence-based treatment, including computer-assisted CBT, versus treatment as usual. Therapists (anxiety clinical specialists; ACSs) were nonexpert clinicians, many of whom had no prior experience in delivering psychotherapy (and in particular, very little experience with CBT). Trained raters reviewed randomly selected treatment sessions from 176 participants and rated therapists on measures of CBT competence and CBT adherence. Patients were assessed at baseline and at 6-, 12-, and 18-month follow-ups on measures of anxiety, depression, and functioning, and an average Reliable Change Index was calculated as a composite measure of outcome. CBT competence and CBT adherence were entered as predictors of outcome, after controlling for baseline covariates. Higher CBT competence was associated with better clinical outcomes whereas CBT adherence was not. Also, CBT competence was inversely correlated with years of clinical experience and trended (not significantly, though) down as the study progressed. CBT adherence was inversely correlated with therapist tenure in the study. Therapist competence was related to improved clinical outcomes when CBT for anxiety disorders was delivered by novice clinicians with technology assistance. The results highlight the value of the initial training for novice therapists as well as booster training to limit declines in therapist adherence. © 2012 Wiley Periodicals, Inc.
CBT competence in novice therapists improves anxiety outcomes
Brown, Lily A.; Craske, Michelle G.; Glenn, Daniel E.; Stein, Murray B.; Sullivan, Greer; Sherbourne, Cathy; Bystritsky, Alexander; Welch, Stacy S.; Campbell-Sills, Laura; Lang, Ariel; Roy-Byrne, Peter; Rose, Raphael D.
2013-01-01
Objective This study explores the relationships between therapist variables (CBT competence, and CBT adherence) and clinical outcomes of computer-assisted CBT for anxiety disorders delivered by novice therapists in a primary care setting. Methods Participants were recruited for a randomized controlled trial of evidence-based treatment, including computer-assisted CBT, versus treatment as usual. Therapists (Anxiety Clinical Specialists; ACSs) were non-expert clinicians, many of whom had no prior experience in delivering psychotherapy (and in particular, very little experience with CBT). Trained raters reviewed randomly selected treatment sessions from 176 participants and rated therapists on measures of CBT-competence and CBT-adherence. Patients were assessed at baseline and at 6, 12, and 18 month follow-ups on measures of anxiety, depression, and functioning, and an average reliable change index was calculated as a composite measure of outcome. CBT-competence and CBT-adherence were entered as predictors of outcome, after controlling for baseline covariates. Results Higher CBT-competence was associated with better clinical outcomes whereas CBT-adherence was not. Also, CBT-competence was inversely correlated with years of clinical experience and trended (not significantly, though) down as the study progressed. CBT-adherence was inversely correlated with therapist tenure in the study. Conclusions Therapist competence was related to improved clinical outcomes when CBT for anxiety disorders was delivered by novice clinicians with technology assistance. The results highlight the value of the initial training for novice therapists as well as booster training to limit declines in therapist adherence. PMID:23225338
Hemmerich, Joshua A; Elstein, Arthur S; Schwarze, Margaret L; Moliski, Elizabeth G; Dale, William
2013-01-01
The present study tested predictions derived from the Risk as Feelings hypothesis about the effects of prior patients' negative treatment outcomes on physicians' subsequent treatment decisions. Two experiments at The University of Chicago, U.S.A., utilized a computer simulation of an abdominal aortic aneurysm (AAA) patient with enhanced realism to present participants with one of three experimental conditions: AAA rupture causing a watchful waiting death (WWD), perioperative death (PD), or a successful operation (SO), as well as the statistical treatment guidelines for AAA. Experiment 1 tested effects of these simulated outcomes on (n=76) laboratory participants' (university student sample) self-reported emotions, and their ratings of valence and arousal of the AAA rupture simulation and other emotion inducing picture stimuli. Experiment 2 tested two hypotheses: 1) that experiencing a patient WWD in the practice trial's experimental condition would lead physicians to choose surgery earlier, and 2) experiencing a patient PD would lead physicians to choose surgery later with the next patient. Experiment 2 presented (n=132) physicians (surgeons and geriatricians) with the same experimental manipulation and a second simulated AAA patient. Physicians then chose to either go to surgery or continue watchful waiting. The results of Experiment 1 demonstrated that the WWD experimental condition significantly increased anxiety, and was rated similarly to other negative and arousing pictures. The results of Experiment 2 demonstrated that, after controlling for demographics, baseline anxiety, intolerance for uncertainty, risk attitudes, and the influence of simulation characteristics, the WWD experimental condition significantly expedited decisions to choose surgery for the next patient. The results support the Risk as Feelings hypothesis on physicians' treatment decisions in a realistic AAA patient computer simulation. Bad outcomes affected emotions and decisions, even with statistical AAA rupture risk guidance present. These results suggest that bad patient outcomes cause physicians to experience anxiety and regret that influences their subsequent treatment decision-making for the next patient. PMID:22571890
Hemmerich, Joshua A; Elstein, Arthur S; Schwarze, Margaret L; Moliski, Elizabeth Ghini; Dale, William
2012-07-01
The present study tested predictions derived from the Risk as Feelings hypothesis about the effects of prior patients' negative treatment outcomes on physicians' subsequent treatment decisions. Two experiments at The University of Chicago, U.S.A., utilized a computer simulation of an abdominal aortic aneurysm (AAA) patient with enhanced realism to present participants with one of three experimental conditions: AAA rupture causing a watchful waiting death (WWD), perioperative death (PD), or a successful operation (SO), as well as the statistical treatment guidelines for AAA. Experiment 1 tested effects of these simulated outcomes on (n = 76) laboratory participants' (university student sample) self-reported emotions, and their ratings of valence and arousal of the AAA rupture simulation and other emotion-inducing picture stimuli. Experiment 2 tested two hypotheses: 1) that experiencing a patient WWD in the practice trial's experimental condition would lead physicians to choose surgery earlier, and 2) experiencing a patient PD would lead physicians to choose surgery later with the next patient. Experiment 2 presented (n = 132) physicians (surgeons and geriatricians) with the same experimental manipulation and a second simulated AAA patient. Physicians then chose to either go to surgery or continue watchful waiting. The results of Experiment 1 demonstrated that the WWD experimental condition significantly increased anxiety, and was rated similarly to other negative and arousing pictures. The results of Experiment 2 demonstrated that, after controlling for demographics, baseline anxiety, intolerance for uncertainty, risk attitudes, and the influence of simulation characteristics, the WWD experimental condition significantly expedited decisions to choose surgery for the next patient. The results support the Risk as Feelings hypothesis on physicians' treatment decisions in a realistic AAA patient computer simulation. Bad outcomes affected emotions and decisions, even with statistical AAA rupture risk guidance present. These results suggest that bad patient outcomes cause physicians to experience anxiety and regret that influences their subsequent treatment decision-making for the next patient. Copyright © 2012 Elsevier Ltd. All rights reserved.
Computational Psychiatry: towards a mathematically informed understanding of mental illness
Huys, Quentin J M; Roiser, Jonathan P
2016-01-01
Computational Psychiatry aims to describe the relationship between the brain's neurobiology, its environment and mental symptoms in computational terms. In so doing, it may improve psychiatric classification and the diagnosis and treatment of mental illness. It can unite many levels of description in a mechanistic and rigorous fashion, while avoiding biological reductionism and artificial categorisation. We describe how computational models of cognition can infer the current state of the environment and weigh up future actions, and how these models provide new perspectives on two example disorders, depression and schizophrenia. Reinforcement learning describes how the brain can choose and value courses of actions according to their long-term future value. Some depressive symptoms may result from aberrant valuations, which could arise from prior beliefs about the loss of agency (‘helplessness’), or from an inability to inhibit the mental exploration of aversive events. Predictive coding explains how the brain might perform Bayesian inference about the state of its environment by combining sensory data with prior beliefs, each weighted according to their certainty (or precision). Several cortical abnormalities in schizophrenia might reduce precision at higher levels of the inferential hierarchy, biasing inference towards sensory data and away from prior beliefs. We discuss whether striatal hyperdopaminergia might have an adaptive function in this context, and also how reinforcement learning and incentive salience models may shed light on the disorder. Finally, we review some of Computational Psychiatry's applications to neurological disorders, such as Parkinson's disease, and some pitfalls to avoid when applying its methods. PMID:26157034
Virtual navigation performance: the relationship to field of view and prior video gaming experience.
Richardson, Anthony E; Collaer, Marcia L
2011-04-01
Two experiments examined whether learning a virtual environment was influenced by field of view and how it related to prior video gaming experience. In the first experiment, participants (42 men, 39 women; M age = 19.5 yr., SD = 1.8) performed worse on a spatial orientation task displayed with a narrow field of view in comparison to medium and wide field-of-view displays. Counter to initial hypotheses, wide field-of-view displays did not improve performance over medium displays, and this was replicated in a second experiment (30 men, 30 women; M age = 20.4 yr., SD = 1.9) presenting a more complex learning environment. Self-reported video gaming experience correlated with several spatial tasks: virtual environment pointing and tests of Judgment of Line Angle and Position, mental rotation, and Useful Field of View (with correlations between .31 and .45). When prior video gaming experience was included as a covariate, sex differences in spatial tasks disappeared.
The Effects of Modern Mathematics Computer Games on Mathematics Achievement and Class Motivation
ERIC Educational Resources Information Center
Kebritchi, Mansureh; Hirumi, Atsusi; Bai, Haiyan
2010-01-01
This study examined the effects of a computer game on students' mathematics achievement and motivation, and the role of prior mathematics knowledge, computer skill, and English language skill on their achievement and motivation as they played the game. A total of 193 students and 10 teachers participated in this study. The teachers were randomly…
Beall, D P; Feldman, R G; Gordon, M L; Gruber, B L; Lane, J M; Valenzuela, G; Yim, D; Alam, J; Krege, J H; Krohn, K
2016-03-01
In patients in the Direct Assessment of Nonvertebral Fractures in Community Experience (DANCE) observational study with and without a prior vertebral or hip fracture, the incidence of nonvertebral fractures was lower with >6 months of teriparatide treatment than during the first 6 months. Clinical evidence on the effect of teriparatide in patients with prior fracture is limited. In the DANCE observational study, the incidence of nonvertebral fragility fractures (NVFX) decreased significantly in patients receiving teriparatide for >6 months (6-24 months) versus >0 to ≤6 months (reference period). We performed a post hoc analysis to assess the effect of teriparatide 20 μg/day in patients who entered DANCE with prior vertebral or hip fractures. The incidence of patients experiencing a NVFX for four 6-month intervals during and after treatment was compared with the reference period. Overall, 4085 patients received ≥1 dose of teriparatide. Of 3720 with sufficient data for efficacy analysis, 692 had prior vertebral fracture, including 179 with previous kyphoplasty/vertebroplasty; 290 had prior hip fracture. These patients were older, and those with prior vertebral fractures had more comorbid conditions at baseline than those without prior vertebral fractures. The incidence of patients experiencing NVFX declined over time in all patient groups. The fracture incidence rate declined 49 and 46%, respectively, in patients with and without prior vertebral fracture and was 63 and 46% lower in patients with previous kyphoplasty/vertebroplasty and without prior vertebral fracture. NVFX declined 43 and 48% in patients with and without prior hip fracture. The reduced incidence over time was consistent in the subgroups (all interaction p values >0.05). Patients with prior fracture were more likely to experience serious adverse events. The incidence of NVFX decreased over time in patients receiving teriparatide in DANCE regardless of prior fracture status.
ERIC Educational Resources Information Center
Gray, Christina; Pascoe, Robin; Wright, Peter
2018-01-01
Pre-service drama teachers enter teacher training with established ideas and beliefs about teaching. These beliefs, based on experience, are informed by many hours spent in schools, and the pedagogies--both effective and ineffective--utilised by their teachers. This research explores the influence of some of these prior experiences on pre-service…
Bootstrapping and Maintaining Trust in the Cloud
2016-12-01
proliferation and popularity of infrastructure-as-a- service (IaaS) cloud computing services such as Amazon Web Services and Google Compute Engine means...IaaS trusted computing system: • Secure Bootstrapping – the system should enable the tenant to securely install an initial root secret into each cloud ...elastically instantiated and terminated. Prior cloud trusted computing solutions address a subset of these features, but none achieve all. Excalibur [31] sup
Curtin, Lindsay B; Finn, Laura A; Czosnowski, Quinn A; Whitman, Craig B; Cawley, Michael J
2011-08-10
To assess the impact of computer-based simulation on the achievement of student learning outcomes during mannequin-based simulation. Participants were randomly assigned to rapid response teams of 5-6 students and then teams were randomly assigned to either a group that completed either computer-based or mannequin-based simulation cases first. In both simulations, students used their critical thinking skills and selected interventions independent of facilitator input. A predetermined rubric was used to record and assess students' performance in the mannequin-based simulations. Feedback and student performance scores were generated by the software in the computer-based simulations. More of the teams in the group that completed the computer-based simulation before completing the mannequin-based simulation achieved the primary outcome for the exercise, which was survival of the simulated patient (41.2% vs. 5.6%). The majority of students (>90%) recommended the continuation of simulation exercises in the course. Students in both groups felt the computer-based simulation should be completed prior to the mannequin-based simulation. The use of computer-based simulation prior to mannequin-based simulation improved the achievement of learning goals and outcomes. In addition to improving participants' skills, completing the computer-based simulation first may improve participants' confidence during the more real-life setting achieved in the mannequin-based simulation.
Social Inequalities, Meta-Awareness and Literacy in Mathematics Education
ERIC Educational Resources Information Center
Kleve, Bodil
2013-01-01
Pupils start school with different prior understandings about its activities and goals. They have different experiences with books, literature and calculation, and different affinities in relation to letters and numbers. These prior understandings, which encompass experiences, language, habits, affinities and feelings, constitute what Gee (2003)…
NASA Astrophysics Data System (ADS)
Pittman, E. R.; Gustavsen, R. L.; Hagelberg, C. R.; Schmidt, J. H.
2017-06-01
The focus of this set of experiments is the development of data on the Hugoniot for the overdriven products equation of state (EOS) of PBX 9501 (95 weight % HMX, 5 weight % plastic binder) and to extend data from which current computational EOS models draw. This series of shots was conducted using the two-stage gas-guns at Los Alamos and aimed to gather data in the 30 to 120 GPa pressure regime. Experiments were simulated using FLAG, a Langrangian multiphysics code, using a one-dimensional setup which employs the Wescott Stewart Davis (WSD) reactive burn model. Prior to this study, data did not extend above 90 GPa, so the new data allowed the model to be re-evaluated. A comparison of the simulations with the experimental data shows that the model fits well below 80 GPa. However, the model did not fall within the error bars of the data for higher pressures. This is an indication that the PBX 9501 overdriven EOS products model could be modified to better match the data.
Computed Tomography Support for Microgravity Materials Science Experiments
NASA Technical Reports Server (NTRS)
Gillies, Donald C.; Engel, H. Peter; Whitaker, Ann F. (Technical Monitor)
2001-01-01
The accurate measurement of density in both liquid and solid samples is of considerable interest to Principal Investigators with materials science experiments slated for the ISS. The work to be described is an innovative application of a conventional industrial nondestructive evaluation instrument. Traditional applications of industrial computed tomography (CT) rely on reconstructing cross sections of large structures to provide two-dimensional planar views which can identify defects such as porosity, or other material anomalies. This has been done on microgravity materials science experiments to check the integrity of ampoule-cartridge assemblies for safety purposes. With a substantially monoenergetic flux, as can be obtained with a radioactive cobalt source, there will be a direct correlation between absorption and density. Under such conditions it then becomes possible to make accurate measurements of density throughout a sample, and even when the sample itself is enclosed within a furnace and a safety required cartridge. Such a system has been installed at Kennedy Space Center (KSC) and is available to PIs to examine samples before and after flight. The CT system is being used to provide density information for two purposes. Firstly, the determination of density changes from liquid to solid is vital information to the PI for purposes of modeling the solidification behavior of his sample, and to engineers who have to design containment ampoules and must allow for shrinkage and other volume changes that may occur during processing. While such information can be obtained by pycnometric measurements, the possibility of using a furnace installed on the CT system enables one to examine potentially dangerous materials having high vapor pressures, while not needing visible access to the material. In addition, uniform temperature can readily be obtained, and the system can be controlled to ramp up, hold, and ramp down while collecting data over a wide range of parameters automatically. Results of initial tests on low melting point elements such as gallium, indium and tin will be presented, and the intent is to proceed to compounds such as InSb, HgCdTe and CdTe. Alloys such as Pb-Sb (PI - Poirier, U AZ) and Cu-Al (PI - Trivedi, Ames Lab.), which are the subjects of flight experiments, will also be examined. The second application is the conversion of measured density values directly to composition. This was successfully done with the mercury cadmium telluride alloys grown on the second and fourth United States Microgravity Payload (USMP-2 and USMP-4) missions by Lehoczky. CdTe values along the length of the boules were obtained at KSC prior to cutting the sample, and could have been obtained prior to its removal from the cartridge and ampoule. Examples of the data obtained will be shown. It is anticipated that several of the materials science PIs will avail themselves of the technique described, initially for determining densities prior to flight, and then to acquire early quantitative data on the compositional variation within their samples.
Murdoch-Eaton, D; Manning, D; Kwizera, E; Burch, V; Pell, G; Whittle, S
2012-01-01
Medical education faces challenges posed by widening access to training, a demand for globally competent healthcare workers and progress towards harmonisation of standards. To explore potential challenges arising from variation in diversity and educational background of medical school entrants. This study investigated the reported experience and confidence, in a range of 31 generic skills underpinning learning, of 2606 medical undergraduates entering 14 medical schools in England and South Africa, using a validated questionnaire. Responses suggest that there is considerable similarity in prior educational experience and confidence skills profiles on entry to South African and English medical schools. South African entrants reported significantly more experience in 'Technical skills', 'Managing their own Learning', and 'Presentation', while English students reported increased experience in 'IT' skills. South African undergraduates reported more confidence in 'Information Handling', while English students were more confident in 'IT' skills. The most noticeable difference, in 'IT' skills, is probably due to documented differences in access to computer facilities at high school level. Differences between individual schools within each country are noticeable. Educators need to acquire a good understanding of their incoming cohorts, and ensure necessary tailored support for skills development.
Data-Driven Software Framework for Web-Based ISS Telescience
NASA Technical Reports Server (NTRS)
Tso, Kam S.
2005-01-01
Software that enables authorized users to monitor and control scientific payloads aboard the International Space Station (ISS) from diverse terrestrial locations equipped with Internet connections is undergoing development. This software reflects a data-driven approach to distributed operations. A Web-based software framework leverages prior developments in Java and Extensible Markup Language (XML) to create portable code and portable data, to which one can gain access via Web-browser software on almost any common computer. Open-source software is used extensively to minimize cost; the framework also accommodates enterprise-class server software to satisfy needs for high performance and security. To accommodate the diversity of ISS experiments and users, the framework emphasizes openness and extensibility. Users can take advantage of available viewer software to create their own client programs according to their particular preferences, and can upload these programs for custom processing of data, generation of views, and planning of experiments. The same software system, possibly augmented with a subset of data and additional software tools, could be used for public outreach by enabling public users to replay telescience experiments, conduct their experiments with simulated payloads, and create their own client programs and other custom software.
Use of prior odds for missing persons identifications.
Budowle, Bruce; Ge, Jianye; Chakraborty, Ranajit; Gill-King, Harrell
2011-06-27
Identification of missing persons from mass disasters is based on evaluation of a number of variables and observations regarding the combination of features derived from these variables. DNA typing now is playing a more prominent role in the identification of human remains, and particularly so for highly decomposed and fragmented remains. The strength of genetic associations, by either direct or kinship analyses, is often quantified by calculating a likelihood ratio. The likelihood ratio can be multiplied by prior odds based on nongenetic evidence to calculate the posterior odds, that is, by applying Bayes' Theorem, to arrive at a probability of identity. For the identification of human remains, the path creating the set and intersection of variables that contribute to the prior odds needs to be appreciated and well defined. Other than considering the total number of missing persons, the forensic DNA community has been silent on specifying the elements of prior odds computations. The variables include the number of missing individuals, eyewitness accounts, anthropological features, demographics and other identifying characteristics. The assumptions, supporting data and reasoning that are used to establish a prior probability that will be combined with the genetic data need to be considered and justified. Otherwise, data may be unintentionally or intentionally manipulated to achieve a probability of identity that cannot be supported and can thus misrepresent the uncertainty with associations. The forensic DNA community needs to develop guidelines for objectively computing prior odds.
In silico model-based inference: a contemporary approach for hypothesis testing in network biology
Klinke, David J.
2014-01-01
Inductive inference plays a central role in the study of biological systems where one aims to increase their understanding of the system by reasoning backwards from uncertain observations to identify causal relationships among components of the system. These causal relationships are postulated from prior knowledge as a hypothesis or simply a model. Experiments are designed to test the model. Inferential statistics are used to establish a level of confidence in how well our postulated model explains the acquired data. This iterative process, commonly referred to as the scientific method, either improves our confidence in a model or suggests that we revisit our prior knowledge to develop a new model. Advances in technology impact how we use prior knowledge and data to formulate models of biological networks and how we observe cellular behavior. However, the approach for model-based inference has remained largely unchanged since Fisher, Neyman and Pearson developed the ideas in the early 1900’s that gave rise to what is now known as classical statistical hypothesis (model) testing. Here, I will summarize conventional methods for model-based inference and suggest a contemporary approach to aid in our quest to discover how cells dynamically interpret and transmit information for therapeutic aims that integrates ideas drawn from high performance computing, Bayesian statistics, and chemical kinetics. PMID:25139179
In silico model-based inference: a contemporary approach for hypothesis testing in network biology.
Klinke, David J
2014-01-01
Inductive inference plays a central role in the study of biological systems where one aims to increase their understanding of the system by reasoning backwards from uncertain observations to identify causal relationships among components of the system. These causal relationships are postulated from prior knowledge as a hypothesis or simply a model. Experiments are designed to test the model. Inferential statistics are used to establish a level of confidence in how well our postulated model explains the acquired data. This iterative process, commonly referred to as the scientific method, either improves our confidence in a model or suggests that we revisit our prior knowledge to develop a new model. Advances in technology impact how we use prior knowledge and data to formulate models of biological networks and how we observe cellular behavior. However, the approach for model-based inference has remained largely unchanged since Fisher, Neyman and Pearson developed the ideas in the early 1900s that gave rise to what is now known as classical statistical hypothesis (model) testing. Here, I will summarize conventional methods for model-based inference and suggest a contemporary approach to aid in our quest to discover how cells dynamically interpret and transmit information for therapeutic aims that integrates ideas drawn from high performance computing, Bayesian statistics, and chemical kinetics. © 2014 American Institute of Chemical Engineers.
Accessible switching of electronic defect type in SrTi O3 via biaxial strain
NASA Astrophysics Data System (ADS)
Chi, Yen-Ting; Youssef, Mostafa; Sun, Lixin; Van Vliet, Krystyn J.; Yildiz, Bilge
2018-05-01
Elastic strain is used widely to alter the mobility of free electronic carriers in semiconductors, but a predictive relationship between elastic lattice strain and the extent of charge localization of electronic defects is still underdeveloped. Here we considered SrTi O3 , a prototypical perovskite as a model functional oxide for thin film electronic devices and nonvolatile memories. We assessed the effects of biaxial strain on the stability of electronic defects at finite temperature by combining density functional theory (DFT) and quasiharmonic approximation (QHA) calculations. We constructed a predominance diagram for free electrons and small electron polarons in this material, as a function of biaxial strain and temperature. We found that biaxial tensile strain in SrTi O3 can stabilize the small polaron, leading to a thermally activated and slower electronic transport, consistent with prior experimental observations on SrTi O3 and distinct from our prior theoretical assessment of the response of SrTi O3 to hydrostatic stress. These findings also resolved apparent conflicts between prior atomistic simulations and conductivity experiments for biaxially strained SrTi O3 thin films. Our computational approach can be extended to other functional oxides, and for the case of SrTi O3 our findings provide concrete guidance for conditions under which strain engineering can shift the electronic defect type and concentration to modulate electronic transport in thin films.
Biedermann, Alex; Taroni, Franco; Margot, Pierre
2012-01-31
Prior probabilities represent a core element of the Bayesian probabilistic approach to relatedness testing. This letter opinions on the commentary Use of prior odds for missing persons identifications by Budowle et al., published recently in this journal. Contrary to Budowle et al., we argue that the concept of prior probabilities (i) is not endowed with the notion of objectivity, (ii) is not a case for computation, and (iii) does not require new guidelines edited by the forensic DNA community--as long as probability is properly considered as an expression of personal belief.
A quasi-experimental study of after-event reviews and leadership development.
Derue, D Scott; Nahrgang, Jennifer D; Hollenbeck, John R; Workman, Kristina
2012-09-01
We examine how structured reflection through after-event reviews (AERs) promotes experience-based leadership development and how people's prior experiences and personality attributes influence the impact of AERs on leadership development. We test our hypotheses in a time-lagged, quasi-experimental study that followed 173 research participants for 9 months and across 4 distinct developmental experiences. Findings indicate that AERs have a positive effect on leadership development, and this effect is accentuated when people are conscientious, open to experience, and emotionally stable and have a rich base of prior developmental experiences.
Free-choice family learning experiences at informal astronomy observing events
NASA Astrophysics Data System (ADS)
Wenger, Matthew C.
This qualitative study is an exploratory look at family experiences at night time telescope observing events, often called star parties. Four families participated in this study which looked at their expectations, experiences and agendas as well as the roles that identity and family culture played in the negotiation of meaning. Two families who had prior experience with attending star parties were recruited ahead of time and two other families who were first time visitors were recruited on-site at the observing event. Data were collected at two star parties. At each event, one experienced family was paired with an on-site family for the purposes of facilitating conversations about expectations and prior experiences. The results of this study showed that learning is constantly occurring among families, and that star parties and family culture were mediational means for making meaning. Expectations and agendas were found to affect the families' star party experiences and differences were observed between the expectations and experiences of families based on their prior experiences with star parties. These data also showed that family members are actively negotiating their individual and family identities. These families use their cultural history together to make sense of their star party experiences; however, the meaning that families were negotiating was often focused more on developing family and individual identity rather than science content. The families in this study used the star party context as a way to connect with each other, to make sense of their prior experiences, and as raw material for making sense of future experiences.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pointer, William David
The objective of this effort is to establish a strategy and process for generation of suitable computational mesh for computational fluid dynamics simulations of departure from nucleate boiling in a 5 by 5 fuel rod assembly held in place by PWR mixing vane spacer grids. This mesh generation process will support ongoing efforts to develop, demonstrate and validate advanced multi-phase computational fluid dynamics methods that enable more robust identification of dryout conditions and DNB occurrence.Building upon prior efforts and experience, multiple computational meshes were developed using the native mesh generation capabilities of the commercial CFD code STAR-CCM+. These meshes weremore » used to simulate two test cases from the Westinghouse 5 by 5 rod bundle facility. The sensitivity of predicted quantities of interest to the mesh resolution was then established using two evaluation methods, the Grid Convergence Index method and the Least Squares method. This evaluation suggests that the Least Squares method can reliably establish the uncertainty associated with local parameters such as vector velocity components at a point in the domain or surface averaged quantities such as outlet velocity magnitude. However, neither method is suitable for characterization of uncertainty in global extrema such as peak fuel surface temperature, primarily because such parameters are not necessarily associated with a fixed point in space. This shortcoming is significant because the current generation algorithm for identification of DNB event conditions relies on identification of such global extrema. Ongoing efforts to identify DNB based on local surface conditions will address this challenge« less
Ye, Chuyang; Murano, Emi; Stone, Maureen; Prince, Jerry L
2015-10-01
The tongue is a critical organ for a variety of functions, including swallowing, respiration, and speech. It contains intrinsic and extrinsic muscles that play an important role in changing its shape and position. Diffusion tensor imaging (DTI) has been used to reconstruct tongue muscle fiber tracts. However, previous studies have been unable to reconstruct the crossing fibers that occur where the tongue muscles interdigitate, which is a large percentage of the tongue volume. To resolve crossing fibers, multi-tensor models on DTI and more advanced imaging modalities, such as high angular resolution diffusion imaging (HARDI) and diffusion spectrum imaging (DSI), have been proposed. However, because of the involuntary nature of swallowing, there is insufficient time to acquire a sufficient number of diffusion gradient directions to resolve crossing fibers while the in vivo tongue is in a fixed position. In this work, we address the challenge of distinguishing interdigitated tongue muscles from limited diffusion magnetic resonance imaging by using a multi-tensor model with a fixed tensor basis and incorporating prior directional knowledge. The prior directional knowledge provides information on likely fiber directions at each voxel, and is computed with anatomical knowledge of tongue muscles. The fiber directions are estimated within a maximum a posteriori (MAP) framework, and the resulting objective function is solved using a noise-aware weighted ℓ1-norm minimization algorithm. Experiments were performed on a digital crossing phantom and in vivo tongue diffusion data including three control subjects and four patients with glossectomies. On the digital phantom, effects of parameters, noise, and prior direction accuracy were studied, and parameter settings for real data were determined. The results on the in vivo data demonstrate that the proposed method is able to resolve interdigitated tongue muscles with limited gradient directions. The distributions of the computed fiber directions in both the controls and the patients were also compared, suggesting a potential clinical use for this imaging and image analysis methodology. Copyright © 2015 Elsevier Ltd. All rights reserved.
Prior Learning Assessment: U.S. Experience Facilitating Lifelong Learning.
ERIC Educational Resources Information Center
Mann, Carolyn M.
This paper focuses on the role of prior learning assessment in the life long learning of adults in the United States. The introduction stresses the increasing importance of life long learning in American society. The second section reviews prior learning and its assessment. Prior learning is formally defined as learning which has been acquired…
76 FR 75427 - Farm Loan Programs Loan Making Activities
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-02
... prior farming experience of the applicant. This amendment is required by sections 5001 and 5101 of the... years prior to the date of the application, if all prior farming occurred more than five years prior to...: (1) Is a beginning farmer or socially disadvantaged farmer engaged primarily in farming in the United...
Caffeine Use and Extroversion.
ERIC Educational Resources Information Center
Landrum, R. Eric; Meliska, Charles J.
Some research on the stimulant effect of caffeine suggests that the amount of behavioral enhancement produced by caffeine may depend on subjects' prior experience with the task and the drug. A study was undertaken to test whether prior experience with a task while under the influence of caffeine would facilitate performance of that task. Male…
34 CFR 644.22 - How does the Secretary evaluate prior experience?
Code of Federal Regulations, 2010 CFR
2010-07-01
... 34 Education 3 2010-07-01 2010-07-01 false How does the Secretary evaluate prior experience? 644.22 Section 644.22 Education Regulations of the Offices of the Department of Education (Continued) OFFICE OF POSTSECONDARY EDUCATION, DEPARTMENT OF EDUCATION EDUCATIONAL OPPORTUNITY CENTERS How Does the...
34 CFR 647.22 - How does the Secretary evaluate prior experience?
Code of Federal Regulations, 2010 CFR
2010-07-01
... 34 Education 3 2010-07-01 2010-07-01 false How does the Secretary evaluate prior experience? 647.22 Section 647.22 Education Regulations of the Offices of the Department of Education (Continued) OFFICE OF POSTSECONDARY EDUCATION, DEPARTMENT OF EDUCATION RONALD E. MCNAIR POSTBACCALAUREATE ACHIEVEMENT...
Second-Order Conditioning during a Compound Extinction Treatment
ERIC Educational Resources Information Center
Pineno, Oskar; Zilski, Jessica M.; Schachtman, Todd R.
2007-01-01
Two conditioned taste aversion experiments with rats were conducted to establish if a target taste that had received a prior pairing with illness could be subject to second-order conditioning during extinction treatment in compound with a flavor that also received prior conditioning. In these experiments, the occurrence of second-order…
Effects of Students' Prior Knowledge on Scientific Reasoning in Density.
ERIC Educational Resources Information Center
Yang, Il-Ho; Kwon, Yong-Ju; Kim, Young-Shin; Jang, Myoung-Duk; Jeong, Jin-Woo; Park, Kuk-Tae
2002-01-01
Investigates the effects of students' prior knowledge on the scientific reasoning processes of performing the task of controlling variables with computer simulation and identifies a number of problems that students encounter in scientific discovery. Involves (n=27) 5th grade students and (n=33) 7th grade students. Indicates that students' prior…
Horn-Ritzinger, Sabine; Bernhardt, Johannes; Horn, Michael; Smolle, Josef
2011-04-01
The importance of inductive instruction in medical education is increasingly growing. Little is known about the relevance of prior knowledge regarding students' inductive reasoning abilities. The purpose is to evaluate this inductive teaching method as a means of fostering higher levels of learning and to explore how individual differences in prior knowledge (high [HPK] vs. low [LPK]) contribute to students' inductive reasoning skills. Twenty-six LPK and 18 HPK students could train twice with an interactive computer-based training object to discover the underlying concept before doing the final comprehension check. Students had a median of 76.9% of correct answers in the first, 90.9% in the second training, and answered 92% of the final assessment questions correctly. More important, 86% of all students succeeded with inductive learning, among them 83% of the HPK students and 89% of the LPK students. Prior knowledge did not predict performance on overall comprehension. This inductive instructional strategy fostered students' deep approaches to learning in a time-effective way.
Voss, Georgina
2013-09-01
This paper examines how young peoples' lived experiences with personal technologies can be used to teach engineering ethics in a way which facilitates greater engagement with the subject. Engineering ethics can be challenging to teach: as a form of practical ethics, it is framed around future workplace experience in a professional setting which students are assumed to have no prior experience of. Yet the current generations of engineering students, who have been described as 'digital natives', do however have immersive personal experience with digital technologies; and experiential learning theory describes how students learn ethics more successfully when they can draw on personal experience which give context and meaning to abstract theories. This paper reviews current teaching practices in engineering ethics; and examines young people's engagement with technologies including cell phones, social networking sites, digital music and computer games to identify social and ethical elements of these practices which have relevance for the engineering ethics curricula. From this analysis three case studies are developed to illustrate how facets of the use of these technologies can be drawn on to teach topics including group work and communication; risk and safety; and engineering as social experimentation. Means for bridging personal experience and professional ethics when teaching these cases are discussed. The paper contributes to research and curriculum development in engineering ethics education, and to wider education research about methods of teaching 'the net generation'.
The cradle of causal reasoning: newborns' preference for physical causality.
Mascalzoni, Elena; Regolin, Lucia; Vallortigara, Giorgio; Simion, Francesca
2013-05-01
Perception of mechanical (i.e. physical) causality, in terms of a cause-effect relationship between two motion events, appears to be a powerful mechanism in our daily experience. In spite of a growing interest in the earliest causal representations, the role of experience in the origin of this sensitivity is still a matter of dispute. Here, we asked the question about the innate origin of causal perception, never tested before at birth. Three experiments were carried out to investigate sensitivity at birth to some visual spatiotemporal cues present in a launching event. Newborn babies, only a few hours old, showed that they significantly preferred a physical causality event (i.e. Michotte's Launching effect) when matched to a delay event (i.e. a delayed launching; Experiment 1) or to a non-causal event completely identical to the causal one except for the order of the displacements of the two objects involved which was swapped temporally (Experiment 3). This preference for the launching event, moreover, also depended on the continuity of the trajectory between the objects involved in the event (Experiment 2). These results support the hypothesis that the human system possesses an early available, possibly innate basic mechanism to compute causality, such a mechanism being sensitive to the additive effect of certain well-defined spatiotemporal cues present in the causal event independently of any prior visual experience. © 2013 Blackwell Publishing Ltd.
Effect of subliminal visual material on an auditory signal detection task.
Moroney, E; Bross, M
1984-02-01
An experiment assessed the effect of subliminally embedded, visual material on an auditory detection task. 22 women and 19 men were presented tachistoscopically with words designated as "emotional" or "neutral" on the basis of prior GSRs and a Word Rating List under four conditions: (a) Unembedded Neutral, (b) Embedded Neutral, (c) Unembedded Emotional, and (d) Embedded Emotional. On each trial subjects made forced choices concerning the presence or absence of an auditory tone (1000 Hz) at threshold level; hits and false alarm rates were used to compute non-parametric indices for sensitivity (A') and response bias (B"). While over-all analyses of variance yielded no significant differences, further examination of the data suggests the presence of subliminally "receptive" and "non-receptive" subpopulations.
Red blood cell sedimentation of Apheresis Granulocytes.
Lodermeier, Michelle A; Byrne, Karen M; Flegel, Willy A
2017-10-01
Sedimentation of Apheresis Granulocyte components removes red blood cells. It is used to increase the blood donor pool when blood group-compatible donors cannot be recruited for a patient because of a major ABO incompatibility or incompatible red blood cell antibodies in the recipient. Because granulocytes have little ABO and few other red blood cell antigens on their membrane, such incompatibility lies mostly with the contaminating red blood cells. Video Clip S1 shows the process of red blood cell sedimentation of an Apheresis Granulocyte component. This video was filmed with a single smart phone attached to a commercial tripod and was edited on a tablet computer with free software by an amateur videographer without prior video experience. © 2017 AABB.
Youngs, Noah; Penfold-Brown, Duncan; Drew, Kevin; Shasha, Dennis; Bonneau, Richard
2013-05-01
Computational biologists have demonstrated the utility of using machine learning methods to predict protein function from an integration of multiple genome-wide data types. Yet, even the best performing function prediction algorithms rely on heuristics for important components of the algorithm, such as choosing negative examples (proteins without a given function) or determining key parameters. The improper choice of negative examples, in particular, can hamper the accuracy of protein function prediction. We present a novel approach for choosing negative examples, using a parameterizable Bayesian prior computed from all observed annotation data, which also generates priors used during function prediction. We incorporate this new method into the GeneMANIA function prediction algorithm and demonstrate improved accuracy of our algorithm over current top-performing function prediction methods on the yeast and mouse proteomes across all metrics tested. Code and Data are available at: http://bonneaulab.bio.nyu.edu/funcprop.html
Computer-assisted diagnostic decision support: history, challenges, and possible paths forward.
Miller, Randolph A
2009-09-01
This paper presents a brief history of computer-assisted diagnosis, including challenges and future directions. Some ideas presented in this article on computer-assisted diagnostic decision support systems (CDDSS) derive from prior work by the author and his colleagues (see list in Acknowledgments) on the INTERNIST-1 and QMR projects. References indicate the original sources of many of these ideas.
Media multitasking behavior: concurrent television and computer usage.
Brasel, S Adam; Gips, James
2011-09-01
Changes in the media landscape have made simultaneous usage of the computer and television increasingly commonplace, but little research has explored how individuals navigate this media multitasking environment. Prior work suggests that self-insight may be limited in media consumption and multitasking environments, reinforcing a rising need for direct observational research. A laboratory experiment recorded both younger and older individuals as they used a computer and television concurrently, multitasking across television and Internet content. Results show that individuals are attending primarily to the computer during media multitasking. Although gazes last longer on the computer when compared to the television, the overall distribution of gazes is strongly skewed toward very short gazes only a few seconds in duration. People switched between media at an extreme rate, averaging more than 4 switches per min and 120 switches over the 27.5-minute study exposure. Participants had little insight into their switching activity and recalled their switching behavior at an average of only 12 percent of their actual switching rate revealed in the objective data. Younger individuals switched more often than older individuals, but other individual differences such as stated multitasking preference and polychronicity had little effect on switching patterns or gaze duration. This overall pattern of results highlights the importance of exploring new media environments, such as the current drive toward media multitasking, and reinforces that self-monitoring, post hoc surveying, and lay theory may offer only limited insight into how individuals interact with media.
Media Multitasking Behavior: Concurrent Television and Computer Usage
Gips, James
2011-01-01
Abstract Changes in the media landscape have made simultaneous usage of the computer and television increasingly commonplace, but little research has explored how individuals navigate this media multitasking environment. Prior work suggests that self-insight may be limited in media consumption and multitasking environments, reinforcing a rising need for direct observational research. A laboratory experiment recorded both younger and older individuals as they used a computer and television concurrently, multitasking across television and Internet content. Results show that individuals are attending primarily to the computer during media multitasking. Although gazes last longer on the computer when compared to the television, the overall distribution of gazes is strongly skewed toward very short gazes only a few seconds in duration. People switched between media at an extreme rate, averaging more than 4 switches per min and 120 switches over the 27.5-minute study exposure. Participants had little insight into their switching activity and recalled their switching behavior at an average of only 12 percent of their actual switching rate revealed in the objective data. Younger individuals switched more often than older individuals, but other individual differences such as stated multitasking preference and polychronicity had little effect on switching patterns or gaze duration. This overall pattern of results highlights the importance of exploring new media environments, such as the current drive toward media multitasking, and reinforces that self-monitoring, post hoc surveying, and lay theory may offer only limited insight into how individuals interact with media. PMID:21381969
Computerized screening for cognitive impairment in patients with COPD
Campman, Carlijn; van Ranst, Dirk; Meijer, Jan Willem; Sitskoorn, Margriet
2017-01-01
Purpose COPD is associated with cognitive impairment. These impairments should be diagnosed, but due to time- and budget-reasons, they are often not investigated. The aim of this study is to examine the viability of a brief computerized cognitive test battery, Central Nervous System Vital Signs (CNSVS), in COPD patients. Patients and methods Patients with COPD referred to tertiary pulmonary rehabilitation were included. Cognitive functioning of patients was assessed with CNSVS before pulmonary rehabilitation and compared with age-corrected CNSVS norms. CNSVS is a 30 minute computerized test battery that includes tests of verbal and visual memory, psychomotor speed, processing speed, cognitive flexibility, complex attention, executive functioning, and reaction time. Results CNSVS was fully completed by 205 (93.2%, 105 females, 100 males) of the total group of patients (n=220, 116 females, 104 males). Z-tests showed that COPD patients performed significantly worse than the norms on all CNSVS cognitive domains. Slightly more than half of the patients (51.8%) had impaired functioning on 1 or more cognitive domains. Patients without computer experience performed significantly worse on CNSVS than patients using the computer frequently. Conclusion The completion rate of CNSVS was high and cognitive dysfunctions measured with this screening were similar to the results found in prior research, including paper and pen cognitive tests. These results support the viability of this brief computerized cognitive screening in COPD patients, that may lead to better care for these patients. Cognitive performance of patients with little computer experience should be interpreted carefully. Future research on this issue is needed. PMID:29089756
Computational Simulation of a Water-Cooled Heat Pump
NASA Technical Reports Server (NTRS)
Bozarth, Duane
2008-01-01
A Fortran-language computer program for simulating the operation of a water-cooled vapor-compression heat pump in any orientation with respect to gravity has been developed by modifying a prior general-purpose heat-pump design code used at Oak Ridge National Laboratory (ORNL).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pagani, J.J.; Hayman, L.A.; Bigelow, R.H.
1983-04-01
The effect of 5 mg of intravenous diazepam (Valium) on contrast media-associated seizer incidence was studied in a randomized controlled trial involving 284 patients with known or suspected brain metastases undergoing cerebral computed tomography. Of these patients, 188 were found to have brain metastases, and it is estimated that for this subgroup prophylactic diazepam reduces the risk of contrast-assocated seizure by a factor of 0.26. Seizures occurred in three of 96 patients with metastases on diazepam and in 14 of 92 patients with metastases but without diazepam. Factors related to increased risk of contrast media-associated seizures are: (1) prior seizuremore » history due to brain metatases and/or prior contrast, (2) progressive cerebral metastases, and (3) prior or concurrent brain antineoplastic therapy. Factors not related to an increased risk of these seizures are: (1) contrast media dosage, chemical composition, or osmolarity, (2) computed tomographic appearance of metastases, and (3) type of primary malignancy. Concomitant therapeutic levels of diphenylhydantoin (Dilantin) do not protect completely against contrast media-associated seizures. Pathophysiology of contrast media-associated seizures is discussed in view of the risk factors determined by this study.« less
Early Results in Capella's Prior Learning Assessment Experimental Site Initiative
ERIC Educational Resources Information Center
Klein, Jillian
2017-01-01
In July 2014, the U.S. Department of Education announced a new round of experimental sites focusing on competency-based education. Capella University was selected to participate in three of the Department of Education's competency-based education (CBE) experiments and began by implementing the prior learning assessment experiment, which allows…
Random Access: The Latino Student Experience with Prior Learning Assessment
ERIC Educational Resources Information Center
Klein-Collins, Rebecca; Olson, Richard
2014-01-01
Many Latinos come to higher education as adults. One degree completion strategy that is particularly suited to adult students in higher education is prior learning assessment (PLA). PLA provides opportunities to evaluate a student's learning from work or life experience for the purpose of awarding college credit. For students whose…
Autonomous entropy-based intelligent experimental design
NASA Astrophysics Data System (ADS)
Malakar, Nabin Kumar
2011-07-01
The aim of this thesis is to explore the application of probability and information theory in experimental design, and to do so in a way that combines what we know about inference and inquiry in a comprehensive and consistent manner. Present day scientific frontiers involve data collection at an ever-increasing rate. This requires that we find a way to collect the most relevant data in an automated fashion. By following the logic of the scientific method, we couple an inference engine with an inquiry engine to automate the iterative process of scientific learning. The inference engine involves Bayesian machine learning techniques to estimate model parameters based upon both prior information and previously collected data, while the inquiry engine implements data-driven exploration. By choosing an experiment whose distribution of expected results has the maximum entropy, the inquiry engine selects the experiment that maximizes the expected information gain. The coupled inference and inquiry engines constitute an autonomous learning method for scientific exploration. We apply it to a robotic arm to demonstrate the efficacy of the method. Optimizing inquiry involves searching for an experiment that promises, on average, to be maximally informative. If the set of potential experiments is described by many parameters, the search involves a high-dimensional entropy space. In such cases, a brute force search method will be slow and computationally expensive. We develop an entropy-based search algorithm, called nested entropy sampling, to select the most informative experiment. This helps to reduce the number of computations necessary to find the optimal experiment. We also extended the method of maximizing entropy, and developed a method of maximizing joint entropy so that it could be used as a principle of collaboration between two robots. This is a major achievement of this thesis, as it allows the information-based collaboration between two robotic units towards a same goal in an automated fashion.
Using Alice 2.0 to Design Games for People with Stroke.
Proffitt, Rachel; Kelleher, Caitlin; Baum, M Carolyn; Engsberg, Jack
2012-08-01
Computer and videogames are gaining in popularity as rehabilitation tools. Unfortunately, most systems still require extensive programming/engineering knowledge to create, something that therapists, as novice programmers, do not possess. There is software designed to allow novice programmers to create storyboard and games through simple drag-and-drop formats; however, the applications for therapeutic game development have not been studied. The purpose of this study was to have an occupational therapy (OT) student with no prior computer programming experience learn how to create computer games for persons with stroke using Alice 2.0, a drag-and-drop editor, designed by Carnegie Mellon University (Pittsburgh, PA). The OT student learned how to use Alice 2.0 through a textbook, tutorials, and assistance from computer science students. She kept a journal of her process, detailing her successes and challenges. The OT student created three games for people with stroke using Alice 2.0. She found that although there were many supports in Alice for creating stories, it lacked critical pieces necessary for game design. Her recommendations for a future programming environment for therapists were that it (1) be efficient, (2) include basic game design pieces so therapists do not have to create them, (3) provide technical support, and (4) be simple. With the incorporation of these recommendations, a future programming environment for therapists will be an effective tool for therapeutic game development.
NASA Astrophysics Data System (ADS)
Frieze, Carol; Quesenberry, Jeria L.; Kemp, Elizabeth; Velázquez, Anthony
2012-08-01
Gender difference approaches to the participation of women in computing have not provided adequate explanations for women's declining interest in computer science (CS) and related technical fields. Indeed, the search for gender differences can work against diversity which we define as a cross-gender spectrum of characteristics, interests, abilities, experiences, beliefs and identities. Our ongoing case studies at Carnegie Mellon University (CMU) provide evidence to show that a focus on culture offers the most insightful and effective approach for investigating women's participation in CS. In this paper, we illustrate this approach and show the significance of cultural factors by describing a new case study which examines the attitudes of CS majors at CMU. Our analysis found that most men and women felt comfortable in the school, believed they could be successful in the CS environment at CMU, and thought they fit in socially and academically. In brief, we did not see any evidence of a strong gender divide in student attitudes towards fitting in or feeling like they could be successful; indeed we found that the Women-CS fit remained strong from prior years. Hence, our research demonstrates that women, alongside their male peers, can fit successfully into a CS environment and help shape that environment and computing culture, for the benefit of everyone, without accommodating presumed gender differences or any compromises to academic integrity.
Ruth, Veikko; Kolditz, Daniel; Steiding, Christian; Kalender, Willi A
2017-06-01
The performance of metal artifact reduction (MAR) methods in x-ray computed tomography (CT) suffers from incorrect identification of metallic implants in the artifact-affected volumetric images. The aim of this study was to investigate potential improvements of state-of-the-art MAR methods by using prior information on geometry and material of the implant. The influence of a novel prior knowledge-based segmentation (PS) compared with threshold-based segmentation (TS) on 2 MAR methods (linear interpolation [LI] and normalized-MAR [NORMAR]) was investigated. The segmentation is the initial step of both MAR methods. Prior knowledge-based segmentation uses 3-dimensional registered computer-aided design (CAD) data as prior knowledge to estimate the correct position and orientation of the metallic objects. Threshold-based segmentation uses an adaptive threshold to identify metal. Subsequently, for LI and NORMAR, the selected voxels are projected into the raw data domain to mark metal areas. Attenuation values in these areas are replaced by different interpolation schemes followed by a second reconstruction. Finally, the previously selected metal voxels are replaced by the metal voxels determined by PS or TS in the initial reconstruction. First, we investigated in an elaborate phantom study if the knowledge of the exact implant shape extracted from the CAD data provided by the manufacturer of the implant can improve the MAR result. Second, the leg of a human cadaver was scanned using a clinical CT system before and after the implantation of an artificial knee joint. The results were compared regarding segmentation accuracy, CT number accuracy, and the restoration of distorted structures. The use of PS improved the efficacy of LI and NORMAR compared with TS. Artifacts caused by insufficient segmentation were reduced, and additional information was made available within the projection data. The estimation of the implant shape was more exact and not dependent on a threshold value. Consequently, the visibility of structures was improved when comparing the new approach to the standard method. This was further confirmed by improved CT value accuracy and reduced image noise. The PS approach based on prior implant information provides image quality which is superior to TS-based MAR, especially when the shape of the metallic implant is complex. The new approach can be useful for improving MAR methods and dose calculations within radiation therapy based on the MAR corrected CT images.
Impact of Previous Pharmacy Work Experience on Pharmacy School Academic Performance
Mar, Ellena; T-L Tang, Terrill; Sasaki-Hill, Debra; Kuperberg, James R.; Knapp, Katherine
2010-01-01
Objectives To determine whether students' previous pharmacy-related work experience was associated with their pharmacy school performance (academic and clinical). Methods The following measures of student academic performance were examined: pharmacy grade point average (GPA), scores on cumulative high-stakes examinations, and advanced pharmacy practice experience (APPE) grades. The quantity and type of pharmacy-related work experience each student performed prior to matriculation was solicited through a student survey instrument. Survey responses were correlated with academic measures, and demographic-based stratified analyses were conducted. Results No significant difference in academic or clinical performance between those students with prior pharmacy experience and those without was identified. Subanalyses by work setting, position type, and substantial pharmacy work experience did not reveal any association with student performance. A relationship was found, however, between age and work experience, ie, older students tended to have more work experience than younger students. Conclusions Prior pharmacy work experience did not affect students' overall academic or clinical performance in pharmacy school. The lack of significant findings may have been due to the inherent practice limitations of nonpharmacist positions, changes in pharmacy education, and the limitations of survey responses. PMID:20498735
Prior experiences associated with residents' scores on a communication and interpersonal skill OSCE.
Yudkowsky, Rachel; Downing, Steven M; Ommert, Dennis
2006-09-01
This exploratory study investigated whether prior task experience and comfort correlate with scores on an assessment of patient-centered communication. A six-station standardized patient exam assessed patient-centered communication of 79 PGY2-3 residents in Internal Medicine and Family Medicine. A survey provided information on prior experiences. t-tests, correlations, and multi-factorial ANOVA explored relationship between scores and experiences. Experience with a task predicted comfort but did not predict communication scores. Comfort was moderately correlated with communication scores for some tasks; residents who were less comfortable were indeed less skilled, but greater comfort did not predict higher scores. Female gender and medical school experiences with standardized patients along with training in patient-centered interviewing were associated with higher scores. Residents without standardized patient experiences in medical school were almost five times more likely to be rejected by patients. Task experience alone does not guarantee better communication, and may instill a false sense of confidence. Experiences with standardized patients during medical school, especially in combination with interviewing courses, may provide an element of "deliberate practice" and have a long-term impact on communication skills. The combination of didactic courses and practice with standardized patients may promote a patient-centered approach.
Beeson, Tishra; Jester, Michelle; Proser, Michelle; Shin, Peter
2014-04-01
Despite community health centers' substantial role in local communities and in the broader safety-net healthcare system, very limited research has been conducted on community health center research experience, infrastructure, or needs from a national perspective. A national survey of 386 community health centers was conducted in 2011 and 2012 to assess research engagement among community health centers and their perceived needs, barriers, challenges, and facilitators with respect to their involvement in public health and health services research. This paper analyzes the differences between health centers that currently conduct or participate in research and health centers that have no prior research experience to determine whether prior research experience is indicative of different perceived challenges and research needs in community health center settings. © 2014 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Laune, Jordan; Tzeferacos, Petros; Feister, Scott; Fatenejad, Milad; Yurchak, Roman; Flocke, Norbert; Weide, Klaus; Lamb, Donald
2017-10-01
Thermodynamic and opacity properties of materials are necessary to accurately simulate laser-driven laboratory experiments. Such data are compiled in tabular format since the thermodynamic range that needs to be covered cannot be described with one single theoretical model. Moreover, tabulated data can be made available prior to runtime, reducing both compute cost and code complexity. This approach is employed by the FLASH code. Equation of state (EoS) and opacity data comes in various formats, matrix-layouts, and file-structures. We discuss recent developments on opacplot2, an open-source Python module that manipulates tabulated EoS and opacity data. We present software that builds upon opacplot2 and enables easy-to-use conversion of different table formats into the IONMIX format, the native tabular input used by FLASH. Our work enables FLASH users to take advantage of a wider range of accurate EoS and opacity tables in simulating HELP experiments at the National Laser User Facilities.
Home Literacy Environments of Young Children with Down Syndrome: Findings from a Web-based Survey.
Al Otaiba, Stephanie; Lewis, Sandra; Whalon, Kelly; Dyrlund, Alison; McKenzie, Amy
2009-03-01
Early home literacy experiences, including parent-child book reading, account for a significant amount of childrens' later reading achievement. Yet, there is a very limited research base about the home literacy environments and experiences of children with cognitive disabilities. The purpose of this study is to describe findings from a web-based survey of home literacy environments of young children with Down syndrome. Respondents ( n = 107) were mostly mothers; a majority were well-educated. Findings suggest that respondents gave literacy a higher priority than reported in prior research on children with disabilities. Over 70% of respondents had 50 or more childrens' books and also had literacy materials including flashcards, magnetic letters, and educational videos or computer games. Most parents read to their children and used these literacy materials 10-30 minutes per day. Respondents reported that their children had reached many important early literacy milestones and they also described having relatively ambitious life-long literacy goals for their children. Important implications for research and practice are discussed.
Calculation of Rate Spectra from Noisy Time Series Data
Voelz, Vincent A.; Pande, Vijay S.
2011-01-01
As the resolution of experiments to measure folding kinetics continues to improve, it has become imperative to avoid bias that may come with fitting data to a predetermined mechanistic model. Towards this end, we present a rate spectrum approach to analyze timescales present in kinetic data. Computing rate spectra of noisy time series data via numerical discrete inverse Laplace transform is an ill-conditioned inverse problem, so a regularization procedure must be used to perform the calculation. Here, we show the results of different regularization procedures applied to noisy multi-exponential and stretched exponential time series, as well as data from time-resolved folding kinetics experiments. In each case, the rate spectrum method recapitulates the relevant distribution of timescales present in the data, with different priors on the rate amplitudes naturally corresponding to common biases toward simple phenomenological models. These results suggest an attractive alternative to the “Occam’s razor” philosophy of simply choosing models with the fewest number of relaxation rates. PMID:22095854
Cipolletta, Sabrina; Mocellin, Damiano
2017-01-09
Online counseling may be defined as an interaction between users and mental health professionals that takes place through computer mediated communication technology. This study aimed to investigate the attitudes of Italian psychologists towards different aspects of online counseling provided via email, chat, forums, and videoconference. An online questionnaire was administered to a sample of 289 licensed psychologists in the Veneto Region (Italy) in order to collect opinions, preferences, and intentions to use online modalities, along with prior knowledge and practice experiences. Only 18.3% of the respondents had previous experience with online counseling. Overall, the majority of psychologists (62.6%) were favorable towards online counseling, but they also had several reservations about the provision of online diagnosis and therapeutic interventions. Results showed a consistent lack of clarity regarding ethical and penal issues concerning online modalities. More efforts must be directed to deepening the application of new technologies in the field of psychology in order to enable an ethical and professional practice of online counseling in Italy.
Apparent minification in an imaging display under reduced viewing conditions.
Meehan, J W
1993-01-01
When extended outdoor scenes are imaged with magnification of 1 in optical, electronic, or computer-generated displays, scene features appear smaller and farther than in direct view. This has been shown to occur in various periscopic and camera-viewfinder displays outdoors in daylight. In four experiments it was found that apparent minification of the size of a planar object at a distance of 3-9 m indoors occurs in the viewfinder display of an SLR camera both in good light and in darkness with only the luminous object visible. The effect is robust and survives changes in the relationship between object luminance in the display and in direct view and occurs in the dark when subjects have no prior knowledge of room dimensions, object size or object distance. The results of a fifth experiment suggest that the effect is an instance of reduced visual size constancy consequent on elimination of cues for size, which include those for distance.
Tangible User Interfaces and Contrasting Cases as a Preparation for Future Learning
NASA Astrophysics Data System (ADS)
Schneider, Bertrand; Blikstein, Paulo
2018-04-01
In this paper, we describe an experiment that compared the use of a Tangible User Interface (physical objects augmented with digital information) and a set of Contrasting Cases as a preparation for future learning. We carried out an experiment (N = 40) with a 2 × 2 design: the first factor compared traditional instruction ("Tell & Practice") with a constructivist activity designed using the Preparation for Future Learning framework (PFL). The second factor contrasted state-of-the-art PFL learning activity (i.e., students studying Contrasting Cases) with an interactive tabletop featuring digitally enhanced manipulatives. In agreement with prior work, we found that dyads of students who followed the PFL activity achieved significantly higher learning gains compared to their peers who followed a traditional "Tell & Practice" instruction (large effect size). A similar effect was found in favor of the interactive tabletop compared to the Contrasting Cases (small-to-moderate effect size). We discuss implications for designing socio-constructivist activities using new computer interfaces.
26 CFR 20.2056A-7 - Allowance of prior transfer credit under section 2013.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 2013. 20.2056A-7 Section 20.2056A-7 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE... Taxable Estate § 20.2056A-7 Allowance of prior transfer credit under section 2013. (a) Property subject to QDOT election. Section 2056(d)(3) provides special rules for computing the section 2013 credit allowed...
A Model for New Linkages for Prior Learning Assessment
ERIC Educational Resources Information Center
Kalz, Marco; van Bruggen, Jan; Giesbers, Bas; Waterink, Wim; Eshuis, Jannes; Koper, Rob
2008-01-01
Purpose: The purpose of this paper is twofold: first the paper aims to sketch the theoretical basis for the use of electronic portfolios for prior learning assessment; second it endeavours to introduce latent semantic analysis (LSA) as a powerful method for the computation of semantic similarity between texts and a basis for a new observation link…
ERIC Educational Resources Information Center
Campbell, Donald P.
2013-01-01
This study investigated the effect of student prior knowledge and feedback type on student achievement and satisfaction in an introductory managerial accounting course using computer-based formative assessment tools. The study involved a redesign of the existing Job Order Costing unit using the ADDIE model of instructional design. The…
Factors associated with simulator-assessed laparoscopic surgical skills of veterinary students.
Kilkenny, Jessica J; Singh, Ameet; Kerr, Carolyn L; Khosa, Deep K; Fransson, Boel A
2017-06-01
OBJECTIVE To determine whether simulator-assessed laparoscopic skills of veterinary students were associated with training level and prior experience performing nonlaparoscopic veterinary surgery and other activities requiring hand-eye coordination and manual dexterity. DESIGN Experiment. SAMPLE 145 students without any prior laparoscopic surgical or fundamentals of laparoscopic surgery (FLS) simulator experience in years 1 (n = 39), 2 (34), 3 (39), and 4 (33) at a veterinary college. PROCEDURES A questionnaire was used to collect data from participants regarding experience performing veterinary surgery, playing video games, and participating in other activities. Participants performed a peg transfer, pattern cutting, and ligature loop-placement task on an FLS simulator, and FLS scores were assigned by an observer. Scores were compared among academic years, and correlations between amounts of veterinary surgical experience and FLS scores were assessed. A general linear model was used to identify predictors of FLS scores. RESULTS Participants were predominantly female (75%), right-hand dominant (92%), and between 20 and 29 years of age (98%). No significant differences were identified among academic years in FLS scores for individual tasks or total FLS score. Scores were not significantly associated with prior surgical or video game experience. Participants reporting no handicraft experience had significantly lower total FLS scores and FLS scores for task 2 than did participants reporting a lot of handicraft experience. CONCLUSIONS AND CLINICAL RELEVANCE Prior veterinary surgical and video game experience had no influence on FLS scores in this group of veterinary students, suggesting that proficiency of veterinary students in FLS may require specific training.
distributed computing, Web information systems engineering, software engineering, computer graphics, and Dashboard, NREL Energy Story visualization, Green Button data integration, as well as a large number of Web of an R&D 100 Award. Prior to joining NREL, Alex worked as a system administrator, Web developer
Microprocessors in the Curriculum and the Classroom.
ERIC Educational Resources Information Center
Summers, M. K.
1978-01-01
This article, directed at teachers concerned with computer science courses at sixth-form level with no prior knowledge of microprocessors, provides a basic introduction, and describes possible applications of a microprocessor development system as a teaching aid in computer sciences courses in UK secondary school. (Author/RAO)
ERIC Educational Resources Information Center
Rourke, Martha; Rourke, Patrick
1974-01-01
The school district business manager can make sound, cost-conscious decisions in the purchase of computer equipment by developing a list of cost-justified applications for automation, considering the software, writing performance specifications for bidding or negotiating a contract, and choosing the vendor wisely prior to the purchase; and by…
2013-01-01
Objective. This study compared the relationship between computer experience and performance on computerized cognitive tests and a traditional paper-and-pencil cognitive test in a sample of older adults (N = 634). Method. Participants completed computer experience and computer attitudes questionnaires, three computerized cognitive tests (Useful Field of View (UFOV) Test, Road Sign Test, and Stroop task) and a paper-and-pencil cognitive measure (Trail Making Test). Multivariate analysis of covariance was used to examine differences in cognitive performance across the four measures between those with and without computer experience after adjusting for confounding variables. Results. Although computer experience had a significant main effect across all cognitive measures, the effect sizes were similar. After controlling for computer attitudes, the relationship between computer experience and UFOV was fully attenuated. Discussion. Findings suggest that computer experience is not uniquely related to performance on computerized cognitive measures compared with paper-and-pencil measures. Because the relationship between computer experience and UFOV was fully attenuated by computer attitudes, this may imply that motivational factors are more influential to UFOV performance than computer experience. Our findings support the hypothesis that computer use is related to cognitive performance, and this relationship is not stronger for computerized cognitive measures. Implications and directions for future research are provided. PMID:22929395
NASA Astrophysics Data System (ADS)
Antoine, Marilyn V.
2011-12-01
The purpose of this research was to extend earlier research on sources of selfefficacy (Lent, Lopez, & Biechke, 1991; Usher & Pajares, 2009) to the information technology domain. The principal investigator examined how Bandura's (1977) sources of self-efficacy information---mastery experience, vicarious experience, verbal persuasion, and physiological states---shape computer self-efficacy beliefs and influence the decision to use or not use computers. The study took place at a mid-sized Historically Black College or University in the South. A convenience sample of 105 undergraduates was drawn from students enrolled in multiple sections of two introductory computer courses. There were 67 females and 38 males. This research was a correlational study of the following variables: sources of computer self-efficacy, general computer self-efficacy, outcome expectations, computer anxiety, and intention to use computers. The principal investigator administered a survey questionnaire containing 52 Likert items to measure the major study variables. Additionally, the survey instrument collected demographic variables such as gender, age, race, intended major, classification, technology use, technology adoption category, and whether the student owns a computer. The results reveal the following: (1) Mastery experience and verbal persuasion had statistically significant relationships to general computer self-efficacy, while vicarious experience and physiological states had non-significant relationships. Mastery experience had the strongest correlation to general computer self-efficacy. (2) All of the sources of computer self-efficacy had statistically significant relationships to personal outcome expectations. Vicarious experience had the strongest correlation to personal outcome expectations. (3) All of the sources of self-efficacy had statistically significant relationships to performance outcome expectations. Vicarious experience had the strongest correlation to performance outcome expectations. (4) Mastery experience and physiological states had statistically significant relationships to computer anxiety, while vicarious experience and verbal persuasion had non-significant relationships. Physiological states had the strongest correlation to computer anxiety. (5) Mastery experience, vicarious experience, and physiological states had statistically significant relationships to intention to use computers, while verbal persuasion had a non-significant relationship. Mastery experience had the strongest correlation to intention to use computers. Gender-related findings indicate that females reported higher average mastery experience, vicarious experience, physiological states, and intention to use computers than males. Females reported lower average general computer self-efficacy, computer anxiety, verbal persuasion, personal outcome expectations, and performance outcome expectations than males. The results of this study can be used to develop strategies for increasing general computer self-efficacy, outcome expectations, and intention to use computers. The results can also be used to develop strategies for reducing computer anxiety.
Heuristics as Bayesian inference under extreme priors.
Parpart, Paula; Jones, Matt; Love, Bradley C
2018-05-01
Simple heuristics are often regarded as tractable decision strategies because they ignore a great deal of information in the input data. One puzzle is why heuristics can outperform full-information models, such as linear regression, which make full use of the available information. These "less-is-more" effects, in which a relatively simpler model outperforms a more complex model, are prevalent throughout cognitive science, and are frequently argued to demonstrate an inherent advantage of simplifying computation or ignoring information. In contrast, we show at the computational level (where algorithmic restrictions are set aside) that it is never optimal to discard information. Through a formal Bayesian analysis, we prove that popular heuristics, such as tallying and take-the-best, are formally equivalent to Bayesian inference under the limit of infinitely strong priors. Varying the strength of the prior yields a continuum of Bayesian models with the heuristics at one end and ordinary regression at the other. Critically, intermediate models perform better across all our simulations, suggesting that down-weighting information with the appropriate prior is preferable to entirely ignoring it. Rather than because of their simplicity, our analyses suggest heuristics perform well because they implement strong priors that approximate the actual structure of the environment. We end by considering how new heuristics could be derived by infinitely strengthening the priors of other Bayesian models. These formal results have implications for work in psychology, machine learning and economics. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Bootstrapping and Maintaining Trust in the Cloud
2016-12-01
simultaneous cloud nodes. 1. INTRODUCTION The proliferation and popularity of infrastructure-as-a- service (IaaS) cloud computing services such as...Amazon Web Services and Google Compute Engine means more cloud tenants are hosting sensitive, private, and business critical data and applications in the...thousands of IaaS resources as they are elastically instantiated and terminated. Prior cloud trusted computing solutions address a subset of these features
Patient Experiences with Surveillance Endoscopy: A Qualitative Study
Arney, Jennifer; Hinojosa-Lindsey, Marilyn; Street, Richard L.; Hou, Jason; El-Serag, Hashem B.; Naik, Aanand D.
2014-01-01
Background Prior studies examining patterns of esophagogastroduodenoscopy (EGD) surveillance in patients with Barrett’s Esophagus (BE) demonstrate variable adherence to practice guidelines. In prior studies, memories of endoscopic experiences shaped patients’ overall perceptions and subsequent adherence behaviors, but the specific elements of that experience are unclear. We therefore sought to identify elements of the EGD experience that frame patients’ memories and overall perceptions of surveillance. Methods We conducted structured in-depth, qualitative interviews with BE patients in a single regional medical center. We recruited patients with a range of severity of BE (non-dysplastic, low-grade and high-grade dysplasia) who recently completed an EGD. Data collection continued until we reached thematic saturation (n=20). We applied principles of framework analysis to identify emerging themes regarding patients’ salient EGD experiences. We validated our coding scheme through multidisciplinary consensus meetings comprised of clinician (gastroenterologist and internist) and non-clinician investigators (sociologist and public health expert). Results Patient experiences can be conceptualized within a temporal model of surveillance EGD: prior to endoscopy, during the endoscopy procedure, and after endoscopy. Within this model, the most memorable aspects of the EGD experience include physician-patient communication prior to EGD, wait time at the endoscopy center, interpersonal interactions at the time of the EGD, level of pain or discomfort with the procedure, level of trust in the physician following EGD, and gaining a sense of control over BE. Conclusions We identified six salient memories before, during, and after the procedure that shape patients’ perceptions of the EGD experience. We offer recommendations for measuring the patient experience of EGD using a composite of validated survey items. Future studies should test the relation of patient experience measures and adherence to surveillance EGD. PMID:24500449
Zonal Acoustic Velocimetry in 30-cm, 60-cm, and 3-m Laboratory Models of the Outer Core
NASA Astrophysics Data System (ADS)
Rojas, R.; Doan, M. N.; Adams, M. M.; Mautino, A. R.; Stone, D.; Lekic, V.; Lathrop, D. P.
2016-12-01
A knowledge of zonal flows and shear is key in understanding magnetic field dynamics in the Earth and laboratory experiments with Earth-like geometries. Traditional techniques for measuring fluid flow using visualization and particle tracking are not well-suited to liquid metal flows. This has led us to develop a flow measurement technique based on acoustic mode velocimetry adapted from helioseismology. As a first step prior to measurements in the liquid sodium experiments, we implement this technique in our 60-cm diameter spherical Couette experiment in air. To account for a more realistic experimental geometry, including deviations from spherical symmetry, we compute predicted frequencies of acoustic normal modes using the finite element method. The higher accuracy of the predicted frequencies allows the identification of over a dozen acoustic modes, and mode identification is further aided by the use of multiple microphones and by analyzing spectra together with those obtained at a variety of nearby Rossby numbers. Differences between the predicted and observed mode frequencies are caused by differences in flow patterns present in the experiment. We compare acoustic mode frequency splittings with theoretical predictions for stationary fluid and solid body flow condition with excellent agreement. We also use this technique to estimate the zonal shear in those experiments across a range of Rossby numbers. Finally, we report on initial attempts to use this in liquid sodium in the 3-meter diameter experiment and parallel experiments performed in water in the 30-cm diameter experiment.
Computational Modeling of Magnetically Actuated Propellant Orientation
NASA Technical Reports Server (NTRS)
Hochstein, John I.
1996-01-01
Unlike terrestrial applications where gravity positions liquid at the "bottom" of the tank, the location of liquid propellant in spacecraft tanks is uncertain unless specific actions are taken or special features are built into the tank. Some mission events require knowledge of liquid position prior to a particular action: liquid must be positioned over the tank outlet prior to starting the main engines and must be moved away from the tank vent before vapor can be released overboard to reduce pressure. It may also be desirable to positively position liquid to improve propulsion system performance: moving liquid away from the tank walls will dramatically decrease the rate of heat transfer to the propellant, suppressing the boil-off rate, thereby reducing overall mission propellant requirements. The process of moving propellant to a desired position is referred to as propellant orientation or reorientation. Propulsive reorientation relies on small auxiliary thrusters to accelerate the tank. The inertia of the liquid causes it to collect in the aft-end of the tank if the acceleration is forward. Liquid Acquisition Devices (LAD's) rely on surface tension to hold the liquid within special geometries, (i.e. vanes, wire-mesh channels, start-baskets), to positively position propellants. Both of these technologies add significant weight and complexity to the spacecraft and can be limiting systems for long duration missions. The subject of the present research is an alternate technique for positively positioning liquid within spacecraft propellant tanks: magnetic fields. LOX is paramagnetic (attracted toward a magnet) and LH2 is diamagnetic (repelled from a magnet). Order-of-magnitude analyses, performed in the 1960's to determine required magnet size, concluded that the magnets would be prohibitively massive and this option has remained dormant during the intervening years. Recent advances in high-temperature superconducting materials hold the promise of electromagnets with sufficient performance to support cryogenic propellant management tasks. In late 1992, NASA MSFC began a new investigation in this technology commencing with the design of the Magnetically-Actuated Propellant Orientation (MAPO) experiment. A mixture of ferrofluid and water is used to simulate the paramagnetic properties of LOX and the experiment is being flown on the KC-135 aircraft to provide a reduced gravity environment. The influence of a 0.4 Tesla ring magnet on flow into and out of a subscale Plexiglas tank is being recorded on video tape. The most efficient approach to evaluating the feasibility of MAPO is to compliment the experimental program with development of a computational tool to model the process of interest. The goal of the present research is to develop such a tool. Once confidence in its fidelity is established by comparison to data from the MAPO experiment, it can be used to assist in the design of future experiments and to study the parameter space of the process. Ultimately, it is hoped that the computational model can serve as a design tool for full-scale spacecraft applications.
ERIC Educational Resources Information Center
Gurlitt, Johannes; Renkl, Alexander
2010-01-01
Two experiments investigated the effects of characteristic features of concept mapping used for prior knowledge activation. Characteristic demands of concept mapping include connecting lines representing the relationships between concepts and labeling these lines, specifying the type of the semantic relationships. In the first experiment,…
Media Choice for Intra-School Communication: The Role of Environment, User, and Medium
ERIC Educational Resources Information Center
Caspi, Avner; Blau, Ina
2011-01-01
The influence of media richness, media attentional load, social influence and users' prior experience with media on selection of media to transmit different messages to peers within an educational organization was tested. Media were discriminated by all potential variables. Support was found for the role of prior experience and social influence in…
ERIC Educational Resources Information Center
Krieg, Dana Balsink
2013-01-01
Increasing numbers of students are experiencing difficulty adjusting to college. Violated expectations of college may increase the stress experienced across the college career. Therefore, 36 college students were assessed prior to matriculation, during the first year and during the senior year. Expectations and experiences of academics, social…
Transitions to Adulthood in a Changing Economy: No Work, No Family, No Future?
ERIC Educational Resources Information Center
Booth, Alan, Ed.; Crouter, Ann C., Ed.; Shanahan, Michael J., Ed.
This book contains 17 papers devoted to the following four aspects of the transition to adulthood: effects of alterations in the structure of opportunity; effects of prior experiences in the family; effects of prior experience in the workplace; and career development and marriage formation during a period of rising inequality. The following papers…
Random Access: The Latino Student Experience with Prior Learning Assessment. Executive Summary
ERIC Educational Resources Information Center
Klein-Collins, Rebecca; Olson, Richard
2014-01-01
Many Latinos come to higher education as adults. One degree completion strategy that is particularly suited to adult students in higher education is prior learning assessment (PLA). PLA provides opportunities to evaluate a student's learning from work or life experience for the purpose of awarding college credit. For students whose…
A Fair and Balanced Look at the News: What Affects Memory for Controversial Arguments?
ERIC Educational Resources Information Center
Wiley, J.
2005-01-01
This research demonstrates how prior knowledge may allow for qualitative differences in representation of texts about controversial issues. People often experience a memory bias in favor of information with which they agree. In several experiments it was found that individuals with high prior knowledge about the topic were better able to recall…
ERIC Educational Resources Information Center
García-Carmona, Antonio; Criado, Ana M.; Cruz-Guzmán, Marta
2018-01-01
A diagnostic study is presented of the prior experiences, conceptions, and pedagogical valuations of prospective primary teachers (PPTs) about experimental activities (ExA's) in science education. The participants were 121 PPTs who, in small teams, responded to various questions related to ExA. Their responses were analysed interpretively with…
ERIC Educational Resources Information Center
Friedrichsen, Patricia J.; Abell, Sandra K.; Pareja, Enrique M.; Brown, Patrick L.; Lankford, Deanna M.; Volkmann, Mark J.
2009-01-01
Alternative certification programs (ACPs) have been proposed as a viable way to address teacher shortages, yet we know little about how teacher knowledge develops within such programs. The purpose of this study was to investigate prior knowledge for teaching among students entering an ACP, comparing individuals with teaching experience to those…
Trajectories in Teacher Education: Recognising Prior Learning in Practice
ERIC Educational Resources Information Center
Andersson, Per; Hellberg, Kristina
2009-01-01
This article analyses the trajectories into teacher education of a group of child minders who are studying to become pre-school teachers. The specific focus is what impact their prior experiences and learning from pre-school have on their trajectories, and how these experiences and learning are recognised in the first year of teacher education. A…
The influence of performance on action-effect integration in sense of agency.
Wen, Wen; Yamashita, Atsushi; Asama, Hajime
2017-08-01
Sense of agency refers to the subjective feeling of being able to control an outcome through one's own actions or will. Prior studies have shown that both sensory processing (e.g., comparisons between sensory feedbacks and predictions basing on one's motor intentions) and high-level cognitive/constructive processes (e.g., inferences based on one's performance or the consequences of one's actions) contribute to judgments of sense of agency. However, it remains unclear how these two types of processes interact, which is important for clarifying the mechanisms underlying sense of agency. Thus, we examined whether performance-based inferences influence action-effect integration in sense of agency using a delay detection paradigm in two experiments. In both experiments, participants pressed left and right arrow keys to control the direction in which a moving dot was travelling. The dot's response delay was manipulated randomly on 7 levels (0-480ms) between the trials; for each trial, participants were asked to judge whether the dot response was delayed and to rate their level of agency over the dot. In Experiment 1, participants tried to direct the dot to reach a destination on the screen as quickly as possible. Furthermore, the computer assisted participants by ignoring erroneous commands for half of the trials (assisted condition), while in the other half, all of the participants' commands were executed (self-control condition). In Experiment 2, participants directed the dot as they pleased (without a specific goal), but, in half of the trials, the computer randomly ignored 32% of their commands (disturbed condition) rather than assisted them. The results from the two experiments showed that performance enhanced action-effect integration. Specifically, when task performance was improved through the computer's assistance in Experiment 1, delay detection was reduced in the 480-ms delay condition, despite the fact that 32% of participants' commands were ignored. Conversely, when no feedback on task performance was given (as in Experiment 2), the participants reported greater delay when some of their commands were randomly ignored. Furthermore, the results of a logistic regression analysis showed that the threshold of delay detection was greater in the assisted condition than in the self-control condition in Experiment 1, which suggests a wider time window for action-effect integration. A multivariate analysis also revealed that assistance was related to reduced delay detection via task performance, while reduced delay detection was directly correlated with a better sense of agency. These results indicate an association between the implicit and explicit aspects of sense of agency. Copyright © 2017 Elsevier Inc. All rights reserved.
Cloud Computing: Virtual Clusters, Data Security, and Disaster Recovery
NASA Astrophysics Data System (ADS)
Hwang, Kai
Dr. Kai Hwang is a Professor of Electrical Engineering and Computer Science and Director of Internet and Cloud Computing Lab at the Univ. of Southern California (USC). He received the Ph.D. in Electrical Engineering and Computer Science from the Univ. of California, Berkeley. Prior to joining USC, he has taught at Purdue Univ. for many years. He has also served as a visiting Chair Professor at Minnesota, Hong Kong Univ., Zhejiang Univ., and Tsinghua Univ. He has published 8 books and over 210 scientific papers in computer science/engineering.
Koele-Schmidt, Lindsey; Vasquez, Margarita M
2016-04-01
Competency rates in neonatal intubation among pediatric residents are low and currently not meeting ACGME/AAP standards. The aim of this study was to compare standard bedside teaching of neonatal endotracheal intubation to a computer module, as well as introduce residents to the emerging technology of videolaryngoscopy. The study population consisted of The University of Texas Health Science Center at San Antonio Pediatric interns/residents and PGY-1 Anesthesia interns rotating through the NICU. Prior to participating in the study, the residents completed a survey addressing past experiences with intubation, comfort level, and prior use of direct and videolaryngoscopy. Participants then performed timed trials of both direct and videolaryngoscopy on the SimNewB(®). They had up to three attempts to successfully intubate, with up to 30 s on each attempt. After randomization, participants received one of the following teaching interventions: standard, computer module, or both. This was followed by a second intubation trial and survey completion. Thirty residents were enrolled in the study. There was significant improvement in time to successful intubation in both methods after any teaching intervention (direct 22.0 ± 13.4 s vs 14.7 ± 5.9 s, P = 0.002 and videolaryngoscopy 42.2 ± 29.3 s vs 26.8 ± 18.6 s, P = 0.003). No differences were found between the types of teaching. Residents were faster at intubating with direct laryngoscopy compared to videolaryngoscopy before and after teaching. By the end of the study, only 33% of residents preferred using videolaryngoscopy over direct laryngoscopy, but 76% felt videolaryngoscopy was better to teach intubation. Both standard teaching and computer module teaching of neonatal intubation on a mannequin model results in improved time to successful intubation and overall improved resident confidence with intubation equipment and technique. Although intubation times were lower with direct laryngoscopy compared to videolaryngoscopy, the participating residents felt that videolaryngoscopy is an important educational tool. © 2015 John Wiley & Sons Ltd.
Using Perturbation Theory to Reduce Noise in Diffusion Tensor Fields
Bansal, Ravi; Staib, Lawrence H.; Xu, Dongrong; Laine, Andrew F.; Liu, Jun; Peterson, Bradley S.
2009-01-01
We propose the use of Perturbation theory to reduce noise in Diffusion Tensor (DT) fields. Diffusion Tensor Imaging (DTI) encodes the diffusion of water molecules along different spatial directions in a positive-definite, 3 × 3 symmetric tensor. Eigenvectors and eigenvalues of DTs allow the in vivo visualization and quantitative analysis of white matter fiber bundles across the brain. The validity and reliability of these analyses are limited, however, by the low spatial resolution and low Signal-to-Noise Ratio (SNR) in DTI datasets. Our procedures can be applied to improve the validity and reliability of these quantitative analyses by reducing noise in the tensor fields. We model a tensor field as a three-dimensional Markov Random Field and then compute the likelihood and the prior terms of this model using Perturbation theory. The prior term constrains the tensor field to be smooth, whereas the likelihood term constrains the smoothed tensor field to be similar to the original field. Thus, the proposed method generates a smoothed field that is close in structure to the original tensor field. We evaluate the performance of our method both visually and quantitatively using synthetic and real-world datasets. We quantitatively assess the performance of our method by computing the SNR for eigenvalues and the coherence measures for eigenvectors of DTs across tensor fields. In addition, we quantitatively compare the performance of our procedures with the performance of one method that uses a Riemannian distance to compute the similarity between two tensors, and with another method that reduces noise in tensor fields by anisotropically filtering the diffusion weighted images that are used to estimate diffusion tensors. These experiments demonstrate that our method significantly increases the coherence of the eigenvectors and the SNR of the eigenvalues, while simultaneously preserving the fine structure and boundaries between homogeneous regions, in the smoothed tensor field. PMID:19540791
Individual Differences in Learning Computer Programming: A Social Cognitive Approach
ERIC Educational Resources Information Center
Akar, Sacide Guzin Mazman; Altun, Arif
2017-01-01
The purpose of this study is to investigate and conceptualize the ranks of importance of social cognitive variables on university students' computer programming performances. Spatial ability, working memory, self-efficacy, gender, prior knowledge and the universities students attend were taken as variables to be analyzed. The study has been…
DEVELOPMENT AND APPLICATIONS OF CFD SIMULATIONS SUPPORTING URBAN AIR QUALITY AND HOMELAND SECURITY
Prior to September 11, 2001 developments of Computational Fluid Dynamics (CFD) were begun to support air quality applications. CFD models are emerging as a promising technology for such assessments, in part due to the advancing power of computational hardware and software. CFD si...
Factors Promoting Engaged Exploration with Computer Simulations
ERIC Educational Resources Information Center
Podolefsky, Noah S.; Perkins, Katherine K.; Adams, Wendy K.
2010-01-01
This paper extends prior research on student use of computer simulations (sims) to engage with and explore science topics, in this case wave interference. We describe engaged exploration; a process that involves students actively interacting with educational materials, sense making, and exploring primarily via their own questioning. We analyze…
Computer Utilization by Schools: An Example.
ERIC Educational Resources Information Center
Tondow, Murray
1968-01-01
The Educational Data Services Department of the Palo Alto Unified School District is responsible for implementing data processing needs to improve the quality of education in Palo Alto, California. Information from the schools enters the Department data library to be scanned, coded, and corrected prior to IBM 1620 computer input. Operating 17…
Computer-Based Imaginary Sciences and Research on Concept Acquisition.
ERIC Educational Resources Information Center
Allen, Brockenbrough S.
To control for interactions in learning research due to subjects' prior knowledge of the instructional material presented, an imaginary curriculum was presented with a computer assisted technique based on Carl Berieter's imaginary science of Xenograde systems. The curriculum consisted of a classification system for ten conceptual classes of…
Elementary Teachers' Simulation Adoption and Inquiry-Based Use Following Professional Development
ERIC Educational Resources Information Center
Gonczi, Amanda; Maeng, Jennifer; Bell, Randy
2017-01-01
The purpose of this study was to characterize and compare 64 elementary science teachers' computer simulation use prior to and following professional development (PD) aligned with Innovation Adoption Theory. The PD highlighted computer simulation affordances that elementary teachers might find particularly useful. Qualitative and quantitative…
A sex difference in effect of prior experience on object-mediated problem-solving in gibbons.
Cunningham, Clare; Anderson, James; Mootnick, Alan
2011-07-01
Understanding the functionally relevant properties of objects is likely facilitated by learning with a critical role for past experience. However, current evidence is conflicting regarding the effect of prior object exposure on acquisition of object manipulation skills. This may be due to the influence of life history variables on the capacity to benefit from such experience. This study assessed effect of task-relevant object exposure on object-mediated problem-solving in 22 gibbons using a raking-in task. Despite not using tools habitually, 14 gibbons spontaneously used a rake to obtain a reward. Having prior experience with the rake in an unrewarded context did not improve learning efficiency in males. However, females benefitted significantly from the opportunity to interact with the rake before testing, with reduced latencies to solution compared to those with no previous exposure. These results reflect potential sex differences in approach to novelty that moderate the possible benefits of prior experience. Due to their relatively high energetic requirements, reproductively active females may be highly motivated to explore potential resources; however, increased investment in developing offspring could make them more guarded in their investigations. Previous exposure that allows females to learn of an object's neutrality can offset this cautious exploration.
A study of usability principles and interface design for mobile e-books.
Wang, Chao-Ming; Huang, Ching-Hua
2015-01-01
This study examined usability principles and interface designs in order to understand the relationship between the intentions of mobile e-book interface designs and users' perceptions. First, this study summarised 4 usability principles and 16 interface attributes, in order to conduct usability testing and questionnaire survey by referring to Nielsen (1993), Norman (2002), and Yeh (2010), who proposed the usability principles. Second, this study used the interviews to explore the perceptions and behaviours of user operations through senior users of multi-touch prototype devices. The results of this study are as follows: (1) users' behaviour of operating an interactive interface is related to user prior experience; (2) users' rating of the visibility principle is related to users' subjective perception but not related to user prior experience; however, users' ratings of the ease, efficiency, and enjoyment principles are related to user prior experience; (3) the interview survey reveals that the key attributes affecting users' behaviour of operating an interface include aesthetics, achievement, and friendliness. This study conducts experiments to explore the effects of users’ prior multi-touch experience on users’ behaviour of operating a mobile e-book interface and users’ rating of usability principles. Both qualitative and quantitative data analyses were performed. By applying protocol analysis, key attributes affecting users’ behaviour of operation were determined.
Camargo, Affonso H L A; Cooperberg, Matthew R; Ershoff, Brent D; Rubenstein, Jonathan N; Meng, Maxwell V; Stoller, Marshall L
2005-05-01
To report our experience and review published reports on the laparoscopic management of peripelvic renal cysts. Peripelvic renal cysts represent a unique subset of renal cysts, as they are rare, commonly symptomatic, and more difficult to treat than simple peripheral renal cysts. Minimally invasive methods for the treatment of peripelvic renal cysts, including laparoscopic decortication, have recently become more common. Four patients who presented with symptomatic peripelvic cysts underwent laparoscopic decortication at our institution. All four were men aged 47 to 65 years. One patient had undergone an unsuccessful prior cyst aspiration. All patients underwent preoperative computed tomography and retrograde pyelography. The mean number of peripelvic cysts per patient was 3.0, and the mean cyst size was 7.1 cm. The mean operative time was 259 minutes (range 240 to 293), and the mean estimated blood loss was 30 mL (range 10 to 50). No evidence of cystic renal cell carcinoma was found on aspiration cytology or cyst wall pathologic examination. The mean hospital stay was 1.3 days. No inadvertent collecting system injuries and no intraoperative or postoperative complications occurred. All 4 patients achieved symptomatic relief and were determined to have radiologic success as determined by the 6-month postoperative computed tomography findings. Laparoscopic ablation of peripelvic renal cysts is more difficult than that of simple peripheral renal cysts and demands a heightened awareness of potential complications and, therefore, more advanced surgical skills. In addition to our experience, a thorough review of published reports found this procedure to be safe and effective with appropriate patient selection.
Commissioning of the PRIOR proton microscope
Varentsov, D.; Antonov, O.; Bakhmutova, A.; ...
2016-02-18
Recently, a new high energy proton microscopy facility PRIOR (Proton Microscope for FAIR Facility for Anti-proton and Ion Research) has been designed, constructed, and successfully commissioned at GSI Helmholtzzentrum für Schwerionenforschung (Darmstadt, Germany). As a result of the experiments with 3.5–4.5 GeV proton beams delivered by the heavy ion synchrotron SIS-18 of GSI, 30 μm spatial and 10 ns temporal resolutions of the proton microscope have been demonstrated. A new pulsed power setup for studying properties of matter under extremes has been developed for the dynamic commissioning of the PRIOR facility. This study describes the PRIOR setup as well asmore » the results of the first static and dynamic protonradiography experiments performed at GSI.« less
Commissioning of the PRIOR proton microscope
DOE Office of Scientific and Technical Information (OSTI.GOV)
Varentsov, D.; Antonov, O.; Bakhmutova, A.
Recently, a new high energy proton microscopy facility PRIOR (Proton Microscope for FAIR Facility for Anti-proton and Ion Research) has been designed, constructed, and successfully commissioned at GSI Helmholtzzentrum für Schwerionenforschung (Darmstadt, Germany). As a result of the experiments with 3.5–4.5 GeV proton beams delivered by the heavy ion synchrotron SIS-18 of GSI, 30 μm spatial and 10 ns temporal resolutions of the proton microscope have been demonstrated. A new pulsed power setup for studying properties of matter under extremes has been developed for the dynamic commissioning of the PRIOR facility. This study describes the PRIOR setup as well asmore » the results of the first static and dynamic protonradiography experiments performed at GSI.« less
The Importance of Prior Knowledge.
ERIC Educational Resources Information Center
Cleary, Linda Miller
1989-01-01
Recounts a college English teacher's experience of reading and rereading Noam Chomsky, building up a greater store of prior knowledge. Argues that Frank Smith provides a theory for the importance of prior knowledge and Chomsky's work provided a personal example with which to interpret and integrate that theory. (RS)
External priors for the next generation of CMB experiments
Manzotti, Alessandro; Dodelson, Scott; Park, Youngsoo
2016-03-28
Planned cosmic microwave background (CMB) experiments can dramatically improve what we know about neutrino physics, inflation, and dark energy. The low level of noise, together with improved angular resolution, will increase the signal to noise of the CMB polarized signal as well as the reconstructed lensing potential of high redshift large scale structure. Projected constraints on cosmological parameters are extremely tight, but these can be improved even further with information from external experiments. Here, we examine quantitatively the extent to which external priors can lead to improvement in projected constraints from a CMB-Stage IV (S4) experiment on neutrino and dark energy properties. We find that CMB S4 constraints on neutrino mass could be strongly enhanced by external constraints on the cold dark matter densitymore » $$\\Omega_{c}h^{2}$$ and the Hubble constant $$H_{0}$$. If polarization on the largest scales ($$\\ell<50$$) will not be measured, an external prior on the primordial amplitude $$A_{s}$$ or the optical depth $$\\tau$$ will also be important. A CMB constraint on the number of relativistic degrees of freedom, $$N_{\\rm eff}$$, will benefit from an external prior on the spectral index $$n_{s}$$ and the baryon energy density $$\\Omega_{b}h^{2}$$. Lastly, an external prior on $$H_{0}$$ will help constrain the dark energy equation of state ($w$).« less
Multiple Embedded Processors for Fault-Tolerant Computing
NASA Technical Reports Server (NTRS)
Bolotin, Gary; Watson, Robert; Katanyoutanant, Sunant; Burke, Gary; Wang, Mandy
2005-01-01
A fault-tolerant computer architecture has been conceived in an effort to reduce vulnerability to single-event upsets (spurious bit flips caused by impingement of energetic ionizing particles or photons). As in some prior fault-tolerant architectures, the redundancy needed for fault tolerance is obtained by use of multiple processors in one computer. Unlike prior architectures, the multiple processors are embedded in a single field-programmable gate array (FPGA). What makes this new approach practical is the recent commercial availability of FPGAs that are capable of having multiple embedded processors. A working prototype (see figure) consists of two embedded IBM PowerPC 405 processor cores and a comparator built on a Xilinx Virtex-II Pro FPGA. This relatively simple instantiation of the architecture implements an error-detection scheme. A planned future version, incorporating four processors and two comparators, would correct some errors in addition to detecting them.
Computing Spacecraft Solar-Cell Damage by Charged Particles
NASA Technical Reports Server (NTRS)
Gaddy, Edward M.
2006-01-01
General EQFlux is a computer program that converts the measure of the damage done to solar cells in outer space by impingement of electrons and protons having many different kinetic energies into the measure of the damage done by an equivalent fluence of electrons, each having kinetic energy of 1 MeV. Prior to the development of General EQFlux, there was no single computer program offering this capability: For a given type of solar cell, it was necessary to either perform the calculations manually or to use one of three Fortran programs, each of which was applicable to only one type of solar cell. The problem in developing General EQFlux was to rewrite and combine the three programs into a single program that could perform the calculations for three types of solar cells and run in a Windows environment with a Windows graphical user interface. In comparison with the three prior programs, General EQFlux is easier to use.
Ethics and the 7 `P`s` of computer use policies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scott, T.J.; Voss, R.B.
1994-12-31
A Computer Use Policy (CUP) defines who can use the computer facilities for what. The CUP is the institution`s official position on the ethical use of computer facilities. The authors believe that writing a CUP provides an ideal platform to develop a group ethic for computer users. In prior research, the authors have developed a seven phase model for writing CUPs, entitled the 7 P`s of Computer Use Policies. The purpose of this paper is to present the model and discuss how the 7 P`s can be used to identify and communicate a group ethic for the institution`s computer users.
26 CFR 20.2056A-7 - Allowance of prior transfer credit under section 2013.
Code of Federal Regulations, 2010 CFR
2010-04-01
... section 2013. 20.2056A-7 Section 20.2056A-7 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE... Taxable Estate § 20.2056A-7 Allowance of prior transfer credit under section 2013. (a) Property subject to QDOT election. Section 2056(d)(3) provides special rules for computing the section 2013 credit allowed...
1995-05-01
ForeSight ( MECFS ). Prior to joining TASC, Mr. Stanzione served as the deputy director of the Semi-Automated Forces group at Loral Advanced Distributed...TASC’s other Synthetic Environment programs, including Weather in DIS (WINDS) and Multi-Echelon CFOR with ForeSight ( MECFS ). Prior to joining TASC, Mr
20 CFR 404.1263 - When fractional part of a cent may be disregarded-for wages paid prior to 1987.
Code of Federal Regulations, 2010 CFR
2010-04-01
... cent shall be used in computing the total of contributions. If a State Fails To Make Timely Payments... disregarded-for wages paid prior to 1987. 404.1263 Section 404.1263 Employees' Benefits SOCIAL SECURITY ADMINISTRATION FEDERAL OLD-AGE, SURVIVORS AND DISABILITY INSURANCE (1950- ) Coverage of Employees of State and...
NASA Astrophysics Data System (ADS)
Vogelgesang, Jonas; Schorr, Christian
2016-12-01
We present a semi-discrete Landweber-Kaczmarz method for solving linear ill-posed problems and its application to Cone Beam tomography and laminography. Using a basis function-type discretization in the image domain, we derive a semi-discrete model of the underlying scanning system. Based on this model, the proposed method provides an approximate solution of the reconstruction problem, i.e. reconstructing the density function of a given object from its projections, in suitable subspaces equipped with basis function-dependent weights. This approach intuitively allows the incorporation of additional information about the inspected object leading to a more accurate model of the X-rays through the object. Also, physical conditions of the scanning geometry, like flat detectors in computerized tomography as used in non-destructive testing applications as well as non-regular scanning curves e.g. appearing in computed laminography (CL) applications, are directly taken into account during the modeling process. Finally, numerical experiments of a typical CL application in three dimensions are provided to verify the proposed method. The introduction of geometric prior information leads to a significantly increased image quality and superior reconstructions compared to standard iterative methods.
Variational Gaussian approximation for Poisson data
NASA Astrophysics Data System (ADS)
Arridge, Simon R.; Ito, Kazufumi; Jin, Bangti; Zhang, Chen
2018-02-01
The Poisson model is frequently employed to describe count data, but in a Bayesian context it leads to an analytically intractable posterior probability distribution. In this work, we analyze a variational Gaussian approximation to the posterior distribution arising from the Poisson model with a Gaussian prior. This is achieved by seeking an optimal Gaussian distribution minimizing the Kullback-Leibler divergence from the posterior distribution to the approximation, or equivalently maximizing the lower bound for the model evidence. We derive an explicit expression for the lower bound, and show the existence and uniqueness of the optimal Gaussian approximation. The lower bound functional can be viewed as a variant of classical Tikhonov regularization that penalizes also the covariance. Then we develop an efficient alternating direction maximization algorithm for solving the optimization problem, and analyze its convergence. We discuss strategies for reducing the computational complexity via low rank structure of the forward operator and the sparsity of the covariance. Further, as an application of the lower bound, we discuss hierarchical Bayesian modeling for selecting the hyperparameter in the prior distribution, and propose a monotonically convergent algorithm for determining the hyperparameter. We present extensive numerical experiments to illustrate the Gaussian approximation and the algorithms.
Ding, Chao; Yang, Lijun; Wu, Meng
2017-01-01
Due to the unattended nature and poor security guarantee of the wireless sensor networks (WSNs), adversaries can easily make replicas of compromised nodes, and place them throughout the network to launch various types of attacks. Such an attack is dangerous because it enables the adversaries to control large numbers of nodes and extend the damage of attacks to most of the network with quite limited cost. To stop the node replica attack, we propose a location similarity-based detection scheme using deployment knowledge. Compared with prior solutions, our scheme provides extra functionalities that prevent replicas from generating false location claims without deploying resource-consuming localization techniques on the resource-constraint sensor nodes. We evaluate the security performance of our proposal under different attack strategies through heuristic analysis, and show that our scheme achieves secure and robust replica detection by increasing the cost of node replication. Additionally, we evaluate the impact of network environment on the proposed scheme through theoretic analysis and simulation experiments, and indicate that our scheme achieves effectiveness and efficiency with substantially lower communication, computational, and storage overhead than prior works under different situations and attack strategies. PMID:28098846
Seghouane, Abd-Krim; Iqbal, Asif
2017-09-01
Sequential dictionary learning algorithms have been successfully applied to functional magnetic resonance imaging (fMRI) data analysis. fMRI data sets are, however, structured data matrices with the notions of temporal smoothness in the column direction. This prior information, which can be converted into a constraint of smoothness on the learned dictionary atoms, has seldomly been included in classical dictionary learning algorithms when applied to fMRI data analysis. In this paper, we tackle this problem by proposing two new sequential dictionary learning algorithms dedicated to fMRI data analysis by accounting for this prior information. These algorithms differ from the existing ones in their dictionary update stage. The steps of this stage are derived as a variant of the power method for computing the SVD. The proposed algorithms generate regularized dictionary atoms via the solution of a left regularized rank-one matrix approximation problem where temporal smoothness is enforced via regularization through basis expansion and sparse basis expansion in the dictionary update stage. Applications on synthetic data experiments and real fMRI data sets illustrating the performance of the proposed algorithms are provided.
Ding, Chao; Yang, Lijun; Wu, Meng
2017-01-15
Due to the unattended nature and poor security guarantee of the wireless sensor networks (WSNs), adversaries can easily make replicas of compromised nodes, and place them throughout the network to launch various types of attacks. Such an attack is dangerous because it enables the adversaries to control large numbers of nodes and extend the damage of attacks to most of the network with quite limited cost. To stop the node replica attack, we propose a location similarity-based detection scheme using deployment knowledge. Compared with prior solutions, our scheme provides extra functionalities that prevent replicas from generating false location claims without deploying resource-consuming localization techniques on the resource-constraint sensor nodes. We evaluate the security performance of our proposal under different attack strategies through heuristic analysis, and show that our scheme achieves secure and robust replica detection by increasing the cost of node replication. Additionally, we evaluate the impact of network environment on the proposed scheme through theoretic analysis and simulation experiments, and indicate that our scheme achieves effectiveness and efficiency with substantially lower communication, computational, and storage overhead than prior works under different situations and attack strategies.
Zheng, Meixun; Bender, Daniel
2018-03-13
Computer-based testing (CBT) has made progress in health sciences education. In 2015, the authors led implementation of a CBT system (ExamSoft) at a dental school in the U.S. Guided by the Technology Acceptance Model (TAM), the purposes of this study were to (a) examine dental students' acceptance of ExamSoft; (b) understand factors impacting acceptance; and (c) evaluate the impact of ExamSoft on students' learning and exam performance. Survey and focus group data revealed that ExamSoft was well accepted by students as a testing tool and acknowledged by most for its potential to support learning. Regression analyses showed that perceived ease of use and perceived usefulness of ExamSoft significantly predicted student acceptance. Prior CBT experience and computer skills did not significantly predict acceptance of ExamSoft. Students reported that ExamSoft promoted learning in the first program year, primarily through timely and rich feedback on examination performance. t-Tests yielded mixed results on whether students performed better on computerized or paper examinations. The study contributes to the literature on CBT and the application of the TAM model in health sciences education. Findings also suggest ways in which health sciences institutions can implement CBT to maximize its potential as an assessment and learning tool.
How Haptic Size Sensations Improve Distance Perception
Battaglia, Peter W.; Kersten, Daniel; Schrater, Paul R.
2011-01-01
Determining distances to objects is one of the most ubiquitous perceptual tasks in everyday life. Nevertheless, it is challenging because the information from a single image confounds object size and distance. Though our brains frequently judge distances accurately, the underlying computations employed by the brain are not well understood. Our work illuminates these computions by formulating a family of probabilistic models that encompass a variety of distinct hypotheses about distance and size perception. We compare these models' predictions to a set of human distance judgments in an interception experiment and use Bayesian analysis tools to quantitatively select the best hypothesis on the basis of its explanatory power and robustness over experimental data. The central question is: whether, and how, human distance perception incorporates size cues to improve accuracy. Our conclusions are: 1) humans incorporate haptic object size sensations for distance perception, 2) the incorporation of haptic sensations is suboptimal given their reliability, 3) humans use environmentally accurate size and distance priors, 4) distance judgments are produced by perceptual “posterior sampling”. In addition, we compared our model's estimated sensory and motor noise parameters with previously reported measurements in the perceptual literature and found good correspondence between them. Taken together, these results represent a major step forward in establishing the computational underpinnings of human distance perception and the role of size information. PMID:21738457
Mathematics understanding and anxiety in collaborative teaching
NASA Astrophysics Data System (ADS)
Ansari, B. I.; Wahyu, N.
2017-12-01
This study aims to examine students’ mathematical understanding and anxiety using collaborative teaching. The sample consists of 51 students in the 7th-grade of MTs N Jeureula, one of the Islamic public junior high schools in Jeureula, Aceh, Indonesia. A test of mathematics understanding was administered to the students twice during the period of two months. The result suggests that there is a significant increase in mathematical understanding in the pre-test and post-test. We categorized the students into the high, intermediate, and low level of prior mathematics knowledge. In the high-level prior knowledge, there is no difference of mathematical understanding between the experiment and control group. Meanwhile, in the intermediate and low level of prior knowledge, there is a significant difference of mathematical understanding between the experiment and control group. The mathematics anxiety is at an intermediate level in the experiment class and at a high level in the control group. There is no interaction between the learning model and the students’ prior knowledge towards the mathematical understanding, but there are interactions towards the mathematics anxiety. It indicates that the collaborative teaching model and the students’ prior knowledge do not simultaneously impacts on the mathematics understanding but the mathematics anxiety.
NASA Astrophysics Data System (ADS)
Suess, Daniel; Rudnicki, Łukasz; maciel, Thiago O.; Gross, David
2017-09-01
The outcomes of quantum mechanical measurements are inherently random. It is therefore necessary to develop stringent methods for quantifying the degree of statistical uncertainty about the results of quantum experiments. For the particularly relevant task of quantum state tomography, it has been shown that a significant reduction in uncertainty can be achieved by taking the positivity of quantum states into account. However—the large number of partial results and heuristics notwithstanding—no efficient general algorithm is known that produces an optimal uncertainty region from experimental data, while making use of the prior constraint of positivity. Here, we provide a precise formulation of this problem and show that the general case is NP-hard. Our result leaves room for the existence of efficient approximate solutions, and therefore does not in itself imply that the practical task of quantum uncertainty quantification is intractable. However, it does show that there exists a non-trivial trade-off between optimality and computational efficiency for error regions. We prove two versions of the result: one for frequentist and one for Bayesian statistics.
Hancox, S H; Sinnott, J D; Kirkland, P; Lipscomb, D; Owens, E; Howlett, D C
2018-03-01
A parathyroid multidisciplinary team meeting was set up at East Sussex Healthcare Trust, from November 2014 to November 2015, in order to improve and streamline services for patients with parathyroid pathology. Data were collected on all new referrals for hyperparathyroidism, and on the outcomes for each patient discussed at the meeting, including the number of operations and management outcomes. A survey was sent out to the members of the multidisciplinary team meeting to determine their perception of its effectiveness. Seventy-nine new referrals were discussed throughout the year; 43 per cent were recommended for surgery, 41 per cent had a trial of conservative or medical management before re-discussion, and 16 per cent required further imaging. Ninety-two per cent of patients underwent an ultrasound, single-photon emission computed tomography/computed tomography or nuclear medicine (sestamibi) scan prior to the meeting. All ultrasound scans were performed by a consultant radiologist. The multidisciplinary team meeting has been successful, with perceived benefits for patients, improved imaging evaluation and efficiency of referral pathways, leading to more appropriate patient management.
ACM TOMS replicated computational results initiative
Heroux, Michael Allen
2015-06-03
In this study, the scientific community relies on the peer review process for assuring the quality of published material, the goal of which is to build a body of work we can trust. Computational journals such as The ACM Transactions on Mathematical Software (TOMS) use this process for rigorously promoting the clarity and completeness of content, and citation of prior work. At the same time, it is unusual to independently confirm computational results.
Bayesian X-ray computed tomography using a three-level hierarchical prior model
NASA Astrophysics Data System (ADS)
Wang, Li; Mohammad-Djafari, Ali; Gac, Nicolas
2017-06-01
In recent decades X-ray Computed Tomography (CT) image reconstruction has been largely developed in both medical and industrial domain. In this paper, we propose using the Bayesian inference approach with a new hierarchical prior model. In the proposed model, a generalised Student-t distribution is used to enforce the Haar transformation of images to be sparse. Comparisons with some state of the art methods are presented. It is shown that by using the proposed model, the sparsity of sparse representation of images is enforced, so that edges of images are preserved. Simulation results are also provided to demonstrate the effectiveness of the new hierarchical model for reconstruction with fewer projections.
Freyer, Marcus; Ale, Angelique; Schulz, Ralf B; Zientkowska, Marta; Ntziachristos, Vasilis; Englmeier, Karl-Hans
2010-01-01
The recent development of hybrid imaging scanners that integrate fluorescence molecular tomography (FMT) and x-ray computed tomography (XCT) allows the utilization of x-ray information as image priors for improving optical tomography reconstruction. To fully capitalize on this capacity, we consider a framework for the automatic and fast detection of different anatomic structures in murine XCT images. To accurately differentiate between different structures such as bone, lung, and heart, a combination of image processing steps including thresholding, seed growing, and signal detection are found to offer optimal segmentation performance. The algorithm and its utilization in an inverse FMT scheme that uses priors is demonstrated on mouse images.
Computational efficiency improvements for image colorization
NASA Astrophysics Data System (ADS)
Yu, Chao; Sharma, Gaurav; Aly, Hussein
2013-03-01
We propose an efficient algorithm for colorization of greyscale images. As in prior work, colorization is posed as an optimization problem: a user specifies the color for a few scribbles drawn on the greyscale image and the color image is obtained by propagating color information from the scribbles to surrounding regions, while maximizing the local smoothness of colors. In this formulation, colorization is obtained by solving a large sparse linear system, which normally requires substantial computation and memory resources. Our algorithm improves the computational performance through three innovations over prior colorization implementations. First, the linear system is solved iteratively without explicitly constructing the sparse matrix, which significantly reduces the required memory. Second, we formulate each iteration in terms of integral images obtained by dynamic programming, reducing repetitive computation. Third, we use a coarseto- fine framework, where a lower resolution subsampled image is first colorized and this low resolution color image is upsampled to initialize the colorization process for the fine level. The improvements we develop provide significant speedup and memory savings compared to the conventional approach of solving the linear system directly using off-the-shelf sparse solvers, and allow us to colorize images with typical sizes encountered in realistic applications on typical commodity computing platforms.
Menarche: Prior Knowledge and Experience.
ERIC Educational Resources Information Center
Skandhan, K. P.; And Others
1988-01-01
Recorded menstruation information among 305 young women in India, assessing the differences between those who did and did not have knowledge of menstruation prior to menarche. Those with prior knowledge considered menarche to be a normal physiological function and had a higher rate of regularity, lower rate of dysmenorrhea, and earlier onset of…
NASA Astrophysics Data System (ADS)
Kim, R. S.; Durand, M. T.; Li, D.; Baldo, E.; Margulis, S. A.; Dumont, M.; Morin, S.
2017-12-01
This paper presents a newly-proposed snow depth retrieval approach for mountainous deep snow using airborne multifrequency passive microwave (PM) radiance observation. In contrast to previous snow depth estimations using satellite PM radiance assimilation, the newly-proposed method utilized single flight observation and deployed the snow hydrologic models. This method is promising since the satellite-based retrieval methods have difficulties to estimate snow depth due to their coarse resolution and computational effort. Indeed, this approach consists of particle filter using combinations of multiple PM frequencies and multi-layer snow physical model (i.e., Crocus) to resolve melt-refreeze crusts. The method was performed over NASA Cold Land Processes Experiment (CLPX) area in Colorado during 2002 and 2003. Results showed that there was a significant improvement over the prior snow depth estimates and the capability to reduce the prior snow depth biases. When applying our snow depth retrieval algorithm using a combination of four PM frequencies (10.7,18.7, 37.0 and 89.0 GHz), the RMSE values were reduced by 48 % at the snow depth transects sites where forest density was less than 5% despite deep snow conditions. This method displayed a sensitivity to different combinations of frequencies, model stratigraphy (i.e. different number of layering scheme for snow physical model) and estimation methods (particle filter and Kalman filter). The prior RMSE values at the forest-covered areas were reduced by 37 - 42 % even in the presence of forest cover.
Progress on the Fabric for Frontier Experiments Project at Fermilab
NASA Astrophysics Data System (ADS)
Box, Dennis; Boyd, Joseph; Dykstra, Dave; Garzoglio, Gabriele; Herner, Kenneth; Kirby, Michael; Kreymer, Arthur; Levshina, Tanya; Mhashilkar, Parag; Sharma, Neha
2015-12-01
The FabrIc for Frontier Experiments (FIFE) project is an ambitious, major-impact initiative within the Fermilab Scientific Computing Division designed to lead the computing model for Fermilab experiments. FIFE is a collaborative effort between experimenters and computing professionals to design and develop integrated computing models for experiments of varying needs and infrastructure. The major focus of the FIFE project is the development, deployment, and integration of Open Science Grid solutions for high throughput computing, data management, database access and collaboration within experiment. To accomplish this goal, FIFE has developed workflows that utilize Open Science Grid sites along with dedicated and commercial cloud resources. The FIFE project has made significant progress integrating into experiment computing operations several services including new job submission services, software and reference data distribution through CVMFS repositories, flexible data transfer client, and access to opportunistic resources on the Open Science Grid. The progress with current experiments and plans for expansion with additional projects will be discussed. FIFE has taken a leading role in the definition of the computing model for Fermilab experiments, aided in the design of computing for experiments beyond Fermilab, and will continue to define the future direction of high throughput computing for future physics experiments worldwide.
Computational Fluid Dynamics (CFD) Simulations of Jet Mixing in Tanks of Different Scales
NASA Technical Reports Server (NTRS)
Breisacher, Kevin; Moder, Jeffrey
2010-01-01
For long-duration in-space storage of cryogenic propellants, an axial jet mixer is one concept for controlling tank pressure and reducing thermal stratification. Extensive ground-test data from the 1960s to the present exist for tank diameters of 10 ft or less. The design of axial jet mixers for tanks on the order of 30 ft diameter, such as those planned for the Ares V Earth Departure Stage (EDS) LH2 tank, will require scaling of available experimental data from much smaller tanks, as well designing for microgravity effects. This study will assess the ability for Computational Fluid Dynamics (CFD) to handle a change of scale of this magnitude by performing simulations of existing ground-based axial jet mixing experiments at two tank sizes differing by a factor of ten. Simulations of several axial jet configurations for an Ares V scale EDS LH2 tank during low Earth orbit (LEO) coast are evaluated and selected results are also presented. Data from jet mixing experiments performed in the 1960s by General Dynamics with water at two tank sizes (1 and 10 ft diameter) are used to evaluate CFD accuracy. Jet nozzle diameters ranged from 0.032 to 0.25 in. for the 1 ft diameter tank experiments and from 0.625 to 0.875 in. for the 10 ft diameter tank experiments. Thermally stratified layers were created in both tanks prior to turning on the jet mixer. Jet mixer efficiency was determined by monitoring the temperatures on thermocouple rakes in the tanks to time when the stratified layer was mixed out. Dye was frequently injected into the stratified tank and its penetration recorded. There were no velocities or turbulence quantities available in the experimental data. A commercially available, time accurate, multi-dimensional CFD code with free surface tracking (FLOW-3D from Flow Science, Inc.) is used for the simulations presented. Comparisons are made between computed temperatures at various axial locations in the tank at different times and those observed experimentally. The affect of various modeling parameters on the agreement obtained are assessed.
ERIC Educational Resources Information Center
Roelle, Julian; Lehmkuhl, Nina; Beyer, Martin-Uwe; Berthold, Kirsten
2015-01-01
In 2 experiments we examined the role of (a) specificity, (b) the type of targeted learning activities, and (c) learners' prior knowledge for the effects of relevance instructions on learning from instructional explanations. In Experiment 1, we recruited novices regarding the topic of atomic structure (N = 80) and found that "specific"…
ERIC Educational Resources Information Center
Artino, Anthony R., Jr.
2007-01-01
Using a social cognitive framework, the present study investigated the relations between two motivational constructs, prior experience, and several adaptive outcomes. Participants (n = 204) completed a survey that assessed their perceived task value, self-efficacy, prior experience, and a collection of outcomes that included their satisfaction,…
Support News Publications Computing for Experiments Computing for Neutrino and Muon Physics Computing for Collider Experiments Computing for Astrophysics Research and Development Accelerator Modeling ComPASS - Impact of Detector Simulation on Particle Physics Collider Experiments Daniel Elvira's paper "Impact
Reis, Shmuel; Sagi, Doron; Eisenberg, Orit; Kuchnir, Yosi; Azuri, Joseph; Shalev, Varda; Ziv, Amitai
2013-12-01
Even though Electronic Medical Records (EMRs) are increasingly used in healthcare organizations there is surprisingly little theoretical work or educational programs in this field. This study is aimed at comparing two training programs for doctor-patient-computer communication (DPCC). 36 Family Medicine Residents (FMRs) participated in this study. All FMRs went through twelve identical simulated encounters, six pre and six post training. The experiment group received simulation based training (SBT) while the control group received traditional lecture based training. Performance, attitude and sense of competence of all FMRs improved, but no difference was found between the experiment and control groups. FMRs from the experiment group evaluated the contribution of the training phase higher than control group, and showed higher satisfaction. We assume that the mere exposure to simulation served as a learning experience and enabled deliberate practice that was more powerful than training. Because DPCC is a new field, all participants in such studies, including instructors and raters, should receive basic training of DPCC skills. Simulation enhances DPCC skills. Future studies of this kind should control the exposure to simulation prior to the training phase. Training and assessment of clinical communication should include EMR related skills. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Konduri, Niranjan; Sawyer, Kelly; Nizova, Nataliya
2017-04-01
Ukraine has successfully implemented e-TB Manager nationwide as its mandatory national tuberculosis registry after first introducing it in 2009. Our objective was to perform an end-of-programme evaluation after formal handover of the registry administration to Ukraine's Centre for Disease Control in 2015. We conducted a nationwide, cross-sectional, anonymous, 18-point user experience survey, and stratified the registry's transaction statistics to demonstrate usability. Contrary to initial implementation experience, older users (aged >50 years), often with limited or no computer proficiency prior to using the registry, had significantly better user experience scores for at least six of the 12 measures compared to younger users (aged 18-29 years). Using the registry for >3 years was associated with significantly higher scores for having capacity, adequacy of training received and satisfaction with the registry. Of the 5.9 million transactions over a 4-year period, nine out of 24 oblasts (regions) and Kiev city accounted for 62.5% of all transactions, and corresponded to 59% of Ukraine's tuberculosis burden. There were 437 unique active users in 486 rayons (districts) of Ukraine, demonstrating extensive reach. Our key findings complement the World Health Organization and European Respiratory Society's agenda for action on digital health to help implement the End TB Strategy.
Applying Standard Interfaces to a Process-Control Language
NASA Technical Reports Server (NTRS)
Berthold, Richard T.
2005-01-01
A method of applying open-operating-system standard interfaces to the NASA User Interface Language (UIL) has been devised. UIL is a computing language that can be used in monitoring and controlling automated processes: for example, the Timeliner computer program, written in UIL, is a general-purpose software system for monitoring and controlling sequences of automated tasks in a target system. In providing the major elements of connectivity between UIL and the target system, the present method offers advantages over the prior method. Most notably, unlike in the prior method, the software description of the target system can be made independent of the applicable compiler software and need not be linked to the applicable executable compiler image. Also unlike in the prior method, it is not necessary to recompile the source code and relink the source code to a new executable compiler image. Abstraction of the description of the target system to a data file can be defined easily, with intuitive syntax, and knowledge of the source-code language is not needed for the definition.
Real-Time Very High-Resolution Regional 4D Assimilation in Supporting CRYSTAL-FACE Experiment
NASA Technical Reports Server (NTRS)
Wang, Donghai; Minnis, Patrick
2004-01-01
To better understand tropical cirrus cloud physical properties and formation processes with a view toward the successful modeling of the Earth's climate, the CRYSTAL-FACE (Cirrus Regional Study of Tropical Anvils and Cirrus Layers - Florida Area Cirrus Experiment) field experiment took place over southern Florida from 1 July to 29 July 2002. During the entire field campaign, a very high-resolution numerical weather prediction (NWP) and assimilation system was performed in support of the mission with supercomputing resources provided by NASA Center for Computational Sciences (NCCS). By using NOAA NCEP Eta forecast for boundary conditions and as a first guess for initial conditions assimilated with all available observations, two nested 15/3 km grids are employed over the CRYSTAL-FACE experiment area. The 15-km grid covers the southeast US domain, and is run two times daily for a 36-hour forecast starting at 0000 UTC and 1200 UTC. The nested 3-km grid covering only southern Florida is used for 9-hour and 18-hour forecasts starting at 1500 and 0600 UTC, respectively. The forecasting system provided more accurate and higher spatial and temporal resolution forecasts of 4-D atmospheric fields over the experiment area than available from standard weather forecast models. These forecasts were essential for flight planning during both the afternoon prior to a flight day and the morning of a flight day. The forecasts were used to help decide takeoff times and the most optimal flight areas for accomplishing the mission objectives. See more detailed products on the web site http://asd-www.larc.nasa.gov/mode/crystal. The model/assimilation output gridded data are archived on the NASA Center for Computational Sciences (NCCS) UniTree system in the HDF format at 30-min intervals for real-time forecasts or 5-min intervals for the post-mission case studies. Particularly, the data set includes the 3-D cloud fields (cloud liquid water, rain water, cloud ice, snow and graupe/hail).
Air-Induced Drag Reduction at High Reynolds Numbers: Velocity and Void Fraction Profiles
NASA Astrophysics Data System (ADS)
Elbing, Brian; Mäkiharju, Simo; Wiggins, Andrew; Dowling, David; Perlin, Marc; Ceccio, Steven
2010-11-01
The injection of air into a turbulent boundary layer forming over a flat plate can reduce the skin friction. With sufficient volumetric fluxes an air layer can separate the solid surface from the flowing liquid, which can produce drag reduction in excess of 80%. Several large scale experiments have been conducted at the US Navy's Large Cavitation Channel on a 12.9 m long flat plate model investigating bubble drag reduction (BDR), air layer drag reduction (ALDR) and the transition between BDR and ALDR. The most recent experiment acquired phase velocities and void fraction profiles at three downstream locations (3.6, 5.9 and 10.6 m downstream from the model leading edge) for a single flow speed (˜6.4 m/s). The profiles were acquired with a combination of electrode point probes, time-of-flight sensors, Pitot tubes and an LDV system. Additional diagnostics included skin-friction sensors and flow-field image visualization. During this experiment the inlet flow was perturbed with vortex generators immediately upstream of the injection location to assess the robustness of the air layer. From these, and prior measurements, computational models can be refined to help assess the viability of ALDR for full-scale ship applications.
GWASinlps: Nonlocal prior based iterative SNP selection tool for genome-wide association studies.
Sanyal, Nilotpal; Lo, Min-Tzu; Kauppi, Karolina; Djurovic, Srdjan; Andreassen, Ole A; Johnson, Valen E; Chen, Chi-Hua
2018-06-19
Multiple marker analysis of the genome-wide association study (GWAS) data has gained ample attention in recent years. However, because of the ultra high-dimensionality of GWAS data, such analysis is challenging. Frequently used penalized regression methods often lead to large number of false positives, whereas Bayesian methods are computationally very expensive. Motivated to ameliorate these issues simultaneously, we consider the novel approach of using nonlocal priors in an iterative variable selection framework. We develop a variable selection method, named, iterative nonlocal prior based selection for GWAS, or GWASinlps, that combines, in an iterative variable selection framework, the computational efficiency of the screen-and-select approach based on some association learning and the parsimonious uncertainty quantification provided by the use of nonlocal priors. The hallmark of our method is the introduction of 'structured screen-and-select' strategy, that considers hierarchical screening, which is not only based on response-predictor associations, but also based on response-response associations, and concatenates variable selection within that hierarchy. Extensive simulation studies with SNPs having realistic linkage disequilibrium structures demonstrate the advantages of our computationally efficient method compared to several frequentist and Bayesian variable selection methods, in terms of true positive rate, false discovery rate, mean squared error, and effect size estimation error. Further, we provide empirical power analysis useful for study design. Finally, a real GWAS data application was considered with human height as phenotype. An R-package for implementing the GWASinlps method is available at https://cran.r-project.org/web/packages/GWASinlps/index.html. Supplementary data are available at Bioinformatics online.
Novice and expert teachers' conceptions of learners' prior knowledge
NASA Astrophysics Data System (ADS)
Meyer, Helen
2004-11-01
This study presents comparative case studies of preservice and first-year teachers' and expert teachers' conceptions of the concept of prior knowledge. Kelly's (The Psychology of Personal Construct, New York: W.W. Norton, 1955) theory of personal constructs as discussed by Akerson, Flick, and Lederman (Journal of Research in Science Teaching, 2000, 37, 363-385) in relationship to prior knowledge underpins the study. Six teachers were selected to participate in the case studies based upon their level experience teaching science and their willingness to take part. The comparative case studies of the novice and expert teachers provide insights into (a) how novice and expert teachers understand the concept of prior knowledge and (b) how they use this knowledge to make instructional decisions. Data collection consisted of interviews, classroom observations, and document analysis. Findings suggest that novice teachers hold insufficient conceptions of prior knowledge and its role in instruction to effectively implement constructivist teaching practices. While expert teachers hold a complex conception of prior knowledge and make use of their students' prior knowledge in significant ways during instruction. A second finding was an apparent mismatch between the novice teachers' beliefs about their urban students' life experiences and prior knowledge and the wealth of knowledge the expert teachers found to draw upon.
"Small Talk Is Not Cheap": Phatic Computer-Mediated Communication in Intercultural Classes
ERIC Educational Resources Information Center
Maíz-Arévalo, Carmen
2017-01-01
The present study aims to analyse the phatic exchanges performed by a class of nine intercultural Master's students during a collaborative assignment which demanded online discussion using English as a lingua franca (ELF). Prior studies on the use of phatic communication in computer-mediated communication have concentrated on social networking…
Computer-Based Learning: Interleaving Whole and Sectional Representation of Neuroanatomy
ERIC Educational Resources Information Center
Pani, John R.; Chariker, Julia H.; Naaz, Farah
2013-01-01
The large volume of material to be learned in biomedical disciplines requires optimizing the efficiency of instruction. In prior work with computer-based instruction of neuroanatomy, it was relatively efficient for learners to master whole anatomy and then transfer to learning sectional anatomy. It may, however, be more efficient to continuously…
Algebraic Functions, Computer Programming, and the Challenge of Transfer
ERIC Educational Resources Information Center
Schanzer, Emmanuel Tanenbaum
2015-01-01
Students' struggles with algebra are well documented. Prior to the introduction of functions, mathematics is typically focused on applying a set of arithmetic operations to compute an answer. The introduction of functions, however, marks the point at which mathematics begins to focus on building up abstractions as a way to solve complex problems.…
Computer Generated Optical Illusions: A Teaching and Research Tool.
ERIC Educational Resources Information Center
Bailey, Bruce; Harman, Wade
Interactive computer-generated simulations that highlight psychological principles were investigated in this study in which 33 female and 19 male undergraduate college student volunteers of median age 21 matched line and circle sizes in six variations of Ponzo's illusion. Prior to working with the illusions, data were collected based on subjects'…
ERIC Educational Resources Information Center
Winberg, T. Mikael; Berg, C. Anders R.
2007-01-01
To enhance the learning outcomes achieved by students, learners undertook a computer-simulated activity based on an acid-base titration prior to a university-level chemistry laboratory activity. Students were categorized with respect to their attitudes toward learning. During the laboratory exercise, questions that students asked their assistant…
49 CFR 383.73 - State procedures.
Code of Federal Regulations, 2011 CFR
2011-10-01
... endorsement knowledge tests; (iv) Allow only a group-specific passenger (P) and school bus (S) endorsement and... verification. (1) Prior to issuing a CLP or a CDL to a person the State must verify the name, date of birth... of issuance of the CLP or CDL. (n) Computer system controls. The State must establish computer system...
49 CFR 383.73 - State procedures.
Code of Federal Regulations, 2012 CFR
2012-10-01
... endorsement knowledge tests; (iv) Allow only a group-specific passenger (P) and school bus (S) endorsement and... verification. (1) Prior to issuing a CLP or a CDL to a person the State must verify the name, date of birth... of issuance of the CLP or CDL. (n) Computer system controls. The State must establish computer system...
ERIC Educational Resources Information Center
Mitzel, Harold E.; Brandon, George L.
A series of five reports is presented which describes the activities carried out by the Pennsylvania State University group engaged in research in computer-assisted instruction (CAI) in vocational-technical education. The reports cover the period January 1968-June 1968 and deal with: 1) prior knowledge and individualized instruction; 2) numerical…
NASA Astrophysics Data System (ADS)
Chertkov, Yu B.; Disyuk, V. V.; Pimenov, E. Yu; Aksenova, N. V.
2017-01-01
Within the framework of research in possibility and prospects of power density equalization in boiling water reactors (as exemplified by WB-50) a work was undertaken to improve prior computational model of the WB-50 reactor implemented in MCU-RR software. Analysis of prior works showed that critical state calculations have deviation of calculated reactivity exceeding ±0.3 % (ΔKef/Kef) for minimum concentrations of boric acid in the reactor water and reaching 2 % for maximum concentration values. Axial coefficient of nonuniform burnup distribution reaches high values in the WB-50 reactor. Thus, the computational model needed refinement to take into account burnup inhomogeneity along the fuel assembly height. At this stage, computational results with mean square deviation of less than 0.7 % (ΔKef/Kef) and dispersion of design values of ±1 % (ΔK/K) shall be deemed acceptable. Further lowering of these parameters apparently requires root cause analysis of such large values and paying more attention to experimental measurement techniques.
The inaction effect in the psychology of regret.
Zeelenberg, Marcel; van de Bos, Kees; van Dijk, Eric; Pieters, Rik
2002-03-01
Previous research showed that decisions to act (i.e., actions) produce more regret than decisions not to act (i.e., inactions). This previous research focused on decisions made in isolation and ignored that decisions are often made in response to earlier outcomes. The authors show in 4 experiments that these prior outcomes may promote action and hence make inaction more abnormal. They manipulated information about a prior outcome. As hypothesized, when prior outcomes were positive or absent, people attributed more regret to action than to inaction. However, as predicted and counter to previous research, following negative prior outcomes, more regret was attributed to inaction, a finding that the authors label the inaction effect. Experiment 4, showing differential effects for regret and disappointment, demonstrates the need for emotion-specific predictions.
Improving zero-training brain-computer interfaces by mixing model estimators
NASA Astrophysics Data System (ADS)
Verhoeven, T.; Hübner, D.; Tangermann, M.; Müller, K. R.; Dambre, J.; Kindermans, P. J.
2017-06-01
Objective. Brain-computer interfaces (BCI) based on event-related potentials (ERP) incorporate a decoder to classify recorded brain signals and subsequently select a control signal that drives a computer application. Standard supervised BCI decoders require a tedious calibration procedure prior to every session. Several unsupervised classification methods have been proposed that tune the decoder during actual use and as such omit this calibration. Each of these methods has its own strengths and weaknesses. Our aim is to improve overall accuracy of ERP-based BCIs without calibration. Approach. We consider two approaches for unsupervised classification of ERP signals. Learning from label proportions (LLP) was recently shown to be guaranteed to converge to a supervised decoder when enough data is available. In contrast, the formerly proposed expectation maximization (EM) based decoding for ERP-BCI does not have this guarantee. However, while this decoder has high variance due to random initialization of its parameters, it obtains a higher accuracy faster than LLP when the initialization is good. We introduce a method to optimally combine these two unsupervised decoding methods, letting one method’s strengths compensate for the weaknesses of the other and vice versa. The new method is compared to the aforementioned methods in a resimulation of an experiment with a visual speller. Main results. Analysis of the experimental results shows that the new method exceeds the performance of the previous unsupervised classification approaches in terms of ERP classification accuracy and symbol selection accuracy during the spelling experiment. Furthermore, the method shows less dependency on random initialization of model parameters and is consequently more reliable. Significance. Improving the accuracy and subsequent reliability of calibrationless BCIs makes these systems more appealing for frequent use.
A computational method for estimating the PCR duplication rate in DNA and RNA-seq experiments.
Bansal, Vikas
2017-03-14
PCR amplification is an important step in the preparation of DNA sequencing libraries prior to high-throughput sequencing. PCR amplification introduces redundant reads in the sequence data and estimating the PCR duplication rate is important to assess the frequency of such reads. Existing computational methods do not distinguish PCR duplicates from "natural" read duplicates that represent independent DNA fragments and therefore, over-estimate the PCR duplication rate for DNA-seq and RNA-seq experiments. In this paper, we present a computational method to estimate the average PCR duplication rate of high-throughput sequence datasets that accounts for natural read duplicates by leveraging heterozygous variants in an individual genome. Analysis of simulated data and exome sequence data from the 1000 Genomes project demonstrated that our method can accurately estimate the PCR duplication rate on paired-end as well as single-end read datasets which contain a high proportion of natural read duplicates. Further, analysis of exome datasets prepared using the Nextera library preparation method indicated that 45-50% of read duplicates correspond to natural read duplicates likely due to fragmentation bias. Finally, analysis of RNA-seq datasets from individuals in the 1000 Genomes project demonstrated that 70-95% of read duplicates observed in such datasets correspond to natural duplicates sampled from genes with high expression and identified outlier samples with a 2-fold greater PCR duplication rate than other samples. The method described here is a useful tool for estimating the PCR duplication rate of high-throughput sequence datasets and for assessing the fraction of read duplicates that correspond to natural read duplicates. An implementation of the method is available at https://github.com/vibansal/PCRduplicates .
NASA Astrophysics Data System (ADS)
Tucker, G. E.
1997-05-01
This NSF supported program, emphasizing hands-on learning and observation with modern instruments, is described in its pilot phase, prior to being launched nationally. A group of 14 year old students are using a small (21 cm) computer controlled telescope and CCD camera to do: (1) a 'sky survey' of brighter celestial objects, finding, identifying, and learning about them, and accumulating a portfolio of images, (2) photometry of variable stars, reducing the data to get a light curve, and (3) learn modern computer-based communication/dissemination skills by posting images and data to a Web site they are designing (http://www.javanet.com/ sky) and contributing data to archives (e.g. AAVSO) via the Internet. To attract more interest to astronomy and science in general and have a wider impact on the school and surrounding community, peer teaching is used as a pedagogical technique and families are encouraged to participate. Students teach e.g. astronomy, software and computers, Internet, instrumentation, and observing to other students, parents and the community by means of daytime presentations of their results (images and data) and evening public viewing at the telescope, operating the equipment themselves. Students can contribute scientifically significant data and experience the `discovery' aspect of science through observing projects where a measurement is made. Their `informal education' activities also help improve the perception of science in general and astronomy in particular in society at large. This program could benefit from collaboration with astronomers wanting to organize geographically distributed observing campaigns coordinated over the Internet and willing to advise on promising observational programs for small telescopes in the context of current science.
NASA Astrophysics Data System (ADS)
Tandon, K.; Egbert, G.; Siripunvaraporn, W.
2003-12-01
We are developing a modular system for three-dimensional inversion of electromagnetic (EM) induction data, using an object oriented programming approach. This approach allows us to modify the individual components of the inversion scheme proposed, and also reuse the components for variety of problems in earth science computing howsoever diverse they might be. In particular, the modularity allows us to (a) change modeling codes independently of inversion algorithm details; (b) experiment with new inversion algorithms; and (c) modify the way prior information is imposed in the inversion to test competing hypothesis and techniques required to solve an earth science problem. Our initial code development is for EM induction equations on a staggered grid, using iterative solution techniques in 3D. An example illustrated here is an experiment with the sensitivity of 3D magnetotelluric inversion to uncertainties in the boundary conditions required for regional induction problems. These boundary conditions should reflect the large-scale geoelectric structure of the study area, which is usually poorly constrained. In general for inversion of MT data, one fixes boundary conditions at the edge of the model domain, and adjusts the earth?s conductivity structure within the modeling domain. Allowing for errors in specification of the open boundary values is simple in principle, but no existing inversion codes that we are aware of have this feature. Adding a feature such as this is straightforward within the context of the modular approach. More generally, a modular approach provides an efficient methodology for setting up earth science computing problems to test various ideas. As a concrete illustration relevant to EM induction problems, we investigate the sensitivity of MT data near San Andreas Fault at Parkfield (California) to uncertainties in the regional geoelectric structure.
Bill, Johannes; Buesing, Lars; Habenschuss, Stefan; Nessler, Bernhard; Maass, Wolfgang; Legenstein, Robert
2015-01-01
During the last decade, Bayesian probability theory has emerged as a framework in cognitive science and neuroscience for describing perception, reasoning and learning of mammals. However, our understanding of how probabilistic computations could be organized in the brain, and how the observed connectivity structure of cortical microcircuits supports these calculations, is rudimentary at best. In this study, we investigate statistical inference and self-organized learning in a spatially extended spiking network model, that accommodates both local competitive and large-scale associative aspects of neural information processing, under a unified Bayesian account. Specifically, we show how the spiking dynamics of a recurrent network with lateral excitation and local inhibition in response to distributed spiking input, can be understood as sampling from a variational posterior distribution of a well-defined implicit probabilistic model. This interpretation further permits a rigorous analytical treatment of experience-dependent plasticity on the network level. Using machine learning theory, we derive update rules for neuron and synapse parameters which equate with Hebbian synaptic and homeostatic intrinsic plasticity rules in a neural implementation. In computer simulations, we demonstrate that the interplay of these plasticity rules leads to the emergence of probabilistic local experts that form distributed assemblies of similarly tuned cells communicating through lateral excitatory connections. The resulting sparse distributed spike code of a well-adapted network carries compressed information on salient input features combined with prior experience on correlations among them. Our theory predicts that the emergence of such efficient representations benefits from network architectures in which the range of local inhibition matches the spatial extent of pyramidal cells that share common afferent input. PMID:26284370
Psychological impact on house staff of an initial versus subsequent emergency medicine rotation.
Alagappan, K; Grlic, N; Steinberg, M; Pollack, S
2001-01-01
The objective of this study was to assess the psychological impact of a 4-week emergency medicine (EM) rotation on residents undergoing their first EM experience. These findings were compared to the psychological impact the rotation had on residents with prior EM experience. Data were obtained from a post hoc analysis of a previous study. Prerotation and postrotation psychological distress levels were assessed over a 4-week EM rotation. Anxiety and depressive symptoms were evaluated by the Brief Symptom Inventory and the Dissociative Experience Scale that together comprise a total of 14 psychometric scales. All scales were given at the beginning and end of the initial EM rotation for the academic year of 1994-1995. All information was coded and confidential. Eighteen junior residents (9/18 EM [50%]) were analyzed as a group and compared to 53 residents (34/51 EM [66%]) with prior exposure to the authors' emergency department. Residents doing their first EM rotation (N = 18) showed improvement in 13 of 14 scales (P = .002). Of the 13 scales that improved, 3 improved significantly: Brief Symptom Inventory = anxiety (P = .002) and Dissociative Experience Scale = absorption (P = .001) and other (P = .001). Residents with prior EM experience (N = 53) displayed worsening in 9 of 13 scales (P = not significant) and no change in 1. Residents undergoing their first EM rotation showed a significant decrease in psychological distress over the 4-week period. Residents with prior EM experience did not show a similar change.
Kisch, Annika; Bolmsjö, Ingrid; Lenhoff, Stig; Bengtsson, Mariette
2015-10-01
There is a lack of knowledge about sibling stem cell donors' experiences pre-donation and the waiting period before the donation might have been long. The donors and their corresponding sibling recipients were simultaneously included in two different interview studies. The results from the recipient study have been presented in a separate paper. The aim was to explore the experiences of being a stem cell donor for a sibling, prior to donation. Ten adult sibling donors were interviewed prior to stem cell donation. The interviews were digitally recorded, transcribed verbatim and subjected to qualitative content analysis. The main theme Being a cog in a big wheel describes the complex process of being a sibling donor prior to donation, covering a mixture of emotions and thoughts. The four subthemes Being available, Being anxious, Being concerned and Being obliged cover the various experiences. The sibling donors' experiences are influenced by the quality of the relationship with the sick sibling. Sibling stem cell donors go through a complex process once they have accidentally got involved in. They have been asked to become a donor; it was not a voluntary choice. In caring for sibling stem cell donors the nurses should be aware of the complexity of the process they experience and take into consideration their personal situation and needs. Providing optimal care for both sibling donors and their corresponding recipients is a challenge, and further improvement and exploration are needed. Copyright © 2015 Elsevier Ltd. All rights reserved.
Optimising electron microscopy experiment through electron optics simulation.
Kubo, Y; Gatel, C; Snoeck, E; Houdellier, F
2017-04-01
We developed a new type of electron trajectories simulation inside a complete model of a modern transmission electron microscope (TEM). Our model incorporates the precise and real design of each element constituting a TEM, i.e. the field emission (FE) cathode, the extraction optic and acceleration stages of a 300kV cold field emission gun, the illumination lenses, the objective lens, the intermediate and projection lenses. Full trajectories can be computed using magnetically saturated or non-saturated round lenses, magnetic deflectors and even non-cylindrical symmetry elements like electrostatic biprism. This multi-scale model gathers nanometer size components (FE tip) with parts of meter length (illumination and projection systems). We demonstrate that non-trivial TEM experiments requiring specific and complex optical configurations can be simulated and optimized prior to any experiment using such model. We show that all the currents set in all optical elements of the simulated column can be implemented in the real column (I2TEM in CEMES) and used as starting alignment for the requested experiment. We argue that the combination of such complete electron trajectory simulations in the whole TEM column with automatic optimization of the microscope parameters for optimal experimental data (images, diffraction, spectra) allows drastically simplifying the implementation of complex experiments in TEM and will facilitate the development of advanced use of the electron microscope in the near future. Copyright © 2017 Elsevier B.V. All rights reserved.
Model verification of large structural systems
NASA Technical Reports Server (NTRS)
Lee, L. T.; Hasselman, T. K.
1977-01-01
A methodology was formulated, and a general computer code implemented for processing sinusoidal vibration test data to simultaneously make adjustments to a prior mathematical model of a large structural system, and resolve measured response data to obtain a set of orthogonal modes representative of the test model. The derivation of estimator equations is shown along with example problems. A method for improving the prior analytic model is included.
Prior-based artifact correction (PBAC) in computed tomography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heußer, Thorsten, E-mail: thorsten.heusser@dkfz-heidelberg.de; Brehm, Marcus; Ritschl, Ludwig
2014-02-15
Purpose: Image quality in computed tomography (CT) often suffers from artifacts which may reduce the diagnostic value of the image. In many cases, these artifacts result from missing or corrupt regions in the projection data, e.g., in the case of metal, truncation, and limited angle artifacts. The authors propose a generalized correction method for different kinds of artifacts resulting from missing or corrupt data by making use of available prior knowledge to perform data completion. Methods: The proposed prior-based artifact correction (PBAC) method requires prior knowledge in form of a planning CT of the same patient or in form ofmore » a CT scan of a different patient showing the same body region. In both cases, the prior image is registered to the patient image using a deformable transformation. The registered prior is forward projected and data completion of the patient projections is performed using smooth sinogram inpainting. The obtained projection data are used to reconstruct the corrected image. Results: The authors investigate metal and truncation artifacts in patient data sets acquired with a clinical CT and limited angle artifacts in an anthropomorphic head phantom data set acquired with a gantry-based flat detector CT device. In all cases, the corrected images obtained by PBAC are nearly artifact-free. Compared to conventional correction methods, PBAC achieves better artifact suppression while preserving the patient-specific anatomy at the same time. Further, the authors show that prominent anatomical details in the prior image seem to have only minor impact on the correction result. Conclusions: The results show that PBAC has the potential to effectively correct for metal, truncation, and limited angle artifacts if adequate prior data are available. Since the proposed method makes use of a generalized algorithm, PBAC may also be applicable to other artifacts resulting from missing or corrupt data.« less
Das, Abhiram; Schneider, Hannah; Burridge, James; Ascanio, Ana Karine Martinez; Wojciechowski, Tobias; Topp, Christopher N; Lynch, Jonathan P; Weitz, Joshua S; Bucksch, Alexander
2015-01-01
Plant root systems are key drivers of plant function and yield. They are also under-explored targets to meet global food and energy demands. Many new technologies have been developed to characterize crop root system architecture (CRSA). These technologies have the potential to accelerate the progress in understanding the genetic control and environmental response of CRSA. Putting this potential into practice requires new methods and algorithms to analyze CRSA in digital images. Most prior approaches have solely focused on the estimation of root traits from images, yet no integrated platform exists that allows easy and intuitive access to trait extraction and analysis methods from images combined with storage solutions linked to metadata. Automated high-throughput phenotyping methods are increasingly used in laboratory-based efforts to link plant genotype with phenotype, whereas similar field-based studies remain predominantly manual low-throughput. Here, we present an open-source phenomics platform "DIRT", as a means to integrate scalable supercomputing architectures into field experiments and analysis pipelines. DIRT is an online platform that enables researchers to store images of plant roots, measure dicot and monocot root traits under field conditions, and share data and results within collaborative teams and the broader community. The DIRT platform seamlessly connects end-users with large-scale compute "commons" enabling the estimation and analysis of root phenotypes from field experiments of unprecedented size. DIRT is an automated high-throughput computing and collaboration platform for field based crop root phenomics. The platform is accessible at http://www.dirt.iplantcollaborative.org/ and hosted on the iPlant cyber-infrastructure using high-throughput grid computing resources of the Texas Advanced Computing Center (TACC). DIRT is a high volume central depository and high-throughput RSA trait computation platform for plant scientists working on crop roots. It enables scientists to store, manage and share crop root images with metadata and compute RSA traits from thousands of images in parallel. It makes high-throughput RSA trait computation available to the community with just a few button clicks. As such it enables plant scientists to spend more time on science rather than on technology. All stored and computed data is easily accessible to the public and broader scientific community. We hope that easy data accessibility will attract new tool developers and spur creative data usage that may even be applied to other fields of science.
Progress on the FabrIc for Frontier Experiments project at Fermilab
Box, Dennis; Boyd, Joseph; Dykstra, Dave; ...
2015-12-23
The FabrIc for Frontier Experiments (FIFE) project is an ambitious, major-impact initiative within the Fermilab Scientific Computing Division designed to lead the computing model for Fermilab experiments. FIFE is a collaborative effort between experimenters and computing professionals to design and develop integrated computing models for experiments of varying needs and infrastructure. The major focus of the FIFE project is the development, deployment, and integration of Open Science Grid solutions for high throughput computing, data management, database access and collaboration within experiment. To accomplish this goal, FIFE has developed workflows that utilize Open Science Grid sites along with dedicated and commercialmore » cloud resources. The FIFE project has made significant progress integrating into experiment computing operations several services including new job submission services, software and reference data distribution through CVMFS repositories, flexible data transfer client, and access to opportunistic resources on the Open Science Grid. Hence, the progress with current experiments and plans for expansion with additional projects will be discussed. FIFE has taken a leading role in the definition of the computing model for Fermilab experiments, aided in the design of computing for experiments beyond Fermilab, and will continue to define the future direction of high throughput computing for future physics experiments worldwide« less
2011 Report to the Legislature: Credit for Prior Learning Experience in Washington
ERIC Educational Resources Information Center
Washington Higher Education Coordinating Board, 2011
2011-01-01
The Higher Education Opportunity Act (E2SHB 1795), passed by the Legislature in 2011, identified prior learning assessment (PLA) as an innovative means for improving degree and certificate attainment and improving cost effectiveness and efficiency within Washington's higher education system. The Act defines prior learning as "the knowledge…
A Study about Placement Support Using Semantic Similarity
ERIC Educational Resources Information Center
Katz, Marco; van Bruggen, Jan; Giesbers, Bas; Waterink, Wim; Eshuis, Jannes; Koper, Rob
2014-01-01
This paper discusses Latent Semantic Analysis (LSA) as a method for the assessment of prior learning. The Accreditation of Prior Learning (APL) is a procedure to offer learners an individualized curriculum based on their prior experiences and knowledge. The placement decisions in this process are based on the analysis of student material by domain…
ERIC Educational Resources Information Center
Smith, Derick Graham
2012-01-01
This study sought to answer the question: "To what extent do prior beliefs about and experiences of teaching and learning influence the instructional practices of new independent school teachers," who are generally not required to have any formal pedagogical training or hold teacher certification prior to beginning full-time employment.…
ERIC Educational Resources Information Center
Hsiao, E-Ling
2010-01-01
The aim of this study is to explore whether presentation format and prior knowledge affect the effectiveness of worked examples. The experiment was conducted through a specially designed online instrument. A 2X2X3 factorial before-and-after design was conducted. Three-way ANOVA was employed for data analysis. The result showed first, that prior…
NASA Technical Reports Server (NTRS)
King, James A.
1987-01-01
The goal is to explain Case-Based Reasoning as a vehicle to establish knowledge-based systems based on experimental reasoning for possible space applications. This goal will be accomplished through an examination of reasoning based on prior experience in a sample domain, and also through a presentation of proposed space applications which could utilize Case-Based Reasoning techniques.
ERIC Educational Resources Information Center
Uppal, Nishant; Mishra, Sushanta Kumar
2014-01-01
The study investigates the relationship between prior job experience and current academic performance among management students in India. It further explores the impact of individual and situational factors on the above relationship. Based on a longitudinal study spanning over nine months in the academic year 2010-11 among a sample of 324…
2001-03-28
The Aerostructures Test Wing (ATW) experiment, which consisted of an 18-inch carbon fiber test wing with surface-mounted piezoelectric strain actuators, undergoing ground testing prior to flight on Dryden's F-15B Research Testbed aircraft
Improved Collision-Detection Method for Robotic Manipulator
NASA Technical Reports Server (NTRS)
Leger, Chris
2003-01-01
An improved method has been devised for the computational prediction of a collision between (1) a robotic manipulator and (2) another part of the robot or an external object in the vicinity of the robot. The method is intended to be used to test commanded manipulator trajectories in advance so that execution of the commands can be stopped before damage is done. The method involves utilization of both (1) mathematical models of the robot and its environment constructed manually prior to operation and (2) similar models constructed automatically from sensory data acquired during operation. The representation of objects in this method is simpler and more efficient (with respect to both computation time and computer memory), relative to the representations used in most prior methods. The present method was developed especially for use on a robotic land vehicle (rover) equipped with a manipulator arm and a vision system that includes stereoscopic electronic cameras. In this method, objects are represented and collisions detected by use of a previously developed technique known in the art as the method of oriented bounding boxes (OBBs). As the name of this technique indicates, an object is represented approximately, for computational purposes, by a box that encloses its outer boundary. Because many parts of a robotic manipulator are cylindrical, the OBB method has been extended in this method to enable the approximate representation of cylindrical parts by use of octagonal or other multiple-OBB assemblies denoted oriented bounding prisms (OBPs), as in the example of Figure 1. Unlike prior methods, the OBB/OBP method does not require any divisions or transcendental functions; this feature leads to greater robustness and numerical accuracy. The OBB/OBP method was selected for incorporation into the present method because it offers the best compromise between accuracy on the one hand and computational efficiency (and thus computational speed) on the other hand.
Meffert, Harma; Thornton, Laura C; Tyler, Patrick M; Botkin, Mary L; Erway, Anna K; Kolli, Venkata; Pope, Kayla; White, Stuart F; Blair, R James R
2018-02-12
Previous work has shown that amygdala responsiveness to fearful expressions is inversely related to level of callous-unemotional (CU) traits (i.e. reduced guilt and empathy) in youth with conduct problems. However, some research has suggested that the relationship between pathophysiology and CU traits may be different in those youth with significant prior trauma exposure. In experiment 1, 72 youth with varying levels of disruptive behavior and trauma exposure performed a gender discrimination task while viewing morphed fear expressions (0, 50, 100, 150 fear) and Blood Oxygenation Level Dependent responses were recorded. In experiment 2, 66 of these youth performed the Social Goals Task, which measures self-reports of the importance of specific social goals to the participant in provoking social situations. In experiment 1, a significant CU traits-by-trauma exposure interaction was observed within right amygdala; fear intensity-modulated amygdala responses negatively predicted CU traits for those youth with low levels of trauma but positively predicted CU traits for those with high levels of trauma. In experiment 2, a bootstrapped model revealed that the indirect effect of fear intensity amygdala response on social goal importance through CU traits is moderated by prior trauma exposure. This study, while exploratory, indicates that the pathophysiology associated with CU traits differs in youth as a function of prior trauma exposure. These data suggest that prior trauma exposure should be considered when evaluating potential interventions for youth with high CU traits.
Parasitic current collection by PASP Plus solar arrays
NASA Technical Reports Server (NTRS)
Davis, Victoria Ann; Gardner, Barbara M.
1995-01-01
Solar cells at potentials positive with respect to a surrounding plasma collect electrons. Current is collected by the exposed high voltage surfaces: the interconnects and the sides of the solar cells. This current is a drain on the array power that can be significant for high-power arrays. In addition, this current influences the current balance that determines the floating potential of the spacecraft. One of the objectives of the Air Force (PL/GPS) PASP Plus (Photovoltaic Array Space Power Plus Diagnostics) experiment is an improved understanding fo parasitic current collection. We have done computer modeling of parasitic current collection and have examined current collection flight data from the first year of operations. Prior to the flight we did computer modeling to improve our understanding of the physical processes that control parasitic current collection. At high potentials, the current rapidly rises due to a phenomenon called snapover. Under snapover conditions, the equilibrium potential distribution across the dielectric surface is such that part of the area is at potentials greater than the first crossover of the secondary yield curve. Therefore, each incident electron generates more than one secondary electron. The net effect is that the high potential area and the collecting area increase. We did two-dimensional calculations for the various geometries to be flown. The calculations span the space of anticipated plasma conditions, applied potential, and material parameters. We used the calculations and early flight data to develop an analytic formula for the dependence of the current on the primary problem variables. The analytic formula was incorporated into the EPSAT computer code. EPSAT allows us to easily extend the results to other conditions. PASP Plus is the principal experiment integrated onto the Advanced Photovoltaic and Electronics Experiments (APEX) satellite bus. The experiment is testing twelve different solar array designs. Parasitic current collection is being measured for eight of the designs under various operational and environment conditions. We examined the current collected as a function of the various parameters for the six non-concentrator designs. The results are similar to those obtained in previous experiments and predicted by the calculations. We are using the flight data to validate the analytic formula developed. The formula can be used to quantify the parasitic current collected. Anticipating the parasitic current value allows the spacecraft designer to include this interaction when developing the design.
Arnold, Jeffrey
2018-05-14
Floating-point computations are at the heart of much of the computing done in high energy physics. The correctness, speed and accuracy of these computations are of paramount importance. The lack of any of these characteristics can mean the difference between new, exciting physics and an embarrassing correction. This talk will examine practical aspects of IEEE 754-2008 floating-point arithmetic as encountered in HEP applications. After describing the basic features of IEEE floating-point arithmetic, the presentation will cover: common hardware implementations (SSE, x87) techniques for improving the accuracy of summation, multiplication and data interchange compiler options for gcc and icc affecting floating-point operations hazards to be avoided. About the speaker: Jeffrey M Arnold is a Senior Software Engineer in the Intel Compiler and Languages group at Intel Corporation. He has been part of the Digital->Compaq->Intel compiler organization for nearly 20 years; part of that time, he worked on both low- and high-level math libraries. Prior to that, he was in the VMS Engineering organization at Digital Equipment Corporation. In the late 1980s, Jeff spent 2½ years at CERN as part of the CERN/Digital Joint Project. In 2008, he returned to CERN to spent 10 weeks working with CERN/openlab. Since that time, he has returned to CERN multiple times to teach at openlab workshops and consult with various LHC experiments. Jeff received his Ph.D. in physics from Case Western Reserve University.
Mobility analysis, simulation, and scale model testing for the design of wheeled planetary rovers
NASA Technical Reports Server (NTRS)
Lindemann, Randel A.; Eisen, Howard J.
1993-01-01
The use of computer based techniques to model and simulate wheeled rovers on rough natural terrains is considered. Physical models of a prototype vehicle can be used to test the correlation of the simulations in scaled testing. The computer approaches include a quasi-static planar or two dimensional analysis and design tool based on the traction necessary for the vehicle to have imminent mobility. The computer program modeled a six by six wheel drive vehicle of original kinematic configuration, called the Rocker Bogie. The Rocker Bogie was optimized using the quasi-static software with respect to its articulation parameters prior to fabrication of a prototype. In another approach used, the dynamics of the Rocker Bogie vehicle in 3-D space was modeled on an engineering workstation using commercial software. The model included the complex and nonlinear interaction of the tire and terrain. The results of the investigation yielded numerical and graphical results of the rover traversing rough terrain on the earth, moon, and Mars. In addition, animations of the rover excursions were also generated. A prototype vehicle was then used in a series of testbed and field experiments. Correspondence was then established between the computer models and the physical model. The results indicated the utility of the quasi-static tool for configurational design, as well as the predictive ability of the 3-D simulation to model the dynamic behavior of the vehicle over short traverses.
Davis, Matthew H.
2016-01-01
Successful perception depends on combining sensory input with prior knowledge. However, the underlying mechanism by which these two sources of information are combined is unknown. In speech perception, as in other domains, two functionally distinct coding schemes have been proposed for how expectations influence representation of sensory evidence. Traditional models suggest that expected features of the speech input are enhanced or sharpened via interactive activation (Sharpened Signals). Conversely, Predictive Coding suggests that expected features are suppressed so that unexpected features of the speech input (Prediction Errors) are processed further. The present work is aimed at distinguishing between these two accounts of how prior knowledge influences speech perception. By combining behavioural, univariate, and multivariate fMRI measures of how sensory detail and prior expectations influence speech perception with computational modelling, we provide evidence in favour of Prediction Error computations. Increased sensory detail and informative expectations have additive behavioural and univariate neural effects because they both improve the accuracy of word report and reduce the BOLD signal in lateral temporal lobe regions. However, sensory detail and informative expectations have interacting effects on speech representations shown by multivariate fMRI in the posterior superior temporal sulcus. When prior knowledge was absent, increased sensory detail enhanced the amount of speech information measured in superior temporal multivoxel patterns, but with informative expectations, increased sensory detail reduced the amount of measured information. Computational simulations of Sharpened Signals and Prediction Errors during speech perception could both explain these behavioural and univariate fMRI observations. However, the multivariate fMRI observations were uniquely simulated by a Prediction Error and not a Sharpened Signal model. The interaction between prior expectation and sensory detail provides evidence for a Predictive Coding account of speech perception. Our work establishes methods that can be used to distinguish representations of Prediction Error and Sharpened Signals in other perceptual domains. PMID:27846209
NASA Astrophysics Data System (ADS)
Chien, Cheng-Chih
In the past thirty years, the effectiveness of computer assisted learning was found varied by individual studies. Today, with drastic technical improvement, computers have been widely spread in schools and used in a variety of ways. In this study, a design model involving educational technology, pedagogy, and content domain is proposed for effective use of computers in learning. Computer simulation, constructivist and Vygotskian perspectives, and circular motion are the three elements of the specific Chain Model for instructional design. The goal of the physics course is to help students remove the ideas which are not consistent with the physics community and rebuild new knowledge. To achieve the learning goal, the strategies of using conceptual conflicts and using language to internalize specific tasks into mental functions were included. Computer simulations and accompanying worksheets were used to help students explore their own ideas and to generate questions for discussions. Using animated images to describe the dynamic processes involved in the circular motion may reduce the complexity and possible miscommunications resulting from verbal explanations. The effectiveness of the instructional material on student learning is evaluated. The results of problem solving activities show that students using computer simulations had significantly higher scores than students not using computer simulations. For conceptual understanding, on the pretest students in the non-simulation group had significantly higher score than students in the simulation group. There was no significant difference observed between the two groups in the posttest. The relations of gender, prior physics experience, and frequency of computer uses outside the course to student achievement were also studied. There were fewer female students than male students and fewer students using computer simulations than students not using computer simulations. These characteristics affect the statistical power for detecting differences. For the future research, more intervention of simulations may be introduced to explore the potential of computer simulation in helping students learning. A test for conceptual understanding with more problems and appropriate difficulty level may be needed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Box, D.; Boyd, J.; Di Benedetto, V.
2016-01-01
The FabrIc for Frontier Experiments (FIFE) project is an initiative within the Fermilab Scientific Computing Division designed to steer the computing model for non-LHC Fermilab experiments across multiple physics areas. FIFE is a collaborative effort between experimenters and computing professionals to design and develop integrated computing models for experiments of varying size, needs, and infrastructure. The major focus of the FIFE project is the development, deployment, and integration of solutions for high throughput computing, data management, database access and collaboration management within an experiment. To accomplish this goal, FIFE has developed workflows that utilize Open Science Grid compute sites alongmore » with dedicated and commercial cloud resources. The FIFE project has made significant progress integrating into experiment computing operations several services including a common job submission service, software and reference data distribution through CVMFS repositories, flexible and robust data transfer clients, and access to opportunistic resources on the Open Science Grid. The progress with current experiments and plans for expansion with additional projects will be discussed. FIFE has taken the leading role in defining the computing model for Fermilab experiments, aided in the design of experiments beyond those hosted at Fermilab, and will continue to define the future direction of high throughput computing for future physics experiments worldwide.« less
New graduate students' baseline knowledge of the responsible conduct of research.
Heitman, Elizabeth; Olsen, Cara H; Anestidou, Lida; Bulger, Ruth Ellen
2007-09-01
To assess (1) new biomedical science graduate students' baseline knowledge of core concepts and standards in responsible conduct of research (RCR), (2) differences in graduate students' baseline knowledge overall and across the Office of Research Integrity's nine core areas, and (3) demographic and educational factors in these differences. A 30-question, computer-scored multiple-choice test on core concepts and standards of RCR was developed following content analysis of 20 United States-published RCR texts, and combined with demographic questions on undergraduate experience with RCR developed from graduate student focus groups. Four hundred two new graduate students at three health science universities were recruited for Scantron and online testing before beginning RCR instruction. Two hundred fifty-one of 402 eligible trainees (62%) at three universities completed the test; scores ranged from 26.7% to 83.3%, with a mean of 59.5%. Only seven (3%) participants scored 80% or above. Students who received their undergraduate education outside the United States scored significantly lower (mean 52.0%) than those with U.S. bachelor's degrees (mean 60.5%, P < .001). Participants with prior graduate biomedical or health professions education scored marginally higher than new students, but both groups' mean scores were well below 80%. The mean score of 16 participants who reported previous graduate-level RCR instruction was 67.7%. Participants' specific knowledge varied, but overall scores were universally low. New graduate biomedical sciences students have inadequate and inconsistent knowledge of RCR, irrespective of their prior education or experience. Incoming trainees with previous graduate RCR education may also have gaps in core knowledge.
What is the perception of biological risk by undergraduate nursing students?
Moreno-Arroyo, Mª Carmen; Puig-Llobet, Montserrat; Falco-Pegueroles, Anna; Lluch-Canut, Maria Teresa; García, Irma Casas; Roldán-Merino, Juan
2016-01-01
Abstract Objective: to analyze undergraduate nursing students' perception of biological risk and its relationship with their prior practical training. Method: a descriptive cross-sectional study was conducted among undergraduate nursing students enrolled in clinical practice courses in the academic year 2013-2014 at the School of Nursing at the University of Barcelona. Variables: sociodemographic variables, employment, training, clinical experience and other variables related to the assessment of perceived biological risk were collected. Both a newly developed tool and the Dimensional Assessment of Risk Perception at the worker level scale (Escala de Evaluación Dimensional del Riesgo Percibido por el Trabajador, EDRP-T) were used. Statistical analysis: descriptive and univariate analysis were used to identify differences between the perception of biological risk of the EDRP-T scale items and sociodemographic variables. Results: students without prior practical training had weaker perceptions of biological risk compared to students with prior practical training (p=0.05 and p=0.04, respectively). Weaker perceptions of biological risk were found among students with prior work experience. Conclusion: practical training and work experience influence the perception of biological risk among nursing students. PMID:27384468
The Utility of Gas Gun Experiments in Developing Equations of State
NASA Astrophysics Data System (ADS)
Pittman, Emily; Hagelberg, Carl; Ramsey, Scott
2016-11-01
Gas gun experiments have the potential to investigate material properties in various well defined shock conditions, making them a valuable research tool for the development of equations of state (EOS) and material response under shock loading. Gas guns have the ability to create shocks for loading to pressures ranging from MPa to GPa. A variety of diagnostics techniques can be used to gather data from gas gun experiments; resulting data from these experiments is applicable to many fields of study. The focus of this set of experiments is the development of data on the Hugoniot for the overdriven products EOS of PBX 9501 to extend data from which current computational EOS models draw. This series of shots was conducted by M-9 using the two-stage gas-guns at LANL and aimed to gather data within the 30-120 GPa pressure regime. The experiment was replicated using FLAG, a Langrangian multiphysics code, using a one-dimensional setup which employs the Wescott Stewart Davis (WSD) reactive burn model. Prior to this series, data did not extend into this higher range, so the new data allowed for the model to be re-evaluated. A comparison of the results to the experimental data reveals that the model is a good fit to the data below 40 GPa. However, the model did not fall within the error bars for pressures above this region. This is an indication that the material models or burn model could be modified to better match the data.
The Use of Virtual Reality Computer Simulation in Learning Port-A Cath Injection
ERIC Educational Resources Information Center
Tsai, Sing-Ling; Chai, Sin-Kuo; Hsieh, Li-Feng; Lin, Shirling; Taur, Fang-Meei; Sung, Wen-Hsu; Doong, Ji-Liang
2008-01-01
Cost-benefit management trends in Taiwan healthcare settings have led nurses to perform more invasive skills, such as Port-A cath administration of medications. Accordingly, nurses must be well-prepared prior to teaching by the mentor and supervision method. The purpose of the current study was to develop a computer-assisted protocol using virtual…
The Use of Reverse Engineering to Analyse Student Computer Programs.
ERIC Educational Resources Information Center
Vanneste, Philip; And Others
1996-01-01
Discusses how the reverse engineering approach can generate feedback on computer programs without the user having any prior knowledge of what the program was designed to do. This approach uses the cognitive model of programming knowledge to interpret both context independent and dependent errors in the same words and concepts as human programmers.…
ERIC Educational Resources Information Center
Waight, Noemi; Gillmeister, Kristina
2014-01-01
This study examined teachers' and students' initial conceptions of computer-based models--Flash and NetLogo models--and documented how teachers and students reconciled notions of multiple representations featuring macroscopic, submicroscopic and symbolic representations prior to actual intervention in eight high school chemistry…
Tangential scanning of hardwood logs: developing an industrial computer tomography scanner
Nand K. Gupta; Daniel L. Schmoldt; Bruce Isaacson
1999-01-01
It is generally believed that noninvasive scanning of hardwood logs such as computer tomography (CT) scanning prior to initial breakdown will greatly improve the processing of logs into lumber. This belief, however, has not translated into rapid development and widespread installation of industrial CT scanners for log processing. The roadblock has been more operational...
ERIC Educational Resources Information Center
Pinkard, Nichole; Erete, Sheena; Martin, Caitlin K.; McKinney de Royston, Maxine
2017-01-01
Women use technology to mediate numerous aspects of their professional and personal lives. Yet, few design and create these technologies given that women, especially women of color, are grossly underrepresented in computer science and engineering courses. Decisions about participation in STEM are frequently made prior to high school, and these…
Sinha, Shriprakash
2017-12-04
Ever since the accidental discovery of Wingless [Sharma R.P., Drosophila information service, 1973, 50, p 134], research in the field of Wnt signaling pathway has taken significant strides in wet lab experiments and various cancer clinical trials, augmented by recent developments in advanced computational modeling of the pathway. Information rich gene expression profiles reveal various aspects of the signaling pathway and help in studying different issues simultaneously. Hitherto, not many computational studies exist which incorporate the simultaneous study of these issues. This manuscript ∙ explores the strength of contributing factors in the signaling pathway, ∙ analyzes the existing causal relations among the inter/extracellular factors effecting the pathway based on prior biological knowledge and ∙ investigates the deviations in fold changes in the recently found prevalence of psychophysical laws working in the pathway. To achieve this goal, local and global sensitivity analysis is conducted on the (non)linear responses between the factors obtained from static and time series expression profiles using the density (Hilbert-Schmidt Information Criterion) and variance (Sobol) based sensitivity indices. The results show the advantage of using density based indices over variance based indices mainly due to the former's employment of distance measures & the kernel trick via Reproducing kernel Hilbert space (RKHS) that capture nonlinear relations among various intra/extracellular factors of the pathway in a higher dimensional space. In time series data, using these indices it is now possible to observe where in time, which factors get influenced & contribute to the pathway, as changes in concentration of the other factors are made. This synergy of prior biological knowledge, sensitivity analysis & representations in higher dimensional spaces can facilitate in time based administration of target therapeutic drugs & reveal hidden biological information within colorectal cancer samples.
Dual-contrast agent photon-counting computed tomography of the heart: initial experience.
Symons, Rolf; Cork, Tyler E; Lakshmanan, Manu N; Evers, Robert; Davies-Venn, Cynthia; Rice, Kelly A; Thomas, Marvin L; Liu, Chia-Ying; Kappler, Steffen; Ulzheimer, Stefan; Sandfort, Veit; Bluemke, David A; Pourmorteza, Amir
2017-08-01
To determine the feasibility of dual-contrast agent imaging of the heart using photon-counting detector (PCD) computed tomography (CT) to simultaneously assess both first-pass and late enhancement of the myocardium. An occlusion-reperfusion canine model of myocardial infarction was used. Gadolinium-based contrast was injected 10 min prior to PCD CT. Iodinated contrast was infused immediately prior to PCD CT, thus capturing late gadolinium enhancement as well as first-pass iodine enhancement. Gadolinium and iodine maps were calculated using a linear material decomposition technique and compared to single-energy (conventional) images. PCD images were compared to in vivo and ex vivo magnetic resonance imaging (MRI) and histology. For infarct versus remote myocardium, contrast-to-noise ratio (CNR) was maximal on late enhancement gadolinium maps (CNR 9.0 ± 0.8, 6.6 ± 0.7, and 0.4 ± 0.4, p < 0.001 for gadolinium maps, single-energy images, and iodine maps, respectively). For infarct versus blood pool, CNR was maximum for iodine maps (CNR 11.8 ± 1.3, 3.8 ± 1.0, and 1.3 ± 0.4, p < 0.001 for iodine maps, gadolinium maps, and single-energy images, respectively). Combined first-pass iodine and late gadolinium maps allowed quantitative separation of blood pool, scar, and remote myocardium. MRI and histology analysis confirmed accurate PCD CT delineation of scar. Simultaneous multi-contrast agent cardiac imaging is feasible with photon-counting detector CT. These initial proof-of-concept results may provide incentives to develop new k-edge contrast agents, to investigate possible interactions between multiple simultaneously administered contrast agents, and to ultimately bring them to clinical practice.
Automated Agatston score computation in non-ECG gated CT scans using deep learning
NASA Astrophysics Data System (ADS)
Cano-Espinosa, Carlos; González, Germán.; Washko, George R.; Cazorla, Miguel; San José Estépar, Raúl
2018-03-01
Introduction: The Agatston score is a well-established metric of cardiovascular disease related to clinical outcomes. It is computed from CT scans by a) measuring the volume and intensity of the atherosclerotic plaques and b) aggregating such information in an index. Objective: To generate a convolutional neural network that inputs a non-contrast chest CT scan and outputs the Agatston score associated with it directly, without a prior segmentation of Coronary Artery Calcifications (CAC). Materials and methods: We use a database of 5973 non-contrast non-ECG gated chest CT scans where the Agatston score has been manually computed. The heart of each scan is cropped automatically using an object detector. The database is split in 4973 cases for training and 1000 for testing. We train a 3D deep convolutional neural network to regress the Agatston score directly from the extracted hearts. Results: The proposed method yields a Pearson correlation coefficient of r = 0.93; p <= 0.0001 against manual reference standard in the 1000 test cases. It further stratifies correctly 72.6% of the cases with respect to standard risk groups. This compares to more complex state-of-the-art methods based on prior segmentations of the CACs, which achieve r = 0.94 in ECG-gated pulmonary CT. Conclusions: A convolutional neural network can regress the Agatston score from the image of the heart directly, without a prior segmentation of the CACs. This is a new and simpler paradigm in the Agatston score computation that yields similar results to the state-of-the-art literature.
ERIC Educational Resources Information Center
Boddey, Kerrie; de Berg, Kevin
2015-01-01
Nursing students have typically found the study of chemistry to be one of their major challenges in a nursing course. This mixed method study was designed to explore how prior experiences in chemistry might impact chemistry achievement during a health science unit. Nursing students (N = 101) studying chemistry as part of a health science unit were…
ERIC Educational Resources Information Center
Clark, Mary Kristen; Kamhi, Alan G.
2014-01-01
Purpose: In 2 experiments, we examined the influence of prior knowledge and interest on 4th- and 5th-grade students' passage comprehension scores on the Qualitative Reading Inventory-4 (QRI-4) and 2 experimenter constructed passages. Method: In Experiment 1, 4th- and 5th-grade students were administered 4 Level 4 passages or 4 Level 5…