ERIC Educational Resources Information Center
Nottingham, Sara; Verscheure, Susan
2010-01-01
Active learning is a teaching methodology with a focus on student-centered learning that engages students in the educational process. This study implemented active learning techniques in an orthopedic assessment laboratory, and the effects of these teaching techniques. Mean scores from written exams, practical exams, and final course evaluations…
Dallora, Ana Luiza; Eivazzadeh, Shahryar; Mendes, Emilia; Berglund, Johan; Anderberg, Peter
2017-01-01
Dementia is a complex disorder characterized by poor outcomes for the patients and high costs of care. After decades of research little is known about its mechanisms. Having prognostic estimates about dementia can help researchers, patients and public entities in dealing with this disorder. Thus, health data, machine learning and microsimulation techniques could be employed in developing prognostic estimates for dementia. The goal of this paper is to present evidence on the state of the art of studies investigating and the prognosis of dementia using machine learning and microsimulation techniques. To achieve our goal we carried out a systematic literature review, in which three large databases-Pubmed, Socups and Web of Science were searched to select studies that employed machine learning or microsimulation techniques for the prognosis of dementia. A single backward snowballing was done to identify further studies. A quality checklist was also employed to assess the quality of the evidence presented by the selected studies, and low quality studies were removed. Finally, data from the final set of studies were extracted in summary tables. In total 37 papers were included. The data summary results showed that the current research is focused on the investigation of the patients with mild cognitive impairment that will evolve to Alzheimer's disease, using machine learning techniques. Microsimulation studies were concerned with cost estimation and had a populational focus. Neuroimaging was the most commonly used variable. Prediction of conversion from MCI to AD is the dominant theme in the selected studies. Most studies used ML techniques on Neuroimaging data. Only a few data sources have been recruited by most studies and the ADNI database is the one most commonly used. Only two studies have investigated the prediction of epidemiological aspects of Dementia using either ML or MS techniques. Finally, care should be taken when interpreting the reported accuracy of ML techniques, given studies' different contexts.
Mendes, Emilia; Berglund, Johan; Anderberg, Peter
2017-01-01
Background Dementia is a complex disorder characterized by poor outcomes for the patients and high costs of care. After decades of research little is known about its mechanisms. Having prognostic estimates about dementia can help researchers, patients and public entities in dealing with this disorder. Thus, health data, machine learning and microsimulation techniques could be employed in developing prognostic estimates for dementia. Objective The goal of this paper is to present evidence on the state of the art of studies investigating and the prognosis of dementia using machine learning and microsimulation techniques. Method To achieve our goal we carried out a systematic literature review, in which three large databases—Pubmed, Socups and Web of Science were searched to select studies that employed machine learning or microsimulation techniques for the prognosis of dementia. A single backward snowballing was done to identify further studies. A quality checklist was also employed to assess the quality of the evidence presented by the selected studies, and low quality studies were removed. Finally, data from the final set of studies were extracted in summary tables. Results In total 37 papers were included. The data summary results showed that the current research is focused on the investigation of the patients with mild cognitive impairment that will evolve to Alzheimer’s disease, using machine learning techniques. Microsimulation studies were concerned with cost estimation and had a populational focus. Neuroimaging was the most commonly used variable. Conclusions Prediction of conversion from MCI to AD is the dominant theme in the selected studies. Most studies used ML techniques on Neuroimaging data. Only a few data sources have been recruited by most studies and the ADNI database is the one most commonly used. Only two studies have investigated the prediction of epidemiological aspects of Dementia using either ML or MS techniques. Finally, care should be taken when interpreting the reported accuracy of ML techniques, given studies’ different contexts. PMID:28662070
Learning by Teaching: Implementation of a Multimedia Project in Astro 101
NASA Astrophysics Data System (ADS)
Perrodin, D.; Lommen, A.
2011-09-01
Astro 101 students have deep-seated pre-conceptions regarding such topics as the cause of moon phases or the seasons. Beyond exploring the topics in a learner-centered fashion, the "learning by teaching" philosophy enables students to truly master concepts. In order to make students teach the cause of moon phases, we created a multimedia project where groups of students taught other students and filmed the session. They were to produce a 10-minute final movie highlighting their teaching techniques and showing students in the process of learning the concepts. This "experiment" turned out to be a great success for a few reasons. First, students gained experience explaining conceptually-challenging topics, making them learn the material better. Additionally, they learned to apply learner-centered techniques, most likely learning to teach for the first time. Finally, this project provided the students a connection between the classroom and the rest of the college, making them responsible for applying and sharing their knowledge with their peers.
ERIC Educational Resources Information Center
McCormick, Sandra; Cooper, John O.
The study reported in this paper investigated the effects of a frequently recommended study technique on the comprehension of expository text by high-school students having learning disabilities. The instructional procedure studied was "Survey, Question, Read, Recite, Review" (SQ3R). Six experiments were conducted over a 3-year period,…
Bullying in Virtual Learning Communities.
Nikiforos, Stefanos; Tzanavaris, Spyros; Kermanidis, Katia Lida
2017-01-01
Bullying through the internet has been investigated and analyzed mainly in the field of social media. In this paper, it is attempted to analyze bullying in the Virtual Learning Communities using Natural Language Processing (NLP) techniques, mainly in the context of sociocultural learning theories. Therefore four case studies took place. We aim to apply NLP techniques to speech analysis on communication data of online communities. Emphasis is given on qualitative data, taking into account the subjectivity of the collaborative activity. Finally, this is the first time such type of analysis is attempted on Greek data.
A review on machine learning principles for multi-view biological data integration.
Li, Yifeng; Wu, Fang-Xiang; Ngom, Alioune
2018-03-01
Driven by high-throughput sequencing techniques, modern genomic and clinical studies are in a strong need of integrative machine learning models for better use of vast volumes of heterogeneous information in the deep understanding of biological systems and the development of predictive models. How data from multiple sources (called multi-view data) are incorporated in a learning system is a key step for successful analysis. In this article, we provide a comprehensive review on omics and clinical data integration techniques, from a machine learning perspective, for various analyses such as prediction, clustering, dimension reduction and association. We shall show that Bayesian models are able to use prior information and model measurements with various distributions; tree-based methods can either build a tree with all features or collectively make a final decision based on trees learned from each view; kernel methods fuse the similarity matrices learned from individual views together for a final similarity matrix or learning model; network-based fusion methods are capable of inferring direct and indirect associations in a heterogeneous network; matrix factorization models have potential to learn interactions among features from different views; and a range of deep neural networks can be integrated in multi-modal learning for capturing the complex mechanism of biological systems.
Applications of Deep Learning and Reinforcement Learning to Biological Data.
Mahmud, Mufti; Kaiser, Mohammed Shamim; Hussain, Amir; Vassanelli, Stefano
2018-06-01
Rapid advances in hardware-based technologies during the past decades have opened up new possibilities for life scientists to gather multimodal data in various application domains, such as omics, bioimaging, medical imaging, and (brain/body)-machine interfaces. These have generated novel opportunities for development of dedicated data-intensive machine learning techniques. In particular, recent research in deep learning (DL), reinforcement learning (RL), and their combination (deep RL) promise to revolutionize the future of artificial intelligence. The growth in computational power accompanied by faster and increased data storage, and declining computing costs have already allowed scientists in various fields to apply these techniques on data sets that were previously intractable owing to their size and complexity. This paper provides a comprehensive survey on the application of DL, RL, and deep RL techniques in mining biological data. In addition, we compare the performances of DL techniques when applied to different data sets across various application domains. Finally, we outline open issues in this challenging research area and discuss future development perspectives.
NASA Astrophysics Data System (ADS)
Miller, H. R.; Sell, K. S.; Herbert, B. E.
2004-12-01
Shifts in learning goals in introductory earth science courses to greater emphasis on critical thinking and the nature of science has led to the adoption of new pedagogical techniques, including inquiry-based learning (IBL). IBL is thought to support understanding of the nature of science and foster development of scientific reasoning and critical thinking skills by modeling authentic science inquiry. Implementation of new pedagogical techniques do not occur without influence, instruction and learning occurs in a complex learning environment, referring to the social, physical, mental, and pedagogical contexts. This study characterized the impact of an IBL module verses a traditionally structured laboratory exercise in an introductory physical geology class at Texas A&M University. Student activities in this study included manipulation of large-scale data sets, use of multiple representations, and exposure to ill-constrained problems common to the Texas Gulf Coast system. Formative assessment data collected included an initial survey of self efficacy, student demographics, content knowledge and a pre-mental model expression. Summative data collected included a post-test, post-mental model expression, final laboratory report, and a post-survey on student attitudes toward the module. Mental model expressions and final reports were scored according to a validated rubric instrument (Cronbrach alpha: 0.84-0.98). Nine lab sections were randomized into experimental and control groups. Experimental groups were taught using IBL pedagogical techniques, while the control groups were taught using traditional laboratory "workbook" techniques. Preliminary assessment based on rubric scores for pre-tests using Student's t-test (N ˜ 140) indicated that the experimental and control groups were not significantly different (ρ > 0.05), therefore, the learning environment likely impacted student's ability to succeed. A non-supportive learning environment, including student attitudes, teaching assistant attitudes, the lack of scaffolded learning, limited pedagogical content knowledge, and departmental oversight, which were all encountered during this study, can have an affect on the students' attitudes and achievements during the course. Data collected showed an overall improvement in content knowledge (38% increase); while performance effort clearly declined as seen through post-mental model expressions (a decline in performance by 24.8%) and percentage of assignments turned in (39% of all students turned in the required final report). A non-supportive learning environment was also seen through student comments on the final survey, "I think that all the TA's and the professor have forgotten that we are an intro class". A non-supportive environment clearly does not encourage critical thinking and completion of work. This pilot study showed that the complex learning environment can play a significant role in student learning. It also illustrates the need for future studies in IBL with supportive learning environments in order for students to achieve academic excellence and develop scientific reasoning and critical thinking skills.
Jet-images — deep learning edition
de Oliveira, Luke; Kagan, Michael; Mackey, Lester; ...
2016-07-13
Building on the notion of a particle physics detector as a camera and the collimated streams of high energy particles, or jets, it measures as an image, we investigate the potential of machine learning techniques based on deep learning architectures to identify highly boosted W bosons. Modern deep learning algorithms trained on jet images can out-perform standard physically-motivated feature driven approaches to jet tagging. We develop techniques for visualizing how these features are learned by the network and what additional information is used to improve performance. Finally, this interplay between physically-motivated feature driven tools and supervised learning algorithms is generalmore » and can be used to significantly increase the sensitivity to discover new particles and new forces, and gain a deeper understanding of the physics within jets.« less
Jet-images — deep learning edition
DOE Office of Scientific and Technical Information (OSTI.GOV)
de Oliveira, Luke; Kagan, Michael; Mackey, Lester
Building on the notion of a particle physics detector as a camera and the collimated streams of high energy particles, or jets, it measures as an image, we investigate the potential of machine learning techniques based on deep learning architectures to identify highly boosted W bosons. Modern deep learning algorithms trained on jet images can out-perform standard physically-motivated feature driven approaches to jet tagging. We develop techniques for visualizing how these features are learned by the network and what additional information is used to improve performance. Finally, this interplay between physically-motivated feature driven tools and supervised learning algorithms is generalmore » and can be used to significantly increase the sensitivity to discover new particles and new forces, and gain a deeper understanding of the physics within jets.« less
Implementation of a General Real-Time Visual Anomaly Detection System Via Soft Computing
NASA Technical Reports Server (NTRS)
Dominguez, Jesus A.; Klinko, Steve; Ferrell, Bob; Steinrock, Todd (Technical Monitor)
2001-01-01
The intelligent visual system detects anomalies or defects in real time under normal lighting operating conditions. The application is basically a learning machine that integrates fuzzy logic (FL), artificial neural network (ANN), and generic algorithm (GA) schemes to process the image, run the learning process, and finally detect the anomalies or defects. The system acquires the image, performs segmentation to separate the object being tested from the background, preprocesses the image using fuzzy reasoning, performs the final segmentation using fuzzy reasoning techniques to retrieve regions with potential anomalies or defects, and finally retrieves them using a learning model built via ANN and GA techniques. FL provides a powerful framework for knowledge representation and overcomes uncertainty and vagueness typically found in image analysis. ANN provides learning capabilities, and GA leads to robust learning results. An application prototype currently runs on a regular PC under Windows NT, and preliminary work has been performed to build an embedded version with multiple image processors. The application prototype is being tested at the Kennedy Space Center (KSC), Florida, to visually detect anomalies along slide basket cables utilized by the astronauts to evacuate the NASA Shuttle launch pad in an emergency. The potential applications of this anomaly detection system in an open environment are quite wide. Another current, potentially viable application at NASA is in detecting anomalies of the NASA Space Shuttle Orbiter's radiator panels.
Evaluation of a Small-Group Technique as a Teacher Training Instrument. Final Report.
ERIC Educational Resources Information Center
Whipple, Babette S.
An exploratory study was designed to determine whether the use of a new, small group technique adds significantly to the level of training in early childhood education. Two groups of five student teachers learned the technique and were then evaluated. The evaluation procedure was designed to measure changes in their educational objectives, their…
Introducing Social Stratification and Inequality: An Active Learning Technique.
ERIC Educational Resources Information Center
McCammon, Lucy
1999-01-01
Summarizes literature on techniques for teaching social stratification. Describes the three parts of an exercise that enables students to understand economic and political inequality: students are given a family scenario, create household budgets, and finally rework the national budget with their family scenario groups. Discusses student…
Research on Mathematical Techniques in Psychology. Final Report.
ERIC Educational Resources Information Center
Gulliksen, Harold
Mathematical techniques are developed for studying psychological problems in three fields: (1) psychological scaling, (2) learning and concept formation, and (3) mental measurement. Psychological scaling procedures are demonstrated to be useful in many areas, ranging from sensory discrimination of physical stimuli, such as colors, sounds, etc.,…
Learning and Optimization of Cognitive Capabilities. Final Project Report.
ERIC Educational Resources Information Center
Lumsdaine, A.A.; And Others
The work of a three-year series of experimental studies of human cognition is summarized in this report. Proglem solving and learning in man-machine interaction was investigated, as well as relevant variables and processes. The work included four separate projects: (1) computer-aided problem solving, (2) computer-aided instruction techniques, (3)…
Helping Learning Disabled Adults through Special Tutorial Techniques. Final Report. 1992-1993.
ERIC Educational Resources Information Center
Reading Area Community Coll., PA.
A project offered special training to instructors and volunteer tutors for adult basic education classes in recognizing and helping adults who are enrolled in adult education programs with learning disabilities. These instructors and tutors were taught the necessary skills through a series of three 3-hour inservice sessions. The regular…
An Approach Based on Social Network Analysis Applied to a Collaborative Learning Experience
ERIC Educational Resources Information Center
Claros, Iván; Cobos, Ruth; Collazos, César A.
2016-01-01
The Social Network Analysis (SNA) techniques allow modelling and analysing the interaction among individuals based on their attributes and relationships. This approach has been used by several researchers in order to measure the social processes in collaborative learning experiences. But oftentimes such measures were calculated at the final state…
Research Handbook on Children's Language Learning. Preliminary Edition. Final Report.
ERIC Educational Resources Information Center
Dato, Daniel P.
This handbook serves as an introduction to the study of children's language development and as a supplementary aid in the training of research workers in the field of children's language learning. As a teaching aid, it is suggested this work be used with a film entitled "Psycholinquistic Research Techniques: Children's Language." Major chapters…
A service based adaptive U-learning system using UX.
Jeong, Hwa-Young; Yi, Gangman
2014-01-01
In recent years, traditional development techniques for e-learning systems have been changing to become more convenient and efficient. One new technology in the development of application systems includes both cloud and ubiquitous computing. Cloud computing can support learning system processes by using services while ubiquitous computing can provide system operation and management via a high performance technical process and network. In the cloud computing environment, a learning service application can provide a business module or process to the user via the internet. This research focuses on providing the learning material and processes of courses by learning units using the services in a ubiquitous computing environment. And we also investigate functions that support users' tailored materials according to their learning style. That is, we analyzed the user's data and their characteristics in accordance with their user experience. We subsequently applied the learning process to fit on their learning performance and preferences. Finally, we demonstrate how the proposed system outperforms learning effects to learners better than existing techniques.
A Service Based Adaptive U-Learning System Using UX
Jeong, Hwa-Young
2014-01-01
In recent years, traditional development techniques for e-learning systems have been changing to become more convenient and efficient. One new technology in the development of application systems includes both cloud and ubiquitous computing. Cloud computing can support learning system processes by using services while ubiquitous computing can provide system operation and management via a high performance technical process and network. In the cloud computing environment, a learning service application can provide a business module or process to the user via the internet. This research focuses on providing the learning material and processes of courses by learning units using the services in a ubiquitous computing environment. And we also investigate functions that support users' tailored materials according to their learning style. That is, we analyzed the user's data and their characteristics in accordance with their user experience. We subsequently applied the learning process to fit on their learning performance and preferences. Finally, we demonstrate how the proposed system outperforms learning effects to learners better than existing techniques. PMID:25147832
A Machine Learning Concept for DTN Routing
NASA Technical Reports Server (NTRS)
Dudukovich, Rachel; Hylton, Alan; Papachristou, Christos
2017-01-01
This paper discusses the concept and architecture of a machine learning based router for delay tolerant space networks. The techniques of reinforcement learning and Bayesian learning are used to supplement the routing decisions of the popular Contact Graph Routing algorithm. An introduction to the concepts of Contact Graph Routing, Q-routing and Naive Bayes classification are given. The development of an architecture for a cross-layer feedback framework for DTN (Delay-Tolerant Networking) protocols is discussed. Finally, initial simulation setup and results are given.
Introducing 12 Year-Olds to Elementary Particles
ERIC Educational Resources Information Center
Wiener, Gerfried J.; Schmeling, Sascha M.; Hopf, Martin
2017-01-01
We present a new learning unit, which introduces 12 year-olds to the subatomic structure of matter. The learning unit was iteratively developed as a design-based research project using the technique of probing acceptance. We give a brief overview of the unit's final version, discuss its key ideas and main concepts, and conclude by highlighting the…
ERIC Educational Resources Information Center
Skokie School District 68, IL.
A Chicago suburban public school with approximately 450 children per grade level demonstrated a system-wide program for identification, diagnosis, and educational treatment of children with learning disabilities in grades 2 through 6. Children were judged to underachieve when achievement measures in language or mathematics fell more than 10% below…
Initial Skill Acquisition of Handrim Wheelchair Propulsion: A New Perspective.
Vegter, Riemer J K; de Groot, Sonja; Lamoth, Claudine J; Veeger, Dirkjan Hej; van der Woude, Lucas H V
2014-01-01
To gain insight into cyclic motor learning processes, hand rim wheelchair propulsion is a suitable cyclic task, to be learned during early rehabilitation and novel to almost every individual. To propel in an energy efficient manner, wheelchair users must learn to control bimanually applied forces onto the rims, preserving both speed and direction of locomotion. The purpose of this study was to evaluate mechanical efficiency and propulsion technique during the initial stage of motor learning. Therefore, 70 naive able-bodied men received 12-min uninstructed wheelchair practice, consisting of three 4-min blocks separated by 2 min rest. Practice was performed on a motor-driven treadmill at a fixed belt speed and constant power output relative to body mass. Energy consumption and the kinetics of propulsion technique were continuously measured. Participants significantly increased their mechanical efficiency and changed their propulsion technique from a high frequency mode with a lot of negative work to a longer-slower movement pattern with less power losses. Furthermore a multi-level model showed propulsion technique to relate to mechanical efficiency. Finally improvers and non-improvers were identified. The non-improving group was already more efficient and had a better propulsion technique in the first block of practice (i.e., the fourth minute). These findings link propulsion technique to mechanical efficiency, support the importance of a correct propulsion technique for wheelchair users and show motor learning differences.
Crowdsourcing: A Primer and Its implications for Systems Engineering
2012-08-01
detailing areas to be improved within current crowdsourcing frameworks. Finally, an agent-based simulation using machine learning techniques is defined, preliminary results are presented, and future research directions are described.
Geometry-based ensembles: toward a structural characterization of the classification boundary.
Pujol, Oriol; Masip, David
2009-06-01
This paper introduces a novel binary discriminative learning technique based on the approximation of the nonlinear decision boundary by a piecewise linear smooth additive model. The decision border is geometrically defined by means of the characterizing boundary points-points that belong to the optimal boundary under a certain notion of robustness. Based on these points, a set of locally robust linear classifiers is defined and assembled by means of a Tikhonov regularized optimization procedure in an additive model to create a final lambda-smooth decision rule. As a result, a very simple and robust classifier with a strong geometrical meaning and nonlinear behavior is obtained. The simplicity of the method allows its extension to cope with some of today's machine learning challenges, such as online learning, large-scale learning or parallelization, with linear computational complexity. We validate our approach on the UCI database, comparing with several state-of-the-art classification techniques. Finally, we apply our technique in online and large-scale scenarios and in six real-life computer vision and pattern recognition problems: gender recognition based on face images, intravascular ultrasound tissue classification, speed traffic sign detection, Chagas' disease myocardial damage severity detection, old musical scores clef classification, and action recognition using 3D accelerometer data from a wearable device. The results are promising and this paper opens a line of research that deserves further attention.
Modeling Temporal Crowd Work Quality with Limited Supervision
2015-11-11
crowdsourcing, human computation, predic- tion, uncertainty-aware learning, time- series modeling Introduction While crowdsourcing offers a cost...individual correctness. As discussed ear- lier, such a strategy is difficult to employ in a live setting because it is unrealistic to assume that all...et al. 2014). Finally, there are interesting opportunities to investigate at the intersection of live task-routing with active-learning techniques
Asadi, Hamed; Kok, Hong Kuan; Looby, Seamus; Brennan, Paul; O'Hare, Alan; Thornton, John
2016-12-01
To identify factors influencing outcome in brain arteriovenous malformations (BAVM) treated with endovascular embolization. We also assessed the feasibility of using machine learning techniques to prognosticate and predict outcome and compared this to conventional statistical analyses. A retrospective study of patients undergoing endovascular treatment of BAVM during a 22-year period in a national neuroscience center was performed. Clinical presentation, imaging, procedural details, complications, and outcome were recorded. The data was analyzed with artificial intelligence techniques to identify predictors of outcome and assess accuracy in predicting clinical outcome at final follow-up. One-hundred ninety-nine patients underwent treatment for BAVM with a mean follow-up duration of 63 months. The commonest clinical presentation was intracranial hemorrhage (56%). During the follow-up period, there were 51 further hemorrhagic events, comprising spontaneous hemorrhage (n = 27) and procedural related hemorrhage (n = 24). All spontaneous events occurred in previously embolized BAVMs remote from the procedure. Complications included ischemic stroke in 10%, symptomatic hemorrhage in 9.8%, and mortality rate of 4.7%. Standard regression analysis model had an accuracy of 43% in predicting final outcome (mortality), with the type of treatment complication identified as the most important predictor. The machine learning model showed superior accuracy of 97.5% in predicting outcome and identified the presence or absence of nidal fistulae as the most important factor. BAVMs can be treated successfully by endovascular techniques or combined with surgery and radiosurgery with an acceptable risk profile. Machine learning techniques can predict final outcome with greater accuracy and may help individualize treatment based on key predicting factors. Copyright © 2016 Elsevier Inc. All rights reserved.
Cascade Error Projection: A Learning Algorithm for Hardware Implementation
NASA Technical Reports Server (NTRS)
Duong, Tuan A.; Daud, Taher
1996-01-01
In this paper, we workout a detailed mathematical analysis for a new learning algorithm termed Cascade Error Projection (CEP) and a general learning frame work. This frame work can be used to obtain the cascade correlation learning algorithm by choosing a particular set of parameters. Furthermore, CEP learning algorithm is operated only on one layer, whereas the other set of weights can be calculated deterministically. In association with the dynamical stepsize change concept to convert the weight update from infinite space into a finite space, the relation between the current stepsize and the previous energy level is also given and the estimation procedure for optimal stepsize is used for validation of our proposed technique. The weight values of zero are used for starting the learning for every layer, and a single hidden unit is applied instead of using a pool of candidate hidden units similar to cascade correlation scheme. Therefore, simplicity in hardware implementation is also obtained. Furthermore, this analysis allows us to select from other methods (such as the conjugate gradient descent or the Newton's second order) one of which will be a good candidate for the learning technique. The choice of learning technique depends on the constraints of the problem (e.g., speed, performance, and hardware implementation); one technique may be more suitable than others. Moreover, for a discrete weight space, the theoretical analysis presents the capability of learning with limited weight quantization. Finally, 5- to 8-bit parity and chaotic time series prediction problems are investigated; the simulation results demonstrate that 4-bit or more weight quantization is sufficient for learning neural network using CEP. In addition, it is demonstrated that this technique is able to compensate for less bit weight resolution by incorporating additional hidden units. However, generation result may suffer somewhat with lower bit weight quantization.
The training and learning process of transseptal puncture using a modified technique.
Yao, Yan; Ding, Ligang; Chen, Wensheng; Guo, Jun; Bao, Jingru; Shi, Rui; Huang, Wen; Zhang, Shu; Wong, Tom
2013-12-01
As the transseptal (TS) puncture has become an integral part of many types of cardiac interventional procedures, its technique that was initial reported for measurement of left atrial pressure in 1950s, continue to evolve. Our laboratory adopted a modified technique which uses only coronary sinus catheter as the landmark to accomplishing TS punctures under fluoroscopy. The aim of this study is prospectively to evaluate the training and learning process for TS puncture guided by this modified technique. Guided by the training protocol, TS puncture was performed in 120 consecutive patients by three trainees without previous personal experience in TS catheterization and one experienced trainer as a controller. We analysed the following parameters: one puncture success rate, total procedure time, fluoroscopic time, and radiation dose. The learning curve was analysed using curve-fitting methodology. The first attempt at TS crossing was successful in 74 (82%), a second attempt was successful in 11 (12%), and 5 patients failed to puncture the interatrial septal finally. The average starting process time was 4.1 ± 0.8 min, and the estimated mean learning plateau was 1.2 ± 0.2 min. The estimated mean learning rate for process time was 25 ± 3 cases. Important aspects of learning curve can be estimated by fitting inverse curves for TS puncture. The study demonstrated that this technique was a simple, safe, economic, and effective approach for learning of TS puncture. Base on the statistical analysis, approximately 29 TS punctures will be needed for trainee to pass the steepest area of learning curve.
Peer Learning in a MATLAB Programming Course
NASA Astrophysics Data System (ADS)
Reckinger, Shanon
2016-11-01
Three forms of research-based peer learning were implemented in the design of a MATLAB programming course for mechanical engineering undergraduate students. First, a peer learning program was initiated. These undergraduate peer learning leaders played two roles in the course, (I) they were in the classroom helping students' with their work, and, (II) they led optional two hour helps sessions outside of the class time. The second form of peer learning was implemented through the inclusion of a peer discussion period following in class clicker quizzes. The third form of peer learning had the students creating video project assignments and posting them on YouTube to explain course topics to their peers. Several other more informal techniques were used to encourage peer learning. Student feedback in the form of both instructor-designed survey responses and formal course evaluations (quantitative and narrative) will be presented. Finally, effectiveness will be measured by formal assessment, direct and indirect to these peer learning methods. This will include both academic data/grades and pre/post test scores. Overall, the course design and its inclusion of these peer learning techniques demonstrate effectiveness.
A lightweight network anomaly detection technique
Kim, Jinoh; Yoo, Wucherl; Sim, Alex; ...
2017-03-13
While the network anomaly detection is essential in network operations and management, it becomes further challenging to perform the first line of detection against the exponentially increasing volume of network traffic. In this paper, we develop a technique for the first line of online anomaly detection with two important considerations: (i) availability of traffic attributes during the monitoring time, and (ii) computational scalability for streaming data. The presented learning technique is lightweight and highly scalable with the beauty of approximation based on the grid partitioning of the given dimensional space. With the public traffic traces of KDD Cup 1999 andmore » NSL-KDD, we show that our technique yields 98.5% and 83% of detection accuracy, respectively, only with a couple of readily available traffic attributes that can be obtained without the help of post-processing. Finally, the results are at least comparable with the classical learning methods including decision tree and random forest, with approximately two orders of magnitude faster learning performance.« less
A lightweight network anomaly detection technique
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Jinoh; Yoo, Wucherl; Sim, Alex
While the network anomaly detection is essential in network operations and management, it becomes further challenging to perform the first line of detection against the exponentially increasing volume of network traffic. In this paper, we develop a technique for the first line of online anomaly detection with two important considerations: (i) availability of traffic attributes during the monitoring time, and (ii) computational scalability for streaming data. The presented learning technique is lightweight and highly scalable with the beauty of approximation based on the grid partitioning of the given dimensional space. With the public traffic traces of KDD Cup 1999 andmore » NSL-KDD, we show that our technique yields 98.5% and 83% of detection accuracy, respectively, only with a couple of readily available traffic attributes that can be obtained without the help of post-processing. Finally, the results are at least comparable with the classical learning methods including decision tree and random forest, with approximately two orders of magnitude faster learning performance.« less
ERIC Educational Resources Information Center
Overlock, Terrence H., Sr.
To determine the effect of collaborative learning methods on the success rate of physics students at Northern Maine Technical College (NMTC), a study was undertaken to compare the mean final exam scores of a students in a physics course taught by traditional lecture/lab methods to those in a group taught by collaborative techniques. The…
Contemporary machine learning: techniques for practitioners in the physical sciences
NASA Astrophysics Data System (ADS)
Spears, Brian
2017-10-01
Machine learning is the science of using computers to find relationships in data without explicitly knowing or programming those relationships in advance. Often without realizing it, we employ machine learning every day as we use our phones or drive our cars. Over the last few years, machine learning has found increasingly broad application in the physical sciences. This most often involves building a model relationship between a dependent, measurable output and an associated set of controllable, but complicated, independent inputs. The methods are applicable both to experimental observations and to databases of simulated output from large, detailed numerical simulations. In this tutorial, we will present an overview of current tools and techniques in machine learning - a jumping-off point for researchers interested in using machine learning to advance their work. We will discuss supervised learning techniques for modeling complicated functions, beginning with familiar regression schemes, then advancing to more sophisticated decision trees, modern neural networks, and deep learning methods. Next, we will cover unsupervised learning and techniques for reducing the dimensionality of input spaces and for clustering data. We'll show example applications from both magnetic and inertial confinement fusion. Along the way, we will describe methods for practitioners to help ensure that their models generalize from their training data to as-yet-unseen test data. We will finally point out some limitations to modern machine learning and speculate on some ways that practitioners from the physical sciences may be particularly suited to help. This work was performed by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
[Therapeutic education didactic techniques].
Valverde, Maite; Vidal, Mercè; Jansa, Margarida
2012-10-01
This article includes an introduction to the role of Therapeutic Education for Diabetes treatment according to the recommendations of the American Diabetes Association (ADA), the Diabetes Education Study Group (DESG) of the "European Association for Study of Diabetes (EASD) and the clinical Practice Guidelines (CPG) of the Spanish Ministry of Health. We analyze theoretical models and the differences between teaching vs. learning as well as current trends (including Internet), that can facilitate meaningful learning of people with diabetes and their families and relatives. We analyze the differences, similarities, advantages and disadvantages of individual and group education. Finally, we describe different educational techniques (metaplan, case method, brainstorming, role playing, games, seminars, autobiography, forums, chats,..) applicable to individual, group or virtual education and its application depending on the learning objective.
NASA Astrophysics Data System (ADS)
Ghasem, Nayef
2016-07-01
This paper illustrates a teaching technique used in computer applications in chemical engineering employed for designing various unit operation processes, where the students learn about unit operations by designing them. The aim of the course is not to teach design, but rather to teach the fundamentals and the function of unit operation processes through simulators. A case study presenting the teaching method was evaluated using student surveys and faculty assessments, which were designed to measure the quality and effectiveness of the teaching method. The results of the questionnaire conclusively demonstrate that this method is an extremely efficient way of teaching a simulator-based course. In addition to that, this teaching method can easily be generalised and used in other courses. A student's final mark is determined by a combination of in-class assessments conducted based on cooperative and peer learning, progress tests and a final exam. Results revealed that peer learning can improve the overall quality of student learning and enhance student understanding.
NASA Technical Reports Server (NTRS)
Jani, Yashvant
1993-01-01
As part of the RICIS project, the reinforcement learning techniques developed at Ames Research Center are being applied to proximity and docking operations using the Shuttle and Solar Maximum Mission (SMM) satellite simulation. In utilizing these fuzzy learning techniques, we use the Approximate Reasoning based Intelligent Control (ARIC) architecture, and so we use these two terms interchangeably to imply the same. This activity is carried out in the Software Technology Laboratory utilizing the Orbital Operations Simulator (OOS) and programming/testing support from other contractor personnel. This report is the final deliverable D4 in our milestones and project activity. It provides the test results for the special testcase of approach/docking scenario for the shuttle and SMM satellite. Based on our experience and analysis with the attitude and translational controllers, we have modified the basic configuration of the reinforcement learning algorithm in ARIC. The shuttle translational controller and its implementation in ARIC is described in our deliverable D3. In order to simulate the final approach and docking operations, we have set-up this special testcase as described in section 2. The ARIC performance results for these operations are discussed in section 3 and conclusions are provided in section 4 along with the summary for the project.
Using cooperative learning for a drug information assignment.
Earl, Grace L
2009-11-12
To implement a cooperative learning activity to engage students in analyzing tertiary drug information resources in a literature evaluation course. The class was divided into 4 sections to form expert groups and each group researched a different set of references using the jigsaw technique. Each member of each expert group was reassigned to a jigsaw group so that each new group was composed of 4 students from 4 different expert groups. The jigsaw groups met to discuss search strategies and rate the usefulness of the references. In addition to group-based learning, teaching methods included students' writing an independent research paper to enhance their abilities to search and analyze drug information resources. The assignment and final course grades improved after implementation of the activity. Students agreed that class discussions were a useful learning experience and 75% (77/102) said they would use the drug information references for other courses. The jigsaw technique was successful in engaging students in cooperative learning to improve critical thinking skills regarding drug information.
Introducing 12 year-olds to elementary particles
NASA Astrophysics Data System (ADS)
Wiener, Gerfried J.; Schmeling, Sascha M.; Hopf, Martin
2017-07-01
We present a new learning unit, which introduces 12 year-olds to the subatomic structure of matter. The learning unit was iteratively developed as a design-based research project using the technique of probing acceptance. We give a brief overview of the unit’s final version, discuss its key ideas and main concepts, and conclude by highlighting the main implications of our research, which we consider to be most promising for use in the physics classroom.
Imaging nanoscale lattice variations by machine learning of x-ray diffraction microscopy data
Laanait, Nouamane; Zhang, Zhan; Schlepütz, Christian M.
2016-08-09
In this paper, we present a novel methodology based on machine learning to extract lattice variations in crystalline materials, at the nanoscale, from an x-ray Bragg diffraction-based imaging technique. By employing a full-field microscopy setup, we capture real space images of materials, with imaging contrast determined solely by the x-ray diffracted signal. The data sets that emanate from this imaging technique are a hybrid of real space information (image spatial support) and reciprocal lattice space information (image contrast), and are intrinsically multidimensional (5D). By a judicious application of established unsupervised machine learning techniques and multivariate analysis to this multidimensional datamore » cube, we show how to extract features that can be ascribed physical interpretations in terms of common structural distortions, such as lattice tilts and dislocation arrays. Finally, we demonstrate this 'big data' approach to x-ray diffraction microscopy by identifying structural defects present in an epitaxial ferroelectric thin-film of lead zirconate titanate.« less
Imaging nanoscale lattice variations by machine learning of x-ray diffraction microscopy data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Laanait, Nouamane; Zhang, Zhan; Schlepütz, Christian M.
In this paper, we present a novel methodology based on machine learning to extract lattice variations in crystalline materials, at the nanoscale, from an x-ray Bragg diffraction-based imaging technique. By employing a full-field microscopy setup, we capture real space images of materials, with imaging contrast determined solely by the x-ray diffracted signal. The data sets that emanate from this imaging technique are a hybrid of real space information (image spatial support) and reciprocal lattice space information (image contrast), and are intrinsically multidimensional (5D). By a judicious application of established unsupervised machine learning techniques and multivariate analysis to this multidimensional datamore » cube, we show how to extract features that can be ascribed physical interpretations in terms of common structural distortions, such as lattice tilts and dislocation arrays. Finally, we demonstrate this 'big data' approach to x-ray diffraction microscopy by identifying structural defects present in an epitaxial ferroelectric thin-film of lead zirconate titanate.« less
Accurate monitoring leads to effective control and greater learning of patient education materials.
Rawson, Katherine A; O'Neil, Rochelle; Dunlosky, John
2011-09-01
Effective management of chronic diseases (e.g., diabetes) can depend on the extent to which patients can learn and remember disease-relevant information. In two experiments, we explored a technique motivated by theories of self-regulated learning for improving people's learning of information relevant to managing a chronic disease. Materials were passages from patient education booklets on diabetes from NIDDK. Session 1 included an initial study trial, Session 2 included self-regulated restudy, and Session 3 included a final memory test. The key manipulation concerned the kind of support provided for self-regulated learning during Session 2. In Experiment 1, participants either were prompted to self-test and then evaluate their learning before selecting passages to restudy, were shown the prompt questions but did not overtly self-test or evaluate learning prior to selecting passages, or were not shown any prompts and were simply given the menu for selecting passages to restudy. Participants who self-tested and evaluated learning during Session 2 had a small but significant advantage over the other groups on the final test. Secondary analyses provided evidence that the performance advantage may have been modest because of inaccurate monitoring. Experiment 2 included a group who also self-tested but who evaluated their learning using idea-unit judgments (i.e., by checking their responses against a list of key ideas from the correct response). Participants who self-tested and made idea-unit judgments exhibited a sizable advantage on final test performance. Secondary analyses indicated that the performance advantage was attributable in part to more accurate monitoring and more effective self-regulated learning. An important practical implication is that learning of patient education materials can be enhanced by including appropriate support for learners' self-regulatory processes. (c) 2011 APA, all rights reserved.
Learning New Basic Movements for Robotics
NASA Astrophysics Data System (ADS)
Kober, Jens; Peters, Jan
Obtaining novel skills is one of the most important problems in robotics. Machine learning techniques may be a promising approach for automatic and autonomous acquisition of movement policies. However, this requires both an appropriate policy representation and suitable learning algorithms. Employing the most recent form of the dynamical systems motor primitives originally introduced by Ijspeert et al. [1], we show how both discrete and rhythmic tasks can be learned using a concerted approach of both imitation and reinforcement learning, and present our current best performing learning algorithms. Finally, we show that it is possible to include a start-up phase in rhythmic primitives. We apply our approach to two elementary movements, i.e., Ball-in-a-Cup and Ball-Paddling, which can be learned on a real Barrett WAM robot arm at a pace similar to human learning.
Silicon photonics for neuromorphic information processing
NASA Astrophysics Data System (ADS)
Bienstman, Peter; Dambre, Joni; Katumba, Andrew; Freiberger, Matthias; Laporte, Floris; Lugnan, Alessio
2018-02-01
We present our latest results on silicon photonics neuromorphic information processing based a.o. on techniques like reservoir computing. We will discuss aspects like scalability, novel architectures for enhanced power efficiency, as well as all-optical readout. Additionally, we will touch upon new machine learning techniques to operate these integrated readouts. Finally, we will show how these systems can be used for high-speed low-power information processing for applications like recognition of biological cells.
Ye, Qing; Pan, Hao; Liu, Changhua
2015-01-01
This research proposes a novel framework of final drive simultaneous failure diagnosis containing feature extraction, training paired diagnostic models, generating decision threshold, and recognizing simultaneous failure modes. In feature extraction module, adopt wavelet package transform and fuzzy entropy to reduce noise interference and extract representative features of failure mode. Use single failure sample to construct probability classifiers based on paired sparse Bayesian extreme learning machine which is trained only by single failure modes and have high generalization and sparsity of sparse Bayesian learning approach. To generate optimal decision threshold which can convert probability output obtained from classifiers into final simultaneous failure modes, this research proposes using samples containing both single and simultaneous failure modes and Grid search method which is superior to traditional techniques in global optimization. Compared with other frequently used diagnostic approaches based on support vector machine and probability neural networks, experiment results based on F 1-measure value verify that the diagnostic accuracy and efficiency of the proposed framework which are crucial for simultaneous failure diagnosis are superior to the existing approach. PMID:25722717
310 Individualized Teacher Practicum. Final Report, 1979-80.
ERIC Educational Resources Information Center
Barabe, Rosemeri, Comp.; And Others
Objectives and program descriptions are presented for the Scottsdale Adult Learning Center (Arizona) which in 1979-80 conducted a number of practicums for adult educators on individualized techniques, Adult Basic Education (ABE), High School Equivalency (GED), and English as a Second Language (ESL). First described is a paid internship program for…
Production Techniques for Computer-Based Learning Material.
ERIC Educational Resources Information Center
Moonen, Jef; Schoenmaker, Jan
Experiences in the development of educational software in the Netherlands have included the use of individual and team approaches, the determination of software content and how it should be presented, and the organization of the entire development process, from experimental programs to prototype to final product. Because educational software is a…
Behavioral Science Elementary Teacher Education Program. Final Report. Volume II.
ERIC Educational Resources Information Center
Michigan State Univ., East Lansing.
Volume II separately details two model components: Scholarly Modes of Knowledge, in which problem-solving techniques and the applicability of subject content to teaching are emphasized, and Professional Use of Knowledge, in which the student translates what he knows about various disciplines and human learning into instructional strategies. Each…
Johnson, Jason K.; Oyen, Diane Adele; Chertkov, Michael; ...
2016-12-01
Inference and learning of graphical models are both well-studied problems in statistics and machine learning that have found many applications in science and engineering. However, exact inference is intractable in general graphical models, which suggests the problem of seeking the best approximation to a collection of random variables within some tractable family of graphical models. In this paper, we focus on the class of planar Ising models, for which exact inference is tractable using techniques of statistical physics. Based on these techniques and recent methods for planarity testing and planar embedding, we propose a greedy algorithm for learning the bestmore » planar Ising model to approximate an arbitrary collection of binary random variables (possibly from sample data). Given the set of all pairwise correlations among variables, we select a planar graph and optimal planar Ising model defined on this graph to best approximate that set of correlations. Finally, we demonstrate our method in simulations and for two applications: modeling senate voting records and identifying geo-chemical depth trends from Mars rover data.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Jason K.; Oyen, Diane Adele; Chertkov, Michael
Inference and learning of graphical models are both well-studied problems in statistics and machine learning that have found many applications in science and engineering. However, exact inference is intractable in general graphical models, which suggests the problem of seeking the best approximation to a collection of random variables within some tractable family of graphical models. In this paper, we focus on the class of planar Ising models, for which exact inference is tractable using techniques of statistical physics. Based on these techniques and recent methods for planarity testing and planar embedding, we propose a greedy algorithm for learning the bestmore » planar Ising model to approximate an arbitrary collection of binary random variables (possibly from sample data). Given the set of all pairwise correlations among variables, we select a planar graph and optimal planar Ising model defined on this graph to best approximate that set of correlations. Finally, we demonstrate our method in simulations and for two applications: modeling senate voting records and identifying geo-chemical depth trends from Mars rover data.« less
Dipnall, Joanna F.
2016-01-01
Background Atheoretical large-scale data mining techniques using machine learning algorithms have promise in the analysis of large epidemiological datasets. This study illustrates the use of a hybrid methodology for variable selection that took account of missing data and complex survey design to identify key biomarkers associated with depression from a large epidemiological study. Methods The study used a three-step methodology amalgamating multiple imputation, a machine learning boosted regression algorithm and logistic regression, to identify key biomarkers associated with depression in the National Health and Nutrition Examination Study (2009–2010). Depression was measured using the Patient Health Questionnaire-9 and 67 biomarkers were analysed. Covariates in this study included gender, age, race, smoking, food security, Poverty Income Ratio, Body Mass Index, physical activity, alcohol use, medical conditions and medications. The final imputed weighted multiple logistic regression model included possible confounders and moderators. Results After the creation of 20 imputation data sets from multiple chained regression sequences, machine learning boosted regression initially identified 21 biomarkers associated with depression. Using traditional logistic regression methods, including controlling for possible confounders and moderators, a final set of three biomarkers were selected. The final three biomarkers from the novel hybrid variable selection methodology were red cell distribution width (OR 1.15; 95% CI 1.01, 1.30), serum glucose (OR 1.01; 95% CI 1.00, 1.01) and total bilirubin (OR 0.12; 95% CI 0.05, 0.28). Significant interactions were found between total bilirubin with Mexican American/Hispanic group (p = 0.016), and current smokers (p<0.001). Conclusion The systematic use of a hybrid methodology for variable selection, fusing data mining techniques using a machine learning algorithm with traditional statistical modelling, accounted for missing data and complex survey sampling methodology and was demonstrated to be a useful tool for detecting three biomarkers associated with depression for future hypothesis generation: red cell distribution width, serum glucose and total bilirubin. PMID:26848571
Dipnall, Joanna F; Pasco, Julie A; Berk, Michael; Williams, Lana J; Dodd, Seetal; Jacka, Felice N; Meyer, Denny
2016-01-01
Atheoretical large-scale data mining techniques using machine learning algorithms have promise in the analysis of large epidemiological datasets. This study illustrates the use of a hybrid methodology for variable selection that took account of missing data and complex survey design to identify key biomarkers associated with depression from a large epidemiological study. The study used a three-step methodology amalgamating multiple imputation, a machine learning boosted regression algorithm and logistic regression, to identify key biomarkers associated with depression in the National Health and Nutrition Examination Study (2009-2010). Depression was measured using the Patient Health Questionnaire-9 and 67 biomarkers were analysed. Covariates in this study included gender, age, race, smoking, food security, Poverty Income Ratio, Body Mass Index, physical activity, alcohol use, medical conditions and medications. The final imputed weighted multiple logistic regression model included possible confounders and moderators. After the creation of 20 imputation data sets from multiple chained regression sequences, machine learning boosted regression initially identified 21 biomarkers associated with depression. Using traditional logistic regression methods, including controlling for possible confounders and moderators, a final set of three biomarkers were selected. The final three biomarkers from the novel hybrid variable selection methodology were red cell distribution width (OR 1.15; 95% CI 1.01, 1.30), serum glucose (OR 1.01; 95% CI 1.00, 1.01) and total bilirubin (OR 0.12; 95% CI 0.05, 0.28). Significant interactions were found between total bilirubin with Mexican American/Hispanic group (p = 0.016), and current smokers (p<0.001). The systematic use of a hybrid methodology for variable selection, fusing data mining techniques using a machine learning algorithm with traditional statistical modelling, accounted for missing data and complex survey sampling methodology and was demonstrated to be a useful tool for detecting three biomarkers associated with depression for future hypothesis generation: red cell distribution width, serum glucose and total bilirubin.
NASA Astrophysics Data System (ADS)
Alsadoon, Abeer; Prasad, P. W. C.; Beg, Azam
2017-09-01
Making the students understand the theoretical concepts of digital logic design concepts is one of the major issues faced by the academics, therefore the teachers have tried different techniques to link the theoretical information to the practical knowledge. Use of software simulations is a technique for learning and practice that can be applied to many different disciplines. Experimentation of different computer hardware components/integrated circuits with the use of the simulators enhances the student learning. The simulators can be rather simplistic or quite complex. This paper reports our evaluation of different simulators available for use in the higher education institutions. We also provide the experience of incorporating some selected tools in teaching introductory courses in computer systems. We justified the effectiveness of incorporating the simulators into the computer system courses by use of student survey and final grade results.
Deep Learning in Nuclear Medicine and Molecular Imaging: Current Perspectives and Future Directions.
Choi, Hongyoon
2018-04-01
Recent advances in deep learning have impacted various scientific and industrial fields. Due to the rapid application of deep learning in biomedical data, molecular imaging has also started to adopt this technique. In this regard, it is expected that deep learning will potentially affect the roles of molecular imaging experts as well as clinical decision making. This review firstly offers a basic overview of deep learning particularly for image data analysis to give knowledge to nuclear medicine physicians and researchers. Because of the unique characteristics and distinctive aims of various types of molecular imaging, deep learning applications can be different from other fields. In this context, the review deals with current perspectives of deep learning in molecular imaging particularly in terms of development of biomarkers. Finally, future challenges of deep learning application for molecular imaging and future roles of experts in molecular imaging will be discussed.
Laursen, Jannie
2014-01-01
Background. When implementing a new surgical technique, the best method for didactic learning has not been settled. There are basically two scenarios: the trainee goes to the teacher's clinic and learns the new technique hands-on, or the teacher goes to the trainee's clinic and performs the teaching there. Methods. An informal literature review was conducted to provide a basis for discussing pros and cons. We also wanted to discuss how many surgeons can be trained in a day and the importance of the demand for a new surgical procedure to ensure a high adoption rate and finally to apply these issues on a discussion of barriers for adoption of the new ONSTEP technique for inguinal hernia repair after initial training. Results and Conclusions. The optimal training method would include moving the teacher to the trainee's department to obtain team-training effects simultaneous with surgical technical training of the trainee surgeon. The training should also include a theoretical presentation and discussion along with the practical training. Importantly, the training visit should probably be followed by a scheduled visit to clear misunderstandings and fine-tune the technique after an initial self-learning period. PMID:25506078
Applying machine learning classification techniques to automate sky object cataloguing
NASA Astrophysics Data System (ADS)
Fayyad, Usama M.; Doyle, Richard J.; Weir, W. Nick; Djorgovski, Stanislav
1993-08-01
We describe the application of an Artificial Intelligence machine learning techniques to the development of an automated tool for the reduction of a large scientific data set. The 2nd Mt. Palomar Northern Sky Survey is nearly completed. This survey provides comprehensive coverage of the northern celestial hemisphere in the form of photographic plates. The plates are being transformed into digitized images whose quality will probably not be surpassed in the next ten to twenty years. The images are expected to contain on the order of 107 galaxies and 108 stars. Astronomers wish to determine which of these sky objects belong to various classes of galaxies and stars. Unfortunately, the size of this data set precludes analysis in an exclusively manual fashion. Our approach is to develop a software system which integrates the functions of independently developed techniques for image processing and data classification. Digitized sky images are passed through image processing routines to identify sky objects and to extract a set of features for each object. These routines are used to help select a useful set of attributes for classifying sky objects. Then GID3 (Generalized ID3) and O-B Tree, two inductive learning techniques, learns classification decision trees from examples. These classifiers will then be applied to new data. These developmnent process is highly interactive, with astronomer input playing a vital role. Astronomers refine the feature set used to construct sky object descriptions, and evaluate the performance of the automated classification technique on new data. This paper gives an overview of the machine learning techniques with an emphasis on their general applicability, describes the details of our specific application, and reports the initial encouraging results. The results indicate that our machine learning approach is well-suited to the problem. The primary benefit of the approach is increased data reduction throughput. Another benefit is consistency of classification. The classification rules which are the product of the inductive learning techniques will form an objective, examinable basis for classifying sky objects. A final, not to be underestimated benefit is that astronomers will be freed from the tedium of an intensely visual task to pursue more challenging analysis and interpretation problems based on automatically catalogued data.
NASA Astrophysics Data System (ADS)
Taylor, Stephen R.; Simon, Joseph; Sampson, Laura
2017-01-01
The final parsec of supermassive black-hole binary evolution is subject to the complex interplay of stellar loss-cone scattering, circumbinary disk accretion, and gravitational-wave emission, with binary eccentricity affected by all of these. The strain spectrum of gravitational-waves in the pulsar-timing band thus encodes rich information about the binary population's response to these various environmental mechanisms. Current spectral models have heretofore followed basic analytic prescriptions, and attempt to investigate these final-parsec mechanisms in an indirect fashion. Here we describe a new technique to directly probe the environmental properties of supermassive black-hole binaries through "Bayesian model-emulation". We perform black-hole binary population synthesis simulations at a restricted set of environmental parameter combinations, compute the strain spectra from these, then train a Gaussian process to learn the shape of the spectrum at any point in parameter space. We describe this technique, demonstrate its efficacy with a program of simulated datasets, then illustrate its power by directly constraining final-parsec physics in a Bayesian analysis of the NANOGrav 5-year dataset. The technique is fast, flexible, and robust.
NASA Astrophysics Data System (ADS)
Taylor, Stephen; Simon, Joseph; Sampson, Laura
2017-01-01
The final parsec of supermassive black-hole binary evolution is subject to the complex interplay of stellar loss-cone scattering, circumbinary disk accretion, and gravitational-wave emission, with binary eccentricity affected by all of these. The strain spectrum of gravitational-waves in the pulsar-timing band thus encodes rich information about the binary population's response to these various environmental mechanisms. Current spectral models have heretofore followed basic analytic prescriptions, and attempt to investigate these final-parsec mechanisms in an indirect fashion. Here we describe a new technique to directly probe the environmental properties of supermassive black-hole binaries through ``Bayesian model-emulation''. We perform black-hole binary population synthesis simulations at a restricted set of environmental parameter combinations, compute the strain spectra from these, then train a Gaussian process to learn the shape of spectrum at any point in parameter space. We describe this technique, demonstrate its efficacy with a program of simulated datasets, then illustrate its power by directly constraining final-parsec physics in a Bayesian analysis of the NANOGrav 5-year dataset. The technique is fast, flexible, and robust.
NASA Astrophysics Data System (ADS)
Hoffmann, Achim; Mahidadia, Ashesh
The purpose of this chapter is to present fundamental ideas and techniques of machine learning suitable for the field of this book, i.e., for automated scientific discovery. The chapter focuses on those symbolic machine learning methods, which produce results that are suitable to be interpreted and understood by humans. This is particularly important in the context of automated scientific discovery as the scientific theories to be produced by machines are usually meant to be interpreted by humans. This chapter contains some of the most influential ideas and concepts in machine learning research to give the reader a basic insight into the field. After the introduction in Sect. 1, general ideas of how learning problems can be framed are given in Sect. 2. The section provides useful perspectives to better understand what learning algorithms actually do. Section 3 presents the Version space model which is an early learning algorithm as well as a conceptual framework, that provides important insight into the general mechanisms behind most learning algorithms. In section 4, a family of learning algorithms, the AQ family for learning classification rules is presented. The AQ family belongs to the early approaches in machine learning. The next, Sect. 5 presents the basic principles of decision tree learners. Decision tree learners belong to the most influential class of inductive learning algorithms today. Finally, a more recent group of learning systems are presented in Sect. 6, which learn relational concepts within the framework of logic programming. This is a particularly interesting group of learning systems since the framework allows also to incorporate background knowledge which may assist in generalisation. Section 7 discusses Association Rules - a technique that comes from the related field of Data mining. Section 8 presents the basic idea of the Naive Bayesian Classifier. While this is a very popular learning technique, the learning result is not well suited for human comprehension as it is essentially a large collection of probability values. In Sect. 9, we present a generic method for improving accuracy of a given learner by generatingmultiple classifiers using variations of the training data. While this works well in most cases, the resulting classifiers have significantly increased complexity and, hence, tend to destroy the human readability of the learning result that a single learner may produce. Section 10 contains a summary, mentions briefly other techniques not discussed in this chapter and presents outlook on the potential of machine learning in the future.
Dunlosky, John; Rawson, Katherine A; Marsh, Elizabeth J; Nathan, Mitchell J; Willingham, Daniel T
2013-01-01
Many students are being left behind by an educational system that some people believe is in crisis. Improving educational outcomes will require efforts on many fronts, but a central premise of this monograph is that one part of a solution involves helping students to better regulate their learning through the use of effective learning techniques. Fortunately, cognitive and educational psychologists have been developing and evaluating easy-to-use learning techniques that could help students achieve their learning goals. In this monograph, we discuss 10 learning techniques in detail and offer recommendations about their relative utility. We selected techniques that were expected to be relatively easy to use and hence could be adopted by many students. Also, some techniques (e.g., highlighting and rereading) were selected because students report relying heavily on them, which makes it especially important to examine how well they work. The techniques include elaborative interrogation, self-explanation, summarization, highlighting (or underlining), the keyword mnemonic, imagery use for text learning, rereading, practice testing, distributed practice, and interleaved practice. To offer recommendations about the relative utility of these techniques, we evaluated whether their benefits generalize across four categories of variables: learning conditions, student characteristics, materials, and criterion tasks. Learning conditions include aspects of the learning environment in which the technique is implemented, such as whether a student studies alone or with a group. Student characteristics include variables such as age, ability, and level of prior knowledge. Materials vary from simple concepts to mathematical problems to complicated science texts. Criterion tasks include different outcome measures that are relevant to student achievement, such as those tapping memory, problem solving, and comprehension. We attempted to provide thorough reviews for each technique, so this monograph is rather lengthy. However, we also wrote the monograph in a modular fashion, so it is easy to use. In particular, each review is divided into the following sections: General description of the technique and why it should work How general are the effects of this technique? 2a. Learning conditions 2b. Student characteristics 2c. Materials 2d. Criterion tasks Effects in representative educational contexts Issues for implementation Overall assessment The review for each technique can be read independently of the others, and particular variables of interest can be easily compared across techniques. To foreshadow our final recommendations, the techniques vary widely with respect to their generalizability and promise for improving student learning. Practice testing and distributed practice received high utility assessments because they benefit learners of different ages and abilities and have been shown to boost students' performance across many criterion tasks and even in educational contexts. Elaborative interrogation, self-explanation, and interleaved practice received moderate utility assessments. The benefits of these techniques do generalize across some variables, yet despite their promise, they fell short of a high utility assessment because the evidence for their efficacy is limited. For instance, elaborative interrogation and self-explanation have not been adequately evaluated in educational contexts, and the benefits of interleaving have just begun to be systematically explored, so the ultimate effectiveness of these techniques is currently unknown. Nevertheless, the techniques that received moderate-utility ratings show enough promise for us to recommend their use in appropriate situations, which we describe in detail within the review of each technique. Five techniques received a low utility assessment: summarization, highlighting, the keyword mnemonic, imagery use for text learning, and rereading. These techniques were rated as low utility for numerous reasons. Summarization and imagery use for text learning have been shown to help some students on some criterion tasks, yet the conditions under which these techniques produce benefits are limited, and much research is still needed to fully explore their overall effectiveness. The keyword mnemonic is difficult to implement in some contexts, and it appears to benefit students for a limited number of materials and for short retention intervals. Most students report rereading and highlighting, yet these techniques do not consistently boost students' performance, so other techniques should be used in their place (e.g., practice testing instead of rereading). Our hope is that this monograph will foster improvements in student learning, not only by showcasing which learning techniques are likely to have the most generalizable effects but also by encouraging researchers to continue investigating the most promising techniques. Accordingly, in our closing remarks, we discuss some issues for how these techniques could be implemented by teachers and students, and we highlight directions for future research. © The Author(s) 2013.
Two neural network algorithms for designing optimal terminal controllers with open final time
NASA Technical Reports Server (NTRS)
Plumer, Edward S.
1992-01-01
Multilayer neural networks, trained by the backpropagation through time algorithm (BPTT), have been used successfully as state-feedback controllers for nonlinear terminal control problems. Current BPTT techniques, however, are not able to deal systematically with open final-time situations such as minimum-time problems. Two approaches which extend BPTT to open final-time problems are presented. In the first, a neural network learns a mapping from initial-state to time-to-go. In the second, the optimal number of steps for each trial run is found using a line-search. Both methods are derived using Lagrange multiplier techniques. This theoretical framework is used to demonstrate that the derived algorithms are direct extensions of forward/backward sweep methods used in N-stage optimal control. The two algorithms are tested on a Zermelo problem and the resulting trajectories compare favorably to optimal control results.
Blending Individual and Group Assessment: A Model for Measuring Student Performance
ERIC Educational Resources Information Center
Reiser, Elana
2017-01-01
Two sections of a college discrete mathematics class were taught using cooperative learning techniques throughout the semester. The 33 students attending these sections were randomly assigned into groups of three. Their final examination consisted of an individual and group blended examination where students worked in their groups and discussed…
Self-Reflective Journaling: A Tool for Assessment
ERIC Educational Resources Information Center
Giguere, Miriam
2012-01-01
This article outlines suggestions for the use of self-reflective journaling as an assessment method in dance technique classes. The use of self-reflection makes assessment a part of the learning process, not an imposed evaluation of a student's final product, particularly when it is related to personal goal setting. The article provides practical…
Deep Learning for Computer Vision: A Brief Review
Doulamis, Nikolaos; Doulamis, Anastasios; Protopapadakis, Eftychios
2018-01-01
Over the last years deep learning methods have been shown to outperform previous state-of-the-art machine learning techniques in several fields, with computer vision being one of the most prominent cases. This review paper provides a brief overview of some of the most significant deep learning schemes used in computer vision problems, that is, Convolutional Neural Networks, Deep Boltzmann Machines and Deep Belief Networks, and Stacked Denoising Autoencoders. A brief account of their history, structure, advantages, and limitations is given, followed by a description of their applications in various computer vision tasks, such as object detection, face recognition, action and activity recognition, and human pose estimation. Finally, a brief overview is given of future directions in designing deep learning schemes for computer vision problems and the challenges involved therein. PMID:29487619
Gopaldas, Raja R; Rohatgi, Chand
2009-04-01
A major limitation of conventional laparoscopic surgery is the placement of an intracorporeal (IC) knot, which requires a significant amount of training and practice. An easier technique of IC knot tying using 90-degree grasper is compared with the conventional technique (CLT). The new axial-spin technique (AST) uses the spin of the instrument shaft to tie IC knots. Fourteen participants stratified into 3 training levels were instructed to tie 50 reef IC knots using each technique on trainers in 3 sessions. The final 5 knots tied using each technique were deemed to be representative of maximal performance efficiency (PE) and randomly subject to tensile strength measurements using a tensiometer at 50 mm/s distraction. Mean knot execution time (mKET) measured in seconds (s), normalized KE time (nET=group mean/mKET), knot holding capacity, relative knot security (RKS), and PE (PE=RKS/nET) of the knots tied were computed and analyzed using paired t and analysis of variance. Variables included knot-tying session, technique and the training level. On completion of the study, junior residents (JR) averaged 51.72 seconds more, senior residents (SR) averaged 26.22 seconds more and attendings (ATT) averaged 19.17 seconds less to tie using CLT compared with the AST (F=40.52, P=0.0001). Across all levels, the CLT technique was taking 83.26 seconds on average to execute an IC knot compared with 59.08 seconds with AST method (t=2.784, P=0.015). Learning curves revealed that JR significantly improved mean KE times with the AST technique (first session vs. final session: 473.8 s vs. 55.9 s) compared with CLT (672.5 s vs. 107.6 s) across the sessions as compared with those in advanced levels of training. The RKS of knots executed by AST was significantly stronger (AST: 13.1 vs. 5.44 N, t=4.9, P=0.0001). The PE of knots executed using the CLT increased geometrically across training levels (JR: 1.35% SR: 5.58% ATT: 11.22%) whereas those of AST showed a linear trend (17.09%; 17.11%, and 13.95%). The AST follows a linear pattern of learning across training levels compared with the steep exponential learning of the CLT. Inexperienced JRs were surprisingly 1.5 times more efficient with AST and 8 times less efficient with CLT compared with ATT using the CLT to execute the same knot. The AST is significantly easier to learn for JRs and could serve as a platform before acquiring more advanced knot-tying skills. Overall, with the AST, execution times are significantly shorter whereas the RKS and PE are significantly higher. JRs achieve a level of proficiency comparable with the senior level residents and ATT after participating in a reasonable training session consisting of at least 25 knots.
Liu, Chunming; Xu, Xin; Hu, Dewen
2013-04-29
Reinforcement learning is a powerful mechanism for enabling agents to learn in an unknown environment, and most reinforcement learning algorithms aim to maximize some numerical value, which represents only one long-term objective. However, multiple long-term objectives are exhibited in many real-world decision and control problems; therefore, recently, there has been growing interest in solving multiobjective reinforcement learning (MORL) problems with multiple conflicting objectives. The aim of this paper is to present a comprehensive overview of MORL. In this paper, the basic architecture, research topics, and naive solutions of MORL are introduced at first. Then, several representative MORL approaches and some important directions of recent research are reviewed. The relationships between MORL and other related research are also discussed, which include multiobjective optimization, hierarchical reinforcement learning, and multi-agent reinforcement learning. Finally, research challenges and open problems of MORL techniques are highlighted.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Phinney, N.
The SLAC Linear Collider (SLC) is the first example of an entirely new type of lepton collider. Many years of effort were required to develop the understanding and techniques needed to approach design luminosity. This paper discusses some of the key issues and problems encountered in producing a working linear collider. These include the polarized source, techniques for emittance preservation, extensive feedback systems, and refinements in beam optimization in the final focus. The SLC experience has been invaluable for testing concepts and developing designs for a future linear collider.
Deep Learning and Its Applications in Biomedicine.
Cao, Chensi; Liu, Feng; Tan, Hai; Song, Deshou; Shu, Wenjie; Li, Weizhong; Zhou, Yiming; Bo, Xiaochen; Xie, Zhi
2018-02-01
Advances in biological and medical technologies have been providing us explosive volumes of biological and physiological data, such as medical images, electroencephalography, genomic and protein sequences. Learning from these data facilitates the understanding of human health and disease. Developed from artificial neural networks, deep learning-based algorithms show great promise in extracting features and learning patterns from complex data. The aim of this paper is to provide an overview of deep learning techniques and some of the state-of-the-art applications in the biomedical field. We first introduce the development of artificial neural network and deep learning. We then describe two main components of deep learning, i.e., deep learning architectures and model optimization. Subsequently, some examples are demonstrated for deep learning applications, including medical image classification, genomic sequence analysis, as well as protein structure classification and prediction. Finally, we offer our perspectives for the future directions in the field of deep learning. Copyright © 2018. Production and hosting by Elsevier B.V.
Less is more: Sampling chemical space with active learning
NASA Astrophysics Data System (ADS)
Smith, Justin S.; Nebgen, Ben; Lubbers, Nicholas; Isayev, Olexandr; Roitberg, Adrian E.
2018-06-01
The development of accurate and transferable machine learning (ML) potentials for predicting molecular energetics is a challenging task. The process of data generation to train such ML potentials is a task neither well understood nor researched in detail. In this work, we present a fully automated approach for the generation of datasets with the intent of training universal ML potentials. It is based on the concept of active learning (AL) via Query by Committee (QBC), which uses the disagreement between an ensemble of ML potentials to infer the reliability of the ensemble's prediction. QBC allows the presented AL algorithm to automatically sample regions of chemical space where the ML potential fails to accurately predict the potential energy. AL improves the overall fitness of ANAKIN-ME (ANI) deep learning potentials in rigorous test cases by mitigating human biases in deciding what new training data to use. AL also reduces the training set size to a fraction of the data required when using naive random sampling techniques. To provide validation of our AL approach, we develop the COmprehensive Machine-learning Potential (COMP6) benchmark (publicly available on GitHub) which contains a diverse set of organic molecules. Active learning-based ANI potentials outperform the original random sampled ANI-1 potential with only 10% of the data, while the final active learning-based model vastly outperforms ANI-1 on the COMP6 benchmark after training to only 25% of the data. Finally, we show that our proposed AL technique develops a universal ANI potential (ANI-1x) that provides accurate energy and force predictions on the entire COMP6 benchmark. This universal ML potential achieves a level of accuracy on par with the best ML potentials for single molecules or materials, while remaining applicable to the general class of organic molecules composed of the elements CHNO.
Human semi-supervised learning.
Gibson, Bryan R; Rogers, Timothy T; Zhu, Xiaojin
2013-01-01
Most empirical work in human categorization has studied learning in either fully supervised or fully unsupervised scenarios. Most real-world learning scenarios, however, are semi-supervised: Learners receive a great deal of unlabeled information from the world, coupled with occasional experiences in which items are directly labeled by a knowledgeable source. A large body of work in machine learning has investigated how learning can exploit both labeled and unlabeled data provided to a learner. Using equivalences between models found in human categorization and machine learning research, we explain how these semi-supervised techniques can be applied to human learning. A series of experiments are described which show that semi-supervised learning models prove useful for explaining human behavior when exposed to both labeled and unlabeled data. We then discuss some machine learning models that do not have familiar human categorization counterparts. Finally, we discuss some challenges yet to be addressed in the use of semi-supervised models for modeling human categorization. Copyright © 2013 Cognitive Science Society, Inc.
Hands-on Simulation versus Traditional Video-learning in Teaching Microsurgery Technique
SAKAMOTO, Yusuke; OKAMOTO, Sho; SHIMIZU, Kenzo; ARAKI, Yoshio; HIRAKAWA, Akihiro; WAKABAYASHI, Toshihiko
2017-01-01
Bench model hands-on learning may be more effective than traditional didactic practice in some surgical fields. However, this has not been reported for microsurgery. Our study objective was to demonstrate the efficacy of bench model hands-on learning in acquiring microsuturing skills. The secondary objective was to evaluate the aptitude for microsurgery based on personality assessment. Eighty-six medical students comprising 62 men and 24 women were randomly assigned to either 20 min of hands-on learning with a bench model simulator or 20 min of video-learning using an instructional video. They then practiced microsuturing for 40 min. Each student then made three knots, and the time to complete the task was recorded. The final products were scored by two independent graders in a blind fashion. All participants then took a personality test, and their microsuture test scores and the time to complete the task were compared. The time to complete the task was significantly shorter in the simulator group than in the video-learning group. The final product scores tended to be higher with simulator-learning than with video-learning, but the difference was not significant. Students with high “extraversion” scores on the personality inventory took a shorter time to complete the suturing test. Simulator-learning was more effective for microsurgery training than video instruction, especially in understanding the procedure. There was a weak association between personality traits and microsurgery skill. PMID:28381653
Retinal blood vessel segmentation using fully convolutional network with transfer learning.
Jiang, Zhexin; Zhang, Hao; Wang, Yi; Ko, Seok-Bum
2018-04-26
Since the retinal blood vessel has been acknowledged as an indispensable element in both ophthalmological and cardiovascular disease diagnosis, the accurate segmentation of the retinal vessel tree has become the prerequisite step for automated or computer-aided diagnosis systems. In this paper, a supervised method is presented based on a pre-trained fully convolutional network through transfer learning. This proposed method has simplified the typical retinal vessel segmentation problem from full-size image segmentation to regional vessel element recognition and result merging. Meanwhile, additional unsupervised image post-processing techniques are applied to this proposed method so as to refine the final result. Extensive experiments have been conducted on DRIVE, STARE, CHASE_DB1 and HRF databases, and the accuracy of the cross-database test on these four databases is state-of-the-art, which also presents the high robustness of the proposed approach. This successful result has not only contributed to the area of automated retinal blood vessel segmentation but also supports the effectiveness of transfer learning when applying deep learning technique to medical imaging. Copyright © 2018 Elsevier Ltd. All rights reserved.
An underwater turbulence degraded image restoration algorithm
NASA Astrophysics Data System (ADS)
Furhad, Md. Hasan; Tahtali, Murat; Lambert, Andrew
2017-09-01
Underwater turbulence occurs due to random fluctuations of temperature and salinity in the water. These fluctuations are responsible for variations in water density, refractive index and attenuation. These impose random geometric distortions, spatio-temporal varying blur, limited range visibility and limited contrast on the acquired images. There are some restoration techniques developed to address this problem, such as image registration based, lucky region based and centroid-based image restoration algorithms. Although these methods demonstrate better results in terms of removing turbulence, they require computationally intensive image registration, higher CPU load and memory allocations. Thus, in this paper, a simple patch based dictionary learning algorithm is proposed to restore the image by alleviating the costly image registration step. Dictionary learning is a machine learning technique which builds a dictionary of non-zero atoms derived from the sparse representation of an image or signal. The image is divided into several patches and the sharp patches are detected from them. Next, dictionary learning is performed on these patches to estimate the restored image. Finally, an image deconvolution algorithm is employed on the estimated restored image to remove noise that still exists.
ERIC Educational Resources Information Center
Jonesboro School District 1, AR.
The Jonesboro Area Vocational High School was the Arkansas pilot site for the Southern Regional Education Board (SREB) initiative to improve the basic competencies of high school vocational students. The project aimed to revise vocational courses to incorporate academic content, to revise course requirements, and to encourage vocational and…
Developing Specialized Programs for Singing in the Elementary School. Final Report.
ERIC Educational Resources Information Center
Gould, A. Oren
The objectives of this research into ways of helping children who have difficulty finding and using their singing voices were (1) to make an intensive study of singing difficulties of elementary-school children, (2) to discover, observe, and analyze successful techniques and materials for use with children who have not learned to "carry a tune,"…
Automation in Vocational Training of the Mentally Retarded. Final Report.
ERIC Educational Resources Information Center
Platt, Henry; And Others
Various uses of automation in teaching were studied with mentally retarded (IQ 70 to 90) and/or emotionally disturbed (IQ 80 to 90) youth aged 16 to 20. Programed instruction was presented by six audiovisual devices and techniques: the Devereux Model 50 Teaching Aid, the Learn-Ease Teaching Device, the Mast Teaching Machine, the Graflex…
Learner-Adaptive Educational Technology for Simulation in Healthcare: Foundations and Opportunities.
Lineberry, Matthew; Dev, Parvati; Lane, H Chad; Talbot, Thomas B
2018-06-01
Despite evidence that learners vary greatly in their learning needs, practical constraints tend to favor ''one-size-fits-all'' educational approaches, in simulation-based education as elsewhere. Adaptive educational technologies - devices and/or software applications that capture and analyze relevant data about learners to select and present individually tailored learning stimuli - are a promising aid in learners' and educators' efforts to provide learning experiences that meet individual needs. In this article, we summarize and build upon the 2017 Society for Simulation in Healthcare Research Summit panel discussion on adaptive learning. First, we consider the role of adaptivity in learning broadly. We then outline the basic functions that adaptive learning technologies must implement and the unique affordances and challenges of technology-based approaches for those functions, sharing an illustrative example from healthcare simulation. Finally, we consider future directions for accelerating research, development, and deployment of effective adaptive educational technology and techniques in healthcare simulation.
Learning Negotiation Policies Using IB3 and Bayesian Networks
NASA Astrophysics Data System (ADS)
Nalepa, Gislaine M.; Ávila, Bráulio C.; Enembreck, Fabrício; Scalabrin, Edson E.
This paper presents an intelligent offer policy in a negotiation environment, in which each agent involved learns the preferences of its opponent in order to improve its own performance. Each agent must also be able to detect drifts in the opponent's preferences so as to quickly adjust itself to their new offer policy. For this purpose, two simple learning techniques were first evaluated: (i) based on instances (IB3) and (ii) based on Bayesian Networks. Additionally, as its known that in theory group learning produces better results than individual/single learning, the efficiency of IB3 and Bayesian classifier groups were also analyzed. Finally, each decision model was evaluated in moments of concept drift, being the drift gradual, moderate or abrupt. Results showed that both groups of classifiers were able to effectively detect drifts in the opponent's preferences.
Multi-objects recognition for distributed intelligent sensor networks
NASA Astrophysics Data System (ADS)
He, Haibo; Chen, Sheng; Cao, Yuan; Desai, Sachi; Hohil, Myron E.
2008-04-01
This paper proposes an innovative approach for multi-objects recognition for homeland security and defense based intelligent sensor networks. Unlike the conventional way of information analysis, data mining in such networks is typically characterized with high information ambiguity/uncertainty, data redundancy, high dimensionality and real-time constrains. Furthermore, since a typical military based network normally includes multiple mobile sensor platforms, ground forces, fortified tanks, combat flights, and other resources, it is critical to develop intelligent data mining approaches to fuse different information resources to understand dynamic environments, to support decision making processes, and finally to achieve the goals. This paper aims to address these issues with a focus on multi-objects recognition. Instead of classifying a single object as in the traditional image classification problems, the proposed method can automatically learn multiple objectives simultaneously. Image segmentation techniques are used to identify the interesting regions in the field, which correspond to multiple objects such as soldiers or tanks. Since different objects will come with different feature sizes, we propose a feature scaling method to represent each object in the same number of dimensions. This is achieved by linear/nonlinear scaling and sampling techniques. Finally, support vector machine (SVM) based learning algorithms are developed to learn and build the associations for different objects, and such knowledge will be adaptively accumulated for objects recognition in the testing stage. We test the effectiveness of proposed method in different simulated military environments.
Creating Turbulent Flow Realizations with Generative Adversarial Networks
NASA Astrophysics Data System (ADS)
King, Ryan; Graf, Peter; Chertkov, Michael
2017-11-01
Generating valid inflow conditions is a crucial, yet computationally expensive, step in unsteady turbulent flow simulations. We demonstrate a new technique for rapid generation of turbulent inflow realizations that leverages recent advances in machine learning for image generation using a deep convolutional generative adversarial network (DCGAN). The DCGAN is an unsupervised machine learning technique consisting of two competing neural networks that are trained against each other using backpropagation. One network, the generator, tries to produce samples from the true distribution of states, while the discriminator tries to distinguish between true and synthetic samples. We present results from a fully-trained DCGAN that is able to rapidly draw random samples from the full distribution of possible inflow states without needing to solve the Navier-Stokes equations, eliminating the costly process of spinning up inflow turbulence. This suggests a new paradigm in physics informed machine learning where the turbulence physics can be encoded in either the discriminator or generator. Finally, we also propose additional applications such as feature identification and subgrid scale modeling.
Is a Team-based Learning Approach to Anatomy Teaching Superior to Didactic Lecturing?
Ghorbani, Naghme; Karbalay-Doust, Saied; Noorafshan, Ali
2014-02-01
Team-based learning (TBL) is used in the medical field to implement interactive learning in small groups. The learning of anatomy and its subsequent application requires the students to recall a great deal of factual content. The aims of this study were to evaluate the students' satisfaction, engagement and knowledge gain in anatomy through the medium of TBL in comparison to the traditional lecture method. This study, carried out from February to June 2012, included 30 physical therapy students of the Shiraz University of Medical Science, School of Rehabilitation Sciences. Classic TBL techniques were modified to cover lower limb anatomy topics in the first year of the physical therapy curriculum. Anatomy lectures were replaced with TBL, which required the preparation of assigned content, specific discussion topics, an individual self-assessment test (IRAT) and the analysis of discussion topics. The teams then subsequently retook the assessment test as a group (GRAT). The first eight weeks of the curriculum were taught using traditional didactic lecturing, while during the second eight weeks the modified TBL method was used. The students evaluated these sessions through a questionnaire. The impact of TBL on student engagement and educational achievement was determined using numerical data, including the IRAT, GRAT and final examination scores. Students had a higher satisfaction rate with the TBL teaching according to the Likert scale. Additionally, higher scores were obtained in the TBL-based final examination in comparison to the lecture-based midterm exam. The students' responses showed that the TBL technique could be used alone or in conjunction with traditional didactic lecturing in order to teach anatomy more effectively.
Incremental online learning in high dimensions.
Vijayakumar, Sethu; D'Souza, Aaron; Schaal, Stefan
2005-12-01
Locally weighted projection regression (LWPR) is a new algorithm for incremental nonlinear function approximation in high-dimensional spaces with redundant and irrelevant input dimensions. At its core, it employs nonparametric regression with locally linear models. In order to stay computationally efficient and numerically robust, each local model performs the regression analysis with a small number of univariate regressions in selected directions in input space in the spirit of partial least squares regression. We discuss when and how local learning techniques can successfully work in high-dimensional spaces and review the various techniques for local dimensionality reduction before finally deriving the LWPR algorithm. The properties of LWPR are that it (1) learns rapidly with second-order learning methods based on incremental training, (2) uses statistically sound stochastic leave-one-out cross validation for learning without the need to memorize training data, (3) adjusts its weighting kernels based on only local information in order to minimize the danger of negative interference of incremental learning, (4) has a computational complexity that is linear in the number of inputs, and (5) can deal with a large number of-possibly redundant-inputs, as shown in various empirical evaluations with up to 90 dimensional data sets. For a probabilistic interpretation, predictive variance and confidence intervals are derived. To our knowledge, LWPR is the first truly incremental spatially localized learning method that can successfully and efficiently operate in very high-dimensional spaces.
Segmenting overlapping nano-objects in atomic force microscopy image
NASA Astrophysics Data System (ADS)
Wang, Qian; Han, Yuexing; Li, Qing; Wang, Bing; Konagaya, Akihiko
2018-01-01
Recently, techniques for nanoparticles have rapidly been developed for various fields, such as material science, medical, and biology. In particular, methods of image processing have widely been used to automatically analyze nanoparticles. A technique to automatically segment overlapping nanoparticles with image processing and machine learning is proposed. Here, two tasks are necessary: elimination of image noises and action of the overlapping shapes. For the first task, mean square error and the seed fill algorithm are adopted to remove noises and improve the quality of the original image. For the second task, four steps are needed to segment the overlapping nanoparticles. First, possibility split lines are obtained by connecting the high curvature pixels on the contours. Second, the candidate split lines are classified with a machine learning algorithm. Third, the overlapping regions are detected with the method of density-based spatial clustering of applications with noise (DBSCAN). Finally, the best split lines are selected with a constrained minimum value. We give some experimental examples and compare our technique with two other methods. The results can show the effectiveness of the proposed technique.
ERIC Educational Resources Information Center
EACHUS, HERBERT T.; KING, PHILIP H.
AN EXPERIMENT TESTED THE RELATIVE EFFECTIVENESS OF TWO TECHNIQUES FOR TRAINING UNITED STATES AIR FORCE MILITARY ADVISORS IN CROSS CULTURAL COMMUNICATION SKILLS. RETENTION OF SKILLS OVER TIME AND EFFECTS OF ATTITUDE ON LEARNING WERE ALSO STUDIED. SUBJECTS PLAYED THE ROLE OF AN AIR FORCE CAPTAIN INTERACTING WITH A FOREIGN COUNTERPART, PLAYED BY A…
Sparsity-aware tight frame learning with adaptive subspace recognition for multiple fault diagnosis
NASA Astrophysics Data System (ADS)
Zhang, Han; Chen, Xuefeng; Du, Zhaohui; Yang, Boyuan
2017-09-01
It is a challenging problem to design excellent dictionaries to sparsely represent diverse fault information and simultaneously discriminate different fault sources. Therefore, this paper describes and analyzes a novel multiple feature recognition framework which incorporates the tight frame learning technique with an adaptive subspace recognition strategy. The proposed framework consists of four stages. Firstly, by introducing the tight frame constraint into the popular dictionary learning model, the proposed tight frame learning model could be formulated as a nonconvex optimization problem which can be solved by alternatively implementing hard thresholding operation and singular value decomposition. Secondly, the noises are effectively eliminated through transform sparse coding techniques. Thirdly, the denoised signal is decoupled into discriminative feature subspaces by each tight frame filter. Finally, in guidance of elaborately designed fault related sensitive indexes, latent fault feature subspaces can be adaptively recognized and multiple faults are diagnosed simultaneously. Extensive numerical experiments are sequently implemented to investigate the sparsifying capability of the learned tight frame as well as its comprehensive denoising performance. Most importantly, the feasibility and superiority of the proposed framework is verified through performing multiple fault diagnosis of motor bearings. Compared with the state-of-the-art fault detection techniques, some important advantages have been observed: firstly, the proposed framework incorporates the physical prior with the data-driven strategy and naturally multiple fault feature with similar oscillation morphology can be adaptively decoupled. Secondly, the tight frame dictionary directly learned from the noisy observation can significantly promote the sparsity of fault features compared to analytical tight frames. Thirdly, a satisfactory complete signal space description property is guaranteed and thus weak feature leakage problem is avoided compared to typical learning methods.
Expressive facial animation synthesis by learning speech coarticulation and expression spaces.
Deng, Zhigang; Neumann, Ulrich; Lewis, J P; Kim, Tae-Yong; Bulut, Murtaza; Narayanan, Shrikanth
2006-01-01
Synthesizing expressive facial animation is a very challenging topic within the graphics community. In this paper, we present an expressive facial animation synthesis system enabled by automated learning from facial motion capture data. Accurate 3D motions of the markers on the face of a human subject are captured while he/she recites a predesigned corpus, with specific spoken and visual expressions. We present a novel motion capture mining technique that "learns" speech coarticulation models for diphones and triphones from the recorded data. A Phoneme-Independent Expression Eigenspace (PIEES) that encloses the dynamic expression signals is constructed by motion signal processing (phoneme-based time-warping and subtraction) and Principal Component Analysis (PCA) reduction. New expressive facial animations are synthesized as follows: First, the learned coarticulation models are concatenated to synthesize neutral visual speech according to novel speech input, then a texture-synthesis-based approach is used to generate a novel dynamic expression signal from the PIEES model, and finally the synthesized expression signal is blended with the synthesized neutral visual speech to create the final expressive facial animation. Our experiments demonstrate that the system can effectively synthesize realistic expressive facial animation.
Learning directed acyclic graphs from large-scale genomics data.
Nikolay, Fabio; Pesavento, Marius; Kritikos, George; Typas, Nassos
2017-09-20
In this paper, we consider the problem of learning the genetic interaction map, i.e., the topology of a directed acyclic graph (DAG) of genetic interactions from noisy double-knockout (DK) data. Based on a set of well-established biological interaction models, we detect and classify the interactions between genes. We propose a novel linear integer optimization program called the Genetic-Interactions-Detector (GENIE) to identify the complex biological dependencies among genes and to compute the DAG topology that matches the DK measurements best. Furthermore, we extend the GENIE program by incorporating genetic interaction profile (GI-profile) data to further enhance the detection performance. In addition, we propose a sequential scalability technique for large sets of genes under study, in order to provide statistically significant results for real measurement data. Finally, we show via numeric simulations that the GENIE program and the GI-profile data extended GENIE (GI-GENIE) program clearly outperform the conventional techniques and present real data results for our proposed sequential scalability technique.
NASA Astrophysics Data System (ADS)
Lee, Seungjoon; Kevrekidis, Ioannis G.; Karniadakis, George Em
2017-09-01
Exascale-level simulations require fault-resilient algorithms that are robust against repeated and expected software and/or hardware failures during computations, which may render the simulation results unsatisfactory. If each processor can share some global information about the simulation from a coarse, limited accuracy but relatively costless auxiliary simulator we can effectively fill-in the missing spatial data at the required times by a statistical learning technique - multi-level Gaussian process regression, on the fly; this has been demonstrated in previous work [1]. Based on the previous work, we also employ another (nonlinear) statistical learning technique, Diffusion Maps, that detects computational redundancy in time and hence accelerate the simulation by projective time integration, giving the overall computation a "patch dynamics" flavor. Furthermore, we are now able to perform information fusion with multi-fidelity and heterogeneous data (including stochastic data). Finally, we set the foundations of a new framework in CFD, called patch simulation, that combines information fusion techniques from, in principle, multiple fidelity and resolution simulations (and even experiments) with a new adaptive timestep refinement technique. We present two benchmark problems (the heat equation and the Navier-Stokes equations) to demonstrate the new capability that statistical learning tools can bring to traditional scientific computing algorithms. For each problem, we rely on heterogeneous and multi-fidelity data, either from a coarse simulation of the same equation or from a stochastic, particle-based, more "microscopic" simulation. We consider, as such "auxiliary" models, a Monte Carlo random walk for the heat equation and a dissipative particle dynamics (DPD) model for the Navier-Stokes equations. More broadly, in this paper we demonstrate the symbiotic and synergistic combination of statistical learning, domain decomposition, and scientific computing in exascale simulations.
Finding Waldo: Learning about Users from their Interactions.
Brown, Eli T; Ottley, Alvitta; Zhao, Helen; Quan Lin; Souvenir, Richard; Endert, Alex; Chang, Remco
2014-12-01
Visual analytics is inherently a collaboration between human and computer. However, in current visual analytics systems, the computer has limited means of knowing about its users and their analysis processes. While existing research has shown that a user's interactions with a system reflect a large amount of the user's reasoning process, there has been limited advancement in developing automated, real-time techniques that mine interactions to learn about the user. In this paper, we demonstrate that we can accurately predict a user's task performance and infer some user personality traits by using machine learning techniques to analyze interaction data. Specifically, we conduct an experiment in which participants perform a visual search task, and apply well-known machine learning algorithms to three encodings of the users' interaction data. We achieve, depending on algorithm and encoding, between 62% and 83% accuracy at predicting whether each user will be fast or slow at completing the task. Beyond predicting performance, we demonstrate that using the same techniques, we can infer aspects of the user's personality factors, including locus of control, extraversion, and neuroticism. Further analyses show that strong results can be attained with limited observation time: in one case 95% of the final accuracy is gained after a quarter of the average task completion time. Overall, our findings show that interactions can provide information to the computer about its human collaborator, and establish a foundation for realizing mixed-initiative visual analytics systems.
Dynamic adaptive learning for decision-making supporting systems
NASA Astrophysics Data System (ADS)
He, Haibo; Cao, Yuan; Chen, Sheng; Desai, Sachi; Hohil, Myron E.
2008-03-01
This paper proposes a novel adaptive learning method for data mining in support of decision-making systems. Due to the inherent characteristics of information ambiguity/uncertainty, high dimensionality and noisy in many homeland security and defense applications, such as surveillances, monitoring, net-centric battlefield, and others, it is critical to develop autonomous learning methods to efficiently learn useful information from raw data to help the decision making process. The proposed method is based on a dynamic learning principle in the feature spaces. Generally speaking, conventional approaches of learning from high dimensional data sets include various feature extraction (principal component analysis, wavelet transform, and others) and feature selection (embedded approach, wrapper approach, filter approach, and others) methods. However, very limited understandings of adaptive learning from different feature spaces have been achieved. We propose an integrative approach that takes advantages of feature selection and hypothesis ensemble techniques to achieve our goal. Based on the training data distributions, a feature score function is used to provide a measurement of the importance of different features for learning purpose. Then multiple hypotheses are iteratively developed in different feature spaces according to their learning capabilities. Unlike the pre-set iteration steps in many of the existing ensemble learning approaches, such as adaptive boosting (AdaBoost) method, the iterative learning process will automatically stop when the intelligent system can not provide a better understanding than a random guess in that particular subset of feature spaces. Finally, a voting algorithm is used to combine all the decisions from different hypotheses to provide the final prediction results. Simulation analyses of the proposed method on classification of different US military aircraft databases show the effectiveness of this method.
Behavior Knowledge Space-Based Fusion for Copy-Move Forgery Detection.
Ferreira, Anselmo; Felipussi, Siovani C; Alfaro, Carlos; Fonseca, Pablo; Vargas-Munoz, John E; Dos Santos, Jefersson A; Rocha, Anderson
2016-07-20
The detection of copy-move image tampering is of paramount importance nowadays, mainly due to its potential use for misleading the opinion forming process of the general public. In this paper, we go beyond traditional forgery detectors and aim at combining different properties of copy-move detection approaches by modeling the problem on a multiscale behavior knowledge space, which encodes the output combinations of different techniques as a priori probabilities considering multiple scales of the training data. Afterwards, the conditional probabilities missing entries are properly estimated through generative models applied on the existing training data. Finally, we propose different techniques that exploit the multi-directionality of the data to generate the final outcome detection map in a machine learning decision-making fashion. Experimental results on complex datasets, comparing the proposed techniques with a gamut of copy-move detection approaches and other fusion methodologies in the literature show the effectiveness of the proposed method and its suitability for real-world applications.
Walker, Sandra; Rossi, Dolene; Anastasi, Jennifer; Gray-Ganter, Gillian; Tennent, Rebeka
2016-08-01
In Australia Bachelor of Nursing programmes are delivered via both internal and distance modes yet there is little knowledge of the indicators of undergraduate nursing students' satisfaction with the learning journey. This integrative review was undertaken to uncover the indicators of undergraduate nursing students' satisfaction with their learning journey. Integrative review. A review of key papers was undertaken. Only peer-reviewed papers published in scholarly journals from 2008 onwards were included in this integrative review. Pubmed, CINAHL, Google Scholar, Cochrane, Wiley Online and ProQuest Central databases were searched for relevant papers. 49 papers were appraised, by a minimum of two team members. CASP tools were used when evaluating qualitative research, systematic and integrated reviews while survey research was evaluated using a tool specifically developed for this purpose by the research team. All tools used to assess the quality of the research studies contained comprehensive checklists and questions relevant for the particular type of study. Data related to these checklists was extracted and the research team appraised the quality of each article based on its relevance to the topic, internal and external validity, appropriateness of data analysis technique(s), and whether ethical considerations were addressed. Seventeen papers were included in the final analysis. Data analysis involved a systematic approach using content analysis techniques. This integrative review sought to identify indicators of nursing students' satisfaction with their learning journey. Authentic learning, motivation, resilience, support, and collaborative learning were identified by this integrative review as being key to nursing students' satisfaction with their learning journey. Sub themes were identified within each of these themes that assist in explaining nursing students' views of their learning journey. The findings showed that higher satisfaction levels are attained when nursing students feel included and supported during their learning journey. Copyright © 2016 Elsevier Ltd. All rights reserved.
Neville, Michael W; Palmer, Russ; Elder, Deborah; Fulford, Michael; Morris, Steve; Sappington, Kellie
2015-08-25
To evaluate how flexible learning via online video review affects the ability and confidence of first-year (P1) pharmacy students to accurately compound aseptic preparations. Customary instructions and assignments for aseptic compounding were provided to students, who were given unlimited access to 5 short review videos in addition to customary instruction. Student self-confidence was assessed online, and faculty members evaluated students' aseptic technique at the conclusion of the semester. No significant difference on final assessment scores was observed between those who viewed videos and those who did not. Student self-confidence scores increased significantly from baseline, but were not significantly higher for those who viewed videos than for those who did not. First-year students performed well on final aseptic compounding assessments, and those who viewed videos had a slight advantage. Student self-confidence improved over the semester regardless of whether or not students accessed review videos.
An automatic taxonomy of galaxy morphology using unsupervised machine learning
NASA Astrophysics Data System (ADS)
Hocking, Alex; Geach, James E.; Sun, Yi; Davey, Neil
2018-01-01
We present an unsupervised machine learning technique that automatically segments and labels galaxies in astronomical imaging surveys using only pixel data. Distinct from previous unsupervised machine learning approaches used in astronomy we use no pre-selection or pre-filtering of target galaxy type to identify galaxies that are similar. We demonstrate the technique on the Hubble Space Telescope (HST) Frontier Fields. By training the algorithm using galaxies from one field (Abell 2744) and applying the result to another (MACS 0416.1-2403), we show how the algorithm can cleanly separate early and late type galaxies without any form of pre-directed training for what an 'early' or 'late' type galaxy is. We then apply the technique to the HST Cosmic Assembly Near-infrared Deep Extragalactic Legacy Survey (CANDELS) fields, creating a catalogue of approximately 60 000 classifications. We show how the automatic classification groups galaxies of similar morphological (and photometric) type and make the classifications public via a catalogue, a visual catalogue and galaxy similarity search. We compare the CANDELS machine-based classifications to human-classifications from the Galaxy Zoo: CANDELS project. Although there is not a direct mapping between Galaxy Zoo and our hierarchical labelling, we demonstrate a good level of concordance between human and machine classifications. Finally, we show how the technique can be used to identify rarer objects and present lensed galaxy candidates from the CANDELS imaging.
Wireless device connection problems and design solutions
NASA Astrophysics Data System (ADS)
Song, Ji-Won; Norman, Donald; Nam, Tek-Jin; Qin, Shengfeng
2016-09-01
Users, especially the non-expert users, commonly experience problems when connecting multiple devices with interoperability. While studies on multiple device connections are mostly concentrated on spontaneous device association techniques with a focus on security aspects, the research on user interaction for device connection is still limited. More research into understanding people is needed for designers to devise usable techniques. This research applies the Research-through-Design method and studies the non-expert users' interactions in establishing wireless connections between devices. The "Learning from Examples" concept is adopted to develop a study focus line by learning from the expert users' interaction with devices. This focus line is then used for guiding researchers to explore the non-expert users' difficulties at each stage of the focus line. Finally, the Research-through-Design approach is used to understand the users' difficulties, gain insights to design problems and suggest usable solutions. When connecting a device, the user is required to manage not only the device's functionality but also the interaction between devices. Based on learning from failures, an important insight is found that the existing design approach to improve single-device interaction issues, such as improvements to graphical user interfaces or computer guidance, cannot help users to handle problems between multiple devices. This study finally proposes a desirable user-device interaction in which images of two devices function together with a system image to provide the user with feedback on the status of the connection, which allows them to infer any required actions.
Posttest analysis of beta (Na/S) cells from chloride silent power, limited. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Battles, J.E.; Mrazek, F.C.
Researchers have developed a unique methodology for examining sodium/sulfur cells after testing to learn more about their behavior. The new techniques described in this report allow scientists to discern the physical and chemical states of these high-energy cells and to develop hypotheses about degradation mechanisms. This information may provide a basis for building cells with longer lives.
ERIC Educational Resources Information Center
Schure, Alexander
A computer-based system model for the monitoring and management of the instructional process was conceived, developed and refined through the techniques of systems analysis. This report describes the various aspects and components of this project in a series of independent and self-contained units. The first unit provides an overview of the entire…
Howe, Lisa M; Boothe, Harry W; Hartsfield, Sandee M
2005-01-01
At Texas A&M University, introductory-level surgical lecture and laboratory notes were converted to a CD-ROM format that included illustrative photographs as well as instructional videos demonstrating the basic surgical skills that all students were required to master. The CD-ROM was distributed to all students in place of traditional paper notes in the second-year surgical class in the professional veterinary curriculum. The study reported here was designed to evaluate the educational benefits of the use of the CD-ROM in place of traditional paper notes by examining the attitudes and practices of students before and after exposure to the CD-ROM format. An anonymous survey was distributed to students in the second-year introductory surgery course on the first day of class and again on the last day of class. Responses to questions were tabulated, response frequencies determined, and Chi-square analysis performed to determine differences between initial and final responses. On the final survey, 89 per cent of students responded that the instructional videos definitely helped them prepare for the laboratory, and 77 per cent responded that they were more likely to practice techniques learned from the CD-ROM videos than those learned from traditional study materials. The majority of students believed that the CD-ROM improved both the course (60 per cent) and their learning experience (62 per cent) as compared to traditional paper notes. Including instructional videos on the CD-ROM enhanced the educational experience of the students by promoting preparedness for laboratories and promoting practice of techniques learned from the videos outside of the laboratory.
NASA Astrophysics Data System (ADS)
Sasmita, E.; Edriati, S.; Yunita, A.
2018-04-01
Related to the math score of the first semester in class at seventh grade of MTSN Model Padang which much the score that low (less than KKM). It because of the students who feel less involved in learning process because the teacher don't do assessment the discussions. The solution of the problem is discussion assessment in Cooperative Learning Model type Numbered Head Together. This study aims to determine whether the discussion assessment in NHT effect on student learning outcomes of class VII MTsN Model Padang. The instrument used in this study is discussion assessment and final tests. The data analysis technique used is the simple linear regression analysis. Hypothesis test results Fcount greater than the value of Ftable then the hypothesis in this study received. So it concluded that the assessment of the discussion in NHT effect on student learning outcomes of class VII MTsN Model Padang.
Supporting Solar Physics Research via Data Mining
NASA Astrophysics Data System (ADS)
Angryk, Rafal; Banda, J.; Schuh, M.; Ganesan Pillai, K.; Tosun, H.; Martens, P.
2012-05-01
In this talk we will briefly introduce three pillars of data mining (i.e. frequent patterns discovery, classification, and clustering), and discuss some possible applications of known data mining techniques which can directly benefit solar physics research. In particular, we plan to demonstrate applicability of frequent patterns discovery methods for the verification of hypotheses about co-occurrence (in space and time) of filaments and sigmoids. We will also show how classification/machine learning algorithms can be utilized to verify human-created software modules to discover individual types of solar phenomena. Finally, we will discuss applicability of clustering techniques to image data processing.
Active Learning Techniques Applied to an Interdisciplinary Mineral Resources Course.
NASA Astrophysics Data System (ADS)
Aird, H. M.
2015-12-01
An interdisciplinary active learning course was introduced at the University of Puget Sound entitled 'Mineral Resources and the Environment'. Various formative assessment and active learning techniques that have been effective in other courses were adapted and implemented to improve student learning, increase retention and broaden knowledge and understanding of course material. This was an elective course targeted towards upper-level undergraduate geology and environmental majors. The course provided an introduction to the mineral resources industry, discussing geological, environmental, societal and economic aspects, legislation and the processes involved in exploration, extraction, processing, reclamation/remediation and recycling of products. Lectures and associated weekly labs were linked in subject matter; relevant readings from the recent scientific literature were assigned and discussed in the second lecture of the week. Peer-based learning was facilitated through weekly reading assignments with peer-led discussions and through group research projects, in addition to in-class exercises such as debates. Writing and research skills were developed through student groups designing, carrying out and reporting on their own semester-long research projects around the lasting effects of the historical Ruston Smelter on the biology and water systems of Tacoma. The writing of their mini grant proposals and final project reports was carried out in stages to allow for feedback before the deadline. Speakers from industry were invited to share their specialist knowledge as guest lecturers, and students were encouraged to interact with them, with a view to employment opportunities. Formative assessment techniques included jigsaw exercises, gallery walks, placemat surveys, think pair share and take-home point summaries. Summative assessment included discussion leadership, exams, homeworks, group projects, in-class exercises, field trips, and pre-discussion reading exercises.An interdisciplinary active learning course was introduced at the University of Puget Sound entitled 'Mineral Resources and the Environment'. Various formative assessment and active learning techniques that have been effective in other courses were adapted and implemented to improve student learning, increase retention and broaden knowledge and understanding of course material. This was an elective course targeted towards upper-level undergraduate geology and environmental majors. The course provided an introduction to the mineral resources industry, discussing geological, environmental, societal and economic aspects, legislation and the processes involved in exploration, extraction, processing, reclamation/remediation and recycling of products. Lectures and associated weekly labs were linked in subject matter; relevant readings from the recent scientific literature were assigned and discussed in the second lecture of the week. Peer-based learning was facilitated through weekly reading assignments with peer-led discussions and through group research projects, in addition to in-class exercises such as debates. Writing and research skills were developed through student groups designing, carrying out and reporting on their own semester-long research projects around the lasting effects of the historical Ruston Smelter on the biology and water systems of Tacoma. The writing of their mini grant proposals and final project reports was carried out in stages to allow for feedback before the deadline. Speakers from industry were invited to share their specialist knowledge as guest lecturers, and students were encouraged to interact with them, with a view to employment opportunities. Formative assessment techniques included jigsaw exercises, gallery walks, placemat surveys, think pair share and take-home point summaries. Summative assessment included discussion leadership, exams, homeworks, group projects, in-class exercises, field trips, and pre-discussion reading exercises.
Parameter estimation using meta-heuristics in systems biology: a comprehensive review.
Sun, Jianyong; Garibaldi, Jonathan M; Hodgman, Charlie
2012-01-01
This paper gives a comprehensive review of the application of meta-heuristics to optimization problems in systems biology, mainly focussing on the parameter estimation problem (also called the inverse problem or model calibration). It is intended for either the system biologist who wishes to learn more about the various optimization techniques available and/or the meta-heuristic optimizer who is interested in applying such techniques to problems in systems biology. First, the parameter estimation problems emerging from different areas of systems biology are described from the point of view of machine learning. Brief descriptions of various meta-heuristics developed for these problems follow, along with outlines of their advantages and disadvantages. Several important issues in applying meta-heuristics to the systems biology modelling problem are addressed, including the reliability and identifiability of model parameters, optimal design of experiments, and so on. Finally, we highlight some possible future research directions in this field.
Laviolette, Steven R
2007-07-01
The neural regulation of emotional perception, learning, and memory is essential for normal behavioral and cognitive functioning. Many of the symptoms displayed by individuals with schizophrenia may arise from fundamental disturbances in the ability to accurately process emotionally salient sensory information. The neurotransmitter dopamine (DA) and its ability to modulate neural regions involved in emotional learning, perception, and memory formation has received considerable research attention as a potential final common pathway to account for the aberrant emotional regulation and psychosis present in the schizophrenic syndrome. Evidence from both human neuroimaging studies and animal-based research using neurodevelopmental, behavioral, and electrophysiological techniques have implicated the mesocorticolimbic DA circuit as a crucial system for the encoding and expression of emotionally salient learning and memory formation. While many theories have examined the cortical-subcortical interactions between prefrontal cortical regions and subcortical DA substrates, many questions remain as to how DA may control emotional perception and learning and how disturbances linked to DA abnormalities may underlie the disturbed emotional processing in schizophrenia. Beyond the mesolimbic DA system, increasing evidence points to the amygdala-prefrontal cortical circuit as an important processor of emotionally salient information and how neurodevelopmental perturbances within this circuitry may lead to dysregulation of DAergic modulation of emotional processing and learning along this cortical-subcortical emotional processing circuit.
Liang, Liang; Liu, Minliang; Martin, Caitlin; Sun, Wei
2018-01-01
Structural finite-element analysis (FEA) has been widely used to study the biomechanics of human tissues and organs, as well as tissue-medical device interactions, and treatment strategies. However, patient-specific FEA models usually require complex procedures to set up and long computing times to obtain final simulation results, preventing prompt feedback to clinicians in time-sensitive clinical applications. In this study, by using machine learning techniques, we developed a deep learning (DL) model to directly estimate the stress distributions of the aorta. The DL model was designed and trained to take the input of FEA and directly output the aortic wall stress distributions, bypassing the FEA calculation process. The trained DL model is capable of predicting the stress distributions with average errors of 0.492% and 0.891% in the Von Mises stress distribution and peak Von Mises stress, respectively. This study marks, to our knowledge, the first study that demonstrates the feasibility and great potential of using the DL technique as a fast and accurate surrogate of FEA for stress analysis. © 2018 The Author(s).
Machine learning approaches for estimation of prediction interval for the model output.
Shrestha, Durga L; Solomatine, Dimitri P
2006-03-01
A novel method for estimating prediction uncertainty using machine learning techniques is presented. Uncertainty is expressed in the form of the two quantiles (constituting the prediction interval) of the underlying distribution of prediction errors. The idea is to partition the input space into different zones or clusters having similar model errors using fuzzy c-means clustering. The prediction interval is constructed for each cluster on the basis of empirical distributions of the errors associated with all instances belonging to the cluster under consideration and propagated from each cluster to the examples according to their membership grades in each cluster. Then a regression model is built for in-sample data using computed prediction limits as targets, and finally, this model is applied to estimate the prediction intervals (limits) for out-of-sample data. The method was tested on artificial and real hydrologic data sets using various machine learning techniques. Preliminary results show that the method is superior to other methods estimating the prediction interval. A new method for evaluating performance for estimating prediction interval is proposed as well.
Elicitation of neurological knowledge with argument-based machine learning.
Groznik, Vida; Guid, Matej; Sadikov, Aleksander; Možina, Martin; Georgiev, Dejan; Kragelj, Veronika; Ribarič, Samo; Pirtošek, Zvezdan; Bratko, Ivan
2013-02-01
The paper describes the use of expert's knowledge in practice and the efficiency of a recently developed technique called argument-based machine learning (ABML) in the knowledge elicitation process. We are developing a neurological decision support system to help the neurologists differentiate between three types of tremors: Parkinsonian, essential, and mixed tremor (comorbidity). The system is intended to act as a second opinion for the neurologists, and most importantly to help them reduce the number of patients in the "gray area" that require a very costly further examination (DaTSCAN). We strive to elicit comprehensible and medically meaningful knowledge in such a way that it does not come at the cost of diagnostic accuracy. To alleviate the difficult problem of knowledge elicitation from data and domain experts, we used ABML. ABML guides the expert to explain critical special cases which cannot be handled automatically by machine learning. This very efficiently reduces the expert's workload, and combines expert's knowledge with learning data. 122 patients were enrolled into the study. The classification accuracy of the final model was 91%. Equally important, the initial and the final models were also evaluated for their comprehensibility by the neurologists. All 13 rules of the final model were deemed as appropriate to be able to support its decisions with good explanations. The paper demonstrates ABML's advantage in combining machine learning and expert knowledge. The accuracy of the system is very high with respect to the current state-of-the-art in clinical practice, and the system's knowledge base is assessed to be very consistent from a medical point of view. This opens up the possibility to use the system also as a teaching tool. Copyright © 2012 Elsevier B.V. All rights reserved.
Das, Dev Kumar; Ghosh, Madhumala; Pal, Mallika; Maiti, Asok K; Chakraborty, Chandan
2013-02-01
The aim of this paper is to address the development of computer assisted malaria parasite characterization and classification using machine learning approach based on light microscopic images of peripheral blood smears. In doing this, microscopic image acquisition from stained slides, illumination correction and noise reduction, erythrocyte segmentation, feature extraction, feature selection and finally classification of different stages of malaria (Plasmodium vivax and Plasmodium falciparum) have been investigated. The erythrocytes are segmented using marker controlled watershed transformation and subsequently total ninety six features describing shape-size and texture of erythrocytes are extracted in respect to the parasitemia infected versus non-infected cells. Ninety four features are found to be statistically significant in discriminating six classes. Here a feature selection-cum-classification scheme has been devised by combining F-statistic, statistical learning techniques i.e., Bayesian learning and support vector machine (SVM) in order to provide the higher classification accuracy using best set of discriminating features. Results show that Bayesian approach provides the highest accuracy i.e., 84% for malaria classification by selecting 19 most significant features while SVM provides highest accuracy i.e., 83.5% with 9 most significant features. Finally, the performance of these two classifiers under feature selection framework has been compared toward malaria parasite classification. Copyright © 2012 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Shao, Haidong; Jiang, Hongkai; Zhang, Haizhou; Duan, Wenjing; Liang, Tianchen; Wu, Shuaipeng
2018-02-01
The vibration signals collected from rolling bearing are usually complex and non-stationary with heavy background noise. Therefore, it is a great challenge to efficiently learn the representative fault features of the collected vibration signals. In this paper, a novel method called improved convolutional deep belief network (CDBN) with compressed sensing (CS) is developed for feature learning and fault diagnosis of rolling bearing. Firstly, CS is adopted for reducing the vibration data amount to improve analysis efficiency. Secondly, a new CDBN model is constructed with Gaussian visible units to enhance the feature learning ability for the compressed data. Finally, exponential moving average (EMA) technique is employed to improve the generalization performance of the constructed deep model. The developed method is applied to analyze the experimental rolling bearing vibration signals. The results confirm that the developed method is more effective than the traditional methods.
Leveraging Experiential Learning Techniques for Transfer
ERIC Educational Resources Information Center
Furman, Nate; Sibthorp, Jim
2013-01-01
Experiential learning techniques can be helpful in fostering learning transfer. Techniques such as project-based learning, reflective learning, and cooperative learning provide authentic platforms for developing rich learning experiences. In contrast to more didactic forms of instruction, experiential learning techniques foster a depth of learning…
ERIC Educational Resources Information Center
Morehead State Univ., KY.
Three types of instruction were used in the Ohio Module Project: traditional classes, programmed learning centers, and home instruction. Four major objectives of the project are: (1) to determine the kind of training program necessary to prepare paraprofessionals to operate an instructional program utilizing programmed materials, (2) to compare…
Great Computational Intelligence in the Formal Sciences via Analogical Reasoning
2017-05-08
computational harnessing of traditional mathematical statistics (as e.g. covered in Hogg, Craig & McKean 2005) is used to power statistical learning techniques...AFRL-AFOSR-VA-TR-2017-0099 Great Computational Intelligence in the Formal Sciences via Analogical Reasoning Selmer Bringsjord RENSSELAER POLYTECHNIC...08-05-2017 2. REPORT TYPE Final Performance 3. DATES COVERED (From - To) 15 Oct 2011 to 31 Dec 2016 4. TITLE AND SUBTITLE Great Computational
NASA Astrophysics Data System (ADS)
Wan, Xiaoqing; Zhao, Chunhui; Gao, Bing
2017-11-01
The integration of an edge-preserving filtering technique in the classification of a hyperspectral image (HSI) has been proven effective in enhancing classification performance. This paper proposes an ensemble strategy for HSI classification using an edge-preserving filter along with a deep learning model and edge detection. First, an adaptive guided filter is applied to the original HSI to reduce the noise in degraded images and to extract powerful spectral-spatial features. Second, the extracted features are fed as input to a stacked sparse autoencoder to adaptively exploit more invariant and deep feature representations; then, a random forest classifier is applied to fine-tune the entire pretrained network and determine the classification output. Third, a Prewitt compass operator is further performed on the HSI to extract the edges of the first principal component after dimension reduction. Moreover, the regional growth rule is applied to the resulting edge logical image to determine the local region for each unlabeled pixel. Finally, the categories of the corresponding neighborhood samples are determined in the original classification map; then, the major voting mechanism is implemented to generate the final output. Extensive experiments proved that the proposed method achieves competitive performance compared with several traditional approaches.
[How to design workshops to promote health in community groups].
Hernández-Díaz, Josefina; Paredes-Carbonell, Joan J; Marín Torrens, Rosa
2014-01-01
One of the strategies of health promotion is to develop life skills people considering themselves as the main health resource. A workshop has to get its participants become «asset» to make decisions and create health, focusing on the development and acquisition of skills in a motivating group and in order to achieve health objectives. The concepts behind the design of a workshop are: participatory planning, training, meaningful learning, group learning and participatory techniques. The steps to follow to design a workshop and facilitate their application are: Stage 0, founding; initial stage, host and initial evaluation; central or construction stage based learning in the acquisition of knowledge, attitudes and skills, and final stage or evaluation. Copyright © 2013 Elsevier España, S.L. All rights reserved.
What We Do and Do Not Know about Teaching Medical Image Interpretation.
Kok, Ellen M; van Geel, Koos; van Merriënboer, Jeroen J G; Robben, Simon G F
2017-01-01
Educators in medical image interpretation have difficulty finding scientific evidence as to how they should design their instruction. We review and comment on 81 papers that investigated instructional design in medical image interpretation. We distinguish between studies that evaluated complete offline courses and curricula, studies that evaluated e-learning modules, and studies that evaluated specific educational interventions. Twenty-three percent of all studies evaluated the implementation of complete courses or curricula, and 44% of the studies evaluated the implementation of e-learning modules. We argue that these studies have encouraging results but provide little information for educators: too many differences exist between conditions to unambiguously attribute the learning effects to specific instructional techniques. Moreover, concepts are not uniformly defined and methodological weaknesses further limit the usefulness of evidence provided by these studies. Thirty-two percent of the studies evaluated a specific interventional technique. We discuss three theoretical frameworks that informed these studies: diagnostic reasoning, cognitive schemas and study strategies. Research on diagnostic reasoning suggests teaching students to start with non-analytic reasoning and subsequently applying analytic reasoning, but little is known on how to train non-analytic reasoning. Research on cognitive schemas investigated activities that help the development of appropriate cognitive schemas. Finally, research on study strategies supports the effectiveness of practice testing, but more study strategies could be applicable to learning medical image interpretation. Our commentary highlights the value of evaluating specific instructional techniques, but further evidence is required to optimally inform educators in medical image interpretation.
NASA Astrophysics Data System (ADS)
Wolf, Nils; Hof, Angela
2012-10-01
Urban sprawl driven by shifts in tourism development produces new suburban landscapes of water consumption on Mediterranean coasts. Golf courses, ornamental, 'Atlantic' gardens and swimming pools are the most striking artefacts of this transformation, threatening the local water supply systems and exacerbating water scarcity. In the face of climate change, urban landscape irrigation is becoming increasingly important from a resource management point of view. This paper adopts urban remote sensing towards a targeted mapping approach using machine learning techniques and highresolution satellite imagery (WorldView-2) to generate GIS-ready information for urban water consumption studies. Swimming pools, vegetation and - as a subgroup of vegetation - turf grass are extracted as important determinants of water consumption. For image analysis, the complex nature of urban environments suggests spatial-spectral classification, i.e. the complementary use of the spectral signature and spatial descriptors. Multiscale image segmentation provides means to extract the spatial descriptors - namely object feature layers - which can be concatenated at pixel level to the spectral signature. This study assesses the value of object features using different machine learning techniques and amounts of labeled information for learning. The results indicate the benefit of the spatial-spectral approach if combined with appropriate classifiers like tree-based ensembles or support vector machines, which can handle high dimensionality. Finally, a Random Forest classifier was chosen to deliver the classified input data for the estimation of evaporative water loss and net landscape irrigation requirements.
Karin, Janet
2016-01-01
The process of transmitting ballet’s complex technique to young dancers can interfere with the innate processes that give rise to efficient, expressive and harmonious movement. With the intention of identifying possible solutions, this article draws on research across the fields of neurology, psychology, motor learning, and education, and considers their relevance to ballet as an art form, a technique, and a training methodology. The integration of dancers’ technique and expressivity is a core theme throughout the paper. A brief outline of the historical development of ballet’s aesthetics and training methods leads into factors that influence dancers’ performance. An exploration of the role of the neuromotor system in motor learning and the acquisition of expert skills reveals the roles of sensory awareness, imagery, and intention in cuing efficient, expressive movement. It also indicates potentially detrimental effects of conscious muscle control, explicit learning and persistent naïve beliefs. Finally, the paper presents a new theory regarding the acquisition of ballet skills. Recontextualization theory proposes that placing a problematic task within a new context may engender a new conceptual approach and/or sensory intention, and hence the genesis of new motor programs; and that these new programs may lead to performance that is more efficient, more rewarding for the dancer, more pleasing aesthetically, and more expressive. From an anecdotal point of view, this theory appears to be supported by the progress of many dancers at various stages of their dancing lives. PMID:27047437
Karin, Janet
2016-01-01
The process of transmitting ballet's complex technique to young dancers can interfere with the innate processes that give rise to efficient, expressive and harmonious movement. With the intention of identifying possible solutions, this article draws on research across the fields of neurology, psychology, motor learning, and education, and considers their relevance to ballet as an art form, a technique, and a training methodology. The integration of dancers' technique and expressivity is a core theme throughout the paper. A brief outline of the historical development of ballet's aesthetics and training methods leads into factors that influence dancers' performance. An exploration of the role of the neuromotor system in motor learning and the acquisition of expert skills reveals the roles of sensory awareness, imagery, and intention in cuing efficient, expressive movement. It also indicates potentially detrimental effects of conscious muscle control, explicit learning and persistent naïve beliefs. Finally, the paper presents a new theory regarding the acquisition of ballet skills. Recontextualization theory proposes that placing a problematic task within a new context may engender a new conceptual approach and/or sensory intention, and hence the genesis of new motor programs; and that these new programs may lead to performance that is more efficient, more rewarding for the dancer, more pleasing aesthetically, and more expressive. From an anecdotal point of view, this theory appears to be supported by the progress of many dancers at various stages of their dancing lives.
Finding Waldo: Learning about Users from their Interactions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Eli T.; Ottley, Alvitta; Zhao, Helen
Visual analytics is inherently a collaboration between human and computer. However, in current visual analytics systems, the computer has limited means of knowing about its users and their analysis processes. While existing research has shown that a user’s interactions with a system reflect a large amount of the user’s reasoning process, there has been limited advancement in developing automated, real-time techniques that mine interactions to learn about the user. In this paper, we demonstrate that we can accurately predict a user’s task performance and infer some user personality traits by using machine learning techniques to analyze interaction data. Specifically, wemore » conduct an experiment in which participants perform a visual search task and we apply well-known machine learning algorithms to three encodings of the users interaction data. We achieve, depending on algorithm and encoding, between 62% and 96% accuracy at predicting whether each user will be fast or slow at completing the task. Beyond predicting performance, we demonstrate that using the same techniques, we can infer aspects of the user’s personality factors, including locus of control, extraversion, and neuroticism. Further analyses show that strong results can be attained with limited observation time, in some cases, 82% of the final accuracy is gained after a quarter of the average task completion time. Overall, our findings show that interactions can provide information to the computer about its human collaborator, and establish a foundation for realizing mixed- initiative visual analytics systems.« less
Application and evaluation of a combination of socratice and learning through discussion techniques.
van Aswegen, E J; Brink, H I; Steyn, P J
2001-11-01
This article has its genesis in the inquirer's interest in the need for internalizing critical thinking, creative thinking and reflective skills in adult learners. As part of a broader study the inquirer used a combination of two techniques over a period of nine months, namely: Socratic discussion/questioning and Learning Through Discussion Technique. The inquirer within this inquiry elected mainly qualitative methods, because they were seen as more adaptable to dealing with multiple realities and more sensitive and adaptable to the many shaping influences and value patterns that may be encountered (Lincoln & Guba, 1989). Purposive sampling was used and sample size (n = 10) was determined by the willingness of potential participants to enlist in the chosen techniques. Feedback from participants was obtained: (1) verbally after each discussion session, and (2) in written format after completion of the course content. The final/summative evaluation was obtained through a semi-structured questionnaire. This was deemed necessary, in that the participants were already studying for the end of the year examination. For the purpose of this condensed report the inquirer reflected only on the feedback obtained with the help of the questionnaire. The empirical study showed that in spite of various adaptation problems experienced, eight (8) of the ten (10) participants felt positive toward the applied techniques.
Neuroimaging of Fear-Associated Learning
Greco, John A; Liberzon, Israel
2016-01-01
Fear conditioning has been commonly used as a model of emotional learning in animals and, with the introduction of functional neuroimaging techniques, has proven useful in establishing the neurocircuitry of emotional learning in humans. Studies of fear acquisition suggest that regions such as amygdala, insula, anterior cingulate cortex, and hippocampus play an important role in acquisition of fear, whereas studies of fear extinction suggest that the amygdala is also crucial for safety learning. Extinction retention testing points to the ventromedial prefrontal cortex as an essential region in the recall of the safety trace, and explicit learning of fear and safety associations recruits additional cortical and subcortical regions. Importantly, many of these findings have implications in our understanding of the pathophysiology of psychiatric disease. Recent studies using clinical populations have lent insight into the changes in regional activity in specific disorders, and treatment studies have shown how pharmaceutical and other therapeutic interventions modulate brain activation during emotional learning. Finally, research investigating individual differences in neurotransmitter receptor genotypes has highlighted the contribution of these systems in fear-associated learning. PMID:26294108
Network-based high level data classification.
Silva, Thiago Christiano; Zhao, Liang
2012-06-01
Traditional supervised data classification considers only physical features (e.g., distance or similarity) of the input data. Here, this type of learning is called low level classification. On the other hand, the human (animal) brain performs both low and high orders of learning and it has facility in identifying patterns according to the semantic meaning of the input data. Data classification that considers not only physical attributes but also the pattern formation is, here, referred to as high level classification. In this paper, we propose a hybrid classification technique that combines both types of learning. The low level term can be implemented by any classification technique, while the high level term is realized by the extraction of features of the underlying network constructed from the input data. Thus, the former classifies the test instances by their physical features or class topologies, while the latter measures the compliance of the test instances to the pattern formation of the data. Our study shows that the proposed technique not only can realize classification according to the pattern formation, but also is able to improve the performance of traditional classification techniques. Furthermore, as the class configuration's complexity increases, such as the mixture among different classes, a larger portion of the high level term is required to get correct classification. This feature confirms that the high level classification has a special importance in complex situations of classification. Finally, we show how the proposed technique can be employed in a real-world application, where it is capable of identifying variations and distortions of handwritten digit images. As a result, it supplies an improvement in the overall pattern recognition rate.
Dental students' preferences and performance in crown design: conventional wax-added versus CAD.
Douglas, R Duane; Hopp, Christa D; Augustin, Marcus A
2014-12-01
The purpose of this study was to evaluate dental students' perceptions of traditional waxing vs. computer-aided crown design and to determine the effectiveness of either technique through comparative grading of the final products. On one of twoidentical tooth preparations, second-year students at one dental school fabricated a wax pattern for a full contour crown; on the second tooth preparation, the same students designed and fabricated an all-ceramic crown using computer-aided design (CAD) and computer-aided manufacturing (CAM) technology. Projects were graded for occlusion and anatomic form by three faculty members. On completion of the projects, 100 percent of the students (n=50) completed an eight-question, five-point Likert scalesurvey, designed to assess their perceptions of and learning associated with the two design techniques. The average grades for the crown design projects were 78.3 (CAD) and 79.1 (wax design). The mean numbers of occlusal contacts were 3.8 (CAD) and 2.9(wax design), which was significantly higher for CAD (p=0.02). The survey results indicated that students enjoyed designing afull contour crown using CAD as compared to using conventional wax techniques and spent less time designing the crown using CAD. From a learning perspective, students felt that they learned more about position and the size/strength of occlusal contacts using CAD. However, students recognized that CAD technology has limits in terms of representing anatomic contours and excursive occlusion compared to conventional wax techniques. The results suggest that crown design using CAD could be considered as an adjunct to conventional wax-added techniques in preclinical fixed prosthodontic curricula.
Effective and ineffective supervision in postgraduate dental education: a qualitative study.
Subramanian, J; Anderson, V R; Morgaine, K C; Thomson, W M
2013-02-01
Research suggests that students' perceptions should be considered in any discussion of their education, but there has been no systematic examination of New Zealand postgraduate dental students' learning experiences. This study aimed to obtain in-depth qualitative insights into student and graduate perceptions of effective and ineffective learning in postgraduate dental education. Data were collected in 2010 using semi-structured individual interviews. Participants included final-year students and graduates of the University of Otago Doctor of Clinical Dentistry programme. Using the Critical Incident Technique, participants were asked to describe atleast one effective and one ineffective learning experience in detail. Interview transcripts were analysed using a general inductive approach. Broad themes which emerged included supervisory approaches, characteristics of the learning process, and the physical learning environment. This paper considers students' and graduates' perceptions of postgraduate supervision in dentistry as it promotes or precludes effective learning. Effective learning was associated by participants with approachable and supportive supervisory practices, and technique demonstrations accompanied by explicit explanations. Ineffective learning was associated with minimal supervisor demonstrations and guidance (particularly when beginning postgraduate study), and aggressive, discriminatory and/or culturally insensitive supervisory approaches. Participants' responses provided rich, in-depth insights into their reflections and understandings of effective and ineffective approaches to supervision as it influenced their learning in the clinical and research settings. These findings provide a starting point for the development of curriculum and supervisory practices, enhancement of supervisory and mentoring approaches, and the design of continuing education programmes for supervisors at an institutional level. Additionally, these findings might also stimulate topics for reflection and discussion amongst dental educators and administrators more broadly. © 2012 John Wiley & Sons A/S.
NASA Astrophysics Data System (ADS)
Silversides, Katherine L.; Melkumyan, Arman
2017-03-01
Machine learning techniques such as Gaussian Processes can be used to identify stratigraphically important features in geophysical logs. The marker shales in the banded iron formation hosted iron ore deposits of the Hamersley Ranges, Western Australia, form distinctive signatures in the natural gamma logs. The identification of these marker shales is important for stratigraphic identification of unit boundaries for the geological modelling of the deposit. Machine learning techniques each have different unique properties that will impact the results. For Gaussian Processes (GPs), the output values are inclined towards the mean value, particularly when there is not sufficient information in the library. The impact that these inclinations have on the classification can vary depending on the parameter values selected by the user. Therefore, when applying machine learning techniques, care must be taken to fit the technique to the problem correctly. This study focuses on optimising the settings and choices for training a GPs system to identify a specific marker shale. We show that the final results converge even when different, but equally valid starting libraries are used for the training. To analyse the impact on feature identification, GP models were trained so that the output was inclined towards a positive, neutral or negative output. For this type of classification, the best results were when the pull was towards a negative output. We also show that the GP output can be adjusted by using a standard deviation coefficient that changes the balance between certainty and accuracy in the results.
Automatic welding detection by an intelligent tool pipe inspection
NASA Astrophysics Data System (ADS)
Arizmendi, C. J.; Garcia, W. L.; Quintero, M. A.
2015-07-01
This work provide a model based on machine learning techniques in welds recognition, based on signals obtained through in-line inspection tool called “smart pig” in Oil and Gas pipelines. The model uses a signal noise reduction phase by means of pre-processing algorithms and attribute-selection techniques. The noise reduction techniques were selected after a literature review and testing with survey data. Subsequently, the model was trained using recognition and classification algorithms, specifically artificial neural networks and support vector machines. Finally, the trained model was validated with different data sets and the performance was measured with cross validation and ROC analysis. The results show that is possible to identify welding automatically with an efficiency between 90 and 98 percent.
Image Change Detection via Ensemble Learning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martin, Benjamin W; Vatsavai, Raju
2013-01-01
The concept of geographic change detection is relevant in many areas. Changes in geography can reveal much information about a particular location. For example, analysis of changes in geography can identify regions of population growth, change in land use, and potential environmental disturbance. A common way to perform change detection is to use a simple method such as differencing to detect regions of change. Though these techniques are simple, often the application of these techniques is very limited. Recently, use of machine learning methods such as neural networks for change detection has been explored with great success. In this work,more » we explore the use of ensemble learning methodologies for detecting changes in bitemporal synthetic aperture radar (SAR) images. Ensemble learning uses a collection of weak machine learning classifiers to create a stronger classifier which has higher accuracy than the individual classifiers in the ensemble. The strength of the ensemble lies in the fact that the individual classifiers in the ensemble create a mixture of experts in which the final classification made by the ensemble classifier is calculated from the outputs of the individual classifiers. Our methodology leverages this aspect of ensemble learning by training collections of weak decision tree based classifiers to identify regions of change in SAR images collected of a region in the Staten Island, New York area during Hurricane Sandy. Preliminary studies show that the ensemble method has approximately 11.5% higher change detection accuracy than an individual classifier.« less
Gross, Douglas P; Zhang, Jing; Steenstra, Ivan; Barnsley, Susan; Haws, Calvin; Amell, Tyler; McIntosh, Greg; Cooper, Juliette; Zaiane, Osmar
2013-12-01
To develop a classification algorithm and accompanying computer-based clinical decision support tool to help categorize injured workers toward optimal rehabilitation interventions based on unique worker characteristics. Population-based historical cohort design. Data were extracted from a Canadian provincial workers' compensation database on all claimants undergoing work assessment between December 2009 and January 2011. Data were available on: (1) numerous personal, clinical, occupational, and social variables; (2) type of rehabilitation undertaken; and (3) outcomes following rehabilitation (receiving time loss benefits or undergoing repeat programs). Machine learning, concerned with the design of algorithms to discriminate between classes based on empirical data, was the foundation of our approach to build a classification system with multiple independent and dependent variables. The population included 8,611 unique claimants. Subjects were predominantly employed (85 %) males (64 %) with diagnoses of sprain/strain (44 %). Baseline clinician classification accuracy was high (ROC = 0.86) for selecting programs that lead to successful return-to-work. Classification performance for machine learning techniques outperformed the clinician baseline classification (ROC = 0.94). The final classifiers were multifactorial and included the variables: injury duration, occupation, job attachment status, work status, modified work availability, pain intensity rating, self-rated occupational disability, and 9 items from the SF-36 Health Survey. The use of machine learning classification techniques appears to have resulted in classification performance better than clinician decision-making. The final algorithm has been integrated into a computer-based clinical decision support tool that requires additional validation in a clinical sample.
Matsunuma, Mitsuyasu
2009-04-01
This study examined why some high achievers on the course final exam were unsuccessful on the proficiency exam in English. We hypothesized that the learning motives and learning behaviors (learning strategy, learning time) had different effects on the outcomes of the exams. First, the relation between the variables was investigated using structural equation modeling. Second, the learning behaviors of students who got good marks on both exams were compared with students who did well only on the course final exam. The results were as follows. (a) Learning motives influenced test performance via learning behaviors. (b) Content-attached motives influenced all variables concerning learning behaviors. (c) Content-detached motives influenced all variables concerning learning behaviors that were related only to the course final exam. (d) The students who got good marks on both exams performed the learning behaviors that were useful on the proficiency exam more frequently than the students who did well only on the course final exam.
A Learner-Centered Spiral Knowledge Approach to Teaching Isotope Geology
NASA Astrophysics Data System (ADS)
Reid, M. R.
2006-12-01
Aided by the insights I gained by participation in the Arizona Board of Regents Tri-University Collaboration on Learner-Centered Practice, I made major changes to a graduate course in isotope geology (GLG617), including: 1) implementation of a spiral knowledge approach (e.g., Bruner, 1990; Dyar et al., 2004); 2) incorporation of more learner-centered in-class activities; and 3) more explicit emphasis on skills that I regarded as important for success in geochemistry. In the geosciences, the field of isotope geology is now an essential area of inquiry with implications for geologic timescales, climate information, tracing geochemical processes, and biological evolution, to name a few. The traditional approach to teaching isotope geology suffers from the fact that learning tends to be compartmentalized by technique/approach and one subfield (e.g., stable or radiogenic isotopes) is usually favored by appearing earlier in semester. To make learning more integrated, I employed a simplified spiral learning approach so that common principles could be revisited several times over the course of the semester and, in so doing, students' grasp of the fundamental principles could be scaffolded into greater understanding. Other learner-centered changes to the course included more explicit emphasis on helping students become comfortable with interpreting data displayed graphically and explicit emphasis on helping students give and evaluate oral presentations that rely on isotope data. I also developed a detailed grading rubric for the final paper and allowed students to have a draft of their final papers evaluated and graded (guided by Huba and Freed, 2000) A number of cooperative learning activities developed specifically for this course (19 in all) enabled me to gain a better appreciation for students' learning. Activities included pair share, round-robin, small group explorations of techniques and case studies (sometimes as introduction to, sometimes as review of material), and Jeopardy-style review sessions. Student learning was also encouraged by brief take-home assignments (graded for "participation") and announced and unannounced quizzes on reading and lecture material. On assessment questionnaires completed after three major milestones in the course (the mid-term exam, the first oral presentation, and the final paper), students ranked the in-class cooperative learning activities as on par with lectures and homework exercises in facilitating their learning. Students recorded improvements in perceived comfort levels with the three major goals identified for the course as the semester progressed even though they were not explicitly reminded of these goals at the time of assessment. Exam performance was better than average and student evaluations indicated greater instructor satisfaction. I enjoyed teaching the class as much as any I have ever taught. References cited: Bruner, J., 1990. Acts of Meaning. Harvard University Press.; Dyar, M.D., Gunter, M.E., Davis, J.C., and Odell, M.R., 2004. Integration of new methods into teaching mineralogy; Huba, M.E. and Freed, J.E., 2000. Learner-centered Assessment on College Campus: Shifting the Focus from Teaching to Learning. Allyn and Bacon.
Part-Task Training Strategies in Simulated Carrier Landing Final Approach Training
1983-11-01
received a large amount of attention in the recent past. However, the notion that the value of flight simulation may b• enhanced when principles of...as training devices through the application of principles of learning. The research proposed here s based on this point of view. THIS EXPERIMENT The...tracking. Following Goldstein’s suggestion, one should look for training techniques suggested by learnina principles developed from research on
NASA Technical Reports Server (NTRS)
Reif, John H.
1987-01-01
A parallel compression algorithm for the 16,384 processor MPP machine was developed. The serial version of the algorithm can be viewed as a combination of on-line dynamic lossless test compression techniques (which employ simple learning strategies) and vector quantization. These concepts are described. How these concepts are combined to form a new strategy for performing dynamic on-line lossy compression is discussed. Finally, the implementation of this algorithm in a massively parallel fashion on the MPP is discussed.
Palmer, Russ; Elder, Deborah; Fulford, Michael; Morris, Steve; Sappington, Kellie
2015-01-01
Objective. To evaluate how flexible learning via online video review affects the ability and confidence of first-year (P1) pharmacy students to accurately compound aseptic preparations. Design. Customary instructions and assignments for aseptic compounding were provided to students, who were given unlimited access to 5 short review videos in addition to customary instruction. Student self-confidence was assessed online, and faculty members evaluated students’ aseptic technique at the conclusion of the semester. Assessment. No significant difference on final assessment scores was observed between those who viewed videos and those who did not. Student self-confidence scores increased significantly from baseline, but were not significantly higher for those who viewed videos than for those who did not. Conclusion. First-year students performed well on final aseptic compounding assessments, and those who viewed videos had a slight advantage. Student self-confidence improved over the semester regardless of whether or not students accessed review videos. PMID:26430278
Improving basic surgical skills for final year medical students: the value of a rural weekend.
House, A K; House, J
2000-05-01
Hospitals employing medical graduates often express concern at the inexperience of new interns in basic surgical skills. In self assessment questionnaires, our senior medical students reported little clinical procedural experience. A practical skills workshop was staged in order to set learning goals for the final study year. This gave the students an opportunity to learn, revise and practice basic surgical techniques. The Bruce Rock rural community sponsored a surgical camp at the beginning of the academic year. Ninety-five (80%) of the class registered at the workshop, which rotated them through teaching modules, with private study opportunities and the capacity to cater for varied skill levels. Eight teaching stations with multiple access points were provided, and ten mock trauma scenarios were staged to augment the learning process. The teaching weekend was rated by students on an evaluative entrance and exit questionnaire. Sixty-five (73%) students returned questionnaires. They recorded significant improvement (P < 0.05) in their ability to handle the teaching stations. All students had inserted intravenous lines in practice prior to the camp, so the rating change in intravenous line insertion ability was not statistically significant. The weekend retreat offers students a chance to focus on surgical skills, free from the pressures of a clinical setting or the classroom. The emphasis was on the value of practice and primary skills learning. Students endorsed the camp as relevant, practical and an enjoyable learning experience for basic surgical skills.
Solving and Learning Soft Temporal Constraints: Experimental Setting and Results
NASA Technical Reports Server (NTRS)
Rossi, F.; Sperduti, A.; Venable, K. B.; Khatib, L.; Morris, P.; Morris, R.; Clancy, Daniel (Technical Monitor)
2002-01-01
Soft temporal constraints problems allow to describe in a natural way scenarios where events happen over time and preferences are associated to event distances and durations. However, sometimes such local preferences are difficult to set, and it may be easier instead to associate preferences to some complete solutions of the problem. Machine learning techniques can be useful in this respect. In this paper we describe two solvers (one more general and the other one more efficient) for tractable subclasses of soft temporal problems, and we show some experimental results. The random generator used to build the problems on which tests are performed is also described. We also compare the two solvers highlighting the tradeoff between performance and representational power. Finally, we present a learning module and we show its behavior on randomly-generated examples.
Classification of fMRI resting-state maps using machine learning techniques: A comparative study
NASA Astrophysics Data System (ADS)
Gallos, Ioannis; Siettos, Constantinos
2017-11-01
We compare the efficiency of Principal Component Analysis (PCA) and nonlinear learning manifold algorithms (ISOMAP and Diffusion maps) for classifying brain maps between groups of schizophrenia patients and healthy from fMRI scans during a resting-state experiment. After a standard pre-processing pipeline, we applied spatial Independent component analysis (ICA) to reduce (a) noise and (b) spatial-temporal dimensionality of fMRI maps. On the cross-correlation matrix of the ICA components, we applied PCA, ISOMAP and Diffusion Maps to find an embedded low-dimensional space. Finally, support-vector-machines (SVM) and k-NN algorithms were used to evaluate the performance of the algorithms in classifying between the two groups.
Pattern perception and computational complexity: introduction to the special issue
Fitch, W. Tecumseh; Friederici, Angela D.; Hagoort, Peter
2012-01-01
Research on pattern perception and rule learning, grounded in formal language theory (FLT) and using artificial grammar learning paradigms, has exploded in the last decade. This approach marries empirical research conducted by neuroscientists, psychologists and ethologists with the theory of computation and FLT, developed by mathematicians, linguists and computer scientists over the last century. Of particular current interest are comparative extensions of this work to non-human animals, and neuroscientific investigations using brain imaging techniques. We provide a short introduction to the history of these fields, and to some of the dominant hypotheses, to help contextualize these ongoing research programmes, and finally briefly introduce the papers in the current issue. PMID:22688630
Incorporating an ERP Project into Undergraduate Instruction
Nyhus, Erika; Curtis, Nancy
2016-01-01
Electroencephalogram (EEG) is a relatively non-invasive, simple technique, and recent advances in open source analysis tools make it feasible to implement EEG as a component in undergraduate neuroscience curriculum. We have successfully led students to design novel experiments, record EEG data, and analyze event-related potentials (ERPs) during a one-semester laboratory course for undergraduates in cognitive neuroscience. First, students learned how to set up an EEG recording and completed an analysis tutorial. Students then learned how to set up a novel EEG experiment; briefly, they formed groups of four and designed an EEG experiment on a topic of their choice. Over the course of two weeks students collected behavioral and EEG data. Each group then analyzed their behavioral and ERP data and presented their results both as a presentation and as a final paper. Upon completion of the group project students reported a deeper understanding of cognitive neuroscience methods and a greater appreciation for the strengths and weaknesses of the EEG technique. Although recent advances in open source software made this project possible, it also required access to EEG recording equipment and proprietary software. Future efforts should be directed at making publicly available datasets to learn ERP analysis techniques and making publicly available EEG recording and analysis software to increase the accessibility of hands-on research experience in undergraduate cognitive neuroscience laboratory courses. PMID:27385925
Meng, Qier; Kitasaka, Takayuki; Nimura, Yukitaka; Oda, Masahiro; Ueno, Junji; Mori, Kensaku
2017-02-01
Airway segmentation plays an important role in analyzing chest computed tomography (CT) volumes for computerized lung cancer detection, emphysema diagnosis and pre- and intra-operative bronchoscope navigation. However, obtaining a complete 3D airway tree structure from a CT volume is quite a challenging task. Several researchers have proposed automated airway segmentation algorithms basically based on region growing and machine learning techniques. However, these methods fail to detect the peripheral bronchial branches, which results in a large amount of leakage. This paper presents a novel approach for more accurate extraction of the complex airway tree. This proposed segmentation method is composed of three steps. First, Hessian analysis is utilized to enhance the tube-like structure in CT volumes; then, an adaptive multiscale cavity enhancement filter is employed to detect the cavity-like structure with different radii. In the second step, support vector machine learning will be utilized to remove the false positive (FP) regions from the result obtained in the previous step. Finally, the graph-cut algorithm is used to refine the candidate voxels to form an integrated airway tree. A test dataset including 50 standard-dose chest CT volumes was used for evaluating our proposed method. The average extraction rate was about 79.1 % with the significantly decreased FP rate. A new method of airway segmentation based on local intensity structure and machine learning technique was developed. The method was shown to be feasible for airway segmentation in a computer-aided diagnosis system for a lung and bronchoscope guidance system.
Machine Learning and Data Mining for Comprehensive Test Ban Treaty Monitoring
DOE Office of Scientific and Technical Information (OSTI.GOV)
Russell, S; Vaidya, S
2009-07-30
The Comprehensive Test Ban Treaty (CTBT) is gaining renewed attention in light of growing worldwide interest in mitigating risks of nuclear weapons proliferation and testing. Since the International Monitoring System (IMS) installed the first suite of sensors in the late 1990's, the IMS network has steadily progressed, providing valuable support for event diagnostics. This progress was highlighted at the recent International Scientific Studies (ISS) Conference in Vienna in June 2009, where scientists and domain experts met with policy makers to assess the current status of the CTBT Verification System. A strategic theme within the ISS Conference centered on exploring opportunitiesmore » for further enhancing the detection and localization accuracy of low magnitude events by drawing upon modern tools and techniques for machine learning and large-scale data analysis. Several promising approaches for data exploitation were presented at the Conference. These are summarized in a companion report. In this paper, we introduce essential concepts in machine learning and assess techniques which could provide both incremental and comprehensive value for event discrimination by increasing the accuracy of the final data product, refining On-Site-Inspection (OSI) conclusions, and potentially reducing the cost of future network operations.« less
Leyva-Moral, Juan M; Riu Camps, Marta
2016-05-01
To adapt nursing studies to the European Higher Education Area, new teaching methods have been included that assign maximum importance to student-centered learning and collaborative work. The Jigsaw Technique is based on collaborative learning and everyone in the group must play their part because each student's mark depends on the other students. Home group members are given the responsibility to become experts in a specific area of knowledge. Experts meet together to reach an agreement and improve skills. Finally, experts return to their home groups to share all their findings. The aim of this study was to evaluate nursing student satisfaction with the Jigsaw Technique used in the context of a compulsory course in research methods for nursing. A cross-sectional study was conducted using a self-administered anonymous questionnaire administered to students who completed the Research Methods course during the 2012-13 and 2013-14 academic years. The questionnaire was developed taking into account the learning objectives, competencies and skills that should be acquired by students, as described in the course syllabus. The responses were compared by age group (younger or older than 22years). A total of 89.6% of nursing students under 22years believed that this methodology helped them to develop teamwork, while this figure was 79.6% in older students. Nursing students also believed it helped them to work independently, with differences according to age, 79.7% and 58% respectively (p=0.010). Students disagreed with the statement "The Jigsaw Technique involves little workload", with percentages of 88.5% in the group under 22years and 80% in older students. Most believed that this method should not be employed in upcoming courses, although there were differences by age, with 44.3% of the younger group being against and 62% of the older group (p=0.037). The method was not highly valued by students, mainly by those older than 22years, who concluded that they did not learn more with it than with other traditional techniques. The results of this study question whether this form of learning meets students' learning needs and its compatibility with individual and group realities. Copyright © 2016 Elsevier Ltd. All rights reserved.
Mincholé, Ana; Martínez, Juan Pablo; Laguna, Pablo; Rodriguez, Blanca
2018-01-01
Widely developed for clinical screening, electrocardiogram (ECG) recordings capture the cardiac electrical activity from the body surface. ECG analysis can therefore be a crucial first step to help diagnose, understand and predict cardiovascular disorders responsible for 30% of deaths worldwide. Computational techniques, and more specifically machine learning techniques and computational modelling are powerful tools for classification, clustering and simulation, and they have recently been applied to address the analysis of medical data, especially ECG data. This review describes the computational methods in use for ECG analysis, with a focus on machine learning and 3D computer simulations, as well as their accuracy, clinical implications and contributions to medical advances. The first section focuses on heartbeat classification and the techniques developed to extract and classify abnormal from regular beats. The second section focuses on patient diagnosis from whole recordings, applied to different diseases. The third section presents real-time diagnosis and applications to wearable devices. The fourth section highlights the recent field of personalized ECG computer simulations and their interpretation. Finally, the discussion section outlines the challenges of ECG analysis and provides a critical assessment of the methods presented. The computational methods reported in this review are a strong asset for medical discoveries and their translation to the clinical world may lead to promising advances. PMID:29321268
An Algebra-Based Introductory Computational Neuroscience Course with Lab.
Fink, Christian G
2017-01-01
A course in computational neuroscience has been developed at Ohio Wesleyan University which requires no previous experience with calculus or computer programming, and which exposes students to theoretical models of neural information processing and techniques for analyzing neural data. The exploration of theoretical models of neural processes is conducted in the classroom portion of the course, while data analysis techniques are covered in lab. Students learn to program in MATLAB and are offered the opportunity to conclude the course with a final project in which they explore a topic of their choice within computational neuroscience. Results from a questionnaire administered at the beginning and end of the course indicate significant gains in student facility with core concepts in computational neuroscience, as well as with analysis techniques applied to neural data.
WE-A-BRE-01: Debate: To Measure or Not to Measure
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moran, J; Miften, M; Mihailidis, D
2014-06-15
Recent studies have highlighted some of the limitations of patient-specific pre-treatment IMRT QA measurements with respect to assessing plan deliverability. Pre-treatment QA measurements are frequently performed with detectors in phantoms that do not involve any patient heterogeneities or with an EPID without a phantom. Other techniques have been developed where measurement results are used to recalculate the patient-specific dose volume histograms. Measurements continue to play a fundamental role in understanding the initial and continued performance of treatment planning and delivery systems. Less attention has been focused on the role of computational techniques in a QA program such as calculation withmore » independent dose calculation algorithms or recalculation of the delivery with machine log files or EPID measurements. This session will explore the role of pre-treatment measurements compared to other methods such as computational and transit dosimetry techniques. Efficiency and practicality of the two approaches will also be presented and debated. The speakers will present a history of IMRT quality assurance and debate each other regarding which types of techniques are needed today and for future quality assurance. Examples will be shared of situations where overall quality needed to be assessed with calculation techniques in addition to measurements. Elements where measurements continue to be crucial such as for a thorough end-to-end test involving measurement will be discussed. Operational details that can reduce the gamma tool effectiveness and accuracy for patient-specific pre-treatment IMRT/VMAT QA will be described. Finally, a vision for the future of IMRT and VMAT plan QA will be discussed from a safety perspective. Learning Objectives: Understand the advantages and limitations of measurement and calculation approaches for pre-treatment measurements for IMRT and VMAT planning Learn about the elements of a balanced quality assurance program involving modulated techniques Learn how to use tools and techniques such as an end-to-end test to enhance your IMRT and VMAT QA program.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Searcy, Jacob; Huang, Lillian; Pleier, Marc -Andre
The unitarization of the longitudinal vector boson scattering (VBS) cross section by the Higgs boson is a fundamental prediction of the Standard Model which has not been experimentally verified. One of the most promising ways to measure VBS uses events containing two leptonically decaying same-electric-charge W bosons produced in association with two jets. However, the angular distributions of the leptons in the W boson rest frame, which are commonly used to fit polarization fractions, are not readily available in this process due to the presence of two neutrinos in the final state. In this paper we present a method tomore » alleviate this problem by using a deep machine learning technique to recover these angular distributions from measurable event kinematics and demonstrate how the longitudinal-longitudinal scattering fraction could be studied. Furthermore, we show that this method doubles the expected sensitivity when compared to previous proposals.« less
Searcy, Jacob; Huang, Lillian; Pleier, Marc -Andre; ...
2016-05-27
The unitarization of the longitudinal vector boson scattering (VBS) cross section by the Higgs boson is a fundamental prediction of the Standard Model which has not been experimentally verified. One of the most promising ways to measure VBS uses events containing two leptonically decaying same-electric-charge W bosons produced in association with two jets. However, the angular distributions of the leptons in the W boson rest frame, which are commonly used to fit polarization fractions, are not readily available in this process due to the presence of two neutrinos in the final state. In this paper we present a method tomore » alleviate this problem by using a deep machine learning technique to recover these angular distributions from measurable event kinematics and demonstrate how the longitudinal-longitudinal scattering fraction could be studied. Furthermore, we show that this method doubles the expected sensitivity when compared to previous proposals.« less
Assessing the use of multiple sources in student essays.
Hastings, Peter; Hughes, Simon; Magliano, Joseph P; Goldman, Susan R; Lawless, Kimberly
2012-09-01
The present study explored different approaches for automatically scoring student essays that were written on the basis of multiple texts. Specifically, these approaches were developed to classify whether or not important elements of the texts were present in the essays. The first was a simple pattern-matching approach called "multi-word" that allowed for flexible matching of words and phrases in the sentences. The second technique was latent semantic analysis (LSA), which was used to compare student sentences to original source sentences using its high-dimensional vector-based representation. Finally, the third was a machine-learning technique, support vector machines, which learned a classification scheme from the corpus. The results of the study suggested that the LSA-based system was superior for detecting the presence of explicit content from the texts, but the multi-word pattern-matching approach was better for detecting inferences outside or across texts. These results suggest that the best approach for analyzing essays of this nature should draw upon multiple natural language processing approaches.
NASA Astrophysics Data System (ADS)
Mandal, Shyamapada; Santhi, B.; Sridhar, S.; Vinolia, K.; Swaminathan, P.
2017-06-01
In this paper, an online fault detection and classification method is proposed for thermocouples used in nuclear power plants. In the proposed method, the fault data are detected by the classification method, which classifies the fault data from the normal data. Deep belief network (DBN), a technique for deep learning, is applied to classify the fault data. The DBN has a multilayer feature extraction scheme, which is highly sensitive to a small variation of data. Since the classification method is unable to detect the faulty sensor; therefore, a technique is proposed to identify the faulty sensor from the fault data. Finally, the composite statistical hypothesis test, namely generalized likelihood ratio test, is applied to compute the fault pattern of the faulty sensor signal based on the magnitude of the fault. The performance of the proposed method is validated by field data obtained from thermocouple sensors of the fast breeder test reactor.
NASA Astrophysics Data System (ADS)
Morales, Francisco J.; Reyes, Antonio; Cáceres, Noelia; Romero, Luis M.; Benitez, Francisco G.; Morgado, Joao; Duarte, Emanuel; Martins, Teresa
2017-09-01
A large percentage of transport infrastructures are composed of linear assets, such as roads and rail tracks. The large social and economic relevance of these constructions force the stakeholders to ensure a prolonged health/durability. Even though, inevitable malfunctioning, breaking down, and out-of-service periods arise randomly during the life cycle of the infrastructure. Predictive maintenance techniques tend to diminish the appearance of unpredicted failures and the execution of needed corrective interventions, envisaging the adequate interventions to be conducted before failures show up. This communication presents: i) A procedural approach, to be conducted, in order to collect the relevant information regarding the evolving state condition of the assets involved in all maintenance interventions; this reported and stored information constitutes a rich historical data base to train Machine Learning algorithms in order to generate reliable predictions of the interventions to be carried out in further time scenarios. ii) A schematic flow chart of the automatic learning procedure. iii) Self-learning rules from automatic learning from false positive/negatives. The description, testing, automatic learning approach and the outcomes of a pilot case are presented; finally some conclusions are outlined regarding the methodology proposed for improving the self-learning predictive capability.
Sixty-five years of the long march in protein secondary structure prediction: the final stretch?
Yang, Yuedong; Gao, Jianzhao; Wang, Jihua; Heffernan, Rhys; Hanson, Jack; Paliwal, Kuldip; Zhou, Yaoqi
2018-01-01
Abstract Protein secondary structure prediction began in 1951 when Pauling and Corey predicted helical and sheet conformations for protein polypeptide backbone even before the first protein structure was determined. Sixty-five years later, powerful new methods breathe new life into this field. The highest three-state accuracy without relying on structure templates is now at 82–84%, a number unthinkable just a few years ago. These improvements came from increasingly larger databases of protein sequences and structures for training, the use of template secondary structure information and more powerful deep learning techniques. As we are approaching to the theoretical limit of three-state prediction (88–90%), alternative to secondary structure prediction (prediction of backbone torsion angles and Cα-atom-based angles and torsion angles) not only has more room for further improvement but also allows direct prediction of three-dimensional fragment structures with constantly improved accuracy. About 20% of all 40-residue fragments in a database of 1199 non-redundant proteins have <6 Å root-mean-squared distance from the native conformations by SPIDER2. More powerful deep learning methods with improved capability of capturing long-range interactions begin to emerge as the next generation of techniques for secondary structure prediction. The time has come to finish off the final stretch of the long march towards protein secondary structure prediction. PMID:28040746
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aziz, H. M. Abdul; Zhu, Feng; Ukkusuri, Satish V.
Here, this research applies R-Markov Average Reward Technique based reinforcement learning (RL) algorithm, namely RMART, for vehicular signal control problem leveraging information sharing among signal controllers in connected vehicle environment. We implemented the algorithm in a network of 18 signalized intersections and compare the performance of RMART with fixed, adaptive, and variants of the RL schemes. Results show significant improvement in system performance for RMART algorithm with information sharing over both traditional fixed signal timing plans and real time adaptive control schemes. Additionally, the comparison with reinforcement learning algorithms including Q learning and SARSA indicate that RMART performs better atmore » higher congestion levels. Further, a multi-reward structure is proposed that dynamically adjusts the reward function with varying congestion states at the intersection. Finally, the results from test networks show significant reduction in emissions (CO, CO 2, NO x, VOC, PM 10) when RL algorithms are implemented compared to fixed signal timings and adaptive schemes.« less
Smart Training, Smart Learning: The Role of Cooperative Learning in Training for Youth Services.
ERIC Educational Resources Information Center
Doll, Carol A.
1997-01-01
Examines cooperative learning in youth services and adult education. Discusses characteristics of cooperative learning techniques; specific cooperative learning techniques (brainstorming, mini-lecture, roundtable technique, send-a-problem problem solving, talking chips technique, and three-step interview); and the role of the trainer. (AEF)
Dang, Shilpa; Chaudhury, Santanu; Lall, Brejesh; Roy, Prasun Kumar
2017-06-15
Determination of effective connectivity (EC) among brain regions using fMRI is helpful in understanding the underlying neural mechanisms. Dynamic Bayesian Networks (DBNs) are an appropriate class of probabilistic graphical temporal-models that have been used in past to model EC from fMRI, specifically order-one. High-order DBNs (HO-DBNs) have still not been explored for fMRI data. A fundamental problem faced in the structure-learning of HO-DBN is high computational-burden and low accuracy by the existing heuristic search techniques used for EC detection from fMRI. In this paper, we propose using dynamic programming (DP) principle along with integration of properties of scoring-function in a way to reduce search space for structure-learning of HO-DBNs and finally, for identifying EC from fMRI which has not been done yet to the best of our knowledge. The proposed exact search-&-score learning approach HO-DBN-DP is an extension of the technique which was originally devised for learning a BN's structure from static data (Singh and Moore, 2005). The effectiveness in structure-learning is shown on synthetic fMRI dataset. The algorithm reaches globally-optimal solution in appreciably reduced time-complexity than the static counterpart due to integration of properties. The proof of optimality is provided. The results demonstrate that HO-DBN-DP is comparably more accurate and faster than currently used structure-learning algorithms used for identifying EC from fMRI. The real data EC from HO-DBN-DP shows consistency with previous literature than the classical Granger Causality method. Hence, the DP algorithm can be employed for reliable EC estimates from experimental fMRI data. Copyright © 2017 Elsevier B.V. All rights reserved.
Exploiting range imagery: techniques and applications
NASA Astrophysics Data System (ADS)
Armbruster, Walter
2009-07-01
Practically no applications exist for which automatic processing of 2D intensity imagery can equal human visual perception. This is not the case for range imagery. The paper gives examples of 3D laser radar applications, for which automatic data processing can exceed human visual cognition capabilities and describes basic processing techniques for attaining these results. The examples are drawn from the fields of helicopter obstacle avoidance, object detection in surveillance applications, object recognition at high range, multi-object-tracking, and object re-identification in range image sequences. Processing times and recognition performances are summarized. The techniques used exploit the bijective continuity of the imaging process as well as its independence of object reflectivity, emissivity and illumination. This allows precise formulations of the probability distributions involved in figure-ground segmentation, feature-based object classification and model based object recognition. The probabilistic approach guarantees optimal solutions for single images and enables Bayesian learning in range image sequences. Finally, due to recent results in 3D-surface completion, no prior model libraries are required for recognizing and re-identifying objects of quite general object categories, opening the way to unsupervised learning and fully autonomous cognitive systems.
Interactive computer simulations of knee-replacement surgery.
Gunther, Stephen B; Soto, Gabriel E; Colman, William W
2002-07-01
Current surgical training programs in the United States are based on an apprenticeship model. This model is outdated because it does not provide conceptual scaffolding, promote collaborative learning, or offer constructive reinforcement. Our objective was to create a more useful approach by preparing students and residents for operative cases using interactive computer simulations of surgery. Total-knee-replacement surgery (TKR) is an ideal procedure to model on the computer because there is a systematic protocol for the procedure. Also, this protocol is difficult to learn by the apprenticeship model because of the multiple instruments that must be used in a specific order. We designed an interactive computer tutorial to teach medical students and residents how to perform knee-replacement surgery. We also aimed to reinforce the specific protocol of the operative procedure. Our final goal was to provide immediate, constructive feedback. We created a computer tutorial by generating three-dimensional wire-frame models of the surgical instruments. Next, we applied a surface to the wire-frame models using three-dimensional modeling. Finally, the three-dimensional models were animated to simulate the motions of an actual TKR. The tutorial is a step-by-step tutorial that teaches and tests the correct sequence of steps in a TKR. The student or resident must select the correct instruments in the correct order. The learner is encouraged to learn the stepwise surgical protocol through repetitive use of the computer simulation. Constructive feedback is acquired through a grading system, which rates the student's or resident's ability to perform the task in the correct order. The grading system also accounts for the time required to perform the simulated procedure. We evaluated the efficacy of this teaching technique by testing medical students who learned by the computer simulation and those who learned by reading the surgical protocol manual. Both groups then performed TKR on manufactured bone models using real instruments. Their technique was graded with the standard protocol. The students who learned on the computer simulation performed the task in a shorter time and with fewer errors than the control group. They were also more engaged in the learning process. Surgical training programs generally lack a consistent approach to preoperative education related to surgical procedures. This interactive computer tutorial has allowed us to make a quantum leap in medical student and resident teaching in our orthopedic department because the students actually participate in the entire process. Our technique provides a linear, sequential method of skill acquisition and direct feedback, which is ideally suited for learning stepwise surgical protocols. Since our initial evaluation has shown the efficacy of this program, we have implemented this teaching tool into our orthopedic curriculum. Our plans for future work with this simulator include modeling procedures involving other anatomic areas of interest, such as the hip and shoulder.
Mental models, metaphors and their use in the education of nurses.
Burke, L M; Wilson, A M
1997-11-01
A great deal of nurses' confidence in the use of information technology (IT) depends both on the way computers are introduced to students in the college and how such education is continued and applied when they are practitioners. It is therefore vital that teachers of IT assist nurses to discover ways of learning to utilize and apply computers within their workplace with whatever methods are available. One method which has been introduced with success in other fields is the use of mental models and metaphors. Mental models and metaphors enable individuals to learn by building on past learning. Concepts and ideas which have already been internalized from past experience can be transferred and adapted for usage in a new learning situation with computers and technology. This article explores the use of mental models and metaphors for the technological education of nurses. The concepts themselves will be examined, followed by suggestions for possible applications specifically in the field of nursing and health care. Finally the role of the teacher in enabling improved learning as a result of these techniques will be addressed.
Using Hollywood techniques to teach freshman astronomy over the Internet
NASA Astrophysics Data System (ADS)
Friedberg, R.; Lipnick, D.; Vila Migliaro, M.
We use interactive 'click and drag' learning, bold colors, high graphic design standards, cartooning, animations and videos. We present Astronomy material in three languages, written English, written Spanish, and written and spoken Navajo. This distance-learning course is specifically designed for students with limited proficiency in the English language. We have both a lecture and laboratory series in the course that may be found at www.ibe.ncc.cc.nm.us and http://yoda.phys.unm.edu/ast100. It carries 4 hours of credit as Astronomy 100. To paraphrase John Ford, the great Hollywood director, a good movie should be able to stand with no dialogue. We have tried to meet his standard. We have borrowed heavily from the style of the Pvt. Snafu World War II military training films produced principally by the Walt Disney Studios. We have also used the graphic design techniques that I learned many years ago as a technical briefing officer at the Chief of Naval Operation's Briefing Room at the Pentagon, Washington D.C.. Finally, we use elements of 'Programmed Learning' developed by the American Management Association thirty odd years ago. Elements that make our web course unique are: A laboratory on Navajo Astronomy, lectures translated into Spanish, and many collateral resources for student use both internal to our web site and as external links on the Internet. Much of this work was underwritten by NASA grant NAG5-10254.
Dropout Prediction in E-Learning Courses through the Combination of Machine Learning Techniques
ERIC Educational Resources Information Center
Lykourentzou, Ioanna; Giannoukos, Ioannis; Nikolopoulos, Vassilis; Mpardis, George; Loumos, Vassili
2009-01-01
In this paper, a dropout prediction method for e-learning courses, based on three popular machine learning techniques and detailed student data, is proposed. The machine learning techniques used are feed-forward neural networks, support vector machines and probabilistic ensemble simplified fuzzy ARTMAP. Since a single technique may fail to…
NASA Astrophysics Data System (ADS)
Petukhov, A. M.; Soldatov, E. Yu
2017-12-01
Separation of electroweak component from strong component of associated Zγ production on hadron colliders is a very challenging task due to identical final states of such processes. The only difference is the origin of two leading jets in these two processes. Rectangular cuts on jet kinematic variables from ATLAS/CMS 8 TeV Zγ experimental analyses were improved using machine learning techniques. New selection variables were also tested. The expected significance of separation for LHC experiments conditions at the second datataking period (Run2) and 120 fb-1 amount of data reaches more than 5σ. Future experimental observation of electroweak Zγ production can also lead to the observation physics beyond Standard Model.
Designing Cancer-Killing Artificial Viruses to Improve Student Understanding of Microbiology
Kuniyuki, Andy; Sharp, Gwen
2011-01-01
Our objective was to assess the effectiveness of a “learning by designing” group project used in a lower-division Microbiology course. Students used knowledge gained from the course to design an artificial virus that would kill cancer cells. The assignment required groups to integrate the individual course topics into a unified, complex understanding of the field of microbiology. Throughout the course, students and the instructor collaborated in creating a rubric to evaluate the groups’ final presentations. This paper reports the results of an assessment of the project by comparing the instructor’s and the students’ scores for the presentations. Students’ and the instructor’s scores were correlated; the Pearson coefficient of 0.52 was statistically significant. The results indicate that students gained sufficient knowledge to accurately evaluate proposed designs. Additionally, the overall course grade distribution improved compared to the semester before the project was introduced. Finally, in order to engage students in thinking about their own learning process, they completed a reflection assignment that required them to discuss the changes in their understanding of microbiology over the course of the semester. Our assessment indicates that a design project can serve as an effective and useful learning technique in undergraduate Microbiology courses, though modifications are suggested. PMID:23653757
Modular neural networks: a survey.
Auda, G; Kamel, M
1999-04-01
Modular Neural Networks (MNNs) is a rapidly growing field in artificial Neural Networks (NNs) research. This paper surveys the different motivations for creating MNNs: biological, psychological, hardware, and computational. Then, the general stages of MNN design are outlined and surveyed as well, viz., task decomposition techniques, learning schemes and multi-module decision-making strategies. Advantages and disadvantages of the surveyed methods are pointed out, and an assessment with respect to practical potential is provided. Finally, some general recommendations for future designs are presented.
The FAMULATUR PLUS as an innovative approach for teaching physical examination skills.
Jerg, Achim; Öchsner, Wolfgang; Wander, Henriette; Traue, Harald C; Jerg-Bretzke, Lucia
2016-01-01
The FAMULATUR PLUS is an innovative approach to teaching physical examination skills. The concept is aimed at medical students during the clinical part of their studies and includes a clinical traineeship (English for "Famulatur") extended to include various courses ("PLUS"). The courses are divided into clinical examination courses and problembased-learning (PBL) seminars. The concept's special feature is the full integration of these courses into a 30-day hospital traineeship. The aim is to facilitate the transfer of knowledge from the courses into daily practice. Each week of the FAMULATUR PLUS is structured in line with the courses and focuses on a particular part of the body (e.g., abdomen). A physical examination course under the supervision of a physician is offered at the beginning of the week. Here, medical students learn the relevant examination techniques by practicing on each other (partner exercises). Subsequently, the techniques taught are applied independently during everyday work on the ward, corrected by the supervisor, if necessary, and thereby reinforced. The final POL seminar takes place towards the end of the week. Possible differential diagnoses are developed based on a clinical case study. The goal is to check these by taking a fictitious medical history and performing a physical examination, as well as to make a preliminary diagnosis. Finally, during the PBL seminar, medical students will be shown how physical examination techniques can be efficiently applied in the diagnosis of common cardinal symptoms (e.g., abdominal pain). The initial implementation of the FAMULATUR PLUS proved the practical feasibility of the concept. In addition, the accompanying evaluation showed that the participants of the pilot project improved with regard to their practical physical examination skills.
The FAMULATUR PLUS as an innovative approach for teaching physical examination skills
Jerg, Achim; Öchsner, Wolfgang; Wander, Henriette; Traue, Harald C.; Jerg-Bretzke, Lucia
2016-01-01
The FAMULATUR PLUS is an innovative approach to teaching physical examination skills. The concept is aimed at medical students during the clinical part of their studies and includes a clinical traineeship (English for “Famulatur”) extended to include various courses (“PLUS”). The courses are divided into clinical examination courses and problembased-learning (PBL) seminars. The concept’s special feature is the full integration of these courses into a 30-day hospital traineeship. The aim is to facilitate the transfer of knowledge from the courses into daily practice. Each week of the FAMULATUR PLUS is structured in line with the courses and focuses on a particular part of the body (e.g., abdomen). A physical examination course under the supervision of a physician is offered at the beginning of the week. Here, medical students learn the relevant examination techniques by practicing on each other (partner exercises). Subsequently, the techniques taught are applied independently during everyday work on the ward, corrected by the supervisor, if necessary, and thereby reinforced. The final POL seminar takes place towards the end of the week. Possible differential diagnoses are developed based on a clinical case study. The goal is to check these by taking a fictitious medical history and performing a physical examination, as well as to make a preliminary diagnosis. Finally, during the PBL seminar, medical students will be shown how physical examination techniques can be efficiently applied in the diagnosis of common cardinal symptoms (e.g., abdominal pain). The initial implementation of the FAMULATUR PLUS proved the practical feasibility of the concept. In addition, the accompanying evaluation showed that the participants of the pilot project improved with regard to their practical physical examination skills. PMID:26958652
NASA Astrophysics Data System (ADS)
Zeilik, M.; Mathieu, R. D.; National InstituteScience Education; College Level-One Team
2000-12-01
Even the most dedicated college faculty often discover that their students fail to learn what was taught in their courses and that much of what students do learn is quickly forgotten after the final exam. To help college faculty improve student learning in college Science, Mathematics, Engineering and Technology (SMET), the College Level - One Team of the National Institute for Science Education has created the "FLAG" a Field-tested Learning Assessment Guide for SMET faculty. Developed with funding from the National Science Foundation, the FLAG presents in guidebook format a diverse and robust collection of field-tested classroom assessment techniques (CATs), with supporting information on how to apply them in the classroom. Faculty can download the tools and techniques from the website, which also provides a goals clarifier, an assessment primer, a searchable database, and links to additional resources. The CATs and tools have been reviewed by an expert editorial board and the NISE team. These assessment strategies can help faculty improve the learning environments in their SMET courses especially the crucial introductory courses that most strongly shape students' college learning experiences. In addition, the FLAG includes the web-based Student Assessment of Learning Gains. The SALG offers a convenient way to evaluate the impact of your courses on students. It is based on findings that students' estimates of what they gained are more reliable and informative than their observations of what they liked about the course or teacher. It offers accurate feedback on how well the different aspects of teaching helped the students to learn. Students complete the SALG online after a generic template has been modified to fit the learning objectives and activities of your course. The results are presented to the teacher as summary statistics automatically. The FLAG can be found at the NISE "Innovations in SMET Education" website at www.wcer.wisc.edu/nise/cl1
Stanger-Hall, Kathrin F.; Shockley, Floyd W.; Wilson, Rachel E.
2011-01-01
We implemented a “how to study” workshop for small groups of students (6–12) for N = 93 consenting students, randomly assigned from a large introductory biology class. The goal of this workshop was to teach students self-regulating techniques with visualization-based exercises as a foundation for learning and critical thinking in two areas: information processing and self-testing. During the workshop, students worked individually or in groups and received immediate feedback on their progress. Here, we describe two individual workshop exercises, report their immediate results, describe students’ reactions (based on the workshop instructors’ experience and student feedback), and report student performance on workshop-related questions on the final exam. Students rated the workshop activities highly and performed significantly better on workshop-related final exam questions than the control groups. This was the case for both lower- and higher-order thinking questions. Student achievement (i.e., grade point average) was significantly correlated with overall final exam performance but not with workshop outcomes. This long-term (10 wk) retention of a self-testing effect across question levels and student achievement is a promising endorsement for future large-scale implementation and further evaluation of this “how to study” workshop as a study support for introductory biology (and other science) students. PMID:21633067
NASA Astrophysics Data System (ADS)
Roche-Lima, Abiel; Thulasiram, Ruppa K.
2012-02-01
Finite automata, in which each transition is augmented with an output label in addition to the familiar input label, are considered finite-state transducers. Transducers have been used to analyze some fundamental issues in bioinformatics. Weighted finite-state transducers have been proposed to pairwise alignments of DNA and protein sequences; as well as to develop kernels for computational biology. Machine learning algorithms for conditional transducers have been implemented and used for DNA sequence analysis. Transducer learning algorithms are based on conditional probability computation. It is calculated by using techniques, such as pair-database creation, normalization (with Maximum-Likelihood normalization) and parameters optimization (with Expectation-Maximization - EM). These techniques are intrinsically costly for computation, even worse when are applied to bioinformatics, because the databases sizes are large. In this work, we describe a parallel implementation of an algorithm to learn conditional transducers using these techniques. The algorithm is oriented to bioinformatics applications, such as alignments, phylogenetic trees, and other genome evolution studies. Indeed, several experiences were developed using the parallel and sequential algorithm on Westgrid (specifically, on the Breeze cluster). As results, we obtain that our parallel algorithm is scalable, because execution times are reduced considerably when the data size parameter is increased. Another experience is developed by changing precision parameter. In this case, we obtain smaller execution times using the parallel algorithm. Finally, number of threads used to execute the parallel algorithm on the Breezy cluster is changed. In this last experience, we obtain as result that speedup is considerably increased when more threads are used; however there is a convergence for number of threads equal to or greater than 16.
WE-G-BRC-02: Risk Assessment for HDR Brachytherapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mayadev, J.
2016-06-15
Failure Mode and Effects Analysis (FMEA) originated as an industrial engineering technique used for risk management and safety improvement of complex processes. In the context of radiotherapy, the AAPM Task Group 100 advocates FMEA as the framework of choice for establishing clinical quality management protocols. However, there is concern that widespread adoption of FMEA in radiation oncology will be hampered by the perception that implementation of the tool will have a steep learning curve, be extremely time consuming and labor intensive, and require additional resources. To overcome these preconceptions and facilitate the introduction of the tool into clinical practice, themore » medical physics community must be educated in the use of this tool and the ease in which it can be implemented. Organizations with experience in FMEA should share their knowledge with others in order to increase the implementation, effectiveness and productivity of the tool. This session will include a brief, general introduction to FMEA followed by a focus on practical aspects of implementing FMEA for specific clinical procedures including HDR brachytherapy, physics plan review and radiosurgery. A description of common equipment and devices used in these procedures and how to characterize new devices for safe use in patient treatments will be presented. This will be followed by a discussion of how to customize FMEA techniques and templates to one’s own clinic. Finally, cases of common failure modes for specific procedures (described previously) will be shown and recommended intervention methodologies and outcomes reviewed. Learning Objectives: Understand the general concept of failure mode and effect analysis Learn how to characterize new equipment for safety Be able to identify potential failure modes for specific procedures and learn mitigation techniques Be able to customize FMEA examples and templates for use in any clinic.« less
WE-G-BRC-01: Risk Assessment for Radiosurgery
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, G.
2016-06-15
Failure Mode and Effects Analysis (FMEA) originated as an industrial engineering technique used for risk management and safety improvement of complex processes. In the context of radiotherapy, the AAPM Task Group 100 advocates FMEA as the framework of choice for establishing clinical quality management protocols. However, there is concern that widespread adoption of FMEA in radiation oncology will be hampered by the perception that implementation of the tool will have a steep learning curve, be extremely time consuming and labor intensive, and require additional resources. To overcome these preconceptions and facilitate the introduction of the tool into clinical practice, themore » medical physics community must be educated in the use of this tool and the ease in which it can be implemented. Organizations with experience in FMEA should share their knowledge with others in order to increase the implementation, effectiveness and productivity of the tool. This session will include a brief, general introduction to FMEA followed by a focus on practical aspects of implementing FMEA for specific clinical procedures including HDR brachytherapy, physics plan review and radiosurgery. A description of common equipment and devices used in these procedures and how to characterize new devices for safe use in patient treatments will be presented. This will be followed by a discussion of how to customize FMEA techniques and templates to one’s own clinic. Finally, cases of common failure modes for specific procedures (described previously) will be shown and recommended intervention methodologies and outcomes reviewed. Learning Objectives: Understand the general concept of failure mode and effect analysis Learn how to characterize new equipment for safety Be able to identify potential failure modes for specific procedures and learn mitigation techniques Be able to customize FMEA examples and templates for use in any clinic.« less
WE-G-BRC-03: Risk Assessment for Physics Plan Review
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parker, S.
2016-06-15
Failure Mode and Effects Analysis (FMEA) originated as an industrial engineering technique used for risk management and safety improvement of complex processes. In the context of radiotherapy, the AAPM Task Group 100 advocates FMEA as the framework of choice for establishing clinical quality management protocols. However, there is concern that widespread adoption of FMEA in radiation oncology will be hampered by the perception that implementation of the tool will have a steep learning curve, be extremely time consuming and labor intensive, and require additional resources. To overcome these preconceptions and facilitate the introduction of the tool into clinical practice, themore » medical physics community must be educated in the use of this tool and the ease in which it can be implemented. Organizations with experience in FMEA should share their knowledge with others in order to increase the implementation, effectiveness and productivity of the tool. This session will include a brief, general introduction to FMEA followed by a focus on practical aspects of implementing FMEA for specific clinical procedures including HDR brachytherapy, physics plan review and radiosurgery. A description of common equipment and devices used in these procedures and how to characterize new devices for safe use in patient treatments will be presented. This will be followed by a discussion of how to customize FMEA techniques and templates to one’s own clinic. Finally, cases of common failure modes for specific procedures (described previously) will be shown and recommended intervention methodologies and outcomes reviewed. Learning Objectives: Understand the general concept of failure mode and effect analysis Learn how to characterize new equipment for safety Be able to identify potential failure modes for specific procedures and learn mitigation techniques Be able to customize FMEA examples and templates for use in any clinic.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
Failure Mode and Effects Analysis (FMEA) originated as an industrial engineering technique used for risk management and safety improvement of complex processes. In the context of radiotherapy, the AAPM Task Group 100 advocates FMEA as the framework of choice for establishing clinical quality management protocols. However, there is concern that widespread adoption of FMEA in radiation oncology will be hampered by the perception that implementation of the tool will have a steep learning curve, be extremely time consuming and labor intensive, and require additional resources. To overcome these preconceptions and facilitate the introduction of the tool into clinical practice, themore » medical physics community must be educated in the use of this tool and the ease in which it can be implemented. Organizations with experience in FMEA should share their knowledge with others in order to increase the implementation, effectiveness and productivity of the tool. This session will include a brief, general introduction to FMEA followed by a focus on practical aspects of implementing FMEA for specific clinical procedures including HDR brachytherapy, physics plan review and radiosurgery. A description of common equipment and devices used in these procedures and how to characterize new devices for safe use in patient treatments will be presented. This will be followed by a discussion of how to customize FMEA techniques and templates to one’s own clinic. Finally, cases of common failure modes for specific procedures (described previously) will be shown and recommended intervention methodologies and outcomes reviewed. Learning Objectives: Understand the general concept of failure mode and effect analysis Learn how to characterize new equipment for safety Be able to identify potential failure modes for specific procedures and learn mitigation techniques Be able to customize FMEA examples and templates for use in any clinic.« less
ERIC Educational Resources Information Center
Rapchak, Marcia E.; Brungard, Allison B.; Bergfelt, Theodore W.
2016-01-01
Using the Information Literacy VALUE Rubric provided by the AAC&U, this study compares thirty final capstone assignments in a research course in a learning community with thirty final assignments in from students not in learning communities. Results indicated higher performance of the non-learning community students; however, transfer skills…
ERIC Educational Resources Information Center
Chiou, Chei-Chang; Lee, Li-Tze; Tien, Li-Chu; Wang, Yu-Min
2017-01-01
This study explored the effectiveness of different concept mapping techniques on the learning achievement of senior accounting students and whether achievements attained using various techniques are affected by different learning styles. The techniques are computer-assisted construct-by-self-concept mapping (CACSB), computer-assisted…
Small-sided games in football as a method to improve high school students’ instep passing skills
NASA Astrophysics Data System (ADS)
Ridwan, M.; Darmawan, G.; Fuadi, Z.
2018-01-01
This study analyzed the influence of small sided games application toward increasing the learning result of instep passing in football. The research used one group pretest-posttest design. The data were obtained once a week for 135 minutes of small sided games and this activity had been held for four weeks with a final test in the final meeting. According to descriptive data result, there were increases of the mean. The data showed the increase of the application of small sided games resulted in not only the mean of the descriptive data but also the result of T-test. The significant of T-test is 0,000. It means less than 0,05 then the hypothesis Ha received and Ho rejected automatically. The presentation showed that 48,15% data is increasing by small-sided games application. The small-sided games were proven to be the right tool to increase instep passing football technique. We suggested to the apply that kind of games of football learning on physical education subject, especially for pre-university students.
A Survey on Ambient Intelligence in Health Care
Acampora, Giovanni; Cook, Diane J.; Rashidi, Parisa; Vasilakos, Athanasios V.
2013-01-01
Ambient Intelligence (AmI) is a new paradigm in information technology aimed at empowering people’s capabilities by the means of digital environments that are sensitive, adaptive, and responsive to human needs, habits, gestures, and emotions. This futuristic vision of daily environment will enable innovative human-machine interactions characterized by pervasive, unobtrusive and anticipatory communications. Such innovative interaction paradigms make ambient intelligence technology a suitable candidate for developing various real life solutions, including in the health care domain. This survey will discuss the emergence of ambient intelligence (AmI) techniques in the health care domain, in order to provide the research community with the necessary background. We will examine the infrastructure and technology required for achieving the vision of ambient intelligence, such as smart environments and wearable medical devices. We will summarize of the state of the art artificial intelligence methodologies used for developing AmI system in the health care domain, including various learning techniques (for learning from user interaction), reasoning techniques (for reasoning about users’ goals and intensions) and planning techniques (for planning activities and interactions). We will also discuss how AmI technology might support people affected by various physical or mental disabilities or chronic disease. Finally, we will point to some of the successful case studies in the area and we will look at the current and future challenges to draw upon the possible future research paths. PMID:24431472
NASA Astrophysics Data System (ADS)
Deng, Chengbin; Wu, Changshan
2013-12-01
Urban impervious surface information is essential for urban and environmental applications at the regional/national scales. As a popular image processing technique, spectral mixture analysis (SMA) has rarely been applied to coarse-resolution imagery due to the difficulty of deriving endmember spectra using traditional endmember selection methods, particularly within heterogeneous urban environments. To address this problem, we derived endmember signatures through a least squares solution (LSS) technique with known abundances of sample pixels, and integrated these endmember signatures into SMA for mapping large-scale impervious surface fraction. In addition, with the same sample set, we carried out objective comparative analyses among SMA (i.e. fully constrained and unconstrained SMA) and machine learning (i.e. Cubist regression tree and Random Forests) techniques. Analysis of results suggests three major conclusions. First, with the extrapolated endmember spectra from stratified random training samples, the SMA approaches performed relatively well, as indicated by small MAE values. Second, Random Forests yields more reliable results than Cubist regression tree, and its accuracy is improved with increased sample sizes. Finally, comparative analyses suggest a tentative guide for selecting an optimal approach for large-scale fractional imperviousness estimation: unconstrained SMA might be a favorable option with a small number of samples, while Random Forests might be preferred if a large number of samples are available.
A Survey on Ambient Intelligence in Health Care.
Acampora, Giovanni; Cook, Diane J; Rashidi, Parisa; Vasilakos, Athanasios V
2013-12-01
Ambient Intelligence (AmI) is a new paradigm in information technology aimed at empowering people's capabilities by the means of digital environments that are sensitive, adaptive, and responsive to human needs, habits, gestures, and emotions. This futuristic vision of daily environment will enable innovative human-machine interactions characterized by pervasive, unobtrusive and anticipatory communications. Such innovative interaction paradigms make ambient intelligence technology a suitable candidate for developing various real life solutions, including in the health care domain. This survey will discuss the emergence of ambient intelligence (AmI) techniques in the health care domain, in order to provide the research community with the necessary background. We will examine the infrastructure and technology required for achieving the vision of ambient intelligence, such as smart environments and wearable medical devices. We will summarize of the state of the art artificial intelligence methodologies used for developing AmI system in the health care domain, including various learning techniques (for learning from user interaction), reasoning techniques (for reasoning about users' goals and intensions) and planning techniques (for planning activities and interactions). We will also discuss how AmI technology might support people affected by various physical or mental disabilities or chronic disease. Finally, we will point to some of the successful case studies in the area and we will look at the current and future challenges to draw upon the possible future research paths.
Blavier, Adélaïde; Gaudissart, Quentin; Cadière, Guy-Bernard; Nyssen, Anne-Sophie
2007-07-01
The purpose of this study was to evaluate the perceptual (2-dimensional [2D] vs. 3-dimensional [3D] view) and instrumental (classical vs. robotic) impacts of new robotic system on learning curves. Forty medical students without any surgical experience were randomized into 4 groups (classical laparoscopy with 3D-direct view or with 2D-indirect view, robotic system in 3D or in 2D) and repeated a laparoscopic task 6 times. After these 6 repetitions, they performed 2 trials with the same technique but in the other viewing condition (perceptive switch). Finally, subjects performed the last 3 trials with the technique they never used (technical switch). Subjects evaluated their performance answering a questionnaire (impressions of mastery, familiarity, satisfaction, self-confidence, and difficulty). Our study showed better performance and improvement in 3D view than in 2D view whatever the instrumental aspect. Participants reported less mastery, familiarity, and self-confidence and more difficulty in classical laparoscopy with 2D-indirect view than in the other conditions. Robotic surgery improves surgical performance and learning, particularly by 3D view advantage. However, perceptive and technical switches emphasize the need to adapt and pursue training also with traditional technology to prevent risks in conversion procedure.
Learning experience in endodontics: Brazilian students' perceptions.
Seijo, Marilia O S; Ferreira, Efigênia F; Ribeiro Sobrinho, Antônio P; Paiva, Saul M; Martins, Renata C
2013-05-01
Including students' perceptions in the educational process is considered a key component in monitoring the quality of academic programs. This study aimed to evaluate the concept of one's learning experience in endodontic teaching from the perspective of a group of Brazilian students. A total of 126 self-administered, structured questionnaires were distributed to undergraduate dental students enrolled in endodontics courses during the second semester of the 2009 academic year. The questionnaires were administered during final examinations and focused on students' opinions concerning learning during endodontic treatments, time spent during endodontic treatments, difficulties found during endodontic treatments, quality of endodontic treatments performed, characteristics of the technique employed, and suggestions to improve endodontic teaching. Ninety-one percent of the questionnaires were returned for evaluation. The obtained answers were discussed and analyzed, thereby generating quantitative and qualitative data showing students' perceptions of their experiences in endodontics courses. The main points that can affect the teaching of endodontics, according to the undergraduate students, included patients' absences and delays, selection of patients, preclinical and clinical training, difficulties found, type of technique employed, and teachers' orientation during endodontic treatment. The students' perceptions provided valuable information about the development of the course and the teacher-student relationship, together with the added intention of enhancing the teaching of endodontics as well as other courses.
A closer look at cross-validation for assessing the accuracy of gene regulatory networks and models.
Tabe-Bordbar, Shayan; Emad, Amin; Zhao, Sihai Dave; Sinha, Saurabh
2018-04-26
Cross-validation (CV) is a technique to assess the generalizability of a model to unseen data. This technique relies on assumptions that may not be satisfied when studying genomics datasets. For example, random CV (RCV) assumes that a randomly selected set of samples, the test set, well represents unseen data. This assumption doesn't hold true where samples are obtained from different experimental conditions, and the goal is to learn regulatory relationships among the genes that generalize beyond the observed conditions. In this study, we investigated how the CV procedure affects the assessment of supervised learning methods used to learn gene regulatory networks (or in other applications). We compared the performance of a regression-based method for gene expression prediction estimated using RCV with that estimated using a clustering-based CV (CCV) procedure. Our analysis illustrates that RCV can produce over-optimistic estimates of the model's generalizability compared to CCV. Next, we defined the 'distinctness' of test set from training set and showed that this measure is predictive of performance of the regression method. Finally, we introduced a simulated annealing method to construct partitions with gradually increasing distinctness and showed that performance of different gene expression prediction methods can be better evaluated using this method.
Ghorbani, Ahmad; Ghazvini, Kiarash
2016-03-01
Many studies have emphasized the incorporation of active learning into classrooms to reinforce didactic lectures for physiology courses. This work aimed to determine if presenting classic papers during didactic lectures improves the learning of physiology among undergraduate students. Twenty-two students of health information technology were randomly divided into the following two groups: 1) didactic lecture only (control group) and 2) didactic lecture plus paper presentation breaks (DLPP group). In the control group, main topics of gastrointestinal and endocrine physiology were taught using only the didactic lecture technique. In the DLPP group, some topics were presented by the didactic lecture method (similar to the control group) and some topics were taught by the DLPP technique (first, concepts were covered briefly in a didactic format and then reinforced with presentation of a related classic paper). The combination of didactic lecture and paper breaks significantly improved learning so that students in the DLPP group showed higher scores on related topics compared with those in the control group (P < 0.001). Comparison of the scores of topics taught by only the didactic lecture and those using both the didactic lecture and paper breaks showed significant improvement only in the DLPP group (P < 0.001). Data obtained from the final exam showed that in the DLPP group, the mean score of the topics taught by the combination of didactic lecture and paper breaks was significantly higher than those taught by only didactic lecture (P < 0.05). In conclusion, the combination of paper presentation breaks and didactic lectures improves the learning of physiology. Copyright © 2016 The American Physiological Society.
Yu, Shu; Yang, Kuei-Feng
2006-08-01
Public health nurses (PHNs) often cannot receive in-service education due to limitations of time and space. Learning through the Internet has been a widely used technique in many professional and clinical nursing fields. The learner's attitude is the most important indicator that promotes learning. The purpose of this study was to investigate PHNs' attitude toward web-based learning and its determinants. This study conducted a cross-sectional research design. 369 health centers in Taiwan. The population involved this study was 2398 PHNs, and we used random sampling from this population. Finally, 329 PHNs completed the questionnaire, with a response rate of 84.0%. Data were collected by mailing the questionnaire. Most PHNs revealed a positive attitude toward web-based learning (mean+/-SD=55.02+/-6.39). PHNs who worked at village health centers, a service population less than 10,000, PHNs who had access to computer facility and on-line hardware in health centers and with better computer competence revealed more positive attitudes (p<0.01). Web-based learning is an important new way of in-service education; however, its success and hindering factors require further investigation. Individual computer competence is the main target for improvement, and educators should also consider how to establish a user-friendly learning environment on the Internet.
Learning curve for laparoscopic Heller myotomy and Dor fundoplication for achalasia
Omura, Nobuo; Tsuboi, Kazuto; Hoshino, Masato; Yamamoto, Seryung; Akimoto, Shunsuke; Masuda, Takahiro; Kashiwagi, Hideyuki; Yanaga, Katsuhiko
2017-01-01
Purpose Although laparoscopic Heller myotomy and Dor fundoplication (LHD) is widely performed to address achalasia, little is known about the learning curve for this technique. We assessed the learning curve for performing LHD. Methods Of the 514 cases with LHD performed between August 1994 and March 2016, the surgical outcomes of 463 cases were evaluated after excluding 50 cases with reduced port surgery and one case with the simultaneous performance of laparoscopic distal partial gastrectomy. A receiver operating characteristic (ROC) curve analysis was used to identify the cut-off value for the number of surgical experiences necessary to become proficient with LHD, which was defined as the completion of the learning curve. Results We defined the completion of the learning curve when the following 3 conditions were satisfied. 1) The operation time was less than 165 minutes. 2) There was no blood loss. 3) There was no intraoperative complication. In order to establish the appropriate number of surgical experiences required to complete the learning curve, the cut-off value was evaluated by using a ROC curve (AUC 0.717, p < 0.001). Finally, we identified the cut-off value as 16 surgical cases (sensitivity 0.706, specificity 0.646). Conclusion Learning curve seems to complete after performing 16 cases. PMID:28686640
Learning curve for laparoscopic Heller myotomy and Dor fundoplication for achalasia.
Yano, Fumiaki; Omura, Nobuo; Tsuboi, Kazuto; Hoshino, Masato; Yamamoto, Seryung; Akimoto, Shunsuke; Masuda, Takahiro; Kashiwagi, Hideyuki; Yanaga, Katsuhiko
2017-01-01
Although laparoscopic Heller myotomy and Dor fundoplication (LHD) is widely performed to address achalasia, little is known about the learning curve for this technique. We assessed the learning curve for performing LHD. Of the 514 cases with LHD performed between August 1994 and March 2016, the surgical outcomes of 463 cases were evaluated after excluding 50 cases with reduced port surgery and one case with the simultaneous performance of laparoscopic distal partial gastrectomy. A receiver operating characteristic (ROC) curve analysis was used to identify the cut-off value for the number of surgical experiences necessary to become proficient with LHD, which was defined as the completion of the learning curve. We defined the completion of the learning curve when the following 3 conditions were satisfied. 1) The operation time was less than 165 minutes. 2) There was no blood loss. 3) There was no intraoperative complication. In order to establish the appropriate number of surgical experiences required to complete the learning curve, the cut-off value was evaluated by using a ROC curve (AUC 0.717, p < 0.001). Finally, we identified the cut-off value as 16 surgical cases (sensitivity 0.706, specificity 0.646). Learning curve seems to complete after performing 16 cases.
Robust Visual Tracking via Online Discriminative and Low-Rank Dictionary Learning.
Zhou, Tao; Liu, Fanghui; Bhaskar, Harish; Yang, Jie
2017-09-12
In this paper, we propose a novel and robust tracking framework based on online discriminative and low-rank dictionary learning. The primary aim of this paper is to obtain compact and low-rank dictionaries that can provide good discriminative representations of both target and background. We accomplish this by exploiting the recovery ability of low-rank matrices. That is if we assume that the data from the same class are linearly correlated, then the corresponding basis vectors learned from the training set of each class shall render the dictionary to become approximately low-rank. The proposed dictionary learning technique incorporates a reconstruction error that improves the reliability of classification. Also, a multiconstraint objective function is designed to enable active learning of a discriminative and robust dictionary. Further, an optimal solution is obtained by iteratively computing the dictionary, coefficients, and by simultaneously learning the classifier parameters. Finally, a simple yet effective likelihood function is implemented to estimate the optimal state of the target during tracking. Moreover, to make the dictionary adaptive to the variations of the target and background during tracking, an online update criterion is employed while learning the new dictionary. Experimental results on a publicly available benchmark dataset have demonstrated that the proposed tracking algorithm performs better than other state-of-the-art trackers.
The Development of Teaching and Learning in Bright-Field Microscopy Technique
ERIC Educational Resources Information Center
Iskandar, Yulita Hanum P.; Mahmud, Nurul Ethika; Wahab, Wan Nor Amilah Wan Abdul; Jamil, Noor Izani Noor; Basir, Nurlida
2013-01-01
E-learning should be pedagogically-driven rather than technologically-driven. The objectives of this study are to develop an interactive learning system in bright-field microscopy technique in order to support students' achievement of their intended learning outcomes. An interactive learning system on bright-field microscopy technique was…
Space Shuttle GN and C Development History and Evolution
NASA Technical Reports Server (NTRS)
Zimpfer, Douglas; Hattis, Phil; Ruppert, John; Gavert, Don
2011-01-01
Completion of the final Space Shuttle flight marks the end of a significant era in Human Spaceflight. Developed in the 1970 s, first launched in 1981, the Space Shuttle embodies many significant engineering achievements. One of these is the development and operation of the first extensive fly-by-wire human space transportation Guidance, Navigation and Control (GN&C) System. Development of the Space Shuttle GN&C represented first time inclusions of modern techniques for electronics, software, algorithms, systems and management in a complex system. Numerous technical design trades and lessons learned continue to drive current vehicle development. For example, the Space Shuttle GN&C system incorporated redundant systems, complex algorithms and flight software rigorously verified through integrated vehicle simulations and avionics integration testing techniques. Over the past thirty years, the Shuttle GN&C continued to go through a series of upgrades to improve safety, performance and to enable the complex flight operations required for assembly of the international space station. Upgrades to the GN&C ranged from the addition of nose wheel steering to modifications that extend capabilities to control of the large flexible configurations while being docked to the Space Station. This paper provides a history of the development and evolution of the Space Shuttle GN&C system. Emphasis is placed on key architecture decisions, design trades and the lessons learned for future complex space transportation system developments. Finally, some of the interesting flight operations experience is provided to inform future developers of flight experiences.
Riley, William; Parsons, Helen; McCoy, Kim; Burns, Debra; Anderson, Donna; Lee, Suhna; Sainfort, François
2009-10-01
To test the feasibility and assess the preliminary impact of a unique statewide quality improvement (QI) training program designed for public health departments. One hundred and ninety-five public health employees/managers from 38 local health departments throughout Minnesota were selected to participate in a newly developed QI training program and 65 of those engaged in and completed eight expert-supported QI projects over a period of 10 months from June 2007 through March 2008. As part of the Minnesota Quality Improvement Initiative, a structured distance education QI training program was designed and deployed in a first large-scale pilot. To evaluate the preliminary impact of the program, a mixed-method evaluation design was used based on four dimensions: learner reaction, knowledge, intention to apply, and preliminary outcomes. Subjective ratings of three dimensions of training quality were collected from participants after each of the scheduled learning sessions. Pre- and post-QI project surveys were administered to collect participant reactions, knowledge, future intention to apply learning, and perceived outcomes. Monthly and final QI project reports were collected to further inform success and preliminary outcomes of the projects. The participants reported (1) high levels of satisfaction with the training sessions, (2) increased perception of the relevance of the QI techniques, (3) increased perceived knowledge of all specific QI methods and techniques, (4) increased confidence in applying QI techniques on future projects, (5) increased intention to apply techniques on future QI projects, and (6) high perceived success of, and satisfaction with, the projects. Finally, preliminary outcomes data show moderate to large improvements in quality and/or efficiency for six out of eight projects. QI methods and techniques can be successfully implemented in local public health agencies on a statewide basis using the collaborative model through distance training and expert facilitation. This unique training can improve both core and support processes and lead to favorable staff reactions, increased knowledge, and improved health outcomes. The program can be further improved and deployed and holds great promise to facilitate the successful dissemination of proven QI methods throughout local public health departments.
Relationship between Learning Style and Academic Status of Babol Dental Students
Nasiri, Zahra; Gharekhani, Samane; Ghasempour, Maryam
2016-01-01
Introduction Identifying and employing students’ learning styles could play an important role in selecting appropriate teaching methods in order to improve education. The aim of this study was to determine the relationship between the students’ final exam scores and the learning style preferences of dental students at Babol University of Medical Sciences. Methods This cross-sectional study was conducted on 88 dental students studying in their fourth, fifth, and sixth years using the visual–aural–reading/writing–kinesthetic (VARK) learning styles’ questionnaire. The data were analyzed with IBM SPSS, version 21, using the chi-squared test and the t-test. Results Of the 88 participants who responded to the questionnaire, 87 preferred multimodal learning styles. There was no significant difference between the mean of the final exam scores in students who did and did not prefer the aural learning style (p = 0.86), the reading/writing learning style (p = 0.20), and the kinesthetic learning style (p = 0.32). In addition, there was no significant difference between the scores on the final clinical course among the students who had different preferences for learning style. However, there was a significant difference between the mean of the final exam scores in students with and without visual learning style preference (p = 0.03), with the former having higher mean scores. There was no significant relationship between preferred learning styles and gender (p > 0.05). Conclusion The majority of dental students preferred multimodal learning styles, and there was a significant difference between the mean of the final exam scores for students with and without a preference for the visual learning style. In addition, there were no differences in the preferred learning styles between male and female students. PMID:27382442
Howlett, David; Vincent, Tim; Watson, Gillian; Owens, Emma; Webb, Richard; Gainsborough, Nicola; Fairclough, Jil; Taylor, Nick; Miles, Ken; Cohen, Jon; Vincent, Richard
2011-06-01
To review the initial experience of blending a variety of online educational techniques with traditional face to face or contact-based teaching methods to deliver final year undergraduate radiology content at a UK Medical School. The Brighton and Sussex Medical School opened in 2003 and offers a 5-year undergraduate programme, with the final 5 spent in several regional centres. Year 5 involves several core clinical specialities with onsite radiology teaching provided at regional centres in the form of small-group tutorials, imaging seminars and also a one-day course. An online educational module was introduced in 2007 to facilitate equitable delivery of the year 5 curriculum between the regional centres and to support students on placement. This module had a strong radiological emphasis, with a combination of imaging integrated into clinical cases to reflect everyday practice and also dedicated radiology cases. For the second cohort of year 5 students in 2008 two additional online media-rich initiatives were introduced, to complement the online module, comprising imaging tutorials and an online case discussion room. In the first year for the 2007/2008 cohort, 490 cases were written, edited and delivered via the Medical School managed learning environment as part of the online module. 253 cases contained a form of image media, of which 195 cases had a radiological component with a total of 325 radiology images. Important aspects of radiology practice (e.g. consent, patient safety, contrast toxicity, ionising radiation) were also covered. There were 274,000 student hits on cases the first year, with students completing a mean of 169 cases each. High levels of student satisfaction were recorded in relation to the online module and also additional online radiology teaching initiatives. Online educational techniques can be effectively blended with other forms of teaching to allow successful undergraduate delivery of radiology. Efficient IT links and good image quality are essential ingredients for successful student/clinician engagement. Copyright © 2009 Elsevier Ireland Ltd. All rights reserved.
Applying data fusion techniques for benthic habitat mapping and monitoring in a coral reef ecosystem
NASA Astrophysics Data System (ADS)
Zhang, Caiyun
2015-06-01
Accurate mapping and effective monitoring of benthic habitat in the Florida Keys are critical in developing management strategies for this valuable coral reef ecosystem. For this study, a framework was designed for automated benthic habitat mapping by combining multiple data sources (hyperspectral, aerial photography, and bathymetry data) and four contemporary imagery processing techniques (data fusion, Object-based Image Analysis (OBIA), machine learning, and ensemble analysis). In the framework, 1-m digital aerial photograph was first merged with 17-m hyperspectral imagery and 10-m bathymetry data using a pixel/feature-level fusion strategy. The fused dataset was then preclassified by three machine learning algorithms (Random Forest, Support Vector Machines, and k-Nearest Neighbor). Final object-based habitat maps were produced through ensemble analysis of outcomes from three classifiers. The framework was tested for classifying a group-level (3-class) and code-level (9-class) habitats in a portion of the Florida Keys. Informative and accurate habitat maps were achieved with an overall accuracy of 88.5% and 83.5% for the group-level and code-level classifications, respectively.
Bias-Free Chemically Diverse Test Sets from Machine Learning.
Swann, Ellen T; Fernandez, Michael; Coote, Michelle L; Barnard, Amanda S
2017-08-14
Current benchmarking methods in quantum chemistry rely on databases that are built using a chemist's intuition. It is not fully understood how diverse or representative these databases truly are. Multivariate statistical techniques like archetypal analysis and K-means clustering have previously been used to summarize large sets of nanoparticles however molecules are more diverse and not as easily characterized by descriptors. In this work, we compare three sets of descriptors based on the one-, two-, and three-dimensional structure of a molecule. Using data from the NIST Computational Chemistry Comparison and Benchmark Database and machine learning techniques, we demonstrate the functional relationship between these structural descriptors and the electronic energy of molecules. Archetypes and prototypes found with topological or Coulomb matrix descriptors can be used to identify smaller, statistically significant test sets that better capture the diversity of chemical space. We apply this same method to find a diverse subset of organic molecules to demonstrate how the methods can easily be reapplied to individual research projects. Finally, we use our bias-free test sets to assess the performance of density functional theory and quantum Monte Carlo methods.
On the design of learning outcomes for the undergraduate engineer's final year project
NASA Astrophysics Data System (ADS)
Thambyah, Ashvin
2011-03-01
The course for the final year project for engineering students, because of its strongly research-based, open-ended format, tends to not have well defined learning outcomes, which are also not aligned with any accepted pedagogical philosophy or learning technology. To address this problem, the revised Bloom's taxonomy table of Anderson and Krathwohl (2001) is utilised, as suggested previously by Lee and Lai (2007), to design new learning outcomes for the final year project course in engineering education. Based on the expectations of the engineering graduate, and integrating these graduate expectations into the six cognitive processes and four knowledge dimensions of the taxonomy table, 24 learning outcomes have been designed. It is proposed that these 24 learning outcomes be utilised as a suitable working template to inspire more critical evaluation of what is expected to be learnt by engineering students undertaking final year research or capstone projects.
Neural architecture design based on extreme learning machine.
Bueno-Crespo, Andrés; García-Laencina, Pedro J; Sancho-Gómez, José-Luis
2013-12-01
Selection of the optimal neural architecture to solve a pattern classification problem entails to choose the relevant input units, the number of hidden neurons and its corresponding interconnection weights. This problem has been widely studied in many research works but their solutions usually involve excessive computational cost in most of the problems and they do not provide a unique solution. This paper proposes a new technique to efficiently design the MultiLayer Perceptron (MLP) architecture for classification using the Extreme Learning Machine (ELM) algorithm. The proposed method provides a high generalization capability and a unique solution for the architecture design. Moreover, the selected final network only retains those input connections that are relevant for the classification task. Experimental results show these advantages. Copyright © 2013 Elsevier Ltd. All rights reserved.
Socrates was not a pimp: changing the paradigm of questioning in medical education.
Kost, Amanda; Chen, Frederick M
2015-01-01
The slang term "pimping" is widely recognized by learners and educators in the clinical learning environment as the act of more senior members of the medical team publicly asking questions of more junior members. Although questioning as a pedagogical practice has many benefits, pimping, as described in the literature, evokes negative emotions in learners and leads to an environment that is not conducive to adult learning. Medical educators may employ pimping as a pedagogic technique because of beliefs that it is a Socratic teaching method. Although problems with pimping have previously been identified, no alternative techniques for questioning in the clinical environment were suggested. The authors posit that using the term "pimping" to describe questioning in medical education is harmful and unprofessional, and they propose clearly defining pimping as "questioning with the intent to shame or humiliate the learner to maintain the power hierarchy in medical education." Explicitly separating pimping from the larger practice of questioning allows the authors to make three recommendations for improving questioning practices. First, educators should examine the purpose of each question they pose to learners. Second, they should apply historic and modern interpretations of Socratic teaching methods that promote critical thinking skills. Finally, they should consider adult learning theories to make concrete changes to their questioning practices. These changes can result in questioning that is more learner centered, aids in the acquisition of knowledge and skills, performs helpful formative and summative assessments of the learner, and improves community in the clinical learning environment.
Tracking Active Learning in the Medical School Curriculum: A Learning-Centered Approach.
McCoy, Lise; Pettit, Robin K; Kellar, Charlyn; Morgan, Christine
2018-01-01
Medical education is moving toward active learning during large group lecture sessions. This study investigated the saturation and breadth of active learning techniques implemented in first year medical school large group sessions. Data collection involved retrospective curriculum review and semistructured interviews with 20 faculty. The authors piloted a taxonomy of active learning techniques and mapped learning techniques to attributes of learning-centered instruction. Faculty implemented 25 different active learning techniques over the course of 9 first year courses. Of 646 hours of large group instruction, 476 (74%) involved at least 1 active learning component. The frequency and variety of active learning components integrated throughout the year 1 curriculum reflect faculty familiarity with active learning methods and their support of an active learning culture. This project has sparked reflection on teaching practices and facilitated an evolution from teacher-centered to learning-centered instruction.
Tracking Active Learning in the Medical School Curriculum: A Learning-Centered Approach
McCoy, Lise; Pettit, Robin K; Kellar, Charlyn; Morgan, Christine
2018-01-01
Background: Medical education is moving toward active learning during large group lecture sessions. This study investigated the saturation and breadth of active learning techniques implemented in first year medical school large group sessions. Methods: Data collection involved retrospective curriculum review and semistructured interviews with 20 faculty. The authors piloted a taxonomy of active learning techniques and mapped learning techniques to attributes of learning-centered instruction. Results: Faculty implemented 25 different active learning techniques over the course of 9 first year courses. Of 646 hours of large group instruction, 476 (74%) involved at least 1 active learning component. Conclusions: The frequency and variety of active learning components integrated throughout the year 1 curriculum reflect faculty familiarity with active learning methods and their support of an active learning culture. This project has sparked reflection on teaching practices and facilitated an evolution from teacher-centered to learning-centered instruction. PMID:29707649
Deep learning with convolutional neural network in radiology.
Yasaka, Koichiro; Akai, Hiroyuki; Kunimatsu, Akira; Kiryu, Shigeru; Abe, Osamu
2018-04-01
Deep learning with a convolutional neural network (CNN) is gaining attention recently for its high performance in image recognition. Images themselves can be utilized in a learning process with this technique, and feature extraction in advance of the learning process is not required. Important features can be automatically learned. Thanks to the development of hardware and software in addition to techniques regarding deep learning, application of this technique to radiological images for predicting clinically useful information, such as the detection and the evaluation of lesions, etc., are beginning to be investigated. This article illustrates basic technical knowledge regarding deep learning with CNNs along the actual course (collecting data, implementing CNNs, and training and testing phases). Pitfalls regarding this technique and how to manage them are also illustrated. We also described some advanced topics of deep learning, results of recent clinical studies, and the future directions of clinical application of deep learning techniques.
Carrasco, Gonzalo A; Behling, Kathryn C; Lopez, Osvaldo J
2018-04-01
Student participation is important for the success of active learning strategies, but participation is often linked to the level of preparation. At our institution, we use two types of active learning activities, a modified case-based learning exercise called active learning groups (ALG) and team-based learning (TBL). These strategies have different assessment and incentive structures for participation. Non-cognitive skills are assessed in ALG using a subjective five-point Likert scale. In TBL, assessment of individual student preparation is based on a multiple choice quiz conducted at the beginning of each session. We studied first-year medical student participation and performance in ALG and TBL as well as performance on course final examinations. Student performance in TBL, but not in ALG, was strongly correlated with final examination scores. Additionally, in students who performed in the upper 33rd percentile on the final examination, there was a positive correlation between final examination performance and participation in TBL and ALG. This correlation was not seen in students who performed in the lower 33rd percentile on the final examinations. Our results suggest that assessments of medical knowledge during active learning exercises could supplement non-cognitive assessments and could be good predictors of performance on summative examinations.
Automated Tissue Classification Framework for Reproducible Chronic Wound Assessment
Mukherjee, Rashmi; Manohar, Dhiraj Dhane; Das, Dev Kumar; Achar, Arun; Mitra, Analava; Chakraborty, Chandan
2014-01-01
The aim of this paper was to develop a computer assisted tissue classification (granulation, necrotic, and slough) scheme for chronic wound (CW) evaluation using medical image processing and statistical machine learning techniques. The red-green-blue (RGB) wound images grabbed by normal digital camera were first transformed into HSI (hue, saturation, and intensity) color space and subsequently the “S” component of HSI color channels was selected as it provided higher contrast. Wound areas from 6 different types of CW were segmented from whole images using fuzzy divergence based thresholding by minimizing edge ambiguity. A set of color and textural features describing granulation, necrotic, and slough tissues in the segmented wound area were extracted using various mathematical techniques. Finally, statistical learning algorithms, namely, Bayesian classification and support vector machine (SVM), were trained and tested for wound tissue classification in different CW images. The performance of the wound area segmentation protocol was further validated by ground truth images labeled by clinical experts. It was observed that SVM with 3rd order polynomial kernel provided the highest accuracies, that is, 86.94%, 90.47%, and 75.53%, for classifying granulation, slough, and necrotic tissues, respectively. The proposed automated tissue classification technique achieved the highest overall accuracy, that is, 87.61%, with highest kappa statistic value (0.793). PMID:25114925
Neurophysiological mechanisms involved in language learning in adults
Rodríguez-Fornells, Antoni; Cunillera, Toni; Mestres-Missé, Anna; de Diego-Balaguer, Ruth
2009-01-01
Little is known about the brain mechanisms involved in word learning during infancy and in second language acquisition and about the way these new words become stable representations that sustain language processing. In several studies we have adopted the human simulation perspective, studying the effects of brain-lesions and combining different neuroimaging techniques such as event-related potentials and functional magnetic resonance imaging in order to examine the language learning (LL) process. In the present article, we review this evidence focusing on how different brain signatures relate to (i) the extraction of words from speech, (ii) the discovery of their embedded grammatical structure, and (iii) how meaning derived from verbal contexts can inform us about the cognitive mechanisms underlying the learning process. We compile these findings and frame them into an integrative neurophysiological model that tries to delineate the major neural networks that might be involved in the initial stages of LL. Finally, we propose that LL simulations can help us to understand natural language processing and how the recovery from language disorders in infants and adults can be accomplished. PMID:19933142
Wei, Jianming; Zhang, Youan; Sun, Meimei; Geng, Baoliang
2017-09-01
This paper presents an adaptive iterative learning control scheme for a class of nonlinear systems with unknown time-varying delays and control direction preceded by unknown nonlinear backlash-like hysteresis. Boundary layer function is introduced to construct an auxiliary error variable, which relaxes the identical initial condition assumption of iterative learning control. For the controller design, integral Lyapunov function candidate is used, which avoids the possible singularity problem by introducing hyperbolic tangent funciton. After compensating for uncertainties with time-varying delays by combining appropriate Lyapunov-Krasovskii function with Young's inequality, an adaptive iterative learning control scheme is designed through neural approximation technique and Nussbaum function method. On the basis of the hyperbolic tangent function's characteristics, the system output is proved to converge to a small neighborhood of the desired trajectory by constructing Lyapunov-like composite energy function (CEF) in two cases, while keeping all the closed-loop signals bounded. Finally, a simulation example is presented to verify the effectiveness of the proposed approach. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
Saqr, Mohammed; Fors, Uno; Tedre, Matti
2017-07-01
Learning analytics (LA) is an emerging discipline that aims at analyzing students' online data in order to improve the learning process and optimize learning environments. It has yet un-explored potential in the field of medical education, which can be particularly helpful in the early prediction and identification of under-achieving students. The aim of this study was to identify quantitative markers collected from students' online activities that may correlate with students' final performance and to investigate the possibility of predicting the potential risk of a student failing or dropping out of a course. This study included 133 students enrolled in a blended medical course where they were free to use the learning management system at their will. We extracted their online activity data using database queries and Moodle plugins. Data included logins, views, forums, time, formative assessment, and communications at different points of time. Five engagement indicators were also calculated which would reflect self-regulation and engagement. Students who scored below 5% over the passing mark were considered to be potentially at risk of under-achieving. At the end of the course, we were able to predict the final grade with 63.5% accuracy, and identify 53.9% of at-risk students. Using a binary logistic model improved prediction to 80.8%. Using data recorded until the mid-course, prediction accuracy was 42.3%. The most important predictors were factors reflecting engagement of the students and the consistency of using the online resources. The analysis of students' online activities in a blended medical education course by means of LA techniques can help early predict underachieving students, and can be used as an early warning sign for timely intervention.
A Deep Learning based Approach to Reduced Order Modeling of Fluids using LSTM Neural Networks
NASA Astrophysics Data System (ADS)
Mohan, Arvind; Gaitonde, Datta
2017-11-01
Reduced Order Modeling (ROM) can be used as surrogates to prohibitively expensive simulations to model flow behavior for long time periods. ROM is predicated on extracting dominant spatio-temporal features of the flow from CFD or experimental datasets. We explore ROM development with a deep learning approach, which comprises of learning functional relationships between different variables in large datasets for predictive modeling. Although deep learning and related artificial intelligence based predictive modeling techniques have shown varied success in other fields, such approaches are in their initial stages of application to fluid dynamics. Here, we explore the application of the Long Short Term Memory (LSTM) neural network to sequential data, specifically to predict the time coefficients of Proper Orthogonal Decomposition (POD) modes of the flow for future timesteps, by training it on data at previous timesteps. The approach is demonstrated by constructing ROMs of several canonical flows. Additionally, we show that statistical estimates of stationarity in the training data can indicate a priori how amenable a given flow-field is to this approach. Finally, the potential and limitations of deep learning based ROM approaches will be elucidated and further developments discussed.
Students' Experiences of Clinic-Based Learning during a Final Year Veterinary Internship Programme
ERIC Educational Resources Information Center
Matthew, Susan M.; Taylor, Rosanne M.; Ellis, Robert A.
2010-01-01
This study investigated veterinary students' experiences of clinic-based learning (CBL) during a comprehensive final year internship programme. Open-ended surveys (n = 93) were used to gather qualitative data about students' conceptions of what is learned during CBL and their approaches to learning in clinics. Phenomenography was used for detailed…
Genetic programming based ensemble system for microarray data classification.
Liu, Kun-Hong; Tong, Muchenxuan; Xie, Shu-Tong; Yee Ng, Vincent To
2015-01-01
Recently, more and more machine learning techniques have been applied to microarray data analysis. The aim of this study is to propose a genetic programming (GP) based new ensemble system (named GPES), which can be used to effectively classify different types of cancers. Decision trees are deployed as base classifiers in this ensemble framework with three operators: Min, Max, and Average. Each individual of the GP is an ensemble system, and they become more and more accurate in the evolutionary process. The feature selection technique and balanced subsampling technique are applied to increase the diversity in each ensemble system. The final ensemble committee is selected by a forward search algorithm, which is shown to be capable of fitting data automatically. The performance of GPES is evaluated using five binary class and six multiclass microarray datasets, and results show that the algorithm can achieve better results in most cases compared with some other ensemble systems. By using elaborate base classifiers or applying other sampling techniques, the performance of GPES may be further improved.
Genetic Programming Based Ensemble System for Microarray Data Classification
Liu, Kun-Hong; Tong, Muchenxuan; Xie, Shu-Tong; Yee Ng, Vincent To
2015-01-01
Recently, more and more machine learning techniques have been applied to microarray data analysis. The aim of this study is to propose a genetic programming (GP) based new ensemble system (named GPES), which can be used to effectively classify different types of cancers. Decision trees are deployed as base classifiers in this ensemble framework with three operators: Min, Max, and Average. Each individual of the GP is an ensemble system, and they become more and more accurate in the evolutionary process. The feature selection technique and balanced subsampling technique are applied to increase the diversity in each ensemble system. The final ensemble committee is selected by a forward search algorithm, which is shown to be capable of fitting data automatically. The performance of GPES is evaluated using five binary class and six multiclass microarray datasets, and results show that the algorithm can achieve better results in most cases compared with some other ensemble systems. By using elaborate base classifiers or applying other sampling techniques, the performance of GPES may be further improved. PMID:25810748
Mozer, M C; Wolniewicz, R; Grimes, D B; Johnson, E; Kaushansky, H
2000-01-01
Competition in the wireless telecommunications industry is fierce. To maintain profitability, wireless carriers must control churn, which is the loss of subscribers who switch from one carrier to another.We explore techniques from statistical machine learning to predict churn and, based on these predictions, to determine what incentives should be offered to subscribers to improve retention and maximize profitability to the carrier. The techniques include logit regression, decision trees, neural networks, and boosting. Our experiments are based on a database of nearly 47,000 U.S. domestic subscribers and includes information about their usage, billing, credit, application, and complaint history. Our experiments show that under a wide variety of assumptions concerning the cost of intervention and the retention rate resulting from intervention, using predictive techniques to identify potential churners and offering incentives can yield significant savings to a carrier. We also show the importance of a data representation crafted by domain experts. Finally, we report on a real-world test of the techniques that validate our simulation experiments.
Liu, Zongcheng; Dong, Xinmin; Xue, Jianping; Li, Hongbo; Chen, Yong
2016-09-01
This brief addresses the adaptive control problem for a class of pure-feedback systems with nonaffine functions possibly being nondifferentiable. Without using the mean value theorem, the difficulty of the control design for pure-feedback systems is overcome by modeling the nonaffine functions appropriately. With the help of neural network approximators, an adaptive neural controller is developed by combining the dynamic surface control (DSC) and minimal learning parameter (MLP) techniques. The key features of our approach are that, first, the restrictive assumptions on the partial derivative of nonaffine functions are removed, second, the DSC technique is used to avoid "the explosion of complexity" in the backstepping design, and the number of adaptive parameters is reduced significantly using the MLP technique, third, smooth robust compensators are employed to circumvent the influences of approximation errors and disturbances. Furthermore, it is proved that all the signals in the closed-loop system are semiglobal uniformly ultimately bounded. Finally, the simulation results are provided to demonstrate the effectiveness of the designed method.
Learning styles and approaches to learning among medical undergraduates and postgraduates
2013-01-01
Background The challenge of imparting a large amount of knowledge within a limited time period in a way it is retained, remembered and effectively interpreted by a student is considerable. This has resulted in crucial changes in the field of medical education, with a shift from didactic teacher centered and subject based teaching to the use of interactive, problem based, student centered learning. This study tested the hypothesis that learning styles (visual, auditory, read/write and kinesthetic) and approaches to learning (deep, strategic and superficial) differ among first and final year undergraduate medical students, and postgraduates medical trainees. Methods We used self administered VARK and ASSIST questionnaires to assess the differences in learning styles and approaches to learning among medical undergraduates of the University of Colombo and postgraduate trainees of the Postgraduate Institute of Medicine, Colombo. Results A total of 147 participated: 73 (49.7%) first year students, 40 (27.2%) final year students and 34(23.1%) postgraduate students. The majority (69.9%) of first year students had multimodal learning styles. Among final year students, the majority (67.5%) had multimodal learning styles, and among postgraduates, the majority were unimodal (52.9%) learners. Among all three groups, the predominant approach to learning was strategic. Postgraduates had significant higher mean scores for deep and strategic approaches than first years or final years (p < 0.05). Mean scores for the superficial approach did not differ significantly between groups. Conclusions The learning approaches suggest a positive shift towards deep and strategic learning in postgraduate students. However a similar difference was not observed in undergraduate students from first year to final year, suggesting that their curriculum may not have influenced learning methodology over a five year period. PMID:23521845
Learning styles and approaches to learning among medical undergraduates and postgraduates.
Samarakoon, Lasitha; Fernando, Tharanga; Rodrigo, Chaturaka
2013-03-25
The challenge of imparting a large amount of knowledge within a limited time period in a way it is retained, remembered and effectively interpreted by a student is considerable. This has resulted in crucial changes in the field of medical education, with a shift from didactic teacher centered and subject based teaching to the use of interactive, problem based, student centered learning. This study tested the hypothesis that learning styles (visual, auditory, read/write and kinesthetic) and approaches to learning (deep, strategic and superficial) differ among first and final year undergraduate medical students, and postgraduates medical trainees. We used self administered VARK and ASSIST questionnaires to assess the differences in learning styles and approaches to learning among medical undergraduates of the University of Colombo and postgraduate trainees of the Postgraduate Institute of Medicine, Colombo. A total of 147 participated: 73 (49.7%) first year students, 40 (27.2%) final year students and 34(23.1%) postgraduate students. The majority (69.9%) of first year students had multimodal learning styles. Among final year students, the majority (67.5%) had multimodal learning styles, and among postgraduates, the majority were unimodal (52.9%) learners.Among all three groups, the predominant approach to learning was strategic. Postgraduates had significant higher mean scores for deep and strategic approaches than first years or final years (p < 0.05). Mean scores for the superficial approach did not differ significantly between groups. The learning approaches suggest a positive shift towards deep and strategic learning in postgraduate students. However a similar difference was not observed in undergraduate students from first year to final year, suggesting that their curriculum may not have influenced learning methodology over a five year period.
Masud, Tahir; Blundell, Adrian; Gordon, Adam Lee; Mulpeter, Ken; Roller, Regina; Singler, Katrin; Goeldlin, Adrian; Stuck, Andreas
2014-01-01
Introduction: the rise in the number of older, frail adults necessitates that future doctors are adequately trained in the skills of geriatric medicine. Few countries have dedicated curricula in geriatric medicine at the undergraduate level. The aim of this project was to develop a consensus among geriatricians on a curriculum with the minimal requirements that a medical student should achieve by the end of medical school. Methods: a modified Delphi process was used. First, educational experts and geriatricians proposed a set of learning objectives based on a literature review. Second, three Delphi rounds involving a panel with 49 experts representing 29 countries affiliated to the European Union of Medical Specialists (UEMS) was used to gain consensus for a final curriculum. Results: the number of disagreements following Delphi Rounds 1 and 2 were 81 and 53, respectively. Complete agreement was reached following the third round. The final curriculum consisted of detailed objectives grouped under 10 overarching learning outcomes. Discussion: a consensus on the minimum requirements of geriatric learning objectives for medical students has been agreed by European geriatricians. Major efforts will be needed to implement these requirements, given the large variation in the quality of geriatric teaching in medical schools. This curriculum is a first step to help improve teaching of geriatrics in medical schools, and will also serve as a basis for advancing postgraduate training in geriatrics across Europe. PMID:24603283
Additive Manufacturing Design Considerations for Liquid Engine Components
NASA Technical Reports Server (NTRS)
Whitten, Dave; Hissam, Andy; Baker, Kevin; Rice, Darron
2014-01-01
The Marshall Space Flight Center's Propulsion Systems Department has gained significant experience in the last year designing, building, and testing liquid engine components using additive manufacturing. The department has developed valve, duct, turbo-machinery, and combustion device components using this technology. Many valuable lessons were learned during this process. These lessons will be the focus of this presentation. We will present criteria for selecting part candidates for additive manufacturing. Some part characteristics are 'tailor made' for this process. Selecting the right parts for the process is the first step to maximizing productivity gains. We will also present specific lessons we learned about feature geometry that can and cannot be produced using additive manufacturing machines. Most liquid engine components were made using a two-step process. The base part was made using additive manufacturing and then traditional machining processes were used to produce the final part. The presentation will describe design accommodations needed to make the base part and lessons we learned about which features could be built directly and which require the final machine process. Tolerance capabilities, surface finish, and material thickness allowances will also be covered. Additive Manufacturing can produce internal passages that cannot be made using traditional approaches. It can also eliminate a significant amount of manpower by reducing part count and leveraging model-based design and analysis techniques. Information will be shared about performance enhancements and design efficiencies we experienced for certain categories of engine parts.
Goldstein, Benjamin A.; Navar, Ann Marie; Carter, Rickey E.
2017-01-01
Abstract Risk prediction plays an important role in clinical cardiology research. Traditionally, most risk models have been based on regression models. While useful and robust, these statistical methods are limited to using a small number of predictors which operate in the same way on everyone, and uniformly throughout their range. The purpose of this review is to illustrate the use of machine-learning methods for development of risk prediction models. Typically presented as black box approaches, most machine-learning methods are aimed at solving particular challenges that arise in data analysis that are not well addressed by typical regression approaches. To illustrate these challenges, as well as how different methods can address them, we consider trying to predicting mortality after diagnosis of acute myocardial infarction. We use data derived from our institution's electronic health record and abstract data on 13 regularly measured laboratory markers. We walk through different challenges that arise in modelling these data and then introduce different machine-learning approaches. Finally, we discuss general issues in the application of machine-learning methods including tuning parameters, loss functions, variable importance, and missing data. Overall, this review serves as an introduction for those working on risk modelling to approach the diffuse field of machine learning. PMID:27436868
Koohestani, Hamid Reza; Baghcheghi, Nayereh
2016-01-01
Background: Team-based learning is a structured type of cooperative learning that is becoming increasingly more popular in nursing education. This study compares levels of nursing students' perception of the psychosocial climate of the classroom between conventional lecture group and team-based learning group. Methods: In a quasi-experimental study with pretest-posttest design 38 nursing students of second year participated. One half of the 16 sessions of cardiovascular disease nursing course sessions was taught by lectures and the second half with team-based learning. The modified college and university classroom environment inventory (CUCEI) was used to measure the perception of classroom environment. This was completed after the final lecture and TBL sessions. Results: Results revealed a significant difference in the mean scores of psycho-social climate for the TBL method (Mean (SD): 179.8(8.27)) versus the mean score for the lecture method (Mean (SD): 154.213.44)). Also, the results showed significant differences between the two groups in the innovation (p<0.001), student cohesiveness (p=0.01), cooperation (p<0.001) and equity (p= 0.03) sub-scales scores (p<0.05). Conclusion: This study provides evidence that team-based learning does have a positive effect on nursing students' perceptions of their psycho-social climate of the classroom.
Deep convolutional neural network based antenna selection in multiple-input multiple-output system
NASA Astrophysics Data System (ADS)
Cai, Jiaxin; Li, Yan; Hu, Ying
2018-03-01
Antenna selection of wireless communication system has attracted increasing attention due to the challenge of keeping a balance between communication performance and computational complexity in large-scale Multiple-Input MultipleOutput antenna systems. Recently, deep learning based methods have achieved promising performance for large-scale data processing and analysis in many application fields. This paper is the first attempt to introduce the deep learning technique into the field of Multiple-Input Multiple-Output antenna selection in wireless communications. First, the label of attenuation coefficients channel matrix is generated by minimizing the key performance indicator of training antenna systems. Then, a deep convolutional neural network that explicitly exploits the massive latent cues of attenuation coefficients is learned on the training antenna systems. Finally, we use the adopted deep convolutional neural network to classify the channel matrix labels of test antennas and select the optimal antenna subset. Simulation experimental results demonstrate that our method can achieve better performance than the state-of-the-art baselines for data-driven based wireless antenna selection.
Optimal structure and parameter learning of Ising models
Lokhov, Andrey; Vuffray, Marc Denis; Misra, Sidhant; ...
2018-03-16
Reconstruction of the structure and parameters of an Ising model from binary samples is a problem of practical importance in a variety of disciplines, ranging from statistical physics and computational biology to image processing and machine learning. The focus of the research community shifted toward developing universal reconstruction algorithms that are both computationally efficient and require the minimal amount of expensive data. Here, we introduce a new method, interaction screening, which accurately estimates model parameters using local optimization problems. The algorithm provably achieves perfect graph structure recovery with an information-theoretically optimal number of samples, notably in the low-temperature regime, whichmore » is known to be the hardest for learning. Here, the efficacy of interaction screening is assessed through extensive numerical tests on synthetic Ising models of various topologies with different types of interactions, as well as on real data produced by a D-Wave quantum computer. Finally, this study shows that the interaction screening method is an exact, tractable, and optimal technique that universally solves the inverse Ising problem.« less
Public Participation, Education, and Engagement in Drought Planning
NASA Astrophysics Data System (ADS)
Bathke, D. J.; Wall, N.; Haigh, T.; Smith, K. H.; Bernadt, T.
2014-12-01
Drought is a complex problem that typically goes beyond the capacity, resources, and jurisdiction of any single person, program, organization, political boundary, or sector. Thus, by nature, monitoring, planning for, and reducing drought risk must be a collaborative process. The National Drought Mitigation Center, in partnership with the National Integrated Drought Information System (NIDIS) Program Office and others, provides active engagement and education drought professionals, stakeholders, and the general public about managing drought-related risks through resilience planning, monitoring, and education. Using case studies, we discuss recruitment processes, network building, participation techniques, and educational methods as they pertain to a variety of unique audiences with distinct objectives. Examples include collaborative decision-making at a World Meteorological Organization conference; planning, and peer-learning among drought professionals in a community of practice; drought condition monitoring through citizen science networks; research and education dissemination with stakeholder groups; and informal learning activities for all ages. Finally, we conclude with evaluation methods, indicators of success, and lessons learned for increasing the effectiveness of our programs in increasing drought resilience.
Optimal structure and parameter learning of Ising models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lokhov, Andrey; Vuffray, Marc Denis; Misra, Sidhant
Reconstruction of the structure and parameters of an Ising model from binary samples is a problem of practical importance in a variety of disciplines, ranging from statistical physics and computational biology to image processing and machine learning. The focus of the research community shifted toward developing universal reconstruction algorithms that are both computationally efficient and require the minimal amount of expensive data. Here, we introduce a new method, interaction screening, which accurately estimates model parameters using local optimization problems. The algorithm provably achieves perfect graph structure recovery with an information-theoretically optimal number of samples, notably in the low-temperature regime, whichmore » is known to be the hardest for learning. Here, the efficacy of interaction screening is assessed through extensive numerical tests on synthetic Ising models of various topologies with different types of interactions, as well as on real data produced by a D-Wave quantum computer. Finally, this study shows that the interaction screening method is an exact, tractable, and optimal technique that universally solves the inverse Ising problem.« less
Peer assisted learning as a formal instructional tool.
Naqi, Syed Asghar
2014-03-01
To explore the utility of peer assisted learning (PAL) in medical schools as a formal instructional tool. Grounded theory approach. King Edward Medical University, Lahore, from July 2011 to December 2011. A study was designed using semi-structured in-depth interviews to collect data from final year medical students (n=6), residents (n=4) and faculty members (n=3), selected on the basis of non-probability purposive sampling. The qualitative data thus generated was first translated in English and transcribed and organized into major categories by using a coding framework. Participants were interviewed two more times to further explore their perceptions and experiences related to emergent categories. An iterative process was employed using grounded theory analysis technique to eventually generate theory. PAL was perceived as rewarding in terms of fostering higher order thinking, effective teaching skills and in improving self efficacy among learners. PAL can offer learning opportunity to medical students, residents and faculty members. It can improve depth of their knowledge and skills.
NASA Astrophysics Data System (ADS)
Otsuka, Yuichi; Ohta, Kazuhide; Noguchi, Hiroshi
The 21st century Center of Excellence (COE) program in Department of Mechanical Engineering Science at Kyushu University construct the training framework of learning “Integrating Techniques” by research presentations for students in different majors and accident analyses for practical cases by Ph.D course students. The training framework is composed of three processes : 1) Peer review among Ph.D course students for the presentations, 2) Instructions by teachers in order to improve the quality of the presentations based on the result of the peer-reviews, 3) Final evaluation for the improved presentations by teachers and the students. This research has elucidated the quantitative effectiveness of the framework by the evaluations using questionnaires for the presentations. Furthermore, the result of investigation for the course students has observed positive correlation between the significance of integration techniques and the enthusiasm for participating the course, which reveals the efficacy of the learning framework proposed.
NASA Astrophysics Data System (ADS)
Han, Ke-Zhen; Feng, Jian; Cui, Xiaohong
2017-10-01
This paper considers the fault-tolerant optimised tracking control (FTOTC) problem for unknown discrete-time linear system. A research scheme is proposed on the basis of data-based parity space identification, reinforcement learning and residual compensation techniques. The main characteristic of this research scheme lies in the parity-space-identification-based simultaneous tracking control and residual compensation. The specific technical line consists of four main contents: apply subspace aided method to design observer-based residual generator; use reinforcement Q-learning approach to solve optimised tracking control policy; rely on robust H∞ theory to achieve noise attenuation; adopt fault estimation triggered by residual generator to perform fault compensation. To clarify the design and implementation procedures, an integrated algorithm is further constructed to link up these four functional units. The detailed analysis and proof are subsequently given to explain the guaranteed FTOTC performance of the proposed conclusions. Finally, a case simulation is provided to verify its effectiveness.
Genotype-phenotype association study via new multi-task learning model
Huo, Zhouyuan; Shen, Dinggang
2018-01-01
Research on the associations between genetic variations and imaging phenotypes is developing with the advance in high-throughput genotype and brain image techniques. Regression analysis of single nucleotide polymorphisms (SNPs) and imaging measures as quantitative traits (QTs) has been proposed to identify the quantitative trait loci (QTL) via multi-task learning models. Recent studies consider the interlinked structures within SNPs and imaging QTs through group lasso, e.g. ℓ2,1-norm, leading to better predictive results and insights of SNPs. However, group sparsity is not enough for representing the correlation between multiple tasks and ℓ2,1-norm regularization is not robust either. In this paper, we propose a new multi-task learning model to analyze the associations between SNPs and QTs. We suppose that low-rank structure is also beneficial to uncover the correlation between genetic variations and imaging phenotypes. Finally, we conduct regression analysis of SNPs and QTs. Experimental results show that our model is more accurate in prediction than compared methods and presents new insights of SNPs. PMID:29218896
OBSERVATIONS ABOUT HOW WE LEARN ABOUT METHODOLOGY AND STATISTICS.
Jose, Paul E
2017-06-01
The overarching theme of this monograph is to encourage developmental researchers to acquire cutting-edge and innovative design and statistical methods so that we can improve the studies that we execute on the topic of change. Card, the editor of the monograph, challenges the reader to think about works such as the present one as contributing to the new subdiscipline of developmental methodology within the broader field of developmental science. This thought-provoking stance served as the stimulus for the present commentary, which is a collection of observations on "how we learn about methodology and statistics." The point is made that we often learn critical new information from our colleagues, from seminal writings in the literature, and from conferences and workshop participation. It is encouraged that researchers pursue all three of these pathways as ways to acquire innovative knowledge and techniques. Finally, the role of developmental science societies in supporting the dissemination and uptake of this type of knowledge is discussed. © 2017 The Society for Research in Child Development, Inc.
Zheng, Shiqi; Tang, Xiaoqi; Song, Bao; Lu, Shaowu; Ye, Bosheng
2013-07-01
In this paper, a stable adaptive PI control strategy based on the improved just-in-time learning (IJITL) technique is proposed for permanent magnet synchronous motor (PMSM) drive. Firstly, the traditional JITL technique is improved. The new IJITL technique has less computational burden and is more suitable for online identification of the PMSM drive system which is highly real-time compared to traditional JITL. In this way, the PMSM drive system is identified by IJITL technique, which provides information to an adaptive PI controller. Secondly, the adaptive PI controller is designed in discrete time domain which is composed of a PI controller and a supervisory controller. The PI controller is capable of automatically online tuning the control gains based on the gradient descent method and the supervisory controller is developed to eliminate the effect of the approximation error introduced by the PI controller upon the system stability in the Lyapunov sense. Finally, experimental results on the PMSM drive system show accurate identification and favorable tracking performance. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.
Long-term spacing effect benefits in developmental amnesia: case experiments in rehabilitation.
Green, Janet L; Weston, Tina; Wiseheart, Melody; Rosenbaum, R Shayna
2014-09-01
The spacing effect describes the typical finding that repeated items are remembered best when additional items are introduced between each repetition than when the repetitions occur in immediate succession. In this study, we investigated the nature and limits of the spacing effect in the developmental amnesic case H.C. In Experiment 1, we compared the performance of H.C. to that of controls on a short-term, free recall, verbal learning spacing paradigm while controlling for retention interval (timing of item review and recall). In Experiment 2, we compared the performance of H.C. to that of controls on a multiday, cued recall, verbal learning spacing paradigm, in which memory was assessed after 1 week. In both experiments, H.C. demonstrated a spacing effect comparable to the effect exhibited by controls. In Experiment 1, her final recall memory for long-lag (spaced) items was better than recall for no-lag (massed) items t(23) = 10.99, p < .001, d = 2.5. In Experiment 2, her final cued recall memory for next-day-reviewed (spaced) items was better than cued recall for same-day-reviewed (massed) items, t(20) = 17.6, p < .001, d = 4.1. This study demonstrates the spacing effect in a person with impaired episodic memory development and is the first to show long-term benefits of spacing in amnesia. Substantially slower learning-to-criterion suggests an alternate mechanism supporting the spacing effect, perhaps independent of the hippocampus. Spacing should be considered as a candidate memory intervention technique given its effectiveness in both short- and long-term learning settings. (c) 2014 APA, all rights reserved.
Pimmer, Christoph; Brysiewicz, Petra; Linxen, Sebastian; Walters, Fiona; Chipps, Jennifer; Gröhbiel, Urs
2014-11-01
With the proliferation of portable digital technology, mobile learning is becoming increasingly popular in nursing education and practice. Most of the research in this field has been concentrated on small-scale projects in high income countries. Very little is known about the ways in which nurses and midwives use mobile technology in remote and resource poor areas in informal learning contexts in low and middle income countries. To address this gap, this study investigates whether nurses use mobile phones as effective educational tools in marginalized and remote areas, and if so, how and why. In rural South Africa, 16 nurses who attended an advanced midwifery education program, facilitators and clinical managers were interviewed about their use of digital mobile technology for learning. Techniques of qualitative content analysis were used to examine the data. Several rich "organically-grown", learning practices were identified: mobile phone usage facilitated (1) authentic problem solving; (2) reflective practice; (3) emotional support and belongingness; (4) the realization of unpredictable teaching situations; and (5) life-long learning. It is concluded that mobile phones, and the convergence of mobile phones and social media, in particular, change learning environments. In addition, these tools are suitable to connect learners and learning distributed in marginalized areas. Finally, a few suggestions are made about how these insights from informal settings can inform the development of more systematic mobile learning formats. Copyright © 2014 Elsevier Ltd. All rights reserved.
Wise, Christopher H; Schenk, Ronald J; Lattanzi, Jill Black
2016-07-01
Despite emerging evidence to support the use of high velocity thrust manipulation in the management of lumbar spinal conditions, utilization of thrust manipulation among clinicians remains relatively low. One reason for the underutilization of these procedures may be related to disparity in training in the performance of these techniques at the professional and post professional levels. To assess the effect of using a new model of active learning on participant confidence in the performance of spinal thrust manipulation and the implications for its use in the professional and post-professional training of physical therapists. A cohort of 15 DPT students in their final semester of entry-level professional training participated in an active training session emphasizing a sequential partial task practice (SPTP) strategy in which participants engaged in partial task practice over several repetitions with different partners. Participants' level of confidence in the performance of these techniques was determined through comparison of pre- and post-training session surveys and a post-session open-ended interview. The increase in scores across all items of the individual pre- and post-session surveys suggests that this model was effective in changing overall participant perception regarding the effectiveness and safety of these techniques and in increasing student confidence in their performance. Interviews revealed that participants greatly preferred the SPTP strategy, which enhanced their confidence in technique performance. Results indicate that this new model of psychomotor training may be effective at improving confidence in the performance of spinal thrust manipulation and, subsequently, may be useful for encouraging the future use of these techniques in the care of individuals with impairments of the spine. Inasmuch, this method of instruction may be useful for training of physical therapists at both the professional and post-professional levels.
Opportunities to Create Active Learning Techniques in the Classroom
ERIC Educational Resources Information Center
Camacho, Danielle J.; Legare, Jill M.
2015-01-01
The purpose of this article is to contribute to the growing body of research that focuses on active learning techniques. Active learning techniques require students to consider a given set of information, analyze, process, and prepare to restate what has been learned--all strategies are confirmed to improve higher order thinking skills. Active…
Factors related to student performance in statistics courses in Lebanon
NASA Astrophysics Data System (ADS)
Naccache, Hiba Salim
The purpose of the present study was to identify factors that may contribute to business students in Lebanese universities having difficulty in introductory and advanced statistics courses. Two statistics courses are required for business majors at Lebanese universities. Students are not obliged to be enrolled in any math courses prior to taking statistics courses. Drawing on recent educational research, this dissertation attempted to identify the relationship between (1) students’ scores on Lebanese university math admissions tests; (2) students’ scores on a test of very basic mathematical concepts; (3) students’ scores on the survey of attitude toward statistics (SATS); (4) course performance as measured by students’ final scores in the course; and (5) their scores on the final exam. Data were collected from 561 students enrolled in multiple sections of two courses: 307 students in the introductory statistics course and 260 in the advanced statistics course in seven campuses across Lebanon over one semester. The multiple regressions results revealed four significant relationships at the introductory level: between students’ scores on the math quiz with their (1) final exam scores; (2) their final averages; (3) the Cognitive subscale of the SATS with their final exam scores; and (4) their final averages. These four significant relationships were also found at the advanced level. In addition, two more significant relationships were found between students’ final average and the two subscales of Effort (5) and Affect (6). No relationship was found between students’ scores on the admission math tests and both their final exam scores and their final averages in both the introductory and advanced level courses. On the other hand, there was no relationship between students’ scores on Lebanese admissions tests and their final achievement. Although these results were consistent across course formats and instructors, they may encourage Lebanese universities to assess the effectiveness of prerequisite math courses. Moreover, these findings may lead the Lebanese Ministry of Education to make changes to the admissions exams, course prerequisites, and course content. Finally, to enhance the attitude of students, new learning techniques, such as group work during class meetings can be helpful, and future research should aim to test the effectiveness of these pedagogical techniques on students’ attitudes toward statistics.
MicroRNA based Pan-Cancer Diagnosis and Treatment Recommendation.
Cheerla, Nikhil; Gevaert, Olivier
2017-01-13
The current state-of-the-art in cancer diagnosis and treatment is not ideal; diagnostic tests are accurate but invasive, and treatments are "one-size fits-all" instead of being personalized. Recently, miRNA's have garnered significant attention as cancer biomarkers, owing to their ease of access (circulating miRNA in the blood) and stability. There have been many studies showing the effectiveness of miRNA data in diagnosing specific cancer types, but few studies explore the role of miRNA in predicting treatment outcome. Here we go a step further, using tissue miRNA and clinical data across 21 cancers from the 'The Cancer Genome Atlas' (TCGA) database. We use machine learning techniques to create an accurate pan-cancer diagnosis system, and a prediction model for treatment outcomes. Finally, using these models, we create a web-based tool that diagnoses cancer and recommends the best treatment options. We achieved 97.2% accuracy for classification using a support vector machine classifier with radial basis. The accuracies improved to 99.9-100% when climbing up the embryonic tree and classifying cancers at different stages. We define the accuracy as the ratio of the total number of instances correctly classified to the total instances. The classifier also performed well, achieving greater than 80% sensitivity for many cancer types on independent validation datasets. Many miRNAs selected by our feature selection algorithm had strong previous associations to various cancers and tumor progression. Then, using miRNA, clinical and treatment data and encoding it in a machine-learning readable format, we built a prognosis predictor model to predict the outcome of treatment with 85% accuracy. We used this model to create a tool that recommends personalized treatment regimens. Both the diagnosis and prognosis model, incorporating semi-supervised learning techniques to improve their accuracies with repeated use, were uploaded online for easy access. Our research is a step towards the final goal of diagnosing cancer and predicting treatment recommendations using non-invasive blood tests.
Molecular electronics: insight from first-principles transport simulations.
Paulsson, Magnus; Frederiksen, Thomas; Brandbyge, Mads
2010-01-01
Conduction properties of nanoscale contacts can be studied using first-principles simulations. Such calculations give insight into details behind the conductance that is not readily available in experiments. For example, we may learn how the bonding conditions of a molecule to the electrodes affect the electronic transport. Here we describe key computational ingredients and discuss these in relation to simulations for scanning tunneling microscopy (STM) experiments with C60 molecules where the experimental geometry is well characterized. We then show how molecular dynamics simulations may be combined with transport calculations to study more irregular situations, such as the evolution of a nanoscale contact with the mechanically controllable break-junction technique. Finally we discuss calculations of inelastic electron tunnelling spectroscopy as a characterization technique that reveals information about the atomic arrangement and transport channels.
Mason, Robert A; Just, Marcel Adam
2015-05-01
Incremental instruction on the workings of a set of mechanical systems induced a progression of changes in the neural representations of the systems. The neural representations of four mechanical systems were assessed before, during, and after three phases of incremental instruction (which first provided information about the system components, then provided partial causal information, and finally provided full functional information). In 14 participants, the neural representations of four systems (a bathroom scale, a fire extinguisher, an automobile braking system, and a trumpet) were assessed using three recently developed techniques: (1) machine learning and classification of multi-voxel patterns; (2) localization of consistently responding voxels; and (3) representational similarity analysis (RSA). The neural representations of the systems progressed through four stages, or states, involving spatially and temporally distinct multi-voxel patterns: (1) initially, the representation was primarily visual (occipital cortex); (2) it subsequently included a large parietal component; (3) it eventually became cortically diverse (frontal, parietal, temporal, and medial frontal regions); and (4) at the end, it demonstrated a strong frontal cortex weighting (frontal and motor regions). At each stage of knowledge, it was possible for a classifier to identify which one of four mechanical systems a participant was thinking about, based on their brain activation patterns. The progression of representational states was suggestive of progressive stages of learning: (1) encoding information from the display; (2) mental animation, possibly involving imagining the components moving; (3) generating causal hypotheses associated with mental animation; and finally (4) determining how a person (probably oneself) would interact with the system. This interpretation yields an initial, cortically-grounded, theory of learning of physical systems that potentially can be related to cognitive learning theories by suggesting links between cortical representations, stages of learning, and the understanding of simple systems. Copyright © 2015 Elsevier Inc. All rights reserved.
Stone, James R; Wilde, Elisabeth A; Taylor, Brian A; Tate, David F; Levin, Harvey; Bigler, Erin D; Scheibel, Randall S; Newsome, Mary R; Mayer, Andrew R; Abildskov, Tracy; Black, Garrett M; Lennon, Michael J; York, Gerald E; Agarwal, Rajan; DeVillasante, Jorge; Ritter, John L; Walker, Peter B; Ahlers, Stephen T; Tustison, Nicholas J
2016-01-01
White matter hyperintensities (WMHs) are foci of abnormal signal intensity in white matter regions seen with magnetic resonance imaging (MRI). WMHs are associated with normal ageing and have shown prognostic value in neurological conditions such as traumatic brain injury (TBI). The impracticality of manually quantifying these lesions limits their clinical utility and motivates the utilization of machine learning techniques for automated segmentation workflows. This study develops a concatenated random forest framework with image features for segmenting WMHs in a TBI cohort. The framework is built upon the Advanced Normalization Tools (ANTs) and ANTsR toolkits. MR (3D FLAIR, T2- and T1-weighted) images from 24 service members and veterans scanned in the Chronic Effects of Neurotrauma Consortium's (CENC) observational study were acquired. Manual annotations were employed for both training and evaluation using a leave-one-out strategy. Performance measures include sensitivity, positive predictive value, [Formula: see text] score and relative volume difference. Final average results were: sensitivity = 0.68 ± 0.38, positive predictive value = 0.51 ± 0.40, [Formula: see text] = 0.52 ± 0.36, relative volume difference = 43 ± 26%. In addition, three lesion size ranges are selected to illustrate the variation in performance with lesion size. Paired with correlative outcome data, supervised learning methods may allow for identification of imaging features predictive of diagnosis and prognosis in individual TBI patients.
How to start a minimal access mitral valve program.
Hunter, Steven
2013-11-01
The seven pillars of governance established by the National Health Service in the United Kingdom provide a useful framework for the process of introducing new procedures to a hospital. Drawing from local experience, the author present guidance for institutions considering establishing a minimal access mitral valve program. The seven pillars of governance apply to the practice of minimally invasive mitral valve surgery, based on the principle of patient-centred practice. The author delineate the benefits of minimally invasive mitral valve surgery in terms of: "clinical effectiveness", including reduced length of hospital stay, "risk management effectiveness", including conversion to sternotomy and aortic dissection, "patient experience" including improved cosmesis and quicker recovery, and the effectiveness of communication, resources and strategies in the implementation of minimally invasive mitral valve surgery. Finally, the author have identified seven learning curves experienced by surgeons involved in introducing a minimal access mitral valve program. The learning curves are defined as: techniques of mitral valve repair, Transoesophageal Echocardiography-guided cannulation, incisions, instruments, visualization, aortic occlusion and cardiopulmonary bypass strategies. From local experience, the author provide advice on how to reduce the learning curves, such as practising with the specialised instruments and visualization techniques during sternotomy cases. Underpinning the NHS pillars are the principles of systems awareness, teamwork, communication, ownership and leadership, all of which are paramount to performing any surgery but more so with minimal access surgery, as will be highlighted throughout this paper.
As above, so below? Towards understanding inverse models in BCI
NASA Astrophysics Data System (ADS)
Lindgren, Jussi T.
2018-02-01
Objective. In brain-computer interfaces (BCI), measurements of the user’s brain activity are classified into commands for the computer. With EEG-based BCIs, the origins of the classified phenomena are often considered to be spatially localized in the cortical volume and mixed in the EEG. We investigate if more accurate BCIs can be obtained by reconstructing the source activities in the volume. Approach. We contrast the physiology-driven source reconstruction with data-driven representations obtained by statistical machine learning. We explain these approaches in a common linear dictionary framework and review the different ways to obtain the dictionary parameters. We consider the effect of source reconstruction on some major difficulties in BCI classification, namely information loss, feature selection and nonstationarity of the EEG. Main results. Our analysis suggests that the approaches differ mainly in their parameter estimation. Physiological source reconstruction may thus be expected to improve BCI accuracy if machine learning is not used or where it produces less optimal parameters. We argue that the considered difficulties of surface EEG classification can remain in the reconstructed volume and that data-driven techniques are still necessary. Finally, we provide some suggestions for comparing approaches. Significance. The present work illustrates the relationships between source reconstruction and machine learning-based approaches for EEG data representation. The provided analysis and discussion should help in understanding, applying, comparing and improving such techniques in the future.
Building a GPS Receiver for Space Lessons Learned
NASA Technical Reports Server (NTRS)
Sirotzky, Steve; Heckler, G. W.; Boegner, G.; Roman, J.; Wennersten, M.; Butler, R.; Davis, M.; Lanham, A.; Winternitz, L.; Thompson, W.;
2008-01-01
Over the past 4 years the Component Systems and Hardware branch at NASA GSFC has pursued an inhouse effort to build a unique space-flight GPS receiver. This effort has resulted in the Navigator GPS receiver. Navigator's first flight opportunity will come with the STS-125 HST-SM4 mission in August 2008. This paper covers the overall hardware design for the receiver and the difficulties encountered during the transition from the breadboard design to the final flight hardware design. Among the different lessons learned, the paper stresses the importance of selecting and verifying parts that are appropriate for space applications, as well as what happens when these parts are not accurately characterized by their datasheets. Additionally, the paper discusses what analysis needs to be performed when deciding system frequencies and filters. The presentation also covers how to prepare for thermal vacuum testing, and problems that may arise during vibration testing. It also contains what criteria should be considered when determining which portions of a design to create in-house, and which portions to license from a third party. Finally, the paper shows techniques which have proven to be extraordinarily helpful in debugging and analysis.
Simple Versus Elaborate Feedback in a Nursing Science Course
NASA Astrophysics Data System (ADS)
Elder, Betty L.; Brooks, David W.
2008-08-01
Feedback techniques, including computer-assisted feedback, have had mixed results in improving student learning outcomes. This project addresses the effect of type of feedback, simple or elaborate, for both short-term comprehension and long-term outcomes. A sample of 75 graduate nursing students was given a total of ten examinations. Four examinations provided tutorials in which the students received one of two types of feedback, simple or elaborate. Five examinations provided tutorials with no feedback. A comprehensive final examination compared initial content and final scores. This study found no significant differences between the types of feedback the students received. The mean scores were significantly higher on the four examinations where the students received feedback than on the five examinations with no feedback on tutorials. The comparison between the individual examinations and the similar content portion of the final examination indicated a significant drop in each of the four examinations where feedback was given and a significant improvement in four of the five examinations where no feedback was given.
Figure Analysis: A Teaching Technique to Promote Visual Literacy and Active Learning
ERIC Educational Resources Information Center
Wiles, Amy M.
2016-01-01
Learning often improves when active learning techniques are used in place of traditional lectures. For many of these techniques, however, students are expected to apply concepts that they have already grasped. A challenge, therefore, is how to incorporate active learning into the classroom of courses with heavy content, such as molecular-based…
NASA Astrophysics Data System (ADS)
Makahinda, T.
2018-02-01
The purpose of this research is to find out the effect of learning model based on technology and assessment technique toward thermodynamic achievement by controlling students intelligence. This research is an experimental research. The sample is taken through cluster random sampling with the total respondent of 80 students. The result of the research shows that the result of learning of thermodynamics of students who taught the learning model of environmental utilization is higher than the learning result of student thermodynamics taught by simulation animation, after controlling student intelligence. There is influence of student interaction, and the subject between models of technology-based learning with assessment technique to student learning result of Thermodynamics, after controlling student intelligence. Based on the finding in the lecture then should be used a thermodynamic model of the learning environment with the use of project assessment technique.
ERIC Educational Resources Information Center
Rabanaque, Samuel; Martinez-Fernandez, J. Reinaldo
2009-01-01
Three conceptions of learning (rote, interpretative and constructive), and two aspects of motivation (level and value of motivation) were identified in 258 Spanish psychology undergraduates classified in three different academic levels (initial, intermediate and final course). Results about conceptions of learning showed final-course students are…
[Ultrasound-guided peripheral catheterization].
Salleras-Duran, Laia; Fuentes-Pumarola, Concepció
2016-01-01
Peripheral catheterization is a technique that can be difficult in some patients. Some studies have recently described the use of ultrasound to guide the venous catheterization. To describe the success rate, time required, complications of ultrasound-guided peripheral venous catheterization. and patients and professionals satisfaction The search was performed in databases (Medline-PubMed, Cochrane Library, CINAHL and Cuiden Plus) for studies published about ultrasound-guided peripheral venous catheterization performed on patients that provided results on the success of the technique, complications, time used, patient satisfaction and the type of professional who performed the technique. A total of 21 studies were included. Most of them get a higher success rate 80% in the catheterization ecoguide and time it is not higher than the traditional technique. The Technical complications analyzed were arterial puncture rates and lower nerve 10%. In all studies measuring and comparing patient satisfaction in the art ecoguide is greater. Various professional groups perform the technique. The use of ultrasound for peripheral pipes has a high success rate, complications are rare and the time used is similar to that of the traditional technique. The technique of inserting catheters through ultrasound may be learned by any professional group performing venipuncture. Finally, it gets underscores the high patient satisfaction with the use of this technique. Copyright © 2015 Elsevier España, S.L.U. All rights reserved.
MO-E-18A-01: Imaging: Best Practices In Pediatric Imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Willis, C; Strauss, K; MacDougall, R
This imaging educational program will focus on solutions to common pediatric imaging challenges. The speakers will present collective knowledge on best practices in pediatric imaging from their experience at dedicated children's hospitals. Areas of focus will include general radiography, the use of manual and automatic dose management in computed tomography, and enterprise-wide radiation dose management in the pediatric practice. The educational program will begin with a discussion of the complexities of exposure factor control in pediatric projection radiography. Following this introduction will be two lectures addressing the challenges of computed tomography (CT) protocol optimization in the pediatric population. The firstmore » will address manual CT protocol design in order to establish a managed radiation dose for any pediatric exam on any CT scanner. The second CT lecture will focus on the intricacies of automatic dose modulation in pediatric imaging with an emphasis on getting reliable results in algorithmbased technique selection. The fourth and final lecture will address the key elements needed to developing a comprehensive radiation dose management program for the pediatric environment with particular attention paid to new regulations and obligations of practicing medical physicists. Learning Objectives: To understand how general radiographic techniques can be optimized using exposure indices in order to improve pediatric radiography. To learn how to establish diagnostic dose reference levels for pediatric patients as a function of the type of examination, patient size, and individual design characteristics of the CT scanner. To learn how to predict the patient's radiation dose prior to the exam and manually adjust technique factors if necessary to match the patient's dose to the department's established dose reference levels. To learn how to utilize manufacturer-provided automatic dose modulation technology to consistently achieve patient doses within the department's established size-based diagnostic reference range. To understand the key components of an enterprise-wide pediatric dose management program that integrates the expanding responsibilities of medial physicists in the new era of dose monitoring.« less
ERIC Educational Resources Information Center
Devaraj, Nirupama; Raman, Jaishankar
2014-01-01
We investigate the impact of active learning techniques, specifically experiment based learning, in a Principles of Economics class. Our case study demonstrates that when using pedagogical techniques intended to facilitate active learning, teachers should be intentional about incorporating components of learning that appeal to students with…
Utterance-final position and pitch marking aid word learning in school-age children
Laaha, Sabine; Fitch, W. Tecumseh
2017-01-01
We investigated the effects of word order and prosody on word learning in school-age children. Third graders viewed photographs belonging to one of three semantic categories while hearing four-word nonsense utterances containing a target word. In the control condition, all words had the same pitch and, across trials, the position of the target word was varied systematically within each utterance. The only cue to word–meaning mapping was the co-occurrence of target words and referents. This cue was present in all conditions. In the Utterance-final condition, the target word always occurred in utterance-final position, and at the same fundamental frequency as all the other words of the utterance. In the Pitch peak condition, the position of the target word was varied systematically within each utterance across trials, and produced with pitch contrasts typical of infant-directed speech (IDS). In the Pitch peak + Utterance-final condition, the target word always occurred in utterance-final position, and was marked with a pitch contrast typical of IDS. Word learning occurred in all conditions except the control condition. Moreover, learning performance was significantly higher than that observed with simple co-occurrence (control condition) only for the Pitch peak + Utterance-final condition. We conclude that, for school-age children, the combination of words' utterance-final alignment and pitch enhancement boosts word learning. PMID:28878961
Utterance-final position and pitch marking aid word learning in school-age children.
Filippi, Piera; Laaha, Sabine; Fitch, W Tecumseh
2017-08-01
We investigated the effects of word order and prosody on word learning in school-age children. Third graders viewed photographs belonging to one of three semantic categories while hearing four-word nonsense utterances containing a target word. In the control condition, all words had the same pitch and, across trials, the position of the target word was varied systematically within each utterance. The only cue to word-meaning mapping was the co-occurrence of target words and referents. This cue was present in all conditions. In the Utterance-final condition, the target word always occurred in utterance-final position, and at the same fundamental frequency as all the other words of the utterance. In the Pitch peak condition, the position of the target word was varied systematically within each utterance across trials, and produced with pitch contrasts typical of infant-directed speech (IDS). In the Pitch peak + Utterance-final condition, the target word always occurred in utterance-final position, and was marked with a pitch contrast typical of IDS. Word learning occurred in all conditions except the control condition. Moreover, learning performance was significantly higher than that observed with simple co-occurrence ( control condition) only for the Pitch peak + Utterance-final condition. We conclude that, for school-age children, the combination of words' utterance-final alignment and pitch enhancement boosts word learning.
Lifelong learning strategies in nursing: A systematic review.
Qalehsari, Mojtaba Qanbari; Khaghanizadeh, Morteza; Ebadi, Abbas
2017-10-01
Lifelong learning is an expectation in the professional performance of nurses, which is directly related to the success of students in nursing schools. In spite of the considerable attention paid to this issue, lifelong learning strategies are not fully understood. The aim of this study was to clarify lifelong learning strategies of nursing students with respect to international experience. In this systematic review, an extensive investigation was carried out using Persian and English studies in Pub Med, ProQuest, Cochrane, Ovid, Scopus, Web of Science, SID, and Iran Doc using the following keywords: lifelong learning, self-directed learning, lifelong learning model, continuing education, nursing education, and lifelong program. Finally, 22 articles published from 1994 to 2016 were selected for the final analysis. Data extracted from the selected articles was summarized and classified based on the research questions. In this study, 8 main themes, namely intellectual and practical independence, collaborative (cooperative) learning, researcher thinking, persistence in learning, need-based learning, learning management, suitable learning environment, and inclusive growth, were extracted from the article data. Having identified and clarified lifelong learning strategies in nursing, it is recommended to use the research findings in the programs and teaching systems of nursing schools. Use of strategies of lifelong learning will led to increased quality of education, development of nursing competency and finally, increased quality of patient care.
Statistical Machine Learning for Structured and High Dimensional Data
2014-09-17
AFRL-OSR-VA-TR-2014-0234 STATISTICAL MACHINE LEARNING FOR STRUCTURED AND HIGH DIMENSIONAL DATA Larry Wasserman CARNEGIE MELLON UNIVERSITY Final...Re . 8-98) v Prescribed by ANSI Std. Z39.18 14-06-2014 Final Dec 2009 - Aug 2014 Statistical Machine Learning for Structured and High Dimensional...area of resource-constrained statistical estimation. machine learning , high-dimensional statistics U U U UU John Lafferty 773-702-3813 > Research under
ERIC Educational Resources Information Center
Harrison, Holly
This final report describes achievements and activities of Project SELF (Supports for Early Learning Foundations), a federally funded project in New Mexico which developed, evaluated, and replicated an innovative model that provides strategies for early interventionists and families to support early learning foundations. The project identified…
Learning during a Collaborative Final Exam
ERIC Educational Resources Information Center
Dahlstrom, Orjan
2012-01-01
Collaborative testing has been suggested to serve as a good learning activity, for example, compared to individual testing. The aim of the present study was to measure learning at different levels of knowledge during a collaborative final exam in a course in basic methods and statistical procedures. Results on pre- and post-tests taken…
Lifelong Learning NCES Task Force: Final Report, Volume II. Working Paper Series.
ERIC Educational Resources Information Center
National Center for Education Statistics (ED), Washington, DC.
This document contains the eight appendixes from the National Center for Education Statistics's (NCES's) final report on lifelong learning in the United States. Appendix A discusses the considerations that entered into the formulation of the definition of lifelong learning adopted for the NCES study. Appendix B, "Literature Review on Lifelong…
Cascaded K-means convolutional feature learner and its application to face recognition
NASA Astrophysics Data System (ADS)
Zhou, Daoxiang; Yang, Dan; Zhang, Xiaohong; Huang, Sheng; Feng, Shu
2017-09-01
Currently, considerable efforts have been devoted to devise image representation. However, handcrafted methods need strong domain knowledge and show low generalization ability, and conventional feature learning methods require enormous training data and rich parameters tuning experience. A lightened feature learner is presented to solve these problems with application to face recognition, which shares similar topology architecture as a convolutional neural network. Our model is divided into three components: cascaded convolution filters bank learning layer, nonlinear processing layer, and feature pooling layer. Specifically, in the filters learning layer, we use K-means to learn convolution filters. Features are extracted via convoluting images with the learned filters. Afterward, in the nonlinear processing layer, hyperbolic tangent is employed to capture the nonlinear feature. In the feature pooling layer, to remove the redundancy information and incorporate the spatial layout, we exploit multilevel spatial pyramid second-order pooling technique to pool the features in subregions and concatenate them together as the final representation. Extensive experiments on four representative datasets demonstrate the effectiveness and robustness of our model to various variations, yielding competitive recognition results on extended Yale B and FERET. In addition, our method achieves the best identification performance on AR and labeled faces in the wild datasets among the comparative methods.
Cant, Robyn; Young, Susan; Cooper, Simon J; Porter, Joanne
2015-03-01
This study explores preregistration nursing students' views of a Web-based simulation program: FIRST ACTWeb (Feedback Incorporating Review and Simulation Techniques to Act on Clinical Trends-Web). The multimedia program incorporating three videoed scenarios portrayed by a standardized patient (human actor) aims to improve students' recognition and management of hospital patient deterioration. Participants were 367 final-year nursing students from three universities who completed an online evaluation survey and 19 students from two universities who attended one of five focus groups. Two researchers conducted a thematic analysis of the transcribed narratives. Three core themes identified were as follows: "ease of program use," "experience of e-Simulation," and "satisfaction with the learning experience." The Web-based clinical learning environment was endorsed as functional, feasible, and easy to use and was reported to have high fidelity and realism. Feedback in both focus groups and surveys showed high satisfaction with the learning experience. Overall, evaluation suggested that the Web-based simulation program successfully integrated elements essential for blended learning. Although Web-based educational applications are resource intensive to develop, positive appraisal of program quality, plus program accessibility and repeatability, appears to provide important educational benefits. Further research is needed to determine the transferability of these learning experiences into real-world practice.
Distributed learning and multi-objectivity in traffic light control
NASA Astrophysics Data System (ADS)
Brys, Tim; Pham, Tong T.; Taylor, Matthew E.
2014-01-01
Traffic jams and suboptimal traffic flows are ubiquitous in modern societies, and they create enormous economic losses each year. Delays at traffic lights alone account for roughly 10% of all delays in US traffic. As most traffic light scheduling systems currently in use are static, set up by human experts rather than being adaptive, the interest in machine learning approaches to this problem has increased in recent years. Reinforcement learning (RL) approaches are often used in these studies, as they require little pre-existing knowledge about traffic flows. Distributed constraint optimisation approaches (DCOP) have also been shown to be successful, but are limited to cases where the traffic flows are known. The distributed coordination of exploration and exploitation (DCEE) framework was recently proposed to introduce learning in the DCOP framework. In this paper, we present a study of DCEE and RL techniques in a complex simulator, illustrating the particular advantages of each, comparing them against standard isolated traffic actuated signals. We analyse how learning and coordination behave under different traffic conditions, and discuss the multi-objective nature of the problem. Finally we evaluate several alternative reward signals in the best performing approach, some of these taking advantage of the correlation between the problem-inherent objectives to improve performance.
Prostate Cancer Probability Prediction By Machine Learning Technique.
Jović, Srđan; Miljković, Milica; Ivanović, Miljan; Šaranović, Milena; Arsić, Milena
2017-11-26
The main goal of the study was to explore possibility of prostate cancer prediction by machine learning techniques. In order to improve the survival probability of the prostate cancer patients it is essential to make suitable prediction models of the prostate cancer. If one make relevant prediction of the prostate cancer it is easy to create suitable treatment based on the prediction results. Machine learning techniques are the most common techniques for the creation of the predictive models. Therefore in this study several machine techniques were applied and compared. The obtained results were analyzed and discussed. It was concluded that the machine learning techniques could be used for the relevant prediction of prostate cancer.
NASA Astrophysics Data System (ADS)
Ellins, K. K.; Eriksson, S. C.; Samsel, F.; Lavier, L.
2017-12-01
A new undergraduate, upper level geoscience course was developed and taught by faculty and staff of the UT Austin Jackson School of Geosciences, the Center for Agile Technology, and the Texas Advanced Computational Center. The course examined the role of the visual arts in placing the scientific process and knowledge in a broader context and introduced students to innovations in the visual arts that promote scientific investigation through collaboration between geoscientists and artists. The course addressed (1) the role of the visual arts in teaching geoscience concepts and promoting geoscience learning; (2) the application of innovative visualization and artistic techniques to large volumes of geoscience data to enhance scientific understanding and to move scientific investigation forward; and (3) the illustrative power of art to communicate geoscience to the public. In-class activities and discussions, computer lab instruction on the application of Paraview software, reading assignments, lectures, and group projects with presentations comprised the two-credit, semester-long "special topics" course, which was taken by geoscience, computer science, and engineering students. Assessment of student learning was carried out by the instructors and course evaluation was done by an external evaluator using rubrics, likert-scale surveys and focus goups. The course achieved its goals of students' learning the concepts and techniques of the visual arts. The final projects demonstrated this, along with the communication of geologic concepts using what they had learned in the course. The basic skill of sketching for learning and using best practices in visual communication were used extensively and, in most cases, very effectively. The use of an advanced visualization tool, Paraview, was received with mixed reviews because of the lack of time to really learn the tool and the fact that it is not a tool used routinely in geoscience. Those senior students with advanced computer skills saw the importance of this tool. Students worked in teams, more or less effectively, and made suggestions for improving future offerings of the course.
A Global Health Elective Course in a PharmD Curriculum
Dutta, Arjun; Kovera, Craig
2014-01-01
Objective. To describe the design, development, and the first 4 implementations of a Global Health elective course intended to prepare pharmacy students pursue global health careers and to evaluate student perceptions of the instructional techniques used and of skills developed during the course. Design. Following the blended curriculum model used at Touro College of Pharmacy, the Global Health course combined team-based learning (TBL) sessions in class, out-of-class team projects, and online self-directed learning with classroom teaching and discussion sessions. Assessment. Student performance was assessed with TBL sessions, team projects, and class presentations, online quizzes, and final examinations. A precourse and postcourse survey showed improvement in global health knowledge and attitudes, and in the perception of pharmacists’ role and career opportunities in global health. Significant improvement in skills applicable to global health work was reported and students rated highly the instructional techniques, value, and relevance of the course. Conclusion. The Global Health elective course is on track to achieve its intended goal of equipping pharmacy students with the requisite knowledge and applicable skills to pursue global health careers and opportunities. After taking this course, students have gone on to pursue global field experiences. PMID:25657374
Okamoto, Yasuyuki
2003-04-01
I propose a postgraduate common clinical training program to be provided by the department of laboratory medicine in our prefectural medical university hospital. The program has three purposes: first, mastering basic laboratory tests; second, developing the skills necessary to accurately interpret laboratory data; third, learning specific techniques in the field of laboratory medicine. For the first purpose, it is important that medical trainees perform testing of their own patients at bedside or in the central clinical laboratory. When testing at the central clinical laboratory, instruction by expert laboratory technicians is helpful. The teaching doctors in the department of laboratory medicine are asked to advise the trainees on the interpretation of data. Consultation will be received via interview or e-mail. In addition, the trainees can participate in various conferences, seminars, and meetings held at the central clinical laboratory. Finally, in order to learn specific techniques in the field of laboratory medicine, several special courses lasting a few months will be prepared. I think this program should be closely linked to the training program in internal medicine.
Smith, D. R. [Princeton Plasma Physics Lab. (PPPL), Princeton, NJ (United States); Bell, R. E. [Princeton Plasma Physics Lab. (PPPL), Princeton, NJ (United States); Podesta, M. [Princeton Plasma Physics Lab. (PPPL), Princeton, NJ (United States); Smith, D. R. [Princeton Plasma Physics Lab. (PPPL), Princeton, NJ (United States); Fonck, R. J. [Princeton Plasma Physics Lab. (PPPL), Princeton, NJ (United States); McKee, G. R. [Princeton Plasma Physics Lab. (PPPL), Princeton, NJ (United States); Diallo, A. [Princeton Plasma Physics Lab. (PPPL), Princeton, NJ (United States); Kaye, S. M. [Princeton Plasma Physics Lab. (PPPL), Princeton, NJ (United States); LeBlanc, B. P. [Princeton Plasma Physics Lab. (PPPL), Princeton, NJ (United States); Sabbagh, S. A. [Princeton Plasma Physics Lab. (PPPL), Princeton, NJ (United States)
2015-09-01
We implement unsupervised machine learning techniques to identify characteristic evolution patterns and associated parameter regimes in edge localized mode (ELM) events observed on the National Spherical Torus Experiment. Multi-channel, localized measurements spanning the pedestal region capture the complex evolution patterns of ELM events on Alfven timescales. Some ELM events are active for less than 100~microsec, but others persist for up to 1~ms. Also, some ELM events exhibit a single dominant perturbation, but others are oscillatory. Clustering calculations with time-series similarity metrics indicate the ELM database contains at least two and possibly three groups of ELMs with similar evolution patterns. The identified ELM groups trigger similar stored energy loss, but the groups occupy distinct parameter regimes for ELM-relevant quantities like plasma current, triangularity, and pedestal height. Notably, the pedestal electron pressure gradient is not an effective parameter for distinguishing the ELM groups, but the ELM groups segregate in terms of electron density gradient and electron temperature gradient. The ELM evolution patterns and corresponding parameter regimes can shape the formulation or validation of nonlinear ELM models. Finally, the techniques and results demonstrate an application of unsupervised machine learning at a data-rich fusion facility.
A Corpus-Based Approach for Automatic Thai Unknown Word Recognition Using Boosting Techniques
NASA Astrophysics Data System (ADS)
Techo, Jakkrit; Nattee, Cholwich; Theeramunkong, Thanaruk
While classification techniques can be applied for automatic unknown word recognition in a language without word boundary, it faces with the problem of unbalanced datasets where the number of positive unknown word candidates is dominantly smaller than that of negative candidates. To solve this problem, this paper presents a corpus-based approach that introduces a so-called group-based ranking evaluation technique into ensemble learning in order to generate a sequence of classification models that later collaborate to select the most probable unknown word from multiple candidates. Given a classification model, the group-based ranking evaluation (GRE) is applied to construct a training dataset for learning the succeeding model, by weighing each of its candidates according to their ranks and correctness when the candidates of an unknown word are considered as one group. A number of experiments have been conducted on a large Thai medical text to evaluate performance of the proposed group-based ranking evaluation approach, namely V-GRE, compared to the conventional naïve Bayes classifier and our vanilla version without ensemble learning. As the result, the proposed method achieves an accuracy of 90.93±0.50% when the first rank is selected while it gains 97.26±0.26% when the top-ten candidates are considered, that is 8.45% and 6.79% improvement over the conventional record-based naïve Bayes classifier and the vanilla version. Another result on applying only best features show 93.93±0.22% and up to 98.85±0.15% accuracy for top-1 and top-10, respectively. They are 3.97% and 9.78% improvement over naive Bayes and the vanilla version. Finally, an error analysis is given.
Learn-as-you-go acceleration of cosmological parameter estimates
NASA Astrophysics Data System (ADS)
Aslanyan, Grigor; Easther, Richard; Price, Layne C.
2015-09-01
Cosmological analyses can be accelerated by approximating slow calculations using a training set, which is either precomputed or generated dynamically. However, this approach is only safe if the approximations are well understood and controlled. This paper surveys issues associated with the use of machine-learning based emulation strategies for accelerating cosmological parameter estimation. We describe a learn-as-you-go algorithm that is implemented in the Cosmo++ code and (1) trains the emulator while simultaneously estimating posterior probabilities; (2) identifies unreliable estimates, computing the exact numerical likelihoods if necessary; and (3) progressively learns and updates the error model as the calculation progresses. We explicitly describe and model the emulation error and show how this can be propagated into the posterior probabilities. We apply these techniques to the Planck likelihood and the calculation of ΛCDM posterior probabilities. The computation is significantly accelerated without a pre-defined training set and uncertainties in the posterior probabilities are subdominant to statistical fluctuations. We have obtained a speedup factor of 6.5 for Metropolis-Hastings and 3.5 for nested sampling. Finally, we discuss the general requirements for a credible error model and show how to update them on-the-fly.
Adaptive Modeling of the International Space Station Electrical Power System
NASA Technical Reports Server (NTRS)
Thomas, Justin Ray
2007-01-01
Software simulations provide NASA engineers the ability to experiment with spacecraft systems in a computer-imitated environment. Engineers currently develop software models that encapsulate spacecraft system behavior. These models can be inaccurate due to invalid assumptions, erroneous operation, or system evolution. Increasing accuracy requires manual calibration and domain-specific knowledge. This thesis presents a method for automatically learning system models without any assumptions regarding system behavior. Data stream mining techniques are applied to learn models for critical portions of the International Space Station (ISS) Electrical Power System (EPS). We also explore a knowledge fusion approach that uses traditional engineered EPS models to supplement the learned models. We observed that these engineered EPS models provide useful background knowledge to reduce predictive error spikes when confronted with making predictions in situations that are quite different from the training scenarios used when learning the model. Evaluations using ISS sensor data and existing EPS models demonstrate the success of the adaptive approach. Our experimental results show that adaptive modeling provides reductions in model error anywhere from 80% to 96% over these existing models. Final discussions include impending use of adaptive modeling technology for ISS mission operations and the need for adaptive modeling in future NASA lunar and Martian exploration.
NASA Astrophysics Data System (ADS)
Tang, Jie; Liu, Rong; Zhang, Yue-Li; Liu, Mou-Ze; Hu, Yong-Fang; Shao, Ming-Jie; Zhu, Li-Jun; Xin, Hua-Wen; Feng, Gui-Wen; Shang, Wen-Jun; Meng, Xiang-Guang; Zhang, Li-Rong; Ming, Ying-Zi; Zhang, Wei
2017-02-01
Tacrolimus has a narrow therapeutic window and considerable variability in clinical use. Our goal was to compare the performance of multiple linear regression (MLR) and eight machine learning techniques in pharmacogenetic algorithm-based prediction of tacrolimus stable dose (TSD) in a large Chinese cohort. A total of 1,045 renal transplant patients were recruited, 80% of which were randomly selected as the “derivation cohort” to develop dose-prediction algorithm, while the remaining 20% constituted the “validation cohort” to test the final selected algorithm. MLR, artificial neural network (ANN), regression tree (RT), multivariate adaptive regression splines (MARS), boosted regression tree (BRT), support vector regression (SVR), random forest regression (RFR), lasso regression (LAR) and Bayesian additive regression trees (BART) were applied and their performances were compared in this work. Among all the machine learning models, RT performed best in both derivation [0.71 (0.67-0.76)] and validation cohorts [0.73 (0.63-0.82)]. In addition, the ideal rate of RT was 4% higher than that of MLR. To our knowledge, this is the first study to use machine learning models to predict TSD, which will further facilitate personalized medicine in tacrolimus administration in the future.
Learn-as-you-go acceleration of cosmological parameter estimates
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aslanyan, Grigor; Easther, Richard; Price, Layne C., E-mail: g.aslanyan@auckland.ac.nz, E-mail: r.easther@auckland.ac.nz, E-mail: lpri691@aucklanduni.ac.nz
2015-09-01
Cosmological analyses can be accelerated by approximating slow calculations using a training set, which is either precomputed or generated dynamically. However, this approach is only safe if the approximations are well understood and controlled. This paper surveys issues associated with the use of machine-learning based emulation strategies for accelerating cosmological parameter estimation. We describe a learn-as-you-go algorithm that is implemented in the Cosmo++ code and (1) trains the emulator while simultaneously estimating posterior probabilities; (2) identifies unreliable estimates, computing the exact numerical likelihoods if necessary; and (3) progressively learns and updates the error model as the calculation progresses. We explicitlymore » describe and model the emulation error and show how this can be propagated into the posterior probabilities. We apply these techniques to the Planck likelihood and the calculation of ΛCDM posterior probabilities. The computation is significantly accelerated without a pre-defined training set and uncertainties in the posterior probabilities are subdominant to statistical fluctuations. We have obtained a speedup factor of 6.5 for Metropolis-Hastings and 3.5 for nested sampling. Finally, we discuss the general requirements for a credible error model and show how to update them on-the-fly.« less
From Sky to Earth: Data Science Methodology Transfer
NASA Astrophysics Data System (ADS)
Mahabal, Ashish A.; Crichton, Daniel; Djorgovski, S. G.; Law, Emily; Hughes, John S.
2017-06-01
We describe here the parallels in astronomy and earth science datasets, their analyses, and the opportunities for methodology transfer from astroinformatics to geoinformatics. Using example of hydrology, we emphasize how meta-data and ontologies are crucial in such an undertaking. Using the infrastructure being designed for EarthCube - the Virtual Observatory for the earth sciences - we discuss essential steps for better transfer of tools and techniques in the future e.g. domain adaptation. Finally we point out that it is never a one-way process and there is enough for astroinformatics to learn from geoinformatics as well.
Improving Word Learning in Children Using an Errorless Technique
ERIC Educational Resources Information Center
Warmington, Meesha; Hitch, Graham J.; Gathercole, Susan E.
2013-01-01
The current experiment examined the relative advantage of an errorless learning technique over an errorful one in the acquisition of novel names for unfamiliar objects in typically developing children aged between 7 and 9 years. Errorless learning led to significantly better learning than did errorful learning. Processing speed and vocabulary…
ERIC Educational Resources Information Center
Hung, Jui-Long; Crooks, Steven M.
2009-01-01
The student learning process is important in online learning environments. If instructors can "observe" online learning behaviors, they can provide adaptive feedback, adjust instructional strategies, and assist students in establishing patterns of successful learning activities. This study used data mining techniques to examine and…
Action Research to Improve the Learning Space for Diagnostic Techniques.
Ariel, Ellen; Owens, Leigh
2015-12-01
The module described and evaluated here was created in response to perceived learning difficulties in diagnostic test design and interpretation for students in third-year Clinical Microbiology. Previously, the activities in lectures and laboratory classes in the module fell into the lower cognitive operations of "knowledge" and "understanding." The new approach was to exchange part of the traditional activities with elements of interactive learning, where students had the opportunity to engage in deep learning using a variety of learning styles. The effectiveness of the new curriculum was assessed by means of on-course student assessment throughout the module, a final exam, an anonymous questionnaire on student evaluation of the different activities and a focus group of volunteers. Although the new curriculum enabled a major part of the student cohort to achieve higher pass grades (p < 0.001), it did not meet the requirements of the weaker students, and the proportion of the students failing the module remained at 34%. The action research applied here provided a number of valuable suggestions from students on how to improve future curricula from their perspective. Most importantly, an interactive online program that facilitated flexibility in the learning space for the different reagents and their interaction in diagnostic tests was proposed. The methods applied to improve and assess a curriculum refresh by involving students as partners in the process, as well as the outcomes, are discussed. Journal of Microbiology & Biology Education.
Goldstein, Benjamin A; Navar, Ann Marie; Carter, Rickey E
2017-06-14
Risk prediction plays an important role in clinical cardiology research. Traditionally, most risk models have been based on regression models. While useful and robust, these statistical methods are limited to using a small number of predictors which operate in the same way on everyone, and uniformly throughout their range. The purpose of this review is to illustrate the use of machine-learning methods for development of risk prediction models. Typically presented as black box approaches, most machine-learning methods are aimed at solving particular challenges that arise in data analysis that are not well addressed by typical regression approaches. To illustrate these challenges, as well as how different methods can address them, we consider trying to predicting mortality after diagnosis of acute myocardial infarction. We use data derived from our institution's electronic health record and abstract data on 13 regularly measured laboratory markers. We walk through different challenges that arise in modelling these data and then introduce different machine-learning approaches. Finally, we discuss general issues in the application of machine-learning methods including tuning parameters, loss functions, variable importance, and missing data. Overall, this review serves as an introduction for those working on risk modelling to approach the diffuse field of machine learning. © The Author 2016. Published by Oxford University Press on behalf of the European Society of Cardiology.
Koohestani, Hamid Reza; Baghcheghi, Nayereh
2016-01-01
Background: Team-based learning is a structured type of cooperative learning that is becoming increasingly more popular in nursing education. This study compares levels of nursing students’ perception of the psychosocial climate of the classroom between conventional lecture group and team-based learning group. Methods: In a quasi-experimental study with pretest-posttest design 38 nursing students of second year participated. One half of the 16 sessions of cardiovascular disease nursing course sessions was taught by lectures and the second half with team-based learning. The modified college and university classroom environment inventory (CUCEI) was used to measure the perception of classroom environment. This was completed after the final lecture and TBL sessions. Results: Results revealed a significant difference in the mean scores of psycho-social climate for the TBL method (Mean (SD): 179.8(8.27)) versus the mean score for the lecture method (Mean (SD): 154.213.44)). Also, the results showed significant differences between the two groups in the innovation (p<0.001), student cohesiveness (p=0.01), cooperation (p<0.001) and equity (p= 0.03) sub-scales scores (p<0.05). Conclusion: This study provides evidence that team-based learning does have a positive effect on nursing students’ perceptions of their psycho-social climate of the classroom. PMID:28210602
Classification of breast tumour using electrical impedance and machine learning techniques.
Al Amin, Abdullah; Parvin, Shahnaj; Kadir, M A; Tahmid, Tasmia; Alam, S Kaisar; Siddique-e Rabbani, K
2014-06-01
When a breast lump is detected through palpation, mammography or ultrasonography, the final test for characterization of the tumour, whether it is malignant or benign, is biopsy. This is invasive and carries hazards associated with any surgical procedures. The present work was undertaken to study the feasibility for such characterization using non-invasive electrical impedance measurements and machine learning techniques. Because of changes in cell morphology of malignant and benign tumours, changes are expected in impedance at a fixed frequency, and versus frequency of measurement. Tetrapolar impedance measurement (TPIM) using four electrodes at the corners of a square region of sides 4 cm was used for zone localization. Data of impedance in two orthogonal directions, measured at 5 and 200 kHz from 19 subjects, and their respective slopes with frequency were subjected to machine learning procedures through the use of feature plots. These patients had single or multiple tumours of various types in one or both breasts, and four of them had malignant tumours, as diagnosed by core biopsy. Although size and depth of the tumours are expected to affect the measurements, this preliminary work ignored these effects. Selecting 12 features from the above measurements, feature plots were drawn for the 19 patients, which displayed considerable overlap between malignant and benign cases. However, based on observed qualitative trend of the measured values, when all the feature values were divided by respective ages, the two types of tumours separated out reasonably well. Using K-NN classification method the results obtained are, positive prediction value: 60%, negative prediction value: 93%, sensitivity: 75%, specificity: 87% and efficacy: 84%, which are very good for such a test on a small sample size. Study on a larger sample is expected to give confidence in this technique, and further improvement of the technique may have the ability to replace biopsy.
Kossioni, A E; Lyrakos, G; Ntinalexi, I; Varela, R; Economu, I
2014-05-01
The aim of this study was to develop and validate according to psychometric standards a self-administered instrument to measure the students' self-perceptions of the undergraduate clinical dental environment (DECLEI). The initial questionnaire was developed using feedback from dental students, experts' opinion and an extensive literature review. Critical incident technique (CIT) analysis was used to generate items and identify domains. Thirty clinical dental students participated in a pilot validation that generated a 67-item questionnaire. To develop a shorter and more practical version of the instrument, DECLEI-67 was distributed to 153 clinical students at the University of Athens and its English version to 51 students from various dental schools, attending the 2012 European Dental Students Association meeting. This final procedure aimed to select items, identify subscales and measure internal consistency and discriminant validity. A total of 202 students returned the questionnaires (response rate 99%). The final instrument included 24 items divided into three subscales: (i) organisation and learning opportunities, (ii) professionalism and communication and (iii) satisfaction and commitment to the dental studies. Cronbach's α for the total questionnaire was 0.89. The interscale correlations ranged from 0.39 to 0.48. The instrument identified differences related to school of origin, age and duration of clinical experience. An interpretation of the scores (range 0–100) has been proposed. The 24-item DECLEI seemed to be a practical and valid instrument to measure a dental school's undergraduate clinical learning environment.
Boxwala, Aziz A; Kim, Jihoon; Grillo, Janice M; Ohno-Machado, Lucila
2011-01-01
To determine whether statistical and machine-learning methods, when applied to electronic health record (EHR) access data, could help identify suspicious (ie, potentially inappropriate) access to EHRs. From EHR access logs and other organizational data collected over a 2-month period, the authors extracted 26 features likely to be useful in detecting suspicious accesses. Selected events were marked as either suspicious or appropriate by privacy officers, and served as the gold standard set for model evaluation. The authors trained logistic regression (LR) and support vector machine (SVM) models on 10-fold cross-validation sets of 1291 labeled events. The authors evaluated the sensitivity of final models on an external set of 58 events that were identified as truly inappropriate and investigated independently from this study using standard operating procedures. The area under the receiver operating characteristic curve of the models on the whole data set of 1291 events was 0.91 for LR, and 0.95 for SVM. The sensitivity of the baseline model on this set was 0.8. When the final models were evaluated on the set of 58 investigated events, all of which were determined as truly inappropriate, the sensitivity was 0 for the baseline method, 0.76 for LR, and 0.79 for SVM. The LR and SVM models may not generalize because of interinstitutional differences in organizational structures, applications, and workflows. Nevertheless, our approach for constructing the models using statistical and machine-learning techniques can be generalized. An important limitation is the relatively small sample used for the training set due to the effort required for its construction. The results suggest that statistical and machine-learning methods can play an important role in helping privacy officers detect suspicious accesses to EHRs.
Kim, Jihoon; Grillo, Janice M; Ohno-Machado, Lucila
2011-01-01
Objective To determine whether statistical and machine-learning methods, when applied to electronic health record (EHR) access data, could help identify suspicious (ie, potentially inappropriate) access to EHRs. Methods From EHR access logs and other organizational data collected over a 2-month period, the authors extracted 26 features likely to be useful in detecting suspicious accesses. Selected events were marked as either suspicious or appropriate by privacy officers, and served as the gold standard set for model evaluation. The authors trained logistic regression (LR) and support vector machine (SVM) models on 10-fold cross-validation sets of 1291 labeled events. The authors evaluated the sensitivity of final models on an external set of 58 events that were identified as truly inappropriate and investigated independently from this study using standard operating procedures. Results The area under the receiver operating characteristic curve of the models on the whole data set of 1291 events was 0.91 for LR, and 0.95 for SVM. The sensitivity of the baseline model on this set was 0.8. When the final models were evaluated on the set of 58 investigated events, all of which were determined as truly inappropriate, the sensitivity was 0 for the baseline method, 0.76 for LR, and 0.79 for SVM. Limitations The LR and SVM models may not generalize because of interinstitutional differences in organizational structures, applications, and workflows. Nevertheless, our approach for constructing the models using statistical and machine-learning techniques can be generalized. An important limitation is the relatively small sample used for the training set due to the effort required for its construction. Conclusion The results suggest that statistical and machine-learning methods can play an important role in helping privacy officers detect suspicious accesses to EHRs. PMID:21672912
Machine Learning Techniques in Clinical Vision Sciences.
Caixinha, Miguel; Nunes, Sandrina
2017-01-01
This review presents and discusses the contribution of machine learning techniques for diagnosis and disease monitoring in the context of clinical vision science. Many ocular diseases leading to blindness can be halted or delayed when detected and treated at its earliest stages. With the recent developments in diagnostic devices, imaging and genomics, new sources of data for early disease detection and patients' management are now available. Machine learning techniques emerged in the biomedical sciences as clinical decision-support techniques to improve sensitivity and specificity of disease detection and monitoring, increasing objectively the clinical decision-making process. This manuscript presents a review in multimodal ocular disease diagnosis and monitoring based on machine learning approaches. In the first section, the technical issues related to the different machine learning approaches will be present. Machine learning techniques are used to automatically recognize complex patterns in a given dataset. These techniques allows creating homogeneous groups (unsupervised learning), or creating a classifier predicting group membership of new cases (supervised learning), when a group label is available for each case. To ensure a good performance of the machine learning techniques in a given dataset, all possible sources of bias should be removed or minimized. For that, the representativeness of the input dataset for the true population should be confirmed, the noise should be removed, the missing data should be treated and the data dimensionally (i.e., the number of parameters/features and the number of cases in the dataset) should be adjusted. The application of machine learning techniques in ocular disease diagnosis and monitoring will be presented and discussed in the second section of this manuscript. To show the clinical benefits of machine learning in clinical vision sciences, several examples will be presented in glaucoma, age-related macular degeneration, and diabetic retinopathy, these ocular pathologies being the major causes of irreversible visual impairment.
LaDage, Lara D; Tornello, Samantha L; Vallejera, Jennilyn M; Baker, Emily E; Yan, Yue; Chowdhury, Anik
2018-03-01
There are many pedagogical techniques used by educators in higher education; however, some techniques and activities have been shown to be more beneficial to student learning than others. Research has demonstrated that active learning and learning in which students cognitively engage with the material in a multitude of ways result in better understanding and retention. The aim of the present study was to determine which of three pedagogical techniques led to improvement in learning and retention in undergraduate college students. Subjects partook in one of three different types of pedagogical engagement: hands-on learning with a model, observing someone else manipulate the model, and traditional lecture-based presentation. Students were then asked to take an online quiz that tested their knowledge of the new material, both immediately after learning the material and 2 wk later. Students who engaged in direct manipulation of the model scored higher on the assessment immediately after learning the material compared with the other two groups. However, there were no differences among the three groups when assessed after a 2-wk retention interval. Thus active learning techniques that involve direct interaction with the material can lead to learning benefits; however, how these techniques benefit long-term retention of the information is equivocal.
Placement-Based Learning and Learner Engagement: Findings from a New University in the UK
ERIC Educational Resources Information Center
Murphy, Timothy R. N.; Folgueiras Bertomeu, Pilar; Mannix McNamara, Patricia
2016-01-01
This paper addresses the potential for engaged learning among final-year undergraduate Education Studies students at a new, post-1992. It discusses a case study analysis of a "Directed Experiential Learning" (DEL) intervention in the final year of an education studies degree designed to engage and motivate students and emphasise the…
ERIC Educational Resources Information Center
Khan, S.
2011-01-01
The purpose of this article is to report on empirical work, related to a techniques module, undertaken with the dental students of the University of the Western Cape, South Africa. I will relate how a range of different active learning techniques (tutorials; question papers and mock tests) assisted students to adopt a deep approach to learning in…
Transcranial magnetic stimulation and neuroplasticity.
Pascual-Leone, A; Tarazona, F; Keenan, J; Tormos, J M; Hamilton, R; Catala, M D
1999-02-01
We review past results and present novel data to illustrate different ways in which TMS can be used to study neural plasticity. Procedural learning during the serial reaction time task (SRTT) is used as a model of neural plasticity to illustrate the applications of TMS. These different applications of TMS represent principles of use that we believe are applicable to studies of cognitive neuroscience in general and exemplify the great potential of TMS in the study of brain and behavior. We review the use of TMS for (1) cortical output mapping using focal, single-pulse TMS; (2) identification of the mechanisms underlying neuroplasticity using paired-pulse TMS techniques; (3) enhancement of the information of other neuroimaging techniques by transient disruption of cortical function using repetitive TMS; and finally (4) modulation of cortical function with repetitive TMS to influence behavior and guide plasticity.
Teaching laboratory neuroscience at bowdoin: the laboratory instructor perspective.
Hauptman, Stephen; Curtis, Nancy
2009-01-01
Bowdoin College is a small liberal arts college that offers a comprehensive Neuroscience major. The laboratory experience is an integral part of the major, and many students progress through three stages. A core course offers a survey of concepts and techniques. Four upper-level courses function to give students more intensive laboratory research experience in neurophysiology, molecular neurobiology, social behavior, and learning and memory. Finally, many majors choose to work in the individual research labs of the Neuroscience faculty. We, as laboratory instructors, are vital to the process, and are actively involved in all aspects of the lab-based courses. We provide student instruction in state of the art techniques in neuroscience research. By sharing laboratory teaching responsibilities with course professors, we help to prepare students for careers in laboratory neuroscience and also support and facilitate faculty research programs.
Developing and testing lay literature about breast cancer screening for African American women.
Coleman, Elizabeth Ann; Coon, Sharon; Mohrmann, Carolyn; Hardin, Susan; Stewart, Beth; Gibson, Regina Shoate; Cantrell, Mary; Lord, Janet; Heard, Jeanne
2003-01-01
Written materials about breast cancer screening for African American women with low literacy skills are needed. Available materials were not at or below third-grade reading levels, were not culturally sensitive, and were not accurate in illustrating correct breast self-examination (BSE) techniques. Focus groups representing the target population helped the authors design a pamphlet describing how to perform BSE and a motivational picture book to help women overcome barriers to screening. The authors chose a food theme for the cover of the pamphlet written at a third-grade level and suggested a photographic version. In the motivational book, two women address barriers to screening and replace myths and fears with facts and actions. Data from 162 women showed that they learned from both the photographic and illustrated versions. Women in the photographic group found significantly more lumps in the silicone models, so the authors chose that version to use in final testing. Finally, nurses pretested a group of patients before they reviewed the materials and post-tested another group after they reviewed them. The group who had reviewed the materials had greater knowledge of and intent to follow the guidelines and received higher scores on BSE techniques.
NASA Astrophysics Data System (ADS)
Tasich, C. M.; Duncan, L. L.; Duncan, B. R.; Burkhardt, B. L.; Benneyworth, L. M.
2015-12-01
Dual-listed courses will persist in higher education because of resource limitations. The pedagogical differences between undergraduate and graduate STEM student groups and the underlying distinction in intellectual development levels between the two student groups complicate the inclusion of undergraduates in these courses. Active learning techniques are a possible remedy to the hardships undergraduate students experience in graduate-level courses. Through an analysis of both undergraduate and graduate student experiences while enrolled in a dual-listed course, we implemented a variety of learning techniques used to complement the learning of both student groups and enhance deep discussion. Here, we provide details concerning the implementation of four active learning techniques - role play, game, debate, and small group - that were used to help undergraduate students critically discuss primary literature. Student perceptions were gauged through an anonymous, end-of-course evaluation that contained basic questions comparing the course to other courses at the university and other salient aspects of the course. These were given as a Likert scale on which students rated a variety of statements (1 = strongly disagree, 3 = no opinion, and 5 = strongly agree). Undergraduates found active learning techniques to be preferable to traditional techniques with small-group discussions being rated the highest in both enjoyment and enhanced learning. The graduate student discussion leaders also found active learning techniques to improve discussion. In hindsight, students of all cultures may be better able to take advantage of such approaches and to critically read and discuss primary literature when written assignments are used to guide their reading. Applications of active learning techniques can not only address the gap between differing levels of students, but also serve as a complement to student engagement in any science course design.
ERIC Educational Resources Information Center
Firdausiah Mansur, Andi Besse; Yusof, Norazah
2013-01-01
Clustering on Social Learning Network still not explored widely, especially when the network focuses on e-learning system. Any conventional methods are not really suitable for the e-learning data. SNA requires content analysis, which involves human intervention and need to be carried out manually. Some of the previous clustering techniques need…
Hathout, Rania M; Metwally, Abdelkader A
2016-11-01
This study represents one of the series applying computer-oriented processes and tools in digging for information, analysing data and finally extracting correlations and meaningful outcomes. In this context, binding energies could be used to model and predict the mass of loaded drugs in solid lipid nanoparticles after molecular docking of literature-gathered drugs using MOE® software package on molecularly simulated tripalmitin matrices using GROMACS®. Consequently, Gaussian processes as a supervised machine learning artificial intelligence technique were used to correlate the drugs' descriptors (e.g. M.W., xLogP, TPSA and fragment complexity) with their molecular docking binding energies. Lower percentage bias was obtained compared to previous studies which allows the accurate estimation of the loaded mass of any drug in the investigated solid lipid nanoparticles by just projecting its chemical structure to its main features (descriptors). Copyright © 2016 Elsevier B.V. All rights reserved.
Visualizing Dataflow Graphs of Deep Learning Models in TensorFlow.
Wongsuphasawat, Kanit; Smilkov, Daniel; Wexler, James; Wilson, Jimbo; Mane, Dandelion; Fritz, Doug; Krishnan, Dilip; Viegas, Fernanda B; Wattenberg, Martin
2018-01-01
We present a design study of the TensorFlow Graph Visualizer, part of the TensorFlow machine intelligence platform. This tool helps users understand complex machine learning architectures by visualizing their underlying dataflow graphs. The tool works by applying a series of graph transformations that enable standard layout techniques to produce a legible interactive diagram. To declutter the graph, we decouple non-critical nodes from the layout. To provide an overview, we build a clustered graph using the hierarchical structure annotated in the source code. To support exploration of nested structure on demand, we perform edge bundling to enable stable and responsive cluster expansion. Finally, we detect and highlight repeated structures to emphasize a model's modular composition. To demonstrate the utility of the visualizer, we describe example usage scenarios and report user feedback. Overall, users find the visualizer useful for understanding, debugging, and sharing the structures of their models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aimone, James Bradley; Bernard, Michael Lewis; Vineyard, Craig Michael
2014-10-01
Adult neurogenesis in the hippocampus region of the brain is a neurobiological process that is believed to contribute to the brain's advanced abilities in complex pattern recognition and cognition. Here, we describe how realistic scale simulations of the neurogenesis process can offer both a unique perspective on the biological relevance of this process and confer computational insights that are suggestive of novel machine learning techniques. First, supercomputer based scaling studies of the neurogenesis process demonstrate how a small fraction of adult-born neurons have a uniquely larger impact in biologically realistic scaled networks. Second, we describe a novel technical approach bymore » which the information content of ensembles of neurons can be estimated. Finally, we illustrate several examples of broader algorithmic impact of neurogenesis, including both extending existing machine learning approaches and novel approaches for intelligent sensing.« less
Graph theory for feature extraction and classification: a migraine pathology case study.
Jorge-Hernandez, Fernando; Garcia Chimeno, Yolanda; Garcia-Zapirain, Begonya; Cabrera Zubizarreta, Alberto; Gomez Beldarrain, Maria Angeles; Fernandez-Ruanova, Begonya
2014-01-01
Graph theory is also widely used as a representational form and characterization of brain connectivity network, as is machine learning for classifying groups depending on the features extracted from images. Many of these studies use different techniques, such as preprocessing, correlations, features or algorithms. This paper proposes an automatic tool to perform a standard process using images of the Magnetic Resonance Imaging (MRI) machine. The process includes pre-processing, building the graph per subject with different correlations, atlas, relevant feature extraction according to the literature, and finally providing a set of machine learning algorithms which can produce analyzable results for physicians or specialists. In order to verify the process, a set of images from prescription drug abusers and patients with migraine have been used. In this way, the proper functioning of the tool has been proved, providing results of 87% and 92% of success depending on the classifier used.
ERIC Educational Resources Information Center
Buditjahjanto, I. G. P. Asto; Nurlaela, Luthfiyah; Ekohariadi; Riduwan, Mochamad
2017-01-01
Programming technique is one of the subjects at Vocational High School in Indonesia. This subject contains theory and application of programming utilizing Visual Programming. Students experience some difficulties to learn textual learning. Therefore, it is necessary to develop media as a tool to transfer learning materials. The objectives of this…
How Students Learn: Improving Teaching Techniques for Business Discipline Courses
ERIC Educational Resources Information Center
Cluskey, Bob; Elbeck, Matt; Hill, Kathy L.; Strupeck, Dave
2011-01-01
The focus of this paper is to familiarize business discipline faculty with cognitive psychology theories of how students learn together with teaching techniques to assist and improve student learning. Student learning can be defined as the outcome from the retrieval (free recall) of desired information. Student learning occurs in two processes.…
Navigating the Active Learning Swamp: Creating an Inviting Environment for Learning.
ERIC Educational Resources Information Center
Johnson, Marie C.; Malinowski, Jon C.
2001-01-01
Reports on a survey of faculty members (n=29) asking them to define active learning, to rate how effectively different teaching techniques contribute to active learning, and to list the three teaching techniques they use most frequently. Concludes that active learning requires establishing an environment rather than employing a specific teaching…
Tests Enhance the Transfer of Learning
ERIC Educational Resources Information Center
Rohrer, Doug; Taylor, Kelli; Sholar, Brandon
2010-01-01
Numerous learning studies have shown that if the period of time devoted to studying information (e.g., casa-house) includes at least 1 test (casa-?), performance on a final test is improved--a finding known as the "testing effect". In most of these studies, however, the final test is identical to the initial test. If the final test…
Lifelong learning strategies in nursing: A systematic review
Qalehsari, Mojtaba Qanbari; Khaghanizadeh, Morteza; Ebadi, Abbas
2017-01-01
Background Lifelong learning is an expectation in the professional performance of nurses, which is directly related to the success of students in nursing schools. In spite of the considerable attention paid to this issue, lifelong learning strategies are not fully understood. Objective The aim of this study was to clarify lifelong learning strategies of nursing students with respect to international experience. Methods In this systematic review, an extensive investigation was carried out using Persian and English studies in Pub Med, ProQuest, Cochrane, Ovid, Scopus, Web of Science, SID, and Iran Doc using the following keywords: lifelong learning, self-directed learning, lifelong learning model, continuing education, nursing education, and lifelong program. Finally, 22 articles published from 1994 to 2016 were selected for the final analysis. Data extracted from the selected articles was summarized and classified based on the research questions. Results In this study, 8 main themes, namely intellectual and practical independence, collaborative (cooperative) learning, researcher thinking, persistence in learning, need-based learning, learning management, suitable learning environment, and inclusive growth, were extracted from the article data. Conclusion Having identified and clarified lifelong learning strategies in nursing, it is recommended to use the research findings in the programs and teaching systems of nursing schools. Use of strategies of lifelong learning will led to increased quality of education, development of nursing competency and finally, increased quality of patient care. PMID:29238496
Moon, Sungrim; McInnes, Bridget; Melton, Genevieve B
2015-01-01
Although acronyms and abbreviations in clinical text are used widely on a daily basis, relatively little research has focused upon word sense disambiguation (WSD) of acronyms and abbreviations in the healthcare domain. Since clinical notes have distinctive characteristics, it is unclear whether techniques effective for acronym and abbreviation WSD from biomedical literature are sufficient. The authors discuss feature selection for automated techniques and challenges with WSD of acronyms and abbreviations in the clinical domain. There are significant challenges associated with the informal nature of clinical text, such as typographical errors and incomplete sentences; difficulty with insufficient clinical resources, such as clinical sense inventories; and obstacles with privacy and security for conducting research with clinical text. Although we anticipated that using sophisticated techniques, such as biomedical terminologies, semantic types, part-of-speech, and language modeling, would be needed for feature selection with automated machine learning approaches, we found instead that simple techniques, such as bag-of-words, were quite effective in many cases. Factors, such as majority sense prevalence and the degree of separateness between sense meanings, were also important considerations. The first lesson is that a comprehensive understanding of the unique characteristics of clinical text is important for automatic acronym and abbreviation WSD. The second lesson learned is that investigators may find that using simple approaches is an effective starting point for these tasks. Finally, similar to other WSD tasks, an understanding of baseline majority sense rates and separateness between senses is important. Further studies and practical solutions are needed to better address these issues.
Experimentally induced innovations lead to persistent culture via conformity in wild birds
Aplin, L.M.; Farine, D.R.; Morand-Ferron, J.; Cockburn, A.; Thornton, A.; Sheldon, B.C.
2014-01-01
In human societies, cultural norms arise when behaviours are transmitted with high-fidelity social learning through social networks1. However a paucity of experimental studies has meant that there is no comparable understanding of the process by which socially transmitted behaviours may spread and persist in animal populations2,3. Here, we introduce alternative novel foraging techniques into replicated wild sub-populations of great tits (Parus major), and employ automated tracking to map the diffusion, establishment and long-term persistence of seeded behaviours. We further use social network analysis to examine social factors influencing diffusion dynamics. From just two trained birds in each sub-population, information spread rapidly through social network ties to reach an average of 75% of individuals, with 508 knowledgeable individuals performing 58,975 solutions. Sub-populations were heavily biased towards the technique originally introduced, resulting in established local arbitrary traditions that were stable over two generations, despite high population turnover. Finally, we demonstrate a strong effect of social conformity, with individuals disproportionately adopting the most frequent local variant when first learning, but then also continuing to favour social over personal information by matching their technique to the majority variant. Cultural conformity is thought to be a key factor in the evolution of complex culture in humans4-7. In providing the first experimental demonstration of conformity in a wild non-primate, and of cultural norms in foraging techniques in any wild animal, our results suggest a much wider evolutionary occurrence of such apparently complex cultural behaviour. PMID:25470065
The application of machine learning techniques in the clinical drug therapy.
Meng, Huan-Yu; Jin, Wan-Lin; Yan, Cheng-Kai; Yang, Huan
2018-05-25
The development of a novel drug is an extremely complicated process that includes the target identification, design and manufacture, and proper therapy of the novel drug, as well as drug dose selection, drug efficacy evaluation, and adverse drug reaction control. Due to the limited resources, high costs, long duration, and low hit-to-lead ratio in the development of pharmacogenetics and computer technology, machine learning techniques have assisted novel drug development and have gradually received more attention by researchers. According to current research, machine learning techniques are widely applied in the process of the discovery of new drugs and novel drug targets, the decision surrounding proper therapy and drug dose, and the prediction of drug efficacy and adverse drug reactions. In this article, we discussed the history, workflow, and advantages and disadvantages of machine learning techniques in the processes mentioned above. Although the advantages of machine learning techniques are fairly obvious, the application of machine learning techniques is currently limited. With further research, the application of machine techniques in drug development could be much more widespread and could potentially be one of the major methods used in drug development. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
NASA Astrophysics Data System (ADS)
Julianto, Tatang Shabur; Fitriastuti, Dhina; Diniaty, Artina; Fauzi'ah, Lina; Arlianty, Widinda Normalia; Febriana, Beta Wulan; Muhaimin
2017-12-01
Phytochemistry is one of the course in Chemistry Department's curriculum which discusses about biosynthetic path of secondary metabolite compound in a plant, classification of secondary metabolite compound, isolation technique, and identification analysis. This course is expected to be able to bridge the generations of a nation that has expertise in managing the natural resources of Indonesian plants. In this research, it was evaluated the implementation of case study learning method towards students' understanding on phytochemistry course. The learning processes were conducted in 2 cycles i.e. before and after midterm. The first seven themes of materials before midterm were learned with case study method and the next seven themes of materials were studied with the same method with the module-assisted. The results showed that there was enhancement of students' understanding in class D that were obtained from comparison of midterm and final test. Contrarily, the students of class C have no significant enhancement. In addition, it was predicted that understanding enhancement was strongly influenced by the life skills and the motivation of students especially the academic skills aspect.
Semisupervised Clustering by Iterative Partition and Regression with Neuroscience Applications
Qian, Guoqi; Wu, Yuehua; Ferrari, Davide; Qiao, Puxue; Hollande, Frédéric
2016-01-01
Regression clustering is a mixture of unsupervised and supervised statistical learning and data mining method which is found in a wide range of applications including artificial intelligence and neuroscience. It performs unsupervised learning when it clusters the data according to their respective unobserved regression hyperplanes. The method also performs supervised learning when it fits regression hyperplanes to the corresponding data clusters. Applying regression clustering in practice requires means of determining the underlying number of clusters in the data, finding the cluster label of each data point, and estimating the regression coefficients of the model. In this paper, we review the estimation and selection issues in regression clustering with regard to the least squares and robust statistical methods. We also provide a model selection based technique to determine the number of regression clusters underlying the data. We further develop a computing procedure for regression clustering estimation and selection. Finally, simulation studies are presented for assessing the procedure, together with analyzing a real data set on RGB cell marking in neuroscience to illustrate and interpret the method. PMID:27212939
Discriminative dictionary learning for abdominal multi-organ segmentation.
Tong, Tong; Wolz, Robin; Wang, Zehan; Gao, Qinquan; Misawa, Kazunari; Fujiwara, Michitaka; Mori, Kensaku; Hajnal, Joseph V; Rueckert, Daniel
2015-07-01
An automated segmentation method is presented for multi-organ segmentation in abdominal CT images. Dictionary learning and sparse coding techniques are used in the proposed method to generate target specific priors for segmentation. The method simultaneously learns dictionaries which have reconstructive power and classifiers which have discriminative ability from a set of selected atlases. Based on the learnt dictionaries and classifiers, probabilistic atlases are then generated to provide priors for the segmentation of unseen target images. The final segmentation is obtained by applying a post-processing step based on a graph-cuts method. In addition, this paper proposes a voxel-wise local atlas selection strategy to deal with high inter-subject variation in abdominal CT images. The segmentation performance of the proposed method with different atlas selection strategies are also compared. Our proposed method has been evaluated on a database of 150 abdominal CT images and achieves a promising segmentation performance with Dice overlap values of 94.9%, 93.6%, 71.1%, and 92.5% for liver, kidneys, pancreas, and spleen, respectively. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.
Analysis of Big Data in Gait Biomechanics: Current Trends and Future Directions.
Phinyomark, Angkoon; Petri, Giovanni; Ibáñez-Marcelo, Esther; Osis, Sean T; Ferber, Reed
2018-01-01
The increasing amount of data in biomechanics research has greatly increased the importance of developing advanced multivariate analysis and machine learning techniques, which are better able to handle "big data". Consequently, advances in data science methods will expand the knowledge for testing new hypotheses about biomechanical risk factors associated with walking and running gait-related musculoskeletal injury. This paper begins with a brief introduction to an automated three-dimensional (3D) biomechanical gait data collection system: 3D GAIT, followed by how the studies in the field of gait biomechanics fit the quantities in the 5 V's definition of big data: volume, velocity, variety, veracity, and value. Next, we provide a review of recent research and development in multivariate and machine learning methods-based gait analysis that can be applied to big data analytics. These modern biomechanical gait analysis methods include several main modules such as initial input features, dimensionality reduction (feature selection and extraction), and learning algorithms (classification and clustering). Finally, a promising big data exploration tool called "topological data analysis" and directions for future research are outlined and discussed.
A Study of Hand Back Skin Texture Patterns for Personal Identification and Gender Classification
Xie, Jin; Zhang, Lei; You, Jane; Zhang, David; Qu, Xiaofeng
2012-01-01
Human hand back skin texture (HBST) is often consistent for a person and distinctive from person to person. In this paper, we study the HBST pattern recognition problem with applications to personal identification and gender classification. A specially designed system is developed to capture HBST images, and an HBST image database was established, which consists of 1,920 images from 80 persons (160 hands). An efficient texton learning based method is then presented to classify the HBST patterns. First, textons are learned in the space of filter bank responses from a set of training images using the l1 -minimization based sparse representation (SR) technique. Then, under the SR framework, we represent the feature vector at each pixel over the learned dictionary to construct a representation coefficient histogram. Finally, the coefficient histogram is used as skin texture feature for classification. Experiments on personal identification and gender classification are performed by using the established HBST database. The results show that HBST can be used to assist human identification and gender classification. PMID:23012512
An adaptive learning control system for large flexible structures
NASA Technical Reports Server (NTRS)
Thau, F. E.
1985-01-01
The objective of the research has been to study the design of adaptive/learning control systems for the control of large flexible structures. In the first activity an adaptive/learning control methodology for flexible space structures was investigated. The approach was based on using a modal model of the flexible structure dynamics and an output-error identification scheme to identify modal parameters. In the second activity, a least-squares identification scheme was proposed for estimating both modal parameters and modal-to-actuator and modal-to-sensor shape functions. The technique was applied to experimental data obtained from the NASA Langley beam experiment. In the third activity, a separable nonlinear least-squares approach was developed for estimating the number of excited modes, shape functions, modal parameters, and modal amplitude and velocity time functions for a flexible structure. In the final research activity, a dual-adaptive control strategy was developed for regulating the modal dynamics and identifying modal parameters of a flexible structure. A min-max approach was used for finding an input to provide modal parameter identification while not exceeding reasonable bounds on modal displacement.
2005-01-01
Students are most motivated and learn best when they are immersed in an environment that causes them to realize why they should learn. Perhaps nowhere is this truer than when teaching the biological sciences to engineers. Transitioning from a traditionally mathematics-based to a traditionally knowledge-based pedagogical style can challenge student learning and engagement. To address this, human pathologies were used as a problem-based context for teaching knowledge-based cell biological mechanisms. Lectures were divided into four modules. First, a disease was presented from clinical, economic, and etiological standpoints. Second, fundamental concepts of cell and molecular biology were taught that were directly relevant to that disease. Finally, we discussed the cellular and molecular basis of the disease based on these fundamental concepts, together with current clinical approaches to the disease. The basic science is thus presented within a “shrink wrap” of disease application. Evaluation of this contextual technique suggests that it is very useful in improving undergraduate student focus and motivation, and offers many advantages to the instructor as well. PMID:15917872
MO-DE-207-04: Imaging educational program on solutions to common pediatric imaging challenges
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krishnamurthy, R.
This imaging educational program will focus on solutions to common pediatric imaging challenges. The speakers will present collective knowledge on best practices in pediatric imaging from their experience at dedicated children’s hospitals. The educational program will begin with a detailed discussion of the optimal configuration of fluoroscopes for general pediatric procedures. Following this introduction will be a focused discussion on the utility of Dual Energy CT for imaging children. The third lecture will address the substantial challenge of obtaining consistent image post -processing in pediatric digital radiography. The fourth and final lecture will address best practices in pediatric MRI includingmore » a discussion of ancillary methods to reduce sedation and anesthesia rates. Learning Objectives: To learn techniques for optimizing radiation dose and image quality in pediatric fluoroscopy To become familiar with the unique challenges and applications of Dual Energy CT in pediatric imaging To learn solutions for consistent post-processing quality in pediatric digital radiography To understand the key components of an effective MRI safety and quality program for the pediatric practice.« less
Xia, Youshen; Kamel, Mohamed S
2007-06-01
Identification of a general nonlinear noisy system viewed as an estimation of a predictor function is studied in this article. A measurement fusion method for the predictor function estimate is proposed. In the proposed scheme, observed data are first fused by using an optimal fusion technique, and then the optimal fused data are incorporated in a nonlinear function estimator based on a robust least squares support vector machine (LS-SVM). A cooperative learning algorithm is proposed to implement the proposed measurement fusion method. Compared with related identification methods, the proposed method can minimize both the approximation error and the noise error. The performance analysis shows that the proposed optimal measurement fusion function estimate has a smaller mean square error than the LS-SVM function estimate. Moreover, the proposed cooperative learning algorithm can converge globally to the optimal measurement fusion function estimate. Finally, the proposed measurement fusion method is applied to ARMA signal and spatial temporal signal modeling. Experimental results show that the proposed measurement fusion method can provide a more accurate model.
Healthcare Learning Community and Student Retention
ERIC Educational Resources Information Center
Johnson, Sherryl W.
2014-01-01
Teaching, learning, and retention processes have evolved historically to include multifaceted techniques beyond the traditional lecture. This article presents related results of a study using a healthcare learning community in a southwest Georgia university. The value of novel techniques and tools in promoting student learning and retention…
Automation of energy demand forecasting
NASA Astrophysics Data System (ADS)
Siddique, Sanzad
Automation of energy demand forecasting saves time and effort by searching automatically for an appropriate model in a candidate model space without manual intervention. This thesis introduces a search-based approach that improves the performance of the model searching process for econometrics models. Further improvements in the accuracy of the energy demand forecasting are achieved by integrating nonlinear transformations within the models. This thesis introduces machine learning techniques that are capable of modeling such nonlinearity. Algorithms for learning domain knowledge from time series data using the machine learning methods are also presented. The novel search based approach and the machine learning models are tested with synthetic data as well as with natural gas and electricity demand signals. Experimental results show that the model searching technique is capable of finding an appropriate forecasting model. Further experimental results demonstrate an improved forecasting accuracy achieved by using the novel machine learning techniques introduced in this thesis. This thesis presents an analysis of how the machine learning techniques learn domain knowledge. The learned domain knowledge is used to improve the forecast accuracy.
Machine learning for outcome prediction of acute ischemic stroke post intra-arterial therapy.
Asadi, Hamed; Dowling, Richard; Yan, Bernard; Mitchell, Peter
2014-01-01
Stroke is a major cause of death and disability. Accurately predicting stroke outcome from a set of predictive variables may identify high-risk patients and guide treatment approaches, leading to decreased morbidity. Logistic regression models allow for the identification and validation of predictive variables. However, advanced machine learning algorithms offer an alternative, in particular, for large-scale multi-institutional data, with the advantage of easily incorporating newly available data to improve prediction performance. Our aim was to design and compare different machine learning methods, capable of predicting the outcome of endovascular intervention in acute anterior circulation ischaemic stroke. We conducted a retrospective study of a prospectively collected database of acute ischaemic stroke treated by endovascular intervention. Using SPSS®, MATLAB®, and Rapidminer®, classical statistics as well as artificial neural network and support vector algorithms were applied to design a supervised machine capable of classifying these predictors into potential good and poor outcomes. These algorithms were trained, validated and tested using randomly divided data. We included 107 consecutive acute anterior circulation ischaemic stroke patients treated by endovascular technique. Sixty-six were male and the mean age of 65.3. All the available demographic, procedural and clinical factors were included into the models. The final confusion matrix of the neural network, demonstrated an overall congruency of ∼ 80% between the target and output classes, with favourable receiving operative characteristics. However, after optimisation, the support vector machine had a relatively better performance, with a root mean squared error of 2.064 (SD: ± 0.408). We showed promising accuracy of outcome prediction, using supervised machine learning algorithms, with potential for incorporation of larger multicenter datasets, likely further improving prediction. Finally, we propose that a robust machine learning system can potentially optimise the selection process for endovascular versus medical treatment in the management of acute stroke.
NASA Astrophysics Data System (ADS)
Lestariani, Ida; Sujadi, Imam; Pramudya, Ikrar
2018-05-01
Portfolio assessment can shows the development of the ability of learners in a period through the work so that can be seen progress monitored learning of each learner. The purpose of research to describe and know the implementation of portfolio assessment on the mathematics learning process with the Senior High school math teacher class X as the subject because of the importance of applying the assessment for the progress of learning outcomes of learners. This research includes descriptive qualitative research type. Techniques of data collecting is done by observation method, interview and documentation. Data collection then validated using triangulation technique that is observation technique, interview and documentation. Data analysis technique is done by data reduction, data presentation and conclusion. The results showed that the steps taken by teachers in applying portfolio assessment obtained focused on learning outcomes. Student learning outcomes include homework and daily tests. Based on the results of research can be concluded that the implementation of portfolio assessment is the form of learning results are scored. Teachers have not yet implemented other portfolio assessment techniques such as student work.
NASA Astrophysics Data System (ADS)
Shamkhali Chenar, S.; Deng, Z.
2017-12-01
Pathogenic viruses pose a significant public health threat and economic losses to shellfish industry in the coastal environment. Norovirus is a contagious virus and the leading cause of epidemic gastroenteritis following consumption of oysters harvested from sewage-contaminated waters. While it is challenging to detect noroviruses in coastal waters due to the lack of sensitive and routine diagnostic methods, machine learning techniques are allowing us to prevent or at least reduce the risks by developing effective predictive models. This study attempts to develop an algorithm between historical norovirus outbreak reports and environmental parameters including water temperature, solar radiation, water level, salinity, precipitation, and wind. For this purpose, the Random Forests statistical technique was utilized to select relevant environmental parameters and their various combinations with different time lags controlling the virus distribution in oyster harvesting areas along the Louisiana Coast. An Artificial Neural Networks (ANN) approach was then presented to predict the outbreaks using a final set of input variables. Finally, a sensitivity analysis was conducted to evaluate relative importance and contribution of the input variables to the model output. Findings demonstrated that the developed model was capable of reproducing historical oyster norovirus outbreaks along the Louisiana Coast with the overall accuracy of than 99.83%, demonstrating the efficacy of the model. Moreover, the increase in water temperature, solar radiation, water level, and salinity, and the decrease in wind and rainfall are associated with the reduction in the model-predicted risk of norovirus outbreak according to sensitivity analysis results. In conclusion, the presented machine learning approach provided reliable tools for predicting potential norovirus outbreaks and could be used for early detection of possible outbreaks and reduce the risk of norovirus to public health and the seafood industry.
NASA Technical Reports Server (NTRS)
Cosentino, Gary B.
2007-01-01
Several examples from the past decade of success stories involving the design and flight test of three true X-planes will be described: in particular, X-plane design techniques that relied heavily upon computational fluid dynamics (CFD). Three specific examples chosen from the author s personal experience are presented: the X-36 Tailless Fighter Agility Research Aircraft, the X-45A Unmanned Combat Air Vehicle, and, most recently, the X-48B Blended Wing Body Demonstrator Aircraft. An overview will be presented of the uses of CFD analysis, comparisons and contrasts with wind tunnel testing, and information derived from the CFD analysis that directly related to successful flight test. Some lessons learned on the proper application, and misapplication, of CFD are illustrated. Finally, some highlights of the flight-test results of the three example X-planes will be presented. This overview paper will discuss some of the author s experience with taking an aircraft shape from early concept and three-dimensional modeling through CFD analysis, wind tunnel testing, further refined CFD analysis, and, finally, flight. An overview of the key roles in which CFD plays well during this process, and some other roles in which it does not, are discussed. How wind tunnel testing complements, calibrates, and verifies CFD analysis is also covered. Lessons learned on where CFD results can be misleading are also given. Strengths and weaknesses of the various types of flow solvers, including panel methods, Euler, and Navier-Stokes techniques, are discussed. The paper concludes with the three specific examples, including some flight test video footage of the X-36, the X-45A, and the X-48B.
NASA Technical Reports Server (NTRS)
Cosentino, Gary B.
2007-01-01
Several examples from the past decade of success stories involving the design and ight test of three true X-planes will be described: in particular, X-plane design techniques that relied heavily upon computational fluid dynamics (CFD). Three specific examples chosen from the authors personal experience are presented: the X-36 Tailless Fighter Agility Research Aircraft, the X-45A Unmanned Combat Air Vehicle, and, most recently, the X-48B Blended Wing Body Demonstrator Aircraft. An overview will be presented of the uses of CFD analysis, comparisons and contrasts with wind tunnel testing, and information derived from the CFD analysis that directly related to successful flight test. Some lessons learned on the proper application, and misapplication, of CFD are illustrated. Finally, some highlights of the flight-test results of the three example X-planes will be presented. This overview paper will discuss some of the authors experience with taking an aircraft shape from early concept and three-dimensional modeling through CFD analysis, wind tunnel testing, further re ned CFD analysis, and, finally, flight. An overview of the key roles in which CFD plays well during this process, and some other roles in which it does not, are discussed. How wind tunnel testing complements, calibrates, and verifies CFD analysis is also covered. Lessons learned on where CFD results can be misleading are also given. Strengths and weaknesses of the various types of ow solvers, including panel methods, Euler, and Navier-Stokes techniques, are discussed. The paper concludes with the three specific examples, including some flight test video footage of the X-36, the X-45A, and the X-48B.
ERIC Educational Resources Information Center
Massie, DeAnna
2017-01-01
College instructors are content experts but ineffective at creating engaging and productive learning environments. This mixed methods study explored how improvisational theatre techniques affect college instructors' ability to increase student engagement and learning. Theoretical foundations included engagement, active learning, collaboration and…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Madelaine Marquez; Neil Stillings
The grant supported four projects that involved professional development for teachers and enrichment programs for students from under-funded and under-served school districts. The projects involved long-term partnerships between Hampshire College and the districts. All projects were concerned with the effective implementation of inquiry-based science learning and its alignment with state and national curriculum and assessment standards. One project, The Collaboration for Excellence in Science Education (CESE), was designed to support research on the development of concepts in the physical sciences, specifically energy and waves. Extensive data from student interviews and written essays supported the neo-Piagetian hierarchical complexity theory of thismore » area of conceptual development. New assessment techniques that can be used by teachers were also developed. The final report includes a full presentation of the methods and results of the research.« less
Problem based learning with scaffolding technique on geometry
NASA Astrophysics Data System (ADS)
Bayuningsih, A. S.; Usodo, B.; Subanti, S.
2018-05-01
Geometry as one of the branches of mathematics has an important role in the study of mathematics. This research aims to explore the effectiveness of Problem Based Learning (PBL) with scaffolding technique viewed from self-regulation learning toward students’ achievement learning in mathematics. The research data obtained through mathematics learning achievement test and self-regulated learning (SRL) questionnaire. This research employed quasi-experimental research. The subjects of this research are students of the junior high school in Banyumas Central Java. The result of the research showed that problem-based learning model with scaffolding technique is more effective to generate students’ mathematics learning achievement than direct learning (DL). This is because in PBL model students are more able to think actively and creatively. The high SRL category student has better mathematic learning achievement than middle and low SRL categories, and then the middle SRL category has better than low SRL category. So, there are interactions between learning model with self-regulated learning in increasing mathematic learning achievement.
(Machine-)Learning to analyze in vivo microscopy: Support vector machines.
Wang, Michael F Z; Fernandez-Gonzalez, Rodrigo
2017-11-01
The development of new microscopy techniques for super-resolved, long-term monitoring of cellular and subcellular dynamics in living organisms is revealing new fundamental aspects of tissue development and repair. However, new microscopy approaches present several challenges. In addition to unprecedented requirements for data storage, the analysis of high resolution, time-lapse images is too complex to be done manually. Machine learning techniques are ideally suited for the (semi-)automated analysis of multidimensional image data. In particular, support vector machines (SVMs), have emerged as an efficient method to analyze microscopy images obtained from animals. Here, we discuss the use of SVMs to analyze in vivo microscopy data. We introduce the mathematical framework behind SVMs, and we describe the metrics used by SVMs and other machine learning approaches to classify image data. We discuss the influence of different SVM parameters in the context of an algorithm for cell segmentation and tracking. Finally, we describe how the application of SVMs has been critical to study protein localization in yeast screens, for lineage tracing in C. elegans, or to determine the developmental stage of Drosophila embryos to investigate gene expression dynamics. We propose that SVMs will become central tools in the analysis of the complex image data that novel microscopy modalities have made possible. This article is part of a Special Issue entitled: Biophysics in Canada, edited by Lewis Kay, John Baenziger, Albert Berghuis and Peter Tieleman. Copyright © 2017 Elsevier B.V. All rights reserved.
Kim, D H; MacKinnon, T
2018-05-01
To identify the extent to which transfer learning from deep convolutional neural networks (CNNs), pre-trained on non-medical images, can be used for automated fracture detection on plain radiographs. The top layer of the Inception v3 network was re-trained using lateral wrist radiographs to produce a model for the classification of new studies as either "fracture" or "no fracture". The model was trained on a total of 11,112 images, after an eightfold data augmentation technique, from an initial set of 1,389 radiographs (695 "fracture" and 694 "no fracture"). The training data set was split 80:10:10 into training, validation, and test groups, respectively. An additional 100 wrist radiographs, comprising 50 "fracture" and 50 "no fracture" images, were used for final testing and statistical analysis. The area under the receiver operator characteristic curve (AUC) for this test was 0.954. Setting the diagnostic cut-off at a threshold designed to maximise both sensitivity and specificity resulted in values of 0.9 and 0.88, respectively. The AUC scores for this test were comparable to state-of-the-art providing proof of concept for transfer learning from CNNs in fracture detection on plain radiographs. This was achieved using only a moderate sample size. This technique is largely transferable, and therefore, has many potential applications in medical imaging, which may lead to significant improvements in workflow productivity and in clinical risk reduction. Copyright © 2017 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.
How learning to abstract shapes neural sound representations
Ley, Anke; Vroomen, Jean; Formisano, Elia
2014-01-01
The transformation of acoustic signals into abstract perceptual representations is the essence of the efficient and goal-directed neural processing of sounds in complex natural environments. While the human and animal auditory system is perfectly equipped to process the spectrotemporal sound features, adequate sound identification and categorization require neural sound representations that are invariant to irrelevant stimulus parameters. Crucially, what is relevant and irrelevant is not necessarily intrinsic to the physical stimulus structure but needs to be learned over time, often through integration of information from other senses. This review discusses the main principles underlying categorical sound perception with a special focus on the role of learning and neural plasticity. We examine the role of different neural structures along the auditory processing pathway in the formation of abstract sound representations with respect to hierarchical as well as dynamic and distributed processing models. Whereas most fMRI studies on categorical sound processing employed speech sounds, the emphasis of the current review lies on the contribution of empirical studies using natural or artificial sounds that enable separating acoustic and perceptual processing levels and avoid interference with existing category representations. Finally, we discuss the opportunities of modern analyses techniques such as multivariate pattern analysis (MVPA) in studying categorical sound representations. With their increased sensitivity to distributed activation changes—even in absence of changes in overall signal level—these analyses techniques provide a promising tool to reveal the neural underpinnings of perceptually invariant sound representations. PMID:24917783
How to start a minimal access mitral valve program
2013-01-01
The seven pillars of governance established by the National Health Service in the United Kingdom provide a useful framework for the process of introducing new procedures to a hospital. Drawing from local experience, the author present guidance for institutions considering establishing a minimal access mitral valve program. The seven pillars of governance apply to the practice of minimally invasive mitral valve surgery, based on the principle of patient-centred practice. The author delineate the benefits of minimally invasive mitral valve surgery in terms of: “clinical effectiveness”, including reduced length of hospital stay, “risk management effectiveness”, including conversion to sternotomy and aortic dissection, “patient experience” including improved cosmesis and quicker recovery, and the effectiveness of communication, resources and strategies in the implementation of minimally invasive mitral valve surgery. Finally, the author have identified seven learning curves experienced by surgeons involved in introducing a minimal access mitral valve program. The learning curves are defined as: techniques of mitral valve repair, Transoesophageal Echocardiography-guided cannulation, incisions, instruments, visualization, aortic occlusion and cardiopulmonary bypass strategies. From local experience, the author provide advice on how to reduce the learning curves, such as practising with the specialised instruments and visualization techniques during sternotomy cases. Underpinning the NHS pillars are the principles of systems awareness, teamwork, communication, ownership and leadership, all of which are paramount to performing any surgery but more so with minimal access surgery, as will be highlighted throughout this paper. PMID:24349981
Mirghani, Hisham M; Ezimokhai, Mutairu; Shaban, Sami; van Berkel, Henk J M
2014-01-01
Students' learning approaches have a significant impact on the success of the educational experience, and a mismatch between instructional methods and the learning approach is very likely to create an obstacle to learning. Educational institutes' understanding of students' learning approaches allows those institutes to introduce changes in their curriculum content, instructional format, and assessment methods that will allow students to adopt deep learning techniques and critical thinking. The objective of this study was to determine and compare learning approaches among medical students following an interdisciplinary integrated curriculum. This was a cross-sectional study in which an electronic questionnaire using the Biggs two-factor Study Process Questionnaire (SPQ) with 20 questions was administered. Of a total of 402 students at the medical school, 214 (53.2%) completed the questionnaire. There was a significant difference in the mean score of superficial approach, motive and strategy between students in the six medical school years. However, no significant difference was observed in the mean score of deep approach, motive and strategy. The mean score for years 1 and 2 showed a significantly higher surface approach, surface motive and surface strategy when compared with students in years 4-6 in medical school. The superficial approach to learning was mostly preferred among first and second year medical students, and the least preferred among students in the final clinical years. These results may be useful in creating future teaching, learning and assessment strategies aiming to enhance a deep learning approach among medical students. Future studies are needed to investigate the reason for the preferred superficial approach among medical students in their early years of study.
Baldwin, Lydia J L; Jones, Christopher M; Hulme, Jonathan; Owen, Andrew
2015-11-01
Feedback is vital for the effective delivery of skills-based education. We sought to compare the sandwich technique and learning conversation structured methods of feedback delivery in competency-based basic life support (BLS) training. Open randomised crossover study undertaken between October 2014 and March 2015 at the University of Birmingham, United Kingdom. Six-hundred and forty healthcare students undertaking a European Resuscitation Council (ERC) BLS course were enrolled, each of whom was randomised to receive teaching using either the sandwich technique or the learning conversation. Fifty-eight instructors were randomised to initially teach using either the learning conversation or sandwich technique, prior to crossing-over and teaching with the alternative technique after a pre-defined time period. Outcome measures included skill acquisition as measured by an end-of-course competency assessment, instructors' perception of teaching with each feedback technique and candidates' perception of the feedback they were provided with. Scores assigned to use of the learning conversation by instructors were significantly more favourable than for the sandwich technique across all but two assessed domains relating to instructor perception of the feedback technique, including all skills-based domains. No difference was seen in either assessment pass rates (80.9% sandwich technique vs. 77.2% learning conversation; OR 1.2, 95% CI 0.85-1.84; p=0.29) or any domain relating to candidates' perception of their teaching technique. This is the first direct comparison of two feedback techniques in clinical medical education using both quantitative and qualitative methodology. The learning conversation is preferred by instructors providing competency-based life support training and is perceived to favour skills acquisition. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Attallah, Omneya; Karthikesalingam, Alan; Holt, Peter J E; Thompson, Matthew M; Sayers, Rob; Bown, Matthew J; Choke, Eddie C; Ma, Xianghong
2017-08-03
Feature selection (FS) process is essential in the medical area as it reduces the effort and time needed for physicians to measure unnecessary features. Choosing useful variables is a difficult task with the presence of censoring which is the unique characteristic in survival analysis. Most survival FS methods depend on Cox's proportional hazard model; however, machine learning techniques (MLT) are preferred but not commonly used due to censoring. Techniques that have been proposed to adopt MLT to perform FS with survival data cannot be used with the high level of censoring. The researcher's previous publications proposed a technique to deal with the high level of censoring. It also used existing FS techniques to reduce dataset dimension. However, in this paper a new FS technique was proposed and combined with feature transformation and the proposed uncensoring approaches to select a reduced set of features and produce a stable predictive model. In this paper, a FS technique based on artificial neural network (ANN) MLT is proposed to deal with highly censored Endovascular Aortic Repair (EVAR). Survival data EVAR datasets were collected during 2004 to 2010 from two vascular centers in order to produce a final stable model. They contain almost 91% of censored patients. The proposed approach used a wrapper FS method with ANN to select a reduced subset of features that predict the risk of EVAR re-intervention after 5 years to patients from two different centers located in the United Kingdom, to allow it to be potentially applied to cross-centers predictions. The proposed model is compared with the two popular FS techniques; Akaike and Bayesian information criteria (AIC, BIC) that are used with Cox's model. The final model outperforms other methods in distinguishing the high and low risk groups; as they both have concordance index and estimated AUC better than the Cox's model based on AIC, BIC, Lasso, and SCAD approaches. These models have p-values lower than 0.05, meaning that patients with different risk groups can be separated significantly and those who would need re-intervention can be correctly predicted. The proposed approach will save time and effort made by physicians to collect unnecessary variables. The final reduced model was able to predict the long-term risk of aortic complications after EVAR. This predictive model can help clinicians decide patients' future observation plan.
You, Zhu-Hong; Lei, Ying-Ke; Zhu, Lin; Xia, Junfeng; Wang, Bing
2013-01-01
Protein-protein interactions (PPIs) play crucial roles in the execution of various cellular processes and form the basis of biological mechanisms. Although large amount of PPIs data for different species has been generated by high-throughput experimental techniques, current PPI pairs obtained with experimental methods cover only a fraction of the complete PPI networks, and further, the experimental methods for identifying PPIs are both time-consuming and expensive. Hence, it is urgent and challenging to develop automated computational methods to efficiently and accurately predict PPIs. We present here a novel hierarchical PCA-EELM (principal component analysis-ensemble extreme learning machine) model to predict protein-protein interactions only using the information of protein sequences. In the proposed method, 11188 protein pairs retrieved from the DIP database were encoded into feature vectors by using four kinds of protein sequences information. Focusing on dimension reduction, an effective feature extraction method PCA was then employed to construct the most discriminative new feature set. Finally, multiple extreme learning machines were trained and then aggregated into a consensus classifier by majority voting. The ensembling of extreme learning machine removes the dependence of results on initial random weights and improves the prediction performance. When performed on the PPI data of Saccharomyces cerevisiae, the proposed method achieved 87.00% prediction accuracy with 86.15% sensitivity at the precision of 87.59%. Extensive experiments are performed to compare our method with state-of-the-art techniques Support Vector Machine (SVM). Experimental results demonstrate that proposed PCA-EELM outperforms the SVM method by 5-fold cross-validation. Besides, PCA-EELM performs faster than PCA-SVM based method. Consequently, the proposed approach can be considered as a new promising and powerful tools for predicting PPI with excellent performance and less time.
Drug-target interaction prediction using ensemble learning and dimensionality reduction.
Ezzat, Ali; Wu, Min; Li, Xiao-Li; Kwoh, Chee-Keong
2017-10-01
Experimental prediction of drug-target interactions is expensive, time-consuming and tedious. Fortunately, computational methods help narrow down the search space for interaction candidates to be further examined via wet-lab techniques. Nowadays, the number of attributes/features for drugs and targets, as well as the amount of their interactions, are increasing, making these computational methods inefficient or occasionally prohibitive. This motivates us to derive a reduced feature set for prediction. In addition, since ensemble learning techniques are widely used to improve the classification performance, it is also worthwhile to design an ensemble learning framework to enhance the performance for drug-target interaction prediction. In this paper, we propose a framework for drug-target interaction prediction leveraging both feature dimensionality reduction and ensemble learning. First, we conducted feature subspacing to inject diversity into the classifier ensemble. Second, we applied three different dimensionality reduction methods to the subspaced features. Third, we trained homogeneous base learners with the reduced features and then aggregated their scores to derive the final predictions. For base learners, we selected two classifiers, namely Decision Tree and Kernel Ridge Regression, resulting in two variants of ensemble models, EnsemDT and EnsemKRR, respectively. In our experiments, we utilized AUC (Area under ROC Curve) as an evaluation metric. We compared our proposed methods with various state-of-the-art methods under 5-fold cross validation. Experimental results showed EnsemKRR achieving the highest AUC (94.3%) for predicting drug-target interactions. In addition, dimensionality reduction helped improve the performance of EnsemDT. In conclusion, our proposed methods produced significant improvements for drug-target interaction prediction. Copyright © 2017 Elsevier Inc. All rights reserved.
Dental education and evidence-based educational best practices: bridging the great divide.
Masella, Richard S; Thompson, Thomas J
2004-12-01
Research about educational best practices is negatively perceived by many dental faculty. Separation between teaching and learning strategies commonly employed in dental education and evidence-based educational techniques is real and caused by a variety of factors: the often incomprehensible jargon of educational specialists; traditional academic dominance of research, publication, and grantsmanship in faculty promotions; institutional undervaluing of teaching and the educational process; and departmentalization of dental school governance with resultant narrowness of academic vision. Clinician-dentists hired as dental school faculty may model teaching activities on decades-old personal experiences, ignoring recent educational evidence and the academic culture. Dentistry's twin internal weaknesses--factionalism and parochialism--contribute to academic resistance to change and unwillingness to share power. Dental accreditation is a powerful impetus toward inclusion of best teaching and learning evidence in dental education. This article will describe how the gap between traditional educational strategies and research-based practices can be reduced by several approaches including dental schools' promotion of learning cultures that encourage and reward faculty who earn advanced degrees in education, regular evaluation of teaching by peers and educational consultants with inclusion of the results of these evaluations in promotion and tenure committee deliberations, creating tangible reward systems to recognize and encourage teaching excellence, and basing faculty development programs on adult learning principles. Leadership development should be part of faculty enrichment, as effective administration is essential to dental school mission fulfillment. Finally, faculty who investigate the effectiveness of educational techniques need to make their research more available by publishing it, more understandable by reducing educational jargon, and more relevant to the day-to-day teaching issues that dental school faculty encounter in classrooms, labs, and clinics.
Martin, Aifric Isabel; Devasahayam, Rajnesh; Hodge, Christopher; Cooper, Simon; Sutton, Gerard L
2017-09-01
This study is the first paper to establish a learning curve by a single technician. Preparation of pre-cut corneal endothelial grafts commenced at Lions New South Wales Eye Bank in December 2014. The primary objective of this study was to review the safety and reliability of the preparation method during the first year of production. This is a hospital-based, prospective case series. There were 234 consecutive donor corneal lenticules. Donor lenticules were prepared by a single operator using a linear cutting microkeratome. Immediately prior to cutting, central corneal thickness values were recorded. Measurements of the corneal bed were taken immediately following lenticule preparation. Outcomes were separated by blade sizes, and intended thickness was compared to actual thickness for each setting. Early specimens were compared to later ones to assess for a learning curve within the technique. The main parameter measured is the mean difference from intended lamellar cut thickness. The mean final cut thickness was 122.36 ± 20.35 μm, and the mean difference from intended cut was 30.17 ± 37.45 μm. No significant difference was found between results achieved with early specimens versus those achieved with later specimens (P = 0.425). Thin, reproducible endothelial grafts can routinely be produced by trained technicians at their respective eye banks without significant concerns for an extended learning curve. This service can reduce perioperative surgical complexity, required surgical paraphernalia and theatre times. The consistent preparation of single-pass, ultrathin pre-cut corneas may have additional advantages for surgeons seeking to introduce lamellar techniques. © 2017 Royal Australian and New Zealand College of Ophthalmologists.
Constraints on the Transfer of Perceptual Learning in Accented Speech
Eisner, Frank; Melinger, Alissa; Weber, Andrea
2013-01-01
The perception of speech sounds can be re-tuned through a mechanism of lexically driven perceptual learning after exposure to instances of atypical speech production. This study asked whether this re-tuning is sensitive to the position of the atypical sound within the word. We investigated perceptual learning using English voiced stop consonants, which are commonly devoiced in word-final position by Dutch learners of English. After exposure to a Dutch learner’s productions of devoiced stops in word-final position (but not in any other positions), British English (BE) listeners showed evidence of perceptual learning in a subsequent cross-modal priming task, where auditory primes with devoiced final stops (e.g., “seed”, pronounced [si:th]), facilitated recognition of visual targets with voiced final stops (e.g., SEED). In Experiment 1, this learning effect generalized to test pairs where the critical contrast was in word-initial position, e.g., auditory primes such as “town” facilitated recognition of visual targets like DOWN. Control listeners, who had not heard any stops by the speaker during exposure, showed no learning effects. The generalization to word-initial position did not occur when participants had also heard correctly voiced, word-initial stops during exposure (Experiment 2), and when the speaker was a native BE speaker who mimicked the word-final devoicing (Experiment 3). The readiness of the perceptual system to generalize a previously learned adjustment to other positions within the word thus appears to be modulated by distributional properties of the speech input, as well as by the perceived sociophonetic characteristics of the speaker. The results suggest that the transfer of pre-lexical perceptual adjustments that occur through lexically driven learning can be affected by a combination of acoustic, phonological, and sociophonetic factors. PMID:23554598
Nanomaterial characterization through image treatment, 3D reconstruction and AI techniques
NASA Astrophysics Data System (ADS)
Lopez de Uralde Huarte, Juan Jose
Nanotechnology is not only the science of the future, but it is indeed the science of today. It is used in all sectors, from health to energy, including information technologies and transport. For the present investigation, we have taken carbon black as a use case. This nanomaterial is mixed with a wide variety of materials to improve their properties, like abrasion resistance, tire and plastic wear or tinting strength in pigments. Nowadays, indirect methods of analysis, like oil absorption or nitrogen adsorption are the most common techniques of the nanomaterial industry. These procedures measure the change in the physical state while adding oil and nitrogen. In this way, the superficial area is estimated and related with the properties of the material. Nevertheless, we have chosen to improve the existent direct methods, which consist in analysing microscopy images of nanomaterials. We have made progress in the image processing treatments and in the extracted features. In fact, some of them have overcome the existing features in the literature. In addition, we have applied, for the first time in the literature, machine learning to aggregate categorization. In this way, we identify automatically their morphology, which will determine the final properties of the material that is mixed with. Finally, we have presented an aggregate reconstruction genetic algorithm that, with only two orthogonal images, provides more information than a tomography, which needs a lot of images. To summarize, we have improved the state of the art in direct analysing techniques, allowing in the near future the replacement of the current indirect techniques.
NASA Astrophysics Data System (ADS)
Juliane, C.; Arman, A. A.; Sastramihardja, H. S.; Supriana, I.
2017-03-01
Having motivation to learn is a successful requirement in a learning process, and needs to be maintained properly. This study aims to measure learning motivation, especially in the process of electronic learning (e-learning). Here, data mining approach was chosen as a research method. For the testing process, the accuracy comparative study on the different testing techniques was conducted, involving Cross Validation and Percentage Split. The best accuracy was generated by J48 algorithm with a percentage split technique reaching at 92.19 %. This study provided an overview on how to detect the presence of learning motivation in the context of e-learning. It is expected to be good contribution for education, and to warn the teachers for whom they have to provide motivation.
Subramanian, J; Anderson, V R; Morgaine, K C; Thomson, W M
2013-02-01
Research suggests that students' perceptions should be considered in any discussion of their education. However, to date, there has been no systematic examination of New Zealand postgraduate dental students' learning processes in both the research and clinical settings. This study aimed to obtain in-depth qualitative insights into student and graduate perspectives of effective and ineffective learning experiences during their postgraduate dental education. Data were collected in 2010 using semi-structured individual interviews. Participants included 2010 final-year students and 2009 graduates of the University of Otago Doctor of Clinical Dentistry programme. Using the Critical Incident Technique, participants were asked to describe at least one effective and one ineffective learning experience in detail. Interview transcripts were analysed using a general inductive approach. Broad themes which emerged included supervisory approaches, characteristics of the learning process and characteristics of the physical learning environment. The focus of this article is to report and discuss the learning processes that participants identified as promoting and precluding effective learning experiences in the clinical and research settings. Students and graduates in the study had largely similar perspectives of learning processes likely to result in effective clinical and research learning. These included self-directed and collaborative learning; timely, constructive and detailed feedback with directions for further improvement; and discreet clinical feedback. Learning processes that precluded effective learning included unsupported and isolated learning, delayed and overly critical/destructive feedback and open criticism in the clinical context. The in-depth findings of this study contribute to the scientific literature that identifies learning process characteristics which facilitate effective learning from New Zealand postgraduate students' and graduates' perspectives. Additional cross-sectional and longitudinal studies (both qualitative and quantitative) would lead to a better understanding of what constitutes effective teaching in postgraduate dental education. © 2012 John Wiley & Sons A/S.
Effects of Enhancement Techniques on L2 Incidental Vocabulary Learning
ERIC Educational Resources Information Center
Duan, Shiping
2018-01-01
Enhancement Techniques are conducive to incidental vocabulary learning. This study investigated the effects of two types of enhancement techniques-multiple-choice glosses (MC) and L1 single-gloss (SG) on L2 incidental learning of new words and retention of them. A total of 89 university learners of English as a Foreign Language (EFL) were asked to…
Modified UTAUT2 Model for M-Learning among Students in India
ERIC Educational Resources Information Center
Bharati, V. Jayendra; Srikanth, R.
2018-01-01
Ubiquitous technologies have a great potential to enrich students' academic experience. Students are more interested in using interactive learning techniques apart from the traditional learning techniques. Several research studies for m-learning has been done in the USA, UK concentrating on students undergoing a graduation degree, especially…
NASA Astrophysics Data System (ADS)
Drakopoulou, E.; Cowan, G. A.; Needham, M. D.; Playfer, S.; Taani, M.
2018-04-01
The application of machine learning techniques to the reconstruction of lepton energies in water Cherenkov detectors is discussed and illustrated for TITUS, a proposed intermediate detector for the Hyper-Kamiokande experiment. It is found that applying these techniques leads to an improvement of more than 50% in the energy resolution for all lepton energies compared to an approach based upon lookup tables. Machine learning techniques can be easily applied to different detector configurations and the results are comparable to likelihood-function based techniques that are currently used.
Choosing the Most Effective Pattern Classification Model under Learning-Time Constraint.
Saito, Priscila T M; Nakamura, Rodrigo Y M; Amorim, Willian P; Papa, João P; de Rezende, Pedro J; Falcão, Alexandre X
2015-01-01
Nowadays, large datasets are common and demand faster and more effective pattern analysis techniques. However, methodologies to compare classifiers usually do not take into account the learning-time constraints required by applications. This work presents a methodology to compare classifiers with respect to their ability to learn from classification errors on a large learning set, within a given time limit. Faster techniques may acquire more training samples, but only when they are more effective will they achieve higher performance on unseen testing sets. We demonstrate this result using several techniques, multiple datasets, and typical learning-time limits required by applications.
Learning and Tuning of Fuzzy Rules
NASA Technical Reports Server (NTRS)
Berenji, Hamid R.
1997-01-01
In this chapter, we review some of the current techniques for learning and tuning fuzzy rules. For clarity, we refer to the process of generating rules from data as the learning problem and distinguish it from tuning an already existing set of fuzzy rules. For learning, we touch on unsupervised learning techniques such as fuzzy c-means, fuzzy decision tree systems, fuzzy genetic algorithms, and linear fuzzy rules generation methods. For tuning, we discuss Jang's ANFIS architecture, Berenji-Khedkar's GARIC architecture and its extensions in GARIC-Q. We show that the hybrid techniques capable of learning and tuning fuzzy rules, such as CART-ANFIS, RNN-FLCS, and GARIC-RB, are desirable in development of a number of future intelligent systems.
Limited transfer of long-term motion perceptual learning with double training.
Liang, Ju; Zhou, Yifeng; Fahle, Manfred; Liu, Zili
2015-01-01
A significant recent development in visual perceptual learning research is the double training technique. With this technique, Xiao, Zhang, Wang, Klein, Levi, and Yu (2008) have found complete transfer in tasks that had previously been shown to be stimulus specific. The significance of this finding is that this technique has since been successful in all tasks tested, including motion direction discrimination. Here, we investigated whether or not this technique could generalize to longer-term learning, using the method of constant stimuli. Our task was learning to discriminate motion directions of random dots. The second leg of training was contrast discrimination along a new average direction of the same moving dots. We found that, although exposure of moving dots along a new direction facilitated motion direction discrimination, this partial transfer was far from complete. We conclude that, although perceptual learning is transferrable under certain conditions, stimulus specificity also remains an inherent characteristic of motion perceptual learning.
Recent developments in machine learning applications in landslide susceptibility mapping
NASA Astrophysics Data System (ADS)
Lun, Na Kai; Liew, Mohd Shahir; Matori, Abdul Nasir; Zawawi, Noor Amila Wan Abdullah
2017-11-01
While the prediction of spatial distribution of potential landslide occurrences is a primary interest in landslide hazard mitigation, it remains a challenging task. To overcome the scarceness of complete, sufficiently detailed geomorphological attributes and environmental conditions, various machine-learning techniques are increasingly applied to effectively map landslide susceptibility for large regions. Nevertheless, limited review papers are devoted to this field, particularly on the various domain specific applications of machine learning techniques. Available literature often report relatively good predictive performance, however, papers discussing the limitations of each approaches are quite uncommon. The foremost aim of this paper is to narrow these gaps in literature and to review up-to-date machine learning and ensemble learning techniques applied in landslide susceptibility mapping. It provides new readers an introductory understanding on the subject matter and researchers a contemporary review of machine learning advancements alongside the future direction of these techniques in the landslide mitigation field.
Backåberg, Sofia; Gummesson, Christina; Brunt, David; Rask, Mikael
2015-01-01
Healthcare staff and students have a great risk of developing musculoskeletal symptoms. One cause of this is heavy load related work activities such as manual handling, in which the quality of individual work technique may play a major role. Preventive interventions and well-defined educational strategies to support movement awareness and long-lasting movement changes need to be developed. The aim of the present study was to explore nursing students' experiences of a newly developed interactive learning model for movement awareness. The learning model, which is based on a life-world perspective with focus on interpersonal interaction, has been used with 11 undergraduate students from the second and final year. Each student participated in three individual video sessions with a facilitator. Two individual interviews were carried out with each student during the learning process and one interview 12-18 months after the last session. The interviews were audio-recorded and transcribed verbatim, and a phenomenological hermeneutic method inspired by Paul Ricoeur and described by Lindseth and Norberg was used to interpret the interviews and diary notes. The interpretation resulted in three key themes and nine subthemes. The key themes were; "Obtaining better preconditions for bodily awareness," "Experiencing changes in one's own movement," and "Experiencing challenges in the learning process." The interactive learning model entails a powerful and challenging experience that develops movement awareness. The experience of meaningfulness and usefulness emerges increasingly and alternates with a feeling of discomfort. The learning model may contribute to the body of knowledge of well-defined educational strategies in movement awareness and learning in, for example, preventive interventions and ergonomic education. It may also be valuable in other practical learning situations where movement awareness is required.
Short-range quantitative precipitation forecasting using Deep Learning approaches
NASA Astrophysics Data System (ADS)
Akbari Asanjan, A.; Yang, T.; Gao, X.; Hsu, K. L.; Sorooshian, S.
2017-12-01
Predicting short-range quantitative precipitation is very important for flood forecasting, early flood warning and other hydrometeorological purposes. This study aims to improve the precipitation forecasting skills using a recently developed and advanced machine learning technique named Long Short-Term Memory (LSTM). The proposed LSTM learns the changing patterns of clouds from Cloud-Top Brightness Temperature (CTBT) images, retrieved from the infrared channel of Geostationary Operational Environmental Satellite (GOES), using a sophisticated and effective learning method. After learning the dynamics of clouds, the LSTM model predicts the upcoming rainy CTBT events. The proposed model is then merged with a precipitation estimation algorithm termed Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks (PERSIANN) to provide precipitation forecasts. The results of merged LSTM with PERSIANN are compared to the results of an Elman-type Recurrent Neural Network (RNN) merged with PERSIANN and Final Analysis of Global Forecast System model over the states of Oklahoma, Florida and Oregon. The performance of each model is investigated during 3 storm events each located over one of the study regions. The results indicate the outperformance of merged LSTM forecasts comparing to the numerical and statistical baselines in terms of Probability of Detection (POD), False Alarm Ratio (FAR), Critical Success Index (CSI), RMSE and correlation coefficient especially in convective systems. The proposed method shows superior capabilities in short-term forecasting over compared methods.
The use of wireless laptop computers for computer-assisted learning in pharmacokinetics.
Munar, Myrna Y; Singh, Harleen; Belle, Donna; Brackett, Carolyn C; Earle, Sandra B
2006-02-15
To implement computer-assisted learning workshops into pharmacokinetics courses in a doctor of pharmacy (PharmD) program. Workshops were designed for students to utilize computer software programs on laptop computers to build pharmacokinetic models to predict drug concentrations resulting from various dosage regimens. In addition, students were able to visualize through graphing programs how altering different parameters changed drug concentration-time curves. Surveys were conducted to measure students' attitudes toward computer technology before and after implementation. Finally, traditional examinations were used to evaluate student learning. Doctor of pharmacy students responded favorably to the use of wireless laptop computers in problem-based pharmacokinetic workshops. Eighty-eight percent (n = 61/69) and 82% (n = 55/67) of PharmD students completed surveys before and after computer implementation, respectively. Prior to implementation, 95% of students agreed that computers would enhance learning in pharmacokinetics. After implementation, 98% of students strongly agreed (p < 0.05) that computers enhanced learning. Examination results were significantly higher after computer implementation (89% with computers vs. 84% without computers; p = 0.01). Implementation of wireless laptop computers in a pharmacokinetic course enabled students to construct their own pharmacokinetic models that could respond to changing parameters. Students had greater comprehension and were better able to interpret results and provide appropriate recommendations. Computer-assisted pharmacokinetic techniques can be powerful tools when making decisions about drug therapy.
The Use of Wireless Laptop Computers for Computer-Assisted Learning in Pharmacokinetics
Munar, Myrna Y.; Singh, Harleen; Belle, Donna; Brackett, Carolyn C.; Earle, Sandra B.
2006-01-01
Objective To implement computer-assisted learning workshops into pharmacokinetics courses in a doctor of pharmacy (PharmD) program. Design Workshops were designed for students to utilize computer software programs on laptop computers to build pharmacokinetic models to predict drug concentrations resulting from various dosage regimens. In addition, students were able to visualize through graphing programs how altering different parameters changed drug concentration-time curves. Surveys were conducted to measure students’ attitudes toward computer technology before and after implementation. Finally, traditional examinations were used to evaluate student learning. Assessment Doctor of pharmacy students responded favorably to the use of wireless laptop computers in problem-based pharmacokinetic workshops. Eighty-eight percent (n = 61/69) and 82% (n = 55/67) of PharmD students completed surveys before and after computer implementation, respectively. Prior to implementation, 95% of students agreed that computers would enhance learning in pharmacokinetics. After implementation, 98% of students strongly agreed (p < 0.05) that computers enhanced learning. Examination results were significantly higher after computer implementation (89% with computers vs. 84% without computers; p = 0.01). Conclusion Implementation of wireless laptop computers in a pharmacokinetic course enabled students to construct their own pharmacokinetic models that could respond to changing parameters. Students had greater comprehension and were better able to interpret results and provide appropriate recommendations. Computer-assisted pharmacokinetic techniques can be powerful tools when making decisions about drug therapy. PMID:17136147
Ji, Hongwei; He, Jiangping; Yang, Xin; Deklerck, Rudi; Cornelis, Jan
2013-05-01
In this paper, we present an autocontext model(ACM)-based automatic liver segmentation algorithm, which combines ACM, multiatlases, and mean-shift techniques to segment liver from 3-D CT images. Our algorithm is a learning-based method and can be divided into two stages. At the first stage, i.e., the training stage, ACM is performed to learn a sequence of classifiers in each atlas space (based on each atlas and other aligned atlases). With the use of multiple atlases, multiple sequences of ACM-based classifiers are obtained. At the second stage, i.e., the segmentation stage, the test image will be segmented in each atlas space by applying each sequence of ACM-based classifiers. The final segmentation result will be obtained by fusing segmentation results from all atlas spaces via a multiclassifier fusion technique. Specially, in order to speed up segmentation, given a test image, we first use an improved mean-shift algorithm to perform over-segmentation and then implement the region-based image labeling instead of the original inefficient pixel-based image labeling. The proposed method is evaluated on the datasets of MICCAI 2007 liver segmentation challenge. The experimental results show that the average volume overlap error and the average surface distance achieved by our method are 8.3% and 1.5 m, respectively, which are comparable to the results reported in the existing state-of-the-art work on liver segmentation.
Medical student web-based formative assessment tool for renal pathology.
Bijol, Vanesa; Byrne-Dugan, Cathryn J; Hoenig, Melanie P
2015-01-01
Background Web-based formative assessment tools have become widely recognized in medical education as valuable resources for self-directed learning. Objectives To explore the educational value of formative assessment using online quizzes for kidney pathology learning in our renal pathophysiology course. Methods Students were given unrestricted and optional access to quizzes. Performance on quizzed and non-quizzed materials of those who used ('quizzers') and did not use the tool ('non-quizzers') was compared. Frequency of tool usage was analyzed and satisfaction surveys were utilized at the end of the course. Results In total, 82.6% of the students used quizzes. The greatest usage was observed on the day before the final exam. Students repeated interactive and more challenging quizzes more often. Average means between final exam scores for quizzed and unrelated materials were almost equal for 'quizzers' and 'non-quizzers', but 'quizzers' performed statistically better than 'non-quizzers' on both, quizzed (p=0.001) and non-quizzed (p=0.024) topics. In total, 89% of surveyed students thought quizzes improved their learning experience in this course. Conclusions Our new computer-assisted learning tool is popular, and although its use can predict the final exam outcome, it does not provide strong evidence for direct improvement in academic performance. Students who chose to use quizzes did well on all aspects of the final exam and most commonly used quizzes to practice for final exam. Our efforts to revitalize the course material and promote learning by adding interactive online formative assessments improved students' learning experience overall.
ERIC Educational Resources Information Center
Lasky, Barbara; Tempone, Irene
2004-01-01
Action learning techniques are well suited to the teaching of organisation behaviour students because of their flexibility, inclusiveness, openness, and respect for individuals. They are no less useful as a tool for change for vocational teachers, learning, of necessity, to become researchers. Whereas traditional universities have always had a…
The Journal of the Society for Accelerative Learning and Teaching, Volume 7.
ERIC Educational Resources Information Center
Journal of the Society for Accelerative Learning and Teaching, 1982
1982-01-01
The four 1982 numbers of the Journal of the Society for Accelerative Learning and Teaching (SALT) include articles on: a comparison of the Tomatis Method and Suggestopedia; the CLC system of accelerated learning; Suggestopedia in the English-as-a-second-language classroom; experiments with SALT techniques; accelerative learning techniques for…
Attitudes of Nigerian Secondary School Teachers towards Media-Based Learning.
ERIC Educational Resources Information Center
Ekpo, C. M.
This document presents results of a study assessing the attitudes of secondary school teachers towards media based learning. The study explores knowledge of and exposure to media based learning techniques of a cross section of Nigerian secondary school teachers. Factors that affect the use of media based learning technique are sought. Media based…
Wise, Christopher H.; Schenk, Ronald J.; Lattanzi, Jill Black
2016-01-01
Background Despite emerging evidence to support the use of high velocity thrust manipulation in the management of lumbar spinal conditions, utilization of thrust manipulation among clinicians remains relatively low. One reason for the underutilization of these procedures may be related to disparity in training in the performance of these techniques at the professional and post professional levels. Purpose To assess the effect of using a new model of active learning on participant confidence in the performance of spinal thrust manipulation and the implications for its use in the professional and post-professional training of physical therapists. Methods A cohort of 15 DPT students in their final semester of entry-level professional training participated in an active training session emphasizing a sequential partial task practice (SPTP) strategy in which participants engaged in partial task practice over several repetitions with different partners. Participants’ level of confidence in the performance of these techniques was determined through comparison of pre- and post-training session surveys and a post-session open-ended interview. Results The increase in scores across all items of the individual pre- and post-session surveys suggests that this model was effective in changing overall participant perception regarding the effectiveness and safety of these techniques and in increasing student confidence in their performance. Interviews revealed that participants greatly preferred the SPTP strategy, which enhanced their confidence in technique performance. Conclusion Results indicate that this new model of psychomotor training may be effective at improving confidence in the performance of spinal thrust manipulation and, subsequently, may be useful for encouraging the future use of these techniques in the care of individuals with impairments of the spine. Inasmuch, this method of instruction may be useful for training of physical therapists at both the professional and post-professional levels. PMID:27559284
Engineering Design Education Program for Graduate School
NASA Astrophysics Data System (ADS)
Ohbuchi, Yoshifumi; Iida, Haruhiko
The new educational methods of engineering design have attempted to improve mechanical engineering education for graduate students in a way of the collaboration in education of engineer and designer. The education program is based on the lecture and practical exercises concerning the product design, and has engineering themes and design process themes, i.e. project management, QFD, TRIZ, robust design (Taguchi method) , ergonomics, usability, marketing, conception etc. At final exercise, all students were able to design new product related to their own research theme by applying learned knowledge and techniques. By the method of engineering design education, we have confirmed that graduate students are able to experience technological and creative interest.
McQueeney, Krista
2016-10-01
This article describes an intersectional approach to teaching about domestic violence (DV), which aims to empower students as critical thinkers and agents of change by merging theory, service learning, self-reflection, and activism. Three intersectional strategies and techniques for teaching about DV are discussed: promoting difference-consciousness, complicating gender-only power frameworks, and organizing for change. The author argues that to empower future generations to end violence, educators should put intersectionality into action through their use of scholarship, teaching methods, and pedagogical authority. Finally, the benefits and challenges of intersectional pedagogy for social justice education are considered. © The Author(s) 2016.
SEGMENTATION OF MITOCHONDRIA IN ELECTRON MICROSCOPY IMAGES USING ALGEBRAIC CURVES.
Seyedhosseini, Mojtaba; Ellisman, Mark H; Tasdizen, Tolga
2013-01-01
High-resolution microscopy techniques have been used to generate large volumes of data with enough details for understanding the complex structure of the nervous system. However, automatic techniques are required to segment cells and intracellular structures in these multi-terabyte datasets and make anatomical analysis possible on a large scale. We propose a fully automated method that exploits both shape information and regional statistics to segment irregularly shaped intracellular structures such as mitochondria in electron microscopy (EM) images. The main idea is to use algebraic curves to extract shape features together with texture features from image patches. Then, these powerful features are used to learn a random forest classifier, which can predict mitochondria locations precisely. Finally, the algebraic curves together with regional information are used to segment the mitochondria at the predicted locations. We demonstrate that our method outperforms the state-of-the-art algorithms in segmentation of mitochondria in EM images.
Extracting semantics from audio-visual content: the final frontier in multimedia retrieval.
Naphade, M R; Huang, T S
2002-01-01
Multimedia understanding is a fast emerging interdisciplinary research area. There is tremendous potential for effective use of multimedia content through intelligent analysis. Diverse application areas are increasingly relying on multimedia understanding systems. Advances in multimedia understanding are related directly to advances in signal processing, computer vision, pattern recognition, multimedia databases, and smart sensors. We review the state-of-the-art techniques in multimedia retrieval. In particular, we discuss how multimedia retrieval can be viewed as a pattern recognition problem. We discuss how reliance on powerful pattern recognition and machine learning techniques is increasing in the field of multimedia retrieval. We review the state-of-the-art multimedia understanding systems with particular emphasis on a system for semantic video indexing centered around multijects and multinets. We discuss how semantic retrieval is centered around concepts and context and the various mechanisms for modeling concepts and context.
Tamura, Naomi; Terashita, Takayoshi; Ogasawara, Katsuhiko
2013-01-01
Students with a positive impression of their studies can become more motivated. This study measured the learning impact of clinical training by comparing student impressions before and after clinical training. The study included 32 students of radiological technology in their final year with the Division of Radiological Science and Technology, Department of Health Sciences, School of Medicine, Hokkaido University. To measure student impressions of x-ray examination training, we developed a questionnaire using the semantic differential technique. The resulting factor analysis identified 2 factors that accounted for 44.9% of the 10 bipolar adjective scales. Factor 1 represented a "resistance" impression of x-ray examination training, and factor 2 represented a "responsibility" impression. The differences in factor scores before and after the clinical training suggest that student impressions are affected by clinical training.
Use of data mining to predict significant factors and benefits of bilateral cochlear implantation.
Ramos-Miguel, Angel; Perez-Zaballos, Teresa; Perez, Daniel; Falconb, Juan Carlos; Ramosb, Angel
2015-11-01
Data mining (DM) is a technique used to discover pattern and knowledge from a big amount of data. It uses artificial intelligence, automatic learning, statistics, databases, etc. In this study, DM was successfully used as a predictive tool to assess disyllabic speech test performance in bilateral implanted patients with a success rate above 90%. 60 bilateral sequentially implanted adult patients were included in the study. The DM algorithms developed found correlations between unilateral medical records and Audiological test results and bilateral performance by establishing relevant variables based on two DM techniques: the classifier and the estimation. The nearest neighbor algorithm was implemented in the first case, and the linear regression in the second. The results showed that patients with unilateral disyllabic test results below 70% benefited the most from a bilateral implantation. Finally, it was observed that its benefits decrease as the inter-implant time increases.
Takeda, Sen; Yoshimura, Kentaro; Tanabe, Kunio
2015-09-01
Conventionally, a definitive diagnosis of cancer is derived from histopathological diagnostics based on morphological criteria that are difficult to standardize on a quantifiable basis. On the other hand, while molecular tumor markers and blood biochemical profiles give quantitative values evaluated by objective criteria, these parameters are usually generated by deductive methods such as peak extraction. Therefore, some of the data that may contain useful information on specimens are discarded. To overcome the disadvantages of these methods, we have developed a new approach by employing both mass spectrometry and machine-learning for cancer diagnosis. Probe electrospray ionization (PESI) is a derivative of electrospray ionization that uses a fine acupuncture needle as a sample picker as well as an ion emitter for mass spectrometry. This method enables us to ionize very small tissue samples up to a few pico liters in the presence of physiological concentrations of inorganic salts, without the need for any sample pretreatment. Moreover, as this technique makes it possible to ionize all components with minimal suppression effects, we can retrieve much more molecular information from specimens. To make the most of data enriched with lipid compounds and substances with lower molecular weights such as carbohydrates, we employed machine-learning named the dual penalized logistic regression machine (dPLRM). This method is completely different from pattern-matching in that it discriminates categories by projecting the spectral data into a mathematical space with very high dimensions, where final judgment is made. We are now approaching the final clinical trial to validate the usefulness of our system.
ERIC Educational Resources Information Center
Cechinel, Cristian
2014-01-01
This work presents a quantitative study of the use of a Learning Management System (LMS) by the professors of a distance learning course, focused on the guidance given for the students' Final Undergraduate Project. Data taken from the logs of 34 professors in two distinct virtual rooms were collected. After pre-processing the data, a series of…
A novel method for predicting kidney stone type using ensemble learning.
Kazemi, Yassaman; Mirroshandel, Seyed Abolghasem
2018-01-01
The high morbidity rate associated with kidney stone disease, which is a silent killer, is one of the main concerns in healthcare systems all over the world. Advanced data mining techniques such as classification can help in the early prediction of this disease and reduce its incidence and associated costs. The objective of the present study is to derive a model for the early detection of the type of kidney stone and the most influential parameters with the aim of providing a decision-support system. Information was collected from 936 patients with nephrolithiasis at the kidney center of the Razi Hospital in Rasht from 2012 through 2016. The prepared dataset included 42 features. Data pre-processing was the first step toward extracting the relevant features. The collected data was analyzed with Weka software, and various data mining models were used to prepare a predictive model. Various data mining algorithms such as the Bayesian model, different types of Decision Trees, Artificial Neural Networks, and Rule-based classifiers were used in these models. We also proposed four models based on ensemble learning to improve the accuracy of each learning algorithm. In addition, a novel technique for combining individual classifiers in ensemble learning was proposed. In this technique, for each individual classifier, a weight is assigned based on our proposed genetic algorithm based method. The generated knowledge was evaluated using a 10-fold cross-validation technique based on standard measures. However, the assessment of each feature for building a predictive model was another significant challenge. The predictive strength of each feature for creating a reproducible outcome was also investigated. Regarding the applied models, parameters such as sex, acid uric condition, calcium level, hypertension, diabetes, nausea and vomiting, flank pain, and urinary tract infection (UTI) were the most vital parameters for predicting the chance of nephrolithiasis. The final ensemble-based model (with an accuracy of 97.1%) was a robust one and could be safely applied to future studies to predict the chances of developing nephrolithiasis. This model provides a novel way to study stone disease by deciphering the complex interaction among different biological variables, thus helping in an early identification and reduction in diagnosis time. Copyright © 2017 Elsevier B.V. All rights reserved.
Active learning for semi-supervised clustering based on locally linear propagation reconstruction.
Chang, Chin-Chun; Lin, Po-Yi
2015-03-01
The success of semi-supervised clustering relies on the effectiveness of side information. To get effective side information, a new active learner learning pairwise constraints known as must-link and cannot-link constraints is proposed in this paper. Three novel techniques are developed for learning effective pairwise constraints. The first technique is used to identify samples less important to cluster structures. This technique makes use of a kernel version of locally linear embedding for manifold learning. Samples neither important to locally linear propagation reconstructions of other samples nor on flat patches in the learned manifold are regarded as unimportant samples. The second is a novel criterion for query selection. This criterion considers not only the importance of a sample to expanding the space coverage of the learned samples but also the expected number of queries needed to learn the sample. To facilitate semi-supervised clustering, the third technique yields inferred must-links for passing information about flat patches in the learned manifold to semi-supervised clustering algorithms. Experimental results have shown that the learned pairwise constraints can capture the underlying cluster structures and proven the feasibility of the proposed approach. Copyright © 2014 Elsevier Ltd. All rights reserved.
Undergraduates Achieve Learning Gains in Plant Genetics through Peer Teaching of Secondary Students
Chrispeels, H. E.; Klosterman, M. L.; Martin, J. B.; Lundy, S. R.; Watkins, J. M.; Gibson, C. L.
2014-01-01
This study tests the hypothesis that undergraduates who peer teach genetics will have greater understanding of genetic and molecular biology concepts as a result of their teaching experiences. Undergraduates enrolled in a non–majors biology course participated in a service-learning program in which they led middle school (MS) or high school (HS) students through a case study curriculum to discover the cause of a green tomato variant. The curriculum explored plant reproduction and genetic principles, highlighting variation in heirloom tomato fruits to reinforce the concept of the genetic basis of phenotypic variation. HS students were taught additional activities related to molecular biology techniques not included in the MS curriculum. We measured undergraduates’ learning outcomes using pre/postteaching content assessments and the course final exam. Undergraduates showed significant gains in understanding of topics related to the curriculum they taught, compared with other course content, on both types of assessments. Undergraduates who taught HS students scored higher on questions specific to the HS curriculum compared with undergraduates who taught MS students, despite identical lecture content, on both types of assessments. These results indicate the positive effect of service-learning peer-teaching experiences on undergraduates’ content knowledge, even for non–science major students. PMID:25452487
Support Vector Machine Model for Automatic Detection and Classification of Seismic Events
NASA Astrophysics Data System (ADS)
Barros, Vesna; Barros, Lucas
2016-04-01
The automated processing of multiple seismic signals to detect, localize and classify seismic events is a central tool in both natural hazards monitoring and nuclear treaty verification. However, false detections and missed detections caused by station noise and incorrect classification of arrivals are still an issue and the events are often unclassified or poorly classified. Thus, machine learning techniques can be used in automatic processing for classifying the huge database of seismic recordings and provide more confidence in the final output. Applied in the context of the International Monitoring System (IMS) - a global sensor network developed for the Comprehensive Nuclear-Test-Ban Treaty (CTBT) - we propose a fully automatic method for seismic event detection and classification based on a supervised pattern recognition technique called the Support Vector Machine (SVM). According to Kortström et al., 2015, the advantages of using SVM are handleability of large number of features and effectiveness in high dimensional spaces. Our objective is to detect seismic events from one IMS seismic station located in an area of high seismicity and mining activity and classify them as earthquakes or quarry blasts. It is expected to create a flexible and easily adjustable SVM method that can be applied in different regions and datasets. Taken a step further, accurate results for seismic stations could lead to a modification of the model and its parameters to make it applicable to other waveform technologies used to monitor nuclear explosions such as infrasound and hydroacoustic waveforms. As an authorized user, we have direct access to all IMS data and bulletins through a secure signatory account. A set of significant seismic waveforms containing different types of events (e.g. earthquake, quarry blasts) and noise is being analysed to train the model and learn the typical pattern of the signal from these events. Moreover, comparing the performance of the support-vector network to various classical learning algorithms used before in seismic detection and classification is an essential final step to analyze the advantages and disadvantages of the model.
Dollman, James
2005-01-01
The 'Learning Trail' is an innovative application of peer-mediated instruction designed to enhance student learning in large practical classes. The strategy specifically seeks to improve participants' attention to details of protocol that are often difficult to observe during teacher-centered demonstrations to large groups. Students (n=68) at the University of South Australia trialed this strategy, in which instruction in anthropometric techniques is initiated by an instructor to a group of 3-4 students and then sent in 'waves' from one student group to the next. The final group in the sequence demonstrates the techniques to the instructor, who notes any departures from technical accuracy. As each technical module is flowing from group to group, the instructor initiates the next 'wave' with the first group, and the process is repeated until all of the relevant skills are processed. The final stage is a full class discussion during which sources of technical error are identified and resolved. In this trial, students taught skinfold measurement by the peer instructed method (PI; n=33) were compared with a traditionally instructed group (TI; n=35), in which the instructor was responsible for all information transfer. For each participant, technical errors of measurement (TEM) were calculated; the intra-tester TEM as a measure of reliability, and the inter-tester TEM, in which the student's measures are compared with those of a criterion anthropometrist to give an indication of validity. There were no differences between TI and PI groups on intra-tester TEM (p=0.24), but the PI group had a lower inter-tester TEM for pooled skinfold sites (p=0.006) and for one individual site (triceps; p=0.007), but not the other three sites. The time taken to complete the whole set of instructions did not differ between delivery modes. The results of this trial suggest that the peer-mediated strategy may be more effective than teacher-centered instruction in terms of technical accuracy in anthropometry.
ERIC Educational Resources Information Center
Chen, Chia-Chen; Lin, Pei-Hsuan
2016-01-01
In recent years information technology has been integrated into education to produce a series of trends, beginning with "electronic learning" (e-learning), through "mobile learning" (m-learning) and finally to "ubiquitous learning" (u-learning), which aims to improve learner motivation through overcoming the…
Diagnosis of students' ability in a statistical course based on Rasch probabilistic outcome
NASA Astrophysics Data System (ADS)
Mahmud, Zamalia; Ramli, Wan Syahira Wan; Sapri, Shamsiah; Ahmad, Sanizah
2017-06-01
Measuring students' ability and performance are important in assessing how well students have learned and mastered the statistical courses. Any improvement in learning will depend on the student's approaches to learning, which are relevant to some factors of learning, namely assessment methods carrying out tasks consisting of quizzes, tests, assignment and final examination. This study has attempted an alternative approach to measure students' ability in an undergraduate statistical course based on the Rasch probabilistic model. Firstly, this study aims to explore the learning outcome patterns of students in a statistics course (Applied Probability and Statistics) based on an Entrance-Exit survey. This is followed by investigating students' perceived learning ability based on four Course Learning Outcomes (CLOs) and students' actual learning ability based on their final examination scores. Rasch analysis revealed that students perceived themselves as lacking the ability to understand about 95% of the statistics concepts at the beginning of the class but eventually they had a good understanding at the end of the 14 weeks class. In terms of students' performance in their final examination, their ability in understanding the topics varies at different probability values given the ability of the students and difficulty of the questions. Majority found the probability and counting rules topic to be the most difficult to learn.
Farzandipour, Mehrdad; Meidani, Zahra; Riazi, Hossein; Sadeqi Jabali, Monireh
2018-09-01
There are various approaches to evaluating the usability of electronic medical record (EMR) systems. User perspectives are an integral part of evaluation. Usability evaluations efficiently and effectively contribute to user-centered design and supports tasks and increase user satisfaction. This study determined the main usability requirements for EMRs by means of an end-user survey. A mixed-method strategy was conducted in three phases. A qualitative approach was employed to collect and formulate EMR usability requirements using the focus group method and the modified Delphi technique. Classic Delphi technique was used to evaluate the proposed requirements among 380 end-users in Iran. The final list of EMR usability requirements was verified and included 163 requirements divided into nine groups. The highest rates of end-user agreement relate to EMR visual clarity (3.65 ± 0.61), fault tolerance (3.58 ± 0.56), and suitability for learning (3.55 ± 0.54). The lowest end-user agreement was for auditory presentation (3.18 ± 0.69). The highest and lowest agreement among end-users was for visual clarity and auditory presentation by EMRs, respectively. This suggests that user priorities in determination of EMR usability and their understanding of the importance of the types of individual tasks and context characteristics differ.
Selecting a restoration technique to minimize OCR error.
Cannon, M; Fugate, M; Hush, D R; Scovel, C
2003-01-01
This paper introduces a learning problem related to the task of converting printed documents to ASCII text files. The goal of the learning procedure is to produce a function that maps documents to restoration techniques in such a way that on average the restored documents have minimum optical character recognition error. We derive a general form for the optimal function and use it to motivate the development of a nonparametric method based on nearest neighbors. We also develop a direct method of solution based on empirical error minimization for which we prove a finite sample bound on estimation error that is independent of distribution. We show that this empirical error minimization problem is an extension of the empirical optimization problem for traditional M-class classification with general loss function and prove computational hardness for this problem. We then derive a simple iterative algorithm called generalized multiclass ratchet (GMR) and prove that it produces an optimal function asymptotically (with probability 1). To obtain the GMR algorithm we introduce a new data map that extends Kesler's construction for the multiclass problem and then apply an algorithm called Ratchet to this mapped data, where Ratchet is a modification of the Pocket algorithm . Finally, we apply these methods to a collection of documents and report on the experimental results.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yee, Shannon
BETTER Capstone supported 29 student project teams consisting of 155 students over two years in developing transformative building energy efficiency technologies through a capstone design experience. Capstone is the culmination of an undergraduate student’s engineering education. Interdisciplinary teams of students spent a semester designing and prototyping a technological solution for a variety building energy efficiency problems. During this experience students utilized the full design process, including the manufacturing and testing of a prototype solution, as well as publically demonstrating the solution at the Capstone Design Expo. As part of this project, students explored modern manufacturing techniques and gained hands-on experiencemore » with these techniques to produce their prototype technologies. This research added to the understanding of the challenges within building technology education and engagement with industry. One goal of the project was to help break the chicken-and-egg problem with getting students to engage more deeply with the building technology industry. It was learned however that this industry is less interested in trying innovative new concept but rather interested in hiring graduates for existing conventional building efforts. While none of the projects yielded commercial success, much individual student growth and learning was accomplished, which is a long-term benefit to the public at large.« less
Computational Fluid Dynamics Analysis Success Stories of X-Plane Design to Flight Test
NASA Technical Reports Server (NTRS)
Cosentino, Gary B.
2008-01-01
Examples of the design and flight test of three true X-planes are described, particularly X-plane design techniques that relied heavily on computational fluid dynamics(CFD) analysis. Three examples are presented: the X-36 Tailless Fighter Agility Research Aircraft, the X-45A Unmanned Combat Air Vehicle, and the X-48B Blended Wing Body Demonstrator Aircraft. An overview is presented of the uses of CFD analysis, comparison and contrast with wind tunnel testing, and information derived from CFD analysis that directly related to successful flight test. Lessons learned on the proper and improper application of CFD analysis are presented. Highlights of the flight-test results of the three example X-planes are presented. This report discusses developing an aircraft shape from early concept and three-dimensional modeling through CFD analysis, wind tunnel testing, further refined CFD analysis, and, finally, flight. An overview of the areas in which CFD analysis does and does not perform well during this process is presented. How wind tunnel testing complements, calibrates, and verifies CFD analysis is discussed. Lessons learned revealing circumstances under which CFD analysis results can be misleading are given. Strengths and weaknesses of the various flow solvers, including panel methods, Euler, and Navier-Stokes techniques, are discussed.
Ott, Laura E; Carson, Susan
2014-01-01
Flow cytometry and enzyme-linked immunosorbent assay (ELISA) are commonly used techniques associated with clinical and research applications within the immunology and medical fields. The use of these techniques is becoming increasingly valuable in many life science and engineering disciplines as well. Herein, we report the development and evaluation of a novel half-semester course that focused on introducing undergraduate and graduate students to advance conceptual and technical skills associated with flow cytometry and ELISA, with emphasis on applications, experimental design, and data analysis. This course was offered in the North Carolina State University Biotechnology Program over three semesters and consisted of weekly lectures and laboratories. Students performed and/or analyzed flow cytometry and ELISA in three separate laboratory exercises: (1) identification of transgenic zebrafish hematopoietic cells, (2) analysis of transfection efficiency, and (3) analysis of cytokine production upon lipopolysaccharide stimulation. Student learning outcomes were achieved as demonstrated by multiple means of assessment, including three laboratory reports, a data analysis laboratory practicum, and a cumulative final exam. Further, anonymous student self-assessment revealed increased student confidence in the knowledge and skill sets defined in the learning outcomes. Copyright © 2014 The International Union of Biochemistry and Molecular Biology.
Collaborative and Cooperative Learning Techniques. Learning Package No. 6.
ERIC Educational Resources Information Center
Compton, Joe; Smith, Carl, Comp.
Originally developed for the Department of Defense Schools (DoDDS) system, this learning package on collaborative and cooperative learning techniques is designed for teachers who wish to upgrade or expand their teaching skills on their own. The package includes a comprehensive search of the ERIC database; a lecture giving an overview on the topic;…
Interpreting Medical Information Using Machine Learning and Individual Conditional Expectation.
Nohara, Yasunobu; Wakata, Yoshifumi; Nakashima, Naoki
2015-01-01
Recently, machine-learning techniques have spread many fields. However, machine-learning is still not popular in medical research field due to difficulty of interpreting. In this paper, we introduce a method of interpreting medical information using machine learning technique. The method gave new explanation of partial dependence plot and individual conditional expectation plot from medical research field.
ERIC Educational Resources Information Center
Santicola, Craig F.
2015-01-01
The literature indicates that there is a lack of learning outcomes in economics that can be attributed to the reliance on traditional lecture and the failure to adopt innovative instructional techniques. This study sought to investigate the student learning effects of academic controversy, a cooperative learning technique that shows promise in the…
Incorporating Active Learning Techniques into a Genetics Class
ERIC Educational Resources Information Center
Lee, W. Theodore; Jabot, Michael E.
2011-01-01
We revised a sophomore-level genetics class to more actively engage the students in their learning. The students worked in groups on quizzes using the Immediate Feedback Assessment Technique (IF-AT) and active-learning projects. The IF-AT quizzes allowed students to discuss key concepts in small groups and learn the correct answers in class. The…
ERIC Educational Resources Information Center
Ohio State Univ., Columbus. National Center for Research in Vocational Education.
One of a series of performance-based teacher education learning packages focusing upon specific professional competencies of vocational teachers, this learning module deals with employing simulation techniques. It consists of an introduction and four learning experiences. Covered in the first learning experience are various types of simulation…
Guidelines for application of learning/cost improvement curves
NASA Technical Reports Server (NTRS)
Delionback, L. M.
1975-01-01
The differences between the terms learning curve and improvement curve are noted, as well as the differences between the Wright system and the Crawford system. Learning curve computational techniques were reviewed along with a method to arrive at a composite learning curve for a system given detail curves either by the functional techniques classification or simply categorized by subsystem. Techniques are discussed for determination of the theoretical first unit (TFU) cost using several of the currently accepted methods. Sometimes TFU cost is referred to as simply number one cost. A tabular presentation of the various learning curve slope values is given. A discussion of the various trends in the application of learning/improvement curves and an outlook for the future are presented.
Not another boring lecture: engaging learners with active learning techniques.
Wolff, Margaret; Wagner, Mary Jo; Poznanski, Stacey; Schiller, Jocelyn; Santen, Sally
2015-01-01
Core content in Emergency Medicine Residency Programs is traditionally covered in didactic sessions, despite evidence suggesting that learners do not retain a significant portion of what is taught during lectures. We describe techniques that medical educators can use when leading teaching sessions to foster engagement and encourage self-directed learning, based on current literature and evidence about learning. When these techniques are incorporated, sessions can be effective in delivering core knowledge, contextualizing content, and explaining difficult concepts, leading to increased learning. Copyright © 2015 Elsevier Inc. All rights reserved.
Imbalanced Learning for Functional State Assessment
NASA Technical Reports Server (NTRS)
Li, Feng; McKenzie, Frederick; Li, Jiang; Zhang, Guangfan; Xu, Roger; Richey, Carl; Schnell, Tom
2011-01-01
This paper presents results of several imbalanced learning techniques applied to operator functional state assessment where the data is highly imbalanced, i.e., some function states (majority classes) have much more training samples than other states (minority classes). Conventional machine learning techniques usually tend to classify all data samples into majority classes and perform poorly for minority classes. In this study, we implemented five imbalanced learning techniques, including random undersampling, random over-sampling, synthetic minority over-sampling technique (SMOTE), borderline-SMOTE and adaptive synthetic sampling (ADASYN) to solve this problem. Experimental results on a benchmark driving lest dataset show thai accuracies for minority classes could be improved dramatically with a cost of slight performance degradations for majority classes,
Learned-Helplessness Theory: Implications for Research in Learning Disabilities.
ERIC Educational Resources Information Center
Canino, Frank J.
1981-01-01
The application of learned helplessness theory to achievement is discussed within the context of implications for research in learning disabilities. Finally, the similarities between helpless children and learning disabled students in terms of problems solving and attention are discussed. (Author)
Theorists and Techniques: Connecting Education Theories to Lamaze Teaching Techniques
Podgurski, Mary Jo
2016-01-01
ABSTRACT Should childbirth educators connect education theory to technique? Is there more to learning about theorists than memorizing facts for an assessment? Are childbirth educators uniquely poised to glean wisdom from theorists and enhance their classes with interactive techniques inspiring participant knowledge and empowerment? Yes, yes, and yes. This article will explore how an awareness of education theory can enhance retention of material through interactive learning techniques. Lamaze International childbirth classes already prepare participants for the childbearing year by using positive group dynamics; theory will empower childbirth educators to address education through well-studied avenues. Childbirth educators can provide evidence-based learning techniques in their classes and create true behavioral change. PMID:26848246
Classroom Research in Accounting: Assessing for Learning.
ERIC Educational Resources Information Center
Cottell, Philip G., Jr.
1991-01-01
The use of several college classroom assessment techniques to evaluate the processes and products of accounting instruction through cooperative learning is described. The discussion looks at considerations in planning classroom assessment, choosing initial assessment techniques and adapting them, and blending cooperative learning structures with…
Integrating Research, Teaching and Learning: Preparing the Future National STEM Faculty
NASA Astrophysics Data System (ADS)
Hooper, E. J.; Pfund, C.; Mathieu, R.
2010-08-01
A network of universities (Howard, Michigan State, Texas A&M, University of Colorado at Boulder, University of Wisconsin-Madison, Vanderbilt) have created a National Science Foundation-funded network to prepare a future national STEM (science, technology, engineering, mathematics) faculty committed to learning, implementing, and advancing teaching techniques that are effective for the wide range of students enrolled in higher education. The Center for the Integration of Research, Teaching and Learning (CIRTL; http://www.cirtl.net) develops, implements and evaluates professional development programs for future and current faculty. The programs comprise graduate courses, internships, and workshops, all integrated within campus learning communities. These elements are unified and guided by adherence to three core principles, or pillars: "Teaching as Research," whereby research skills are applied to evaluating and advancing undergraduate learning; "Learning through Diversity," in which the diversity of students' backgrounds and experiences are used as a rich resource to enhance teaching and learning; and "Learning Communities" that foster shared learning and discovery among students, and between future and current faculty within a department or institution. CIRTL established a laboratory for testing its ideas and practices at the University of Wisconsin-Madison, known as the Delta Program in Research, Teaching and Learning (http://www.delta.wisc.edu). The program offers project-based graduate courses, research mentor training, and workshops for post-docs, staff, and faculty. In addition, graduate students and post-docs can partner with a faculty member in a teaching-as-research internship to define and tackle a specific teaching and learning problem. Finally, students can obtain a Delta Certificate as testimony to their engagement in and commitment to teaching and learning. Delta has proved very successful, having served over 1500 UW-Madison instructors from graduate students to full professors. UW-Madison values the program to the point of now funding it internally.
DiBartolomeis, Susan M
2011-01-01
Several reports on science education suggest that students at all levels learn better if they are immersed in a project that is long term, yielding results that require analysis and interpretation. I describe a 12-wk laboratory project suitable for upper-level undergraduates and first-year graduate students, in which the students molecularly locate and map a gene from Drosophila melanogaster called dusky and one of dusky's mutant alleles. The mapping strategy uses restriction fragment length polymorphism analysis; hence, students perform most of the basic techniques of molecular biology (DNA isolation, restriction enzyme digestion and mapping, plasmid vector subcloning, agarose and polyacrylamide gel electrophoresis, DNA labeling, and Southern hybridization) toward the single goal of characterizing dusky and the mutant allele dusky(73). Students work as individuals, pairs, or in groups of up to four students. Some exercises require multitasking and collaboration between groups. Finally, results from everyone in the class are required for the final analysis. Results of pre- and postquizzes and surveys indicate that student knowledge of appropriate topics and skills increased significantly, students felt more confident in the laboratory, and students found the laboratory project interesting and challenging. Former students report that the lab was useful in their careers.
DiBartolomeis, Susan M.
2011-01-01
Several reports on science education suggest that students at all levels learn better if they are immersed in a project that is long term, yielding results that require analysis and interpretation. I describe a 12-wk laboratory project suitable for upper-level undergraduates and first-year graduate students, in which the students molecularly locate and map a gene from Drosophila melanogaster called dusky and one of dusky's mutant alleles. The mapping strategy uses restriction fragment length polymorphism analysis; hence, students perform most of the basic techniques of molecular biology (DNA isolation, restriction enzyme digestion and mapping, plasmid vector subcloning, agarose and polyacrylamide gel electrophoresis, DNA labeling, and Southern hybridization) toward the single goal of characterizing dusky and the mutant allele dusky73. Students work as individuals, pairs, or in groups of up to four students. Some exercises require multitasking and collaboration between groups. Finally, results from everyone in the class are required for the final analysis. Results of pre- and postquizzes and surveys indicate that student knowledge of appropriate topics and skills increased significantly, students felt more confident in the laboratory, and students found the laboratory project interesting and challenging. Former students report that the lab was useful in their careers. PMID:21364104
Locality constrained joint dynamic sparse representation for local matching based face recognition.
Wang, Jianzhong; Yi, Yugen; Zhou, Wei; Shi, Yanjiao; Qi, Miao; Zhang, Ming; Zhang, Baoxue; Kong, Jun
2014-01-01
Recently, Sparse Representation-based Classification (SRC) has attracted a lot of attention for its applications to various tasks, especially in biometric techniques such as face recognition. However, factors such as lighting, expression, pose and disguise variations in face images will decrease the performances of SRC and most other face recognition techniques. In order to overcome these limitations, we propose a robust face recognition method named Locality Constrained Joint Dynamic Sparse Representation-based Classification (LCJDSRC) in this paper. In our method, a face image is first partitioned into several smaller sub-images. Then, these sub-images are sparsely represented using the proposed locality constrained joint dynamic sparse representation algorithm. Finally, the representation results for all sub-images are aggregated to obtain the final recognition result. Compared with other algorithms which process each sub-image of a face image independently, the proposed algorithm regards the local matching-based face recognition as a multi-task learning problem. Thus, the latent relationships among the sub-images from the same face image are taken into account. Meanwhile, the locality information of the data is also considered in our algorithm. We evaluate our algorithm by comparing it with other state-of-the-art approaches. Extensive experiments on four benchmark face databases (ORL, Extended YaleB, AR and LFW) demonstrate the effectiveness of LCJDSRC.
"Celebration of the Neurons": The Application of Brain Based Learning in Classroom Environment
ERIC Educational Resources Information Center
Duman, Bilal
2007-01-01
The purpose of this study is to investigate approaches and techniques related to how brain based learning used in classroom atmosphere. This general purpose were answered following the questions: (1) What is the aim of brain based learning? (2) What are general approaches and techniques that brain based learning used? and (3) How should be used…
ERIC Educational Resources Information Center
Shea, Mary Lou; And Others
This learning module, which is part of a staff development program for health occupations clinical instructors, discusses various creative teaching techniques that can be used in teaching students to find information, use opportunities to learn, assume responsibility for self-learning, solve problems, apply skills learned to new situations,…
Precision Learning Assessment: An Alternative to Traditional Assessment Techniques.
ERIC Educational Resources Information Center
Caltagirone, Paul J.; Glover, Christopher E.
1985-01-01
A continuous and curriculum-based assessment method, Precision Learning Assessment (PLA), which integrates precision teaching and norm-referenced techniques, was applied to a math computation curriculum for 214 third graders. The resulting districtwide learning curves defining average annual progress through the computation curriculum provided…
Towards a Quality Assessment Method for Learning Preference Profiles in Negotiation
NASA Astrophysics Data System (ADS)
Hindriks, Koen V.; Tykhonov, Dmytro
In automated negotiation, information gained about an opponent's preference profile by means of learning techniques may significantly improve an agent's negotiation performance. It therefore is useful to gain a better understanding of how various negotiation factors influence the quality of learning. The quality of learning techniques in negotiation are typically assessed indirectly by means of comparing the utility levels of agreed outcomes and other more global negotiation parameters. An evaluation of learning based on such general criteria, however, does not provide any insight into the influence of various aspects of negotiation on the quality of the learned model itself. The quality may depend on such aspects as the domain of negotiation, the structure of the preference profiles, the negotiation strategies used by the parties, and others. To gain a better understanding of the performance of proposed learning techniques in the context of negotiation and to be able to assess the potential to improve the performance of such techniques a more systematic assessment method is needed. In this paper we propose such a systematic method to analyse the quality of the information gained about opponent preferences by learning in single-instance negotiations. The method includes measures to assess the quality of a learned preference profile and proposes an experimental setup to analyse the influence of various negotiation aspects on the quality of learning. We apply the method to a Bayesian learning approach for learning an opponent's preference profile and discuss our findings.
Deep-Learning Convolutional Neural Networks Accurately Classify Genetic Mutations in Gliomas.
Chang, P; Grinband, J; Weinberg, B D; Bardis, M; Khy, M; Cadena, G; Su, M-Y; Cha, S; Filippi, C G; Bota, D; Baldi, P; Poisson, L M; Jain, R; Chow, D
2018-05-10
The World Health Organization has recently placed new emphasis on the integration of genetic information for gliomas. While tissue sampling remains the criterion standard, noninvasive imaging techniques may provide complimentary insight into clinically relevant genetic mutations. Our aim was to train a convolutional neural network to independently predict underlying molecular genetic mutation status in gliomas with high accuracy and identify the most predictive imaging features for each mutation. MR imaging data and molecular information were retrospectively obtained from The Cancer Imaging Archives for 259 patients with either low- or high-grade gliomas. A convolutional neural network was trained to classify isocitrate dehydrogenase 1 ( IDH1 ) mutation status, 1p/19q codeletion, and O6-methylguanine-DNA methyltransferase ( MGMT ) promotor methylation status. Principal component analysis of the final convolutional neural network layer was used to extract the key imaging features critical for successful classification. Classification had high accuracy: IDH1 mutation status, 94%; 1p/19q codeletion, 92%; and MGMT promotor methylation status, 83%. Each genetic category was also associated with distinctive imaging features such as definition of tumor margins, T1 and FLAIR suppression, extent of edema, extent of necrosis, and textural features. Our results indicate that for The Cancer Imaging Archives dataset, machine-learning approaches allow classification of individual genetic mutations of both low- and high-grade gliomas. We show that relevant MR imaging features acquired from an added dimensionality-reduction technique demonstrate that neural networks are capable of learning key imaging components without prior feature selection or human-directed training. © 2018 by American Journal of Neuroradiology.
Application of machine learning for the evaluation of turfgrass plots using aerial images
NASA Astrophysics Data System (ADS)
Ding, Ke; Raheja, Amar; Bhandari, Subodh; Green, Robert L.
2016-05-01
Historically, investigation of turfgrass characteristics have been limited to visual ratings. Although relevant information may result from such evaluations, final inferences may be questionable because of the subjective nature in which the data is collected. Recent advances in computer vision techniques allow researchers to objectively measure turfgrass characteristics such as percent ground cover, turf color, and turf quality from the digital images. This paper focuses on developing a methodology for automated assessment of turfgrass quality from aerial images. Images of several turfgrass plots of varying quality were gathered using a camera mounted on an unmanned aerial vehicle. The quality of these plots were also evaluated based on visual ratings. The goal was to use the aerial images to generate quality evaluations on a regular basis for the optimization of water treatment. Aerial images are used to train a neural network so that appropriate features such as intensity, color, and texture of the turfgrass are extracted from these images. Neural network is a nonlinear classifier commonly used in machine learning. The output of the neural network trained model is the ratings of the grass, which is compared to the visual ratings. Currently, the quality and the color of turfgrass, measured as the greenness of the grass, are evaluated. The textures are calculated using the Gabor filter and co-occurrence matrix. Other classifiers such as support vector machines and simpler linear regression models such as Ridge regression and LARS regression are also used. The performance of each model is compared. The results show encouraging potential for using machine learning techniques for the evaluation of turfgrass quality and color.
PSP Measurement of Stator Vane Surface Pressures in a High Speed Fan
NASA Technical Reports Server (NTRS)
Lepicovsky, Jan
1998-01-01
This paper presents measurements of static pressures on the stator vane suction side of a high-speed single stage fan using the technique of pressure sensitive paint (PSP). The paper illustrates development in application of the relatively new experimental technique to the complex environment of internal flows in turbomachines. First, there is a short explanation of the physics of the PSP technique and a discussion of calibration methods for pressure sensitive paint in the turbomachinery environment. A description of the image conversion process follows. The recorded image of the stator vane pressure field is skewed due to the limited optical access and must be converted to the meridional plane projection for comparison with analytical predictions. The experimental results for seven operating conditions along an off-design rotational speed line are shown in a concise form, including performance map points, mindspan static tap pressure distributions, and vane suction side pressure fields. Then, a comparison between static tap and pressure sensitive paint data is discussed. Finally, the paper lists shortcomings of the pressure sensitive paint technology and lessons learned in this high-speed fan application.
Camargo, Lucila Basto; Raggio, Daniela Prócida; Bonacina, Carlos Felipe; Wen, Chao Lung; Mendes, Fausto Medeiros; Bönecker, Marcelo José Strazzeri; Haddad, Ana Estela
2014-07-17
The aim of this study was to evaluate e-learning strategy in teaching Atraumatic Restorative Treatment (ART) to undergraduate and graduate students. The sample comprised 76 participants-38 dental students and 38 pediatric dentistry students-in a specialization course. To evaluate knowledge improvement, participants were subjected to a test performed before and after the course. A single researcher corrected the tests and intraexaminer reproducibility was calculated (CCI = 0.991; 95% IC = 0.975-0.996). All students improved their performances after the e-learning course (Paired t-tests p < 0.001). The means of undergraduate students were 4.7 (initial) and 6.4 (final) and those of graduate students were 6.8 (initial) and 8.2 (final). The comparison of the final evaluation means showed a statistically significant difference (t-tests p < 0.0001). The e-learning strategy has the potential of improving students' knowledge in ART. Mature students perform better in this teaching modality when it is applied exclusively via distance learning.
Prediction of drug synergy in cancer using ensemble-based machine learning techniques
NASA Astrophysics Data System (ADS)
Singh, Harpreet; Rana, Prashant Singh; Singh, Urvinder
2018-04-01
Drug synergy prediction plays a significant role in the medical field for inhibiting specific cancer agents. It can be developed as a pre-processing tool for therapeutic successes. Examination of different drug-drug interaction can be done by drug synergy score. It needs efficient regression-based machine learning approaches to minimize the prediction errors. Numerous machine learning techniques such as neural networks, support vector machines, random forests, LASSO, Elastic Nets, etc., have been used in the past to realize requirement as mentioned above. However, these techniques individually do not provide significant accuracy in drug synergy score. Therefore, the primary objective of this paper is to design a neuro-fuzzy-based ensembling approach. To achieve this, nine well-known machine learning techniques have been implemented by considering the drug synergy data. Based on the accuracy of each model, four techniques with high accuracy are selected to develop ensemble-based machine learning model. These models are Random forest, Fuzzy Rules Using Genetic Cooperative-Competitive Learning method (GFS.GCCL), Adaptive-Network-Based Fuzzy Inference System (ANFIS) and Dynamic Evolving Neural-Fuzzy Inference System method (DENFIS). Ensembling is achieved by evaluating the biased weighted aggregation (i.e. adding more weights to the model with a higher prediction score) of predicted data by selected models. The proposed and existing machine learning techniques have been evaluated on drug synergy score data. The comparative analysis reveals that the proposed method outperforms others in terms of accuracy, root mean square error and coefficient of correlation.
Accurate Identification of Cancerlectins through Hybrid Machine Learning Technology.
Zhang, Jieru; Ju, Ying; Lu, Huijuan; Xuan, Ping; Zou, Quan
2016-01-01
Cancerlectins are cancer-related proteins that function as lectins. They have been identified through computational identification techniques, but these techniques have sometimes failed to identify proteins because of sequence diversity among the cancerlectins. Advanced machine learning identification methods, such as support vector machine and basic sequence features (n-gram), have also been used to identify cancerlectins. In this study, various protein fingerprint features and advanced classifiers, including ensemble learning techniques, were utilized to identify this group of proteins. We improved the prediction accuracy of the original feature extraction methods and classification algorithms by more than 10% on average. Our work provides a basis for the computational identification of cancerlectins and reveals the power of hybrid machine learning techniques in computational proteomics.
Machine learning modelling for predicting soil liquefaction susceptibility
NASA Astrophysics Data System (ADS)
Samui, P.; Sitharam, T. G.
2011-01-01
This study describes two machine learning techniques applied to predict liquefaction susceptibility of soil based on the standard penetration test (SPT) data from the 1999 Chi-Chi, Taiwan earthquake. The first machine learning technique which uses Artificial Neural Network (ANN) based on multi-layer perceptions (MLP) that are trained with Levenberg-Marquardt backpropagation algorithm. The second machine learning technique uses the Support Vector machine (SVM) that is firmly based on the theory of statistical learning theory, uses classification technique. ANN and SVM have been developed to predict liquefaction susceptibility using corrected SPT [(N1)60] and cyclic stress ratio (CSR). Further, an attempt has been made to simplify the models, requiring only the two parameters [(N1)60 and peck ground acceleration (amax/g)], for the prediction of liquefaction susceptibility. The developed ANN and SVM models have also been applied to different case histories available globally. The paper also highlights the capability of the SVM over the ANN models.
Cavallini, Gian Maria; Verdina, Tommaso; De Maria, Michele; Fornasari, Elisa; Volpini, Elisa; Campi, Luca
2017-11-29
To describe the intraoperative complications and the learning curve of microincision cataract surgery assisted by femtosecond laser (FLACS) with bimanual technique performed by an experienced surgeon. It is a prospective, observational, comparative case series. A total of 120 eyes which underwent bimanual FLACS by the same experienced surgeon during his first experience were included in the study; we considered the first 60 cases as Group A and the second 60 cases as Group B. In both groups, only nuclear sclerosis of grade 2 or 3 was included; an intraocular lens was implanted through a 1.4-mm incision. Best-corrected visual acuity (BCVA), surgically induced astigmatism (SIA), central corneal thickness and endothelial cell loss (ECL) were evaluated before and at 1 and 3 months after surgery. Intraoperative parameters, and intra- and post-operative complications were recorded. In Group A, we had femtosecond laser-related minor complications in 11 cases (18.3%) and post-operative complications in 2 cases (3.3%); in Group B, we recorded 2 cases (3.3%) of femtosecond laser-related minor complications with no post-operative complications. Mean effective phaco time (EPT) was 5.32 ± 3.68 s in Group A and 4.34 ± 2.39 s in Group B with a significant difference (p = 0.046). We recorded a significant mean BCVA improvement at 3 months in both groups (p < 0.05) and no significant SIA nor corneal pachymetry changes in the two groups during the follow-up (p > 0.05). Finally, we found significant ECL in both groups with a significant difference between the two groups (p = 0.042). FLACS with bimanual technique and low-energy LDV Z8 is associated with a necessary initial learning curve. After the first adjustments in the surgical technique, this technology seems to be safe and effective with rapid visual recovery and it helps surgeons to standardize the crucial steps of cataract surgery.
ERIC Educational Resources Information Center
Pranoto, Hadi; Atieka, Nurul; Wihardjo, Sihadi Darmo; Wibowo, Agus; Nurlaila, Siti; Sudarmaji
2016-01-01
This study aims at: determining students motivation before being given a group guidance with self-regulation technique, determining students' motivation after being given a group counseling with self-regulation technique, generating a model of group counseling with self-regulation technique to improve motivation of learning, determining the…
NASA Astrophysics Data System (ADS)
Kriswintari, D.; Yuanita, L.; Widodo, W.
2018-04-01
The aim of this study was to develop chemistry learning package using Student Teams Achievement Division (STAD) cooperative learning technique to foster students’ thinking skills and social attitudes. The chemistry learning package consisting of lesson plan, handout, students’ worksheet, thinking skill test, and observation sheet of social attitude was developed using the Dick and Carey model. Research subject of this study was chemistry learning package using STAD which was tried out on tenth grade students of SMA Trimurti Surabaya. The tryout was conducted using the one-group pre-test post-test design. Data was collected through observation, test, and questionnaire. The obtained data were analyzed using descriptive qualitative analysis. The findings of this study revealed that the developed chemistry learning package using STAD cooperative learning technique was categorized valid, practice and effective to be implemented in the classroom to foster students’ thinking skill and social attitude.
Chen, Zhiru; Hong, Wenxue
2016-02-01
Considering the low accuracy of prediction in the positive samples and poor overall classification effects caused by unbalanced sample data of MicroRNA (miRNA) target, we proposes a support vector machine (SVM)-integration of under-sampling and weight (IUSM) algorithm in this paper, an under-sampling based on the ensemble learning algorithm. The algorithm adopts SVM as learning algorithm and AdaBoost as integration framework, and embeds clustering-based under-sampling into the iterative process, aiming at reducing the degree of unbalanced distribution of positive and negative samples. Meanwhile, in the process of adaptive weight adjustment of the samples, the SVM-IUSM algorithm eliminates the abnormal ones in negative samples with robust sample weights smoothing mechanism so as to avoid over-learning. Finally, the prediction of miRNA target integrated classifier is achieved with the combination of multiple weak classifiers through the voting mechanism. The experiment revealed that the SVM-IUSW, compared with other algorithms on unbalanced dataset collection, could not only improve the accuracy of positive targets and the overall effect of classification, but also enhance the generalization ability of miRNA target classifier.
Effects of visual feedback-induced variability on motor learning of handrim wheelchair propulsion.
Leving, Marika T; Vegter, Riemer J K; Hartog, Johanneke; Lamoth, Claudine J C; de Groot, Sonja; van der Woude, Lucas H V
2015-01-01
It has been suggested that a higher intra-individual variability benefits the motor learning of wheelchair propulsion. The present study evaluated whether feedback-induced variability on wheelchair propulsion technique variables would also enhance the motor learning process. Learning was operationalized as an improvement in mechanical efficiency and propulsion technique, which are thought to be closely related during the learning process. 17 Participants received visual feedback-based practice (feedback group) and 15 participants received regular practice (natural learning group). Both groups received equal practice dose of 80 min, over 3 weeks, at 0.24 W/kg at a treadmill speed of 1.11 m/s. To compare both groups the pre- and post-test were performed without feedback. The feedback group received real-time visual feedback on seven propulsion variables with instruction to manipulate the presented variable to achieve the highest possible variability (1st 4-min block) and optimize it in the prescribed direction (2nd 4-min block). To increase motor exploration the participants were unaware of the exact variable they received feedback on. Energy consumption and the propulsion technique variables with their respective coefficient of variation were calculated to evaluate the amount of intra-individual variability. The feedback group, which practiced with higher intra-individual variability, improved the propulsion technique between pre- and post-test to the same extent as the natural learning group. Mechanical efficiency improved between pre- and post-test in the natural learning group but remained unchanged in the feedback group. These results suggest that feedback-induced variability inhibited the improvement in mechanical efficiency. Moreover, since both groups improved propulsion technique but only the natural learning group improved mechanical efficiency, it can be concluded that the improvement in mechanical efficiency and propulsion technique do not always appear simultaneously during the motor learning process. Their relationship is most likely modified by other factors such as the amount of the intra-individual variability.
Effects of Visual Feedback-Induced Variability on Motor Learning of Handrim Wheelchair Propulsion
Leving, Marika T.; Vegter, Riemer J. K.; Hartog, Johanneke; Lamoth, Claudine J. C.; de Groot, Sonja; van der Woude, Lucas H. V.
2015-01-01
Background It has been suggested that a higher intra-individual variability benefits the motor learning of wheelchair propulsion. The present study evaluated whether feedback-induced variability on wheelchair propulsion technique variables would also enhance the motor learning process. Learning was operationalized as an improvement in mechanical efficiency and propulsion technique, which are thought to be closely related during the learning process. Methods 17 Participants received visual feedback-based practice (feedback group) and 15 participants received regular practice (natural learning group). Both groups received equal practice dose of 80 min, over 3 weeks, at 0.24 W/kg at a treadmill speed of 1.11 m/s. To compare both groups the pre- and post-test were performed without feedback. The feedback group received real-time visual feedback on seven propulsion variables with instruction to manipulate the presented variable to achieve the highest possible variability (1st 4-min block) and optimize it in the prescribed direction (2nd 4-min block). To increase motor exploration the participants were unaware of the exact variable they received feedback on. Energy consumption and the propulsion technique variables with their respective coefficient of variation were calculated to evaluate the amount of intra-individual variability. Results The feedback group, which practiced with higher intra-individual variability, improved the propulsion technique between pre- and post-test to the same extent as the natural learning group. Mechanical efficiency improved between pre- and post-test in the natural learning group but remained unchanged in the feedback group. Conclusion These results suggest that feedback-induced variability inhibited the improvement in mechanical efficiency. Moreover, since both groups improved propulsion technique but only the natural learning group improved mechanical efficiency, it can be concluded that the improvement in mechanical efficiency and propulsion technique do not always appear simultaneously during the motor learning process. Their relationship is most likely modified by other factors such as the amount of the intra-individual variability. PMID:25992626
ERIC Educational Resources Information Center
Wei, Zheng
2015-01-01
The present research tested the effectiveness of the word part technique in comparison with the keyword method and self-strategy learning. One hundred and twenty-one Chinese year-one university students were randomly assigned to one of the three learning conditions: word part, keyword or self-strategy learning condition. Half of the target words…
Prediction of preterm deliveries from EHG signals using machine learning.
Fergus, Paul; Cheung, Pauline; Hussain, Abir; Al-Jumeily, Dhiya; Dobbins, Chelsea; Iram, Shamaila
2013-01-01
There has been some improvement in the treatment of preterm infants, which has helped to increase their chance of survival. However, the rate of premature births is still globally increasing. As a result, this group of infants are most at risk of developing severe medical conditions that can affect the respiratory, gastrointestinal, immune, central nervous, auditory and visual systems. In extreme cases, this can also lead to long-term conditions, such as cerebral palsy, mental retardation, learning difficulties, including poor health and growth. In the US alone, the societal and economic cost of preterm births, in 2005, was estimated to be $26.2 billion, per annum. In the UK, this value was close to £2.95 billion, in 2009. Many believe that a better understanding of why preterm births occur, and a strategic focus on prevention, will help to improve the health of children and reduce healthcare costs. At present, most methods of preterm birth prediction are subjective. However, a strong body of evidence suggests the analysis of uterine electrical signals (Electrohysterography), could provide a viable way of diagnosing true labour and predict preterm deliveries. Most Electrohysterography studies focus on true labour detection during the final seven days, before labour. The challenge is to utilise Electrohysterography techniques to predict preterm delivery earlier in the pregnancy. This paper explores this idea further and presents a supervised machine learning approach that classifies term and preterm records, using an open source dataset containing 300 records (38 preterm and 262 term). The synthetic minority oversampling technique is used to oversample the minority preterm class, and cross validation techniques, are used to evaluate the dataset against other similar studies. Our approach shows an improvement on existing studies with 96% sensitivity, 90% specificity, and a 95% area under the curve value with 8% global error using the polynomial classifier.
Campbell-Voytal, Kimberly; Daly, Jeanette M; Nagykaldi, Zsolt J; Aspy, Cheryl B; Dolor, Rowena J; Fagnan, Lyle J; Levy, Barcey T; Palac, Hannah L; Michaels, LeAnn; Patterson, V Beth; Kano, Miria; Smith, Paul D; Sussman, Andrew L; Williams, Robert; Sterling, Pamela; O'Beirne, Maeve; Neale, Anne Victoria
2015-12-01
Using peer learning strategies, seven experienced PBRNs working in collaborative teams articulated procedures for PBRN Research Good Practices (PRGPs). The PRGPs is a PBRN-specific resource to facilitate PBRN management and staff training, to promote adherence to study protocols, and to increase validity and generalizability of study findings. This paper describes the team science processes which culminated in the PRGPs. Skilled facilitators used team science strategies and methods from the Technology of Participation (ToP®), and the Consensus Workshop Method to support teams to codify diverse research expertise in practice-based research. The participatory nature of "sense-making" moved through identifiable stages. Lessons learned include (1) team input into the scope of the final outcome proved vital to project relevance; (2) PBRNs with diverse domains of research expertise contributed broad knowledge on each topic; and (3) ToP® structured facilitation techniques were critical for establishing trust and clarifying the "sense-making" process. © 2015 Wiley Periodicals, Inc.
He, Tung-Hsien; Chang, Shan-Mao; Chen, Shu-Hui Eileen; Gou, Wen Johnny
2012-02-01
This study applied structural equation modeling (SEM) techniques to define the relations among trichotomous goals (mastery goals, performance-approach goals, and performance-avoidance goals), self-efficacy, use of metacognitive self-regulation strategies, positive belief in seeking help, and help-avoidance behavior. Elementary school students (N = 105), who were learning English as a foreign language, were surveyed using five self-report scales. The structural equation model showed that self-efficacy led to the adoption of mastery goals but discouraged the adoption of performance-approach goals and performance-avoidance goals. Furthermore, mastery goals increased the use of metacognitive self-regulation strategies, whereas performance-approach goals and performance-avoidance goals reduced their use. Mastery goals encouraged positive belief in help-seeking, but performance-avoidance goals decreased such belief. Finally, performance-avoidance goals directly led to help-avoidance behavior, whereas positive belief assumed a critical role in reducing help-avoidance. The established structural equation model illuminated the potential causal relations among these variables for the young learners in this study.
Daly, Jeanette M.; Nagykaldi, Zsolt J.; Aspy, Cheryl B.; Dolor, Rowena J.; Fagnan, Lyle J.; Levy, Barcey T.; Palac, Hannah L.; Michaels, LeAnn; Patterson, V. Beth; Kano, Miria; Smith, Paul D.; Sussman, Andrew L.; Williams, Robert; Sterling, Pamela; O'Beirne, Maeve; Neale, Anne Victoria
2015-01-01
Abstract Using peer learning strategies, seven experienced PBRNs working in collaborative teams articulated procedures for PBRN Research Good Practices (PRGPs). The PRGPs is a PBRN‐specific resource to facilitate PBRN management and staff training, to promote adherence to study protocols, and to increase validity and generalizability of study findings. This paper describes the team science processes which culminated in the PRGPs. Skilled facilitators used team science strategies and methods from the Technology of Participation (ToP®), and the Consensus Workshop Method to support teams to codify diverse research expertise in practice‐based research. The participatory nature of “sense‐making” moved through identifiable stages. Lessons learned include (1) team input into the scope of the final outcome proved vital to project relevance; (2) PBRNs with diverse domains of research expertise contributed broad knowledge on each topic; and (3) ToP® structured facilitation techniques were critical for establishing trust and clarifying the “sense‐making” process. PMID:26602516
Kernel Recursive Least-Squares Temporal Difference Algorithms with Sparsification and Regularization
Zhu, Qingxin; Niu, Xinzheng
2016-01-01
By combining with sparse kernel methods, least-squares temporal difference (LSTD) algorithms can construct the feature dictionary automatically and obtain a better generalization ability. However, the previous kernel-based LSTD algorithms do not consider regularization and their sparsification processes are batch or offline, which hinder their widespread applications in online learning problems. In this paper, we combine the following five techniques and propose two novel kernel recursive LSTD algorithms: (i) online sparsification, which can cope with unknown state regions and be used for online learning, (ii) L 2 and L 1 regularization, which can avoid overfitting and eliminate the influence of noise, (iii) recursive least squares, which can eliminate matrix-inversion operations and reduce computational complexity, (iv) a sliding-window approach, which can avoid caching all history samples and reduce the computational cost, and (v) the fixed-point subiteration and online pruning, which can make L 1 regularization easy to implement. Finally, simulation results on two 50-state chain problems demonstrate the effectiveness of our algorithms. PMID:27436996
Discriminative least squares regression for multiclass classification and feature selection.
Xiang, Shiming; Nie, Feiping; Meng, Gaofeng; Pan, Chunhong; Zhang, Changshui
2012-11-01
This paper presents a framework of discriminative least squares regression (LSR) for multiclass classification and feature selection. The core idea is to enlarge the distance between different classes under the conceptual framework of LSR. First, a technique called ε-dragging is introduced to force the regression targets of different classes moving along opposite directions such that the distances between classes can be enlarged. Then, the ε-draggings are integrated into the LSR model for multiclass classification. Our learning framework, referred to as discriminative LSR, has a compact model form, where there is no need to train two-class machines that are independent of each other. With its compact form, this model can be naturally extended for feature selection. This goal is achieved in terms of L2,1 norm of matrix, generating a sparse learning model for feature selection. The model for multiclass classification and its extension for feature selection are finally solved elegantly and efficiently. Experimental evaluation over a range of benchmark datasets indicates the validity of our method.
Zhang, Chunyuan; Zhu, Qingxin; Niu, Xinzheng
2016-01-01
By combining with sparse kernel methods, least-squares temporal difference (LSTD) algorithms can construct the feature dictionary automatically and obtain a better generalization ability. However, the previous kernel-based LSTD algorithms do not consider regularization and their sparsification processes are batch or offline, which hinder their widespread applications in online learning problems. In this paper, we combine the following five techniques and propose two novel kernel recursive LSTD algorithms: (i) online sparsification, which can cope with unknown state regions and be used for online learning, (ii) L 2 and L 1 regularization, which can avoid overfitting and eliminate the influence of noise, (iii) recursive least squares, which can eliminate matrix-inversion operations and reduce computational complexity, (iv) a sliding-window approach, which can avoid caching all history samples and reduce the computational cost, and (v) the fixed-point subiteration and online pruning, which can make L 1 regularization easy to implement. Finally, simulation results on two 50-state chain problems demonstrate the effectiveness of our algorithms.
A Simple Deep Learning Method for Neuronal Spike Sorting
NASA Astrophysics Data System (ADS)
Yang, Kai; Wu, Haifeng; Zeng, Yu
2017-10-01
Spike sorting is one of key technique to understand brain activity. With the development of modern electrophysiology technology, some recent multi-electrode technologies have been able to record the activity of thousands of neuronal spikes simultaneously. The spike sorting in this case will increase the computational complexity of conventional sorting algorithms. In this paper, we will focus spike sorting on how to reduce the complexity, and introduce a deep learning algorithm, principal component analysis network (PCANet) to spike sorting. The introduced method starts from a conventional model and establish a Toeplitz matrix. Through the column vectors in the matrix, we trains a PCANet, where some eigenvalue vectors of spikes could be extracted. Finally, support vector machine (SVM) is used to sort spikes. In experiments, we choose two groups of simulated data from public databases availably and compare this introduced method with conventional methods. The results indicate that the introduced method indeed has lower complexity with the same sorting errors as the conventional methods.
Imaging and machine learning techniques for diagnosis of Alzheimer's disease.
Mirzaei, Golrokh; Adeli, Anahita; Adeli, Hojjat
2016-12-01
Alzheimer's disease (AD) is a common health problem in elderly people. There has been considerable research toward the diagnosis and early detection of this disease in the past decade. The sensitivity of biomarkers and the accuracy of the detection techniques have been defined to be the key to an accurate diagnosis. This paper presents a state-of-the-art review of the research performed on the diagnosis of AD based on imaging and machine learning techniques. Different segmentation and machine learning techniques used for the diagnosis of AD are reviewed including thresholding, supervised and unsupervised learning, probabilistic techniques, Atlas-based approaches, and fusion of different image modalities. More recent and powerful classification techniques such as the enhanced probabilistic neural network of Ahmadlou and Adeli should be investigated with the goal of improving the diagnosis accuracy. A combination of different image modalities can help improve the diagnosis accuracy rate. Research is needed on the combination of modalities to discover multi-modal biomarkers.
Analyzing Engineering Design through the Lens of Computation
ERIC Educational Resources Information Center
Worsley, Marcelo; Blikstein, Paulo
2014-01-01
Learning analytics and educational data mining are introducing a number of new techniques and frameworks for studying learning. The scalability and complexity of these novel techniques has afforded new ways for enacting education research and has helped scholars gain new insights into human cognition and learning. Nonetheless, there remain some…
Flipping the Classroom: An Empirical Study Examining Student Learning
ERIC Educational Resources Information Center
Sparks, Roland J.
2013-01-01
Flipping the classroom is the latest reported teaching technique to improve student learning at all levels. Prior studies showed significant increases in learning by employing this technique. However, an examination of the previous studies indicates significant flaws in the testing procedure controls. Moreover, most studies were based on anecdotal…
Robert's Rules for Optimal Learning: Model Development, Field Testing, Implications!
ERIC Educational Resources Information Center
McGinty, Robert L.
The value of accelerated learning techniques developed by the national organization for Suggestive Accelerated Learning Techniques (SALT) was tested in a study using Administrative Policy students taking the capstone course in the Eastern Washington University School of Business. Educators have linked the brain and how it functions to various…
The GenTechnique Project: Developing an Open Environment for Learning Molecular Genetics.
ERIC Educational Resources Information Center
Calza, R. E.; Meade, J. T.
1998-01-01
The GenTechnique project at Washington State University uses a networked learning environment for molecular genetics learning. The project is developing courseware featuring animation, hyper-link controls, and interactive self-assessment exercises focusing on fundamental concepts. The first pilot course featured a Web-based module on DNA…
Adaptive Educational Software by Applying Reinforcement Learning
ERIC Educational Resources Information Center
Bennane, Abdellah
2013-01-01
The introduction of the intelligence in teaching software is the object of this paper. In software elaboration process, one uses some learning techniques in order to adapt the teaching software to characteristics of student. Generally, one uses the artificial intelligence techniques like reinforcement learning, Bayesian network in order to adapt…
Title III: Curricular Development for Secondary Learning Disabilities. Final Report.
ERIC Educational Resources Information Center
Goodman, Libby
Presented is the final report of a 2-year project to develop an examplary model classroom program and curriculum for the secondary learning disabled student. Section I consists of completed forms entitled Project Completion Report, Termination Report, and Equipment Inventory. Outlined in Section II is information on the following project…
ERIC Educational Resources Information Center
Murphy, Harry; Higgins, Eleanor
This final report describes the activities and accomplishments of a 3-year study on the compensatory effectiveness of three assistive technologies, optical character recognition, speech synthesis, and speech recognition, on postsecondary students (N=140) with learning disabilities. These technologies were investigated relative to: (1) immediate…
Learning with Portable Digital Devices in Australian Schools: 20 Years On!
ERIC Educational Resources Information Center
Newhouse, C. Paul
2014-01-01
Portable computing technologies such as laptops, tablets, smartphones, wireless networking, voice/stylus input, and plug and play peripheral devices, appear to offer the means of finally realising much of the long heralded vision for computers to support learning in schools. There is the possibility for the technology to finally become a…
Temperature, Pulse, and Respiration. Instructor's Packet. Learning Activity Package.
ERIC Educational Resources Information Center
Runge, Lillian
This instructor's packet accompanies the learning activity package (LAP) on temperature, pulse, and respiration. Contents included in the packet are a time sheet, suggested uses for the LAP, an instruction sheet, final LAP reviews, a final LAP review answer key, suggested activities, an additional resources list, and student completion cards to…
ERIC Educational Resources Information Center
Anderson-Inman, Lynne; Ditson, Mary
This final report describes activities and accomplishments of the four-year Computer-Based Study Strategies (CBSS) Outreach Project at the University of Oregon. This project disseminated information about using computer-based study strategies as an intervention for students with learning disabilities and provided teachers in participating outreach…
Evaluation of Learning Money Matters (LMM). Final Report
ERIC Educational Resources Information Center
Spielhofer, Thomas; Kerr, David; Gardiner, Clare
2009-01-01
This report presents the final findings of research carried out by the National Foundation for Educational Research (NFER), as part of an independent evaluation on behalf of pfeg, of the Learning Money Matters (LMM) initiative. LMM provides help, support and advice for secondary schools in delivering personal finance education (PFE) to their…
A hybrid integrated services digital network-internet protocol solution for resident education.
Erickson, Delnora; Greer, Lester; Belard, Arnaud; Tinnel, Brent; O'Connell, John
2010-05-01
The purpose of this study was to explore the effectiveness of incorporating Web-based application sharing of virtual medical simulation software within a multipoint video teleconference (VTC) as a training tool in graduate medical education. National Capital Consortium Radiation Oncology Residency Program resident and attending physicians participated in dosimetry teaching sessions held via VTC using Acrobat Connect application sharing. Residents at remote locations could take turns designing radiation treatments using standard three-dimensional planning software, whereas instructors gave immediate feedback and demonstrated proper techniques. Immediately after each dosimetry lesson, residents were asked to complete a survey that evaluated the effectiveness of the session. At the end of a 3-month trial of using Adobe Connect, residents completed a final survey that compared this teaching technology to the prior VTC-alone method. The mean difference from equality across all quality measures from the weekly survey was 0.8, where 0 indicated neither enhanced nor detracted from the learning experience and 1 indicated a minor enhancement in the learning experience. The mean difference from equality across all measures from the final survey comparing use of application sharing with VTC to VTC alone was 1.5, where 1 indicated slightly better and 2 indicated a somewhat better experience. The teaching efficacy of multipoint VTC is perceived by medical residents to be more effective when complemented by application-sharing software such as Adobe Acrobat Connect.
Optimal Sensor Management and Signal Processing for New EMI Systems
2010-09-01
adaptive techniques that would improve the speed of data collection and increase the mobility of a TEMTADS system. Although an active learning technique...data, SIG has simulated the active selection based on the data already collected at Camp SLO. In this setup, the active learning approach was constrained...to work only on a 5x5 grid (corresponding to twenty five transmitters and co-located receivers). The first technique assumes that active learning will
NASA Astrophysics Data System (ADS)
Zhang, Huibin; Wang, Yuqiao; Chen, Haoran; Zhao, Yongli; Zhang, Jie
2017-12-01
In software defined optical networks (SDON), the centralized control plane may encounter numerous intrusion threatens which compromise the security level of provisioned services. In this paper, the issue of control plane security is studied and two machine-learning-based control plane intrusion detection techniques are proposed for SDON with properly selected features such as bandwidth, route length, etc. We validate the feasibility and efficiency of the proposed techniques by simulations. Results show an accuracy of 83% for intrusion detection can be achieved with the proposed machine-learning-based control plane intrusion detection techniques.
NASA Astrophysics Data System (ADS)
Utanto, Yuli; Widhanarto, Ghanis Putra; Maretta, Yoris Adi
2017-03-01
This study aims to develop a web-based portfolio model. The model developed in this study could reveal the effectiveness of the new model in experiments conducted at research respondents in the department of curriculum and educational technology FIP Unnes. In particular, the further research objectives to be achieved through this development of research, namely: (1) Describing the process of implementing a portfolio in a web-based model; (2) Assessing the effectiveness of web-based portfolio model for the final task, especially in Web-Based Learning courses. This type of research is the development of research Borg and Gall (2008: 589) says "educational research and development (R & D) is a process used to develop and validate educational production". The series of research and development carried out starting with exploration and conceptual studies, followed by testing and evaluation, and also implementation. For the data analysis, the technique used is simple descriptive analysis, analysis of learning completeness, which then followed by prerequisite test for normality and homogeneity to do T - test. Based on the data analysis, it was concluded that: (1) a web-based portfolio model can be applied to learning process in higher education; (2) The effectiveness of web-based portfolio model with field data from the respondents of large group trial participants (field trial), the number of respondents who reached mastery learning (a score of 60 and above) were 24 people (92.3%) in which it indicates that the web-based portfolio model is effective. The conclusion of this study is that a web-based portfolio model is effective. The implications of the research development of this model, the next researcher is expected to be able to use the guideline of the development model based on the research that has already been conducted to be developed on other subjects.
Kanthan, Rani; Senger, Jenna-Lynn
2011-01-01
The rapid advances of computer technologies have created a new e-learner generation of "Homo-zappien" students that think and learn differently. Digital gaming is an effective, fun, active, and encouraging way of learning, providing immediate feedback and measurable process. Within the context of ongoing reforms in medical education, specially designed digital games, a form of active learning, are effective, complementary e-teaching/learning resources. To examine the effectiveness of the use of specially designed digital games for student satisfaction and for measurable academic improvement. One hundred fourteen students registered in first-year pathology Medicine 102 had 8 of 16 lecture sessions reviewed in specially designed content-relevant digital games. Performance scores to relevant content sessions were analyzed at midterm and final examinations. Seventy-one students who registered in second-year pathology Medicine 202 were exposed to the games only during the final examination, with the midterm examination serving as an internal matched-control group. Outcome measures included performance at midterm and final examinations. Paired 2-tailed t test statistics compared means. A satisfaction survey questionnaire of yes or no responses analyzed student engagement and their perceptions to digital game-based learning. Questions relevant to the game-play sessions had the highest success rate in both examinations among 114 first-year students. In the 71 second-year students, the examination scores at the end of the final examination were significantly higher than the scores on the midterm examination. Positive satisfaction survey noted increased student engagement, enhanced personal learning, and reduced student stress. Specially constructed digital games-based learning in undergraduate pathology courses showed improved academic performance as measured by examination test scores with increased student satisfaction and engagement.
Green, Rodney A; Farchione, Davide; Hughes, Diane L; Chan, Siew-Pang
2014-01-01
Asynchronous online discussion forums are common in blended learning models and are popular with students. A previous report has suggested that participation in these forums may assist student learning in a gross anatomy subject but it was unclear as to whether more academically able students post more often or whether participation led to improved learning outcomes. This study used a path model to analyze the contribution of forum participation, previous academic ability, and student campus of enrolment to final marks in a multicampus gross anatomy course for physiotherapy students. The course has a substantial online learning management system (LMS) that incorporates asynchronous forums as a learning tool, particularly to answer learning objectives. Students were encouraged to post new threads and answer queries in threads started by others. The forums were moderated weekly by staff. Discussion forums were the most used feature of the LMS site with 31,920 hits. Forty-eight percent of the students posted at least once with 186 threads initiated by students and a total of 608 posts. The total number of posts made a significant direct contribution to final mark (P = 0.008) as did previous academic ability (P = 0.002). Although campus did not contribute to final mark, there was a trend for students at the campus where the course coordinator was situated to post more often than those at the other campus (P = 0.073). These results indicate that asynchronous online discussion forums can be an effective tool for improving student learning outcomes as evidenced by final marks in gross anatomy teaching. Copyright © 2013 American Association of Anatomists.
Use of a machine learning framework to predict substance use disorder treatment success
Kelmansky, Diana; van der Laan, Mark; Sahker, Ethan; Jones, DeShauna; Arndt, Stephan
2017-01-01
There are several methods for building prediction models. The wealth of currently available modeling techniques usually forces the researcher to judge, a priori, what will likely be the best method. Super learning (SL) is a methodology that facilitates this decision by combining all identified prediction algorithms pertinent for a particular prediction problem. SL generates a final model that is at least as good as any of the other models considered for predicting the outcome. The overarching aim of this work is to introduce SL to analysts and practitioners. This work compares the performance of logistic regression, penalized regression, random forests, deep learning neural networks, and SL to predict successful substance use disorders (SUD) treatment. A nationwide database including 99,013 SUD treatment patients was used. All algorithms were evaluated using the area under the receiver operating characteristic curve (AUC) in a test sample that was not included in the training sample used to fit the prediction models. AUC for the models ranged between 0.793 and 0.820. SL was superior to all but one of the algorithms compared. An explanation of SL steps is provided. SL is the first step in targeted learning, an analytic framework that yields double robust effect estimation and inference with fewer assumptions than the usual parametric methods. Different aspects of SL depending on the context, its function within the targeted learning framework, and the benefits of this methodology in the addiction field are discussed. PMID:28394905
Vicarious extinction learning during reconsolidation neutralizes fear memory.
Golkar, Armita; Tjaden, Cathelijn; Kindt, Merel
2017-05-01
Previous studies have suggested that fear memories can be updated when recalled, a process referred to as reconsolidation. Given the beneficial effects of model-based safety learning (i.e. vicarious extinction) in preventing the recovery of short-term fear memory, we examined whether consolidated long-term fear memories could be updated with safety learning accomplished through vicarious extinction learning initiated within the reconsolidation time-window. We assessed this in a final sample of 19 participants that underwent a three-day within-subject fear-conditioning design, using fear-potentiated startle as our primary index of fear learning. On day 1, two fear-relevant stimuli (reinforced CSs) were paired with shock (US) and a third stimulus served as a control (CS). On day 2, one of the two previously reinforced stimuli (the reminded CS) was presented once in order to reactivate the fear memory 10 min before vicarious extinction training was initiated for all CSs. The recovery of the fear memory was tested 24 h later. Vicarious extinction training conducted within the reconsolidation time window specifically prevented the recovery of the reactivated fear memory (p = 0.03), while leaving fear-potentiated startle responses to the non-reactivated cue intact (p = 0.62). These findings are relevant to both basic and clinical research, suggesting that a safe, non-invasive model-based exposure technique has the potential to enhance the efficiency and durability of anxiolytic therapies. Copyright © 2017 Elsevier Ltd. All rights reserved.
Introducing Seismic Tomography with Computational Modeling
NASA Astrophysics Data System (ADS)
Neves, R.; Neves, M. L.; Teodoro, V.
2011-12-01
Learning seismic tomography principles and techniques involves advanced physical and computational knowledge. In depth learning of such computational skills is a difficult cognitive process that requires a strong background in physics, mathematics and computer programming. The corresponding learning environments and pedagogic methodologies should then involve sets of computational modelling activities with computer software systems which allow students the possibility to improve their mathematical or programming knowledge and simultaneously focus on the learning of seismic wave propagation and inverse theory. To reduce the level of cognitive opacity associated with mathematical or programming knowledge, several computer modelling systems have already been developed (Neves & Teodoro, 2010). Among such systems, Modellus is particularly well suited to achieve this goal because it is a domain general environment for explorative and expressive modelling with the following main advantages: 1) an easy and intuitive creation of mathematical models using just standard mathematical notation; 2) the simultaneous exploration of images, tables, graphs and object animations; 3) the attribution of mathematical properties expressed in the models to animated objects; and finally 4) the computation and display of mathematical quantities obtained from the analysis of images and graphs. Here we describe virtual simulations and educational exercises which enable students an easy grasp of the fundamental of seismic tomography. The simulations make the lecture more interactive and allow students the possibility to overcome their lack of advanced mathematical or programming knowledge and focus on the learning of seismological concepts and processes taking advantage of basic scientific computation methods and tools.
Use of a machine learning framework to predict substance use disorder treatment success.
Acion, Laura; Kelmansky, Diana; van der Laan, Mark; Sahker, Ethan; Jones, DeShauna; Arndt, Stephan
2017-01-01
There are several methods for building prediction models. The wealth of currently available modeling techniques usually forces the researcher to judge, a priori, what will likely be the best method. Super learning (SL) is a methodology that facilitates this decision by combining all identified prediction algorithms pertinent for a particular prediction problem. SL generates a final model that is at least as good as any of the other models considered for predicting the outcome. The overarching aim of this work is to introduce SL to analysts and practitioners. This work compares the performance of logistic regression, penalized regression, random forests, deep learning neural networks, and SL to predict successful substance use disorders (SUD) treatment. A nationwide database including 99,013 SUD treatment patients was used. All algorithms were evaluated using the area under the receiver operating characteristic curve (AUC) in a test sample that was not included in the training sample used to fit the prediction models. AUC for the models ranged between 0.793 and 0.820. SL was superior to all but one of the algorithms compared. An explanation of SL steps is provided. SL is the first step in targeted learning, an analytic framework that yields double robust effect estimation and inference with fewer assumptions than the usual parametric methods. Different aspects of SL depending on the context, its function within the targeted learning framework, and the benefits of this methodology in the addiction field are discussed.
A framework to enhance security of physically unclonable functions using chaotic circuits
NASA Astrophysics Data System (ADS)
Chen, Lanxiang
2018-05-01
As a new technique for authentication and key generation, physically unclonable function (PUF) has attracted considerable attentions, with extensive research results achieved already. To resist the popular machine learning modeling attacks, a framework to enhance the security of PUFs is proposed. The basic idea is to combine PUFs with a chaotic system of which the response is highly sensitive to initial conditions. For this framework, a specific construction which combines the common arbiter PUF circuit, a converter, and the Chua's circuit is given to implement a more secure PUF. Simulation experiments are presented to further validate the framework. Finally, some practical suggestions for the framework and specific construction are also discussed.
Cascade process modeling with mechanism-based hierarchical neural networks.
Cong, Qiumei; Yu, Wen; Chai, Tianyou
2010-02-01
Cascade process, such as wastewater treatment plant, includes many nonlinear sub-systems and many variables. When the number of sub-systems is big, the input-output relation in the first block and the last block cannot represent the whole process. In this paper we use two techniques to overcome the above problem. Firstly we propose a new neural model: hierarchical neural networks to identify the cascade process; then we use serial structural mechanism model based on the physical equations to connect with neural model. A stable learning algorithm and theoretical analysis are given. Finally, this method is used to model a wastewater treatment plant. Real operational data of wastewater treatment plant is applied to illustrate the modeling approach.
Johnson, James F; Bagdasarov, Zhanna; MacDougall, Alexandra E; Steele, Logan; Connelly, Shane; Devenport, Lynn D; Mumford, Michael D
2014-01-01
The case-based approach to learning is popular among many applied fields. However, results of case-based education vary widely on case content and case presentation. This study examined two aspects of case-based education-outcome valence and case elaboration methods-in a two-day case-based Responsible Conduct of Research (RCR) ethics education program. Results suggest that outcome information is an integral part of a quality case. Furthermore, valence consistent outcomes may have certain advantages over mixed valence outcome information. Finally, students enjoy and excel working with case material, and the use of elaborative interrogation techniques can significantly improve internally-focused ethical sensemaking strategies associated with personal biases, constraints, and emotions.
East-West Perspectives on Elder Learning
ERIC Educational Resources Information Center
Tam, Maureen
2012-01-01
This paper describes and conceptualizes the meaning of lifelong learning from two cultural perspectives--East and West. It examines the different principles underpinning lifelong learning that explain why and how elders in the two cultures engage differently in continued learning. Finally, it discusses the cultural impact on elder learning by…
Figure analysis: A teaching technique to promote visual literacy and active Learning.
Wiles, Amy M
2016-07-08
Learning often improves when active learning techniques are used in place of traditional lectures. For many of these techniques, however, students are expected to apply concepts that they have already grasped. A challenge, therefore, is how to incorporate active learning into the classroom of courses with heavy content, such as molecular-based biology courses. An additional challenge is that visual literacy is often overlooked in undergraduate science education. To address both of these challenges, a technique called figure analysis was developed and implemented in three different levels of undergraduate biology courses. Here, students learn content while gaining practice in interpreting visual information by discussing figures with their peers. Student groups also make connections between new and previously learned concepts on their own while in class. The instructor summarizes the material for the class only after students grapple with it in small groups. Students reported a preference for learning by figure analysis over traditional lecture, and female students in particular reported increased confidence in their analytical abilities. There is not a technology requirement for this technique; therefore, it may be utilized both in classrooms and in nontraditional spaces. Additionally, the amount of preparation required is comparable to that of a traditional lecture. © 2016 by The International Union of Biochemistry and Molecular Biology, 44(4):336-344, 2016. © 2016 The International Union of Biochemistry and Molecular Biology.
Wang, Deyun; Wei, Shuai; Luo, Hongyuan; Yue, Chenqiang; Grunder, Olivier
2017-02-15
The randomness, non-stationarity and irregularity of air quality index (AQI) series bring the difficulty of AQI forecasting. To enhance forecast accuracy, a novel hybrid forecasting model combining two-phase decomposition technique and extreme learning machine (ELM) optimized by differential evolution (DE) algorithm is developed for AQI forecasting in this paper. In phase I, the complementary ensemble empirical mode decomposition (CEEMD) is utilized to decompose the AQI series into a set of intrinsic mode functions (IMFs) with different frequencies; in phase II, in order to further handle the high frequency IMFs which will increase the forecast difficulty, variational mode decomposition (VMD) is employed to decompose the high frequency IMFs into a number of variational modes (VMs). Then, the ELM model optimized by DE algorithm is applied to forecast all the IMFs and VMs. Finally, the forecast value of each high frequency IMF is obtained through adding up the forecast results of all corresponding VMs, and the forecast series of AQI is obtained by aggregating the forecast results of all IMFs. To verify and validate the proposed model, two daily AQI series from July 1, 2014 to June 30, 2016 collected from Beijing and Shanghai located in China are taken as the test cases to conduct the empirical study. The experimental results show that the proposed hybrid model based on two-phase decomposition technique is remarkably superior to all other considered models for its higher forecast accuracy. Copyright © 2016 Elsevier B.V. All rights reserved.
Lempp, H; Seabrook, M; Cochrane, M; Rees, J
2005-03-01
In this prospective qualitative study over 12 months, we evaluated the educational and clinical effectiveness of a new final year undergraduate programme in a London medical school (Guy's, King's and St Thomas'). A stratified sample of 17/360 final year students were interviewed four times, and the content was assessed against 32 amalgamated learning outcomes identified in 1997 in The New Doctor. At the beginning of the preregistration year, eight of the learning outcomes were already met, 10 partly, eight remained to be attained and for six, insufficient evidence existed. Preregistration house officers who have been through the final year student house officer programme expressed competence in many of the outcomes of the General Medical Council's New Doctor. The study identified areas such as prescribing where further developments are needed and will help in planning the new foundation programme.
Performance analysis and prediction in triathlon.
Ofoghi, Bahadorreza; Zeleznikow, John; Macmahon, Clare; Rehula, Jan; Dwyer, Dan B
2016-01-01
Performance in triathlon is dependent upon factors that include somatotype, physiological capacity, technical proficiency and race strategy. Given the multidisciplinary nature of triathlon and the interaction between each of the three race components, the identification of target split times that can be used to inform the design of training plans and race pacing strategies is a complex task. The present study uses machine learning techniques to analyse a large database of performances in Olympic distance triathlons (2008-2012). The analysis reveals patterns of performance in five components of triathlon (three race "legs" and two transitions) and the complex relationships between performance in each component and overall performance in a race. The results provide three perspectives on the relationship between performance in each component of triathlon and the final placing in a race. These perspectives allow the identification of target split times that are required to achieve a certain final place in a race and the opportunity to make evidence-based decisions about race tactics in order to optimise performance.
Hard exudates segmentation based on learned initial seeds and iterative graph cut.
Kusakunniran, Worapan; Wu, Qiang; Ritthipravat, Panrasee; Zhang, Jian
2018-05-01
(Background and Objective): The occurrence of hard exudates is one of the early signs of diabetic retinopathy which is one of the leading causes of the blindness. Many patients with diabetic retinopathy lose their vision because of the late detection of the disease. Thus, this paper is to propose a novel method of hard exudates segmentation in retinal images in an automatic way. (Methods): The existing methods are based on either supervised or unsupervised learning techniques. In addition, the learned segmentation models may often cause miss-detection and/or fault-detection of hard exudates, due to the lack of rich characteristics, the intra-variations, and the similarity with other components in the retinal image. Thus, in this paper, the supervised learning based on the multilayer perceptron (MLP) is only used to identify initial seeds with high confidences to be hard exudates. Then, the segmentation is finalized by unsupervised learning based on the iterative graph cut (GC) using clusters of initial seeds. Also, in order to reduce color intra-variations of hard exudates in different retinal images, the color transfer (CT) is applied to normalize their color information, in the pre-processing step. (Results): The experiments and comparisons with the other existing methods are based on the two well-known datasets, e_ophtha EX and DIARETDB1. It can be seen that the proposed method outperforms the other existing methods in the literature, with the sensitivity in the pixel-level of 0.891 for the DIARETDB1 dataset and 0.564 for the e_ophtha EX dataset. The cross datasets validation where the training process is performed on one dataset and the testing process is performed on another dataset is also evaluated in this paper, in order to illustrate the robustness of the proposed method. (Conclusions): This newly proposed method integrates the supervised learning and unsupervised learning based techniques. It achieves the improved performance, when compared with the existing methods in the literature. The robustness of the proposed method for the scenario of cross datasets could enhance its practical usage. That is, the trained model could be more practical for unseen data in the real-world situation, especially when the capturing environments of training and testing images are not the same. Copyright © 2018 Elsevier B.V. All rights reserved.
NCS-1 dependent learning bonus and behavior outputs of self-directed exploration
NASA Astrophysics Data System (ADS)
Mun, Ho-Suk
Animals explore a new environment and learn about their surroundings. "Exploration" refers to all activities that increase the information obtained from an animal. For this study, I determined a molecule that mediates self-directed exploration, with a particular focus on rearing behavior and vocalization. Rearing can be either self-directed exploration or escape-oriented exploration. Self-directed exploration can be driven by the desire to gather information about environments while escape-oriented exploration can be driven by fear or anxiety. To differentiate between these two concepts, I compared rearing and other behaviors in three different conditions 1) novel dim (safe environment), which induces exploration based rearing; 2) novel bright (fearful environment), which elicits fear driven rearing; and 3) familiar environment as a control. First, I characterized the effects on two distinct types of environment in exploratory behavior and its effect on learning. From this, I determined that self-directed exploration enhances spatial learning while escape-oriented exploration does not produce a learning bonus. Second, I found that NCS-1 is involved in exploration, as well as learning and memory, by testing mice with reduced levels of Ncs-1 by point mutation and also siRNA injection. Finally, I illustrated other behavior outputs and neural substrate activities, which co-occurred during either self-directed or escape-oriented exploration. I found that high-frequency ultrasonic vocalizations occurred during self-directed exploration while low-frequency calls were emitted during escape-oriented exploration. Also, with immediate early gene imaging techniques, I found hippocampus and nucleus accumbens activation in self-directed exploration. This study is the first comprehensive molecular analysis of learning bonus in self-directed exploration. These results may be beneficial for studying underlying mechanisms of neuropsychiatric disease, and also reveal therapeutic targets for them.
New Media Learning: Student Podcasting and Blogging in an Intro to Meteorology Course
NASA Astrophysics Data System (ADS)
Small, J. D.
2013-12-01
Current weather events and climate change are hot media topics discussed on television, the internet, and through social media. In this world of 'Tweets', 'Texts' and constant multi-media bombardment it is becoming increasingly difficult to engage students in the learning process by simply standing at a podium and lecturing in a darkened classroom. Educational research has found that lectures place students in a passive role, preventing them from actively engaging in the learning process. Through the innovative use of multi-media platforms this study assesses the potential to create active learning opportunities (podcasting and blogging) that connect theoretical 'textbook' atmospheric science with the 'real world.' This work focuses on students enrolled in the Introduction to Meteorology course (MET 101) at the University of Hawaii at Manoa. This study summarizes the impact of the 'course-casting' technique which utilizes podcasts of lectures and supplemental material. Lecture Podcasts are used mainly as a revision tool for students by providing on-demand portable (MP3) course content that supports independent student learning. Students also produced their own podcasts (research projects) to share with classmates throughout the course relating atmospheric science content to personal 'real world' experiences. Along with podcasting, students blogged about designated topics related to weather and climate, making their knowledge and understanding accessible to other students in the course and the general internet community. Student surveys, journals, and final exit interviews are used to assess the impact of the blogging and podcasting exercises on the student learning experience. The number of times each lecture podcast was downloaded is recorded to determine the interest level in using audio lectures as a review tool. Student blogs and podcasts are evaluated based on science content accuracy and student survey evaluations of the learning experience.
[Training program in endourological surgery. Future perspectives.
Soria, Federico; Villacampa, Felipe; Serrano, Alvaro; Moreno, Jesús; Rioja, Jorge; Sánchez, Francisco Miguel
2018-01-01
Current training in urological endoscopy lacks a specific training program. However, there is a clear need for a specific and uniform program, which will ensure the training, regardless of the unit where it is carried out. So, the goal is to first evaluate the current model and then bring improvements for update. The hospital training accreditation programme are only the adjustment of the official program of the urology specialty to the specific circumstances of each center, which causes variability in training of residents. After reviewing 19 training programs belonging to 12 Spanish regions. The current outlook shows that scarcely 10% of hospitals quantify the number of procedures/ year, although the Spanish program emphasizes that the achievement of the residents should be quantified. Urology residents, sense their training as inadequate and therefore their level of satisfaction is moderate. The three main problems detected by residents as an obstacle on their training are: the lack of supervision, tutors completing their own learning. Finally, the lack of quantification in surgical activities is described as a threat. This has no easy solution, since the learning curve of the most common techniques in endourology is not correctly established. Regarding aspects that can improve the current model, they highlight the need to design a specific program. The need to customize the training, the ineludible accreditation of tutors and obviously dignify the tutor's teaching activity. Another basic aspect is the inclusion of new technologies as training tools, e-learning. As well as the implementation of an adequate competency assessment plan and the possibility of relying on simulation systems. Finally, they highlight the need to attend monographic meetings and external clinic rotations to promote critical training.
Collaborative Learning in the Dance Technique Class
ERIC Educational Resources Information Center
Raman, Tanja
2009-01-01
This research was designed to enhance dance technique learning by promoting critical thinking amongst students studying on a degree programme at the University of Wales Institute, Cardiff. Students were taught Cunningham-based dance technique using pair work together with the traditional demonstration/copying method. To evaluate the study,…
Accelerated Learning Techniques for the Foreign Language Class: A Personal View.
ERIC Educational Resources Information Center
Bancroft, W. Jane
Foreign language instructors cope with problems of learner anxiety in the classroom, fossilization of language use and language skill loss. Relaxation and concentration techniques can alleviate stress and fatigue and improve students' capabilities. Three categories of accelerated learning techniques are: (1) those that serve as a preliminary to…
Jaspers, Arne; De Beéck, Tim Op; Brink, Michel S; Frencken, Wouter G P; Staes, Filip; Davis, Jesse J; Helsen, Werner F
2018-05-01
Machine learning may contribute to understanding the relationship between the external load and internal load in professional soccer. Therefore, the relationship between external load indicators (ELIs) and the rating of perceived exertion (RPE) was examined using machine learning techniques on a group and individual level. Training data were collected from 38 professional soccer players over 2 seasons. The external load was measured using global positioning system technology and accelerometry. The internal load was obtained using the RPE. Predictive models were constructed using 2 machine learning techniques, artificial neural networks and least absolute shrinkage and selection operator (LASSO) models, and 1 naive baseline method. The predictions were based on a large set of ELIs. Using each technique, 1 group model involving all players and 1 individual model for each player were constructed. These models' performance on predicting the reported RPE values for future training sessions was compared with the naive baseline's performance. Both the artificial neural network and LASSO models outperformed the baseline. In addition, the LASSO model made more accurate predictions for the RPE than did the artificial neural network model. Furthermore, decelerations were identified as important ELIs. Regardless of the applied machine learning technique, the group models resulted in equivalent or better predictions for the reported RPE values than the individual models. Machine learning techniques may have added value in predicting RPE for future sessions to optimize training design and evaluation. These techniques may also be used in conjunction with expert knowledge to select key ELIs for load monitoring.
ERIC Educational Resources Information Center
Trevathan, Jarrod; Myers, Trina
2013-01-01
Process-Oriented Guided Inquiry Learning (POGIL) is a technique used to teach in large lectures and tutorials. It invokes interaction, team building, learning and interest through highly structured group work. Currently, POGIL has only been implemented in traditional classroom settings where all participants are physically present. However,…
ERIC Educational Resources Information Center
WIENTGE, KING M., ED.; AND OTHERS
PAPERS WERE PRESENTED AT A CONFERENCE ON CLASSROOM LEARNING ON SUCH TOPICS AS PROGRAM DESIGN, TESTING, AND OTHER EVALUATION TECHNIQUES, COMPUTER ASSISTED INSTRUCTION, PROGRAMED INSTRUCTION, SIMULATION, PACING, AND RETENTION. SEVERAL TREATED MILITARY TRAINING, ADULT LEARNING, AND ADULT-CENTERED CLASSROOM TECHNIQUES. IN ONE PAPER, THE SYSTEMS…
Embedding Mixed-Reality Laboratories into E-Learning Systems for Engineering Education
ERIC Educational Resources Information Center
Al-Tikriti, Munther N.; Al-Aubidy, Kasim M.
2013-01-01
E-learning, virtual learning and mixed reality techniques are now a global integral part of the academic and educational systems. They provide easier access to educational opportunities to a very wide spectrum of individuals to pursue their educational and qualification objectives. These modern techniques have the potentials to improve the quality…
E-Learning System Using Segmentation-Based MR Technique for Learning Circuit Construction
ERIC Educational Resources Information Center
Takemura, Atsushi
2016-01-01
This paper proposes a novel e-Learning system using the mixed reality (MR) technique for technical experiments involving the construction of electronic circuits. The proposed system comprises experimenters' mobile computers and a remote analysis system. When constructing circuits, each learner uses a mobile computer to transmit image data from the…
Semantics of User Interface for Image Retrieval: Possibility Theory and Learning Techniques.
ERIC Educational Resources Information Center
Crehange, M.; And Others
1989-01-01
Discusses the need for a rich semantics for the user interface in interactive image retrieval and presents two methods for building such interfaces: possibility theory applied to fuzzy data retrieval, and a machine learning technique applied to learning the user's deep need. Prototypes developed using videodisks and knowledge-based software are…
ERIC Educational Resources Information Center
Ploetzner, Rolf; Lowe, Richard; Schlag, Sabine
2013-01-01
Pictorial representations can play a pivotal role in both printed and digital learning material. Although there has been extensive research on cognitive techniques and strategies for learning from text, the same cannot be said for static and dynamic pictorial representations. In this paper we propose a systematic characterization of cognitive…
The relation between learning mathematics and students' competencies in undesrtanding texts
NASA Astrophysics Data System (ADS)
Hapipi, Azmi, Syahrul; Sripatmi, Amrullah
2017-08-01
This study was a descriptive study that aimed to gain an overview on the relation between learning mathematics and students' competencies in understanding texts. This research was classified as an ex post facto study due in part to the variable studied is the variable that was already happening. While the technique of taking the sample using stratified proportional sampling techniques. These techniques have been selected for the condition of the population, in the context of learning mathematics, diverse and also tiered. The results of this study indicate that there is a relationship between learning mathematics and students' competencies in understanding texts.
The educational impact of assessment: A comparison of DOPS and MCQs
Cobb, Kate A.; Brown, George; Jaarsma, Debbie A. D. C.; Hammond, Richard A.
2013-01-01
Aim To evaluate the impact of two different assessment formats on the approaches to learning of final year veterinary students. The relationship between approach to learning and examination performance was also investigated. Method An 18-item version of the Study Process Questionnaire (SPQ) was sent to 87 final year students. Each student responded to the questionnaire with regards to DOPS (Direct Observation of Procedural Skills) and a Multiple Choice Examination (MCQ). Semi-structured interviews were conducted with 16 of the respondents to gain a deeper insight into the students’ perception of assessment. Results Students’ adopted a deeper approach to learning for DOPS and a more surface approach with MCQs. There was a positive correlation between an achieving approach to learning and examination performance. Analysis of the qualitative data revealed that deep, surface and achieving approaches were reported by the students and seven major influences on their approaches to learning were identified: motivation, purpose, consequence, acceptability, feedback, time pressure and the individual difference of the students. Conclusions The format of DOPS has a positive influence on approaches to learning. There is a conflict for students between preparing for final examinations and preparing for clinical practice. PMID:23808609
NASA Astrophysics Data System (ADS)
Poole, Barbara Ann Matherly
1997-11-01
This study explored the relationship between the grades students earned in introductory college microbiology and American College Testing scores, sex, race, age, GED or high school diploma, full-time or part-time student status, developmental reasoning levels, memory tactics, and expected achievement. The study also explored student perceptions at the beginning and the end of the microbiology courses for science preparation, expected achievement, relevancy of microbiology, and expectations for the course. Archival records for 121 freshman level and 119 sophomore level microbiology students were accessed to obtain final grades, ACT scores, sex, race, age, GED or high school diploma and full-time or part-time status. The same information was obtained for the 113 freshman level and the 85 sophomore level students who participated in the study. The study groups were given the Group Assessment of Logical Thinking to assess their level of formal reasoning ability, the Inventory of Learning Processes-Revised to assess three memory techniques, an initial perception survey, and an exit perception survey. Academic achievement in microbiology could not be predicted using composites of the predictor variables. There were significant relationships between the GALT scores and the predicted grades with both the freshman and the sophomore final grades. The Self-Efficacy Fact Retention scores and the Literal Memorization scores had significant relationships to the final grades of the freshmen but not the sophomores. There was not a significant relationship between the Deep Semantic scores and the final grades in either group. Students indicated that high school science had given them only a medium to low level of preparation for college microbiology. The sophomores felt that previous college science classes had given them a much better preparation for microbiology than did the freshmen students. Both groups expressed the importance of the laboratory experience to the understanding of science and also the relevancy of microbiology both to their chosen professions and to their own personal lives.
ERIC Educational Resources Information Center
van Gelderen, Ingrid; Matthew, Susan M.; Hendry, Graham D.; Taylor, Rosanne
2018-01-01
Good teaching that supports final year students' learning in clinical placements is critical for students' successful transition from an academic environment to professional practice. Final year internship programmes are designed to encourage student-centred approaches to teaching and deep approaches to learning, but the extent to which clinical…
ERIC Educational Resources Information Center
Macro Systems, Inc., Silver Spring, MD.
This final report describes the development of eight computer based science simulations designed for use with middle school mainstreamed students having learning disabilities or mild mental retardation. The total program includes software, a teacher's manual, 3 videos, and a set of 30 activity worksheets. Special features of the software for…
ERIC Educational Resources Information Center
Hudson, Suzanne; Hudson, Peter; Adie, Lenore
2015-01-01
Universities and teacher employment bodies seek new, cost-effective ways for graduating classroom-ready teachers. This study involved 32 final-year preservice teachers in an innovative school--university partnership teacher education programme titled, the School-Community Integrated Learning (SCIL) pathway. Data were collected using a five-part…
Neural Correlates of Temporal Credit Assignment in the Parietal Lobe
Eisenberg, Ian; Gottlieb, Jacqueline
2014-01-01
Empirical studies of decision making have typically assumed that value learning is governed by time, such that a reward prediction error arising at a specific time triggers temporally-discounted learning for all preceding actions. However, in natural behavior, goals must be acquired through multiple actions, and each action can have different significance for the final outcome. As is recognized in computational research, carrying out multi-step actions requires the use of credit assignment mechanisms that focus learning on specific steps, but little is known about the neural correlates of these mechanisms. To investigate this question we recorded neurons in the monkey lateral intraparietal area (LIP) during a serial decision task where two consecutive eye movement decisions led to a final reward. The underlying decision trees were structured such that the two decisions had different relationships with the final reward, and the optimal strategy was to learn based on the final reward at one of the steps (the “F” step) but ignore changes in this reward at the remaining step (the “I” step). In two distinct contexts, the F step was either the first or the second in the sequence, controlling for effects of temporal discounting. We show that LIP neurons had the strongest value learning and strongest post-decision responses during the transition after the F step regardless of the serial position of this step. Thus, the neurons encode correlates of temporal credit assignment mechanisms that allocate learning to specific steps independently of temporal discounting. PMID:24523935
Information Theory, Inference and Learning Algorithms
NASA Astrophysics Data System (ADS)
Mackay, David J. C.
2003-10-01
Information theory and inference, often taught separately, are here united in one entertaining textbook. These topics lie at the heart of many exciting areas of contemporary science and engineering - communication, signal processing, data mining, machine learning, pattern recognition, computational neuroscience, bioinformatics, and cryptography. This textbook introduces theory in tandem with applications. Information theory is taught alongside practical communication systems, such as arithmetic coding for data compression and sparse-graph codes for error-correction. A toolbox of inference techniques, including message-passing algorithms, Monte Carlo methods, and variational approximations, are developed alongside applications of these tools to clustering, convolutional codes, independent component analysis, and neural networks. The final part of the book describes the state of the art in error-correcting codes, including low-density parity-check codes, turbo codes, and digital fountain codes -- the twenty-first century standards for satellite communications, disk drives, and data broadcast. Richly illustrated, filled with worked examples and over 400 exercises, some with detailed solutions, David MacKay's groundbreaking book is ideal for self-learning and for undergraduate or graduate courses. Interludes on crosswords, evolution, and sex provide entertainment along the way. In sum, this is a textbook on information, communication, and coding for a new generation of students, and an unparalleled entry point into these subjects for professionals in areas as diverse as computational biology, financial engineering, and machine learning.
Jönsson, Bo-Anders
2005-09-01
Learning activities and course design in the new context of e-learning, such as in web-based courses involves a change both for teachers and students. The paper discusses factors important for e-learning to be successful. The development of an online course in medical physics and technology for high school teachers of physics, details of the course, and experience gained in connection with it are described. The course syllabus includes basics of radiation physics, imaging techniques using ionizing or non-ionizing radiation, and external and internal radiation therapy. The course has a highly didactic approach. The final task is for participants to design a course of their own centered on some topic of medical physics on the basis of the knowledge they have acquired. The aim of the course is to help the teachers integrate medical physics into their own teaching. This is seen as enhancing the interest of high school students in later studying physics, medical physics or some other branch of science at the university level, and as increasing the knowledge that they and people generally have of science. It is suggested that the basic approach taken can also have applicability to the training of medical, nursing or engineering students, and be used for continuing professional development in various areas.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hu, Wenjian; Singh, Rajiv R. P.; Scalettar, Richard T.
Here, we apply unsupervised machine learning techniques, mainly principal component analysis (PCA), to compare and contrast the phase behavior and phase transitions in several classical spin models - the square and triangular-lattice Ising models, the Blume-Capel model, a highly degenerate biquadratic-exchange spin-one Ising (BSI) model, and the 2D XY model, and examine critically what machine learning is teaching us. We find that quantified principal components from PCA not only allow exploration of different phases and symmetry-breaking, but can distinguish phase transition types and locate critical points. We show that the corresponding weight vectors have a clear physical interpretation, which ismore » particularly interesting in the frustrated models such as the triangular antiferromagnet, where they can point to incipient orders. Unlike the other well-studied models, the properties of the BSI model are less well known. Using both PCA and conventional Monte Carlo analysis, we demonstrate that the BSI model shows an absence of phase transition and macroscopic ground-state degeneracy. The failure to capture the 'charge' correlations (vorticity) in the BSI model (XY model) from raw spin configurations points to some of the limitations of PCA. Finally, we employ a nonlinear unsupervised machine learning procedure, the 'antoencoder method', and demonstrate that it too can be trained to capture phase transitions and critical points.« less
Hu, Wenjian; Singh, Rajiv R. P.; Scalettar, Richard T.
2017-06-19
Here, we apply unsupervised machine learning techniques, mainly principal component analysis (PCA), to compare and contrast the phase behavior and phase transitions in several classical spin models - the square and triangular-lattice Ising models, the Blume-Capel model, a highly degenerate biquadratic-exchange spin-one Ising (BSI) model, and the 2D XY model, and examine critically what machine learning is teaching us. We find that quantified principal components from PCA not only allow exploration of different phases and symmetry-breaking, but can distinguish phase transition types and locate critical points. We show that the corresponding weight vectors have a clear physical interpretation, which ismore » particularly interesting in the frustrated models such as the triangular antiferromagnet, where they can point to incipient orders. Unlike the other well-studied models, the properties of the BSI model are less well known. Using both PCA and conventional Monte Carlo analysis, we demonstrate that the BSI model shows an absence of phase transition and macroscopic ground-state degeneracy. The failure to capture the 'charge' correlations (vorticity) in the BSI model (XY model) from raw spin configurations points to some of the limitations of PCA. Finally, we employ a nonlinear unsupervised machine learning procedure, the 'antoencoder method', and demonstrate that it too can be trained to capture phase transitions and critical points.« less
NASA Astrophysics Data System (ADS)
Hu, Wenjian; Singh, Rajiv R. P.; Scalettar, Richard T.
2017-06-01
We apply unsupervised machine learning techniques, mainly principal component analysis (PCA), to compare and contrast the phase behavior and phase transitions in several classical spin models—the square- and triangular-lattice Ising models, the Blume-Capel model, a highly degenerate biquadratic-exchange spin-1 Ising (BSI) model, and the two-dimensional X Y model—and we examine critically what machine learning is teaching us. We find that quantified principal components from PCA not only allow the exploration of different phases and symmetry-breaking, but they can distinguish phase-transition types and locate critical points. We show that the corresponding weight vectors have a clear physical interpretation, which is particularly interesting in the frustrated models such as the triangular antiferromagnet, where they can point to incipient orders. Unlike the other well-studied models, the properties of the BSI model are less well known. Using both PCA and conventional Monte Carlo analysis, we demonstrate that the BSI model shows an absence of phase transition and macroscopic ground-state degeneracy. The failure to capture the "charge" correlations (vorticity) in the BSI model (X Y model) from raw spin configurations points to some of the limitations of PCA. Finally, we employ a nonlinear unsupervised machine learning procedure, the "autoencoder method," and we demonstrate that it too can be trained to capture phase transitions and critical points.
McGovern, Amy; Gagne, David J; Williams, John K; Brown, Rodger A; Basara, Jeffrey B
Severe weather, including tornadoes, thunderstorms, wind, and hail annually cause significant loss of life and property. We are developing spatiotemporal machine learning techniques that will enable meteorologists to improve the prediction of these events by improving their understanding of the fundamental causes of the phenomena and by building skillful empirical predictive models. In this paper, we present significant enhancements of our Spatiotemporal Relational Probability Trees that enable autonomous discovery of spatiotemporal relationships as well as learning with arbitrary shapes. We focus our evaluation on two real-world case studies using our technique: predicting tornadoes in Oklahoma and predicting aircraft turbulence in the United States. We also discuss how to evaluate success for a machine learning algorithm in the severe weather domain, which will enable new methods such as ours to transfer from research to operations, provide a set of lessons learned for embedded machine learning applications, and discuss how to field our technique.
ERIC Educational Resources Information Center
Agyei, Douglas D.; Voogt, Joke
2014-01-01
This study examined 100 beginning teachers' transfer of learning when utilising Information Communication Technology-enhanced activity-based learning activities. The beginning teachers had participated in a professional development program that was characterised by "learning technology by collaborative design" in their final year of…
Project-Based Learning Involving Sensory Panelists Improves Student Learning Outcomes
ERIC Educational Resources Information Center
Lee, Yee Ming
2015-01-01
Project-based, collaborative learning is an effective teaching method when compared to traditional cognitive learning. The purpose of this study was to assess student learning after the completion of a final meal project that involved a group of sensory panelists. A paper survey was conducted among 73 senior nutrition and dietetics students…
The impact of machine learning techniques in the study of bipolar disorder: A systematic review.
Librenza-Garcia, Diego; Kotzian, Bruno Jaskulski; Yang, Jessica; Mwangi, Benson; Cao, Bo; Pereira Lima, Luiza Nunes; Bermudez, Mariane Bagatin; Boeira, Manuela Vianna; Kapczinski, Flávio; Passos, Ives Cavalcante
2017-09-01
Machine learning techniques provide new methods to predict diagnosis and clinical outcomes at an individual level. We aim to review the existing literature on the use of machine learning techniques in the assessment of subjects with bipolar disorder. We systematically searched PubMed, Embase and Web of Science for articles published in any language up to January 2017. We found 757 abstracts and included 51 studies in our review. Most of the included studies used multiple levels of biological data to distinguish the diagnosis of bipolar disorder from other psychiatric disorders or healthy controls. We also found studies that assessed the prediction of clinical outcomes and studies using unsupervised machine learning to build more consistent clinical phenotypes of bipolar disorder. We concluded that given the clinical heterogeneity of samples of patients with BD, machine learning techniques may provide clinicians and researchers with important insights in fields such as diagnosis, personalized treatment and prognosis orientation. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Hancher, M.
2017-12-01
Recent years have seen promising results from many research teams applying deep learning techniques to geospatial data processing. In that same timeframe, TensorFlow has emerged as the most popular framework for deep learning in general, and Google has assembled petabytes of Earth observation data from a wide variety of sources and made them available in analysis-ready form in the cloud through Google Earth Engine. Nevertheless, developing and applying deep learning to geospatial data at scale has been somewhat cumbersome to date. We present a new set of tools and techniques that simplify this process. Our approach combines the strengths of several underlying tools: TensorFlow for its expressive deep learning framework; Earth Engine for data management, preprocessing, postprocessing, and visualization; and other tools in Google Cloud Platform to train TensorFlow models at scale, perform additional custom parallel data processing, and drive the entire process from a single familiar Python development environment. These tools can be used to easily apply standard deep neural networks, convolutional neural networks, and other custom model architectures to a variety of geospatial data structures. We discuss our experiences applying these and related tools to a range of machine learning problems, including classic problems like cloud detection, building detection, land cover classification, as well as more novel problems like illegal fishing detection. Our improved tools will make it easier for geospatial data scientists to apply modern deep learning techniques to their own problems, and will also make it easier for machine learning researchers to advance the state of the art of those techniques.
Gohar, Manoochehr Jafari; Rahmanian, Mahboubeh; Soleimani, Hassan
2018-02-05
Vocabulary learning has always been a great concern and has attracted the attention of many researchers. Among the vocabulary learning hypotheses, involvement load hypothesis and technique feature analysis have been proposed which attempt to bring some concepts like noticing, motivation, and generation into focus. In the current study, 90 high proficiency EFL students were assigned into three vocabulary tasks of sentence making, composition, and reading comprehension in order to examine the power of involvement load hypothesis and technique feature analysis frameworks in predicting vocabulary learning. It was unraveled that involvement load hypothesis cannot be a good predictor, and technique feature analysis was a good predictor in pretest to posttest score change and not in during-task activity. The implications of the results will be discussed in the light of preparing vocabulary tasks.
2018-01-01
On-chip LiDAR sensors for vehicle collision avoidance are a rapidly expanding area of research and development. The assessment of reliable obstacle detection using data collected by LiDAR sensors has become a key issue that the scientific community is actively exploring. The design of a self-tuning methodology and its implementation are presented in this paper, to maximize the reliability of LiDAR sensors network for obstacle detection in the ‘Internet of Things’ (IoT) mobility scenarios. The Webots Automobile 3D simulation tool for emulating sensor interaction in complex driving environments is selected in order to achieve that objective. Furthermore, a model-based framework is defined that employs a point-cloud clustering technique, and an error-based prediction model library that is composed of a multilayer perceptron neural network, and k-nearest neighbors and linear regression models. Finally, a reinforcement learning technique, specifically a Q-learning method, is implemented to determine the number of LiDAR sensors that are required to increase sensor reliability for obstacle localization tasks. In addition, a IoT driving assistance user scenario, connecting a five LiDAR sensor network is designed and implemented to validate the accuracy of the computational intelligence-based framework. The results demonstrated that the self-tuning method is an appropriate strategy to increase the reliability of the sensor network while minimizing detection thresholds. PMID:29748521
Muhlestein, Whitney E; Akagi, Dallin S; Kallos, Justiss A; Morone, Peter J; Weaver, Kyle D; Thompson, Reid C; Chambless, Lola B
2018-04-01
Objective Machine learning (ML) algorithms are powerful tools for predicting patient outcomes. This study pilots a novel approach to algorithm selection and model creation using prediction of discharge disposition following meningioma resection as a proof of concept. Materials and Methods A diversity of ML algorithms were trained on a single-institution database of meningioma patients to predict discharge disposition. Algorithms were ranked by predictive power and top performers were combined to create an ensemble model. The final ensemble was internally validated on never-before-seen data to demonstrate generalizability. The predictive power of the ensemble was compared with a logistic regression. Further analyses were performed to identify how important variables impact the ensemble. Results Our ensemble model predicted disposition significantly better than a logistic regression (area under the curve of 0.78 and 0.71, respectively, p = 0.01). Tumor size, presentation at the emergency department, body mass index, convexity location, and preoperative motor deficit most strongly influence the model, though the independent impact of individual variables is nuanced. Conclusion Using a novel ML technique, we built a guided ML ensemble model that predicts discharge destination following meningioma resection with greater predictive power than a logistic regression, and that provides greater clinical insight than a univariate analysis. These techniques can be extended to predict many other patient outcomes of interest.
Cilla, M; Pérez-Rey, I; Martínez, M A; Peña, Estefania; Martínez, Javier
2018-06-23
Motivated by the search for new strategies for fitting a material model, a new approach is explored in the present work. The use of numerical and complex algorithms based on machine learning techniques such as support vector machines for regression, bagged decision trees and artificial neural networks is proposed for solving the parameter identification of constitutive laws for soft biological tissues. First, the mathematical tools were trained with analytical uniaxial data (circumferential and longitudinal directions) as inputs, and their corresponding material parameters of the Gasser, Ogden and Holzapfel strain energy function as outputs. The train and test errors show great efficiency during the training process in finding correlations between inputs and outputs; besides, the correlation coefficients were very close to 1. Second, the tool was validated with unseen observations of analytical circumferential and longitudinal uniaxial data. The results show an excellent agreement between the prediction of the material parameters of the SEF and the analytical curves. Finally, data from real circumferential and longitudinal uniaxial tests on different cardiovascular tissues were fitted, thus the material model of these tissues was predicted. We found that the method was able to consistently identify model parameters, and we believe that the use of these numerical tools could lead to an improvement in the characterization of soft biological tissues. This article is protected by copyright. All rights reserved.
Castaño, Fernando; Beruvides, Gerardo; Villalonga, Alberto; Haber, Rodolfo E
2018-05-10
On-chip LiDAR sensors for vehicle collision avoidance are a rapidly expanding area of research and development. The assessment of reliable obstacle detection using data collected by LiDAR sensors has become a key issue that the scientific community is actively exploring. The design of a self-tuning methodology and its implementation are presented in this paper, to maximize the reliability of LiDAR sensors network for obstacle detection in the 'Internet of Things' (IoT) mobility scenarios. The Webots Automobile 3D simulation tool for emulating sensor interaction in complex driving environments is selected in order to achieve that objective. Furthermore, a model-based framework is defined that employs a point-cloud clustering technique, and an error-based prediction model library that is composed of a multilayer perceptron neural network, and k-nearest neighbors and linear regression models. Finally, a reinforcement learning technique, specifically a Q-learning method, is implemented to determine the number of LiDAR sensors that are required to increase sensor reliability for obstacle localization tasks. In addition, a IoT driving assistance user scenario, connecting a five LiDAR sensor network is designed and implemented to validate the accuracy of the computational intelligence-based framework. The results demonstrated that the self-tuning method is an appropriate strategy to increase the reliability of the sensor network while minimizing detection thresholds.
A Delphi Investigation into Future Trends in E-Learning in Israel
ERIC Educational Resources Information Center
Aharony, Noa; Bronstein, Jenny
2014-01-01
The purpose of this study is to investigate the views and opinions of e-learning experts regarding future trends in the e-learning arena. The Delphi technique was chosen as a method of study. This technique is an efficient and effective group communication process designed to systematically elicit judgments from experts in their selected area of…
The Ticket to Retention: A Classroom Assessment Technique Designed to Improve Student Learning
ERIC Educational Resources Information Center
Divoll, Kent A.; Browning, Sandra T.; Vesey, Winona M.
2012-01-01
Classroom assessment techniques (CATs) or other closure activities are widely promoted for use in college classrooms. However, research on whether CATs improve student learning are mixed. The authors posit that the results are mixed because CATs were designed to "help teachers find out what students are learning in the classroom and how well…
COMPOSER: A Probabilistic Solution to the Utility Problem in Speed-up Learning.
ERIC Educational Resources Information Center
Gratch, Jonathan; DeJong, Gerald
In machine learning there is considerable interest in techniques which improve planning ability. Initial investigations have identified a wide variety of techniques to address this issue. Progress has been hampered by the utility problem, a basic tradeoff between the benefit of learned knowledge and the cost to locate and apply relevant knowledge.…
2014-09-30
This ONR grant promotes the development and application of advanced machine learning techniques for detection and classification of marine mammal...sounds. The objective is to engage a broad community of data scientists in the development and application of advanced machine learning techniques for detection and classification of marine mammal sounds.
ERIC Educational Resources Information Center
Servetti, Sara
2010-01-01
This paper focuses on cooperative learning (CL) used as a correction and grammar revision technique and considers the data collected in six Italian parallel classes, three of which (sample classes) corrected mistakes and revised grammar through cooperative learning, while the other three (control classes) in a traditional way. All the classes…
Just-in-Time Teaching in Statistics Classrooms
ERIC Educational Resources Information Center
McGee, Monnie; Stokes, Lynne; Nadolsky, Pavel
2016-01-01
Much has been made of the flipped classroom as an approach to teaching, and its effect on student learning. The volume of material showing that the flipped classroom technique helps students better learn and better retain material is increasing at a rapid pace. Coupled with this technique is active learning in the classroom. There are many ways of…
ERIC Educational Resources Information Center
Jacobs, George M.; Power, Michael A.; Inn, Loh Wan
This book demonstrates how classroom teachers can use cooperative learning techniques for lesson planning and classroom management. It emphasizes that cooperation among students is powerful, and it notes that just because students are in a group does not mean that they are cooperating. Part 1, "Getting Started with Cooperative Learning," includes…
Swarm Intelligence: New Techniques for Adaptive Systems to Provide Learning Support
ERIC Educational Resources Information Center
Wong, Lung-Hsiang; Looi, Chee-Kit
2012-01-01
The notion of a system adapting itself to provide support for learning has always been an important issue of research for technology-enabled learning. One approach to provide adaptivity is to use social navigation approaches and techniques which involve analysing data of what was previously selected by a cluster of users or what worked for…
ERIC Educational Resources Information Center
Martin, Larry L.; Hershey, Myrliss
Studied was the effectiveness of biofeedback techniques in reducing the hyperactive behavior of five hyperactive and four nonhyperactive children (all in elementary level learning disability classes). After 10 15-minute biofeedback training sessions over an 8-week period, Ss learned to raise their finger temperatures an average of 12.92 degrees…
Learning Physics through Project-Based Learning Game Techniques
ERIC Educational Resources Information Center
Baran, Medine; Maskan, Abdulkadir; Yasar, Seyma
2018-01-01
The aim of the present study, in which Project and game techniques are used together, is to examine the impact of project-based learning games on students' physics achievement. Participants of the study consist of 34 9th grade students (N = 34). The data were collected using achievement tests and a questionnaire. Throughout the applications, the…
Learning Strategies: Secondary LD Students in the Mainstream.
ERIC Educational Resources Information Center
D'Antoni, Alice; And Others
The paper presents four learning strategy techniques--the SQ3R method of study, the Multipass Strategy, the Advanced Study Guide Technique, and Cognitive Mapping--for use with secondary level learning disabled students. The SQ3R method involves the five steps of survey, question, read, recite, and review. An adaption of the SQ3R method, the…
2013-01-01
Background Protein-protein interactions (PPIs) play crucial roles in the execution of various cellular processes and form the basis of biological mechanisms. Although large amount of PPIs data for different species has been generated by high-throughput experimental techniques, current PPI pairs obtained with experimental methods cover only a fraction of the complete PPI networks, and further, the experimental methods for identifying PPIs are both time-consuming and expensive. Hence, it is urgent and challenging to develop automated computational methods to efficiently and accurately predict PPIs. Results We present here a novel hierarchical PCA-EELM (principal component analysis-ensemble extreme learning machine) model to predict protein-protein interactions only using the information of protein sequences. In the proposed method, 11188 protein pairs retrieved from the DIP database were encoded into feature vectors by using four kinds of protein sequences information. Focusing on dimension reduction, an effective feature extraction method PCA was then employed to construct the most discriminative new feature set. Finally, multiple extreme learning machines were trained and then aggregated into a consensus classifier by majority voting. The ensembling of extreme learning machine removes the dependence of results on initial random weights and improves the prediction performance. Conclusions When performed on the PPI data of Saccharomyces cerevisiae, the proposed method achieved 87.00% prediction accuracy with 86.15% sensitivity at the precision of 87.59%. Extensive experiments are performed to compare our method with state-of-the-art techniques Support Vector Machine (SVM). Experimental results demonstrate that proposed PCA-EELM outperforms the SVM method by 5-fold cross-validation. Besides, PCA-EELM performs faster than PCA-SVM based method. Consequently, the proposed approach can be considered as a new promising and powerful tools for predicting PPI with excellent performance and less time. PMID:23815620
ERIC Educational Resources Information Center
Gallagher, Rosina Mena
This study evaluates the counseling-learning approach to foreign language instruction as compared with traditional methods in terms of language achievement and change in personal orientation and in attitude toward learning. Twelve students volunteered to learn Spanish or German under simultaneous exposure to both languages using the…
Henry, Teague; Campbell, Ashley
2015-01-01
Objective. To examine factors that determine the interindividual variability of learning within a team-based learning environment. Methods. Students in a pharmacokinetics course were given 4 interim, low-stakes cumulative assessments throughout the semester and a cumulative final examination. Students’ Myers-Briggs personality type was assessed, as well as their study skills, motivations, and attitudes towards team-learning. A latent curve model (LCM) was applied and various covariates were assessed to improve the regression model. Results. A quadratic LCM was applied for the first 4 assessments to predict final examination performance. None of the covariates examined significantly impacted the regression model fit except metacognitive self-regulation, which explained some of the variability in the rate of learning. There were some correlations between personality type and attitudes towards team learning, with introverts having a lower opinion of team-learning than extroverts. Conclusion. The LCM could readily describe the learning curve. Extroverted and introverted personality types had the same learning performance even though preference for team-learning was lower in introverts. Other personality traits, study skills, or practice did not significantly contribute to the learning variability in this course. PMID:25861101
Persky, Adam M; Henry, Teague; Campbell, Ashley
2015-03-25
To examine factors that determine the interindividual variability of learning within a team-based learning environment. Students in a pharmacokinetics course were given 4 interim, low-stakes cumulative assessments throughout the semester and a cumulative final examination. Students' Myers-Briggs personality type was assessed, as well as their study skills, motivations, and attitudes towards team-learning. A latent curve model (LCM) was applied and various covariates were assessed to improve the regression model. A quadratic LCM was applied for the first 4 assessments to predict final examination performance. None of the covariates examined significantly impacted the regression model fit except metacognitive self-regulation, which explained some of the variability in the rate of learning. There were some correlations between personality type and attitudes towards team learning, with introverts having a lower opinion of team-learning than extroverts. The LCM could readily describe the learning curve. Extroverted and introverted personality types had the same learning performance even though preference for team-learning was lower in introverts. Other personality traits, study skills, or practice did not significantly contribute to the learning variability in this course.
A Bayesian Machine Learning Model for Estimating Building Occupancy from Open Source Data
Stewart, Robert N.; Urban, Marie L.; Duchscherer, Samantha E.; ...
2016-01-01
Understanding building occupancy is critical to a wide array of applications including natural hazards loss analysis, green building technologies, and population distribution modeling. Due to the expense of directly monitoring buildings, scientists rely in addition on a wide and disparate array of ancillary and open source information including subject matter expertise, survey data, and remote sensing information. These data are fused using data harmonization methods which refer to a loose collection of formal and informal techniques for fusing data together to create viable content for building occupancy estimation. In this paper, we add to the current state of the artmore » by introducing the Population Data Tables (PDT), a Bayesian based informatics system for systematically arranging data and harmonization techniques into a consistent, transparent, knowledge learning framework that retains in the final estimation uncertainty emerging from data, expert judgment, and model parameterization. PDT probabilistically estimates ambient occupancy in units of people/1000ft2 for over 50 building types at the national and sub-national level with the goal of providing global coverage. The challenge of global coverage led to the development of an interdisciplinary geospatial informatics system tool that provides the framework for capturing, storing, and managing open source data, handling subject matter expertise, carrying out Bayesian analytics as well as visualizing and exporting occupancy estimation results. We present the PDT project, situate the work within the larger community, and report on the progress of this multi-year project.Understanding building occupancy is critical to a wide array of applications including natural hazards loss analysis, green building technologies, and population distribution modeling. Due to the expense of directly monitoring buildings, scientists rely in addition on a wide and disparate array of ancillary and open source information including subject matter expertise, survey data, and remote sensing information. These data are fused using data harmonization methods which refer to a loose collection of formal and informal techniques for fusing data together to create viable content for building occupancy estimation. In this paper, we add to the current state of the art by introducing the Population Data Tables (PDT), a Bayesian model and informatics system for systematically arranging data and harmonization techniques into a consistent, transparent, knowledge learning framework that retains in the final estimation uncertainty emerging from data, expert judgment, and model parameterization. PDT probabilistically estimates ambient occupancy in units of people/1000 ft 2 for over 50 building types at the national and sub-national level with the goal of providing global coverage. The challenge of global coverage led to the development of an interdisciplinary geospatial informatics system tool that provides the framework for capturing, storing, and managing open source data, handling subject matter expertise, carrying out Bayesian analytics as well as visualizing and exporting occupancy estimation results. We present the PDT project, situate the work within the larger community, and report on the progress of this multi-year project.« less
Lom, Barbara
2012-01-01
The traditional science lecture, where an instructor delivers a carefully crafted monolog to a large audience of students who passively receive the information, has been a popular mode of instruction for centuries. Recent evidence on the science of teaching and learning indicates that learner-centered, active teaching strategies can be more effective learning tools than traditional lectures. Yet most colleges and universities retain lectures as their central instructional method. This article highlights several simple collaborative teaching techniques that can be readily deployed within traditional lecture frameworks to promote active learning. Specifically, this article briefly introduces the techniques of: reader’s theatre, think-pair-share, roundtable, jigsaw, in-class quizzes, and minute papers. Each technique is broadly applicable well beyond neuroscience courses and easily modifiable to serve an instructor’s specific pedagogical goals. The benefits of each technique are described along with specific examples of how each technique might be deployed within a traditional lecture to create more active learning experiences. PMID:23494568
Machine learning in heart failure: ready for prime time.
Awan, Saqib Ejaz; Sohel, Ferdous; Sanfilippo, Frank Mario; Bennamoun, Mohammed; Dwivedi, Girish
2018-03-01
The aim of this review is to present an up-to-date overview of the application of machine learning methods in heart failure including diagnosis, classification, readmissions and medication adherence. Recent studies have shown that the application of machine learning techniques may have the potential to improve heart failure outcomes and management, including cost savings by improving existing diagnostic and treatment support systems. Recently developed deep learning methods are expected to yield even better performance than traditional machine learning techniques in performing complex tasks by learning the intricate patterns hidden in big medical data. The review summarizes the recent developments in the application of machine and deep learning methods in heart failure management.
Storytelling: a teaching-learning technique.
Geanellos, R
1996-03-01
Nurses' stories, arising from the practice world, reconstruct the essence of experience as lived and provide vehicles for learning about nursing. The learning process is forwarded by combining storytelling and reflection. Reflection represents an active, purposive, contemplative and deliberative approach to learning through which learners create meaning from the learning experience. The combination of storytelling and reflection allows the creation of links between the materials at hand and prior and future learning. As a teaching-learning technique storytelling engages learners; organizes information; allows exploration of shared lived experiences without the demands, responsibilities and consequences of practice; facilitates remembering; enhances discussion, problem posing and problem solving; and aids understanding of what it is to nurse and to be a nurse.
Deep Learning Neural Networks and Bayesian Neural Networks in Data Analysis
NASA Astrophysics Data System (ADS)
Chernoded, Andrey; Dudko, Lev; Myagkov, Igor; Volkov, Petr
2017-10-01
Most of the modern analyses in high energy physics use signal-versus-background classification techniques of machine learning methods and neural networks in particular. Deep learning neural network is the most promising modern technique to separate signal and background and now days can be widely and successfully implemented as a part of physical analysis. In this article we compare Deep learning and Bayesian neural networks application as a classifiers in an instance of top quark analysis.
B-tree search reinforcement learning for model based intelligent agent
NASA Astrophysics Data System (ADS)
Bhuvaneswari, S.; Vignashwaran, R.
2013-03-01
Agents trained by learning techniques provide a powerful approximation of active solutions for naive approaches. In this study using B - Trees implying reinforced learning the data search for information retrieval is moderated to achieve accuracy with minimum search time. The impact of variables and tactics applied in training are determined using reinforcement learning. Agents based on these techniques perform satisfactory baseline and act as finite agents based on the predetermined model against competitors from the course.
Computational materials design of crystalline solids.
Butler, Keith T; Frost, Jarvist M; Skelton, Jonathan M; Svane, Katrine L; Walsh, Aron
2016-11-07
The modelling of materials properties and processes from first principles is becoming sufficiently accurate as to facilitate the design and testing of new systems in silico. Computational materials science is both valuable and increasingly necessary for developing novel functional materials and composites that meet the requirements of next-generation technology. A range of simulation techniques are being developed and applied to problems related to materials for energy generation, storage and conversion including solar cells, nuclear reactors, batteries, fuel cells, and catalytic systems. Such techniques may combine crystal-structure prediction (global optimisation), data mining (materials informatics) and high-throughput screening with elements of machine learning. We explore the development process associated with computational materials design, from setting the requirements and descriptors to the development and testing of new materials. As a case study, we critically review progress in the fields of thermoelectrics and photovoltaics, including the simulation of lattice thermal conductivity and the search for Pb-free hybrid halide perovskites. Finally, a number of universal chemical-design principles are advanced.
Petascale supercomputing to accelerate the design of high-temperature alloys
Shin, Dongwon; Lee, Sangkeun; Shyam, Amit; ...
2017-10-25
Recent progress in high-performance computing and data informatics has opened up numerous opportunities to aid the design of advanced materials. Herein, we demonstrate a computational workflow that includes rapid population of high-fidelity materials datasets via petascale computing and subsequent analyses with modern data science techniques. We use a first-principles approach based on density functional theory to derive the segregation energies of 34 microalloying elements at the coherent and semi-coherent interfaces between the aluminium matrix and the θ'-Al 2Cu precipitate, which requires several hundred supercell calculations. We also perform extensive correlation analyses to identify materials descriptors that affect the segregation behaviourmore » of solutes at the interfaces. Finally, we show an example of leveraging machine learning techniques to predict segregation energies without performing computationally expensive physics-based simulations. As a result, the approach demonstrated in the present work can be applied to any high-temperature alloy system for which key materials data can be obtained using high-performance computing.« less
Petascale supercomputing to accelerate the design of high-temperature alloys
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shin, Dongwon; Lee, Sangkeun; Shyam, Amit
Recent progress in high-performance computing and data informatics has opened up numerous opportunities to aid the design of advanced materials. Herein, we demonstrate a computational workflow that includes rapid population of high-fidelity materials datasets via petascale computing and subsequent analyses with modern data science techniques. We use a first-principles approach based on density functional theory to derive the segregation energies of 34 microalloying elements at the coherent and semi-coherent interfaces between the aluminium matrix and the θ'-Al 2Cu precipitate, which requires several hundred supercell calculations. We also perform extensive correlation analyses to identify materials descriptors that affect the segregation behaviourmore » of solutes at the interfaces. Finally, we show an example of leveraging machine learning techniques to predict segregation energies without performing computationally expensive physics-based simulations. As a result, the approach demonstrated in the present work can be applied to any high-temperature alloy system for which key materials data can be obtained using high-performance computing.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sternberg, B.K.; Thomas, S.J.
1992-12-01
The overall objective of the project was to apply a new high-resolution imaging system to water resource investigations. This imaging system measures the ellipticity of received magnetic-field components. The source of the magnetic field is a long-line transmitter emitting frequencies from 30 Hz to 30 kHz. A new high-accuracy calibration method was used to enhance the resolution of the measurements. The specific objectives included: (1) refine the system hardware and software based on these investigations, (2) learn the limitations of this technology in practical water resource investigations, and (3) improve interpretation techniques to extract the highest possible resolution. Successful fieldmore » surveys were run at: (1) San Xavier Mine, Arizona - flow of injected fluid was monitored with the system. (2) Avra Valley, Arizona - subsurface stratigraphy was imaged. A survey at a third site was less successful; interpreted resistivity section does not agree with nearby well logs. Surveys are continuing at this site.« less
The Impact of Machine Translation and Computer-aided Translation on Translators
NASA Astrophysics Data System (ADS)
Peng, Hao
2018-03-01
Under the context of globalization, communications between countries and cultures are becoming increasingly frequent, which make it imperative to use some techniques to help translate. This paper is to explore the influence of computer-aided translation on translators, which is derived from the field of the computer-aided translation (CAT) and machine translation (MT). Followed by an introduction to the development of machine and computer-aided translation, it then depicts the technologies practicable to translators, which are trying to analyze the demand of designing the computer-aided translation so far in translation practice, and optimize the designation of computer-aided translation techniques, and analyze its operability in translation. The findings underline the advantages and disadvantages of MT and CAT tools, and the serviceability and future development of MT and CAT technologies. Finally, this thesis probes into the impact of these new technologies on translators in hope that more translators and translation researchers can learn to use such tools to improve their productivity.
Petascale supercomputing to accelerate the design of high-temperature alloys
NASA Astrophysics Data System (ADS)
Shin, Dongwon; Lee, Sangkeun; Shyam, Amit; Haynes, J. Allen
2017-12-01
Recent progress in high-performance computing and data informatics has opened up numerous opportunities to aid the design of advanced materials. Herein, we demonstrate a computational workflow that includes rapid population of high-fidelity materials datasets via petascale computing and subsequent analyses with modern data science techniques. We use a first-principles approach based on density functional theory to derive the segregation energies of 34 microalloying elements at the coherent and semi-coherent interfaces between the aluminium matrix and the θ‧-Al2Cu precipitate, which requires several hundred supercell calculations. We also perform extensive correlation analyses to identify materials descriptors that affect the segregation behaviour of solutes at the interfaces. Finally, we show an example of leveraging machine learning techniques to predict segregation energies without performing computationally expensive physics-based simulations. The approach demonstrated in the present work can be applied to any high-temperature alloy system for which key materials data can be obtained using high-performance computing.
ERIC Educational Resources Information Center
Huet, Michael; Jacobs, David M.; Camachon, Cyril; Missenard, Olivier; Gray, Rob; Montagne, Gilles
2011-01-01
The present study reports two experiments in which a total of 20 participants without prior flight experience practiced the final approach phase in a fixed-base simulator. All participants received self-controlled concurrent feedback during 180 practice trials. Experiment 1 shows that participants learn more quickly under variable practice…