Science.gov

Sample records for large classroom setting

  1. The Utility of Concept Maps to Facilitate Higher-Level Learning in a Large Classroom Setting

    PubMed Central

    Carr-Lopez, Sian M.; Vyas, Deepti; Patel, Rajul A.; Gnesa, Eric H.

    2014-01-01

    Objective. To describe the utility of concept mapping in a cardiovascular therapeutics course within a large classroom setting. Design. Students enrolled in a cardiovascular care therapeutics course completed concept maps for each major chronic cardiovascular condition. A grading rubric was used to facilitate peer-assessment of the concept map. Assessment. Students were administered a survey at the end of the course assessing their perceptions on the usefulness of the concept maps during the course and also during APPEs to assess utility beyond the course. Question item analyses were conducted on cumulative final examinations comparing student performance on concept-mapped topics compared to nonconcept-mapped topics. Conclusion. Concept maps help to facilitate meaningful learning within the course and the majority of students utilized them beyond the course. PMID:26056408

  2. Calibrated Peer Review: A New Tool for Integrating Information Literacy Skills in Writing-Intensive Large Classroom Settings

    ERIC Educational Resources Information Center

    Fosmire, Michael

    2010-01-01

    Calibrated Peer Review[TM] (CPR) is a program that can significantly enhance the ability to integrate intensive information literacy exercises into large classroom settings. CPR is founded on a solid pedagogic base for learning, and it is formulated in such a way that information skills can easily be inserted. However, there is no mention of its…

  3. Impact of problem-based learning in a large classroom setting: student perception and problem-solving skills.

    PubMed

    Klegeris, Andis; Hurren, Heather

    2011-12-01

    Problem-based learning (PBL) can be described as a learning environment where the problem drives the learning. This technique usually involves learning in small groups, which are supervised by tutors. It is becoming evident that PBL in a small-group setting has a robust positive effect on student learning and skills, including better problem-solving skills and an increase in overall motivation. However, very little research has been done on the educational benefits of PBL in a large classroom setting. Here, we describe a PBL approach (using tutorless groups) that was introduced as a supplement to standard didactic lectures in University of British Columbia Okanagan undergraduate biochemistry classes consisting of 45-85 students. PBL was chosen as an effective method to assist students in learning biochemical and physiological processes. By monitoring student attendance and using informal and formal surveys, we demonstrated that PBL has a significant positive impact on student motivation to attend and participate in the course work. Student responses indicated that PBL is superior to traditional lecture format with regard to the understanding of course content and retention of information. We also demonstrated that student problem-solving skills are significantly improved, but additional controlled studies are needed to determine how much PBL exercises contribute to this improvement. These preliminary data indicated several positive outcomes of using PBL in a large classroom setting, although further studies aimed at assessing student learning are needed to further justify implementation of this technique in courses delivered to large undergraduate classes. PMID:22139779

  4. Active Learning in a Large Medical Classroom Setting for Teaching Renal Physiology

    ERIC Educational Resources Information Center

    Dietz, John R.; Stevenson, Frazier T.

    2011-01-01

    In this article, the authors describe an active learning exercise which has been used to replace some lecture hours in the renal portion of an integrated, organ system-based curriculum for first-year medical students. The exercise takes place in a large auditorium with ~150 students. The authors, who are faculty members, lead the discussions,…

  5. Setting Up a Classroom Business

    ERIC Educational Resources Information Center

    Morgan, Madeline L.

    1977-01-01

    Junior high school home economics students plan and operate a holiday boutique in their school. The organization, operation, and evaluation involved in setting up a simulated business in the classroom is described. (BM)

  6. A Classroom Tariff-Setting Game

    ERIC Educational Resources Information Center

    Winchester, Niven

    2006-01-01

    The author outlines a classroom tariff-setting game that allows students to explore the consequences of import tariffs imposed by large countries (countries able to influence world prices). Groups of students represent countries, which are organized into trading pairs. Each group's objective is to maximize welfare by choosing an appropriate ad…

  7. A Classroom Tariff-Setting Game

    ERIC Educational Resources Information Center

    Winchester, Niven

    2006-01-01

    The author outlines a classroom tariff-setting game that allows students to explore the consequences of import tariffs imposed by large countries (countries able to influence world prices). Groups of students represent countries, which are organized into trading pairs. Each group's objective is to maximize welfare by choosing an appropriate ad

  8. Controlling Setting Events in the Classroom

    ERIC Educational Resources Information Center

    Chan, Paula E.

    2016-01-01

    Teachers face the challenging job of differentiating instruction for the diverse needs of their students. This task is difficult enough with happy students who are eager to learn; unfortunately students often enter the classroom in a bad mood because of events that happened outside the classroom walls. These events--called setting events--can…

  9. Individualizing in Traditional Classroom Settings.

    ERIC Educational Resources Information Center

    Thornell, John G.

    1980-01-01

    Effective individualized instruction depends primarily on the teacher possessing the skills to implement it. Individualization is therefore quite compatible with the traditional self-contained elementary classroom model, but not with its alternative, departmentalization, which allows teachers neither the time flexibility nor the familiarity with…

  10. Improvement in Generic Problem-Solving Abilities of Students by Use of Tutor-Less Problem-Based Learning in a Large Classroom Setting

    ERIC Educational Resources Information Center

    Klegeris, Andis; Bahniwal, Manpreet; Hurren, Heather

    2013-01-01

    Problem-based learning (PBL) was originally introduced in medical education programs as a form of small-group learning, but its use has now spread to large undergraduate classrooms in various other disciplines. Introduction of new teaching techniques, including PBL-based methods, needs to be justified by demonstrating the benefits of such…

  11. Improvement in Generic Problem-Solving Abilities of Students by Use of Tutor-Less Problem-Based Learning in a Large Classroom Setting

    ERIC Educational Resources Information Center

    Klegeris, Andis; Bahniwal, Manpreet; Hurren, Heather

    2013-01-01

    Problem-based learning (PBL) was originally introduced in medical education programs as a form of small-group learning, but its use has now spread to large undergraduate classrooms in various other disciplines. Introduction of new teaching techniques, including PBL-based methods, needs to be justified by demonstrating the benefits of such

  12. Improvement in Generic Problem-Solving Abilities of Students by Use of Tutor-less Problem-Based Learning in a Large Classroom Setting

    PubMed Central

    Klegeris, Andis; Bahniwal, Manpreet; Hurren, Heather

    2013-01-01

    Problem-based learning (PBL) was originally introduced in medical education programs as a form of small-group learning, but its use has now spread to large undergraduate classrooms in various other disciplines. Introduction of new teaching techniques, including PBL-based methods, needs to be justified by demonstrating the benefits of such techniques over classical teaching styles. Previously, we demonstrated that introduction of tutor-less PBL in a large third-year biochemistry undergraduate class increased student satisfaction and attendance. The current study assessed the generic problem-solving abilities of students from the same class at the beginning and end of the term, and compared student scores with similar data obtained in three classes not using PBL. Two generic problem-solving tests of equal difficulty were administered such that students took different tests at the beginning and the end of the term. Blinded marking showed a statistically significant 13% increase in the test scores of the biochemistry students exposed to PBL, while no trend toward significant change in scores was observed in any of the control groups not using PBL. Our study is among the first to demonstrate that use of tutor-less PBL in a large classroom leads to statistically significant improvement in generic problem-solving skills of students. PMID:23463230

  13. Implementing iPads in the Inclusive Classroom Setting

    ERIC Educational Resources Information Center

    Maich, Kimberly; Hall, Carmen

    2016-01-01

    This column provides practical suggestions to help guide teachers in utilizing classroom sets of iPads. Following a brief introduction to tablet technology in inclusive classrooms and the origin of these recommendations from a case study focus group, important elements of setting up classroom iPad use, from finding funding to teaching apps, are…

  14. Implementing iPads in the Inclusive Classroom Setting

    ERIC Educational Resources Information Center

    Maich, Kimberly; Hall, Carmen

    2016-01-01

    This column provides practical suggestions to help guide teachers in utilizing classroom sets of iPads. Following a brief introduction to tablet technology in inclusive classrooms and the origin of these recommendations from a case study focus group, important elements of setting up classroom iPad use, from finding funding to teaching apps, are

  15. Collaboration within Large Groups in the Classroom

    ERIC Educational Resources Information Center

    Szewkis, Eyal; Nussbaum, Miguel; Rosen, Tal; Abalos, Jose; Denardin, Fernanda; Caballero, Daniela; Tagle, Arturo; Alcoholado, Cristian

    2011-01-01

    The purpose of this paper is to show how a large group of students can work collaboratively in a synchronous way within the classroom using the cheapest possible technological support. Making use of the features of Single Display Groupware and of Multiple Mice we propose a computer-supported collaborative learning approach for big groups within…

  16. Using Microcomputers Interactively in Large Classrooms.

    ERIC Educational Resources Information Center

    Bowman, Barbara E.; Ellsworth, Randy

    In 1980, Wichita State University received a grant to introduce microcomputers as interactive teaching tools in large science classrooms. Through this grant, 18 faculty in 11 departments developed software modules illustrating concepts that are often difficult to teach by usual lecture methods. To determine whether the use of microcomputers in…

  17. Analyzing Multimodal Interaction within a Classroom Setting

    ERIC Educational Resources Information Center

    Moura, Heloisa

    2006-01-01

    Human interactions are multimodal in nature. From simple to complex forms of transferal of information, human beings draw on a multiplicity of communicative modes, such as intonation and gaze, to make sense of everyday experiences. Likewise, the learning process, either within traditional classrooms or Virtual Learning Environments, is shaped by…

  18. Tangential Floor in a Classroom Setting

    ERIC Educational Resources Information Center

    Marti, Leyla

    2012-01-01

    This article examines floor management in two classroom sessions: a task-oriented computer lesson and a literature lesson. Recordings made in the computer lesson show the organization of floor when a task is given to students. Temporary or "incipient" side floors (Jones and Thornborrow, 2004) emerge beside the main floor. In the literature lesson,…

  19. A Practical Setting of Distance Learning Classroom.

    ERIC Educational Resources Information Center

    Wang, Shousan; Buck, Lawrence

    1996-01-01

    Describes a distance-learning classroom developed and used by Central Connecticut State University for nurse training, educational statistics, mathematics, and technology courses. Discusses initial engineering, video cameras, video source switching, lighting, audio, and other technical and related aspects. Block diagrams and lists of equipment for…

  20. Student Engagement and Success in the Large Astronomy 101 Classroom

    NASA Astrophysics Data System (ADS)

    Jensen, J. B.

    2014-07-01

    The large auditorium classroom presents unique challenges to maintaining student engagement. During the fall 2012 semester, I adopted several specific strategies for increasing student engagement and reducing anonymity with the goal of maximizing student success in the large class. I measured attendance and student success in two classes, one with 300 students and one with 42, but otherwise taught as similarly as possible. While the students in the large class probably did better than they would have in a traditional lecture setting, attendance was still significantly lower in the large class, resulting in lower student success than in the small control class by all measures. I will discuss these results and compare to classes in previous semesters, including other small classes and large Distance Education classes conducted live over remote television link.

  1. Social Studies Instruction in a Non-Classroom Setting.

    ERIC Educational Resources Information Center

    Murphy, Margaret M.

    Certain areas in the social studies can be effectively taught in a non-classroom setting. This experiment determined if, in a supermarket situation, consumer preferences (as measured in sales figures and augmented by questionnaire data) could be altered by the addition of nutritional information to the labels of sixteen items which had moderate…

  2. Observation Instrument of Play Behaviour in a Classroom Setting

    ERIC Educational Resources Information Center

    Berkhout, Louise; Hoekman, Joop; Goorhuis-Brouwer, Sieneke M.

    2012-01-01

    The objective of this study was to develop an instrument to observe the play behaviour of a whole group of children from four to six years of age in a classroom setting on the basis of video recording. The instrument was developed in collaboration with experienced teachers and experts on play. Categories of play were derived from the literature…

  3. Enhancing Feedback via Peer Learning in Large Classrooms

    ERIC Educational Resources Information Center

    Zher, Ng Huey; Hussein, Raja Maznah Raja; Saat, Rohaida Mohd

    2016-01-01

    Feedback has been lauded as a key pedagogical tool in higher education. Unfortunately, the value of feedback falls short when being carried out in large classrooms. In this study, strategies for sustaining feedback in large classroom based on peer learning are explored. All the characteristics identified within the concept of peer learning were…

  4. Teacher and Student Research Using Large Data Sets

    NASA Astrophysics Data System (ADS)

    Croft, S. K.; Pompea, S. M.; Sparks, R. T.

    2005-12-01

    One of the objectives of teacher research experiences is to immerse the teacher in an authentic research situation to help the teacher understand what real research is all about: "to do science as scientists do." Experiences include doing experiments in laboratories, gathering data out in the field, and observing at professional observatories. However, a rapidly growing area of scientific research is in "data mining" increasingly large public data archives. In the earth and space sciences, such large archives are built around data from Landsat 7, the Sloan Digital Sky Survey, and in about seven years, the Large Synoptic Survey Telescope. The LSST will re-photograph the entire night sky every three day, resulting in a data flow of about 20 terabytes per night. The resulting LSST archive will represent a huge challenge of simple storage and retrieval for professional scientists. It will be a much greater challenge to help K-12 teachers use such gargantuan files and collections of data effectively in the classroom and to understand and begin to practice the new research procedures involved in data mining. At NOAO we are exploring ways of using large data sets in formal educational settings like classrooms, and public settings like planetariums and museums. In our existing professional development programs, such as our Teacher leaders in Research Based Science Education, we have introduced teachers to research via on-site observing experiences and partnerships with active astronomers. To successfully initiate research in the classroom, we have found that teachers need training in specific science content, use of specialized software to work with the data, development of research questions and objectives, and explicit pedagogical strategies for classroom use. Our research projects are well defined, though not "canned," and incorporate specific types of data, such as solar images. These data can be replaced with new data from an archive for the classroom research experience. This is already a form of data mining that can be applied to large data sets. We are looking at ways to apply our experience with hands-on observation experiences to the relatively abstract world of data mining. We are also looking at ways to move beyond the well-defined application to training teachers to develop their own more open-ended research activities. NOAO is operated by the Association of Universities for Research in Astronomy (AURA), Inc., under cooperative agreement with the National Science Foundation.

  5. Examining the Effectiveness of Team-Based Learning (TBL) in Different Classroom Settings

    ERIC Educational Resources Information Center

    Yuretich, Richard F.; Kanner, Lisa C.

    2015-01-01

    The problem of effective learning in college classrooms, especially in a large lecture setting, has been a topic of discussion for a considerable span of time. Most efforts to improve learning incorporate various forms of student-active learning, such as in-class investigations or problems, group discussions, collaborative examinations and…

  6. Examining the Effectiveness of Team-Based Learning (TBL) in Different Classroom Settings

    ERIC Educational Resources Information Center

    Yuretich, Richard F.; Kanner, Lisa C.

    2015-01-01

    The problem of effective learning in college classrooms, especially in a large lecture setting, has been a topic of discussion for a considerable span of time. Most efforts to improve learning incorporate various forms of student-active learning, such as in-class investigations or problems, group discussions, collaborative examinations and

  7. Observations of Children’s Interactions with Teachers, Peers, and Tasks across Preschool Classroom Activity Settings

    PubMed Central

    Booren, Leslie M.; Downer, Jason T.; Vitiello, Virginia E.

    2014-01-01

    This descriptive study examined classroom activity settings in relation to children’s observed behavior during classroom interactions, child gender, and basic teacher behavior within the preschool classroom. 145 children were observed for an average of 80 minutes during 8 occasions across 2 days using the inCLASS, an observational measure that conceptualizes behavior into teacher, peer, task, and conflict interactions. Findings indicated that on average children’s interactions with teachers were higher in teacher-structured settings, such as large group. On average, children’s interactions with peers and tasks were more positive in child-directed settings, such as free choice. Children experienced more conflict during recess and routines/transitions. Finally, gender differences were observed within small group and meals. The implications of these findings might encourage teachers to be thoughtful and intentional about what types of support and resources are provided so children can successfully navigate the demands of particular settings. These findings are not meant to discourage certain teacher behaviors or imply value of certain classroom settings; instead, by providing an evidenced-based picture of the conditions under which children display the most positive interactions, teachers can be more aware of choices within these settings and have a powerful way to assist in professional development and interventions. PMID:25717282

  8. A Student Response System in an Electronic Classroom: Technology Aids for Large Classroom Instruction

    NASA Astrophysics Data System (ADS)

    Ober, D.; Errington, P.; Islam, S.; Robertson, T.; Watson, J.

    1997-10-01

    In the fall of 1996, thirteen (13) classrooms on the Ball State campus were equipped with technological aids to enhance learning in large classrooms (for typically 100 students or larger). Each classroom was equipped with the following built-in equipment: computer, zip drive, laser disc player, VCR, LAN and Internet connection, TV monitors, and Elmo overhead camera with large-screen projection system. This past fall semester a student response system was added to a 108-seat classroom in the Physics and Astronomy department for use with large General Education courses. Each student seat was equipped with a hardwired hand-held unit possessing input capabilities and LCD feedback for the student. The introduction of the student response system was added in order enhance more active learning by students in the large classroom environment. Attendance, quizzes, hour exams, and in-class surveys are early uses for the system; initial reactions by student and faculty users will be given.

  9. Using Flipped Classroom Approach to Explore Deep Learning in Large Classrooms

    ERIC Educational Resources Information Center

    Danker, Brenda

    2015-01-01

    This project used two Flipped Classroom approaches to stimulate deep learning in large classrooms during the teaching of a film module as part of a Diploma in Performing Arts course at Sunway University, Malaysia. The flipped classes utilized either a blended learning approach where students first watched online lectures as homework, and then…

  10. Using problem-based learning in a large classroom.

    PubMed

    Pastirik, Pamela J

    2006-09-01

    Although PBL (problem-based learning) has gained increasing acceptance as an alternative to teacher-centered methods in nursing education, there are challenges to implementing this method in conventional course-based curriculums due to lack of additional faculty tutors to facilitate and monitor small group process. Little is known in nursing education regarding the effectiveness of teaching PBL in large group settings. [Woods, D. 1996. Problem-based Learning for Large Classes in Chemical Engineering. In: Wilkerson, L., Gijsaers, W. (Eds.), Bringing Problem-based Learning to Higher Education: Theory And Practice. Jossey-Bass, San Francisco, pp. 91-99] suggests that there are significant challenges related to student acceptance of the method, monitoring small group process and evaluating the quality of students' work. This paper will provide a description of the process and outcome of using PBL in a second year Baccalaureate nursing course using both classroom and on-line learning technology. Findings from a student survey will be included to highlight the strengths and challenges of using PBL in a large group setting with one faculty tutor. Implications for using PBL in this format will be provided. PMID:19040887

  11. Patterns of classroom discourse in an integrated, interpreted elementary school setting.

    PubMed

    Shaw, J; Jamieson, J

    1997-03-01

    The purpose of this study was to describe the patterns of classroom discourse experienced by an integrated deaf child with full-time interpreting services in an elementary setting. The child was videotaped for 3 hours during classroom instructional time. The videotapes were analyzed to determine patterns of discourse between the child and the teacher and the child and the interpreter, as well as to gauge the deaf student's accessibility to teacher-class discourse. It was found that the deaf student interacted predominantly with the interpreter; in fact, this student received more direct instruction from the interpreter than from the teacher. The discourse to which the student was exposed was largely academic, rather than cultural or social. Findings are discussed in terms of the extent to which implicit classroom discourse and cultural knowledge were inaccessible to the deaf student. PMID:9127500

  12. Using News Articles to Build a Critical Literacy Classroom in an EFL Setting

    ERIC Educational Resources Information Center

    Park, Yujong

    2011-01-01

    This article examines an effort to support critical literacy in an English as a foreign language (EFL) setting by analyzing one college EFL reading classroom in which students read and responded to articles from "The New Yorker". Data include transcribed audiotapes of classroom interaction and interviews with students, classroom materials, and…

  13. Observations of Children's Interactions with Teachers, Peers, and Tasks across Preschool Classroom Activity Settings

    ERIC Educational Resources Information Center

    Booren, Leslie M.; Downer, Jason T.; Vitiello, Virginia E.

    2012-01-01

    Research Findings: This descriptive study examined classroom activity settings in relation to children's observed behavior during classroom interactions, child gender, and basic teacher behavior within the preschool classroom. A total of 145 children were observed for an average of 80 min during 8 occasions across 2 days using the Individualized…

  14. Content Specific Classroom Libraries in a Middle School Setting

    ERIC Educational Resources Information Center

    Ray, Stacy T.

    2011-01-01

    In September of 2008 Scholastic Book Company donated content specific classroom libraries to one core team of sixth grade classrooms at Wentzville Middle School in the Wentzville School District. This was the first time that Scholastic had been involved in the concept of "team" libraries. The classrooms involved consisted of mathematics,…

  15. Lessons Learned from a Multiculturally, Economically Diverse Classroom Setting.

    ERIC Educational Resources Information Center

    Lyman, Lawrence

    For her sabbatical a professor of teacher education at Emporia State University returned to the elementary classroom after a 20-year absence to teach in a third/fourth combination classroom in the Emporia, Kansas Public Schools. The return to elementary classroom teaching provided the professor with the opportunity to utilize some of the social…

  16. On Flipping the Classroom in Large First Year Calculus Courses

    ERIC Educational Resources Information Center

    Jungic, Veselin; Kaur, Harpreet; Mulholland, Jamie; Xin, Cindy

    2015-01-01

    Over the course of two years, 2012-2014, we have implemented a "flipping" the classroom approach in three of our large enrolment first year calculus courses: differential and integral calculus for scientists and engineers. In this article we describe the details of our particular approach and share with the reader some experiences of…

  17. On Flipping the Classroom in Large First Year Calculus Courses

    ERIC Educational Resources Information Center

    Jungic, Veselin; Kaur, Harpreet; Mulholland, Jamie; Xin, Cindy

    2015-01-01

    Over the course of two years, 2012-2014, we have implemented a "flipping" the classroom approach in three of our large enrolment first year calculus courses: differential and integral calculus for scientists and engineers. In this article we describe the details of our particular approach and share with the reader some experiences of

  18. Designing an Electronic Classroom for Large College Courses.

    ERIC Educational Resources Information Center

    Aiken, Milam W.; Hawley, Delvin D.

    1995-01-01

    Describes a state-of-the-art electronic classroom at the University of Mississippi School of Business designed for large numbers of students and regularly scheduled classes. Highlights include: architecture of the room, hardware components, software utilized in the room, and group decision support system software and its uses. (JKP)

  19. Parallel visualization of large data sets

    NASA Astrophysics Data System (ADS)

    Rosenberg, Robert O.; Lanzagorta, Marco O.; Chtchelkanova, Almadena; Khokhlov, Alexei

    2000-02-01

    In this paper we describe our efforts towards the parallel visualization of large data sets. We describe the fully threaded tree (FTT) structure developed at NRL to tackle the problem of massive parallel calculations using adaptive mesh refinement methods. All operations with FTT are performed in parallel and require only a small memory overhead. The FTT can be viewed as a data compression scheme that dramatically improves the performance of standard finite difference algorithms by performing calculations on the compressed data in situ. Because of the tremendous benefits of this type of data structure, it is of great interest to develop visualization algorithms that are native to the FTT. Using the FTT library, we convert the FTT data structure to an unstructured data set and discuss applications to both scatter dot visualization and parallel ray-tracing. The latter technique gives a good indication of the performance and scalability of the FTT algorithm for ray-tracking. We then discuss conversion of the FTT data structure for virtual reality visualization in an immersive room. Our results are presented using an example of a numerical calculation of a detonation in a rectangular cavity using from 1 to 2 million cells.

  20. Peer Educators in Classroom Settings: Effective Academic Partners

    ERIC Educational Resources Information Center

    Owen, Julie E.

    2011-01-01

    Involving undergraduates in the design, delivery, and evaluation of classroom-based learning enhances student ownership of the learning environment and stimulates peer interest in the transformative possibilities of education. As bell hooks (1994) eloquently describes, the process of honoring student voices in the classroom enhances "the

  1. Peer Educators in Classroom Settings: Effective Academic Partners

    ERIC Educational Resources Information Center

    Owen, Julie E.

    2011-01-01

    Involving undergraduates in the design, delivery, and evaluation of classroom-based learning enhances student ownership of the learning environment and stimulates peer interest in the transformative possibilities of education. As bell hooks (1994) eloquently describes, the process of honoring student voices in the classroom enhances "the…

  2. An Exploration of the Effectiveness of an Audit Simulation Tool in a Classroom Setting

    ERIC Educational Resources Information Center

    Zelin, Robert C., II

    2010-01-01

    The purpose of this study was to examine the effectiveness of using an audit simulation product in a classroom setting. Many students and professionals feel that a disconnect exists between learning auditing in the classroom and practicing auditing in the workplace. It was hoped that the introduction of an audit simulation tool would help to…

  3. The Emergence of Student Creativity in Classroom Settings: A Case Study of Elementary Schools in Korea

    ERIC Educational Resources Information Center

    Cho, Younsoon; Chung, Hye Young; Choi, Kyoulee; Seo, Choyoung; Baek, Eunjoo

    2013-01-01

    This research explores the emergence of student creativity in classroom settings, specifically within two content areas: science and social studies. Fourteen classrooms in three elementary schools in Korea were observed, and the teachers and students were interviewed. The three types of student creativity emerging in the teaching and learning…

  4. The Emergence of Student Creativity in Classroom Settings: A Case Study of Elementary Schools in Korea

    ERIC Educational Resources Information Center

    Cho, Younsoon; Chung, Hye Young; Choi, Kyoulee; Seo, Choyoung; Baek, Eunjoo

    2013-01-01

    This research explores the emergence of student creativity in classroom settings, specifically within two content areas: science and social studies. Fourteen classrooms in three elementary schools in Korea were observed, and the teachers and students were interviewed. The three types of student creativity emerging in the teaching and learning

  5. Clickers in the Large Classroom: Current Research and Best-Practice Tips

    PubMed Central

    2007-01-01

    Audience response systems (ARS) or clickers, as they are commonly called, offer a management tool for engaging students in the large classroom. Basic elements of the technology are discussed. These systems have been used in a variety of fields and at all levels of education. Typical goals of ARS questions are discussed, as well as methods of compensating for the reduction in lecture time that typically results from their use. Examples of ARS use occur throughout the literature and often detail positive attitudes from both students and instructors, although exceptions do exist. When used in classes, ARS clickers typically have either a benign or positive effect on student performance on exams, depending on the method and extent of their use, and create a more positive and active atmosphere in the large classroom. These systems are especially valuable as a means of introducing and monitoring peer learning methods in the large lecture classroom. So that the reader may use clickers effectively in his or her own classroom, a set of guidelines for writing good questions and a list of best-practice tips have been culled from the literature and experienced users. PMID:17339389

  6. Clickers in the large classroom: current research and best-practice tips.

    PubMed

    Caldwell, Jane E

    2007-01-01

    Audience response systems (ARS) or clickers, as they are commonly called, offer a management tool for engaging students in the large classroom. Basic elements of the technology are discussed. These systems have been used in a variety of fields and at all levels of education. Typical goals of ARS questions are discussed, as well as methods of compensating for the reduction in lecture time that typically results from their use. Examples of ARS use occur throughout the literature and often detail positive attitudes from both students and instructors, although exceptions do exist. When used in classes, ARS clickers typically have either a benign or positive effect on student performance on exams, depending on the method and extent of their use, and create a more positive and active atmosphere in the large classroom. These systems are especially valuable as a means of introducing and monitoring peer learning methods in the large lecture classroom. So that the reader may use clickers effectively in his or her own classroom, a set of guidelines for writing good questions and a list of best-practice tips have been culled from the literature and experienced users. PMID:17339389

  7. Twelve Practical Strategies To Prevent Behavioral Escalation in Classroom Settings.

    ERIC Educational Resources Information Center

    Shukla-Mehta, Smita; Albin, Richard W.

    2003-01-01

    Twelve practical strategies that can be used by classroom teachers to prevent behavioral escalation are discussed, including reinforce calm, know the triggers, pay attention to anything unusual, do not escalate, intervene early, know the function of problem behavior, use extinction wisely, teach prosocial behavior, and teach academic survival…

  8. Setting of Classroom Environments for Hearing Impaired Children

    ERIC Educational Resources Information Center

    Turan, Zerrin

    2007-01-01

    This paper aims to explain effects of acoustical environments in sound perception of hearing impaired people. Important aspects of sound and hearing impairment are explained. Detrimental factors in acoustic conditions for speech perception are mentioned. Necessary acoustic treatment in classrooms and use of FM systems to eliminate these factors…

  9. Twelve Practical Strategies To Prevent Behavioral Escalation in Classroom Settings.

    ERIC Educational Resources Information Center

    Shukla-Mehta, Smita; Albin, Richard W.

    2003-01-01

    Twelve practical strategies that can be used by classroom teachers to prevent behavioral escalation are discussed, including reinforce calm, know the triggers, pay attention to anything unusual, do not escalate, intervene early, know the function of problem behavior, use extinction wisely, teach prosocial behavior, and teach academic survival

  10. Understanding Bystander Perceptions of Cyberbullying in Inclusive Classroom Settings

    ERIC Educational Resources Information Center

    Guckert, Mary

    2013-01-01

    Cyberbullying is a pervasive problem that puts students at risk of successful academic outcomes and the ability to feel safe in school. As most students with disabilities are served in inclusive classrooms, there is a growing concern that students with special needs are at an increased risk of online bullying harassment. Enhancing responsible

  11. Thinking Routines: Replicating Classroom Practices within Museum Settings

    ERIC Educational Resources Information Center

    Wolberg, Rochelle Ibanez; Goff, Allison

    2012-01-01

    This article describes thinking routines as tools to guide and support young children's thinking. These learning strategies, developed by Harvard University's Project Zero Classroom, actively engage students in constructing meaning while also understanding their own thinking process. The authors discuss how thinking routines can be used in both…

  12. Understanding Bystander Perceptions of Cyberbullying in Inclusive Classroom Settings

    ERIC Educational Resources Information Center

    Guckert, Mary

    2013-01-01

    Cyberbullying is a pervasive problem that puts students at risk of successful academic outcomes and the ability to feel safe in school. As most students with disabilities are served in inclusive classrooms, there is a growing concern that students with special needs are at an increased risk of online bullying harassment. Enhancing responsible…

  13. Knowledge Discovery in Large Data Sets

    SciTech Connect

    Simas, Tiago; Silva, Gabriel; Miranda, Bruno; Ribeiro, Rita

    2008-12-05

    In this work we briefly address the problem of unsupervised classification on large datasets, magnitude around 100,000,000 objects. The objects are variable objects, which are around 10% of the 1,000,000,000 astronomical objects that will be collected by GAIA/ESA mission. We tested unsupervised classification algorithms on known datasets such as OGLE and Hipparcos catalogs. Moreover, we are building several templates to represent the main classes of variable objects as well as new classes to build a synthetic dataset of this dimension. In the future we will run the GAIA satellite scanning law on these templates to obtain a testable large dataset.

  14. Classroom Climate in Regular Primary School Settings with Children with Special Needs

    ERIC Educational Resources Information Center

    Schmidt, Majda; Cagran, Branka

    2006-01-01

    This study investigates the classroom climate in two settings of the 6th-grade class (a setting of children with special needs and a setting without children with special needs), focusing on aspects of satisfaction and cohesiveness on one side and friction, competitiveness and difficulties on the other. The study results indicate the existence of…

  15. Reliability of the 5-min psychomotor vigilance task in a primary school classroom setting.

    PubMed

    Wilson, Andrew; Dollman, James; Lushington, Kurt; Olds, Timothy

    2010-08-01

    This study evaluated the reliability of the 5-min psychomotor vigilance task (PVT) in a single-sex Australian primary school. Seventy-five male students (mean age = 11.82 years, SD = 1.12) completed two 5-min PVTs using a Palm personal digital assistant (PDA) in (1) an isolated setting and (2) a classroom setting. Of this group of students, a subsample of 37 students completed a test-retest reliability trial within the classroom setting. Using a mixed-model analysis, there was no significant difference in the mean response time (RT) or number of lapses (RTs >or= 500 msec) between the isolated and the classroom setting. There was, however, an order effect for the number of lapses in the isolated setting, with the number of lapses being greater if the isolated test was conducted second. Test-retest intraclass correlation coefficients (ICCs) in the classroom setting indicated moderate to high reliability (mean RT = .84, lapses = .59). Bland-Altman analysis showed no systematic difference between the two settings. Findings suggest that the 5-min PDA PVT is a reliable measure of sustained attention in the classroom setting in this sample of primary-aged schoolchildren. The results provide further evidence for the versatility of this measuring device for larger interventions outside the laboratory. PMID:20805597

  16. Large-N in Volcano Settings: Volcanosri

    NASA Astrophysics Data System (ADS)

    Lees, J. M.; Song, W.; Xing, G.; Vick, S.; Phillips, D.

    2014-12-01

    We seek a paradigm shift in the approach we take on volcano monitoring where the compromise from high fidelity to large numbers of sensors is used to increase coverage and resolution. Accessibility, danger and the risk of equipment loss requires that we develop systems that are independent and inexpensive. Furthermore, rather than simply record data on hard disk for later analysis we desire a system that will work autonomously, capitalizing on wireless technology and in field network analysis. To this end we are currently producing a low cost seismic array which will incorporate, at the very basic level, seismological tools for first cut analysis of a volcano in crises mode. At the advanced end we expect to perform tomographic inversions in the network in near real time. Geophone (4 Hz) sensors connected to a low cost recording system will be installed on an active volcano where triggering earthquake location and velocity analysis will take place independent of human interaction. Stations are designed to be inexpensive and possibly disposable. In one of the first implementations the seismic nodes consist of an Arduino Due processor board with an attached Seismic Shield. The Arduino Due processor board contains an Atmel SAM3X8E ARM Cortex-M3 CPU. This 32 bit 84 MHz processor can filter and perform coarse seismic event detection on a 1600 sample signal in fewer than 200 milliseconds. The Seismic Shield contains a GPS module, 900 MHz high power mesh network radio, SD card, seismic amplifier, and 24 bit ADC. External sensors can be attached to either this 24-bit ADC or to the internal multichannel 12 bit ADC contained on the Arduino Due processor board. This allows the node to support attachment of multiple sensors. By utilizing a high-speed 32 bit processor complex signal processing tasks can be performed simultaneously on multiple sensors. Using a 10 W solar panel, second system being developed can run autonomously and collect data on 3 channels at 100Hz for 6 months with the installed 16Gb SD card. Initial designs and test results will be presented and discussed.

  17. Activity Settings and Daily Routines in Preschool Classrooms: Diverse Experiences in Early Learning Settings for Low-Income Children

    PubMed Central

    Fuligni, Allison Sidle; Howes, Carollee; Huang, Yiching; Hong, Sandra Soliday; Lara-Cinisomo, Sandraluz

    2011-01-01

    This paper examines activity settings and daily classroom routines experienced by 3- and 4-year-old low-income children in public center-based preschool programs, private center-based programs, and family child care homes. Two daily routine profiles were identified using a time-sampling coding procedure: a High Free-Choice pattern in which children spent a majority of their day engaged in child-directed free-choice activity settings combined with relatively low amounts of teacher-directed activity, and a Structured-Balanced pattern in which children spent relatively equal proportions of their day engaged in child-directed free-choice activity settings and teacher-directed small- and whole-group activities. Daily routine profiles were associated with program type and curriculum use but not with measures of process quality. Children in Structured-Balanced classrooms had more opportunities to engage in language and literacy and math activities, whereas children in High Free-Choice classrooms had more opportunities for gross motor and fantasy play. Being in a Structured-Balanced classroom was associated with children’s language scores but profiles were not associated with measures of children’s math reasoning or socio-emotional behavior. Consideration of teachers’ structuring of daily routines represents a valuable way to understand nuances in the provision of learning experiences for young children in the context of current views about developmentally appropriate practice and school readiness. PMID:22665945

  18. Interactive Television vs. a Traditional Classroom Setting: A Comparison of Student Math Achievement.

    ERIC Educational Resources Information Center

    Hodge-Hardin, Sherri

    The purpose of this study was to determine if there were differences in math achievement of students taught in an interactive television (ITV) class setting with the instructor present (host site), students receiving instruction via television at an off-campus location (remote site), and students taught in a traditional classroom setting. The…

  19. Content-Based Instruction for English Language Learners: An Exploration across Multiple Classroom Settings

    ERIC Educational Resources Information Center

    Park, Seo Jung

    2009-01-01

    This study explored the content-based literacy instruction of English language learners (ELLs) across multiple classroom settings in U.S. elementary schools. The following research questions guided the study: (a) How are ELLs taught English in two types of instructional settings: regular content-area literacy instruction in the all-English…

  20. Corrective Feedback and Learner Uptake in Communicative Classrooms across Instructional Settings

    ERIC Educational Resources Information Center

    Sheen, YoungHee

    2004-01-01

    This paper reports similarities and differences in teachers' corrective feedback and learners' uptake across instructional settings. Four communicative classroom settings--French Immersion, Canada ESL, New Zealand ESL and Korean EFL--were examined using Lyster and Ranta's taxonomy of teachers' corrective feedback moves and learner uptake. The…

  1. The Transition of Women from the Classroom Setting to the Educational Administration Setting

    ERIC Educational Resources Information Center

    Zachreson, Sarah A.

    2011-01-01

    This qualitative case study examined the research exploring how female teachers had perceived their potential challenges in becoming a principal, and how those perceptions actually changed as they made the move from the classroom to the principal's office. The purpose of the study is to investigate how female administrative candidates assessed and…

  2. Large Data at Small Universities: Astronomical processing using a computer classroom

    NASA Astrophysics Data System (ADS)

    Fuller, Nathaniel James; Clarkson, William I.; Fluharty, Bill; Belanger, Zach; Dage, Kristen

    2016-06-01

    The use of large computing clusters for astronomy research is becoming more commonplace as datasets expand, but access to these required resources is sometimes difficult for research groups working at smaller Universities. As an alternative to purchasing processing time on an off-site computing cluster, or purchasing dedicated hardware, we show how one can easily build a crude on-site cluster by utilizing idle cycles on instructional computers in computer-lab classrooms. Since these computers are maintained as part of the educational mission of the University, the resource impact on the investigator is generally low.By using open source Python routines, it is possible to have a large number of desktop computers working together via a local network to sort through large data sets. By running traditional analysis routines in an “embarrassingly parallel” manner, gains in speed are accomplished without requiring the investigator to learn how to write routines using highly specialized methodology. We demonstrate this concept here applied to 1. photometry of large-format images and 2. Statistical significance-tests for X-ray lightcurve analysis. In these scenarios, we see a speed-up factor which scales almost linearly with the number of cores in the cluster. Additionally, we show that the usage of the cluster does not severely limit performance for a local user, and indeed the processing can be performed while the computers are in use for classroom purposes.

  3. The Impact of Physical Settings on Pre-Schoolers Classroom Organization

    ERIC Educational Resources Information Center

    Tadjic, Mirko; Martinec, Miroslav; Farago, Amalija

    2015-01-01

    The physical setting plays an important role in the lives of pre-schoolers and can be an important component of children's experience and development when it is wisely and meaningfully designed. The classroom organization enhances and supports the pre-schooler capability to perform activities himself, initiate and finish tasks, creates the…

  4. Improving Preschool Classroom Processes: Preliminary Findings from a Randomized Trial Implemented in Head Start Settings

    ERIC Educational Resources Information Center

    Raver, C. Cybele; Jones, Stephanie M.; Li-Grining, Christine P.; Metzger, Molly; Champion, Kina M.; Sardin, Latriese

    2008-01-01

    A primary aim of the Chicago School Readiness Project was to improve teachers' emotionally supportive classroom practices in Head Start-funded preschool settings. Using a clustered randomized controlled trial (RCT) design, the Chicago School Readiness Project randomly assigned a treatment versus control condition to 18 Head Start sites, which…

  5. Civility in the University Classroom: An Opportunity for Faculty to Set Expectations

    ERIC Educational Resources Information Center

    Ward, Chris; Yates, Dan

    2014-01-01

    This research examines the types of uncivil behaviors frequently encountered in university classrooms. These behaviors range from walking in late to class, texting in class, and/or unprofessional emails. These behaviors can often undermine a professor's teaching. Setting reasonable and consistent expectations is a combination of university policy,…

  6. Generalizability and Decision Studies to Inform Observational and Experimental Research in Classroom Settings

    ERIC Educational Resources Information Center

    Bottema-Beutel, Kristen; Lloyd, Blair; Carter, Erik W.; Asmus, Jennifer M.

    2014-01-01

    Attaining reliable estimates of observational measures can be challenging in school and classroom settings, as behavior can be influenced by multiple contextual factors. Generalizability (G) studies can enable researchers to estimate the reliability of observational data, and decision (D) studies can inform how many observation sessions are…

  7. Reliability Issues and Solutions for Coding Social Communication Performance in Classroom Settings

    ERIC Educational Resources Information Center

    Olswang, Lesley B.; Svensson, Liselotte; Coggins, Truman E.; Beilinson, Jill S.; Donaldson, Amy L.

    2006-01-01

    Purpose: To explore the utility of time-interval analysis for documenting the reliability of coding social communication performance of children in classroom settings. Of particular interest was finding a method for determining whether independent observers could reliably judge both occurrence and duration of ongoing behavioral dimensions for…

  8. Descriptive Analysis of Classroom Setting Events on the Social Behaviors of Children with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Boyd, Brian A.; Conroy, Maureen A.; Asmus, Jennifer M.; McKenney, Elizabeth L. W.; Mancil, G. Richmond

    2008-01-01

    Children with Autism Spectrum Disorder (ASD) are characterized by extreme deficits in social relatedness with same-age peers. The purpose of this descriptive study was to identify naturally occurring antecedent variables (i.e., setting events) in the classroom environments of children with ASD that promoted their engagement in peer-related social…

  9. Analysis of Two Early Childhood Education Settings: Classroom Variables and Peer Verbal Interaction

    ERIC Educational Resources Information Center

    Hojnoski, Robin L.; Margulies, Allison S.; Barry, Amberly; Bose-Deakins, Jillaynne; Sumara, Kimberly M.; Harman, Jennifer L.

    2008-01-01

    Descriptive and ecobehavioral analyses were used to explore the daily activity contexts in classroom settings reflecting two distinct models of early childhood education. Activity context, social configurations, teacher behavior, and child behavior were explored, with specific consideration given to peer verbal behavior as an indicator of social

  10. Mobile-IT Education (MIT.EDU): M-Learning Applications for Classroom Settings

    ERIC Educational Resources Information Center

    Sung, M.; Gips, J.; Eagle, N.; Madan, A.; Caneel, R.; DeVaul, R.; Bonsen, J.; Pentland, A.

    2005-01-01

    In this paper, we describe the Mobile-IT Education (MIT.EDU) system, which demonstrates the potential of using a distributed mobile device architecture for rapid prototyping of wireless mobile multi-user applications for use in classroom settings. MIT.EDU is a stable, accessible system that combines inexpensive, commodity hardware, a flexible…

  11. Turkish Special Education Teachers' Implementation of Functional Analysis in Classroom Settings

    ERIC Educational Resources Information Center

    Erbas, Dilek; Yucesoy, Serife; Turan, Yasemin; Ostrosky, Michaelene M.

    2006-01-01

    Three Turkish special education teachers conducted a functional analysis to identify variables that might initiate or maintain the problem behaviors of three children with developmental disabilities. The analysis procedures were conducted in natural classroom settings. In Phase 1, following initial training in functional analysis procedures, the

  12. Developing a Positive Mind-Set toward the Use of Technology for Classroom Instruction

    ERIC Educational Resources Information Center

    Okojie, Mabel C. P. O.; Olinzock, Anthony

    2006-01-01

    The aim of this paper is to examine various indicators associated with the development of a positive mind-set toward the use of technology for instruction. The paper also examines the resources available to help teachers keep pace with technological innovation. Electronic classrooms have some complexities associated with them; therefore, support…

  13. Mobile-IT Education (MIT.EDU): M-Learning Applications for Classroom Settings

    ERIC Educational Resources Information Center

    Sung, M.; Gips, J.; Eagle, N.; Madan, A.; Caneel, R.; DeVaul, R.; Bonsen, J.; Pentland, A.

    2005-01-01

    In this paper, we describe the Mobile-IT Education (MIT.EDU) system, which demonstrates the potential of using a distributed mobile device architecture for rapid prototyping of wireless mobile multi-user applications for use in classroom settings. MIT.EDU is a stable, accessible system that combines inexpensive, commodity hardware, a flexible

  14. A Collaborative Model for Developing Classroom Management Skills in Urban Professional Development School Settings

    ERIC Educational Resources Information Center

    Dobler, Elizabeth; Kesner, Cathy; Kramer, Rebecca; Resnik, Marilyn; Devin, Libby

    2009-01-01

    This article describes a school-university partnership that focuses on the development of classroom management skills for preservice teachers in an urban setting, through collaboration between mentors, principals, and a university supervisor. To prepare preservice teachers for the unique challenges of urban schools, three key elements were

  15. INTERIOR VIEW, SETTING LARGE CORE WITH ASSISTANCE FROM THE OVERHEAD ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    INTERIOR VIEW, SETTING LARGE CORE WITH ASSISTANCE FROM THE OVERHEAD RAIL CRANE IN BOX FLOOR MOLD AREA (WORKERS: DAN T. WELLS AND TRUMAN CARLISLE). - Stockham Pipe & Fittings Company, Ductile Iron Foundry, 4000 Tenth Avenue North, Birmingham, Jefferson County, AL

  16. Use of Big-Screen Films in Multiple Childbirth Education Classroom Settings

    PubMed Central

    Kaufman, Tamara

    2010-01-01

    Although two recent films, Orgasmic Birth and Pregnant in America, were intended for the big screen, they can also serve as valuable teaching resources in multiple childbirth education settings. Each film conveys powerful messages about birth and today's birthing culture. Depending on a childbirth educator's classroom setting (hospital, birthing center, or home birth environment), particular portions in each film, along with extra clips featured on the films' DVDs, can enhance an educator's curriculum and spark compelling discussions with class participants. PMID:21358831

  17. Strategies for Engaging FCS Learners in a Large-Format Classroom: Embedded Videos

    ERIC Educational Resources Information Center

    Leslie, Catherine Amoroso

    2014-01-01

    This article presents a method for utilizing technology to increase student engagement in large classroom formats. In their lives outside the classroom, students spend considerable time interfacing with media, and they are receptive to information conveyed in electronic formats. Research has shown that multimedia is an effective learning resource;…

  18. Teaching Methodology in a "Large Power Distance" Classroom: A South Korean Context

    ERIC Educational Resources Information Center

    Jambor, Paul Z.

    2005-01-01

    This paper looks at South Korea as an example of a collectivist society having a rather large power distance dimension value. In a traditional Korean classroom the teacher is at the top of the classroom hierarchy, while the students are the passive participants. Gender and age play a role in the hierarchy between students themselves. Teaching…

  19. [The BASYS observation system for the analysis of aggressive behavior in classroom-settings].

    PubMed

    Wettstein, Alexander

    2012-01-01

    Educational or therapeutic measures of aggressive student behavior are often based on the judgments of teachers. However, empirical studies show that the objectivity of these judgments is generally low. In order to assess aggressive behavior in classroom settings, we developed a context-sensitive observational system. The observation system exists in a version for teachers in action as well as a version for the uninvolved observer. The teacher version allows categorizing aggressive behavior while teaching. The aim is to differentiate the perception and the judgments of teachers, so that the judgments can serve as trustable diagnostic information. The version for an independent observer, in addition, contains categories to collect information about the context in which aggressions take place. The behavior observation system was tested in four field-studies in regular and special classes. The empirical results show that, after training, teachers were able to make objective observations, and that aggressive behavior depends to a large extent on situational factors. The system allows identification of problematic people-environment relationships and the derivation of intervention measures. PMID:22748725

  20. Adaptive, multiresolution visualization of large data sets using parallel octrees.

    SciTech Connect

    Freitag, L. A.; Loy, R. M.

    1999-06-10

    The interactive visualization and exploration of large scientific data sets is a challenging and difficult task; their size often far exceeds the performance and memory capacity of even the most powerful graphics work-stations. To address this problem, we have created a technique that combines hierarchical data reduction methods with parallel computing to allow interactive exploration of large data sets while retaining full-resolution capability. The hierarchical representation is built in parallel by strategically inserting field data into an octree data structure. We provide functionality that allows the user to interactively adapt the resolution of the reduced data sets so that resolution is increased in regions of interest without sacrificing local graphics performance. We describe the creation of the reduced data sets using a parallel octree, the software architecture of the system, and the performance of this system on the data from a Rayleigh-Taylor instability simulation.

  1. Automated pharmacophore identification for large chemical data sets.

    PubMed

    Chen, X; Rusinko, A; Tropsha, A; Young, S S

    1999-01-01

    The identification of three-dimensional pharmacophores from large, heterogeneous data sets is still an unsolved problem. We developed a novel program, SCAMPI (statistical classification of activities of molecules for pharmacophore identification), for this purpose by combining a fast conformation search with recursive partitioning, a data-mining technique, which can easily handle large data sets. The pharmacophore identification process is designed to run recursively, and the conformation spaces are resampled under the constraints of the evolving pharmacophore model. This program is capable of deriving pharmacophores from a data set of 1000-2000 compounds, with thousands of conformations generated for each compound and in less than 1 day of computational time. For two test data sets, the identified pharmacophores are consistent with the known results from the literature. PMID:10529987

  2. Looking at large data sets using binned data plots

    SciTech Connect

    Carr, D.B.

    1990-04-01

    This report addresses the monumental challenge of developing exploratory analysis methods for large data sets. The goals of the report are to increase awareness of large data sets problems and to contribute simple graphical methods that address some of the problems. The graphical methods focus on two- and three-dimensional data and common task such as finding outliers and tail structure, assessing central structure and comparing central structures. The methods handle large sample size problems through binning, incorporate information from statistical models and adapt image processing algorithms. Examples demonstrate the application of methods to a variety of publicly available large data sets. The most novel application addresses the too many plots to examine'' problem by using cognostics, computer guiding diagnostics, to prioritize plots. The particular application prioritizes views of computational fluid dynamics solution sets on the fly. That is, as each time step of a solution set is generated on a parallel processor the cognostics algorithms assess virtual plots based on the previous time step. Work in such areas is in its infancy and the examples suggest numerous challenges that remain. 35 refs., 15 figs.

  3. Reducing Information Overload in Large Seismic Data Sets

    SciTech Connect

    HAMPTON,JEFFERY W.; YOUNG,CHRISTOPHER J.; MERCHANT,BION J.; CARR,DORTHE B.; AGUILAR-CHANG,JULIO

    2000-08-02

    Event catalogs for seismic data can become very large. Furthermore, as researchers collect multiple catalogs and reconcile them into a single catalog that is stored in a relational database, the reconciled set becomes even larger. The sheer number of these events makes searching for relevant events to compare with events of interest problematic. Information overload in this form can lead to the data sets being under-utilized and/or used incorrectly or inconsistently. Thus, efforts have been initiated to research techniques and strategies for helping researchers to make better use of large data sets. In this paper, the authors present their efforts to do so in two ways: (1) the Event Search Engine, which is a waveform correlation tool and (2) some content analysis tools, which area combination of custom-built and commercial off-the-shelf tools for accessing, managing, and querying seismic data stored in a relational database. The current Event Search Engine is based on a hierarchical clustering tool known as the dendrogram tool, which is written as a MatSeis graphical user interface. The dendrogram tool allows the user to build dendrogram diagrams for a set of waveforms by controlling phase windowing, down-sampling, filtering, enveloping, and the clustering method (e.g. single linkage, complete linkage, flexible method). It also allows the clustering to be based on two or more stations simultaneously, which is important to bridge gaps in the sparsely recorded event sets anticipated in such a large reconciled event set. Current efforts are focusing on tools to help the researcher winnow the clusters defined using the dendrogram tool down to the minimum optimal identification set. This will become critical as the number of reference events in the reconciled event set continually grows. The dendrogram tool is part of the MatSeis analysis package, which is available on the Nuclear Explosion Monitoring Research and Engineering Program Web Site. As part of the research into how to winnow the reference events in these large reconciled event sets, additional database query approaches have been developed to provide windows into these datasets. These custom built content analysis tools help identify dataset characteristics that can potentially aid in providing a basis for comparing similar reference events in these large reconciled event sets. Once these characteristics can be identified, algorithms can be developed to create and add to the reduced set of events used by the Event Search Engine. These content analysis tools have already been useful in providing information on station coverage of the referenced events and basic statistical, information on events in the research datasets. The tools can also provide researchers with a quick way to find interesting and useful events within the research datasets. The tools could also be used as a means to review reference event datasets as part of a dataset delivery verification process. There has also been an effort to explore the usefulness of commercially available web-based software to help with this problem. The advantages of using off-the-shelf software applications, such as Oracle's WebDB, to manipulate, customize and manage research data are being investigated. These types of applications are being examined to provide access to large integrated data sets for regional seismic research in Asia. All of these software tools would provide the researcher with unprecedented power without having to learn the intricacies and complexities of relational database systems.

  4. Spatial compounding of large sets of 3D echocardiography images

    NASA Astrophysics Data System (ADS)

    Yao, Cheng; Simpson, John M.; Jansen, Christian H. P.; King, Andrew P.; Penney, Graeme P.

    2009-02-01

    We present novel methodologies for compounding large numbers of 3D echocardiography volumes. Our aim is to investigate the effect of using an increased number of images, and to compare the performance of different compounding methods on image quality. Three sets of 3D echocardiography images were acquired from three volunteers. Each set of data (containing 10+ images) were registered using external tracking followed by state-of-the-art image registration. Four compounding methods were investigated, mean, maximum, and two methods derived from phase-based compounding. The compounded images were compared by calculating signal-to-noise ratios and contrast at manually identified anatomical positions within the images, and by visual inspection by experienced echocardiographers. Our results indicate that signal-to-noise ratio and contrast can be improved using increased number of images, and that a coherent compounded image can be produced using large (10+) numbers of 3D volumes.

  5. Treatment of psychotic children in a classroom environment: I. Learning in a large group.

    PubMed

    Koegel, R L; Rincover, A

    1974-01-01

    The purpose of this study was to investigate systematically the feasibility of modifying the behavior of autistic children in a classroom environment. In the first experiment, eight autistic children were taught certain basic classroom behaviors (including attending to the teacher upon command, imitation, and an elementary speaking and recognition vocabulary) that were assumed to be necessary for subsequent learning to take place in the classroom. Based on research documenting the effectiveness of one-to-one (teacher-child ratio) procedures for modifying such behaviors, these behaviors were taught in one-to-one sessions. It was, however, found that behaviors taught in a one-to-one setting were not performed consistently in a classroom-sized group, or even in a group as small as two children with one teacher. Further, the children evidenced no acquisition of new behaviors in a classroom environment over a four-week period. Therefore, Experiment II introduced a treatment procedure based upon "fading in" the classroom stimulus situation from the one-to-one stimulus situation. Such treatment was highly effective in producing both a transfer in stimulus control and the acquisition of new behaviors in a kindergarten/first-grade classroom environment. PMID:4465373

  6. Computing Information-Theoretic Quantities in Large Climate Data Sets

    NASA Astrophysics Data System (ADS)

    Knuth, K. H.; Castle, J. P.; Curry, C. T.; Gotera, A.; Huyser, K. A.; Wheeler, K. R.; Rossow, W. B.

    2005-12-01

    Information-theoretic quantities, such as mutual information, allow one to quantify the amount of information shared by two variables. In large data sets, the mutual information can be used to identify sets of co-informative variables and thus are able to identify variables that can act as predictors of a phenomenon of interest. While mutual information alone does not distinguish a causal interaction between two variables, another information-theoretic quantity called the transfer entropy can indicate such possible causal interactions. Together, these quantities can be used to identify causal interactions among sets of variables in large distributed data sets. We are currently developing a suite of computational tools that will allow researchers to calculate, from data, these useful information-theoretic quantities. Our software tools estimate these quantities along with their associated error bars, the latter of which are critical for describing the degree of uncertainty in the estimates. In this presentation we demonstrate how mutual information and transfer entropy can be applied so as to allow researchers not only to identify relations among climate variables, but also to characterize and quantify their possible causal interactions.

  7. Quantitative prediction of reduction in large pipe setting round process

    NASA Astrophysics Data System (ADS)

    Zhao, Jun; Zhan, Peipei; Ma, Rui; Zhai, Ruixue

    2013-07-01

    The control manner during the process to ensure the quality of pipe products mainly relies on the operator's experience, so it is very necessary to study the setting round process and obtain its spring-back law. The setting round process is shaping an oval section pipe into circular section, so it is difficult to provide a quantificational analysis for its spring-back process because of the curvature inequality of pipe section neutral layer. However, the spring-back law of the circle-oval process can be easily predicted. The experimental method is firstly used to establish the equivalent effect between the setting round process and the circle-oval process. The setting round process can be converted into the circle-oval process. There are two difficulties in the theoretical analysis for the circle-oval process: elastic-plastic bending problem of curved beam; statically indeterminate problem. A quantitative analytic method for the circle-oval process is presented on the basis of combination of the spring-back law of plane curved beam with the element dividing idea in finite element method. The ovality after unloading versus the relative reduction is plotted with analytical and experimental results respectively, which shows a fair agreement. Finally, the method of quantitative prediction of reduction for large pipe setting round is given based on the equivalent effect and the analytical results. Five pipes, which are needed to be set round, are used to carry out experiment so as to verify this method. The results of verification experiment indicates that, in the experimental range, the residual ovality are all under 0.35% after the once only setting round with the theoretical prediction reductions. It is much less than the 1% requirement of pipe standard. Applying the established theoretical analysis is able to correct the pipe ovality with sufficient accuracy, which provides theoretical direction to plant use.

  8. Interactive Web-Based Map: Applications to Large Data Sets in the Geosciences. Interactive Web-Based Map: Applications to Large Data Sets in the Geosciences.

    NASA Astrophysics Data System (ADS)

    Garbow, Z. A.; Olson, N. R.; Yuen, D. A.; Boggs, J. M.

    2001-12-01

    Current advances in computer hardware, information technology and data collection techniques have produced very large data sets, sometimes more than terabytes,in a wide variety of scientific and engineering disciplines. We must harness this opportunity to visualize and extract useful information from geophysical and geological data. We have taken the task of data-mining by using a map-like approach over the web for interrogating the humongous data, using a client-server paradigm. The spatial-data is mapped onto a two-dimensional grid from which the user ( client ) can quiz the data with the map-interface as a user extension . The data is stored on high-end compute server. The computational gateway separating the client and the server can be the front-end of an electronic publication , electronic classroom , a Grid system device or e-business. We have used a combination of JAVA, JAVA-3D and Perl for processing the data and communicating them between the client and the server. The user can interrogate the geospatial data over any particular region with arbitrary length scales and pose relevant statistical questions, such as the histogram plots and local statistics. We have applied this method for the following data sets (1.) distribution of prime numbers (2.) two-dimensional mantle convection (3.) three-dimensional mantle convection (4) high-resolution satellite reflectance data over the Upper Midwest for multiple wavelengths (5) molecular dynamics describing the flow of blood in narrow vessels. Using this map-interface concept, the user can actually interrogate these data over the web. This strategy for dissecting large data-sets can be easily applied to other areas, such as satellite geodesy and earthquake data. This mode of data-query may function in an adequately covered wireless web environment with a transfer rate of around 10 Mbit/sec .

  9. Robust Coordination for Large Sets of Simple Rovers

    NASA Technical Reports Server (NTRS)

    Tumer, Kagan; Agogino, Adrian

    2006-01-01

    The ability to coordinate sets of rovers in an unknown environment is critical to the long-term success of many of NASA;s exploration missions. Such coordination policies must have the ability to adapt in unmodeled or partially modeled domains and must be robust against environmental noise and rover failures. In addition such coordination policies must accommodate a large number of rovers, without excessive and burdensome hand-tuning. In this paper we present a distributed coordination method that addresses these issues in the domain of controlling a set of simple rovers. The application of these methods allows reliable and efficient robotic exploration in dangerous, dynamic, and previously unexplored domains. Most control policies for space missions are directly programmed by engineers or created through the use of planning tools, and are appropriate for single rover missions or missions requiring the coordination of a small number of rovers. Such methods typically require significant amounts of domain knowledge, and are difficult to scale to large numbers of rovers. The method described in this article aims to address cases where a large number of rovers need to coordinate to solve a complex time dependent problem in a noisy environment. In this approach, each rover decomposes a global utility, representing the overall goal of the system, into rover-specific utilities that properly assign credit to the rover s actions. Each rover then has the responsibility to create a control policy that maximizes its own rover-specific utility. We show a method of creating rover-utilities that are "aligned" with the global utility, such that when the rovers maximize their own utility, they also maximize the global utility. In addition we show that our method creates rover-utilities that allow the rovers to create their control policies quickly and reliably. Our distributed learning method allows large sets rovers be used unmodeled domains, while providing robustness against rover failures and changing environments. In experimental simulations we show that our method scales well with large numbers of rovers in addition to being robust against noisy sensor inputs and noisy servo control. The results show that our method is able to scale to large numbers of rovers and achieves up to 400% performance improvement over standard machine learning methods.

  10. Optimizing distance-based methods for large data sets

    NASA Astrophysics Data System (ADS)

    Scholl, Tobias; Brenner, Thomas

    2015-10-01

    Distance-based methods for measuring spatial concentration of industries have received an increasing popularity in the spatial econometrics community. However, a limiting factor for using these methods is their computational complexity since both their memory requirements and running times are in {{O}}(n^2). In this paper, we present an algorithm with constant memory requirements and shorter running time, enabling distance-based methods to deal with large data sets. We discuss three recent distance-based methods in spatial econometrics: the D&O-Index by Duranton and Overman (Rev Econ Stud 72(4):1077-1106, 2005), the M-function by Marcon and Puech (J Econ Geogr 10(5):745-762, 2010) and the Cluster-Index by Scholl and Brenner (Reg Stud (ahead-of-print):1-15, 2014). Finally, we present an alternative calculation for the latter index that allows the use of data sets with millions of firms.

  11. A large-scale crop protection bioassay data set

    PubMed Central

    Gaulton, Anna; Kale, Namrata; van Westen, Gerard J. P.; Bellis, Louisa J.; Bento, A. Patrícia; Davies, Mark; Hersey, Anne; Papadatos, George; Forster, Mark; Wege, Philip; Overington, John P.

    2015-01-01

    ChEMBL is a large-scale drug discovery database containing bioactivity information primarily extracted from scientific literature. Due to the medicinal chemistry focus of the journals from which data are extracted, the data are currently of most direct value in the field of human health research. However, many of the scientific use-cases for the current data set are equally applicable in other fields, such as crop protection research: for example, identification of chemical scaffolds active against a particular target or endpoint, the de-convolution of the potential targets of a phenotypic assay, or the potential targets/pathways for safety liabilities. In order to broaden the applicability of the ChEMBL database and allow more widespread use in crop protection research, an extensive data set of bioactivity data of insecticidal, fungicidal and herbicidal compounds and assays was collated and added to the database. PMID:26175909

  12. Support vector machine classifiers for large data sets.

    SciTech Connect

    Gertz, E. M.; Griffin, J. D.

    2006-01-31

    This report concerns the generation of support vector machine classifiers for solving the pattern recognition problem in machine learning. Several methods are proposed based on interior point methods for convex quadratic programming. Software implementations are developed by adapting the object-oriented packaging OOQP to the problem structure and by using the software package PETSc to perform time-intensive computations in a distributed setting. Linear systems arising from classification problems with moderately large numbers of features are solved by using two techniques--one a parallel direct solver, the other a Krylov-subspace method incorporating novel preconditioning strategies. Numerical results are provided, and computational experience is discussed.

  13. Science Teacher Beliefs and Classroom Practice Related to Constructivism in Different School Settings

    ERIC Educational Resources Information Center

    Savasci, Funda; Berlin, Donna F.

    2012-01-01

    Science teacher beliefs and classroom practice related to constructivism and factors that may influence classroom practice were examined in this cross-case study. Data from four science teachers in two schools included interviews, demographic questionnaire, Classroom Learning Environment Survey (preferred/perceived), and classroom observations and…

  14. Techniques for Efficiently Managing Large Geosciences Data Sets

    NASA Astrophysics Data System (ADS)

    Kruger, A.; Krajewski, W. F.; Bradley, A. A.; Smith, J. A.; Baeck, M. L.; Steiner, M.; Lawrence, R. E.; Ramamurthy, M. K.; Weber, J.; Delgreco, S. A.; Domaszczynski, P.; Seo, B.; Gunyon, C. A.

    2007-12-01

    We have developed techniques and software tools for efficiently managing large geosciences data sets. While the techniques were developed as part of an NSF-Funded ITR project that focuses on making NEXRAD weather data and rainfall products available to hydrologists and other scientists, they are relevant to other geosciences disciplines that deal with large data sets. Metadata, relational databases, data compression, and networking are central to our methodology. Data and derived products are stored on file servers in a compressed format. URLs to, and metadata about the data and derived products are managed in a PostgreSQL database. Virtually all access to the data and products is through this database. Geosciences data normally require a number of processing steps to transform the raw data into useful products: data quality assurance, coordinate transformations and georeferencing, applying calibration information, and many more. We have developed the concept of crawlers that manage this scientific workflow. Crawlers are unattended processes that run indefinitely, and at set intervals query the database for their next assignment. A database table functions as a roster for the crawlers. Crawlers perform well-defined tasks that are, except for perhaps sequencing, largely independent from other crawlers. Once a crawler is done with its current assignment, it updates the database roster table, and gets its next assignment by querying the database. We have developed a library that enables one to quickly add crawlers. The library provides hooks to external (i.e., C-language) compiled codes, so that developers can work and contribute independently. Processes called ingesters inject data into the system. The bulk of the data are from a real-time feed using UCAR/Unidata's IDD/LDM software. An exciting recent development is the establishment of a Unidata HYDRO feed that feeds value-added metadata over the IDD/LDM. Ingesters grab the metadata and populate the PostgreSQL tables. These and other concepts we have developed have enabled us to efficiently manage a 70 Tb (and growing) data weather radar data set.

  15. Observations of Teacher-Child Interactions in Classrooms Serving Latinos and Dual Language Learners: Applicability of the Classroom Assessment Scoring System in Diverse Settings

    ERIC Educational Resources Information Center

    Downer, Jason T.; Lopez, Michael L.; Grimm, Kevin J.; Hamagami, Aki; Pianta, Robert C.; Howes, Carollee

    2012-01-01

    With the rising number of Latino and dual language learner (DLL) children attending pre-k and the importance of assessing the quality of their experiences in those settings, this study examined the extent to which a commonly used assessment of teacher-child interactions, the Classroom Assessment Scoring System (CLASS), demonstrated similar…

  16. Manifold sequencing: efficient processing of large sets of sequencing reactions.

    PubMed Central

    Lagerkvist, A; Stewart, J; Lagerström-Fermér, M; Landegren, U

    1994-01-01

    Automated instruments for DNA sequencing greatly simplify data collection in the Sanger sequencing procedure. By contrast, the so-called front-end problems of preparing sequencing templates, performing sequencing reactions, and loading these on the instruments remain major obstacles to extensive sequencing projects. We describe here the use of a manifold support to prepare and perform sequencing reactions on large sets of templates in parallel, as well as to load the reaction products on a sequencing instrument. In this manner, all reaction steps are performed without pipetting the samples. The strategy is applied to sequencing PCR-amplified clones of the human mitochondrial D-loop and for detection of heterozygous positions in the human major histocompatibility complex class II gene HLA-DQB, amplified from genomic DNA samples. This technique will promote sequencing in a clinical context and could form the basis of more efficient genomic sequencing strategies. Images PMID:8134382

  17. Towards effective analysis of large grain boundary data sets

    NASA Astrophysics Data System (ADS)

    Glowinski, K.; Morawiec, A.

    2015-04-01

    Grain boundaries affect properties of polycrystals. Novel experimental techniques for three-dimensional orientation mapping give new opportunities for studies of this influence. Large networks of boundaries can be analyzed based on all five ’macroscopic’ boundary parameters. We demonstrate benefits of applying two methods for improving these analyses. The fractions of geometrically special boundaries in ferrite are estimated based on ’approximate’ distances to the nearest special boundaries; by using these parameters, the times needed for processing boundary data sets are shortened. Moreover, grain-boundary distributions for nickel are obtained using kernel density estimation; this approach leads to distribution functions more accurate than those obtained based on partition of the space into bins.

  18. Manifold sequencing: Efficient processing of large sets of sequencing reactions

    SciTech Connect

    Lagerkvist, A.; Stewart, J.; Lagerstroem-Fermer, M.; Landegren, U.

    1994-03-15

    Automated instruments for DNA sequencing greatly simplify data collection in the Sanger sequencing procedure. By contrast, the so-called front-end problems of preparing sequencing templates, performing sequencing reactions, and loading these on the instruments remain major obstacles to extensive sequencing projects. The authors describe here the use of a manifold support to prepare and perform sequencing reactions on large sets of templates in parallel, as well as to load the reaction products on a sequencing instrument. In this manner, all reaction steps are performed without pipetting the samples. The strategy is applied to sequencing PCR-amplified clones of the human mitochondrial D-loop and for detection of heterozygous positions in the human major histocompatibility complex class II gene HLA-DQB, amplified from genomic DNA samples. This technique will promote sequencing in a clinical context and could form the basis of more efficient genomic sequencing strategies. 24 refs., 5 figs.

  19. Comparing Outcomes from Field and Classroom Based Settings for Undergraduate Geoscience Courses

    NASA Astrophysics Data System (ADS)

    Skinner, M. R.; Harris, R. A.; Flores, J.

    2011-12-01

    Field based learning can be found in nearly every course offered in Geology at Brigham Young University. For example, in our Structural Geology course field studies substitute for labs. Students collect data their own data from several different structural settings of the Wasatch Range. Our curriculum also includes a two-week, sophomore-level field course that introduces students to interpreting field relations themselves and sets the stage for much of what they learn in their upper-division courses. Our senior-level six-week field geology course includes classical field mapping with exercises in petroleum and mineral exploration, environmental geology and geological hazards. Experiments with substituting field-based general education courses for those in traditional classroom settings indicate that student cognition, course enjoyment and recruiting of majors significantly increase in a field-based course. We offer a field-based introductory geology course (Geo 102) that is taught in seven, six-hour field trips during which students travel to localities of geologic interest to investigate a variety of fundamental geological problems. We compare the outcomes of Geo 102 with a traditional classroom-based geology course (Geo 101). For the comparison both courses are taught by the same instructor, use the same text and supplementary materials and take the same exams. The results of 7 years of reporting indicate that test scores and final grades are one-half grade point higher for Geo 102 students versus those in traditional introductory courses. Student evaluations of the course are also 0.8-1.4 points higher on a scale of 1-8, and are consistently the highest in the Department and College. Other observations include increased attendance, attention and curiosity. The later two are measured by the number of students asking questions of other students as well as the instructors, and the total number of questions asked during class time in the field versus the classroom. Normal classroom involvement includes two or three students asking nearly all of the questions, while in Geo 102 it is closer to half the class, and not the same students each time. Not only do more individuals participate in asking questions in Geo 102, but each participant asks more questions as well. Questions asked in class are generally specific to the discussion, while field questions are commonly multidisciplinary in nature. Field-based courses also encourage more students to collaborate with each other and to integrate shared observations due to the many different aspects of the geosciences present at each site. One of the most important pay-offs is the 50% increase in the number of students changing their major to geology in the field-based versus classroom-based courses. Field-based learning increases the depth of student understanding of the subjects they investigate as well as student involvement and enthusiasm in the class. The tradeoff we make for realizing significant individual and group discovery in the field is that more responsibility is placed on the student to understand the broad based geologic concepts found in the text. The field based approach allows the students to immediately apply their learning in real world applications.

  20. Visualizing large data sets in the earth sciences

    NASA Technical Reports Server (NTRS)

    Hibbard, William; Santek, David

    1989-01-01

    The authors describe the capabilities of McIDAS, an interactive visualization system that is vastly increasing the ability of earth scientists to manage and analyze data from remote sensing instruments and numerical simulation models. McIDAS provides animated three-dimensionsal images and highly interactive displays. The software can manage, analyze, and visualize large data sets that span many physical variables (such as temperature, pressure, humidity, and wind speed), as well as time and three spatial dimensions. The McIDAS system manages data from at least 100 different sources. The data management tools consist of data structures for storing different data types in files, libraries of routines for accessing these data structures, system commands for performing housekeeping functions on the data files, and reformatting programs for converting external data to the system's data structures. The McIDAS tools for three-dimensional visualization of meteorological data run on an IBM mainframe and can load up to 128-frame animation sequences into the workstations. A highly interactive version of the system can provide an interactive window into data sets containing tens of millions of points produced by numerical models and remote sensing instruments. The visualizations are being used for teaching as well as by scientists.

  1. Introducing Recent NASA Discoveries into the Astro 101 Classroom with Modular Slide Sets

    NASA Astrophysics Data System (ADS)

    Meinke, B.; Schultz, G.; Bianchi, L.; Blair, W. P.; Len, P. M.

    2014-07-01

    This paper summarizes the special interest group discussion about slides sets for use by Astronomy 101 instructors. The NASA Science Mission Directorate Astrophysics Education and Public Outreach Forum is coordinating the development of a pilot series of slide sets to help Astronomy 101 instructors incorporate new discoveries in their classrooms. The “Astro 101 slide sets” are presentations of 5-7 slides on a new development or discovery from a NASA Astrophysics mission relevant to topics in introductory astronomy courses. We intend for these slide sets to help Astronomy 101 instructors include new developments (discoveries not yet in their textbooks) into the broader context of the course. With their modular design and non-technical language, the slide sets may also serve audiences beyond Astronomy 101 instruction and are adaptable to different needs. An example on exoplanets was highlighted in this session. In this paper, we outline the community feedback, which falls into the broad categories of content, format, uses, relevant topics, and future adaptations.

  2. Implementing Concept-Based Learning in a Large Undergraduate Classroom

    ERIC Educational Resources Information Center

    Morse, David; Jutras, France

    2008-01-01

    An experiment explicitly introducing learning strategies to a large, first-year undergraduate cell biology course was undertaken to see whether awareness and use of strategies had a measurable impact on student performance. The construction of concept maps was selected as the strategy to be introduced because of an inherent coherence with a course…

  3. Interaction and Uptake in Large Foreign Language Classrooms

    ERIC Educational Resources Information Center

    Ekembe, Eric Enongene

    2014-01-01

    Inteaction determines and affects the conditions of language acquisition especially in contexts where exposure to the target language is limited. This is believed to be successful only within the context of small classes (Chavez, 2009). This paper examines learners' progress resulting from interaction in large classes. Using pre-, post-, and…

  4. Web based visualization of large climate data sets

    USGS Publications Warehouse

    Alder, Jay R.; Hostetler, Steven W.

    2015-01-01

    We have implemented the USGS National Climate Change Viewer (NCCV), which is an easy-to-use web application that displays future projections from global climate models over the United States at the state, county and watershed scales. We incorporate the NASA NEX-DCP30 statistically downscaled temperature and precipitation for 30 global climate models being used in the Fifth Assessment Report (AR5) of the Intergovernmental Panel on Climate Change (IPCC), and hydrologic variables we simulated using a simple water-balance model. Our application summarizes very large, complex data sets at scales relevant to resource managers and citizens and makes climate-change projection information accessible to users of varying skill levels. Tens of terabytes of high-resolution climate and water-balance data are distilled to compact binary format summary files that are used in the application. To alleviate slow response times under high loads, we developed a map caching technique that reduces the time it takes to generate maps by several orders of magnitude. The reduced access time scales to >500 concurrent users. We provide code examples that demonstrate key aspects of data processing, data exporting/importing and the caching technique used in the NCCV.

  5. The Incredible Years Teacher Classroom Management Program: Using Coaching to Support Generalization to Real-World Classroom Settings

    ERIC Educational Resources Information Center

    Reinke, Wendy M.; Stormont, Melissa; Webster-Stratton, Carolyn; Newcomer, Lori L.; Herman, Keith C.

    2012-01-01

    This article focuses on the Incredible Years Teacher Classroom Management Training (IY TCM) intervention as an example of an evidence-based program that embeds coaching within its design. First, the core features of the IY TCM program are described. Second, the IY TCM coaching model and processes utilized to facilitate high fidelity of…

  6. Challenges Associated With Using Large Data Sets for Quality Assessment and Research in Clinical Settings

    PubMed Central

    Cohen, Bevin; Vawdrey, David K.; Liu, Jianfang; Caplan, David; Furuya, E. Yoko; Mis, Frederick W.; Larson, Elaine

    2015-01-01

    The rapidly expanding use of electronic records in health-care settings is generating unprecedented quantities of data available for clinical, epidemiological, and cost-effectiveness research. Several challenges are associated with using these data for clinical research, including issues surrounding access and information security, poor data quality, inconsistency of data within and across institutions, and a paucity of staff with expertise to manage and manipulate large clinical data sets. In this article, we describe our experience with assembling a data-mart and conducting clinical research using electronic data from four facilities within a single hospital network in New York City. We culled data from several electronic sources, including the institution’s admission-discharge-transfer system, cost accounting system, electronic health record, clinical data warehouse, and departmental records. The final data-mart contained information for more than 760,000 discharges occurring from 2006 through 2012. Using categories identified by the National Institutes of Health Big Data to Knowledge initiative as a framework, we outlined challenges encountered during the development and use of a domain-specific data-mart and recommend approaches to overcome these challenges. PMID:26351216

  7. Challenges Associated With Using Large Data Sets for Quality Assessment and Research in Clinical Settings.

    PubMed

    Cohen, Bevin; Vawdrey, David K; Liu, Jianfang; Caplan, David; Furuya, E Yoko; Mis, Frederick W; Larson, Elaine

    2015-08-01

    The rapidly expanding use of electronic records in health-care settings is generating unprecedented quantities of data available for clinical, epidemiological, and cost-effectiveness research. Several challenges are associated with using these data for clinical research, including issues surrounding access and information security, poor data quality, inconsistency of data within and across institutions, and a paucity of staff with expertise to manage and manipulate large clinical data sets. In this article, we describe our experience with assembling a data-mart and conducting clinical research using electronic data from four facilities within a single hospital network in New York City. We culled data from several electronic sources, including the institution's admission-discharge-transfer system, cost accounting system, electronic health record, clinical data warehouse, and departmental records. The final data-mart contained information for more than 760,000 discharges occurring from 2006 through 2012. Using categories identified by the National Institutes of Health Big Data to Knowledge initiative as a framework, we outlined challenges encountered during the development and use of a domain-specific data-mart and recommend approaches to overcome these challenges. PMID:26351216

  8. An Exploration Tool for Very Large Spectrum Data Sets

    NASA Astrophysics Data System (ADS)

    Carbon, Duane F.; Henze, Christopher

    2015-01-01

    We present an exploration tool for very large spectrum data sets such as the SDSS, LAMOST, and 4MOST data sets. The tool works in two stages: the first uses batch processing and the second runs interactively. The latter employs the NASA hyperwall, a configuration of 128 workstation displays (8x16 array) controlled by a parallelized software suite running on NASA's Pleiades supercomputer. The stellar subset of the Sloan Digital Sky Survey DR10 was chosen to show how the tool may be used. In stage one, SDSS files for 569,738 stars are processed through our data pipeline. The pipeline fits each spectrum using an iterative continuum algorithm, distinguishing emission from absorption and handling molecular absorption bands correctly. It then measures 1659 discrete atomic and molecular spectral features that were carefully preselected based on their likelihood of being visible at some spectral type. The depths relative to the local continuum at each feature wavelength are determined for each spectrum: these depths, the local S/N level, and DR10-supplied variables such as magnitudes, colors, positions, and radial velocities are the basic measured quantities used on the hyperwall. In stage two, each hyperwall panel is used to display a 2-D scatter plot showing the depth of feature A vs the depth of feature B for all of the stars. A and B change from panel to panel. The relationships between the various (A,B) strengths and any distinctive clustering are immediately apparent when examining and inter-comparing the different panels on the hyperwall. The interactive software allows the user to select the stars in any interesting region of any 2-D plot on the hyperwall, immediately rendering the same stars on all the other 2-D plots in a unique color. The process may be repeated multiple times, each selection displaying a distinctive color on all the plots. At any time, the spectra of the selected stars may be examined in detail on a connected workstation display. We illustrate how our approach allows us to quickly isolate and examine such interesting stellar subsets as EMP stars, CV stars and C-rich stars.

  9. Implementing Concept-based Learning in a Large Undergraduate Classroom

    PubMed Central

    Jutras, France

    2008-01-01

    An experiment explicitly introducing learning strategies to a large, first-year undergraduate cell biology course was undertaken to see whether awareness and use of strategies had a measurable impact on student performance. The construction of concept maps was selected as the strategy to be introduced because of an inherent coherence with a course structured by concepts. Data were collected over three different semesters of an introductory cell biology course, all teaching similar course material with the same professor and all evaluated using similar examinations. The first group, used as a control, did not construct concept maps, the second group constructed individual concept maps, and the third group first constructed individual maps then validated their maps in small teams to provide peer feedback about the individual maps. Assessment of the experiment involved student performance on the final exam, anonymous polls of student perceptions, failure rate, and retention of information at the start of the following year. The main conclusion drawn is that concept maps without feedback have no significant effect on student performance, whereas concept maps with feedback produced a measurable increase in student problem-solving performance and a decrease in failure rates. PMID:18519616

  10. The Design and Synthesis of a Large Interactive Classroom

    NASA Astrophysics Data System (ADS)

    Clouston, Laurel L.; Kleinman, Mark H.

    1999-01-01

    The use of group learning techniques in large classes has been used to effectively convey the central concepts of SN1 and SN2 reactions in an introductory organic chemistry class. The activities described are best used as an introduction to these mechanisms. The class begins with the instructor relaying the key points of the reaction pathways. Following this synopsis, the class is divided through the use of assignment sheets that are distributed to the students upon arrival. The use of markers and poster boards, model kits, and role playing help to explain the intricacies of the mechanisms to learners, thereby accommodating a variety of different learning styles. After a guided discussion, each group presents their results to another collection of students who used a different learning technique to understand the alternate reaction. In this manner, each student encounters two learning styles and benefits from the repetitious nature of the exercise. After the groups break up into even smaller groups, higher-order questions are posed for further discussion. The class is terminated by the presentation of a summary slide that contains all the important facts covered during the lecture.

  11. Science Teacher Beliefs and Classroom Practice Related to Constructivism in Different School Settings

    NASA Astrophysics Data System (ADS)

    Savasci, Funda; Berlin, Donna F.

    2012-02-01

    Science teacher beliefs and classroom practice related to constructivism and factors that may influence classroom practice were examined in this cross-case study. Data from four science teachers in two schools included interviews, demographic questionnaire, Classroom Learning Environment Survey (preferred/perceived), and classroom observations and documents. Using an inductive analytic approach, results suggested that the teachers embraced constructivism, but classroom observations did not confirm implementation of these beliefs for three of the four teachers. The most preferred constructivist components were personal relevance and student negotiation; the most perceived component was critical voice. Shared control was the least preferred, least perceived, and least observed constructivist component. School type, grade, student behavior/ability, curriculum/standardized testing, and parental involvement may influence classroom practice.

  12. A Case Based Analysis Preparation Strategy for Use in a Classroom Management for Inclusive Settings Course: Preliminary Observations

    ERIC Educational Resources Information Center

    Niles, William J.; Cohen, Alan

    2012-01-01

    Case based instruction (CBI) is a pedagogical option in teacher preparation growing in application but short on practical means to implement the method. This paper presents an analysis strategy and questions developed to help teacher trainees focus on classroom management issues embedded in a set of "real" cases. An analysis of teacher candidate…

  13. Student-Centred Anti-Smoking Education: Comparing a Classroom-Based and an Out-of-School Setting

    ERIC Educational Resources Information Center

    Geier, Christine S.; Bogner, Franz X.

    2010-01-01

    The present study monitored a student-centred educational anti-smoking intervention with fifth graders by focusing on their cognitive achievement and intrinsic motivation. In order to assess the potential influence of the setting on self-directed learning, the intervention was conducted in two different learning environments: a classroom-based…

  14. The Effects of a Teacher-Child Play Intervention on Classroom Compliance in Young Children in Child Care Settings

    ERIC Educational Resources Information Center

    Levine, Darren G.; Ducharme, Joseph M.

    2013-01-01

    The current study evaluated the effects of a teacher-conducted play intervention on preschool-aged children's compliance in child care settings. Study participants included 8 children ranging in age from 3 to 5 years and 5 early childhood education teachers within 5 classrooms across 5 child care centers. A combination ABAB and multiple baseline…

  15. Using Reading Classroom Explorer's Interactive Notebook: Student-Initiated Inquiries in a Collaborative Setting.

    ERIC Educational Resources Information Center

    Hughes, Joan E.; Packard, Becky Wai-Ling; Reischl, Catherine H.; Pearson, P. David

    The Reading Classroom Explorer (RCE), a hypermedia learning environment for teacher education, was developed in 1996. The environment contains searchable video clips of six exemplary teachers teaching reading, transcripts of classroom clips, questions to spur thinking, reference citations, and an interactive notebook. A study explored: what sorts…

  16. Classrooms that Work: Teaching Generic Skills in Academic and Vocational Settings.

    ERIC Educational Resources Information Center

    Stasz, Cathleen; And Others

    This report documents the second of two studies on teaching and learning generic skills in high schools. It extends the earlier work by providing a model for designing classroom instruction in both academic and vocational classrooms where teaching generic skills is an instructional goal. Ethnographic field methods were used to observe, record, and…

  17. Examining Play among Young Children in Single-Age and Multi-Age Preschool Classroom Settings

    ERIC Educational Resources Information Center

    Youhne, Mia Song

    2009-01-01

    Advocates for multi-age classrooms claim multi-age groupings benefit children (Brynes, Shuster, & Jones, 1994). Currently, there is a lack of research examining play among students in multi-age classrooms. If indeed there is a positive benefit of play among children, research is needed to examine these behaviors among and between young children in…

  18. Self-Recording with Goal Setting: A Self-Management Programme for the Classroom.

    ERIC Educational Resources Information Center

    Moore, Dennis W.; Prebble, Sherrell; Robertson, Jenny; Waetford, Rona; Anderson, Angelika

    2001-01-01

    Presents a study focusing on the effect of a self-management intervention. Included self-recording and goal setting for the behavior of eight-year-old boys attending a large suburban primary school. Suggests that the intervention was socially valid and cost effective. Includes references. (CMK)

  19. Using Large Data Sets to Study College Education Trajectories

    ERIC Educational Resources Information Center

    Oseguera, Leticia; Hwang, Jihee

    2014-01-01

    This chapter presents various considerations researchers undertook to conduct a quantitative study on low-income students using a national data set. Specifically, it describes how a critical quantitative scholar approaches guiding frameworks, variable operationalization, analytic techniques, and result interpretation. Results inform how

  20. Using Large Data Sets to Study College Education Trajectories

    ERIC Educational Resources Information Center

    Oseguera, Leticia; Hwang, Jihee

    2014-01-01

    This chapter presents various considerations researchers undertook to conduct a quantitative study on low-income students using a national data set. Specifically, it describes how a critical quantitative scholar approaches guiding frameworks, variable operationalization, analytic techniques, and result interpretation. Results inform how…

  1. Processing large remote sensing image data sets on Beowulf clusters

    USGS Publications Warehouse

    Steinwand, Daniel R.; Maddox, Brian; Beckmann, Tim; Schmidt, Gail

    2003-01-01

    High-performance computing is often concerned with the speed at which floating- point calculations can be performed. The architectures of many parallel computers and/or their network topologies are based on these investigations. Often, benchmarks resulting from these investigations are compiled with little regard to how a large dataset would move about in these systems. This part of the Beowulf study addresses that concern by looking at specific applications software and system-level modifications. Applications include an implementation of a smoothing filter for time-series data, a parallel implementation of the decision tree algorithm used in the Landcover Characterization project, a parallel Kriging algorithm used to fit point data collected in the field on invasive species to a regular grid, and modifications to the Beowulf project's resampling algorithm to handle larger, higher resolution datasets at a national scale. Systems-level investigations include a feasibility study on Flat Neighborhood Networks and modifications of that concept with Parallel File Systems.

  2. Evaluation of Data Visualization Software for Large Astronomical Data Sets

    NASA Astrophysics Data System (ADS)

    Doyle, Matthew; Taylor, Roger S.; Kanbur, Shashi; Schofield, Damian; Donalek, Ciro; Djorgovski, Stanislav G.; Davidoff, Scott

    2016-01-01

    This study investigates the efficacy of a 3D visualization application used to classify various types of stars using data derived from large synoptic sky surveys. Evaluation methodology included a cognitive walkthrough that prompted participants to identify a specific star type (Supernovae, RR Lyrae or Eclipsing Binary) and retrieve variable information (MAD, magratio, amplitude, frequency) from the star. This study also implemented a heuristic evaluation that applied usability standards such as the Shneiderman Visual Information Seeking Mantra to the initial iteration of the application. Findings from the evaluation indicated that improvements could be made to the application by developing effective spatial organization and implementing data reduction techniques such as linking, brushing, and small multiples.

  3. Value-based customer grouping from large retail data sets

    NASA Astrophysics Data System (ADS)

    Strehl, Alexander; Ghosh, Joydeep

    2000-04-01

    In this paper, we propose OPOSSUM, a novel similarity-based clustering algorithm using constrained, weighted graph- partitioning. Instead of binary presence or absence of products in a market-basket, we use an extended 'revenue per product' measure to better account for management objectives. Typically the number of clusters desired in a database marketing application is only in the teens or less. OPOSSUM proceeds top-down, which is more efficient and takes a small number of steps to attain the desired number of clusters as compared to bottom-up agglomerative clustering approaches. OPOSSUM delivers clusters that are balanced in terms of either customers (samples) or revenue (value). To facilitate data exploration and validation of results we introduce CLUSION, a visualization toolkit for high-dimensional clustering problems. To enable closed loop deployment of the algorithm, OPOSSUM has no user-specified parameters. Thresholding heuristics are avoided and the optimal number of clusters is automatically determined by a search for maximum performance. Results are presented on a real retail industry data-set of several thousand customers and products, to demonstrate the power of the proposed technique.

  4. The Single and Combined Effects of Multiple Intensities of Behavior Modification and Methylphenidate for Children with Attention Deficit Hyperactivity Disorder in a Classroom Setting

    ERIC Educational Resources Information Center

    Fabiano, Gregory A.; Pelham, William E., Jr.; Gnagy, Elizabeth M.; Burrows-MacLean, Lisa; Coles, Erika K.; Chacko, Anil; Wymbs, Brian T.; Walker, Kathryn S.; Arnold, Fran; Garefino, Allison; Keenan, Jenna K.; Onyango, Adia N.; Hoffman, Martin T.; Massetti, Greta M.; Robb, Jessica A.

    2007-01-01

    Currently behavior modification, stimulant medication, and combined treatments are supported as evidence-based interventions for attention deficit hyperactivity disorder in classroom settings. However, there has been little study of the relative effects of these two modalities and their combination in classrooms. Using a within-subject design, the…

  5. Increasing the Writing Performance of Urban Seniors Placed At-Risk through Goal-Setting in a Culturally Responsive and Creativity-Centered Classroom

    ERIC Educational Resources Information Center

    Estrada, Brittany; Warren, Susan

    2014-01-01

    Efforts to support marginalized students require not only identifying systemic inequities, but providing a classroom infrastructure that supports the academic achievement of all students. This action research study examined the effects of implementing goal-setting strategies and emphasizing creativity in a culturally responsive classroom (CRC) on…

  6. Effectiveness of Wellness-Based Classroom Guidance in Elementary School Settings: A Pilot Study

    ERIC Educational Resources Information Center

    Villalba, Jose A.; Myers, Jane E.

    2008-01-01

    A three-session, wellness-based classroom guidance unit was developed based on the Indivisible Self wellness model and presented to 55 students in 5th grade. Participants completed the Five Factor Wellness Inventory, Elementary School Version, before and after the unit. Wellness scores were significantly and positively higher at post-testing for…

  7. Classroom Management Strategies for Young Children with Challenging Behavior within Early Childhood Settings

    ERIC Educational Resources Information Center

    Jolivette, Kristine; Steed, Elizabeth A.

    2010-01-01

    Many preschool, Head Start, and kindergarten educators of young children express concern about the number of children who exhibit frequent challenging behaviors and report that managing these behaviors is difficult within these classrooms. This article describes research-based strategies with practical applications that can be used as part of

  8. The Timing of Feedback on Mathematics Problem Solving in a Classroom Setting

    ERIC Educational Resources Information Center

    Fyfe, Emily R.; Rittle-Johnson, Bethany

    2015-01-01

    Feedback is a ubiquitous learning tool that is theorized to help learners detect and correct their errors. The goal of this study was to examine the effects of feedback in a classroom context for children solving math equivalence problems (problems with operations on both sides of the equal sign). The authors worked with children in 7 second-grade…

  9. By What Token Economy? A Classroom Learning Tool for Inclusive Settings.

    ERIC Educational Resources Information Center

    Anderson, Carol; Katsiyannis, Antonis

    1997-01-01

    Describes a token economy that used tokens styled as license plates to elicit appropriate behavior in an inclusive fifth-grade class in which four students with behavior disorders were enrolled. Student involvement in establishing the "driving rules" of the classroom is explained, the components of a token economy are outlined, and steps for group…

  10. Classroom Management Strategies for Young Children with Challenging Behavior within Early Childhood Settings

    ERIC Educational Resources Information Center

    Jolivette, Kristine; Steed, Elizabeth A.

    2010-01-01

    Many preschool, Head Start, and kindergarten educators of young children express concern about the number of children who exhibit frequent challenging behaviors and report that managing these behaviors is difficult within these classrooms. This article describes research-based strategies with practical applications that can be used as part of…

  11. An Interaction Analysis of Teacher-Inspired Classroom Language Behaviour in Alternative Language Media Settings.

    ERIC Educational Resources Information Center

    Awoniyi, Timothy A.; Ala, Florence B.O.

    1984-01-01

    Evaluated impact of using English, the "Mother Tongue" (Yoruba), or a structural bilingual mix of English and Yoruba in classroom communication. Found little difference in frequency of teaching behavior patterns, although in some instances teachers speaking Yoruba were more active. (CJM)

  12. A Comparative Study on Second Language Vocabulary Development: Study Abroad vs Classroom Settings

    ERIC Educational Resources Information Center

    Jimenez-Jimenez, Antonio F.

    2010-01-01

    The present paper aims to achieve a better understanding of the process of vocabulary acquisition by examining the development of lexical knowledge in both classroom and study abroad contexts. Taking Ife, Vives Boix, and Meara's (2000) study as a starting point, this study attempts to determine whether development in both levels of vocabulary…

  13. Making Room for Group Work I: Teaching Engineering in a Modern Classroom Setting

    ERIC Educational Resources Information Center

    Wilkens, Robert J.; Ciric, Amy R.

    2005-01-01

    This paper describes the results of several teaching experiments in the teaching Studio of The University of Dayton's Learning-Teaching Center. The Studio is a state-of-the-art classroom with a flexible seating arrangements and movable whiteboards and corkboards for small group discussions. The Studio has a communications system with a TV/VCR…

  14. Mobile Learning in a Large Blended Computer Science Classroom: System Function, Pedagogies, and Their Impact on Learning

    ERIC Educational Resources Information Center

    Shen, Ruimin; Wang, Minjuan; Gao, Wanping; Novak, D.; Tang, Lin

    2009-01-01

    The computer science classes in China's institutions of higher education often have large numbers of students. In addition, many institutions offer "blended" classes that include both on-campus and online students. These large blended classrooms have long suffered from a lack of interactivity. Many online classes simply provide recorded instructor…

  15. The Classroom Observation Schedule to Measure Intentional Communication (COSMIC): an observational measure of the intentional communication of children with autism in an unstructured classroom setting.

    PubMed

    Pasco, Greg; Gordon, Rosanna K; Howlin, Patricia; Charman, Tony

    2008-11-01

    The Classroom Observation Schedule to Measure Intentional Communication (COSMIC) was devised to provide ecologically valid outcome measures for a communication-focused intervention trial. Ninety-one children with autism spectrum disorder aged 6 years 10 months (SD 16 months) were videoed during their everyday snack, teaching and free play activities. Inter-rater reliability was high and relevant items showed significant associations with comparable items from concurrent Autism Diagnostic Observation Schedule-Generic (Lord et al. 2000, J Autism Dev Disord 30(3):205-223) assessments. In a subsample of 28 children initial differences in rates of initiations, initiated speech/vocalisation and commenting were predictive of language and communication competence 15 months later. Results suggest that the use of observational measures of intentional communication in natural settings is a valuable assessment strategy for research and clinical practice. PMID:18401692

  16. Response Grids: Practical Ways to Display Large Data Sets with High Visual Impact

    ERIC Educational Resources Information Center

    Gates, Simon

    2013-01-01

    Spreadsheets are useful for large data sets but they may be too wide or too long to print as conventional tables. Response grids offer solutions to the challenges posed by any large data set. They have wide application throughout science and for every subject and context where visual data displays are designed, within education and elsewhere.

  17. Response Grids: Practical Ways to Display Large Data Sets with High Visual Impact

    ERIC Educational Resources Information Center

    Gates, Simon

    2013-01-01

    Spreadsheets are useful for large data sets but they may be too wide or too long to print as conventional tables. Response grids offer solutions to the challenges posed by any large data set. They have wide application throughout science and for every subject and context where visual data displays are designed, within education and elsewhere.…

  18. Feasibility and Acceptability of Adapting the Eating in the Absence of Hunger Assessment for Preschoolers in the Classroom Setting.

    PubMed

    Soltero, Erica G; Ledoux, Tracey; Lee, Rebecca E

    2015-12-01

    Eating in the Absence of Hunger (EAH) represents a failure to self-regulate intake leading to overconsumption. Existing research on EAH has come from the clinical setting, limiting our understanding of this behavior. The purpose of this study was to describe the adaptation of the clinical EAH paradigm for preschoolers to the classroom setting and evaluate the feasibility and acceptability of measuring EAH in the classroom. The adapted protocol was implemented in childcare centers in Houston, Texas (N=4) and Phoenix, Arizona (N=2). The protocol was feasible, economical, and time efficient, eliminating previously identified barriers to administering the EAH assessment such as limited resources and the time constraint of delivering the assessment to participants individually. Implementation challenges included difficulty in choosing palatable test snacks that were in compliance with childcare center food regulations and the limited control over the meal that was administered prior to the assessment. The adapted protocol will allow for broader use of the EAH assessment and encourage researchers to incorporate the assessment into longitudinal studies in order to further our understanding of the causes and emergence of EAH. PMID:26172567

  19. Demands Upon Children Regarding Quality of Achievement: Standard Setting in Preschool Classrooms.

    ERIC Educational Resources Information Center

    Potter, Ellen F.

    Focusing particularly on messages transmitted by socializing agents in preschool settings, this exploratory study investigates (1) the incidence of communication events in which standards for achievement are expressed, (2) the nature of the standards, and (3) variations across settings in the nature of standard-setting events. The relationship of…

  20. Silent and Vocal Students in a Large Active Learning Chemistry Classroom: Comparison of Performance and Motivational Factors

    ERIC Educational Resources Information Center

    Obenland, Carrie A.; Munson, Ashlyn H.; Hutchinson, John S.

    2013-01-01

    Active learning is becoming more prevalent in large science classrooms, and this study shows the impact on performance of being vocal during Socratic questioning in a General Chemistry course. 800 college students over a two year period were given a pre and post-test using the Chemistry Concept Reasoning Test. The pre-test results showed that…

  1. Peer Instruction versus Class-Wide Discussion in Large Classes: A Comparison of Two Interaction Methods in the Wired Classroom.

    ERIC Educational Resources Information Center

    Nicol, David J.; Boyle, James T.

    2003-01-01

    Classroom communication system (CCS) technology makes it easier to give students immediate feedback on concept tests and to manage peer and class discussions in large classes. This study compared the effects of two different CCS discussion sequences on students' experiences of learning engineering. The results demonstrated that the type of…

  2. Mobile-Phone-Based Classroom Response Systems: Students' Perceptions of Engagement and Learning in a Large Undergraduate Course

    ERIC Educational Resources Information Center

    Dunn, Peter K.; Richardson, Alice; Oprescu, Florin; McDonald, Christine

    2013-01-01

    Using a Classroom Response System (CRS) has been associated with positive educational outcomes, by fostering student engagement and by allowing immediate feedback to both students and instructors. This study examined a low-cost CRS (VotApedia) in a large first-year class, where students responded to questions using their mobile phones. This study…

  3. Safety and science at sea: connecting science research settings to the classroom through live video

    NASA Astrophysics Data System (ADS)

    Cohen, E.; Peart, L. W.

    2011-12-01

    Many science teachers start the year off with classroom safety topics. Annual repetition helps with mastery of this important and basic knowledge, while helping schools to meet their legal obligations for safe lab science. Although these lessons are necessary, they are often topical, rarely authentic and relatively dull. Interesting connections can, however, be drawn between the importance of safety in science classrooms and the importance of safety in academic laboratories, fieldwork, shipboard research, and commercial research. Teachers can leverage these connections through live video interactions with scientists in the field, thereby creating an authentic learning environment. During the School of Rock 2009, a professional teacher research experience aboard the Integrated Ocean Drilling Program's research vessel JOIDES Resolution, safety and nature-of-science curricula were created to help address this need. By experimenting with various topics and locations on the ship that were accessible and applicable to middle school learning, 43 highly visual "safety signs" and activities were identified and presented "live" by graduate students, teachers, scientists; the ship's mates, doctor and technical staff. Students were exposed to realistic science process skills along with safety content from the world's only riserless, deep-sea drilling research vessel. The once-in-a-lifetime experience caused the students' eyes to brighten behind their safety glasses, especially as they recognized the same eye wash station and safety gear they have to wear and attended a ship's fire and safety drill along side scientists in hard hats and personal floatation devices. This collaborative and replicable live vide approach will connect basic safety content and nature-of-science process skills for a memorable and authentic learning experience for students.

  4. Initial validation of the prekindergarten Classroom Observation Tool and goal setting system for data-based coaching.

    PubMed

    Crawford, April D; Zucker, Tricia A; Williams, Jeffrey M; Bhavsar, Vibhuti; Landry, Susan H

    2013-12-01

    Although coaching is a popular approach for enhancing the quality of Tier 1 instruction, limited research has addressed observational measures specifically designed to focus coaching on evidence-based practices. This study explains the development of the prekindergarten (pre-k) Classroom Observation Tool (COT) designed for use in a data-based coaching model. We examined psychometric characteristics of the COT and explored how coaches and teachers used the COT goal-setting system. The study included 193 coaches working with 3,909 pre-k teachers in a statewide professional development program. Classrooms served 3 and 4 year olds (n = 56,390) enrolled mostly in Title I, Head Start, and other need-based pre-k programs. Coaches used the COT during a 2-hr observation at the beginning of the academic year. Teachers collected progress-monitoring data on children's language, literacy, and math outcomes three times during the year. Results indicated a theoretically supported eight-factor structure of the COT across language, literacy, and math instructional domains. Overall interrater reliability among coaches was good (.75). Although correlations with an established teacher observation measure were small, significant positive relations between COT scores and children's literacy outcomes indicate promising predictive validity. Patterns of goal-setting behaviors indicate teachers and coaches set an average of 43.17 goals during the academic year, and coaches reported that 80.62% of goals were met. Both coaches and teachers reported the COT was a helpful measure for enhancing quality of Tier 1 instruction. Limitations of the current study and implications for research and data-based coaching efforts are discussed. PMID:24059812

  5. Effective Extensive Reading outside the Classroom: A Large-Scale Experiment

    ERIC Educational Resources Information Center

    Robb, Thomas; Kano, Makimi

    2013-01-01

    We report on a large-scale implementation of extensive reading (ER) in a university setting in Japan where all students were required to read outside class time as part of their course requirement. A pre/posttest comparison between the 2009 cohort of students who read outside of class and the 2008 cohort who did no outside reading shows that the…

  6. E-Classroom. Electronic Card Catalog, Real-Life Sets, and Estimating.

    ERIC Educational Resources Information Center

    Cavanaugh, Betty; McArdle, Donna

    2000-01-01

    Presents standards-based activities that include learning about an electronic card catalog by playing card games; learning to count and to recognize numerical sets by identifying real-life sets (e.g., items in the building or on the playground); and learning about estimation and spatial sense using students and a digital camera. (SM)

  7. Selective amplification of protein-coding regions of large sets of genes using statistically designed primer sets

    SciTech Connect

    Lopez-Nieto, C.E.; Nigam, S.K. |

    1996-07-01

    We describe a novel approach to design a set of primers selective for large groups of genes. This method is based on the distribution frequency of all nucleotide combinations (octa- to decanucleotides), and the combined ability of primer pairs, based on these oligonucleotides, to detect genes. By analyzing 1000 human mRNAs, we found that surprisingly small subset of octanucleotides is shared by a high proportion of human protein-coding region sense strands. By computer simulation of polymerase chain reactions, a set based on only 30 primers was able to detect approximately 75% of known (and presumably unknown) human protein-coding regions. To validate the method and provide experimental support for the feasibility of the more ambitious goal of targeting human protein-coding regions, we sought to apply the technique to a large protein family: G-protein coupled receptors (GPCRs). Our results indicate that there is sufficient low level homology among human coding regions to allow design of a limited set of primer pairs that can selectively target coding regions in general, as well as genomic subsets (e.g., GPCRs). The approach should be generally applicable to human coding regions, and thus provide an efficient method for analyzing much of the transcriptionally active human genome. 23 refs., 5 figs., 2 tabs.

  8. Large Atomic Natural Orbital Basis Sets for the First Transition Row Atoms

    NASA Technical Reports Server (NTRS)

    Bauschlicher, Charles W., Jr.; Langhoff, Stephen R. (Technical Monitor)

    1994-01-01

    Large atomic natural orbital (ANO) basis sets are tabulated for the Sc to Cu. The primitive sets are taken from the large sets optimized by Partridge, namely (21s 13p 8d) for Sc and Ti and (20s 12p 9d) for V to Cu. These primitive sets are supplemented with three p, one d, six f, and four g functions. The ANO sets are derived from configuration interaction density matrices constructed as the average of the lowest states derived from the 3d(sup n)4s(sup 2) and 3d(sup n+1)4s(sup 1) occupations. For Ni, the 1S(3d(sup 10)) state is included in the averaging. The choice of basis sets for molecular calculations is discussed.

  9. The Distracting Effects of a Ringing Cell Phone: An Investigation of the Laboratory and the Classroom Setting

    PubMed Central

    Shelton, Jill T.; Elliott, Emily M.; Lynn, Sharon D.; Exner, Amanda L.

    2010-01-01

    The detrimental effects of a ringing phone on cognitive performance were investigated in four experiments. In Experiments 1 and 2, the effects of different types of sounds (a standard cell phone ring, irrelevant tones and an instrumental song commonly encountered by participants) on performance were examined. In Experiment 1, slower responses were observed in all auditory groups relative to a silence condition, but participants in the ring and song conditions recovered more slowly. In Experiment 2, participants who were warned about the potential for distraction recovered more quickly, suggesting a benefit of this prior knowledge. This investigation continued in a college classroom setting (Experiments 3a and 3b); students were exposed to a ringing cell phone during the lecture. Performance on a surprise quiz revealed low accuracy rates on material presented while the phone was ringing. These findings offer insight into top-down cognitive processes that moderate involuntary orienting responses associated with a common stimulus encountered in the environment. PMID:21234286

  10. Assessing the Effectiveness of Inquiry-based Learning Techniques Implemented in Large Classroom Settings

    NASA Astrophysics Data System (ADS)

    Steer, D. N.; McConnell, D. A.; Owens, K.

    2001-12-01

    Geoscience and education faculty at The University of Akron jointly developed a series of inquiry-based learning modules aimed at both non-major and major student populations enrolled in introductory geology courses. These courses typically serve 2500 students per year in four to six classes of 40-160 students each per section. Twelve modules were developed that contained common topics and assessments appropriate to Earth Science, Environmental Geology and Physical Geology classes. All modules were designed to meet four primary learning objectives agreed upon by Department of Geology faculty. These major objectives include: 1) Improvement of student understanding of the scientific method; 2) Incorporation of problem solving strategies involving analysis, synthesis, and interpretation; 3) Development of the ability to distinguish between inferences, data and observations; and 4) Obtaining an understanding of basic processes that operate on Earth. Additional objectives that may be addressed by selected modules include: 1) The societal relevance of science; 2) Use and interpretation of quantitative data to better understand the Earth; 3) Development of the students' ability to communicate scientific results; 4) Distinguishing differences between science, religion and pseudo-science; 5) Evaluation of scientific information found in the mass media; and 6) Building interpersonal relationships through in-class group work. Student pre- and post-instruction progress was evaluated by administering a test of logical thinking, an attitude toward science survey, and formative evaluations. Scores from the logical thinking instrument were used to form balanced four-person working groups based on the students' incoming cognitive level. Groups were required to complete a series of activities and/or exercises that targeted different cognitive domains based upon Bloom's taxonomy (knowledge, comprehension, application, analysis, synthesis and evaluation of information). Daily assessments of knowledge-level learning included evaluations of student responses to pre- and post-instruction conceptual test questions, short group exercises and content-oriented exam questions. Higher level thinking skills were assessed when students completed exercises that required the completion of Venn diagrams, concept maps and/or evaluation rubrics both during class periods and on exams. Initial results indicate that these techniques improved student attendance significantly and improved overall retention in the course by 8-14% over traditional lecture formats. Student scores on multiple choice exam questions were slightly higher (1-3%) for students taught in the active learning environment and short answer questions showed larger gains (7%) over students' scores in a more traditional class structure.

  11. A Day in Third Grade: A Large-Scale Study of Classroom Quality and Teacher and Student Behavior

    ERIC Educational Resources Information Center

    Elementary School Journal, 2005

    2005-01-01

    Observations of 780 third-grade classrooms described classroom activities, child-teacher interactions, and dimensions of the global classroom environment, which were examined in relation to structural aspects of the classroom and child behavior. 1 child per classroom was targeted for observation in relation to classroom quality and teacher and…

  12. An interactive environment for the analysis of large Earth observation and model data sets

    NASA Technical Reports Server (NTRS)

    Bowman, Kenneth P.; Walsh, John E.; Wilhelmson, Robert B.

    1993-01-01

    We propose to develop an interactive environment for the analysis of large Earth science observation and model data sets. We will use a standard scientific data storage format and a large capacity (greater than 20 GB) optical disk system for data management; develop libraries for coordinate transformation and regridding of data sets; modify the NCSA X Image and X DataSlice software for typical Earth observation data sets by including map transformations and missing data handling; develop analysis tools for common mathematical and statistical operations; integrate the components described above into a system for the analysis and comparison of observations and model results; and distribute software and documentation to the scientific community.

  13. An interactive environment for the analysis of large Earth observation and model data sets

    NASA Technical Reports Server (NTRS)

    Bowman, Kenneth P.; Walsh, John E.; Wilhelmson, Robert B.

    1992-01-01

    We propose to develop an interactive environment for the analysis of large Earth science observation and model data sets. We will use a standard scientific data storage format and a large capacity (greater than 20 GB) optical disk system for data management; develop libraries for coordinate transformation and regridding of data sets; modify the NCSA X Image and X Data Slice software for typical Earth observation data sets by including map transformations and missing data handling; develop analysis tools for common mathematical and statistical operations; integrate the components described above into a system for the analysis and comparison of observations and model results; and distribute software and documentation to the scientific community.

  14. A Classroom Exercise in Spatial Analysis Using an Imaginary Data Set.

    ERIC Educational Resources Information Center

    Kopaska-Merkel, David C.

    One skill that elementary students need to acquire is the ability to analyze spatially distributed data. In this activity students are asked to complete the following tasks: (1) plot a set of data (related to "mud-sharks"--an imaginary fish) on a map of the state of Alabama, (2) identify trends in the data, (3) make graphs using the data…

  15. Students with ASD in Mainstream Primary Education Settings: Teachers' Experiences in Western Australian Classrooms

    ERIC Educational Resources Information Center

    Soto-Chodiman, Rebecca; Pooley, Julie Ann; Cohen, Lynne; Taylor, Myra Frances

    2012-01-01

    The shift to inclusive education within Australia has resulted in increasing numbers of students with autism spectrum disorders (ASD) being placed in mainstream educational settings. This move has created new demands on teachers who are not necessarily trained to meet the challenge. Therefore, the present study aimed to develop an understanding of…

  16. Ready, Set, SCIENCE!: Putting Research to Work in K-8 Science Classrooms

    ERIC Educational Resources Information Center

    Michaels, Sarah; Shouse, Andrew W.; Schweingruber, Heidi A.

    2007-01-01

    What types of instructional experiences help K-8 students learn science with understanding? What do science educators, teachers, teacher leaders, science specialists, professional development staff, curriculum designers, and school administrators need to know to create and support such experiences? "Ready, Set, Science!" guides the way with an

  17. Intercultural Education Set Forward: Operational Strategies and Procedures in Cypriot Classrooms

    ERIC Educational Resources Information Center

    Hajisoteriou, Christina

    2012-01-01

    Teachers in Cyprus are being called upon for the first time to teach within culturally diverse educational settings. Given the substantial role, teachers play in the implementation of intercultural education, this paper explores the intercultural strategies and procedures adopted by primary school teachers in Cyprus. Interviews were carried out

  18. A Classroom Exercise in Spatial Analysis Using an Imaginary Data Set.

    ERIC Educational Resources Information Center

    Kopaska-Merkel, David C.

    One skill that elementary students need to acquire is the ability to analyze spatially distributed data. In this activity students are asked to complete the following tasks: (1) plot a set of data (related to "mud-sharks"--an imaginary fish) on a map of the state of Alabama, (2) identify trends in the data, (3) make graphs using the data

  19. BEST in CLASS: A Classroom-Based Model for Ameliorating Problem Behavior in Early Childhood Settings

    ERIC Educational Resources Information Center

    Vo, Abigail; Sutherland, Kevin S.; Conroy, Maureen A.

    2012-01-01

    As more young children enter school settings to attend early childhood programs, early childhood teachers and school psychologists have been charged with supporting a growing number of young children with chronic problem behaviors that put them at risk for the development of emotional/behavioral disorders (EBDs). There is a need for effective,…

  20. Best in Class: A Classroom-Based Model for Ameliorating Problem Behavior in Early Childhood Settings

    ERIC Educational Resources Information Center

    Vo, Abigail K.; Sutherland, Kevin S.; Conroy, Maureen A.

    2012-01-01

    As more young children enter school settings to attend early childhood programs, early childhood teachers and school psychologists have been charged with supporting a growing number of young children with chronic problem behaviors that put them at risk for the development of emotional/behavioral disorders (EBDs). There is a need for effective,…

  1. Intercultural Education Set Forward: Operational Strategies and Procedures in Cypriot Classrooms

    ERIC Educational Resources Information Center

    Hajisoteriou, Christina

    2012-01-01

    Teachers in Cyprus are being called upon for the first time to teach within culturally diverse educational settings. Given the substantial role, teachers play in the implementation of intercultural education, this paper explores the intercultural strategies and procedures adopted by primary school teachers in Cyprus. Interviews were carried out…

  2. Teleconsultation in School Settings: Linking Classroom Teachers and Behavior Analysts Through Web-Based Technology

    PubMed Central

    Frieder, Jessica E; Peterson, Stephanie M; Woodward, Judy; Crane, JaeLee; Garner, Marlane

    2009-01-01

    This paper describes a technically driven, collaborative approach to assessing the function of problem behavior using web-based technology. A case example is provided to illustrate the process used in this pilot project. A school team conducted a functional analysis with a child who demonstrated challenging behaviors in a preschool setting. Behavior analysts at a university setting provided the school team with initial workshop trainings, on-site visits, e-mail and phone communication, as well as live web-based feedback on functional analysis sessions. The school personnel implemented the functional analysis with high fidelity and scored the data reliably. Outcomes of the project suggest that there is great potential for collaboration via the use of web-based technologies for ongoing assessment and development of effective interventions. However, an empirical evaluation of this model should be conducted before wide-scale adoption is recommended. PMID:22477705

  3. Interdependent Learning in an Open Classroom Setting: Dean Rusk Elementary School, 1972-73. Research and Development Report, Volume 7, Number 7, August 1973.

    ERIC Educational Resources Information Center

    Goettee, Margaret

    All special programs at Dean Rusk Elementary School, funded in part under Title I of the 1965 Elementary Secondary Education Act, combined to facilitate individualized instruction in the nongraded, open classroom setting of the school. To better meet the needs of the pupils during the 1972-73 school year, the Follow Through Program included, for…

  4. Using Mobile Phones to Increase Classroom Interaction

    ERIC Educational Resources Information Center

    Cobb, Stephanie; Heaney, Rose; Corcoran, Olivia; Henderson-Begg, Stephanie

    2010-01-01

    This study examines the possible benefits of using mobile phones to increase interaction and promote active learning in large classroom settings. First year undergraduate students studying Cellular Processes at the University of East London took part in a trial of a new text-based classroom interaction system and evaluated their experience by…

  5. Impact of Abbreviated Lecture with Interactive Mini-cases vs Traditional Lecture on Student Performance in the Large Classroom

    PubMed Central

    Nykamp, Diane L.; Momary, Kathryn M.

    2014-01-01

    Objective. To compare the impact of 2 different teaching and learning methods on student mastery of learning objectives in a pharmacotherapy module in the large classroom setting. Design. Two teaching and learning methods were implemented and compared in a required pharmacotherapy module for 2 years. The first year, multiple interactive mini-cases with inclass individual assessment and an abbreviated lecture were used to teach osteoarthritis; a traditional lecture with 1 inclass case discussion was used to teach gout. In the second year, the same topics were used but the methods were flipped. Student performance on pre/post individual readiness assessment tests (iRATs), case questions, and subsequent examinations were compared each year by the teaching and learning method and then between years by topic for each method. Students also voluntarily completed a 20-item evaluation of the teaching and learning methods. Assessment. Postpresentation iRATs were significantly higher than prepresentation iRATs for each topic each year with the interactive mini-cases; there was no significant difference in iRATs before and after traditional lecture. For osteoarthritis, postpresentation iRATs after interactive mini-cases in year 1 were significantly higher than postpresentation iRATs after traditional lecture in year 2; the difference in iRATs for gout per learning method was not significant. The difference between examination performance for osteoarthritis and gout was not significant when the teaching and learning methods were compared. On the student evaluations, 2 items were significant both years when answers were compared by teaching and learning method. Each year, students ranked their class participation higher with interactive cases than with traditional lecture, but both years they reported enjoying the traditional lecture format more. Conclusion. Multiple interactive mini-cases with an abbreviated lecture improved immediate mastery of learning objectives compared to a traditional lecture format, regardless of therapeutic topic, but did not improve student performance on subsequent examinations. PMID:25657376

  6. Small Group Collaboration in the Large Lecture Setting: Collaborative Process, Pedagogical Paradigms, and Institutional Constraints.

    ERIC Educational Resources Information Center

    Michalchik, Vera; Schaeffer, Evonne; Tovar, Lawrence; Steinbeck, Reinhold; Bhargava, Tina; Kerns, Charles; Engel, Claudia; Levtov, Ruti

    This paper focuses on some of the key issues involved in implementing a collaborative design project in the setting of the large undergraduate lecture course at a major research university, offering a preliminary analysis of the assignment mainly as a function of how students managed and interpreted it. The collaborative design project was…

  7. DocCube: Multi-Dimensional Visualization and Exploration of Large Document Sets.

    ERIC Educational Resources Information Center

    Mothe, Josiane; Chrisment, Claude; Dousset, Bernard; Alaux, Joel

    2003-01-01

    Describes a user interface that provides global visualizations of large document sets to help users formulate the query that corresponds to their information needs. Highlights include concept hierarchies that users can browse to specify and refine information needs; knowledge discovery in databases and texts; and multidimensional modeling.

  8. Preschoolers' Nonsymbolic Arithmetic with Large Sets: Is Addition More Accurate than Subtraction?

    ERIC Educational Resources Information Center

    Shinskey, Jeanne L.; Chan, Cindy Ho-man; Coleman, Rhea; Moxom, Lauren; Yamamoto, Eri

    2009-01-01

    Adult and developing humans share with other animals analog magnitude representations of number that support nonsymbolic arithmetic with large sets. This experiment tested the hypothesis that such representations may be more accurate for addition than for subtraction in children as young as 3 1/2 years of age. In these tasks, the experimenter hid…

  9. Influences of large sets of environmental exposures on immune responses in healthy adult men

    PubMed Central

    Yi, Buqing; Rykova, Marina; Jäger, Gundula; Feuerecker, Matthias; Hörl, Marion; Matzel, Sandra; Ponomarev, Sergey; Vassilieva, Galina; Nichiporuk, Igor; Choukèr, Alexander

    2015-01-01

    Environmental factors have long been known to influence immune responses. In particular, clinical studies about the association between migration and increased risk of atopy/asthma have provided important information on the role of migration associated large sets of environmental exposures in the development of allergic diseases. However, investigations about environmental effects on immune responses are mostly limited in candidate environmental exposures, such as air pollution. The influences of large sets of environmental exposures on immune responses are still largely unknown. A simulated 520-d Mars mission provided an opportunity to investigate this topic. Six healthy males lived in a closed habitat simulating a spacecraft for 520 days. When they exited their “spacecraft” after the mission, the scenario was similar to that of migration, involving exposure to a new set of environmental pollutants and allergens. We measured multiple immune parameters with blood samples at chosen time points after the mission. At the early adaptation stage, highly enhanced cytokine responses were observed upon ex vivo antigen stimulations. For cell population frequencies, we found the subjects displayed increased neutrophils. These results may presumably represent the immune changes occurred in healthy humans when migrating, indicating that large sets of environmental exposures may trigger aberrant immune activity. PMID:26306804

  10. Influences of large sets of environmental exposures on immune responses in healthy adult men.

    PubMed

    Yi, Buqing; Rykova, Marina; Jäger, Gundula; Feuerecker, Matthias; Hörl, Marion; Matzel, Sandra; Ponomarev, Sergey; Vassilieva, Galina; Nichiporuk, Igor; Choukèr, Alexander

    2015-01-01

    Environmental factors have long been known to influence immune responses. In particular, clinical studies about the association between migration and increased risk of atopy/asthma have provided important information on the role of migration associated large sets of environmental exposures in the development of allergic diseases. However, investigations about environmental effects on immune responses are mostly limited in candidate environmental exposures, such as air pollution. The influences of large sets of environmental exposures on immune responses are still largely unknown. A simulated 520-d Mars mission provided an opportunity to investigate this topic. Six healthy males lived in a closed habitat simulating a spacecraft for 520 days. When they exited their "spacecraft" after the mission, the scenario was similar to that of migration, involving exposure to a new set of environmental pollutants and allergens. We measured multiple immune parameters with blood samples at chosen time points after the mission. At the early adaptation stage, highly enhanced cytokine responses were observed upon ex vivo antigen stimulations. For cell population frequencies, we found the subjects displayed increased neutrophils. These results may presumably represent the immune changes occurred in healthy humans when migrating, indicating that large sets of environmental exposures may trigger aberrant immune activity. PMID:26306804

  11. DocCube: Multi-Dimensional Visualization and Exploration of Large Document Sets.

    ERIC Educational Resources Information Center

    Mothe, Josiane; Chrisment, Claude; Dousset, Bernard; Alaux, Joel

    2003-01-01

    Describes a user interface that provides global visualizations of large document sets to help users formulate the query that corresponds to their information needs. Highlights include concept hierarchies that users can browse to specify and refine information needs; knowledge discovery in databases and texts; and multidimensional modeling.…

  12. Experiments and other methods for developing expertise with design of experiments in a classroom setting

    NASA Technical Reports Server (NTRS)

    Patterson, John W.

    1990-01-01

    The only way to gain genuine expertise in Statistical Process Control (SPC) and the design of experiments (DOX) is with repeated practice, but not on canned problems with dead data sets. Rather, one must negotiate a wide variety of problems each with its own peculiarities and its own constantly changing data. The problems should not be of the type for which there is a single, well-defined answer that can be looked up in a fraternity file or in some text. The problems should match as closely as possible the open-ended types for which there is always an abundance of uncertainty. These are the only kinds that arise in real research, whether that be basic research in academe or engineering research in industry. To gain this kind of experience, either as a professional consultant or as an industrial employee, takes years. Vast amounts of money, not to mention careers, must be put at risk. The purpose here is to outline some realistic simulation-type lab exercises that are so simple and inexpensive to run that the students can repeat them as often as desired at virtually no cost. Simulations also allow the instructor to design problems whose outcomes are as noisy as desired but still predictable within limits. Also the instructor and the students can learn a great deal more from the postmortum conducted after the exercise is completed. One never knows for sure what the true data should have been when dealing only with real life experiments. To add a bit more realism to the exercises, it is sometimes desirable to make the students pay for each experimental result from a make-believe budget allocation for the problem.

  13. Large-scale detection of metals with a small set of fluorescent DNA-like chemosensors.

    PubMed

    Yuen, Lik Hang; Franzini, Raphael M; Tan, Samuel S; Kool, Eric T

    2014-10-15

    An important advantage of pattern-based chemosensor sets is their potential to detect and differentiate a large number of analytes with only few sensors. Here we test this principle at a conceptual limit by analyzing a large set of metal ion analytes covering essentially the entire periodic table, employing fluorescent DNA-like chemosensors on solid support. A tetrameric "oligodeoxyfluoroside" (ODF) library of 6561 members containing metal-binding monomers was screened for strong responders to 57 metal ions in solution. Our results show that a set of 9 chemosensors could successfully discriminate the 57 species, including alkali, alkaline earth, post-transition, transition, and lanthanide metals. As few as 6 ODF chemosensors could detect and differentiate 50 metals at 100 μM; sensitivity for some metals was achieved at midnanomolar ranges. A blind test with 50 metals further confirmed the discriminating power of the ODFs. PMID:25255102

  14. Out in the Classroom: Transgender Student Experiences at a Large Public University

    ERIC Educational Resources Information Center

    Pryor, Jonathan T.

    2015-01-01

    Faculty and peer interactions are 2 of the most important relationships for college students to foster (Astin, 1993). Transgender college students have an increasing visible presence on college campuses (Pusch, 2005), yet limited research exists on their experiences and struggles in the classroom environment (Garvey & Rankin, 2015; Renn,…

  15. Classroom Response Systems for Implementing "Interactive Inquiry" in Large Organic Chemistry Classes

    ERIC Educational Resources Information Center

    Morrison, Richard W.; Caughran, Joel A.; Sauers, Angela L.

    2014-01-01

    The authors have developed "sequence response applications" for classroom response systems (CRSs) that allow instructors to engage and actively involve students in the learning process, probe for common misconceptions regarding lecture material, and increase interaction between instructors and students. "Guided inquiry" and…

  16. Coaching as a Key Component in Teachers' Professional Development: Improving Classroom Practices in Head Start Settings. OPRE Report 2012-4

    ERIC Educational Resources Information Center

    Lloyd, Chrrishana M.; Modlin, Emmily L.

    2012-01-01

    Head Start CARES (Classroom-based Approaches and Resources for Emotion and Social Skill Promotion) is a large-scale, national research demonstration that was designed to test the effects of a one-year program aimed at improving pre-kindergarteners' social and emotional readiness for school. To facilitate the delivery of the program, teachers…

  17. Classroom Discourse and Reading Comprehension in Bilingual Settings: A Case Study of Collaborative Reasoning in a Chinese Heritage Language Learners' Classroom

    ERIC Educational Resources Information Center

    Tsai, Hsiao-Feng

    2012-01-01

    This dissertation examines the participation of one Chinese teacher and five 13 to 15 year-old Chinese heritage students in a classroom in a Chinese community school during group discussions about narrative texts. In this study, the teacher used Collaborative Reasoning (CR) (Anderson, et al., 2001) to help the Chinese heritage students extend

  18. Classroom Discourse and Reading Comprehension in Bilingual Settings: A Case Study of Collaborative Reasoning in a Chinese Heritage Language Learners' Classroom

    ERIC Educational Resources Information Center

    Tsai, Hsiao-Feng

    2012-01-01

    This dissertation examines the participation of one Chinese teacher and five 13 to 15 year-old Chinese heritage students in a classroom in a Chinese community school during group discussions about narrative texts. In this study, the teacher used Collaborative Reasoning (CR) (Anderson, et al., 2001) to help the Chinese heritage students extend…

  19. Accelerated similarity searching and clustering of large compound sets by geometric embedding and locality sensitive hashing

    PubMed Central

    Cao, Yiqun; Jiang, Tao; Girke, Thomas

    2010-01-01

    Motivation: Similarity searching and clustering of chemical compounds by structural similarities are important computational approaches for identifying drug-like small molecules. Most algorithms available for these tasks are limited by their speed and scalability, and cannot handle today's large compound databases with several million entries. Results: In this article, we introduce a new algorithm for accelerated similarity searching and clustering of very large compound sets using embedding and indexing (EI) techniques. First, we present EI-Search as a general purpose similarity search method for finding objects with similar features in large databases and apply it here to searching and clustering of large compound sets. The method embeds the compounds in a high-dimensional Euclidean space and searches this space using an efficient index-aware nearest neighbor search method based on locality sensitive hashing (LSH). Second, to cluster large compound sets, we introduce the EI-Clustering algorithm that combines the EI-Search method with Jarvis–Patrick clustering. Both methods were tested on three large datasets with sizes ranging from about 260 000 to over 19 million compounds. In comparison to sequential search methods, the EI-Search method was 40–200 times faster, while maintaining comparable recall rates. The EI-Clustering method allowed us to significantly reduce the CPU time required to cluster these large compound libraries from several months to only a few days. Availability: Software implementations and online services have been developed based on the methods introduced in this study. The online services provide access to the generated clustering results and ultra-fast similarity searching of the PubChem Compound database with subsecond response time. Contact: thomas.girke@ucr.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:20179075

  20. Strategies for MCR image analysis of large hyperspectral data-sets.

    PubMed

    Scurr, David J; Hook, Andrew L; Burley, Jonathan A; Williams, Philip M; Anderson, Daniel G; Langer, Robert C; Davies, Martyn C; Alexander, Morgan R

    2013-01-01

    Polymer microarrays are a key enabling technology for high throughput materials discovery. In this study, multivariate image analysis, specifically multivariate curve resolution (MCR), is applied to the hyperspectral time of flight secondary ion mass spectroscopy (ToF-SIMS) data from eight individual microarray spots. Rather than analysing the data individually, the data-sets are collated and analysed as a single large data-set. Desktop computing is not a practical method for undertaking MCR analysis of such large data-sets due to the constraints of memory and computational overhead. Here, a distributed memory High-Performance Computing facility (HPC) is used. Similar to what is achieved using MCR analysis of individual samples, the results from this consolidated data-set allow clear identification of the substrate material; furthermore, specific chemistries common to different spots are also identified. The application of the HPC facility to the MCR analysis of ToF-SIMS hyperspectral data-sets demonstrates a potential methodology for the analysis of macro-scale data without compromising spatial resolution (data 'binning'). Copyright © 2012 John Wiley & Sons, Ltd. PMID:23450109

  1. Strategies for MCR image analysis of large hyperspectral data-sets

    PubMed Central

    Scurr, David J; Hook, Andrew L; Burley, Jonathan A; Williams, Philip M; Anderson, Daniel G; Langer, Robert C; Davies, Martyn C; Alexander, Morgan R

    2013-01-01

    Polymer microarrays are a key enabling technology for high throughput materials discovery. In this study, multivariate image analysis, specifically multivariate curve resolution (MCR), is applied to the hyperspectral time of flight secondary ion mass spectroscopy (ToF-SIMS) data from eight individual microarray spots. Rather than analysing the data individually, the data-sets are collated and analysed as a single large data-set. Desktop computing is not a practical method for undertaking MCR analysis of such large data-sets due to the constraints of memory and computational overhead. Here, a distributed memory High-Performance Computing facility (HPC) is used. Similar to what is achieved using MCR analysis of individual samples, the results from this consolidated data-set allow clear identification of the substrate material; furthermore, specific chemistries common to different spots are also identified. The application of the HPC facility to the MCR analysis of ToF-SIMS hyperspectral data-sets demonstrates a potential methodology for the analysis of macro-scale data without compromising spatial resolution (data ‘binning’). Copyright © 2012 John Wiley & Sons, Ltd. PMID:23450109

  2. Fast graph-based relaxed clustering for large data sets using minimal enclosing ball.

    PubMed

    Qian, Pengjiang; Chung, Fu-Lai; Wang, Shitong; Deng, Zhaohong

    2012-06-01

    Although graph-based relaxed clustering (GRC) is one of the spectral clustering algorithms with straightforwardness and self-adaptability, it is sensitive to the parameters of the adopted similarity measure and also has high time complexity O(N(3)) which severely weakens its usefulness for large data sets. In order to overcome these shortcomings, after introducing certain constraints for GRC, an enhanced version of GRC [constrained GRC (CGRC)] is proposed to increase the robustness of GRC to the parameters of the adopted similarity measure, and accordingly, a novel algorithm called fast GRC (FGRC) based on CGRC is developed in this paper by using the core-set-based minimal enclosing ball approximation. A distinctive advantage of FGRC is that its asymptotic time complexity is linear with the data set size N. At the same time, FGRC also inherits the straightforwardness and self-adaptability from GRC, making the proposed FGRC a fast and effective clustering algorithm for large data sets. The advantages of FGRC are validated by various benchmarking and real data sets. PMID:22318491

  3. An Analysis Framework Addressing the Scale and Legibility of Large Scientific Data Sets

    SciTech Connect

    Childs, H R

    2006-11-20

    Much of the previous work in the large data visualization area has solely focused on handling the scale of the data. This task is clearly a great challenge and necessary, but it is not sufficient. Applying standard visualization techniques to large scale data sets often creates complicated pictures where meaningful trends are lost. A second challenge, then, is to also provide algorithms that simplify what an analyst must understand, using either visual or quantitative means. This challenge can be summarized as improving the legibility or reducing the complexity of massive data sets. Fully meeting both of these challenges is the work of many, many PhD dissertations. In this dissertation, we describe some new techniques to address both the scale and legibility challenges, in hope of contributing to the larger solution. In addition to our assumption of simultaneously addressing both scale and legibility, we add an additional requirement that the solutions considered fit well within an interoperable framework for diverse algorithms, because a large suite of algorithms is often necessary to fully understand complex data sets. For scale, we present a general architecture for handling large data, as well as details of a contract-based system for integrating advanced optimizations into a data flow network design. We also describe techniques for volume rendering and performing comparisons at the extreme scale. For legibility, we present several techniques. Most noteworthy are equivalence class functions, a technique to drive visualizations using statistical methods, and line-scan based techniques for characterizing shape.

  4. Validating a large geophysical data set: Experiences with satellite-derived cloud parameters

    NASA Technical Reports Server (NTRS)

    Kahn, Ralph; Haskins, Robert D.; Knighton, James E.; Pursch, Andrew; Granger-Gallegos, Stephanie

    1992-01-01

    We are validating the global cloud parameters derived from the satellite-borne HIRS2 and MSU atmospheric sounding instrument measurements, and are using the analysis of these data as one prototype for studying large geophysical data sets in general. The HIRS2/MSU data set contains a total of 40 physical parameters, filling 25 MB/day; raw HIRS2/MSU data are available for a period exceeding 10 years. Validation involves developing a quantitative sense for the physical meaning of the derived parameters over the range of environmental conditions sampled. This is accomplished by comparing the spatial and temporal distributions of the derived quantities with similar measurements made using other techniques, and with model results. The data handling needed for this work is possible only with the help of a suite of interactive graphical and numerical analysis tools. Level 3 (gridded) data is the common form in which large data sets of this type are distributed for scientific analysis. We find that Level 3 data is inadequate for the data comparisons required for validation. Level 2 data (individual measurements in geophysical units) is needed. A sampling problem arises when individual measurements, which are not uniformly distributed in space or time, are used for the comparisons. Standard 'interpolation' methods involve fitting the measurements for each data set to surfaces, which are then compared. We are experimenting with formal criteria for selecting geographical regions, based upon the spatial frequency and variability of measurements, that allow us to quantify the uncertainty due to sampling. As part of this project, we are also dealing with ways to keep track of constraints placed on the output by assumptions made in the computer code. The need to work with Level 2 data introduces a number of other data handling issues, such as accessing data files across machine types, meeting large data storage requirements, accessing other validated data sets, processing speed and throughput for interactive graphical work, and problems relating to graphical interfaces.

  5. A Scalable Approach for Protein False Discovery Rate Estimation in Large Proteomic Data Sets.

    PubMed

    Savitski, Mikhail M; Wilhelm, Mathias; Hahne, Hannes; Kuster, Bernhard; Bantscheff, Marcus

    2015-09-01

    Calculating the number of confidently identified proteins and estimating false discovery rate (FDR) is a challenge when analyzing very large proteomic data sets such as entire human proteomes. Biological and technical heterogeneity in proteomic experiments further add to the challenge and there are strong differences in opinion regarding the conceptual validity of a protein FDR and no consensus regarding the methodology for protein FDR determination. There are also limitations inherent to the widely used classic target-decoy strategy that particularly show when analyzing very large data sets and that lead to a strong over-representation of decoy identifications. In this study, we investigated the merits of the classic, as well as a novel target-decoy-based protein FDR estimation approach, taking advantage of a heterogeneous data collection comprised of ∼19,000 LC-MS/MS runs deposited in ProteomicsDB (https://www.proteomicsdb.org). The "picked" protein FDR approach treats target and decoy sequences of the same protein as a pair rather than as individual entities and chooses either the target or the decoy sequence depending on which receives the highest score. We investigated the performance of this approach in combination with q-value based peptide scoring to normalize sample-, instrument-, and search engine-specific differences. The "picked" target-decoy strategy performed best when protein scoring was based on the best peptide q-value for each protein yielding a stable number of true positive protein identifications over a wide range of q-value thresholds. We show that this simple and unbiased strategy eliminates a conceptual issue in the commonly used "classic" protein FDR approach that causes overprediction of false-positive protein identification in large data sets. The approach scales from small to very large data sets without losing performance, consistently increases the number of true-positive protein identifications and is readily implemented in proteomics analysis software. PMID:25987413

  6. Moving Large Data Sets Over High-Performance Long Distance Networks

    SciTech Connect

    Hodson, Stephen W; Poole, Stephen W; Ruwart, Thomas; Settlemyer, Bradley W

    2011-04-01

    In this project we look at the performance characteristics of three tools used to move large data sets over dedicated long distance networking infrastructure. Although performance studies of wide area networks have been a frequent topic of interest, performance analyses have tended to focus on network latency characteristics and peak throughput using network traffic generators. In this study we instead perform an end-to-end long distance networking analysis that includes reading large data sets from a source file system and committing large data sets to a destination file system. An evaluation of end-to-end data movement is also an evaluation of the system configurations employed and the tools used to move the data. For this paper, we have built several storage platforms and connected them with a high performance long distance network configuration. We use these systems to analyze the capabilities of three data movement tools: BBcp, GridFTP, and XDD. Our studies demonstrate that existing data movement tools do not provide efficient performance levels or exercise the storage devices in their highest performance modes. We describe the device information required to achieve high levels of I/O performance and discuss how this data is applicable in use cases beyond data movement performance.

  7. Implementing Child-focused Activity Meter Utilization into the Elementary School Classroom Setting Using a Collaborative Community-based Approach

    PubMed Central

    Lynch, BA; Jones, A; Biggs, BK; Kaufman, T; Cristiani, V; Kumar, S; Quigg, S; Maxson, J; Swenson, L; Jacobson, N

    2016-01-01

    Introduction The prevalence of pediatric obesity has increased over the past 3 decades and is a pressing public health program. New technology advancements that can encourage more physical in children are needed. The Zamzee program is an activity meter linked to a motivational website designed for children 8–14 years of age. The objective of the study was to use a collaborative approach between a medical center, the private sector and local school staff to assess the feasibility of using the Zamzee Program in the school-based setting to improve physical activity levels in children. Methods This was a pilot 8-week observational study offered to all children in one fifth grade classroom. Body mass index (BMI), the amount of physical activity by 3-day recall survey, and satisfaction with usability of the Zamzee Program were measured pre- and post-study. Results Out of 11 children who enrolled in the study, 7 completed all study activities. In those who completed the study, the median (interquartile range) total activity time by survey increased by 17 (1042) minutes and the BMI percentile change was 0 (8). Both children and their caregivers found the Zamzee Activity Meter (6/7) and website (6/7) “very easy” or “easy” to use. Conclusion The Zamzee Program was found to be usable but did not significantly improve physical activity levels or BMI. Collaborative obesity intervention projects involving medical centers, the private sector and local schools are feasible but the effectiveness needs to be evaluated in larger-scale studies. PMID:27042382

  8. Non-rigid Registration for Large Sets of Microscopic Images on Graphics Processors

    PubMed Central

    Ruiz, Antonio; Ujaldon, Manuel; Cooper, Lee

    2014-01-01

    Microscopic imaging is an important tool for characterizing tissue morphology and pathology. 3D reconstruction and visualization of large sample tissue structure requires registration of large sets of high-resolution images. However, the scale of this problem presents a challenge for automatic registration methods. In this paper we present a novel method for efficient automatic registration using graphics processing units (GPUs) and parallel programming. Comparing a C++ CPU implementation with Compute Unified Device Architecture (CUDA) libraries and pthreads running on GPU we achieve a speed-up factor of up to 4.11× with a single GPU and 6.68× with a GPU pair. We present execution times for a benchmark composed of two sets of large-scale images: mouse placenta (16K × 16K pixels) and breast cancer tumors (23K × 62K pixels). It takes more than 12 hours for the genetic case in C++ to register a typical sample composed of 500 consecutive slides, which was reduced to less than 2 hours using two GPUs, in addition to a very promising scalability for extending those gains easily on a large number of GPUs in a distributed system. PMID:25328635

  9. COLLABORATIVE RESEARCH: Parallel Analysis Tools and New Visualization Techniques for Ultra-Large Climate Data Set

    SciTech Connect

    middleton, Don; Haley, Mary

    2014-12-10

    ParVis was a project funded under LAB 10-05: “Earth System Modeling: Advanced Scientific Visualization of Ultra-Large Climate Data Sets”. Argonne was the lead lab with partners at PNNL, SNL, NCAR and UC-Davis. This report covers progress from January 1st, 2013 through Dec 1st, 2014. Two previous reports covered the period from Summer, 2010, through September 2011 and October 2011 through December 2012, respectively. While the project was originally planned to end on April 30, 2013, personnel and priority changes allowed many of the institutions to continue work through FY14 using existing funds. A primary focus of ParVis was introducing parallelism to climate model analysis to greatly reduce the time-to-visualization for ultra-large climate data sets. Work in the first two years was conducted on two tracks with different time horizons: one track to provide immediate help to climate scientists already struggling to apply their analysis to existing large data sets and another focused on building a new data-parallel library and tool for climate analysis and visualization that will give the field a platform for performing analysis and visualization on ultra-large datasets for the foreseeable future. In the final 2 years of the project, we focused mostly on the new data-parallel library and associated tools for climate analysis and visualization.

  10. Simultaneous identification of long similar substrings in large sets of sequences

    PubMed Central

    Kleffe, Jürgen; Möller, Friedrich; Wittig, Burghardt

    2007-01-01

    Background Sequence comparison faces new challenges today, with many complete genomes and large libraries of transcripts known. Gene annotation pipelines match these sequences in order to identify genes and their alternative splice forms. However, the software currently available cannot simultaneously compare sets of sequences as large as necessary especially if errors must be considered. Results We therefore present a new algorithm for the identification of almost perfectly matching substrings in very large sets of sequences. Its implementation, called ClustDB, is considerably faster and can handle 16 times more data than VMATCH, the most memory efficient exact program known today. ClustDB simultaneously generates large sets of exactly matching substrings of a given minimum length as seeds for a novel method of match extension with errors. It generates alignments of maximum length with a considered maximum number of errors within each overlapping window of a given size. Such alignments are not optimal in the usual sense but faster to calculate and often more appropriate than traditional alignments for genomic sequence comparisons, EST and full-length cDNA matching, and genomic sequence assembly. The method is used to check the overlaps and to reveal possible assembly errors for 1377 Medicago truncatula BAC-size sequences published at . Conclusion The program ClustDB proves that window alignment is an efficient way to find long sequence sections of homogenous alignment quality, as expected in case of random errors, and to detect systematic errors resulting from sequence contaminations. Such inserts are systematically overlooked in long alignments controlled by only tuning penalties for mismatches and gaps. ClustDB is freely available for academic use. PMID:17570866

  11. Breeding and Genetics Symposium: really big data: processing and analysis of very large data sets.

    PubMed

    Cole, J B; Newman, S; Foertter, F; Aguilar, I; Coffey, M

    2012-03-01

    Modern animal breeding data sets are large and getting larger, due in part to recent availability of high-density SNP arrays and cheap sequencing technology. High-performance computing methods for efficient data warehousing and analysis are under development. Financial and security considerations are important when using shared clusters. Sound software engineering practices are needed, and it is better to use existing solutions when possible. Storage requirements for genotypes are modest, although full-sequence data will require greater storage capacity. Storage requirements for intermediate and results files for genetic evaluations are much greater, particularly when multiple runs must be stored for research and validation studies. The greatest gains in accuracy from genomic selection have been realized for traits of low heritability, and there is increasing interest in new health and management traits. The collection of sufficient phenotypes to produce accurate evaluations may take many years, and high-reliability proofs for older bulls are needed to estimate marker effects. Data mining algorithms applied to large data sets may help identify unexpected relationships in the data, and improved visualization tools will provide insights. Genomic selection using large data requires a lot of computing power, particularly when large fractions of the population are genotyped. Theoretical improvements have made possible the inversion of large numerator relationship matrices, permitted the solving of large systems of equations, and produced fast algorithms for variance component estimation. Recent work shows that single-step approaches combining BLUP with a genomic relationship (G) matrix have similar computational requirements to traditional BLUP, and the limiting factor is the construction and inversion of G for many genotypes. A naïve algorithm for creating G for 14,000 individuals required almost 24 h to run, but custom libraries and parallel computing reduced that to 15 m. Large data sets also create challenges for the delivery of genetic evaluations that must be overcome in a way that does not disrupt the transition from conventional to genomic evaluations. Processing time is important, especially as real-time systems for on-farm decisions are developed. The ultimate value of these systems is to decrease time-to-results in research, increase accuracy in genomic evaluations, and accelerate rates of genetic improvement. PMID:22100598

  12. Coffee Shops, Classrooms and Conversations: public engagement and outreach in a large interdisciplinary research Hub

    NASA Astrophysics Data System (ADS)

    Holden, Jennifer A.

    2014-05-01

    Public engagement and outreach activities are increasingly using specialist staff for co-ordination, training and support for researchers, they are also becoming expected for large investments. Here, the experience of public engagement and outreach a large, interdisciplinary Research Hub is described. dot.rural, based at the University of Aberdeen UK, is a £11.8 million Research Councils UK Rural Digital Economy Hub, funded as part of the RCUK Digital Economy Theme (2009-2015). Digital Economy research aims to realise the transformational impact of digital technologies on aspects of the environment, community life, cultural experiences, future society, and the economy. The dot.rural Hub involves 92 researchers from 12 different disciplines, including Geography, Hydrology and Ecology. Public Engagement and Outreach is embedded in the dot.rural Digital Economy Hub via an Outreach Officer. Alongside this position, public engagement and outreach activities are compulsory part of PhD student contracts. Public Engagement and Outreach activities at the dot.rural Hub involve individuals and groups in both formal and informal settings organised by dot.rural and other organisations. Activities in the realms of Education, Public Engagement, Traditional and Social Media are determined by a set of Underlying Principles designed for the Hub by the Outreach Officer. The underlying Engagement and Outreach principles match funding agency requirements and expectations alongside researcher demands and the user-led nature of Digital Economy Research. All activities include researchers alongside the Outreach Officer are research informed and embedded into specific projects that form the Hub. Successful public engagement activities have included participation in Café Scientifique series, workshops in primary and secondary schools, and online activities such as I'm a Scientist Get Me Out of Here. From how to engage 8 year olds with making hydrographs more understandable to members of the public to blogging birds and engaging with remote, rural communities to Spiegeltents. This presentation will share successful public engagement and outreach events alongside some less successful experiences and lessons learnt along the way.

  13. An interactive environment for the analysis of large Earth observation and model data sets

    NASA Technical Reports Server (NTRS)

    Bowman, Kenneth P.; Walsh, John E.; Wilhelmson, Robert B.

    1994-01-01

    Envision is an interactive environment that provides researchers in the earth sciences convenient ways to manage, browse, and visualize large observed or model data sets. Its main features are support for the netCDF and HDF file formats, an easy to use X/Motif user interface, a client-server configuration, and portability to many UNIX workstations. The Envision package also provides new ways to view and change metadata in a set of data files. It permits a scientist to conveniently and efficiently manage large data sets consisting of many data files. It also provides links to popular visualization tools so that data can be quickly browsed. Envision is a public domain package, freely available to the scientific community. Envision software (binaries and source code) and documentation can be obtained from either of these servers: ftp://vista.atmos.uiuc.edu/pub/envision/ and ftp://csrp.tamu.edu/pub/envision/. Detailed descriptions of Envision capabilities and operations can be found in the User's Guide and Reference Manuals distributed with Envision software.

  14. A Method for the Detection of Planetary Transits in Large Time Series Data Sets

    NASA Astrophysics Data System (ADS)

    Weldrake, David T. F.; Sackett, Penny D.

    2005-02-01

    We present a fast, efficient, and easy-to-apply computational method for the detection of planetary transits in large photometric data sets. The code was specifically produced to analyze an ensemble of 21,950 stars in the globular cluster 47 Tuc for the photometric signature indicative of a transiting hot Jupiter planet, the results of which are the subject of a separate paper. Using cross-correlation techniques and Monte Carlo-tested detection criteria, each photometric time series is compared with a database of transit models of appropriate depth and duration. The algorithm recovers transit signatures with high efficiency while maintaining a low false-detection probability, even in rather noisy data. The code has been optimized, and with a 3 GHz machine is capable of analyzing and producing candidate transits for 10,000 stars in 24 hr. We illustrate our algorithm by describing its application to our large 47 Tuc data set, for which the algorithm produced a weighted-mean transit recoverabilty spanning 85%-25% for orbital periods of 1-16 days (half the temporal span of the data set), despite gaps in the time series caused by the weather and observing the duty cycle. The code is easily adaptable and is currently designed to accept time series data produced using difference imaging analysis.

  15. Main large data set features detection by a linear predictor model

    NASA Astrophysics Data System (ADS)

    Gutierrez, Carlos Enrique; Alsharif, Mohamad Reza, Prof.; Khosravy, Mahdi; Yamashita, Katsumi, Prof.; Miyagi, Hayao, Prof.; Villa, Rafael

    2014-10-01

    The aim of the present paper is to explore and obtain a simple method capable to detect the most important variables (features) from a large set of variables. To verify the performance of the approach described in the following sections, we used a set of news. Text sources are considered high-dimensional data, where each word is treated as a single variable. In our work, a linear predictor model has been used to uncover the most influential variables, reducing strongly the dimension of the data set. Input data is classified in two categories; arranged as a collection of plain text data, pre-processed and transformed into a numerical matrix containing around 10,000 different variables. We adjust the linear model's parameters based on its prediction results, the variables with strongest effect on output survive, while those with negligible effect are removed. In order to collect, automatically, a summarized set of features, we sacrifice some details and accuracy of the prediction model, although we try to balance the squared error with the subset obtained.

  16. Web-Queryable Large-Scale Data Sets for Hypothesis Generation in Plant Biology

    PubMed Central

    Brady, Siobhan M.; Provart, Nicholas J.

    2009-01-01

    The approaching end of the 21st century's first decade marks an exciting time for plant biology. Several National Science Foundation Arabidopsis 2010 Projects will conclude, and whether or not the stated goal of the National Science Foundation 2010 Program—to determine the function of 25,000 Arabidopsis genes by 2010—is reached, these projects and others in a similar vein, such as those performed by the AtGenExpress Consortium and various plant genome sequencing initiatives, have generated important and unprecedented large-scale data sets. While providing significant biological insights for the individual laboratories that generated them, these data sets, in conjunction with the appropriate tools, are also permitting plant biologists worldwide to gain new insights into their own biological systems of interest, often at a mouse click through a Web browser. This review provides an overview of several such genomic, epigenomic, transcriptomic, proteomic, and metabolomic data sets and describes Web-based tools for querying them in the context of hypothesis generation for plant biology. We provide five biological examples of how such tools and data sets have been used to provide biological insight. PMID:19401381

  17. Information Visualization, Nonlinear Dimensionality Reduction and Sampling for Large and Complex Data Sets

    NASA Astrophysics Data System (ADS)

    Pesenson, Meyer Z.; Pesenson, I. Z.; McCollum, B.

    2010-01-01

    Recent and forthcoming increases in the amount and complexity of astronomy data are creating data sets that are not amenable to the methods of analysis with which astronomers are familiar. Traditional methods are often inadequate not merely because the data sets are too large and too complex to fully be analyzed "manually", but because many conventional algorithms and techniques cannot be scaled up enough to work effectively on the new data sets. It is essential to develop new methods for organization, scientific visualization (as opposed to illustrative visualization) and analysis of heterogeneous, multiresolution data across application domains. Scientific utilization of highly complex and massive data sets poses significant challenges, and calls for some mathematical approaches more advanced than are now generally available. In this paper, we both give an overview of several innovative developments that address these challenges, and describe a few specific examples of algorithms we have developed, as well as the ones we are developing in the course of this ongoing work. These approaches will enhance scientific visualization and data analysis capabilities, thus facilitating astronomical research and enabling discoveries. This work was carried out with partial funding from the National Geospatial-Intelligence Agency University Research Initiative (NURI), grant HM1582-08-1-0019.

  18. Probabilistic Analysis of a Large-Scale Urban Traffic Sensor Data Set

    NASA Astrophysics Data System (ADS)

    Hutchins, Jon; Ihler, Alexander; Smyth, Padhraic

    Real-world sensor time series are often significantly noisier and more difficult to work with than the relatively clean data sets that tend to be used as the basis for experiments in many research papers. In this paper we report on a large case-study involving statistical data mining of over 100 million measurements from 1700 freeway traffic sensors over a period of seven months in Southern California. We discuss the challenges posed by the wide variety of different sensor failures and anomalies present in the data. The volume and complexity of the data precludes the use of manual visualization or simple thresholding techniques to identify these anomalies. We describe the application of probabilistic modeling and unsupervised learning techniques to this data set and illustrate how these approaches can successfully detect underlying systematic patterns even in the presence of substantial noise and missing data.

  19. Phase Unwrapping for Large InSAR Data Sets Through Statistical-Cost Tiling

    NASA Astrophysics Data System (ADS)

    Chen, C. W.; Zebker, H. A.

    2001-12-01

    Two-dimensional phase unwrapping is a key step in the analysis of InSAR data, and many algorithms for this task have been proposed in recent years. Some of these algorithms have shown promise in handling the problem's intrinsic difficulties, but new difficulties arise when the dimensions of the interferometric input data exceed the limits imposed by computer memory constraints. Similarly, new phase unwrapping strategies may be required when sheer data volumes necessitate greater computational throughput. These issues are especially important in the contexts of large-scale topographic mapping projects such as SRTM and the Alaska DEM Project. We propose a technique for applying the statistical-cost, network-flow phase unwrapping algorithm (SNAPHU) of Chen and Zebker (2001) to large data sets. That is, we introduce a methodology whereby a large interferogram is unwrapped as a set of several smaller tiles. The tiles are unwrapped individually and then further divided into independent, irregularly shaped reliable regions. The phase offsets of these reliable regions are then computed in a secondary optimization problem that seeks to maximize the probability of the full unwrapped solution, using the same statistical models as employed in the primary phase unwrapping stage. The technique therefore approximates a maximum a posteriori probability (MAP) unwrapped solution over the full-sized interferogram. The secondary optimization problem is solved through the use of a nonlinear network-flow solver. We examine the performance of this technique on a real interferometric data set, and we find that the technique is less prone to unwrapping artifacts than more simple tiling approaches.

  20. Experimental set-up for three PHOEBUS type large-area heliostats at the PSA

    SciTech Connect

    Haeger, M.; Schiel, W.; Romero, M.; Schmitz-Goeb, M.

    1995-11-01

    Three large-area heliostat prototypes are being erected at the Plataforma Solar de Almeria by Spanish and German industry. The objective is to demonstrate their technical and economical suitability for a PHOEBUS power tower plant. The two different heliostat designs including two 100 ml glass/metal faceted heliostats and one 150 m{sup 2} stressed membrane heliostat are tested at a representative distance of 485 m to the PSA`s CESA tower. The paper introduces the heliostat designs and test set-up, such as location, targets, flux measurement, data acquisition and control.

  1. Envision: An interactive system for the management and visualization of large geophysical data sets

    NASA Technical Reports Server (NTRS)

    Searight, K. R.; Wojtowicz, D. P.; Walsh, J. E.; Pathi, S.; Bowman, K. P.; Wilhelmson, R. B.

    1995-01-01

    Envision is a software project at the University of Illinois and Texas A&M, funded by NASA's Applied Information Systems Research Project. It provides researchers in the geophysical sciences convenient ways to manage, browse, and visualize large observed or model data sets. Envision integrates data management, analysis, and visualization of geophysical data in an interactive environment. It employs commonly used standards in data formats, operating systems, networking, and graphics. It also attempts, wherever possible, to integrate with existing scientific visualization and analysis software. Envision has an easy-to-use graphical interface, distributed process components, and an extensible design. It is a public domain package, freely available to the scientific community.

  2. Approaching the exa-scale: a real-world evaluation of rendering extremely large data sets

    SciTech Connect

    Patchett, John M; Ahrens, James P; Lo, Li - Ta; Browniee, Carson S; Mitchell, Christopher J; Hansen, Chuck

    2010-10-15

    Extremely large scale analysis is becoming increasingly important as supercomputers and their simulations move from petascale to exascale. The lack of dedicated hardware acceleration for rendering on today's supercomputing platforms motivates our detailed evaluation of the possibility of interactive rendering on the supercomputer. In order to facilitate our understanding of rendering on the supercomputing platform, we focus on scalability of rendering algorithms and architecture envisioned for exascale datasets. To understand tradeoffs for dealing with extremely large datasets, we compare three different rendering algorithms for large polygonal data: software based ray tracing, software based rasterization and hardware accelerated rasterization. We present a case study of strong and weak scaling of rendering extremely large data on both GPU and CPU based parallel supercomputers using Para View, a parallel visualization tool. Wc use three different data sets: two synthetic and one from a scientific application. At an extreme scale, algorithmic rendering choices make a difference and should be considered while approaching exascale computing, visualization, and analysis. We find software based ray-tracing offers a viable approach for scalable rendering of the projected future massive data sizes.

  3. Litho-kinematic facies model for large landslide deposits in arid settings

    SciTech Connect

    Yarnold, J.C.; Lombard, J.P.

    1989-04-01

    Reconnaissance field studies of six large landslide deposits in the S. Basin and Range suggest that a set of characteristic features is common to the deposits of large landslides in an arid setting. These include a coarse boulder cap, an upper massive zone, a lower disrupted zone, and a mixed zone overlying disturbed substrate. The upper massive zone is dominated by crackel breccia. This grades downward into a lower disrupted zone composed of a more matrix-rich breccia that is internally sheared, intruded by clastic dikes, and often contains a cataclasite layer at its base. An underlying discontinuous mixed zone is composed of material from the overlying breccia mixed with material entrained from the underlying substrate. Bedding in the substrate sometimes displays folding and contortion that die out downward. The authors work suggests a spatial zonation of these characteristic features within many landslide deposits. In general, clastic dikes, the basal cataclasite, and folding in the substrate are observed mainly in distal parts of landslides. In most cases, total thickness, thickness of the basal disturbed and mixed zones, and the degree of internal shearing increase distally, whereas maximum clast size commonly decreases distally. Zonation of these features is interpreted to result from kinematics of emplacement that cause generally increased deformation in the distal regions of the landslide.

  4. Large-scale similarity search profiling of ChEMBL compound data sets.

    PubMed

    Heikamp, Kathrin; Bajorath, Jürgen

    2011-08-22

    A large-scale similarity search investigation has been carried out on 266 well-defined compound activity classes extracted from the ChEMBL database. The analysis was performed using two widely applied two-dimensional (2D) fingerprints that mark opposite ends of the current performance spectrum of these types of fingerprints, i.e., MACCS structural keys and the extended connectivity fingerprint with bond diameter four (ECFP4). For each fingerprint, three nearest neighbor search strategies were applied. On the basis of these search calculations, a similarity search profile of the ChEMBL database was generated. Overall, the fingerprint search campaign was surprisingly successful. In 203 of 266 test cases (∼76%), a compound recovery rate of at least 50% was observed with at least the better performing fingerprint and one search strategy. The similarity search profile also revealed several general trends. For example, fingerprint searching was often characterized by an early enrichment of active compounds in database selection sets. In addition, compound activity classes have been categorized according to different similarity search performance levels, which helps to put the results of benchmark calculations into perspective. Therefore, a compendium of activity classes falling into different search performance categories is provided. On the basis of our large-scale investigation, the performance range of state-of-the-art 2D fingerprinting has been delineated for compound data sets directed against a wide spectrum of pharmaceutical targets. PMID:21728295

  5. Using an ensemble of statistical metrics to quantify large sets of plant transcription factor binding sites

    PubMed Central

    2013-01-01

    Background From initial seed germination through reproduction, plants continuously reprogram their transcriptional repertoire to facilitate growth and development. This dynamic is mediated by a diverse but inextricably-linked catalog of regulatory proteins called transcription factors (TFs). Statistically quantifying TF binding site (TFBS) abundance in promoters of differentially expressed genes can be used to identify binding site patterns in promoters that are closely related to stress-response. Output from today’s transcriptomic assays necessitates statistically-oriented software to handle large promoter-sequence sets in a computationally tractable fashion. Results We present Marina, an open-source software for identifying over-represented TFBSs from amongst large sets of promoter sequences, using an ensemble of 7 statistical metrics and binding-site profiles. Through software comparison, we show that Marina can identify considerably more over-represented plant TFBSs compared to a popular software alternative. Conclusions Marina was used to identify over-represented TFBSs in a two time-point RNA-Seq study exploring the transcriptomic interplay between soybean (Glycine max) and soybean rust (Phakopsora pachyrhizi). Marina identified numerous abundant TFBSs recognized by transcription factors that are associated with defense-response such as WRKY, HY5 and MYB2. Comparing results from Marina to that of a popular software alternative suggests that regardless of the number of promoter-sequences, Marina is able to identify significantly more over-represented TFBSs. PMID:23578135

  6. A Technique for Moving Large Data Sets over High-Performance Long Distance Networks

    SciTech Connect

    Settlemyer, Bradley W; Dobson, Jonathan D; Hodson, Stephen W; Kuehn, Jeffery A; Poole, Stephen W; Ruwart, Thomas

    2011-01-01

    In this paper we look at the performance characteristics of three tools used to move large data sets over dedicated long distance networking infrastructure. Although performance studies of wide area networks have been a frequent topic of interest, performance analyses have tended to focus on network latency characteristics and peak throughput using network traffic generators. In this study we instead perform an end-to-end long distance networking analysis that includes reading large data sets from a source file system and committing the data to a remote destination file system. An evaluation of end-to-end data movement is also an evaluation of the system configurations employed and the tools used to move the data. For this paper, we have built several storage platforms and connected them with a high performance long distance network configuration. We use these systems to analyze the capabilities of three data movement tools: BBcp, GridFTP, and XDD. Our studies demonstrate that existing data movement tools do not provide efficient performance levels or exercise the storage devices in their highest performance modes.

  7. Science in the Classroom: Finding a Balance between Autonomous Exploration and Teacher-Led Instruction in Preschool Settings

    ERIC Educational Resources Information Center

    Nayfeld, Irena; Brenneman, Kimberly; Gelman, Rochel

    2011-01-01

    Research Findings: This paper reports on children's use of science materials in preschool classrooms during their free choice time. Baseline observations showed that children and teachers rarely spend time in the designated science area. An intervention was designed to "market" the science center by introducing children to 1 science tool, the…

  8. Analogies as Tools for Meaning Making in Elementary Science Education: How Do They Work in Classroom Settings?

    ERIC Educational Resources Information Center

    Guerra-Ramos, Maria Teresa

    2011-01-01

    In this paper there is a critical overview of the role of analogies as tools for meaning making in science education, their advantages and disadvantages. Two empirical studies on the use of analogies in primary classrooms are discussed and analysed. In the first study, the "string circuit" analogy was used in the teaching of electric circuits with

  9. Analogies as Tools for Meaning Making in Elementary Science Education: How Do They Work in Classroom Settings?

    ERIC Educational Resources Information Center

    Guerra-Ramos, Maria Teresa

    2011-01-01

    In this paper there is a critical overview of the role of analogies as tools for meaning making in science education, their advantages and disadvantages. Two empirical studies on the use of analogies in primary classrooms are discussed and analysed. In the first study, the "string circuit" analogy was used in the teaching of electric circuits with…

  10. Classroom-Based Interventions and Teachers' Perceived Job Stressors and Confidence: Evidence from a Randomized Trial in Head Start Settings

    ERIC Educational Resources Information Center

    Zhai, Fuhua; Raver, C. Cybele; Li-Grining, Christine

    2011-01-01

    Preschool teachers' job stressors have received increasing attention but have been understudied in the literature. We investigated the impacts of a classroom-based intervention, the Chicago School Readiness Project (CSRP), on teachers' perceived job stressors and confidence, as indexed by their perceptions of job control, job resources, job

  11. Child and Setting Characteristics Affecting the Adult Talk Directed at Preschoolers with Autism Spectrum Disorder in the Inclusive Classroom

    ERIC Educational Resources Information Center

    Irvin, Dwight W.; Boyd, Brian A.; Odom, Samuel L.

    2015-01-01

    Difficulty with social competence is a core deficit of autism spectrum disorder. Research on typically developing children and children with disabilities, in general, suggests the adult talk received in the classroom is related to their social development. The aims of this study were to examine (1) the types and amounts of adult talk children with…

  12. Child and Setting Characteristics Affecting the Adult Talk Directed at Preschoolers with Autism Spectrum Disorder in the Inclusive Classroom

    ERIC Educational Resources Information Center

    Irvin, Dwight W.; Boyd, Brian A.; Odom, Samuel L.

    2015-01-01

    Difficulty with social competence is a core deficit of autism spectrum disorder. Research on typically developing children and children with disabilities, in general, suggests the adult talk received in the classroom is related to their social development. The aims of this study were to examine (1) the types and amounts of adult talk children with

  13. Classroom-Based Interventions and Teachers' Perceived Job Stressors and Confidence: Evidence from a Randomized Trial in Head Start Settings

    ERIC Educational Resources Information Center

    Zhai, Fuhua; Raver, C. Cybele; Li-Grining, Christine

    2011-01-01

    Preschool teachers' job stressors have received increasing attention but have been understudied in the literature. We investigated the impacts of a classroom-based intervention, the Chicago School Readiness Project (CSRP), on teachers' perceived job stressors and confidence, as indexed by their perceptions of job control, job resources, job…

  14. An Analogous Study of Children's Attitudes Toward School in an Open Classroom Environment as Opposed to a Conventional Setting.

    ERIC Educational Resources Information Center

    Zeli, Doris Conti

    A study sought to determine whether intermediate age children exposed to open classroom teaching strategy have a more positive attitude toward school than intermediate age children exposed to conventional teaching strategy. The hypothesis was that there would be no significant difference in attitude between the two groups. The study was limited to…

  15. Initial Validation of the Prekindergarten Classroom Observation Tool and Goal Setting System for Data-Based Coaching

    ERIC Educational Resources Information Center

    Crawford, April D.; Zucker, Tricia A.; Williams, Jeffrey M.; Bhavsar, Vibhuti; Landry, Susan H.

    2013-01-01

    Although coaching is a popular approach for enhancing the quality of Tier 1 instruction, limited research has addressed observational measures specifically designed to focus coaching on evidence-based practices. This study explains the development of the prekindergarten (pre-k) Classroom Observation Tool (COT) designed for use in a data-based…

  16. Science in the Classroom: Finding a Balance between Autonomous Exploration and Teacher-Led Instruction in Preschool Settings

    ERIC Educational Resources Information Center

    Nayfeld, Irena; Brenneman, Kimberly; Gelman, Rochel

    2011-01-01

    Research Findings: This paper reports on children's use of science materials in preschool classrooms during their free choice time. Baseline observations showed that children and teachers rarely spend time in the designated science area. An intervention was designed to "market" the science center by introducing children to 1 science tool, the

  17. Developing consistent Landsat data sets for large area applications: the MRLC 2001 protocol

    USGS Publications Warehouse

    Chander, G.; Huang, C.; Yang, L.; Homer, C.; Larson, C.

    2009-01-01

    One of the major efforts in large area land cover mapping over the last two decades was the completion of two U.S. National Land Cover Data sets (NLCD), developed with nominal 1992 and 2001 Landsat imagery under the auspices of the MultiResolution Land Characteristics (MRLC) Consortium. Following the successful generation of NLCD 1992, a second generation MRLC initiative was launched with two primary goals: (1) to develop a consistent Landsat imagery data set for the U.S. and (2) to develop a second generation National Land Cover Database (NLCD 2001). One of the key enhancements was the formulation of an image preprocessing protocol and implementation of a consistent image processing method. The core data set of the NLCD 2001 database consists of Landsat 7 Enhanced Thematic Mapper Plus (ETM+) images. This letter details the procedures for processing the original ETM+ images and more recent scenes added to the database. NLCD 2001 products include Anderson Level II land cover classes, percent tree canopy, and percent urban imperviousness at 30-m resolution derived from Landsat imagery. The products are freely available for download to the general public from the MRLC Consortium Web site at http://www.mrlc.gov.

  18. Contextual settings, science stories, and large context problems: Toward a more humanistic science education

    NASA Astrophysics Data System (ADS)

    Stinner, Arthur

    This article addresses the need for and the problem of organizing a science curriculum around contextual settings and science stories that serve to involve and motivate students to develop an understanding of the world that is rooted in the scientific and the humanistic traditions. A program of activities placed around contextual settings, science stories, and contemporary issues of interest is recommended in an attempt to move toward a slow and secure abolition of the gulf between scientific knowledge and common sense beliefs. A conceptual development model is described to guide the connection between theory and evidence on a level appropriate for children, from early years to senior years. For the senior years it is also important to connect the activity of teaching to a sound theoretical structure. The theoretical structure must illuminate the status of theory in science, establish what counts as evidence, clarify the relationship between experiment and explanation, and make connections to the history of science. The article concludes with a proposed program of activities in terms of a sequence of theoretical and empirical experiences that involve contextual settings, science stories, large context problems, thematic teaching, and popular science literature teaching.

  19. Hierarchical unbiased graph shrinkage (HUGS): a novel groupwise registration for large data set.

    PubMed

    Ying, Shihui; Wu, Guorong; Wang, Qian; Shen, Dinggang

    2014-01-01

    Normalizing all images in a large data set into a common space is a key step in many clinical and research studies, e.g., for brain development, maturation, and aging. Recently, groupwise registration has been developed for simultaneous alignment of all images without selecting a particular image as template, thus potentially avoiding bias in the registration. However, most conventional groupwise registration methods do not explore the data distribution during the image registration. Thus, their performance could be affected by large inter-subject variations in the data set under registration. To solve this potential issue, we propose to use a graph to model the distribution of all image data sitting on the image manifold, with each node representing an image and each edge representing the geodesic pathway between two nodes (or images). Then, the procedure of warping all images to their population center turns to the dynamic shrinking of the graph nodes along their graph edges until all graph nodes become close to each other. Thus, the topology of image distribution on the image manifold is always preserved during the groupwise registration. More importantly, by modeling the distribution of all images via a graph, we can potentially reduce registration error since every time each image is warped only according to its nearby images with similar structures in the graph. We have evaluated our proposed groupwise registration method on both infant and adult data sets, by also comparing with the conventional group-mean based registration and the ABSORB methods. All experimental results show that our proposed method can achieve better performance in terms of registration accuracy and robustness. PMID:24055505

  20. 21-inch common large-area display set for multiple military command and control workstation applications

    NASA Astrophysics Data System (ADS)

    Gorenflo, Ronald L.; Hermann, David J.

    1996-05-01

    Battelle is under contract with Warner Robins Air Logistics Center to design a common large area display set (CLADS) for use in multiple airborne C4I applications that currently use unique 19 inch CRTs. Engineers at Battelle have determined that by taking advantage of the latest flat panel display technology and the commonality between C4I applications, one display head (21 inch diagonal, 1280 by 1024) can be used in multiple applications. In addition, common modules are being designed by Battelle to reduce the number of installation- specific circuit card assemblies required for a particular application. Initial USAF applications include replacements for the E-3 AWACS color monitor assembly, E-8 Joint STARS graphics display unit, and ABCCC airborne color display. Initial U. S. Navy applications include the E-2C ACIS display. For these applications reliability and maintainability are key objectives. The common design reduces the number of unique subassemblies in the USAF inventory by 56 to 66%. In addition to total module reductions, CLADs module/subassembly re-use across nine potential applications is estimated to be 73%. As more platforms implement CLADS, the percentage of module re-use increases. The new design is also expected to have a MTBF of at least 3350 hours, an order of magnitude better than one of the current systems. In the Joint STARS installation, more than 1400 pounds can be eliminated from the aircraft. In the E-3 installation, the CLADs is estimated to provide a power reduction of approximately 1750 watts per aircraft. This paper discuses the common large area display set design and it use in a variety of C4I applications that require a large area, high resolution, full color display.

  1. Suffix tree searcher: exploration of common substrings in large DNA sequence sets

    PubMed Central

    2014-01-01

    Background Large DNA sequence data sets require special bioinformatics tools to search and compare them. Such tools should be easy to use so that the data can be easily accessed by a wide array of researchers. In the past, the use of suffix trees for searching DNA sequences has been limited by a practical need to keep the trees in RAM. Newer algorithms solve this problem by using disk-based approaches. However, none of the fastest suffix tree algorithms have been implemented with a graphical user interface, preventing their incorporation into a feasible laboratory workflow. Results Suffix Tree Searcher (STS) is designed as an easy-to-use tool to index, search, and analyze very large DNA sequence datasets. The program accommodates very large numbers of very large sequences, with aggregate size reaching tens of billions of nucleotides. The program makes use of pre-sorted persistent "building blocks" to reduce the time required to construct new trees. STS is comprised of a graphical user interface written in Java, and four C modules. All components are automatically downloaded when a web link is clicked. The underlying suffix tree data structure permits extremely fast searching for specific nucleotide strings, with wild cards or mismatches allowed. Complete tree traversals for detecting common substrings are also very fast. The graphical user interface allows the user to transition seamlessly between building, traversing, and searching the dataset. Conclusions Thus, STS provides a new resource for the detection of substrings common to multiple DNA sequences or within a single sequence, for truly huge data sets. The re-searching of sequence hits, allowing wild card positions or mismatched nucleotides, together with the ability to rapidly retrieve large numbers of sequence hits from the DNA sequence files, provides the user with an efficient method of evaluating the similarity between nucleotide sequences by multiple alignment or use of Logos. The ability to re-use existing suffix tree pieces considerably shortens index generation time. The graphical user interface enables quick mastery of the analysis functions, easy access to the generated data, and seamless workflow integration. PMID:25053142

  2. Twelve- to 14-Month-Old Infants Can Predict Single-Event Probability with Large Set Sizes

    ERIC Educational Resources Information Center

    Denison, Stephanie; Xu, Fei

    2010-01-01

    Previous research has revealed that infants can reason correctly about single-event probabilities with small but not large set sizes (Bonatti, 2008; Teglas "et al.", 2007). The current study asks whether infants can make predictions regarding single-event probability with large set sizes using a novel procedure. Infants completed two trials: A…

  3. Generating extreme weather event sets from very large ensembles of regional climate models

    NASA Astrophysics Data System (ADS)

    Massey, Neil; Guillod, Benoit; Otto, Friederike; Allen, Myles; Jones, Richard; Hall, Jim

    2015-04-01

    Generating extreme weather event sets from very large ensembles of regional climate models Neil Massey, Benoit P. Guillod, Friederike E. L. Otto, Myles R. Allen, Richard Jones, Jim W. Hall Environmental Change Institute, University of Oxford, Oxford, UK Extreme events can have large impacts on societies and are therefore being increasingly studied. In particular, climate change is expected to impact the frequency and intensity of these events. However, a major limitation when investigating extreme weather events is that, by definition, only few events are present in observations. A way to overcome this issue it to use large ensembles of model simulations. Using the volunteer distributed computing (VDC) infrastructure of weather@home [1], we run a very large number (10'000s) of RCM simulations over the European domain at a resolution of 25km, with an improved land-surface scheme, nested within a free-running GCM. Using VDC allows many thousands of climate model runs to be computed. Using observations for the GCM boundary forcings we can run historical "hindcast" simulations over the past 100 to 150 years. This allows us, due to the chaotic variability of the atmosphere, to ascertain how likely an extreme event was, given the boundary forcings, and to derive synthetic event sets. The events in these sets did not actually occur in the observed record but could have occurred given the boundary forcings, with an associated probability. The event sets contain time-series of fields of meteorological variables that allow impact modellers to assess the loss the event would incur. Projections of events into the future are achieved by modelling projections of the sea-surface temperature (SST) and sea-ice boundary forcings, by combining the variability of the SST in the observed record with a range of warming signals derived from the varying responses of SSTs in the CMIP5 ensemble to elevated greenhouse gas (GHG) emissions in three RCP scenarios. Simulating the future with a range of SST responses, as well as a range of RCP scenarios, allows us to assess the uncertainty in the response to elevated GHG emissions that occurs in the CMIP5 ensemble. Numerous extreme weather events can be studied. Firstly, we analyse droughts in Europe with a focus on the UK in the context of the project MaRIUS (Managing the Risks, Impacts and Uncertainties of droughts and water Scarcity). We analyse the characteristics of the simulated droughts, the underlying physical mechanisms, and assess droughts observed in the recent past. Secondly, we analyse windstorms by applying an objective storm-identification and tracking algorithm to the ensemble output, isolating those storms that cause high loss and building a probabilistic storm catalogue, which can be used by impact modellers, insurance loss modellers, etc. Finally, we combine the model output with a heat-stress index to determine the detrimental effect on health of heat waves in Europe. [1] Massey, N. et al., 2014, Q. J. R. Meteorol. Soc.

  4. Innovation from within the Box: Evaluation of Online Problem Sets in a Series of Large Lecture Undergraduate Science Courses.

    ERIC Educational Resources Information Center

    Schaeffer, Evonne; Bhargava, Tina; Nash, John; Kerns, Charles; Stocker, Scott

    A technology-mediated solution to enhance the learning experience for students in a large lecture setting was evaluated. Online problem sets were developed to engage students in the content of a human biology course and implemented in the classes of eight faculty coordinators. The weekly problem sets contained several multiple choice problems,…

  5. Calculations of safe collimator settings and β* at the CERN Large Hadron Collider

    NASA Astrophysics Data System (ADS)

    Bruce, R.; Assmann, R. W.; Redaelli, S.

    2015-06-01

    The first run of the Large Hadron Collider (LHC) at CERN was very successful and resulted in important physics discoveries. One way of increasing the luminosity in a collider, which gave a very significant contribution to the LHC performance in the first run and can be used even if the beam intensity cannot be increased, is to decrease the transverse beam size at the interaction points by reducing the optical function β*. However, when doing so, the beam becomes larger in the final focusing system, which could expose its aperture to beam losses. For the LHC, which is designed to store beams with a total energy of 362 MJ, this is critical, since the loss of even a small fraction of the beam could cause a magnet quench or even damage. Therefore, the machine aperture has to be protected by the collimation system. The settings of the collimators constrain the maximum beam size that can be tolerated and therefore impose a lower limit on β*. In this paper, we present calculations to determine safe collimator settings and the resulting limit on β*, based on available aperture and operational stability of the machine. Our model was used to determine the LHC configurations in 2011 and 2012 and it was found that β* could be decreased significantly compared to the conservative model used in 2010. The gain in luminosity resulting from the decreased margins between collimators was more than a factor 2, and a further contribution from the use of realistic aperture estimates based on measurements was almost as large. This has played an essential role in the rapid and successful accumulation of experimental data in the LHC.

  6. Ghost transmission: How large basis sets can make electron transport calculations worse

    SciTech Connect

    Herrmann, Carmen; Solomon, Gemma C.; Subotnik, Joseph E.; Mujica, Vladimiro; Ratner, Mark A.

    2010-01-01

    The Landauer approach has proven to be an invaluable tool for calculating the electron transport properties of single molecules, especially when combined with a nonequilibrium Green’s function approach and Kohn–Sham density functional theory. However, when using large nonorthogonal atom-centered basis sets, such as those common in quantum chemistry, one can find erroneous results if the Landauer approach is applied blindly. In fact, basis sets of triple-zeta quality or higher sometimes result in an artificially high transmission and possibly even qualitatively wrong conclusions regarding chemical trends. In these cases, transport persists when molecular atoms are replaced by basis functions alone (“ghost atoms”). The occurrence of such ghost transmission is correlated with low-energy virtual molecular orbitals of the central subsystem and may be interpreted as a biased and thus inaccurate description of vacuum transmission. An approximate practical correction scheme is to calculate the ghost transmission and subtract it from the full transmission. As a further consequence of this study, it is recommended that sensitive molecules be used for parameter studies, in particular those whose transmission functions show antiresonance features such as benzene-based systems connected to the electrodes in meta positions and other low-conducting systems such as alkanes and silanes.

  7. fastSTRUCTURE: variational inference of population structure in large SNP data sets.

    PubMed

    Raj, Anil; Stephens, Matthew; Pritchard, Jonathan K

    2014-06-01

    Tools for estimating population structure from genetic data are now used in a wide variety of applications in population genetics. However, inferring population structure in large modern data sets imposes severe computational challenges. Here, we develop efficient algorithms for approximate inference of the model underlying the STRUCTURE program using a variational Bayesian framework. Variational methods pose the problem of computing relevant posterior distributions as an optimization problem, allowing us to build on recent advances in optimization theory to develop fast inference tools. In addition, we propose useful heuristic scores to identify the number of populations represented in a data set and a new hierarchical prior to detect weak population structure in the data. We test the variational algorithms on simulated data and illustrate using genotype data from the CEPH-Human Genome Diversity Panel. The variational algorithms are almost two orders of magnitude faster than STRUCTURE and achieve accuracies comparable to those of ADMIXTURE. Furthermore, our results show that the heuristic scores for choosing model complexity provide a reasonable range of values for the number of populations represented in the data, with minimal bias toward detecting structure when it is very weak. Our algorithm, fastSTRUCTURE, is freely available online at http://pritchardlab.stanford.edu/structure.html. PMID:24700103

  8. Numerical methods for accelerating the PCA of large data sets applied to hyperspectral imaging

    NASA Astrophysics Data System (ADS)

    Vogt, Frank; Mizaikoff, Boris; Tacke, Maurus

    2002-02-01

    Principal component analysis and regression (PCA, PCR) are widespread algorithms for the calibration of spectrometers and the evaluation of spectra. In many applications, however, there are huge amounts of calibration data, as it is common to hyperspectral imaging for instance. Such data sets consist often of several ten thousands of spectra measured at several hundred wavelength positions. Hence, a PCA of calibration sets that large is computational very time consuming - in particular the included singular value decomposition (SVD). Since this procedure takes several hours of computation time on conventional personal computers, its calculation is often not feasible. In this paper a straightforward acceleration of the PCA is presented, which is achieved by data preprocessing consisting of three steps: data compression based on a wavelet transformation, exclusion of redundant data, and by taking advantage of the matrix dimensions. Since the size of the calibration matrix determines the calculation time of the PCA, a reduction of its size enables the acceleration. Due to an appropriate data preprocessing, the PCA of the discussed examples could be accelerated by more than one order of magnitude. It is demonstrated by means of synthetically generated spectra as well as by experimental data that after preprocessing the PCA results in calibration models, which are comparable to the ones obtained by the conventional approach.

  9. Classroom-based Interventions and Teachers’ Perceived Job Stressors and Confidence: Evidence from a Randomized Trial in Head Start Settings

    PubMed Central

    Zhai, Fuhua; Raver, C. Cybele; Li-Grining, Christine

    2011-01-01

    Preschool teachers’ job stressors have received increasing attention but have been understudied in the literature. We investigated the impacts of a classroom-based intervention, the Chicago School Readiness Project (CSRP), on teachers’ perceived job stressors and confidence, as indexed by their perceptions of job control, job resources, job demands, and confidence in behavior management. Using a clustered randomized controlled trial (RCT) design, the CSRP provided multifaceted services to the treatment group, including teacher training and mental health consultation, which were accompanied by stress-reduction services and workshops. Overall, 90 teachers in 35 classrooms at 18 Head Start sites participated in the study. After adjusting for teacher and classroom factors and site fixed effects, we found that the CSRP had significant effects on the improvement of teachers’ perceived job control and work-related resources. We also found that the CSRP decreased teachers’ confidence in behavior management and had no statistically significant effects on job demands. Overall, we did not find significant moderation effects of teacher race/ethnicity, education, teaching experience, or teacher type. The implications for research and policy are discussed. PMID:21927538

  10. Child and setting characteristics affecting the adult talk directed at preschoolers with autism spectrum disorder in the inclusive classroom.

    PubMed

    Irvin, Dwight W; Boyd, Brian A; Odom, Samuel L

    2015-02-01

    Difficulty with social competence is a core deficit of autism spectrum disorder. Research on typically developing children and children with disabilities, in general, suggests the adult talk received in the classroom is related to their social development. The aims of this study were to examine (1) the types and amounts of adult talk children with autism spectrum disorder are exposed to in the preschool classroom and (2) the associations between child characteristics (e.g. language), activity area, and adult talk. Kontos' Teacher Talk classification was used to code videos approximately 30 min in length of 73 children with autism spectrum disorder (ages 3-5) in inclusive classrooms (n = 33) during center time. The results indicated practical/personal assistance was the most common type of adult talk coded, and behavior management talk least often coded. Child characteristics (i.e. age and autism severity) and activity area were found to be related to specific types of adult talk. Given the findings, implications for future research are discussed. PMID:24463432

  11. Classroom-based Interventions and Teachers' Perceived Job Stressors and Confidence: Evidence from a Randomized Trial in Head Start Settings.

    PubMed

    Zhai, Fuhua; Raver, C Cybele; Li-Grining, Christine

    2011-09-01

    Preschool teachers' job stressors have received increasing attention but have been understudied in the literature. We investigated the impacts of a classroom-based intervention, the Chicago School Readiness Project (CSRP), on teachers' perceived job stressors and confidence, as indexed by their perceptions of job control, job resources, job demands, and confidence in behavior management. Using a clustered randomized controlled trial (RCT) design, the CSRP provided multifaceted services to the treatment group, including teacher training and mental health consultation, which were accompanied by stress-reduction services and workshops. Overall, 90 teachers in 35 classrooms at 18 Head Start sites participated in the study. After adjusting for teacher and classroom factors and site fixed effects, we found that the CSRP had significant effects on the improvement of teachers' perceived job control and work-related resources. We also found that the CSRP decreased teachers' confidence in behavior management and had no statistically significant effects on job demands. Overall, we did not find significant moderation effects of teacher race/ethnicity, education, teaching experience, or teacher type. The implications for research and policy are discussed. PMID:21927538

  12. A multivariate approach to filling gaps in large ecological data sets using probabilistic matrix factorization techniques

    NASA Astrophysics Data System (ADS)

    Schrodt, F. I.; Shan, H.; Kattge, J.; Reich, P.; Banerjee, A.; Reichstein, M.

    2012-12-01

    With the advent of remotely sensed data and coordinated efforts to create global databases, the ecological community has progressively become more data-intensive. However, in contrast to other disciplines, statistical ways of handling these large data sets, especially the gaps which are inherent to them, are lacking. Widely used theoretical approaches, for example model averaging based on Akaike's information criterion (AIC), are sensitive to missing values. Yet, the most common way of handling sparse matrices - the deletion of cases with missing data (complete case analysis) - is known to severely reduce statistical power as well as inducing biased parameter estimates. In order to address these issues, we present novel approaches to gap filling in large ecological data sets using matrix factorization techniques. Factorization based matrix completion was developed in a recommender system context and has since been widely used to impute missing data in fields outside the ecological community. Here, we evaluate the effectiveness of probabilistic matrix factorization techniques for imputing missing data in ecological matrices using two imputation techniques. Hierarchical Probabilistic Matrix Factorization (HPMF) effectively incorporates hierarchical phylogenetic information (phylogenetic group, family, genus, species and individual plant) into the trait imputation. Kernelized Probabilistic Matrix Factorization (KPMF) on the other hand includes environmental information (climate and soils) into the matrix factorization through kernel matrices over rows and columns. We test the accuracy and effectiveness of HPMF and KPMF in filling sparse matrices, using the TRY database of plant functional traits (http://www.try-db.org). TRY is one of the largest global compilations of plant trait databases (750 traits of 1 million plants), encompassing data on morphological, anatomical, biochemical, phenological and physiological features of plants. However, despite of unprecedented coverage, the TRY database is still very sparse, severely limiting joint trait analyses. Plant traits are the key to understanding how plants as primary producers adjust to changes in environmental conditions and in turn influence them. Forming the basis for Dynamic Global Vegetation Models (DGVMs), plant traits are also fundamental in global change studies for predicting future ecosystem changes. It is thus imperative that missing data is imputed in as accurate and precise a way as possible. In this study, we show the advantage of applying probabilistic matrix factorization techniques in incorporating hierarchical and environmental information for the prediction of missing plant traits as compared to conventional imputation techniques such as the complete case and mean approaches. We will discuss advantages of the proposed imputation techniques over other widely used methods such as multiple imputation (MI), as well as possible applications to other data sets.

  13. Evaluation of flow resistance in gravel-bed rivers through a large field data set

    NASA Astrophysics Data System (ADS)

    Rickenmann, Dieter; Recking, Alain

    2011-07-01

    A data set of 2890 field measurements was used to test the ability of several conventional flow resistance equations to predict mean flow velocity in gravel bed rivers when used with no calibration. The tests were performed using both flow depth and discharge as input since discharge may be a more reliable measure of flow conditions in shallow flows. Generally better predictions are obtained when using flow discharge as input. The results indicate that the Manning-Strickler and the Keulegan equations show considerable disagreement with observed flow velocities for flow depths smaller than 10 times the characteristic grain diameter. Most equations show some systematic deviation for small relative flow depth. The use of new definitions for dimensionless variables in terms of nondimensional hydraulic geometry equations allows the development of a new flow resistance equation. The best overall performance is obtained by the Ferguson approach, which combines two power law flow resistance equations that are different for deep and shallow flows. To use this approach with flow discharge as input, a logarithmic matching equation in terms of the new dimensionless variables is proposed. For the domains of intermediate and large-scale roughness, the field data indicate a considerable increase in flow resistance as compared with the domain of small-scale roughness. The Ferguson approach is used to discuss the importance of flow resistance partitioning for bed load transport calculations at flow conditions with intermediate- and large-scale roughness in natural gravel, cobble, and boulder bed streams.

  14. Galaxy Evolution Insights from Spectral Modeling of Large Data Sets from the Sloan Digital Sky Survey

    SciTech Connect

    Hoversten, Erik A.; /Johns Hopkins U.

    2007-10-01

    This thesis centers on the use of spectral modeling techniques on data from the Sloan Digital Sky Survey (SDSS) to gain new insights into current questions in galaxy evolution. The SDSS provides a large, uniform, high quality data set which can be exploited in a number of ways. One avenue pursued here is to use the large sample size to measure precisely the mean properties of galaxies of increasingly narrow parameter ranges. The other route taken is to look for rare objects which open up for exploration new areas in galaxy parameter space. The crux of this thesis is revisiting the classical Kennicutt method for inferring the stellar initial mass function (IMF) from the integrated light properties of galaxies. A large data set ({approx} 10{sup 5} galaxies) from the SDSS DR4 is combined with more in-depth modeling and quantitative statistical analysis to search for systematic IMF variations as a function of galaxy luminosity. Galaxy H{alpha} equivalent widths are compared to a broadband color index to constrain the IMF. It is found that for the sample as a whole the best fitting IMF power law slope above 0.5 M{sub {circle_dot}} is {Lambda} = 1.5 {+-} 0.1 with the error dominated by systematics. Galaxies brighter than around M{sub r,0.1} = -20 (including galaxies like the Milky Way which has M{sub r,0.1} {approx} -21) are well fit by a universal {Lambda} {approx} 1.4 IMF, similar to the classical Salpeter slope, and smooth, exponential star formation histories (SFH). Fainter galaxies prefer steeper IMFs and the quality of the fits reveal that for these galaxies a universal IMF with smooth SFHs is actually a poor assumption. Related projects are also pursued. A targeted photometric search is conducted for strongly lensed Lyman break galaxies (LBG) similar to MS1512-cB58. The evolution of the photometric selection technique is described as are the results of spectroscopic follow-up of the best targets. The serendipitous discovery of two interesting blue compact dwarf galaxies is reported. These galaxies were identified by their extremely weak (< 150) [N {pi}] {lambda}6584 to H{alpha} emission line ratios. Abundance analysis from emission line fluxes reveals that these galaxies have gas phase oxygen abundances 12 + log(O/H) {approx} 7.7 to 7.9, not remarkably low, and near infrared imaging detects an old stellar population. However, the measured nitrogen to oxygen ratios log(N/O) < 1.7 are anomalously low for blue compact dwarf galaxies. These objects may be useful for understanding the chemical evolution of nitrogen.

  15. Measurement, visualization and analysis of extremely large data sets with a nanopositioning and nanomeasuring machine

    NASA Astrophysics Data System (ADS)

    Birli, O.; Franke, K.-H.; Linß, G.; Machleidt, T.; Manske, E.; Schale, F.; Schwannecke, H.-C.; Sparrer, E.; Weiß, M.

    2013-04-01

    Nanopositioning and nanomeasuring machines (NPM machines) developed at the Ilmenau University of Technology allow the measurement of micro- and nanostructures with nanometer precision in a measurement volume of 25 mm × 25 mm × 5 mm (NMM-1) or 200 mm × 200 mm × 25 mm (NPMM-200). Various visual, tactile or atomic force sensors can all be used to measure specimens. Atomic force sensors have emerged as a powerful tool in nanotechnology. Large-scale AFM measurements are very time-consuming and in fact in a practical sense they are impossible over millimeter ranges due to low scanning speeds. A cascaded multi-sensor system can be used to implement a multi-scale measurement and testing strategy for nanopositioning and nanomeasuring machines. This approach involves capturing an overview image at the limit of optical resolution and automatically scanning the measured data for interesting test areas that are suitable for a higher-resolution measurement. These "fields of interest" can subsequently be measured in the same NPM machine using individual AFM sensor scans. The results involve extremely large data sets that cannot be handled by off-the-shelf software. Quickly navigating within terabyte-sized data files requires preprocessing to be done on the measured data to calculate intermediate images based on the principle of a visualization pyramid. This pyramid includes the measured data of the entire volume, prepared in the form of discrete measurement volumes (spatial tiles or cubes) with certain edge lengths at specific zoom levels. The functionality of the closed process chain is demonstrated using a blob analysis for automatically selecting regions of interest on the specimen. As expected, processing large amounts of data places particularly high demands on both computing power and the software architecture.

  16. Efficient Implementation of an Optimal Interpolator for Large Spatial Data Sets

    NASA Technical Reports Server (NTRS)

    Memarsadeghi, Nargess; Mount, David M.

    2007-01-01

    Scattered data interpolation is a problem of interest in numerous areas such as electronic imaging, smooth surface modeling, and computational geometry. Our motivation arises from applications in geology and mining, which often involve large scattered data sets and a demand for high accuracy. The method of choice is ordinary kriging. This is because it is a best unbiased estimator. Unfortunately, this interpolant is computationally very expensive to compute exactly. For n scattered data points, computing the value of a single interpolant involves solving a dense linear system of size roughly n x n. This is infeasible for large n. In practice, kriging is solved approximately by local approaches that are based on considering only a relatively small'number of points that lie close to the query point. There are many problems with this local approach, however. The first is that determining the proper neighborhood size is tricky, and is usually solved by ad hoc methods such as selecting a fixed number of nearest neighbors or all the points lying within a fixed radius. Such fixed neighborhood sizes may not work well for all query points, depending on local density of the point distribution. Local methods also suffer from the problem that the resulting interpolant is not continuous. Meyer showed that while kriging produces smooth continues surfaces, it has zero order continuity along its borders. Thus, at interface boundaries where the neighborhood changes, the interpolant behaves discontinuously. Therefore, it is important to consider and solve the global system for each interpolant. However, solving such large dense systems for each query point is impractical. Recently a more principled approach to approximating kriging has been proposed based on a technique called covariance tapering. The problems arise from the fact that the covariance functions that are used in kriging have global support. Our implementations combine, utilize, and enhance a number of different approaches that have been introduced in literature for solving large linear systems for interpolation of scattered data points. For very large systems, exact methods such as Gaussian elimination are impractical since they require 0(n(exp 3)) time and 0(n(exp 2)) storage. As Billings et al. suggested, we use an iterative approach. In particular, we use the SYMMLQ method, for solving the large but sparse ordinary kriging systems that result from tapering. The main technical issue that need to be overcome in our algorithmic solution is that the points' covariance matrix for kriging should be symmetric positive definite. The goal of tapering is to obtain a sparse approximate representation of the covariance matrix while maintaining its positive definiteness. Furrer et al. used tapering to obtain a sparse linear system of the form Ax = b, where A is the tapered symmetric positive definite covariance matrix. Thus, Cholesky factorization could be used to solve their linear systems. They implemented an efficient sparse Cholesky decomposition method. They also showed if these tapers are used for a limited class of covariance models, the solution of the system converges to the solution of the original system. Matrix A in the ordinary kriging system, while symmetric, is not positive definite. Thus, their approach is not applicable to the ordinary kriging system. Therefore, we use tapering only to obtain a sparse linear system. Then, we use SYMMLQ to solve the ordinary kriging system. We show that solving large kriging systems becomes practical via tapering and iterative methods, and results in lower estimation errors compared to traditional local approaches, and significant memory savings compared to the original global system. We also developed a more efficient variant of the sparse SYMMLQ method for large ordinary kriging systems. This approach adaptively finds the correct local neighborhood for each query point in the interpolation process.

  17. Registering coherent change detection products associated with large image sets and long capture intervals

    SciTech Connect

    Perkins, David Nikolaus; Gonzales, Antonio I

    2014-04-08

    A set of co-registered coherent change detection (CCD) products is produced from a set of temporally separated synthetic aperture radar (SAR) images of a target scene. A plurality of transformations are determined, which transformations are respectively for transforming a plurality of the SAR images to a predetermined image coordinate system. The transformations are used to create, from a set of CCD products produced from the set of SAR images, a corresponding set of co-registered CCD products.

  18. Science Teachers' Decision-Making in Abstinence-Only-Until-Marriage (AOUM) Classrooms: Taboo Subjects and Discourses of Sex and Sexuality in Classroom Settings

    ERIC Educational Resources Information Center

    Gill, Puneet Singh

    2015-01-01

    Sex education, especially in the southeastern USA, remains steeped in an Abstinence-Only-Until-Marriage (AOUM) approach, which sets up barriers to the education of sexually active students. Research confirms that science education has the potential to facilitate discussion of controversial topics, including sex education. Science teachers in the…

  19. Science Teachers' Decision-Making in Abstinence-Only-Until-Marriage (AOUM) Classrooms: Taboo Subjects and Discourses of Sex and Sexuality in Classroom Settings

    ERIC Educational Resources Information Center

    Gill, Puneet Singh

    2015-01-01

    Sex education, especially in the southeastern USA, remains steeped in an Abstinence-Only-Until-Marriage (AOUM) approach, which sets up barriers to the education of sexually active students. Research confirms that science education has the potential to facilitate discussion of controversial topics, including sex education. Science teachers in the

  20. Any Questions? An Application of Weick's Model of Organizing to Increase Student Involvement in the Large-Lecture Classroom

    ERIC Educational Resources Information Center

    Ledford, Christy J. W.; Saperstein, Adam K.; Cafferty, Lauren A.; McClintick, Stacey H.; Bernstein, Ethan M.

    2015-01-01

    Microblogs, with their interactive nature, can engage students in community building and sensemaking. Using Weick's model of organizing as a framework, we integrated the use of micromessaging to increase student engagement in the large-lecture classroom. Students asked significantly more questions and asked a greater diversity of questions…

  1. Any Questions? An Application of Weick's Model of Organizing to Increase Student Involvement in the Large-Lecture Classroom

    ERIC Educational Resources Information Center

    Ledford, Christy J. W.; Saperstein, Adam K.; Cafferty, Lauren A.; McClintick, Stacey H.; Bernstein, Ethan M.

    2015-01-01

    Microblogs, with their interactive nature, can engage students in community building and sensemaking. Using Weick's model of organizing as a framework, we integrated the use of micromessaging to increase student engagement in the large-lecture classroom. Students asked significantly more questions and asked a greater diversity of questions

  2. Considerations for observational research using large data sets in radiation oncology.

    PubMed

    Jagsi, Reshma; Bekelman, Justin E; Chen, Aileen; Chen, Ronald C; Hoffman, Karen; Shih, Ya-Chen Tina; Smith, Benjamin D; Yu, James B

    2014-09-01

    The radiation oncology community has witnessed growing interest in observational research conducted using large-scale data sources such as registries and claims-based data sets. With the growing emphasis on observational analyses in health care, the radiation oncology community must possess a sophisticated understanding of the methodological considerations of such studies in order to evaluate evidence appropriately to guide practice and policy. Because observational research has unique features that distinguish it from clinical trials and other forms of traditional radiation oncology research, the International Journal of Radiation Oncology, Biology, Physics assembled a panel of experts in health services research to provide a concise and well-referenced review, intended to be informative for the lay reader, as well as for scholars who wish to embark on such research without prior experience. This review begins by discussing the types of research questions relevant to radiation oncology that large-scale databases may help illuminate. It then describes major potential data sources for such endeavors, including information regarding access and insights regarding the strengths and limitations of each. Finally, it provides guidance regarding the analytical challenges that observational studies must confront, along with discussion of the techniques that have been developed to help minimize the impact of certain common analytical issues in observational analysis. Features characterizing a well-designed observational study include clearly defined research questions, careful selection of an appropriate data source, consultation with investigators with relevant methodological expertise, inclusion of sensitivity analyses, caution not to overinterpret small but significant differences, and recognition of limitations when trying to evaluate causality. This review concludes that carefully designed and executed studies using observational data that possess these qualities hold substantial promise for advancing our understanding of many unanswered questions of importance to the field of radiation oncology. PMID:25195986

  3. Considerations for Observational Research Using Large Data Sets in Radiation Oncology

    SciTech Connect

    Jagsi, Reshma; Bekelman, Justin E.; Chen, Aileen; Chen, Ronald C.; Hoffman, Karen; Tina Shih, Ya-Chen; Smith, Benjamin D.; Yu, James B.

    2014-09-01

    The radiation oncology community has witnessed growing interest in observational research conducted using large-scale data sources such as registries and claims-based data sets. With the growing emphasis on observational analyses in health care, the radiation oncology community must possess a sophisticated understanding of the methodological considerations of such studies in order to evaluate evidence appropriately to guide practice and policy. Because observational research has unique features that distinguish it from clinical trials and other forms of traditional radiation oncology research, the International Journal of Radiation Oncology, Biology, Physics assembled a panel of experts in health services research to provide a concise and well-referenced review, intended to be informative for the lay reader, as well as for scholars who wish to embark on such research without prior experience. This review begins by discussing the types of research questions relevant to radiation oncology that large-scale databases may help illuminate. It then describes major potential data sources for such endeavors, including information regarding access and insights regarding the strengths and limitations of each. Finally, it provides guidance regarding the analytical challenges that observational studies must confront, along with discussion of the techniques that have been developed to help minimize the impact of certain common analytical issues in observational analysis. Features characterizing a well-designed observational study include clearly defined research questions, careful selection of an appropriate data source, consultation with investigators with relevant methodological expertise, inclusion of sensitivity analyses, caution not to overinterpret small but significant differences, and recognition of limitations when trying to evaluate causality. This review concludes that carefully designed and executed studies using observational data that possess these qualities hold substantial promise for advancing our understanding of many unanswered questions of importance to the field of radiation oncology.

  4. PORTAAL: A Classroom Observation Tool Assessing Evidence-Based Teaching Practices for Active Learning in Large Science, Technology, Engineering, and Mathematics Classes

    PubMed Central

    Eddy, Sarah L.; Converse, Mercedes; Wenderoth, Mary Pat

    2015-01-01

    There is extensive evidence that active learning works better than a completely passive lecture. Despite this evidence, adoption of these evidence-based teaching practices remains low. In this paper, we offer one tool to help faculty members implement active learning. This tool identifies 21 readily implemented elements that have been shown to increase student outcomes related to achievement, logic development, or other relevant learning goals with college-age students. Thus, this tool both clarifies the research-supported elements of best practices for instructor implementation of active learning in the classroom setting and measures instructors’ alignment with these practices. We describe how we reviewed the discipline-based education research literature to identify best practices in active learning for adult learners in the classroom and used these results to develop an observation tool (Practical Observation Rubric To Assess Active Learning, or PORTAAL) that documents the extent to which instructors incorporate these practices into their classrooms. We then use PORTAAL to explore the classroom practices of 25 introductory biology instructors who employ some form of active learning. Overall, PORTAAL documents how well aligned classrooms are with research-supported best practices for active learning and provides specific feedback and guidance to instructors to allow them to identify what they do well and what could be improved. PMID:26033871

  5. PORTAAL: A Classroom Observation Tool Assessing Evidence-Based Teaching Practices for Active Learning in Large Science, Technology, Engineering, and Mathematics Classes.

    PubMed

    Eddy, Sarah L; Converse, Mercedes; Wenderoth, Mary Pat

    2015-01-01

    There is extensive evidence that active learning works better than a completely passive lecture. Despite this evidence, adoption of these evidence-based teaching practices remains low. In this paper, we offer one tool to help faculty members implement active learning. This tool identifies 21 readily implemented elements that have been shown to increase student outcomes related to achievement, logic development, or other relevant learning goals with college-age students. Thus, this tool both clarifies the research-supported elements of best practices for instructor implementation of active learning in the classroom setting and measures instructors' alignment with these practices. We describe how we reviewed the discipline-based education research literature to identify best practices in active learning for adult learners in the classroom and used these results to develop an observation tool (Practical Observation Rubric To Assess Active Learning, or PORTAAL) that documents the extent to which instructors incorporate these practices into their classrooms. We then use PORTAAL to explore the classroom practices of 25 introductory biology instructors who employ some form of active learning. Overall, PORTAAL documents how well aligned classrooms are with research-supported best practices for active learning and provides specific feedback and guidance to instructors to allow them to identify what they do well and what could be improved. PMID:26033871

  6. Anomaly Detection in Large Sets of High-Dimensional Symbol Sequences

    NASA Technical Reports Server (NTRS)

    Budalakoti, Suratna; Srivastava, Ashok N.; Akella, Ram; Turkov, Eugene

    2006-01-01

    This paper addresses the problem of detecting and describing anomalies in large sets of high-dimensional symbol sequences. The approach taken uses unsupervised clustering of sequences using the normalized longest common subsequence (LCS) as a similarity measure, followed by detailed analysis of outliers to detect anomalies. As the LCS measure is expensive to compute, the first part of the paper discusses existing algorithms, such as the Hunt-Szymanski algorithm, that have low time-complexity. We then discuss why these algorithms often do not work well in practice and present a new hybrid algorithm for computing the LCS that, in our tests, outperforms the Hunt-Szymanski algorithm by a factor of five. The second part of the paper presents new algorithms for outlier analysis that provide comprehensible indicators as to why a particular sequence was deemed to be an outlier. The algorithms provide a coherent description to an analyst of the anomalies in the sequence, compared to more normal sequences. The algorithms we present are general and domain-independent, so we discuss applications in related areas such as anomaly detection.

  7. Public-private partnerships with large corporations: setting the ground rules for better health.

    PubMed

    Galea, Gauden; McKee, Martin

    2014-04-01

    Public-private partnerships with large corporations offer potential benefits to the health sector but many concerns have been raised, highlighting the need for appropriate safeguards. In this paper we propose five tests that public policy makers may wish to apply when considering engaging in such a public-private partnership. First, are the core products and services provided by the corporation health enhancing or health damaging? In some cases, such as tobacco, the answer is obvious but others, such as food and alcohol, are contested. In such cases, the burden of proof is on the potential partners to show that their activities are health enhancing. Second, do potential partners put their policies into practice in the settings where they can do so, their own workplaces? Third, are the corporate social responsibility activities of potential partners independently audited? Fourth, do potential partners make contributions to the commons rather than to narrow programmes of their choosing? Fifth, is the role of the partner confined to policy implementation rather than policy development, which is ultimately the responsibility of government alone? PMID:24581699

  8. Information Theoretic Approaches to Rapid Discovery of Relationships in Large Climate Data Sets

    NASA Technical Reports Server (NTRS)

    Knuth, Kevin H.; Rossow, William B.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Mutual information as the asymptotic Bayesian measure of independence is an excellent starting point for investigating the existence of possible relationships among climate-relevant variables in large data sets, As mutual information is a nonlinear function of of its arguments, it is not beholden to the assumption of a linear relationship between the variables in question and can reveal features missed in linear correlation analyses. However, as mutual information is symmetric in its arguments, it only has the ability to reveal the probability that two variables are related. it provides no information as to how they are related; specifically, causal interactions or a relation based on a common cause cannot be detected. For this reason we also investigate the utility of a related quantity called the transfer entropy. The transfer entropy can be written as a difference between mutual informations and has the capability to reveal whether and how the variables are causally related. The application of these information theoretic measures is rested on some familiar examples using data from the International Satellite Cloud Climatology Project (ISCCP) to identify relation between global cloud cover and other variables, including equatorial pacific sea surface temperature (SST), over seasonal and El Nino Southern Oscillation (ENSO) cycles.

  9. Information Theoretic Approaches to Rapid Discovery of Relationships in Large Climate Data Sets

    NASA Astrophysics Data System (ADS)

    Knuth, K. H.; Rossow, W. B.

    2002-12-01

    Mutual information as the asymptotic Bayesian measure of independence is an excellent starting point for investigating the existence of possible relationships among climate-relevant variables in large data sets. As mutual information is a nonlinear function of its arguments, it is not beholden to the assumption of a linear relationship between the variables in question and can reveal features missed in linear correlation analyses. However, as mutual information is symmetric in its arguments, it only has the ability to reveal the probability that two variables are related. It provides no information as to how they are related; specifically, causal interactions or a relation based on a common cause cannot be detected. For this reason we also investigate the utility of a related quantity called the transfer entropy. The transfer entropy can be written as a difference between mutual informations and has the capability to reveal whether and how the variables are causally related. The application of these information theoretic measures is tested on some familiar examples using data from the International Satellite Cloud Climatology Project (ISCCP) to identify relations between global cloud cover and other variables, including equatorial pacific sea surface temperature (SST), over seasonal and El Nino Southern Oscillation (ENSO) cycles.

  10. Taking Energy to the Physics Classroom from the Large Hadron Collider at CERN

    ERIC Educational Resources Information Center

    Cid, Xabier; Cid, Ramon

    2009-01-01

    In 2008, the greatest experiment in history began. When in full operation, the Large Hadron Collider (LHC) at CERN will generate the greatest amount of information that has ever been produced in an experiment before. It will also reveal some of the most fundamental secrets of nature. Despite the enormous amount of information available on this

  11. Taking Energy to the Physics Classroom from the Large Hadron Collider at CERN

    ERIC Educational Resources Information Center

    Cid, Xabier; Cid, Ramon

    2009-01-01

    In 2008, the greatest experiment in history began. When in full operation, the Large Hadron Collider (LHC) at CERN will generate the greatest amount of information that has ever been produced in an experiment before. It will also reveal some of the most fundamental secrets of nature. Despite the enormous amount of information available on this…

  12. Assembly of large metagenome data sets using a Convey HC-1 hybrid core computer (7th Annual SFAF Meeting, 2012)

    ScienceCinema

    Copeland, Alex [DOE JGI

    2013-02-11

    Alex Copeland on "Assembly of large metagenome data sets using a Convey HC-1 hybrid core computer" at the 2012 Sequencing, Finishing, Analysis in the Future Meeting held June 5-7, 2012 in Santa Fe, New Mexico.

  13. Classroom management programs for deaf children in state residential and large public schools.

    PubMed

    Wenkus, M; Rittenhouse, B; Dancer, J

    1999-12-01

    Personnel in 4 randomly selected state residential schools for the deaf and 3 randomly selected large public schools with programs for the deaf were surveyed to assess the types of management or disciplinary programs and strategies currently in use with deaf students and the rated effectiveness of such programs. Several behavioral management programs were identified by respondents, with Assertive Discipline most often listed. Ratings of program effectiveness were generally above average on a number of qualitative criteria. PMID:10710770

  14. Linked Scatter Plots, A Powerful Exploration Tool For Very Large Sets of Spectra

    NASA Astrophysics Data System (ADS)

    Carbon, Duane Francis; Henze, Christopher

    2015-08-01

    We present a new tool, based on linked scatter plots, that is designed to efficiently explore very large spectrum data sets such as the SDSS, APOGEE, LAMOST, GAIA, and RAVE data sets.The tool works in two stages: the first uses batch processing and the second runs interactively. In the batch stage, spectra are processed through our data pipeline which computes the depths relative to the local continuum at preselected feature wavelengths. These depths, and any additional available variables such as local S/N level, magnitudes, colors, positions, and radial velocities, are the basic measured quantities used in the interactive stage.The interactive stage employs the NASA hyperwall, a configuration of 128 workstation displays (8x16 array) controlled by a parallelized software suite running on NASA's Pleiades supercomputer. Each hyperwall panel is used to display a fully linked 2-D scatter plot showing the depth of feature A vs the depth of feature B for all of the spectra. A and B change from panel to panel. The relationships between the various (A,B) strengths and any distinctive clustering, as well as unique outlier groupings, are visually apparent when examining and inter-comparing the different panels on the hyperwall. In addition, the data links between the scatter plots allow the user to apply a logical algebra to the measurements. By graphically selecting the objects in any interesting region of any 2-D plot on the hyperwall, the tool immediately and clearly shows how the selected objects are distributed in all the other 2-D plots. The selection process may be repeated multiple times and, at each step, the selections can represent a sequence of logical constraints on the measurements, revealing those objects which satisfy all the constraints thus far. The spectra of the selected objects may be examined at any time on a connected workstation display.Using over 945,000,000 depth measurements from 569,738 SDSS DR10 stellar spectra, we illustrate how to quickly isolate and examine such interesting stellar subsets as EMP stars, C-rich EMP stars, and CV stars.

  15. Validation and evaluation of common large-area display set (CLADS) performance specification

    NASA Astrophysics Data System (ADS)

    Hermann, David J.; Gorenflo, Ronald L.

    1998-09-01

    Battelle is under contract with Warner Robins Air Logistics Center to design a Common Large Area Display Set (CLADS) for use in multiple Command, Control, Communications, Computers, and Intelligence (C4I) applications that currently use 19- inch Cathode Ray Tubes (CRTs). Battelle engineers have built and fully tested pre-production prototypes of the CLADS design for AWACS, and are completing pre-production prototype displays for three other platforms simultaneously. With the CLADS design, any display technology that can be packaged to meet the form, fit, and function requirements defined by the Common Large Area Display Head Assembly (CLADHA) performance specification is a candidate for CLADS applications. This technology independent feature reduced the risk of CLADS development, permits life long technology insertion upgrades without unnecessary redesign, and addresses many of the obsolescence problems associated with COTS technology-based acquisition. Performance and environmental testing were performed on the AWACS CLADS and continues on other platforms as a part of the performance specification validation process. A simulator assessment and flight assessment were successfully completed for the AWACS CLADS, and lessons learned from these assessments are being incorporated into the performance specifications. Draft CLADS specifications were released to potential display integrators and manufacturers for review in 1997, and the final version of the performance specifications are scheduled to be released to display integrators and manufacturers in May, 1998. Initial USAF applications include replacements for the E-3 AWACS color monitor assembly, E-8 Joint STARS graphics display unit, and ABCCC airborne color display. Initial U.S. Navy applications include the E-2C ACIS display. For these applications, reliability and maintainability are key objectives. The common design will reduce the cost of operation and maintenance by an estimated 3.3M per year on E-3 AWACS alone. It is realistic to anticipate savings of over 30M per year as CLADS is implemented widely across DoD applications. As commonality and open systems interfaces begin to surface in DoD applications, the CLADS architecture can easily and cost effectively absorb the changes, and avoid COTS obsolescence issues.

  16. Getting specific: making taxonomic and ecological sense of large sequencing data sets.

    PubMed

    Massana, Ramon

    2015-06-01

    Eukaryotic microbes comprise a diverse collection of phototrophic and heterotrophic creatures known to play fundamental roles in ecological processes. Some can be identified by light microscopy, generally the largest and with conspicuous shapes, while the smallest can be counted by epifluorescence microscopy or flow cytometry but remain largely unidentified. Microbial diversity studies greatly advanced with the analysis of phylogenetic markers sequenced from natural assemblages. Molecular surveys began in 1990 targeting marine bacterioplankton (Giovannoni et al. ) and first approached microbial eukaryotes in three studies published in 2001 (Díez et al. ; López-García et al. ; Moon-van der Staay et al. ). These seminal studies, based on cloning and Sanger sequencing the complete 18S rDNA, were critical for obtaining broad pictures of microbial diversity in contrasted habitats and for describing novel lineages by robust phylogenies, but were limited by the number of sequences obtained. So, inventories of species richness in a given sample and community comparisons through environmental gradients were very incomplete. These limitations have been overcome with the advent of high-throughput sequencing (HTS) methods, initially 454-pyrosequencing, today Illumina and soon others to come. In this issue of Molecular Ecology, Egge et al. () show a nice example of the use of HTS to study the biodiversity and seasonal succession of a particularly important group of marine microbial eukaryotes, the haptophytes. Temporal changes were analysed first at the community level, then at the clade level, and finally at the lowest rank comparable to species. Interesting and useful ecological insights were obtained at each taxonomic scale. Haptophyte diversity differed along seasons in a systematic manner, with some species showing seasonal preferences and others being always present. Many of these species had no correspondence with known species, pointing out the high level of novelty in microbial assemblages, only accessible by molecular tools. Moreover, the number of species detected was limited, agreeing with a putative scenario of constrained evolutionary diversification in free-living small eukaryotes. This study illustrates the potential of HTS to address ecological relevant questions in an accessible way by processing large data sets that, nonetheless, need to be treated with a fair understanding of their limitations. PMID:26095583

  17. BACHSCORE. A tool for evaluating efficiently and reliably the quality of large sets of protein structures

    NASA Astrophysics Data System (ADS)

    Sarti, E.; Zamuner, S.; Cossio, P.; Laio, A.; Seno, F.; Trovato, A.

    2013-12-01

    In protein structure prediction it is of crucial importance, especially at the refinement stage, to score efficiently large sets of models by selecting the ones that are closest to the native state. We here present a new computational tool, BACHSCORE, that allows its users to rank different structural models of the same protein according to their quality, evaluated by using the BACH++ (Bayesian Analysis Conformation Hunt) scoring function. The original BACH statistical potential was already shown to discriminate with very good reliability the protein native state in large sets of misfolded models of the same protein. BACH++ features a novel upgrade in the solvation potential of the scoring function, now computed by adapting the LCPO (Linear Combination of Pairwise Orbitals) algorithm. This change further enhances the already good performance of the scoring function. BACHSCORE can be accessed directly through the web server: bachserver.pd.infn.it. Catalogue identifier: AEQD_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEQD_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: GNU General Public License version 3 No. of lines in distributed program, including test data, etc.: 130159 No. of bytes in distributed program, including test data, etc.: 24 687 455 Distribution format: tar.gz Programming language: C++. Computer: Any computer capable of running an executable produced by a g++ compiler (4.6.3 version). Operating system: Linux, Unix OS-es. RAM: 1 073 741 824 bytes Classification: 3. Nature of problem: Evaluate the quality of a protein structural model, taking into account the possible “a priori” knowledge of a reference primary sequence that may be different from the amino-acid sequence of the model; the native protein structure should be recognized as the best model. Solution method: The contact potential scores the occurrence of any given type of residue pair in 5 possible contact classes (α-helical contact, parallel β-sheet contact, anti-parallel β-sheet contact, side-chain contact, no contact). The solvation potential scores the occurrence of any residue type in 2 possible environments: buried and solvent exposed. Residue environment is assigned by adapting the LCPO algorithm. Residues present in the reference primary sequence and not present in the model structure contribute to the model score as solvent exposed and as non contacting all other residues. Restrictions: Input format file according to the Protein Data Bank standard Additional comments: Parameter values used in the scoring function can be found in the file /folder-to-bachscore/BACH/examples/bach_std.par. Running time: Roughly one minute to score one hundred structures on a desktop PC, depending on their size.

  18. Setting the Stage for Developing Pre-service Teachers' Conceptions of Good Science Teaching: The role of classroom videos

    NASA Astrophysics Data System (ADS)

    Wong, Siu Ling; Yung, Benny Hin Wai; Cheng, Man Wai; Lam, Kwok Leung; Hodson, Derek

    2006-01-01

    This paper reports findings about a curriculum innovation conducted at The University of Hong Kong. A CD-ROM consisting of videos of two lessons by different teachers demonstrating exemplary science teaching was used to elicit conceptions of good science teaching of student-teachers enrolled for the 1-year Postgraduate Diploma in Education at several stages during the programme. It was found that the videos elicited student-teachers conceptions and had impact on those conceptions prior to the commencement of formal instruction. It has extended student-teachers awareness of alternative teaching methods and approaches not experienced in their own schooling, broadened their awareness of different classroom situations, provided proof of existence of good practices, and prompted them to reflect on their current preconceptions of good science teaching. In several ways, the videos acted as a catalyst in socializing the transition of student-teachers from the role of student to the role of teacher.

  19. Classroom Management and the Librarian

    ERIC Educational Resources Information Center

    Blackburn, Heidi; Hays, Lauren

    2014-01-01

    As librarians take on more instructional responsibilities, the need for classroom management skills becomes vital. Unfortunately, classroom management skills are not taught in library school and therefore, many librarians are forced to learn how to manage a classroom on the job. Different classroom settings such as one-shot instruction sessions…

  20. Toward accurate thermochemical models for transition metals: G3Large basis sets for atoms Sc-Zn

    NASA Astrophysics Data System (ADS)

    Mayhall, Nicholas J.; Raghavachari, Krishnan; Redfern, Paul C.; Curtiss, Larry A.; Rassolov, Vitaly

    2008-04-01

    An augmented valence triple-zeta basis set, referred to as G3Large, is reported for the first-row transition metal elements Sc through Zn. The basis set is constructed in a manner similar to the G3Large basis set developed previously for other elements (H-Ar, K, Ca, Ga-Kr) and used as a key component in Gaussian-3 theory. It is based on a contraction of a set of 15s13p5d Gaussian primitives to 8s7p3d, and also includes sets of f and g polarization functions, diffuse spd functions, and core df polarization functions. The basis set is evaluated with triples-augmented coupled cluster [CCSD(T)] and Brueckner orbital [BD(T)] methods for a small test set involving energies of atoms, atomic ions, and diatomic hydrides. It performs well for the low-lying s →d excitation energies of atoms, atomic ionization energies, and the dissociation energies of the diatomic hydrides. The Brueckner orbital-based BD(T) method performs substantially better than Hartree-Fock-based CCSD(T) for molecules such as NiH, where the starting unrestricted Hartree-Fock wavefunction suffers from a high degree of spin contamination. Comparison with available data for geometries of transition metal hydrides also shows good agreement. A smaller basis set without core polarization functions, G3MP2Large, is also defined.

  1. Tools for Analysis and Visualization of Large Time-Varying CFD Data Sets

    NASA Technical Reports Server (NTRS)

    Wilhelms, Jane; VanGelder, Allen

    1997-01-01

    In the second year, we continued to built upon and improve our scanline-based direct volume renderer that we developed in the first year of this grant. This extremely general rendering approach can handle regular or irregular grids, including overlapping multiple grids, and polygon mesh surfaces. It runs in parallel on multi-processors. It can also be used in conjunction with a k-d tree hierarchy, where approximate models and error terms are stored in the nodes of the tree, and approximate fast renderings can be created. We have extended our software to handle time-varying data where the data changes but the grid does not. We are now working on extending it to handle more general time-varying data. We have also developed a new extension of our direct volume renderer that uses automatic decimation of the 3D grid, as opposed to an explicit hierarchy. We explored this alternative approach as being more appropriate for very large data sets, where the extra expense of a tree may be unacceptable. We also describe a new approach to direct volume rendering using hardware 3D textures and incorporates lighting effects. Volume rendering using hardware 3D textures is extremely fast, and machines capable of using this technique are becoming more moderately priced. While this technique, at present, is limited to use with regular grids, we are pursuing possible algorithms extending the approach to more general grid types. We have also begun to explore a new method for determining the accuracy of approximate models based on the light field method described at ACM SIGGRAPH '96. In our initial implementation, we automatically image the volume from 32 equi-distant positions on the surface of an enclosing tessellated sphere. We then calculate differences between these images under different conditions of volume approximation or decimation. We are studying whether this will give a quantitative measure of the effects of approximation. We have created new tools for exploring the differences between images produced by various rendering methods. Images created by our software can be stored in the SGI RGB format. Our idtools software reads in pair of images and compares them using various metrics. The differences of the images using the RGB, HSV, and HSL color models can be calculated and shown. We can also calculate the auto-correlation function and the Fourier transform of the image and image differences. We will explore how these image differences compare in order to find useful metrics for quantifying the success of various visualization approaches. In general, progress was consistent with our research plan for the second year of the grant.

  2. Gaining A Geological Perspective Through Active Learning in the Large Lecture Classroom

    NASA Astrophysics Data System (ADS)

    Kapp, J. L.; Richardson, R. M.; Slater, S. J.

    2008-12-01

    NATS 101 A Geological Perspective is a general education course taken by non science majors. We offer 600 seats per semester, with four large lecture sections taught by different faculty members. In the past we have offered optional once a week study groups taught by graduate teaching assistants. Students often feel overwhelmed by the science and associated jargon, and many are prone to skipping lectures altogether. Optional study groups are only attended by ~50% of the students. Faculty members find the class to be a lot of work, mainly due to the grading it generates. Activities given in lecture are often short multiple choice or true false assignments, limiting the depth of understanding we can evaluate. Our students often lack math and critical thinking skills, and we spend a lot of time in lecture reintroducing ideas students should have already gotten from the text. In summer 2007 we were funded to redesign the course. Our goals were to 1) cut the cost of running the course, and 2) improve student learning. Under our redesign optional study groups were replaced by once a week mandatory break out sessions where students complete activities that have been introduced in lecture. Break out sessions substitute for one hour of lecture, and are run by undergraduate preceptors and graduate teaching assistants (GTAs). During the lecture period, lectures themselves are brief with a large portion of the class devoted to active learning in small groups. Weekly reading quizzes are submitted via the online course management system. Break out sessions allow students to spend more time interacting with their fellow students, undergraduate preceptors, and GTAs. They get one on one help in break out sessions on assignments designed to enhance the lecture material. The active lecture format means less of their time is devoted to listening passively to a lecture, and more time is spent peer learning an interacting with the instructor. Completing quizzes online allows students more freedom in when and where they complete their work, and we provide instant feedback on their submitted work. The University of Wyoming Cognition in Astronomy, Physics and Earth sciences Research (CAPER) Team, who specialize in project evaluation, are leading the evaluation effort. We are comparing pre-test to post-test gains on the Geoscience Concept Inventory and Attitudes Toward Science surveys before and after the redesign, and inductive analysis of student interviews and reflective writing that describe student perceptions of the modified learning environment. The redesign has cut the cost of the class per student by more than half. This was achieved primarily in two ways: 1) by greatly reducing the number of hours spent by faculty and graduate teaching assistants on preparation, class time, and grading; and 2) reducing the number of graduate teaching assistants required for the class and replacing many of them with undergraduate preceptors. Undergraduate preceptors are not paid, but receive academic credit for their teaching service. The savings from the redesign is used to allow faculty more time to work on institutional priorities.

  3. DivergentSet, a tool for picking non-redundant sequences from large sequence collections.

    PubMed

    Widmann, Jeremy; Hamady, Micah; Knight, Rob

    2006-08-01

    DivergentSet addresses the important but so far neglected bioinformatics task of choosing a representative set of sequences from a larger collection. We found that using a phylogenetic tree to guide the construction of divergent sets of sequences can be up to 2 orders of magnitude faster than the naive method of using a full distance matrix. By providing a user-friendly interface (available online) that integrates the tasks of finding additional sequences, building and refining the divergent set, producing random divergent sets from the same sequences, and exporting identifiers, this software facilitates a wide range of bioinformatics analyses including finding significant motifs and covariations. As an example application of DivergentSet, we demonstrate that the motifs identified by the motif-finding package MEME (Motif Elicitation by Maximum Entropy) are highly unstable with respect to the specific choice of sequences. This instability suggests that the types of sensitivity analysis enabled by DivergentSet may be widely useful for identifying the motifs of biological significance. PMID:16769708

  4. Engaged: Making Large Classes Feel Small through Blended Learning Instructional Strategies that Promote Increased Student Performance

    ERIC Educational Resources Information Center

    Francis, Raymond W.

    2012-01-01

    It is not enough to be great at sharing information in a large classroom setting. To be an effective teacher you must be able to meaningfully engage your students with their peers and with the content. And you must do this regardless of class size or content. The issues of teaching effectively in large classroom settings have presented ongoing…

  5. History Matters: The Large Scale Landscape Setting of the Boulder Creek Critical Zone Observatory (Invited)

    NASA Astrophysics Data System (ADS)

    Anderson, R. S.; Wobus, C. W.; Berlin, M. M.; Duehnforth, M.; Tucker, G. E.; Anderson, S. P.

    2009-12-01

    The architecture of the critical zone depends upon the structural geologic and climatic history of a site. We document these controls in the vicinity of the Boulder Creek Critical Zone Observatory (BcCZO), and show that events in its geologic history reaching back to the Laramide have left a strong legacy in the landscape and its critical zone architecture. The Laramide orogeny brought crystalline >1Gyr rocks to the surface by motion along listric faults both east and west of the range crest. This manifests itself in the juxtaposition of crystalline rocks of the range against younger sedimentary rocks of the Denver Basin. More subtly, the rocks have also been subjected to significant strain upon passing through major bends and ramps in the thrust faults, leading to brittle cracking of the rocks. They therefore arrive in the near-surface pre-crushed. Post-Laramide evolution of the range resulted in reduction of relief within the crystalline core of the range, and construction of a sedimentary ramp built of weathered debris. Accumulation of sediment ended with deposition of the Ogallala formation and similar clastic sedimentary units derived from the crystalline core of the Rockies. Subsequent late Cenozoic abandonment of this depositional surface and significant exhumation of the western Great Plains resulted in the cutting out of a large wedge of the easily eroded sedimentary rocks. This exhumation event could result from either a change in the hydrologic system in the headwaters, or more likely from the climatically induced reduction in sediment supply from the crystalline range. Simple models show how the latter would generate a wedge-shaped exhumation pattern that is both deepest and widest against the range. The relevance to the BcCZO is two-fold. First, it may be a reduction in regolith production rate and/or grain size that promotes the exhumation of the western Plains. Second, the rapid exhumation adjacent to the range served as a drop in base level for all rivers draining the range, which in turn induced a wave of fluvial erosion that propagates into the range. All rivers draining the range display convexities reflecting the modern edge of this wave of incision. The knickzones of rivers with the highest drainage areas have bitten most deeply into the range. The landscape immediately adjacent to the river segments outboard of the knickzones has been rejuvenated. Hillslopes are steeper, are dominated by bedrock exposures and display regolith of variable thickness. Roughly simultaneously with the exhumation of the Plains, the upper drainages have been glaciated, repeatedly scouring the weathered carapace from the glacial troughs; 10Be concentrations in glacial polish suggest that >3 m of erosion occurred in the last glacial cycle. The BcCZO exploits this landscape history by addressing the differences in critical zone development in three settings in the landscape: the glacially scoured headwaters, the re-excited outer edge of the range adjacent to the major streams, and the landscape between these zones of excitement, in which slow post-Laramide decay of the landscape reigns.

  6. On the performance of large Gaussian basis sets for the computation of total atomization energies

    NASA Technical Reports Server (NTRS)

    Martin, J. M. L.

    1992-01-01

    The total atomization energies of a number of molecules have been computed using an augmented coupled-cluster method and (5s4p3d2f1g) and 4s3p2d1f) atomic natural orbital (ANO) basis sets, as well as the correlation consistent valence triple zeta plus polarization (cc-pVTZ) correlation consistent valence quadrupole zeta plus polarization (cc-pVQZ) basis sets. The performance of ANO and correlation consistent basis sets is comparable throughout, although the latter can result in significant CPU time savings. Whereas the inclusion of g functions has significant effects on the computed Sigma D(e) values, chemical accuracy is still not reached for molecules involving multiple bonds. A Gaussian-1 (G) type correction lowers the error, but not much beyond the accuracy of the G1 model itself. Using separate corrections for sigma bonds, pi bonds, and valence pairs brings down the mean absolute error to less than 1 kcal/mol for the spdf basis sets, and about 0.5 kcal/mol for the spdfg basis sets. Some conclusions on the success of the Gaussian-1 and Gaussian-2 models are drawn.

  7. STRESSOR DATA SETS FOR STUDYING SPECIES DIVERSITY AT LARGE SPATIAL SCALES

    EPA Science Inventory

    There is increasing scientific and societal concern over the impact of anthropogenic activities (e.g., habitat destruction, pollution) on biodiversity. he impact of anthropogenic activities on biodiversity is generally recognized as a global phenomenon. t large spatial scales, se...

  8. Improved student engagement, satisfaction, and learning outcomes in a "flipped" large-lecture setting

    NASA Astrophysics Data System (ADS)

    Ward, A. S.; Bettis, E. A., III; Russell, J. E.; Van Horne, S.; Rocheford, M. K.; Sipola, M.; Colombo, M. R.

    2014-12-01

    Large lecture courses are traditional teaching practices of most large institutions of public higher education. They have historically provided an efficient way to deliver content information to the large number of students with the least amount of faculty resources. However, research of student learning indicates that the traditional lecture format does not provide the best learning experience for students, and students learn better in the active learning environments in which students engage in meaningful learning activities rather than just listening. In this study, we compare two offerings of Introduction to Environmental Science, a large-lecture general education course, offered in two formats by the same instructors in subsequent years. In the first offering (Spring 2013) the course was offered as a traditional large-lecture course, with lecture to large audiences and a limited number of exams for assessment. In the second offering (Spring 2014), the course included small-group discussion periods, peer-review of writing assignments, guest lectures, and online learning with limited traditional lecture. Our primary objective was to quantify differences in student engagement and learning outcomes between the two course offerings. Results of our study show that the students in the transformed course indicated higher interest, engagement level, and satisfaction than the students in the traditional lecture course. Furthermore, students in the transformed course reported increased behavior, emotional, and cognitive engagement over those in the traditional course, and also increased satisfaction with the course.

  9. Addressing Methodological Challenges in Large Communication Data Sets: Collecting and Coding Longitudinal Interactions in Home Hospice Cancer Care.

    PubMed

    Reblin, Maija; Clayton, Margaret F; John, Kevin K; Ellington, Lee

    2016-07-01

    In this article, we present strategies for collecting and coding a large longitudinal communication data set collected across multiple sites, consisting of more than 2000 hours of digital audio recordings from approximately 300 families. We describe our methods within the context of implementing a large-scale study of communication during cancer home hospice nurse visits, but this procedure could be adapted to communication data sets across a wide variety of settings. This research is the first study designed to capture home hospice nurse-caregiver communication, a highly understudied location and type of communication event. We present a detailed example protocol encompassing data collection in the home environment, large-scale, multisite secure data management, the development of theoretically-based communication coding, and strategies for preventing coder drift and ensuring reliability of analyses. Although each of these challenges has the potential to undermine the utility of the data, reliability between coders is often the only issue consistently reported and addressed in the literature. Overall, our approach demonstrates rigor and provides a "how-to" example for managing large, digitally recorded data sets from collection through analysis. These strategies can inform other large-scale health communication research. PMID:26580414

  10. A Controlled Trial of Active versus Passive Learning Strategies in a Large Group Setting

    ERIC Educational Resources Information Center

    Haidet, Paul; Morgan, Robert O.; O'Malley, Kimberly; Moran, Betty Jeanne; Richards, Boyd F.

    2004-01-01

    Objective: To compare the effects of active and didactic teaching strategies on learning- and process-oriented outcomes. Design: Controlled trial. Setting: After-hours residents' teaching session. Participants: Family and Community Medicine, Internal Medicine, and Pediatrics residents at two academic medical institutions. Interventions: We…

  11. MDH: A High Speed Multi-phase Dynamic Hash String Matching Algorithm for Large-Scale Pattern Set

    NASA Astrophysics Data System (ADS)

    Zhou, Zongwei; Xue, Yibo; Liu, Junda; Zhang, Wei; Li, Jun

    String matching algorithm is one of the key technologies in numerous network security applications and systems. Nowadays, the increasing network bandwidth and pattern set size both calls for high speed string matching algorithm for large-scale pattern set. This paper proposes a novel algorithm called Multi-phase Dynamic Hash (MDH), which cut down the memory requirement by multi-phase hash and explore valuable pattern set information to speed up searching procedure by dynamic-cut heuristics. The experimental results demonstrate that MDH can improve matching performance by 100% to 300% comparing with other popular algorithms, whereas the memory requirement stays in a comparatively low level.

  12. Training Health Service Technicians as Teacher Assistants in an Inpatient Residential Emotional/Behavior Disorder Classroom Setting

    ERIC Educational Resources Information Center

    Banks, Walter E.

    2012-01-01

    Schools have identified that the use of Teacher Assistants often provides needed additional support in the school setting. In a Health Care Facility that provides inpatient psychiatric services, children ages 5-14 are required to engage in school activities. Currently there are no Teacher Assistants trained in the facility. This study focuses on…

  13. Adaptation of Bharatanatyam Dance Pedagogy for Multicultural Classrooms: Questions and Relevance in a North American University Setting

    ERIC Educational Resources Information Center

    Banerjee, Suparna

    2013-01-01

    This article opens up questions around introducing Bharatanatyam, a form of Indian classical dance, to undergraduate learners within a North American university setting. The aim is to observe how the learners understood and received a particular cultural practice and to explore issues related to learning goals, curriculum content, approaches to…

  14. An Efficient Algorithm for Discovering Motifs in Large DNA Data Sets.

    PubMed

    Yu, Qiang; Huo, Hongwei; Chen, Xiaoyang; Guo, Haitao; Vitter, Jeffrey Scott; Huan, Jun

    2015-07-01

    The planted (l,d) motif discovery has been successfully used to locate transcription factor binding sites in dozens of promoter sequences over the past decade. However, there has not been enough work done in identifying (l,d) motifs in the next-generation sequencing (ChIP-seq) data sets, which contain thousands of input sequences and thereby bring new challenge to make a good identification in reasonable time. To cater this need, we propose a new planted (l,d) motif discovery algorithm named MCES, which identifies motifs by mining and combining emerging substrings. Specially, to handle larger data sets, we design a MapReduce-based strategy to mine emerging substrings distributedly. Experimental results on the simulated data show that i) MCES is able to identify (l,d) motifs efficiently and effectively in thousands to millions of input sequences, and runs faster than the state-of-the-art (l,d) motif discovery algorithms, such as F-motif and TraverStringsR; ii) MCES is able to identify motifs without known lengths, and has a better identification accuracy than the competing algorithm CisFinder. Also, the validity of MCES is tested on real data sets. MCES is freely available at http://sites.google.com/site/feqond/mces. PMID:25872217

  15. An “Electronic Fluorescent Pictograph” Browser for Exploring and Analyzing Large-Scale Biological Data Sets

    PubMed Central

    Nahal, Hardeep; Ammar, Ron; Wilson, Greg V.; Provart, Nicholas J.

    2007-01-01

    Background The exploration of microarray data and data from other high-throughput projects for hypothesis generation has become a vital aspect of post-genomic research. For the non-bioinformatics specialist, however, many of the currently available tools provide overwhelming amounts of data that are presented in a non-intuitive way. Methodology/Principal Findings In order to facilitate the interpretation and analysis of microarray data and data from other large-scale data sets, we have developed a tool, which we have dubbed the electronic Fluorescent Pictograph – or eFP – Browser, available at http://www.bar.utoronto.ca/, for exploring microarray and other data for hypothesis generation. This eFP Browser engine paints data from large-scale data sets onto pictographic representations of the experimental samples used to generate the data sets. We give examples of using the tool to present Arabidopsis gene expression data from the AtGenExpress Consortium (Arabidopsis eFP Browser), data for subcellular localization of Arabidopsis proteins (Cell eFP Browser), and mouse tissue atlas microarray data (Mouse eFP Browser). Conclusions/Significance The eFP Browser software is easily adaptable to microarray or other large-scale data sets from any organism and thus should prove useful to a wide community for visualizing and interpreting these data sets for hypothesis generation. PMID:17684564

  16. Learning through Discussions: Comparing the Benefits of Small-Group and Large-Class Settings

    ERIC Educational Resources Information Center

    Pollock, Philip H.; Hamann, Kerstin; Wilson, Bruce M.

    2011-01-01

    The literature on teaching and learning heralds the benefits of discussion for student learner outcomes, especially its ability to improve students' critical thinking skills. Yet, few studies compare the effects of different types of face-to-face discussions on learners. Using student surveys, we analyze the benefits of small-group and large-class…

  17. Learning through Discussions: Comparing the Benefits of Small-Group and Large-Class Settings

    ERIC Educational Resources Information Center

    Pollock, Philip H.; Hamann, Kerstin; Wilson, Bruce M.

    2011-01-01

    The literature on teaching and learning heralds the benefits of discussion for student learner outcomes, especially its ability to improve students' critical thinking skills. Yet, few studies compare the effects of different types of face-to-face discussions on learners. Using student surveys, we analyze the benefits of small-group and large-class

  18. A posteriori correction of camera characteristics from large image data sets

    PubMed Central

    Afanasyev, Pavel; Ravelli, Raimond B. G.; Matadeen, Rishi; De Carlo, Sacha; van Duinen, Gijs; Alewijnse, Bart; Peters, Peter J.; Abrahams, Jan-Pieter; Portugal, Rodrigo V.; Schatz, Michael; van Heel, Marin

    2015-01-01

    Large datasets are emerging in many fields of image processing including: electron microscopy, light microscopy, medical X-ray imaging, astronomy, etc. Novel computer-controlled instrumentation facilitates the collection of very large datasets containing thousands of individual digital images. In single-particle cryogenic electron microscopy (“cryo-EM”), for example, large datasets are required for achieving quasi-atomic resolution structures of biological complexes. Based on the collected data alone, large datasets allow us to precisely determine the statistical properties of the imaging sensor on a pixel-by-pixel basis, independent of any “a priori” normalization routinely applied to the raw image data during collection (“flat field correction”). Our straightforward “a posteriori” correction yields clean linear images as can be verified by Fourier Ring Correlation (FRC), illustrating the statistical independence of the corrected images over all spatial frequencies. The image sensor characteristics can also be measured continuously and used for correcting upcoming images. PMID:26068909

  19. The PRRS Host Genomic Consortium (PHGC) Database: Management of large data sets.

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In any consortium project where large amounts of phenotypic and genotypic data are collected across several research labs, issues arise with maintenance and analysis of datasets. The PRRS Host Genomic Consortium (PHGC) Database was developed to meet this need for the PRRS research community. The sch...

  20. Design of Availability-Dependent Distributed Services in Large-Scale Uncooperative Settings

    ERIC Educational Resources Information Center

    Morales, Ramses Victor

    2009-01-01

    Thesis Statement: "Availability-dependent global predicates can be efficiently and scalably realized for a class of distributed services, in spite of specific selfish and colluding behaviors, using local and decentralized protocols". Several types of large-scale distributed systems spanning the Internet have to deal with availability variations…

  1. Psychology in an Interdisciplinary Setting: A Large-Scale Project to Improve University Teaching

    ERIC Educational Resources Information Center

    Koch, Franziska D.; Vogt, Joachim

    2015-01-01

    At a German university of technology, a large-scale project was funded as a part of the "Quality Pact for Teaching", a programme launched by the German Federal Ministry of Education and Research to improve the quality of university teaching and study conditions. The project aims at intensifying interdisciplinary networking in teaching,…

  2. Use of Large-Scale Data Sets to Study Educational Pathways of American Indian and Alaska Native Students

    ERIC Educational Resources Information Center

    Faircloth, Susan C.; Alcantar, Cynthia M.; Stage, Frances K.

    2014-01-01

    This chapter discusses issues and challenges encountered in using large-scale data sets to study educational experiences and subsequent outcomes for American Indian and Alaska Native (AI/AN) students. In this chapter, we argue that the linguistic and cultural diversity of Native peoples, coupled with the legal and political ways in which education

  3. Use of Large-Scale Data Sets to Study Educational Pathways of American Indian and Alaska Native Students

    ERIC Educational Resources Information Center

    Faircloth, Susan C.; Alcantar, Cynthia M.; Stage, Frances K.

    2014-01-01

    This chapter discusses issues and challenges encountered in using large-scale data sets to study educational experiences and subsequent outcomes for American Indian and Alaska Native (AI/AN) students. In this chapter, we argue that the linguistic and cultural diversity of Native peoples, coupled with the legal and political ways in which education…

  4. Caught you: threats to confidentiality due to the public release of large-scale genetic data sets

    PubMed Central

    2010-01-01

    Background Large-scale genetic data sets are frequently shared with other research groups and even released on the Internet to allow for secondary analysis. Study participants are usually not informed about such data sharing because data sets are assumed to be anonymous after stripping off personal identifiers. Discussion The assumption of anonymity of genetic data sets, however, is tenuous because genetic data are intrinsically self-identifying. Two types of re-identification are possible: the "Netflix" type and the "profiling" type. The "Netflix" type needs another small genetic data set, usually with less than 100 SNPs but including a personal identifier. This second data set might originate from another clinical examination, a study of leftover samples or forensic testing. When merged to the primary, unidentified set it will re-identify all samples of that individual. Even with no second data set at hand, a "profiling" strategy can be developed to extract as much information as possible from a sample collection. Starting with the identification of ethnic subgroups along with predictions of body characteristics and diseases, the asthma kids case as a real-life example is used to illustrate that approach. Summary Depending on the degree of supplemental information, there is a good chance that at least a few individuals can be identified from an anonymized data set. Any re-identification, however, may potentially harm study participants because it will release individual genetic disease risks to the public. PMID:21190545

  5. Prediction of the adsorption capability onto activated carbon of a large data set of chemicals by local lazy regression method

    NASA Astrophysics Data System (ADS)

    Lei, Beilei; Ma, Yimeng; Li, Jiazhong; Liu, Huanxiang; Yao, Xiaojun; Gramatica, Paola

    2010-08-01

    Accurate quantitative structure-property relationship (QSPR) models based on a large data set containing a total of 3483 organic compounds were developed to predict chemicals' adsorption capability onto activated carbon in gas phrase. Both global multiple linear regression (MLR) method and local lazy regression (LLR) method were used to develop QSPR models. The results proved that LLR has prediction accuracy 10% higher than that of MLR model. By applying LLR method we can predict the test set (787 compounds) with Q2ext of 0.900 and root mean square error (RMSE) of 0.129. The accurate model based on this large data set could be useful to predict adsorption property of new compounds since such model covers a highly diverse structural space.

  6. [Fast multi-resolution volume rendering method of large medical data sets].

    PubMed

    Li, Bin; Tian, Lianfang; Chen, Ping; Wang, Lifei; Mao, Zongyuan

    2008-10-01

    A Fast Multi-resolution Volume Rendering Method (FMVRM) based on wavelet and Shear-Warp is herein proposed. In this method, the medical volume data is compressed using wavelet transformation first. Then based on the set resolution, the medical volume data is decompressed guided by Opacity transfer function (OTF). Finally, the 3D medical image is reconstructed on the basis of Shear-Warp using Block-based run length encoded (BRLE) data structure, in which, the aliasing artifacts resulting from under-sampling in Shear-Warp is avoided by the pre-integrated volume rendering technology. Experiments demonstrate the good performance of the proposed method. PMID:19024471

  7. Generating mock data sets for large-scale Lyman-α forest correlation measurements

    SciTech Connect

    Font-Ribera, Andreu; McDonald, Patrick; Miralda-Escudé, Jordi E-mail: pvmcdonald@lbl.gov

    2012-01-01

    Massive spectroscopic surveys of high-redshift quasars yield large numbers of correlated Lyα absorption spectra that can be used to measure large-scale structure. Simulations of these surveys are required to accurately interpret the measurements of correlations and correct for systematic errors. An efficient method to generate mock realizations of Lyα forest surveys is presented which generates a field over the lines of sight to the survey sources only, instead of having to generate it over the entire three-dimensional volume of the survey. The method can be calibrated to reproduce the power spectrum and one-point distribution function of the transmitted flux fraction, as well as the redshift evolution of these quantities, and is easily used for modeling any survey systematic effects. We present an example of how these mock surveys are applied to predict the measurement errors in a survey with similar parameters as the BOSS quasar survey in SDSS-III.

  8. High-throughput film-densitometry: An efficient approach to generate large data sets

    SciTech Connect

    Typke, Dieter; Nordmeyer, Robert A.; Jones, Arthur; Lee, Juyoung; Avila-Sakar, Agustin; Downing, Kenneth H.; Glaeser, Robert M.

    2004-07-14

    A film-handling machine (robot) has been built which can, in conjunction with a commercially available film densitometer, exchange and digitize over 300 electron micrographs per day. Implementation of robotic film handling effectively eliminates the delay and tedium associated with digitizing images when data are initially recorded on photographic film. The modulation transfer function (MTF) of the commercially available densitometer is significantly worse than that of a high-end, scientific microdensitometer. Nevertheless, its signal-to-noise ratio (S/N) is quite excellent, allowing substantial restoration of the output to ''near-to-perfect'' performance. Due to the large area of the standard electron microscope film that can be digitized by the commercial densitometer (up to 10,000 x 13,680 pixels with an appropriately coded holder), automated film digitization offers a fast and inexpensive alternative to high-end CCD cameras as a means of acquiring large amounts of image data in electron microscopy.

  9. Parallel k-Means Clustering for Quantitative Ecoregion Delineation Using Large Data Sets

    SciTech Connect

    Kumar, Jitendra; Mills, Richard T; Hoffman, Forrest M; HargroveJr., William Walter

    2011-01-01

    Identification of geographic ecoregions has long been of interest to environmental scientists and ecologists for identifying regions of similar ecological and environmental conditions. Such classifications are important for predicting suitable species ranges, for stratification of ecological samples, and to help prioritize habitat preservation and remediation efforts. Hargrove and Hoffman (1999, 2009) have developed geographical spatio-temporal clustering algorithms and codes and have successfully applied them to a variety of environmental science domains, including ecological regionalization; environmental monitoring network design; analysis of satellite-, airborne-, and ground-based remote sensing, and climate model-model and model-measurement intercomparison. With the advances in state-of-the-art satellite remote sensing and climate models, observations and model outputs are available at increasingly high spatial and temporal resolutions. Long time series of these high resolution datasets are extremely large in size and growing. Analysis and knowledge extraction from these large datasets are not just algorithmic and ecological problems, but also pose a complex computational problem. This paper focuses on the development of a massively parallel multivariate geographical spatio-temporal clustering code for analysis of very large datasets using tens of thousands processors on one of the fastest supercomputers in the world.

  10. Large-Eddy Simulation of Premixed and Partially Premixed Turbulent Combustion Using a Level Set Method

    NASA Astrophysics Data System (ADS)

    Duchamp de Lageneste, Laurent; Pitsch, Heinz

    2001-11-01

    Level-set methods (G-equation) have been recently used in the context of RANS to model turbulent premixed (Hermann 2000) or partially premixed (Chen 1999) combustion. By directly taking into account unsteady effects, LES can be expected to improve predictions over RANS. Since the reaction zone thickness of premixed flames in technical devices is usually much smaller than the LES grid spacing, chemical reactions completely occur on the sub-grid scales and hence have to be modeled entirely. In the level-set methodology, the flame front is represented by an arbitrary iso-surface G0 of a scalar field G whose evolution is described by the so-called G-equation. This equation is only valid at G=G_0, and hence decoupled from other G levels. Heat release is then modeled using a flamelet approach in which temperature is determined as a function of G and the mixture-fraction Z. In the present study, the proposed approach has been formulated for LES and validated using data from a turbulent Bunsen burner experiment (Chen, Peters 1996). Simulation of an experimental Lean Premixed Prevapourised (LPP) dump combustor (Besson, Bruel 1999, 2000) under different premixed or partially premixed conditions will also be presented.

  11. Latest developments in the display of large-scale ionospheric and thermospheric data sets

    NASA Technical Reports Server (NTRS)

    Sojka, J. J.

    1992-01-01

    Over the past decade, data base sizes have continually increased and will continue to do so in the future. This problem of size is further compounded because the trend in present-day studies is to use data from many different locations and different instruments and then compare it with data from global scale physical models. The latter produce data bases of comparable if not even larger size. Much of the data can be viewed as 'image' time sequences and is most readily viewed on color display terminals. These data sets reside in national or owner-generated data bases linked together by computer networks. As the size increases, just moving this data around, taking 'quick-looks' at the data, or even storing it locally become severe problems compromising the scientific return from the data. Is the present-day technology with these analysis techniques being used in the best way? What are the prospects for reducing the storage and transmission size of the data sets? Examples of such problems and potential solutions are described in this paper.

  12. Options in Education, Transcript for February 16, 1976: National Commitment to Equal Rights & Equal Educational Opportunity, Racial Conflict in the Classroom, Setting Up a Publishing Business, and Women in Education (Mathematics and Sex).

    ERIC Educational Resources Information Center

    George Washington Univ., Washington, DC. Inst. for Educational Leadership.

    "Options in Education" is a radio news program which focuses on issues and developments in education. This transcript contains discussions of the national commitment to desegregated education, racial conflict in the classroom, learning how to set up a publishing business, women in education (mathematics and sex) and education news highlights.…

  13. Validating hierarchical verbal autopsy expert algorithms in a large data set with known causes of death

    PubMed Central

    Kalter, Henry D; Perin, Jamie; Black, Robert E

    2016-01-01

    Background Physician assessment historically has been the most common method of analyzing verbal autopsy (VA) data. Recently, the World Health Organization endorsed two automated methods, Tariff 2.0 and InterVA–4, which promise greater objectivity and lower cost. A disadvantage of the Tariff method is that it requires a training data set from a prior validation study, while InterVA relies on clinically specified conditional probabilities. We undertook to validate the hierarchical expert algorithm analysis of VA data, an automated, intuitive, deterministic method that does not require a training data set. Methods Using Population Health Metrics Research Consortium study hospital source data, we compared the primary causes of 1629 neonatal and 1456 1–59 month–old child deaths from VA expert algorithms arranged in a hierarchy to their reference standard causes. The expert algorithms were held constant, while five prior and one new “compromise” neonatal hierarchy, and three former child hierarchies were tested. For each comparison, the reference standard data were resampled 1000 times within the range of cause–specific mortality fractions (CSMF) for one of three approximated community scenarios in the 2013 WHO global causes of death, plus one random mortality cause proportions scenario. We utilized CSMF accuracy to assess overall population–level validity, and the absolute difference between VA and reference standard CSMFs to examine particular causes. Chance–corrected concordance (CCC) and Cohen’s kappa were used to evaluate individual–level cause assignment. Results Overall CSMF accuracy for the best–performing expert algorithm hierarchy was 0.80 (range 0.57–0.96) for neonatal deaths and 0.76 (0.50–0.97) for child deaths. Performance for particular causes of death varied, with fairly flat estimated CSMF over a range of reference values for several causes. Performance at the individual diagnosis level was also less favorable than that for overall CSMF (neonatal: best CCC = 0.23, range 0.16–0.33; best kappa = 0.29, 0.23–0.35; child: best CCC = 0.40, 0.19–0.45; best kappa = 0.29, 0.07–0.35). Conclusions Expert algorithms in a hierarchy offer an accessible, automated method for assigning VA causes of death. Overall population–level accuracy is similar to that of more complex machine learning methods, but without need for a training data set from a prior validation study.

  14. Using large clinical data sets to infer pathogenicity for rare copy number variants in autism cohorts

    PubMed Central

    Moreno-De-Luca, D; Sanders, S J; Willsey, A J; Mulle, J G; Lowe, J K; Geschwind, D H; State, M W; Martin, C L; Ledbetter, D H

    2013-01-01

    Copy number variants (CNVs) have a major role in the etiology of autism spectrum disorders (ASD), and several of these have reached statistical significance in case–control analyses. Nevertheless, current ASD cohorts are not large enough to detect very rare CNVs that may be causative or contributory (that is, risk alleles). Here, we use a tiered approach, in which clinically significant CNVs are first identified in large clinical cohorts of neurodevelopmental disorders (including but not specific to ASD), after which these CNVs are then systematically identified within well-characterized ASD cohorts. We focused our initial analysis on 48 recurrent CNVs (segmental duplication-mediated ‘hotspots') from 24 loci in 31 516 published clinical cases with neurodevelopmental disorders and 13 696 published controls, which yielded a total of 19 deletion CNVs and 11 duplication CNVs that reached statistical significance. We then investigated the overlap of these 30 CNVs in a combined sample of 3955 well-characterized ASD cases from three published studies. We identified 73 deleterious recurrent CNVs, including 36 deletions from 11 loci and 37 duplications from seven loci, for a frequency of 1 in 54; had we considered the ASD cohorts alone, only 58 CNVs from eight loci (24 deletions from three loci and 34 duplications from five loci) would have reached statistical significance. In conclusion, until there are sufficiently large ASD research cohorts with enough power to detect very rare causative or contributory CNVs, data from larger clinical cohorts can be used to infer the likely clinical significance of CNVs in ASD. PMID:23044707

  15. Using Classroom-Based Assessment on a Large Scale: Supporting and Reporting on Student Learning with the Early Literacy Profile.

    ERIC Educational Resources Information Center

    Falk, Beverly; Ort, Suzanna Wichterle; Moirs, Katie

    This paper describes the development work and research findings of an initiative to create a statewide literacy assessment in New York to inform teaching and learning and report on group performance trends. The Early Literacy Profile (ELP) is a classroom-based, standards-referenced performance assessment for students in the primary grades

  16. Using a Classroom Response System for Promoting Interaction to Teaching Mathematics to Large Groups of Undergraduate Students

    ERIC Educational Resources Information Center

    Morais, Adolfo; Barragués, José Ignacio; Guisasola, Jenaro

    2015-01-01

    This work describes the design and evaluation of a proposal to use Classroom Response Systems (CRS), intended to promote participative classes of Mathematics at University. The proposal is based on Problem Based Learnig (PBL) and uses Robert's six hypotheses for mathematical teaching-learning. The results show that PBL is a relevant strategy to…

  17. The Impact of Mobile Learning on Students' Learning Behaviours and Performance: Report from a Large Blended Classroom

    ERIC Educational Resources Information Center

    Wang, Minjuan; Shen, Ruimin; Novak, Daniel; Pan, Xiaoyan

    2009-01-01

    Chinese classrooms, whether on school grounds or online, have long suffered from a lack of interactivity. Many online classes simply provide recorded instructor lectures, which only reinforces the negative effects of passive nonparticipatory learning. At Shanghai Jiaotong University, researchers and developers actively seek technologic…

  18. Independent Principal Component Analysis for biologically meaningful dimension reduction of large biological data sets

    PubMed Central

    2012-01-01

    Background A key question when analyzing high throughput data is whether the information provided by the measured biological entities (gene, metabolite expression for example) is related to the experimental conditions, or, rather, to some interfering signals, such as experimental bias or artefacts. Visualization tools are therefore useful to better understand the underlying structure of the data in a 'blind' (unsupervised) way. A well-established technique to do so is Principal Component Analysis (PCA). PCA is particularly powerful if the biological question is related to the highest variance. Independent Component Analysis (ICA) has been proposed as an alternative to PCA as it optimizes an independence condition to give more meaningful components. However, neither PCA nor ICA can overcome both the high dimensionality and noisy characteristics of biological data. Results We propose Independent Principal Component Analysis (IPCA) that combines the advantages of both PCA and ICA. It uses ICA as a denoising process of the loading vectors produced by PCA to better highlight the important biological entities and reveal insightful patterns in the data. The result is a better clustering of the biological samples on graphical representations. In addition, a sparse version is proposed that performs an internal variable selection to identify biologically relevant features (sIPCA). Conclusions On simulation studies and real data sets, we showed that IPCA offers a better visualization of the data than ICA and with a smaller number of components than PCA. Furthermore, a preliminary investigation of the list of genes selected with sIPCA demonstrate that the approach is well able to highlight relevant genes in the data with respect to the biological experiment. IPCA and sIPCA are both implemented in the R package mixomics dedicated to the analysis and exploration of high dimensional biological data sets, and on mixomics' web-interface. PMID:22305354

  19. A new tool called DISSECT for analysing large genomic data sets using a Big Data approach

    PubMed Central

    Canela-Xandri, Oriol; Law, Andy; Gray, Alan; Woolliams, John A.; Tenesa, Albert

    2015-01-01

    Large-scale genetic and genomic data are increasingly available and the major bottleneck in their analysis is a lack of sufficiently scalable computational tools. To address this problem in the context of complex traits analysis, we present DISSECT. DISSECT is a new and freely available software that is able to exploit the distributed-memory parallel computational architectures of compute clusters, to perform a wide range of genomic and epidemiologic analyses, which currently can only be carried out on reduced sample sizes or under restricted conditions. We demonstrate the usefulness of our new tool by addressing the challenge of predicting phenotypes from genotype data in human populations using mixed-linear model analysis. We analyse simulated traits from 470,000 individuals genotyped for 590,004 SNPs in ∼4 h using the combined computational power of 8,400 processor cores. We find that prediction accuracies in excess of 80% of the theoretical maximum could be achieved with large sample sizes. PMID:26657010

  20. A new tool called DISSECT for analysing large genomic data sets using a Big Data approach.

    PubMed

    Canela-Xandri, Oriol; Law, Andy; Gray, Alan; Woolliams, John A; Tenesa, Albert

    2015-01-01

    Large-scale genetic and genomic data are increasingly available and the major bottleneck in their analysis is a lack of sufficiently scalable computational tools. To address this problem in the context of complex traits analysis, we present DISSECT. DISSECT is a new and freely available software that is able to exploit the distributed-memory parallel computational architectures of compute clusters, to perform a wide range of genomic and epidemiologic analyses, which currently can only be carried out on reduced sample sizes or under restricted conditions. We demonstrate the usefulness of our new tool by addressing the challenge of predicting phenotypes from genotype data in human populations using mixed-linear model analysis. We analyse simulated traits from 470,000 individuals genotyped for 590,004 SNPs in ?4?h using the combined computational power of 8,400 processor cores. We find that prediction accuracies in excess of 80% of the theoretical maximum could be achieved with large sample sizes. PMID:26657010

  1. Spatial and temporal data integration in large scale hydrology: a gridded hydrometeorological data set for Peninsular Malaysia

    NASA Astrophysics Data System (ADS)

    Wong, C. L.; Jamil, A. B. M.; Venneker, R.; Uhlenbrook, S.

    2009-04-01

    The integration of spatially and temporally varying data is an important step to formulate and generalize the large-scale relationships and feedback between atmosphere and hydrological processes. In this contribution, we present a moderate resolution surface hydrometeorological data set for Peninsular Malaysia. The data set is gridded from daily observation data at a grid size of 0.05 degree resolution (~5.5km) for 1975-2005. The parameters include rainfall, temperature, pressure, humidity, wind speed and downward radiation. An overview of the integration and processing of the variety of data sources and data assessment is also presented.

  2. A hybrid structure for the storage and manipulation of very large spatial data sets

    USGS Publications Warehouse

    Peuquet, Donna J.

    1982-01-01

    The map data input and output problem for geographic information systems is rapidly diminishing with the increasing availability of mass digitizing, direct spatial data capture and graphics hardware based on raster technology. Although a large number of efficient raster-based algorithms exist for performing a wide variety of common tasks on these data, there are a number of procedures which are more efficiently performed in vector mode or for which raster mode equivalents of current vector-based techniques have not yet been developed. This paper presents a hybrid spatial data structure, named the ?vaster' structure, which can utilize the advantages of both raster and vector structures while potentially eliminating, or greatly reducing, the need for raster-to-vector and vector-to-raster conversion. Other advantages of the vaster structure are also discussed.

  3. Estimation of melting points of large set of persistent organic pollutants utilizing QSPR approach.

    PubMed

    Watkins, Marquita; Sizochenko, Natalia; Rasulev, Bakhtiyor; Leszczynski, Jerzy

    2016-03-01

    The presence of polyhalogenated persistent organic pollutants (POPs), such as Cl/Br-substituted benzenes, biphenyls, diphenyl ethers, and naphthalenes has been identified in all environmental compartments. The exposure to these compounds can pose potential risk not only for ecological systems, but also for human health. Therefore, efficient tools for comprehensive environmental risk assessment for POPs are required. Among the factors vital for environmental transport and fate processes is melting point of a compound. In this study, we estimated the melting points of a large group (1419 compounds) of chloro- and bromo- derivatives of dibenzo-p-dioxins, dibenzofurans, biphenyls, naphthalenes, diphenylethers, and benzenes by utilizing quantitative structure-property relationship (QSPR) techniques. The compounds were classified by applying structure-based clustering methods followed by GA-PLS modeling. In addition, random forest method has been applied to develop more general models. Factors responsible for melting point behavior and predictive ability of each method were discussed. PMID:26874948

  4. Processing large sensor data sets for safeguards : the knowledge generation system.

    SciTech Connect

    Thomas, Maikel A.; Smartt, Heidi Anne; Matthews, Robert F.

    2012-04-01

    Modern nuclear facilities, such as reprocessing plants, present inspectors with significant challenges due in part to the sheer amount of equipment that must be safeguarded. The Sandia-developed and patented Knowledge Generation system was designed to automatically analyze large amounts of safeguards data to identify anomalous events of interest by comparing sensor readings with those expected from a process of interest and operator declarations. This paper describes a demonstration of the Knowledge Generation system using simulated accountability tank sensor data to represent part of a reprocessing plant. The demonstration indicated that Knowledge Generation has the potential to address several problems critical to the future of safeguards. It could be extended to facilitate remote inspections and trigger random inspections. Knowledge Generation could analyze data to establish trust hierarchies, to facilitate safeguards use of operator-owned sensors.

  5. The coming of age of phosphoproteomics--from large data sets to inference of protein functions.

    PubMed

    Roux, Philippe P; Thibault, Pierre

    2013-12-01

    Protein phosphorylation is one of the most common post-translational modifications used in signal transduction to control cell growth, proliferation, and survival in response to both intracellular and extracellular stimuli. This modification is finely coordinated by a network of kinases and phosphatases that recognize unique sequence motifs and/or mediate their functions through scaffold and adaptor proteins. Detailed information on the nature of kinase substrates and site-specific phosphoregulation is required in order for one to better understand their pathophysiological roles. Recent advances in affinity chromatography and mass spectrometry (MS) sensitivity have enabled the large-scale identification and profiling of protein phosphorylation, but appropriate follow-up experiments are required in order to ascertain the functional significance of identified phosphorylation sites. In this review, we present meaningful technical details for MS-based phosphoproteomic analyses and describe important considerations for the selection of model systems and the functional characterization of identified phosphorylation sites. PMID:24037665

  6. Setting up a Rayleigh Scattering Based Flow Measuring System in a Large Nozzle Testing Facility

    NASA Technical Reports Server (NTRS)

    Panda, Jayanta; Gomez, Carlos R.

    2002-01-01

    A molecular Rayleigh scattering based air density measurement system has been built in a large nozzle testing facility at NASA Glenn Research Center. The technique depends on the light scattering by gas molecules present in air; no artificial seeding is required. Light from a single mode, continuous wave laser was transmitted to the nozzle facility by optical fiber, and light scattered by gas molecules, at various points along the laser beam, is collected and measured by photon-counting electronics. By placing the laser beam and collection optics on synchronized traversing units, the point measurement technique is made effective for surveying density variation over a cross-section of the nozzle plume. Various difficulties associated with dust particles, stray light, high noise level and vibration are discussed. Finally, a limited amount of data from an underexpanded jet are presented and compared with expected variations to validate the technique.

  7. An Examination of Classroom Social Environment on Motivation and Engagement of College Early Entrant Honors Students

    ERIC Educational Resources Information Center

    Maddox, Richard S.

    2010-01-01

    This study set out to examine the relationships between the classroom social environment, motivation, engagement and achievement of a group of early entrant Honors students at a large urban university. Prior research on the classroom environment, motivation, engagement and high ability students was examined, leading to the assumption that the…

  8. An Examination of Classroom Social Environment on Motivation and Engagement of College Early Entrant Honors Students

    ERIC Educational Resources Information Center

    Maddox, Richard S.

    2010-01-01

    This study set out to examine the relationships between the classroom social environment, motivation, engagement and achievement of a group of early entrant Honors students at a large urban university. Prior research on the classroom environment, motivation, engagement and high ability students was examined, leading to the assumption that the

  9. Mining pinyin-to-character conversion rules from large-scale corpus: a rough set approach.

    PubMed

    Wang, Xiaolong; Chen, Qingcai; Yeung, Daniel S

    2004-04-01

    This paper introduces a rough set technique for solving the problem of mining Pinyin-to-character (PTC) conversion rules. It first presents a text-structuring method by constructing a language information table from a corpus for each pinyin, which it will then apply to a free-form textual corpus. Data generalization and rule extraction algorithms can then be used to eliminate redundant information and extract consistent PTC conversion rules. The design of our model also addresses a number of important issues such as the long-distance dependency problem, the storage requirements of the rule base, and the consistency of the extracted rules, while the performance of the extracted rules as well as the effects of different model parameters are evaluated experimentally. These results show that by the smoothing method, high precision conversion (0.947) and recall rates (0.84) can be achieved even for rules represented directly by pinyin rather than words. A comparison with the baseline tri-gram model also shows good complement between our method and the tri-gram language model. PMID:15376833

  10. Improved Species-Specific Lysine Acetylation Site Prediction Based on a Large Variety of Features Set

    PubMed Central

    Wuyun, Qiqige; Zheng, Wei; Zhang, Yanping; Ruan, Jishou; Hu, Gang

    2016-01-01

    Lysine acetylation is a major post-translational modification. It plays a vital role in numerous essential biological processes, such as gene expression and metabolism, and is related to some human diseases. To fully understand the regulatory mechanism of acetylation, identification of acetylation sites is first and most important. However, experimental identification of protein acetylation sites is often time consuming and expensive. Therefore, the alternative computational methods are necessary. Here, we developed a novel tool, KA-predictor, to predict species-specific lysine acetylation sites based on support vector machine (SVM) classifier. We incorporated different types of features and employed an efficient feature selection on each type to form the final optimal feature set for model learning. And our predictor was highly competitive for the majority of species when compared with other methods. Feature contribution analysis indicated that HSE features, which were firstly introduced for lysine acetylation prediction, significantly improved the predictive performance. Particularly, we constructed a high-accurate structure dataset of H.sapiens from PDB to analyze the structural properties around lysine acetylation sites. Our datasets and a user-friendly local tool of KA-predictor can be freely available at http://sourceforge.net/p/ka-predictor. PMID:27183223

  11. New large solar photocatalytic plant: set-up and preliminary results.

    PubMed

    Malato, S; Blanco, J; Vidal, A; Fernández, P; Cáceres, J; Trincado, P; Oliveira, J C; Vincent, M

    2002-04-01

    A European industrial consortium called SOLARDETOX has been created as the result of an EC-DGXII BRITE-EURAM-III-financed project on solar photocatalytic detoxification of water. The project objective was to develop a simple, efficient and commercially competitive water-treatment technology, based on compound parabolic collectors (CPCs) solar collectors and TiO2 photocatalysis, to make possible easy design and installation. The design, set-up and preliminary results of the main project deliverable, the first European industrial solar detoxification treatment plant, is presented. This plant has been designed for the batch treatment of 2 m3 of water with a 100 m2 collector-aperture area and aqueous aerated suspensions of polycrystalline TiO2 irradiated by sunlight. Fully automatic control reduces operation and maintenance manpower. Plant behaviour has been compared (using dichloroacetic acid and cyanide at 50 mg l(-1) initial concentration as model compounds) with the small CPC pilot plants installed at the Plataforma Solar de Almería several years ago. The first results with high-content cyanide (1 g l(-1)) waste water are presented and plant treatment capacity is calculated. PMID:11996143

  12. Data Mining on Large Data Set for Predicting Salmon Spawning Habitat

    SciTech Connect

    Xie, YuLong; Murray, Christopher J.; Hanrahan, Timothy P.; Geist, David R.

    2008-07-01

    Hydraulic properties related to river flow affect salmon spawning habitat. Accurate prediction of salmon spawning habitat and understanding the influential properties on the spawning behavior are of great interest for hydroelectric dam management. Previous research predicted salmon spawning habitat through deriving river specific spawning suitability indices and employing a function estimate method like logistic regression on several static river flow related properties and had some success. The objective of this study was two-fold. First dynamic river flow properties associated with upstream dam operation were successfully derived from a huge set of time series of both water velocity and water depth for about one fifth of a million habitat cells through principal component analysis (PCA) using nonlinear iterative partial least squares (NIPLAS). The inclusion of dynamic variables in the models greatly improved the model prediction. Secondly, nine machine learning methods were applied to the data and it was found that decision tree and rule induction methods were generally outperformed usually used logistic regression. Specifically random forest, an advanced decision tree algorithm, provided unanimous better results. Over-prediction problem in previous studies were greatly alleviated.

  13. Cytotoxicity evaluation of large cyanobacterial strain set using selected human and murine in vitro cell models.

    PubMed

    Hrouzek, Pavel; Kapuścik, Aleksandra; Vacek, Jan; Voráčová, Kateřina; Paichlová, Jindřiška; Kosina, Pavel; Voloshko, Ludmila; Ventura, Stefano; Kopecký, Jiří

    2016-02-01

    The production of cytotoxic molecules interfering with mammalian cells is extensively reported in cyanobacteria. These compounds may have a use in pharmacological applications; however, their potential toxicity needs to be considered. We performed cytotoxicity tests of crude cyanobacterial extracts in six cell models in order to address the frequency of cyanobacterial cytotoxicity to human cells and the level of specificity to a particular cell line. A set of more than 100 cyanobacterial crude extracts isolated from soil habitats (mainly genera Nostoc and Tolypothrix) was tested by MTT test for in vitro toxicity on the hepatic and non-hepatic human cell lines HepG2 and HeLa, and three cell systems of rodent origin: Yac-1, Sp-2 and Balb/c 3T3 fibroblasts. Furthermore, a subset of the extracts was assessed for cytotoxicity against primary cultures of human hepatocytes as a model for evaluating potential hepatotoxicity. Roughly one third of cyanobacterial extracts caused cytotoxic effects (i.e. viability<75%) on human cell lines. Despite the sensitivity differences, high correlation coefficients among the inhibition values were obtained for particular cell systems. This suggests a prevailing general cytotoxic effect of extracts and their constituents. The non-transformed immortalized fibroblasts (Balb/c 3T3) and hepatic cancer line HepG2 exhibited good correlations with primary cultures of human hepatocytes. The presence of cytotoxic fractions in strongly cytotoxic extracts was confirmed by an activity-guided HPLC fractionation, and it was demonstrated that cyanobacterial cytotoxicity is caused by a mixture of components with similar hydrophobic/hydrophilic properties. The data presented here could be used in further research into in vitro testing based on human models for the toxicological monitoring of complex cyanobacterial samples. PMID:26519817

  14. Inference of higher-order relationships in the cycads from a large chloroplast data set.

    PubMed

    Rai, Hardeep S; O'Brien, Heath E; Reeves, Patrick A; Olmstead, Richard G; Graham, Sean W

    2003-11-01

    We investigated higher-order relationships in the cycads, an ancient group of seed-bearing plants, by examining a large portion of the chloroplast genome from seven species chosen to exemplify our current understanding of taxonomic diversity in the order. The regions considered span approximately 13.5 kb of unaligned data per taxon, and comprise a diverse range of coding sequences, introns and intergenic spacers dispersed throughout the plastid genome. Our results provide substantial support for most of the inferred backbone of cycad phylogeny, and weak evidence that the sister-group of the cycads among living seed plants is Ginkgo biloba. Cycas (representing Cycadaceae) is the sister-group of the remaining cycads; Dioon is part of the next most basal split. Two of the three commonly recognized families of cycads (Zamiaceae and Stangeriaceae) are not monophyletic; Stangeria is embedded within Zamiaceae, close to Zamia and Ceratozamia, and not closely allied to the other genus of Stangeriaceae, Bowenia. In contrast to the other seed plants, cycad chloroplast genomes share two features with Ginkgo: a reduced rate of evolution and an elevated transition:transversion ratio. We demonstrate that the latter aspect of their molecular evolution is unlikely to have affected inference of cycad relationships in the context of seed-plant wide analyses. PMID:13678689

  15. Value-cell bar charts for visualizing large transaction data sets.

    PubMed

    Keim, Daniel A; Hao, Ming C; Dayal, Umeshwar; Lyons, Martha

    2007-01-01

    One of the common problems businesses need to solve is how to use large volumes of sales histories, Web transactions, and other data to understand the behavior of their customers and increase their revenues. Bar charts are widely used for daily analysis, but only show highly aggregated data. Users often need to visualize detailed multidimensional information reflecting the health of their businesses. In this paper, we propose an innovative visualization solution based on the use of value cells within bar charts to represent business metrics. The value of a transaction can be discretized into one or multiple cells: high-value transactions are mapped to multiple value cells, whereas many small-value transactions are combined into one cell. With value-cell bar charts, users can 1) visualize transaction value distributions and correlations, 2) identify high-value transactions and outliers at a glance, and 3) instantly display values at the transaction record level. Value-Cell Bar Charts have been applied with success to different sales and IT service usage applications, demonstrating the benefits of the technique over traditional charting techniques. A comparison with two variants of the well-known Treemap technique and our earlier work on Pixel Bar Charts is also included. PMID:17495340

  16. A large set of newly created interspecific Saccharomyces hybrids increases aromatic diversity in lager beers.

    PubMed

    Mertens, Stijn; Steensels, Jan; Saels, Veerle; De Rouck, Gert; Aerts, Guido; Verstrepen, Kevin J

    2015-12-01

    Lager beer is the most consumed alcoholic beverage in the world. Its production process is marked by a fermentation conducted at low (8 to 15°C) temperatures and by the use of Saccharomyces pastorianus, an interspecific hybrid between Saccharomyces cerevisiae and the cold-tolerant Saccharomyces eubayanus. Recent whole-genome-sequencing efforts revealed that the currently available lager yeasts belong to one of only two archetypes, "Saaz" and "Frohberg." This limited genetic variation likely reflects that all lager yeasts descend from only two separate interspecific hybridization events, which may also explain the relatively limited aromatic diversity between the available lager beer yeasts compared to, for example, wine and ale beer yeasts. In this study, 31 novel interspecific yeast hybrids were developed, resulting from large-scale robot-assisted selection and breeding between carefully selected strains of S. cerevisiae (six strains) and S. eubayanus (two strains). Interestingly, many of the resulting hybrids showed a broader temperature tolerance than their parental strains and reference S. pastorianus yeasts. Moreover, they combined a high fermentation capacity with a desirable aroma profile in laboratory-scale lager beer fermentations, thereby successfully enriching the currently available lager yeast biodiversity. Pilot-scale trials further confirmed the industrial potential of these hybrids and identified one strain, hybrid H29, which combines a fast fermentation, high attenuation, and the production of a complex, desirable fruity aroma. PMID:26407881

  17. Large-scale assessment of missed opportunity risks in a complex hospital setting.

    PubMed

    Peng, Yidong; Erdem, Ergin; Shi, Jing; Masek, Christopher; Woodbridge, Peter

    2016-03-01

    In this research, we apply a large-scale logistic regression analysis to assess the patient missed opportunity risks at a complex VA (US Department of Veterans Affairs) hospital in three categories, namely, no-show alone, no-show combined with late patient cancellation and no-show combined with late patient and clinic cancellations. The analysis includes unique explanatory variables related to VA patients for predicting missed opportunity risks. Furthermore, we develop two aggregated weather indices by combining many weather measures and include them as explanatory variables. The results indicate that most of the explanatory variables considered are significant factors for predicting the missed opportunity risks. Patients with afternoon appointment, higher percentage service connected, and insurance, married patients, shorter lead time and appointments with longer appointment length are consistently related to lower risks of missed opportunity. Furthermore, the VA patient-related factors and the two proposed weather indices are useful predictors for the risks of no-show and patient cancellation. More importantly, this research presents an effective procedure for VA hospitals and clinics to analyze the missed opportunity risks within the complex VA information technology system, and help them to develop proper interventions to mitigate the adverse effects caused by the missed opportunities. PMID:25325215

  18. "Tools For Analysis and Visualization of Large Time- Varying CFD Data Sets"

    NASA Technical Reports Server (NTRS)

    Wilhelms, Jane; vanGelder, Allen

    1999-01-01

    During the four years of this grant (including the one year extension), we have explored many aspects of the visualization of large CFD (Computational Fluid Dynamics) datasets. These have included new direct volume rendering approaches, hierarchical methods, volume decimation, error metrics, parallelization, hardware texture mapping, and methods for analyzing and comparing images. First, we implemented an extremely general direct volume rendering approach that can be used to render rectilinear, curvilinear, or tetrahedral grids, including overlapping multiple zone grids, and time-varying grids. Next, we developed techniques for associating the sample data with a k-d tree, a simple hierarchial data model to approximate samples in the regions covered by each node of the tree, and an error metric for the accuracy of the model. We also explored a new method for determining the accuracy of approximate models based on the light field method described at ACM SIGGRAPH (Association for Computing Machinery Special Interest Group on Computer Graphics) '96. In our initial implementation, we automatically image the volume from 32 approximately evenly distributed positions on the surface of an enclosing tessellated sphere. We then calculate differences between these images under different conditions of volume approximation or decimation.

  19. Can Wide Consultation Help with Setting Priorities for Large-Scale Biodiversity Monitoring Programs?

    PubMed Central

    Boivin, Frédéric; Simard, Anouk; Peres-Neto, Pedro

    2014-01-01

    Climate and other global change phenomena affecting biodiversity require monitoring to track ecosystem changes and guide policy and management actions. Designing a biodiversity monitoring program is a difficult task that requires making decisions that often lack consensus due to budgetary constrains. As monitoring programs require long-term investment, they also require strong and continuing support from all interested parties. As such, stakeholder consultation is key to identify priorities and make sound design decisions that have as much support as possible. Here, we present the results of a consultation conducted to serve as an aid for designing a large-scale biodiversity monitoring program for the province of Québec (Canada). The consultation took the form of a survey with 13 discrete choices involving tradeoffs in respect to design priorities and 10 demographic questions (e.g., age, profession). The survey was sent to thousands of individuals having expected interests and knowledge about biodiversity and was completed by 621 participants. Overall, consensuses were few and it appeared difficult to create a design fulfilling the priorities of the majority. Most participants wanted 1) a monitoring design covering the entire territory and focusing on natural habitats; 2) a focus on species related to ecosystem services, on threatened and on invasive species. The only demographic characteristic that was related to the type of prioritization was the declared level of knowledge in biodiversity (null to high), but even then the influence was quite small. PMID:25525798

  20. The Same or Separate? An Exploration of Teachers' Perceptions of the Classroom Assignment of Twins in Prior to School and Kindergarten to Year Two School Settings

    ERIC Educational Resources Information Center

    Jones, Laura; De Gioia, Katey

    2010-01-01

    This article investigates the perceptions of 12 teachers from New South Wales, Australia, regarding the classroom assignment of twins. Analysis of semi-structured interviews with each of the teachers revealed four key findings: 1) teachers' perceptions about the classroom assignment of twins vary according to their previous experience and…

  1. A heuristic algorithm for pattern identification in large multivariate analysis of geophysical data sets

    NASA Astrophysics Data System (ADS)

    da Silva Pereira, João Eduardo; Strieder, Adelir José; Amador, Janete Pereira; da Silva, José Luiz Silvério; Volcato Descovi Filho, Leônidas Luiz

    2010-01-01

    This paper aims to present a heuristic algorithm with factor analysis and a local search optimization system for pattern identification problems as applied to large and multivariate aero-geophysical data. The algorithm was developed in MATLAB code using both multivariate and univariate methodologies. Two main analysis steps are detailed in the MATLAB code: the first deals with multivariate factor analysis to reduce the problem of dimension, and to orient the variables in an independent and orthogonal structure; and the second with the application of a novel local research optimization system based on univariate structure. The process of local search is simple and consistent because it solves a multivariate problem by summing up univariate and independent problems. Thus, it can reduce computational time and render the efficiency of estimates independent of the data bank. The aero-geophysical data include the results of the magnetometric and gammaspectrometric (TC, K, Th, and U channels) surveys for the Santa Maria region (RS, Brazil). After the classification, when the observations are superimposed on the regional map, one can see that data belonging to the same subspace appear closer to each other revealing some physical law governing area pattern distribution. The analysis of variance for the original variables as functions of the subspaces obtained results in different mean behaviors for all the variables. This result shows that the use of factor transformation captures the discriminative capacity of the original variables. The proposed algorithm for multivariate factor analysis and the local search system open up new challenges in aero-geophysical data handling and processing techniques.

  2. Designing Websites for Displaying Large Data Sets and Images on Multiple Platforms

    NASA Astrophysics Data System (ADS)

    Anderson, A.; Wolf, V. G.; Garron, J.; Kirschner, M.

    2012-12-01

    The desire to build websites to analyze and display ever increasing amounts of scientific data and images pushes for web site designs which utilize large displays, and to use the display area as efficiently as possible. Yet, scientists and users of their data are increasingly wishing to access these websites in the field and on mobile devices. This results in the need to develop websites that can support a wide range of devices and screen sizes, and to optimally use whatever display area is available. Historically, designers have addressed this issue by building two websites; one for mobile devices, and one for desktop environments, resulting in increased cost, duplicity of work, and longer development times. Recent advancements in web design technology and techniques have evolved which allow for the development of a single website that dynamically adjusts to the type of device being used to browse the website (smartphone, tablet, desktop). In addition they provide the opportunity to truly optimize whatever display area is available. HTML5 and CSS3 give web designers media query statements which allow design style sheets to be aware of the size of the display being used, and to format web content differently based upon the queried response. Web elements can be rendered in a different size, position, or even removed from the display entirely, based upon the size of the display area. Using HTML5/CSS3 media queries in this manner is referred to as "Responsive Web Design" (RWD). RWD in combination with technologies such as LESS and Twitter Bootstrap allow the web designer to build web sites which not only dynamically respond to the browser display size being used, but to do so in very controlled and intelligent ways, ensuring that good layout and graphic design principles are followed while doing so. At the University of Alaska Fairbanks, the Alaska Satellite Facility SAR Data Center (ASF) recently redesigned their popular Vertex application and converted it from a traditional, fixed-layout website into a RWD site built on HTML5, LESS and Twitter Bootstrap. Vertex is a data portal for remotely sensed imagery of the earth, offering Synthetic Aperture Radar (SAR) data products from the global ASF archive. By using Responsive Web Design, ASF is able to provide access to a massive collection of SAR imagery and allow the user to use mobile devices and desktops to maximum advantage. ASF's Vertex web site demonstrates that with increased interface flexibility, scientists, managers and users can increase their personal effectiveness by accessing data portals from their preferred device as their science dictates.

  3. Actual Versus Estimated Utility Factor of a Large Set of Privately Owned Chevrolet Volts

    SciTech Connect

    John Smart; Thomas Bradley; Stephen Schey

    2014-04-01

    In order to determine the overall fuel economy of a plug-in hybrid electric vehicle (PHEV), the amount of operation in charge depleting (CD) versus charge sustaining modes must be determined. Mode of operation is predominantly dependent on customer usage of the vehicle and is therefore highly variable. The utility factor (UF) concept was developed to quantify the distance a group of vehicles has traveled or may travel in CD mode. SAE J2841 presents a UF calculation method based on data collected from travel surveys of conventional vehicles. UF estimates have been used in a variety of areas, including the calculation of window sticker fuel economy, policy decisions, and vehicle design determination. The EV Project, a plug-in electric vehicle charging infrastructure demonstration being conducted across the United States, provides the opportunity to determine the real-world UF of a large group of privately owned Chevrolet Volt extended range electric vehicles. Using data collected from Volts enrolled in The EV Project, this paper compares the real-world UF of two groups of Chevrolet Volts to estimated UF's based on J2841. The actual observed fleet utility factors (FUF) for the MY2011/2012 and MY2013 Volt groups studied were observed to be 72% and 74%, respectively. Using the EPA CD ranges, the method prescribed by J2841 estimates a FUF of 65% and 68% for the MY2011/2012 and MY2013 Volt groups, respectively. Volt drivers achieved higher percentages of distance traveled in EV mode for two reasons. First, they had fewer long-distance travel days than drivers in the national travel survey referenced by J2841. Second, they charged more frequently than the J2841 assumption of once per day - drivers of Volts in this study averaged over 1.4 charging events per day. Although actual CD range varied widely as driving conditions varied, the average CD ranges for the two Volt groups studied matched the EPA CD range estimates, so CD range variation did not affect FUF results.

  4. Mining unusual and rare stellar spectra from large spectroscopic survey data sets using the outlier-detection method

    NASA Astrophysics Data System (ADS)

    Wei, Peng; Luo, Ali; Li, Yinbi; Pan, Jingchang; Tu, Liangping; Jiang, Bin; Kong, Xiao; Shi, Zhixin; Yi, Zhenping; Wang, Fengfei; Liu, Jie; Zhao, Yongheng

    2013-05-01

    The large number of spectra obtained from sky surveys such as the Sloan Digital Sky Survey (SDSS) and the survey executed by the Large sky Area Multi-Object fibre Spectroscopic Telescope (LAMOST, also called GuoShouJing Telescope) provide us with opportunities to search for peculiar or even unknown types of spectra. In response to the limitations of existing methods, a novel outlier-mining method, the Monte Carlo Local Outlier Factor (MCLOF), is proposed in this paper, which can be used to highlight unusual and rare spectra from large spectroscopic survey data sets. The MCLOF method exposes outliers automatically and efficiently by marking each spectrum with a number, i.e. using outlier index as a flag for an unusual and rare spectrum. The Local Outlier Factor (LOF) represents how unusual and rare a spectrum is compared with other spectra and the Monte Carlo method is used to compute the global LOF for each spectrum by randomly selecting samples in each independent iteration. Our MCLOF method is applied to over half a million stellar spectra (classified as STAR by the SDSS Pipeline) from the SDSS data release 8 (DR8) and a total of 37 033 spectra are selected as outliers with signal-to-noise ratio (S/N) ≥ 3 and outlier index ≥0.85. Some of these outliers are shown to be binary stars, emission-line stars, carbon stars and stars with unusual continuum. The results show that our proposed method can efficiently highlight these unusual spectra from the survey data sets. In addition, some relatively rare and interesting spectra are selected, indicating that the proposed method can also be used to mine rare, even unknown, spectra. The proposed method can be applicable not only to spectral survey data sets but also to other types of survey data sets. The spectra of all peculiar objects selected by our MCLOF method are available from a user-friendly website: http://sciwiki.lamost.org/Miningdr8/.

  5. Classroom Assessment in Action

    ERIC Educational Resources Information Center

    Shermis, Mark D.; DiVesta, Francis J.

    2011-01-01

    "Classroom Assessment in Action" clarifies the multi-faceted roles of measurement and assessment and their applications in a classroom setting. Comprehensive in scope, Shermis and Di Vesta explain basic measurement concepts and show students how to interpret the results of standardized tests. From these basic concepts, the authors then provide…

  6. Evaluation in the Classroom.

    ERIC Educational Resources Information Center

    Becnel, Shirley

    Six classroom research-based instructional projects funded under Chapter 2 are described, and their outcomes are summarized. The projects each used computer hardware and software in the classroom setting. The projects and their salient points include: (1) the Science Technology Project, in which 48 teachers and 2,847 students in 18 schools used…

  7. The Viking viewer for connectomics: scalable multi-user annotation and summarization of large volume data sets

    PubMed Central

    ANDERSON, JR; MOHAMMED, S; GRIMM, B; JONES, BW; KOSHEVOY, P; TASDIZEN, T; WHITAKER, R; MARC, RE

    2011-01-01

    Modern microscope automation permits the collection of vast amounts of continuous anatomical imagery in both two and three dimensions. These large data sets present significant challenges for data storage, access, viewing, annotation and analysis. The cost and overhead of collecting and storing the data can be extremely high. Large data sets quickly exceed an individual's capability for timely analysis and present challenges in efficiently applying transforms, if needed. Finally annotated anatomical data sets can represent a significant investment of resources and should be easily accessible to the scientific community. The Viking application was our solution created to view and annotate a 16.5 TB ultrastructural retinal connectome volume and we demonstrate its utility in reconstructing neural networks for a distinctive retinal amacrine cell class. Viking has several key features. (1) It works over the internet using HTTP and supports many concurrent users limited only by hardware. (2) It supports a multi-user, collaborative annotation strategy. (3) It cleanly demarcates viewing and analysis from data collection and hosting. (4) It is capable of applying transformations in real-time. (5) It has an easily extensible user interface, allowing addition of specialized modules without rewriting the viewer. PMID:21118201

  8. Knowledge and theme discovery across very large biological data sets using distributed queries: a prototype combining unstructured and structured data.

    PubMed

    Mudunuri, Uma S; Khouja, Mohamad; Repetski, Stephen; Venkataraman, Girish; Che, Anney; Luke, Brian T; Girard, F Pascal; Stephens, Robert M

    2013-01-01

    As the discipline of biomedical science continues to apply new technologies capable of producing unprecedented volumes of noisy and complex biological data, it has become evident that available methods for deriving meaningful information from such data are simply not keeping pace. In order to achieve useful results, researchers require methods that consolidate, store and query combinations of structured and unstructured data sets efficiently and effectively. As we move towards personalized medicine, the need to combine unstructured data, such as medical literature, with large amounts of highly structured and high-throughput data such as human variation or expression data from very large cohorts, is especially urgent. For our study, we investigated a likely biomedical query using the Hadoop framework. We ran queries using native MapReduce tools we developed as well as other open source and proprietary tools. Our results suggest that the available technologies within the Big Data domain can reduce the time and effort needed to utilize and apply distributed queries over large datasets in practical clinical applications in the life sciences domain. The methodologies and technologies discussed in this paper set the stage for a more detailed evaluation that investigates how various data structures and data models are best mapped to the proper computational framework. PMID:24312478

  9. Development and Validation of Decision Forest Model for Estrogen Receptor Binding Prediction of Chemicals Using Large Data Sets.

    PubMed

    Ng, Hui Wen; Doughty, Stephen W; Luo, Heng; Ye, Hao; Ge, Weigong; Tong, Weida; Hong, Huixiao

    2015-12-21

    Some chemicals in the environment possess the potential to interact with the endocrine system in the human body. Multiple receptors are involved in the endocrine system; estrogen receptor ? (ER?) plays very important roles in endocrine activity and is the most studied receptor. Understanding and predicting estrogenic activity of chemicals facilitates the evaluation of their endocrine activity. Hence, we have developed a decision forest classification model to predict chemical binding to ER? using a large training data set of 3308 chemicals obtained from the U.S. Food and Drug Administration's Estrogenic Activity Database. We tested the model using cross validations and external data sets of 1641 chemicals obtained from the U.S. Environmental Protection Agency's ToxCast project. The model showed good performance in both internal (92% accuracy) and external validations (?70-89% relative balanced accuracies), where the latter involved the validations of the model across different ER pathway-related assays in ToxCast. The important features that contribute to the prediction ability of the model were identified through informative descriptor analysis and were related to current knowledge of ER binding. Prediction confidence analysis revealed that the model had both high prediction confidence and accuracy for most predicted chemicals. The results demonstrated that the model constructed based on the large training data set is more accurate and robust for predicting ER binding of chemicals than the published models that have been developed using much smaller data sets. The model could be useful for the evaluation of ER?-mediated endocrine activity potential of environmental chemicals. PMID:26524122

  10. Second Language Classroom Research. ERIC Digest.

    ERIC Educational Resources Information Center

    Nunan, David

    The purpose of second (or foreign) language classroom research is to answer important questions about the learning and teaching of foreign languages. This kind of research collects data from genuine language classrooms or from experimental settings sometimes established to replicate what takes place in the classroom. Classroom research can focus…

  11. Supporting Classroom Activities with the BSUL System

    ERIC Educational Resources Information Center

    Ogata, Hiroaki; Saito, Nobuji A.; Paredes J., Rosa G.; San Martin, Gerardo Ayala; Yano, Yoneo

    2008-01-01

    This paper presents the integration of ubiquitous computing systems into classroom settings, in order to provide basic support for classrooms and field activities. We have developed web application components using Java technology and configured a classroom with wireless network access and a web camera for our purposes. In this classroom, the…

  12. msIQuant - Quantitation Software for Mass Spectrometry Imaging Enabling Fast Access, Visualization, and Analysis of Large Data Sets.

    PubMed

    Källback, Patrik; Nilsson, Anna; Shariatgorji, Mohammadreza; Andrén, Per E

    2016-04-19

    This paper presents msIQuant, a novel instrument- and manufacturer-independent quantitative mass spectrometry imaging software suite that uses the standardized open access data format imzML. Its data processing structure enables rapid image display and the analysis of very large data sets (>50 GB) without any data reduction. In addition, msIQuant provides many tools for image visualization including multiple interpolation methods, low intensity transparency display, and image fusion. It also has a quantitation function that automatically generates calibration standard curves from series of standards that can be used to determine the concentrations of specific analytes. Regions-of-interest in a tissue section can be analyzed based on a number of quantities including the number of pixels, average intensity, standard deviation of intensity, and median and quartile intensities. Moreover, the suite's export functions enable simplified postprocessing of data and report creation. We demonstrate its potential through several applications including the quantitation of small molecules such as drugs and neurotransmitters. The msIQuant suite is a powerful tool for accessing and evaluating very large data sets, quantifying drugs and endogenous compounds in tissue areas of interest, and for processing mass spectra and images. PMID:27014927

  13. The plateau in mnemonic resolution across large set sizes indicates discrete resource limits in visual working memory

    PubMed Central

    Anderson, David E.

    2015-01-01

    The precision of visual working memory (WM) representations declines monotonically with increasing storage load. Two distinct models of WM capacity predict different shapes for this precision-by-set-size function. Flexible-resource models, which assert a continuous allocation of resources across an unlimited number of items, predict a monotonic decline in precision across a large range of set sizes. Conversely, discrete-resource models, which assert a relatively small item limit for WM storage, predict that precision will plateau once this item limit is exceeded. Recent work has demonstrated such a plateau in mnemonic precision. Moreover, the set size at which mnemonic precision reached asymptote has been strongly predicted by estimated item limits in WM. In the present work, we extend this evidence in three ways. First, we show that this empirical pattern generalizes beyond orientation memory to color memory. Second, we rule out encoding limits as the source of discrete limits by demonstrating equivalent performance across simultaneous and sequential presentations of the memoranda. Finally, we demonstrate that the analytic approach commonly used to estimate precision yields flawed parameter estimates when the range of stimulus space is narrowed (e.g., a 180° rather than a 360° orientation space) and typical numbers of observations are collected. Such errors in parameter estimation reconcile an apparent conflict between our findings and others based on different stimuli. These findings provide further support for discrete-resource models of WM capacity. PMID:22477058

  14. A new wavelet-based approach for the automated treatment of large sets of lunar occultation data

    NASA Astrophysics Data System (ADS)

    Fors, O.; Richichi, A.; Otazu, X.; Núñez, J.

    2008-03-01

    Context: The introduction of infrared arrays for lunar occultations (LO) work and the improvement of predictions based on new deep IR catalogues have resulted in a large increase in sensitivity and in the number of observable occultations. Aims: We provide the means for an automated reduction of large sets of LO data. This frees the user from the tedious task of estimating first-guess parameters for the fit of each LO lightcurve. At the end of the process, ready-made plots and statistics enable the user to identify sources that appear to be resolved or binary, and to initiate their detailed interactive analysis. Methods: The pipeline is tailored to array data, including the extraction of the lightcurves from FITS cubes. Because of its robustness and efficiency, the wavelet transform has been chosen to compute the initial guess of the parameters of the lightcurve fit. Results: We illustrate and discuss our automatic reduction pipeline by analyzing a large volume of novel occultation data recorded at Calar Alto Observatory. The automated pipeline package is available from the authors. Algorithm tested with observations collected at Calar Alto Observatory (Spain). Calar Alto is operated by the German-Spanish Astronomical Center (CAHA).

  15. Do networking activities outside of the classroom protect students against being bullied? A field study with students in secondary school settings in Germany.

    PubMed

    Blickle, Gerhard; Meurs, James A; Schoepe, Christine

    2013-01-01

    Research has shown that having close relationships with fellow classmates can provide a buffer for students against bullying and the negative outcomes associated with it. But, research has not explicitly examined the potential benefits of social networking behaviors outside of the classroom for those who could be bullied. This study addresses this gap and finds that, although a bullying climate in the classroom increases overall bullying, students high on external networking activities did not experience an increase in the bullying they received when in a classroom with a high bullying climate. However, the same group of students reported the largest degree of received bulling under conditions of a low bullying climate. We discuss the implications of our results and provide directions for future research. PMID:24364126

  16. Possible calcium centers for hydrogen storage applications: An accurate many-body study by AFQMC calculations with large basis sets

    NASA Astrophysics Data System (ADS)

    Purwanto, Wirawan; Krakauer, Henry; Zhang, Shiwei; Virgus, Yudistira

    2011-03-01

    Weak H2 physisorption energies present a significant challenge to first-principle theoretical modeling and prediction of materials for H storage. There has been controversy regarding the accuracy of DFT on systems involving Ca cations. We use the auxiliary-field quantum Monte Carlo (AFQMC) method to accurately predict the binding energy of Ca + , - 4{H}2 . AFQMC scales as Nbasis3and has demonstrated accuracy similar to or better than the gold-standard coupled cluster CCSD(T) method. We apply a modified Cholesky decomposition to achieve efficient Hubbard-Stratonovich transformation in AFQMC at large basis sizes. We employ the largest correlation consistent basis sets available, up to Ca/cc-pCV5Z, to extrapolate to the complete basis limit. The calculated potential energy curve exhibits binding with a double-well structure. Supported by DOE and NSF. Calculations were performed at OLCF Jaguar and CPD.

  17. Pre-Service Teachers and Classroom Authority

    ERIC Educational Resources Information Center

    Pellegrino, Anthony M.

    2010-01-01

    This study examined the classroom practices of five pre-service teachers from three secondary schools in a large southeastern state. Through classroom observations, survey responses, reviews of refection logs, and focus-group interview responses, we centered on the issue of developing classroom authority as a means to effective classroom

  18. Petascale Global Kinetic Simulations of The Magnetosphere and Visualization Strategies for Analysis of Very Large Multi-Variate Data Sets

    NASA Astrophysics Data System (ADS)

    Karimabadi, H.; Loring, B.; Vu, H. X.; Omelchenko, Y.; Tatineni, M.; Majumdar, A.; Ayachit, U.; Geveci, B.

    2011-10-01

    3D global electromagnetic hybrid (fluid electrons, kinetic ions) simulations have long been considered the holy grail in kinetic modeling of the magnetosphere but high computational requirements have kept them out of reach. Petascale computers provide the computational power to make such simulations possible but peta computing poses two technical challenges. One is related to the development of efficient and scalable algorithms that can take advantage of the large number of cores. The second is related to knowledge extraction from the resulting simulation output. The challenge of science discovery from the extremely large data sets (˜ 200 TB from a single run) generated from global kinetic simulations is compounded by the multi-variate and "noisy" nature of the data. Here, we review our innovations to overcome both challenges. We have developed a highly scalable hybrid simulation code (H3D) that we used to perform the first petascale global kinetic simulation of the magnetosphere using 98,304 cores on the NSF Kraken supercomputer. To facilitate analysis of data from such runs, we have developed complex visualization pipeline including physics based algorithms to detect and track events of interest in the data. The effectiveness of this approach is illustrated through examples.

  19. The Impact of Brief Teacher Training on Classroom Management and Child Behavior in At-Risk Preschool Settings: Mediators and Treatment Utility

    ERIC Educational Resources Information Center

    Snyder, James; Low, Sabina; Schultz, Tara; Barner, Stacy; Moreno, Desirae; Garst, Meladee; Leiker, Ryan; Swink, Nathan; Schrepferman, Lynn

    2011-01-01

    Teachers from fourteen classrooms were randomly assigned to an adaptation of Incredible Years (IY) teacher training or to teacher training-as-usual. Observations were made of the behavior of 136 target preschool boys and girls nominated by teachers as having many or few conduct problems. Peer and teacher behavior were observed at baseline and post…

  20. "Designing Instrument for Science Classroom Learning Environment in Francophone Minority Settings: Accounting for Voiced Concerns among Teachers and Immigrant/Refugee Students"

    ERIC Educational Resources Information Center

    Bolivar, Bathélemy

    2015-01-01

    The three-phase process "-Instrument for Minority Immigrant Science Learning Environment," an 8-scale, 32-item see Appendix I- (I_MISLE) instrument when completed by teachers provides an accurate description of existing conditions in classrooms in which immigrant and refugee students are situated. Through the completion of the instrument…

  1. The Impact of Brief Teacher Training on Classroom Management and Child Behavior in At-Risk Preschool Settings: Mediators and Treatment Utility

    ERIC Educational Resources Information Center

    Snyder, James; Low, Sabina; Schultz, Tara; Barner, Stacy; Moreno, Desirae; Garst, Meladee; Leiker, Ryan; Swink, Nathan; Schrepferman, Lynn

    2011-01-01

    Teachers from fourteen classrooms were randomly assigned to an adaptation of Incredible Years (IY) teacher training or to teacher training-as-usual. Observations were made of the behavior of 136 target preschool boys and girls nominated by teachers as having many or few conduct problems. Peer and teacher behavior were observed at baseline and post

  2. Navigating the Problem Space of Academia: Exploring Processes of Course Design and Classroom Teaching in Postsecondary Settings. WCER Working Paper No. 2014-1

    ERIC Educational Resources Information Center

    Hora, Matthew T.

    2014-01-01

    Policymakers and educators alike increasing focus on faculty adoption of interactive teaching techniques as a way to improve undergraduate education. Yet, little empirical research exists that examines the processes whereby faculty make decisions about curriculum design and classroom teaching in real-world situations. In this study, I use the idea…

  3. Culture in the Classroom

    ERIC Educational Resources Information Center

    Medin, Douglas L.; Bang, Megan

    2014-01-01

    Culture plays a large but often unnoticeable role in what we teach and how we teach children. We are a country of immense diversity, but in classrooms the dominant European-American culture has become the language of learning.

  4. The Role of Personal Narrative in Constructing Classroom Curriculum

    ERIC Educational Resources Information Center

    Kouri, Donald A.

    2005-01-01

    This article explores the nature of the author's classroom practice and provides insights about how he constructed and delivered classroom curriculum for the electrical apprenticeship classroom. In this article, the author defines classroom curriculum as the planned and guided learning experiences that take place in a classroom setting with…

  5. WebViz:A Web-based Collaborative Interactive Visualization System for large-Scale Data Sets

    NASA Astrophysics Data System (ADS)

    Yuen, D. A.; McArthur, E.; Weiss, R. M.; Zhou, J.; Yao, B.

    2010-12-01

    WebViz is a web-based application designed to conduct collaborative, interactive visualizations of large data sets for multiple users, allowing researchers situated all over the world to utilize the visualization services offered by the University of Minnesota’s Laboratory for Computational Sciences and Engineering (LCSE). This ongoing project has been built upon over the last 3 1/2 years .The motivation behind WebViz lies primarily with the need to parse through an increasing amount of data produced by the scientific community as a result of larger and faster multicore and massively parallel computers coming to the market, including the use of general purpose GPU computing. WebViz allows these large data sets to be visualized online by anyone with an account. The application allows users to save time and resources by visualizing data ‘on the fly’, wherever he or she may be located. By leveraging AJAX via the Google Web Toolkit (http://code.google.com/webtoolkit/), we are able to provide users with a remote, web portal to LCSE's (http://www.lcse.umn.edu) large-scale interactive visualization system already in place at the University of Minnesota. LCSE’s custom hierarchical volume rendering software provides high resolution visualizations on the order of 15 million pixels and has been employed for visualizing data primarily from simulations in astrophysics to geophysical fluid dynamics . In the current version of WebViz, we have implemented a highly extensible back-end framework built around HTTP "server push" technology. The web application is accessible via a variety of devices including netbooks, iPhones, and other web and javascript-enabled cell phones. Features in the current version include the ability for users to (1) securely login (2) launch multiple visualizations (3) conduct collaborative visualization sessions (4) delegate control aspects of a visualization to others and (5) engage in collaborative chats with other users within the user interface of the web application. These features are all in addition to a full range of essential visualization functions including 3-D camera and object orientation, position manipulation, time-stepping control, and custom color/alpha mapping.

  6. Hydraulic behavior of two areas of the Floridan aquifer system characterized by complex hydrogeologic settings and large groundwater withdrawals

    SciTech Connect

    Maslia, M.L. )

    1993-03-01

    Two areas of the Florida aquifer system (FAS) that are characterized by complex hydrogeologic settings and exceedingly large ground-water withdrawals are the Dougherty Plain area of southwest GA and the Glynn County area of southeast GA. In southwest GA, large scale withdrawals of ground water for agricultural and livestock irrigation amounted to about 148 million gallons per day (mg/d) during 1990. Large scale pumping in Glynn County, primarily used for industrial purposes and centered in the City of Brunswick, amounted to about 88 mg/d during 1990. In southwest GA, the FAS consists primarily of the Ocala Limestone (OL) of late Eocene age. Confining the aquifer from above is a residual layer (50 ft thick) of sand and clay containing silicified boulders which is derived from the chemical weathering of the OL. This area is characterized by karst topography marked by numerous depressions and sinkholes, high transmissivity (generally greater than 50,000 feet squared per day), and significant hydraulic connections to overlying streams and lakes. These characteristics, along with the seasonal nature of pumping and mean annual recharge of about 10 inches per year have prevented permanent, long-term water-level declines. In the Glynn County area, the FAS can be more than 2,600 ft thick, consisting of a sequence of calcareous and dolomitic rocks that are of Late Cretaceous to early Miocene in age. The aquifer system is confined above by clastic rocks of Middle Miocene age, having an average thickness of 400 ft. This area is characterized by post-depositional tectonic modification of the subsurface as opposed to simple karst development, thick confinement of the aquifer system, and significant amounts of vertical leakage of water from below. These characteristics and heavy-long term pumping from the Upper Floridan aquifer (UFA) have caused a broad, shallow cone of depression to develop and the upward migration of saltwater to contaminate the freshwater zones of the UFA.

  7. My Classroom Physical Activity Pyramid: A Tool for Integrating Movement into the Classroom

    ERIC Educational Resources Information Center

    Orlowski, Marietta; Lorson, Kevin; Lyon, Anna; Minoughan, Susan

    2013-01-01

    The classroom teacher is a critical team member of a comprehensive school physical activity program and an activity-friendly school environment. Students spend more time in the classroom than in any other school setting or environment. Classrooms are busy places, and classroom teachers must make decisions about how to make the best use of their

  8. My Classroom Physical Activity Pyramid: A Tool for Integrating Movement into the Classroom

    ERIC Educational Resources Information Center

    Orlowski, Marietta; Lorson, Kevin; Lyon, Anna; Minoughan, Susan

    2013-01-01

    The classroom teacher is a critical team member of a comprehensive school physical activity program and an activity-friendly school environment. Students spend more time in the classroom than in any other school setting or environment. Classrooms are busy places, and classroom teachers must make decisions about how to make the best use of their…

  9. Comparison of large-scale global land precipitation from multisatellite and reanalysis products with gauge-based GPCC data sets

    NASA Astrophysics Data System (ADS)

    Prakash, Satya; Gairola, R. M.; Mitra, A. K.

    2015-07-01

    Reliable information of land precipitation along with other atmospheric variables is crucial for monsoon studies, ecosystem modelling, crop modelling and numerous other applications. In this paper, three multisatellite and three reanalysis precipitation products, namely Global Precipitation Climatology Project (GPCP), Climate Prediction Center Mapping of Precipitation (CMAP1 and CMAP2), European Center for Medium Range Weather Forecasts Reanalysis-Interim (ERA-I) and National Center for Environmental Prediction (NCEP1 and NCEP2), are compared with the recent version of gauge-based gridded Global Precipitation Climatology Centre (GPCC) data sets over the global land region. The analysis is done at monthly scale and at 2.5° latitude × 2.5° longitude resolution for a 25-year (1986-2010) period. Large-scale prominent features of precipitation and its variability are qualitatively represented by all the precipitation products. However, the magnitudes considerably differ among themselves. Among the six precipitation products, GPCP performs better than the others when compared to the gridded GPCC data sets. Among the three reanalysis precipitation products, ERA-I is better than NCEP1 and NCEP2 in general. Even though NCEP2 is improved over NCEP1 over the mid-latitudes, NCEP2 has more serious problem over the orographic regions than that of NCEP1. Moreover, all the precipitation estimates exhibit similar kind of interannual variability over the global and tropical land regions. Additionally, the comparison is done for the six global monsoon regions for the regional analysis which shows that all the precipitation estimates exhibit similar kind of interannual variability in the seasonal monsoon precipitation. However, there are some regional differences among these precipitation products in the representation of monsoon variability.

  10. Simulation Studies as Designed Experiments: The Comparison of Penalized Regression Models in the “Large p, Small n” Setting

    PubMed Central

    Chaibub Neto, Elias; Bare, J. Christopher; Margolin, Adam A.

    2014-01-01

    New algorithms are continuously proposed in computational biology. Performance evaluation of novel methods is important in practice. Nonetheless, the field experiences a lack of rigorous methodology aimed to systematically and objectively evaluate competing approaches. Simulation studies are frequently used to show that a particular method outperforms another. Often times, however, simulation studies are not well designed, and it is hard to characterize the particular conditions under which different methods perform better. In this paper we propose the adoption of well established techniques in the design of computer and physical experiments for developing effective simulation studies. By following best practices in planning of experiments we are better able to understand the strengths and weaknesses of competing algorithms leading to more informed decisions about which method to use for a particular task. We illustrate the application of our proposed simulation framework with a detailed comparison of the ridge-regression, lasso and elastic-net algorithms in a large scale study investigating the effects on predictive performance of sample size, number of features, true model sparsity, signal-to-noise ratio, and feature correlation, in situations where the number of covariates is usually much larger than sample size. Analysis of data sets containing tens of thousands of features but only a few hundred samples is nowadays routine in computational biology, where “omics” features such as gene expression, copy number variation and sequence data are frequently used in the predictive modeling of complex phenotypes such as anticancer drug response. The penalized regression approaches investigated in this study are popular choices in this setting and our simulations corroborate well established results concerning the conditions under which each one of these methods is expected to perform best while providing several novel insights. PMID:25289666

  11. Automatic detection of rate change in large data sets with an unsupervised approach: the case of influenza viruses.

    PubMed

    Labonté, Kasandra; Aris-Brosou, Stéphane

    2016-04-01

    Influenza viruses evolve at such a high rate that vaccine recommendations need to be changed, but not quite on a regular basis. This observation suggests that the rate of evolution of these viruses is not constant through time, which begs the question as to when such rate changes occur, if they do so independently of the host in which they circulate and (or) independently of their subtype. To address these outstanding questions, we introduce a novel heuristics, Mclust*, that is based on a two-tier clustering approach in a phylogenetic context to estimate (i) absolute rates of evolution and (ii) when rate change occurs. We employ the novel approach to compare the two influenza surface proteins, hemagglutinin and neuraminidase, that circulated in avian, human, and swine hosts between 1960 and 2014 in two subtypes: H3N2 and H1N1. We show that the algorithm performs well in most conditions, accounting for phylogenetic uncertainty by means of bootstrapping and scales up to analyze very large data sets. Our results show that our approach is robust to the time-dependent artifact of rate estimation, and confirm pervasive punctuated evolution across hosts and subtypes. As such, the novel approach can potentially detect when vaccine composition needs to be updated. PMID:26966881

  12. Spatial Fingerprints of Community Structure in Human Interaction Network for an Extensive Set of Large-Scale Regions

    PubMed Central

    Kallus, Zsófia; Barankai, Norbert; Szüle, János; Vattay, Gábor

    2015-01-01

    Human interaction networks inferred from country-wide telephone activity recordings were recently used to redraw political maps by projecting their topological partitions into geographical space. The results showed remarkable spatial cohesiveness of the network communities and a significant overlap between the redrawn and the administrative borders. Here we present a similar analysis based on one of the most popular online social networks represented by the ties between more than 5.8 million of its geo-located users. The worldwide coverage of their measured activity allowed us to analyze the large-scale regional subgraphs of entire continents and an extensive set of examples for single countries. We present results for North and South America, Europe and Asia. In our analysis we used the well-established method of modularity clustering after an aggregation of the individual links into a weighted graph connecting equal-area geographical pixels. Our results show fingerprints of both of the opposing forces of dividing local conflicts and of uniting cross-cultural trends of globalization. PMID:25993329

  13. Spatial fingerprints of community structure in human interaction network for an extensive set of large-scale regions.

    PubMed

    Kallus, Zsófia; Barankai, Norbert; Szüle, János; Vattay, Gábor

    2015-01-01

    Human interaction networks inferred from country-wide telephone activity recordings were recently used to redraw political maps by projecting their topological partitions into geographical space. The results showed remarkable spatial cohesiveness of the network communities and a significant overlap between the redrawn and the administrative borders. Here we present a similar analysis based on one of the most popular online social networks represented by the ties between more than 5.8 million of its geo-located users. The worldwide coverage of their measured activity allowed us to analyze the large-scale regional subgraphs of entire continents and an extensive set of examples for single countries. We present results for North and South America, Europe and Asia. In our analysis we used the well-established method of modularity clustering after an aggregation of the individual links into a weighted graph connecting equal-area geographical pixels. Our results show fingerprints of both of the opposing forces of dividing local conflicts and of uniting cross-cultural trends of globalization. PMID:25993329

  14. Identifying Cognate Binding Pairs among a Large Set of Paralogs: The Case of PE/PPE Proteins of Mycobacterium tuberculosis

    PubMed Central

    Riley, Robert; Pellegrini, Matteo; Eisenberg, David

    2008-01-01

    We consider the problem of how to detect cognate pairs of proteins that bind when each belongs to a large family of paralogs. To illustrate the problem, we have undertaken a genomewide analysis of interactions of members of the PE and PPE protein families of Mycobacterium tuberculosis. Our computational method uses structural information, operon organization, and protein coevolution to infer the interaction of PE and PPE proteins. Some 289 PE/PPE complexes were predicted out of a possible 5,590 PE/PPE pairs genomewide. Thirty-five of these predicted complexes were also found to have correlated mRNA expression, providing additional evidence for these interactions. We show that our method is applicable to other protein families, by analyzing interactions of the Esx family of proteins. Our resulting set of predictions is a starting point for genomewide experimental interaction screens of the PE and PPE families, and our method may be generally useful for detecting interactions of proteins within families having many paralogs. PMID:18787688

  15. Global nonlinear kernel prediction for large data set with a particle swarm-optimized interval support vector regression.

    PubMed

    Ding, Yongsheng; Cheng, Lijun; Pedrycz, Witold; Hao, Kuangrong

    2015-10-01

    A new global nonlinear predictor with a particle swarm-optimized interval support vector regression (PSO-ISVR) is proposed to address three issues (viz., kernel selection, model optimization, kernel method speed) encountered when applying SVR in the presence of large data sets. The novel prediction model can reduce the SVR computing overhead by dividing input space and adaptively selecting the optimized kernel functions to obtain optimal SVR parameter by PSO. To quantify the quality of the predictor, its generalization performance and execution speed are investigated based on statistical learning theory. In addition, experiments using synthetic data as well as the stock volume weighted average price are reported to demonstrate the effectiveness of the developed models. The experimental results show that the proposed PSO-ISVR predictor can improve the computational efficiency and the overall prediction accuracy compared with the results produced by the SVR and other regression methods. The proposed PSO-ISVR provides an important tool for nonlinear regression analysis of big data. PMID:25974954

  16. Early Miocene Kirka-Phrigian caldera, western Anatolia - an example of large volume silicic magma generation in extensional setting

    NASA Astrophysics Data System (ADS)

    Seghedi, Ioan; Helvacı, Cahit

    2014-05-01

    Large rhyolitic ignimbrite occurrences are close connected to the Early Miocene initiation of extensional processes in the central-west Anatolia along Taşvanlı-Afyon zones. Field correlations, petrographical, geochemical and geochronological data lead to a substantial reinterpretation of the ignimbrite surrounding Kırka area, known from its world-class borate deposits, as representing the climatic event of a caldera collapse, unknown up to now and newly named "Kırka-Phrigian caldera". The caldera, which is roughly oval (24 km x 15km) in shape, one of the largest in Turkey, is supposed to have been formed in a single stage collapse event, at ~19 Ma that generated huge volume extracaldera outflow ignimbrites. Transtensive/distensive tectonic stresses since 25 Ma ago resulted in the NNW-SSE elongation of the magma chamber and influenced the roughly elliptical shape of the subsided block (caldera floor) belonging to the apex of Eskişehir-Afyon-Isparta volcanic area. Intracaldera post-collapse sedimentation and volcanism (at ~ 18 Ma) was controlled through subsidence-related faults with generation of a series of volcanic structures (mainly domes) showing a large compositional range from saturated silicic rhyolites and crystal-rich trachytes to undersaturated lamproites. Such volcanic rock association is typical for lithospheric extension. In this scenario, enriched mantle components within the subcontinental lithospheric mantle will begin to melt via decompression melting during the initiation of extension. Interaction of these melts with crustal rocks, fractionation processes and crustal anatexis driven by the heat contained in the ascending mantle melts produced the silicic compositions in a large crustal reservoir. Such silicic melts generated the initial eruptions of Kırka-Phrigian caldera ignimbrites. The rock volume and geochemical evidence suggests that silicic volcanic rocks come from a long-lived magma chamber that evolved episodically; after caldera generation there is a shift to small volume episodic rhyolitic, trachytic and lamproitic volcanism, the last ones indicating a more primitive magma input with evident origin in an enriched mantle lithosphere. The volcanic rock succession provides a direct picture of the state of the magmatic system at the time of eruptions that generated caldera and post-caldera structures and offer an excellent example for silicic magma generation and associated potassic and ultrapotassic intermediate-mafic rocks in post-collisional extensional setting.

  17. Using Technology To Implement Active Learning in Large Classes. Technical Report.

    ERIC Educational Resources Information Center

    Gerace, William J.; Dufresne, Robert J.; Leonard, William J.

    An emerging technology, classroom communication systems (CCSs), has the potential to transform the way we teach science in large-lecture settings. CCSs can serve as catalysts for creating a more interactive, student-centered classroom in the lecture hall, thereby allowing students to become more actively involved in constructing and using…

  18. Clickenomics: Using a Classroom Response System to Increase Student Engagement in a Large-Enrollment Principles of Economics Course

    ERIC Educational Resources Information Center

    Salemi, Michael K.

    2009-01-01

    One of the most important challenges facing college instructors of economics is helping students engage. Engagement is particularly important in a large-enrollment Principles of Economics course, where it can help students achieve a long-lived understanding of how economists use basic economic ideas to look at the world. The author reports how…

  19. Question Driven Instruction with Classroom Response Technology

    NASA Astrophysics Data System (ADS)

    Gerace, William; Beatty, Ian

    2007-10-01

    Essentially, a classroom response system is technology that: 1) allows an instructor to present a question or problem to the class; 2) allows students to enter their answers into some kind of device; and 3) instantly aggregates and summarizes students' answers for the instructor, usually as a histogram. Most response systems provide additional functionality. Some additional names for this class of system (or for subsets of the class) are classroom communication system (CCS), audience response system (ARS), voting machine system, audience feedback system, and--most ambitiously--CATAALYST system (for ``Classroom Aggregation Technology for Activating and Assessing Learning and Your Students' Thinking''). UMPERG has been teaching with and researching classroom response systems since 1993. We find that the technology has the potential to transform the way we teach science in large lecture settings. CRSs can serve as catalysts for creating a more interactive, student-centered classroom in the lecture hall, thereby allowing students to become more actively involved in constructing and using knowledge. CRSs not only make it easier to engage students in learning activities during lecture but also enhance the communication among students, and between the students and the instructor. This enhanced communication assists the students and the instructor in assessing understanding during class time, and affords the instructor the opportunity to devise instructional interventions that target students' needs as they arise.

  20. Empirical Mining of Large Data Sets Already Helps to Solve Practical Ecological Problems; A Panoply of Working Examples (Invited)

    NASA Astrophysics Data System (ADS)

    Hargrove, W. W.; Hoffman, F. M.; Kumar, J.; Spruce, J.; Norman, S. P.

    2013-12-01

    Here we present diverse examples where empirical mining and statistical analysis of large data sets have already been shown to be useful for a wide variety of practical decision-making problems within the realm of large-scale ecology. Because a full understanding and appreciation of particular ecological phenomena are possible only after hypothesis-directed research regarding the existence and nature of that process, some ecologists may feel that purely empirical data harvesting may represent a less-than-satisfactory approach. Restricting ourselves exclusively to process-driven approaches, however, may actually slow progress, particularly for more complex or subtle ecological processes. We may not be able to afford the delays caused by such directed approaches. Rather than attempting to formulate and ask every relevant question correctly, empirical methods allow trends, relationships and associations to emerge freely from the data themselves, unencumbered by a priori theories, ideas and prejudices that have been imposed upon them. Although they cannot directly demonstrate causality, empirical methods can be extremely efficient at uncovering strong correlations with intermediate "linking" variables. In practice, these correlative structures and linking variables, once identified, may provide sufficient predictive power to be useful themselves. Such correlation "shadows" of causation can be harnessed by, e.g., Bayesian Belief Nets, which bias ecological management decisions, made with incomplete information, toward favorable outcomes. Empirical data-harvesting also generates a myriad of testable hypotheses regarding processes, some of which may even be correct. Quantitative statistical regionalizations based on quantitative multivariate similarity have lended insights into carbon eddy-flux direction and magnitude, wildfire biophysical conditions, phenological ecoregions useful for vegetation type mapping and monitoring, forest disease risk maps (e.g., sudden oak death), global aquatic ecoregion risk maps for aquatic invasives, and forest vertical structure ecoregions (e.g., using extensive LiDAR data sets). Multivariate Spatio-Temporal Clustering, which quantitatively places alternative future conditions on a common footing with present conditions, allows prediction of present and future shifts in tree species ranges, given alternative climatic change forecasts. ForWarn, a forest disturbance detection and monitoring system mining 12 years of national 8-day MODIS phenology data, has been operating since 2010, producing national maps every 8 days showing many kinds of potential forest disturbances. Forest resource managers can view disturbance maps via a web-based viewer, and alerts are issued when particular forest disturbances are seen. Regression-based decadal trend analysis showing long-term forest thrive and decline areas, and individual-based, brute-force supercomputing to map potential movement corridors and migration routes across landscapes will also be discussed. As significant ecological changes occur with increasing rapidity, such empirical data-mining approaches may be the most efficient means to help land managers find the best, most-actionable policies and decision strategies.

  1. Rates and Mechanisms of Solidification in Large Magma Bodies: Implications for Melt Extraction in all Tectonic Settings

    NASA Astrophysics Data System (ADS)

    VanTongeren, J. A.

    2013-12-01

    As is observed in both experiment and theory, in the absence of hydrothermal convection, the majority of magma chamber heat loss occurs via conduction through the roof of the intrusion and into the cold country rock above. The formation of an upper solidification front (or Upper Border Series, UBS), recorded in the rocks both geochemically and texturally, is a natural outcome of the progression of the solidification front from the cold roof to the hot center of the magma chamber. There are, however, a few unique layered mafic intrusions for which little or no UBS exists. In this study, I examine the thermal evolution and crystallization rates of several classic layered intrusions as it is recorded in the extent of the preserved UBS. For those intrusions that have experienced crystallization at the roof, such as the Skaergaard Intrusion, the development of a UBS reduces the temperature gradient at the roof and effectively slows the rate of heat loss from the main magma body. However, for those intrusions that do not have an UBS, such as the Bushveld Complex, the cooling rate is controlled only by the maximum rate of conductive heat loss through the overlying roof rocks, which decreases with time. The implications are two-fold: (1) The relative thickness of the UBS in large intrusions may be the key to quantifying their cooling and solidification rates; and (2) The nature of the magma mush zone near the roof of an intrusion may depend principally on the long-term thermal evolution of the magma body. Particularly at the end stages of crystallization, when the liquids are likely to be highly evolved and high viscosities may inhibit convection, intrusions lacking a well-defined UBS may provide important insights into the mechanics of crystal-liquid separation, melt extraction, and compaction in felsic plutons as well as mafic intrusions. These results are important for long-lived (>500 kyr) or repeatedly replenished magma chambers in all tectonic settings.

  2. Classroom Network Technology as a Support for Systemic Mathematics Reform: The Effects of TI MathForward on Student Achievement in a Large, Diverse District

    ERIC Educational Resources Information Center

    Penuel, William; Singleton, Corinne; Roschelle, Jeremy

    2011-01-01

    Low-cost, portable classroom network technologies have shown great promise in recent years for improving teaching and learning in mathematics. This paper explores the impacts on student learning in mathematics when a program to introduce network technologies into mathematics classrooms is integrated into a systemic reform initiative at the…

  3. A Study of Classroom Response System Clickers: Increasing Student Engagement and Performance in a Large Undergraduate Lecture Class on Architectural Research

    ERIC Educational Resources Information Center

    Bachman, Leonard; Bachman, Christine

    2011-01-01

    This study examines the effectiveness of a classroom response system (CRS) and architecture students' perceptions of real-time feedback. CRS is designed to increase active engagement of students by their responses to a question or prompt via wireless keypads. Feedback is immediately portrayed on a classroom projector for discussion. The authors…

  4. Classroom Planetarium.

    ERIC Educational Resources Information Center

    Ankney, Paul

    1981-01-01

    Provides instructions for the construction of a paper mache classroom planetarium and suggests several student activities using this planetarium model. Lists reasons why students have difficulties in transferring classroom instruction in astronomy to the night sky. (DS)

  5. Teaching Cell Biology in the Large-Enrollment Classroom: Methods to Promote Analytical Thinking and Assessment of Their Effectiveness

    PubMed Central

    Kitchen, Elizabeth; Bell, John D.; Reeve, Suzanne; Sudweeks, Richard R.; Bradshaw, William S.

    2003-01-01

    A large-enrollment, undergraduate cellular biology lecture course is described whose primary goal is to help students acquire skill in the interpretation of experimental data. The premise is that this kind of analytical reasoning is not intuitive for most people and, in the absence of hands-on laboratory experience, will not readily develop unless instructional methods and examinations specifically designed to foster it are employed. Promoting scientific thinking forces changes in the roles of both teacher and student. We describe didactic strategies that include directed practice of data analysis in a workshop format, active learning through verbal and written communication, visualization of abstractions diagrammatically, and the use of ancillary small-group mentoring sessions with faculty. The implications for a teacher in reducing the breadth and depth of coverage, becoming coach instead of lecturer, and helping students to diagnose cognitive weaknesses are discussed. In order to determine the efficacy of these strategies, we have carefully monitored student performance and have demonstrated a large gain in a pre- and posttest comparison of scores on identical problems, improved test scores on several successive midterm examinations when the statistical analysis accounts for the relative difficulty of the problems, and higher scores in comparison to students in a control course whose objective was information transfer, not acquisition of reasoning skills. A novel analytical index (student mobility profile) is described that demonstrates that this improvement was not random, but a systematic outcome of the teaching/learning strategies employed. An assessment of attitudes showed that, in spite of finding it difficult, students endorse this approach to learning, but also favor curricular changes that would introduce an analytical emphasis earlier in their training. PMID:14506506

  6. A review of sea-spray aerosol source functions using a large global set of sea salt aerosol concentration measurements

    NASA Astrophysics Data System (ADS)

    Grythe, H.; Ström, J.; Krejci, R.; Quinn, P.; Stohl, A.

    2014-02-01

    Sea-spray aerosols (SSA) are an important part of the climate system because of their effects on the global radiative budget - both directly as scatterers and absorbers of solar and terrestrial radiation, and indirectly as cloud condensation nuclei (CCN) influencing cloud formation, lifetime, and precipitation. In terms of their global mass, SSA have the largest uncertainty of all aerosols. In this study we review 21 SSA source functions from the literature, several of which are used in current climate models. In addition, we propose a~new function. Even excluding outliers, the global annual SSA mass produced spans roughly 3-70 Pg yr-1 for the different source functions, for particles with dry diameter Dp < 10 μm, with relatively little interannual variability for a given function. The FLEXPART Lagrangian particle dispersion model was run in backward mode for a large global set of observed SSA concentrations, comprised of several station networks and ship cruise measurement campaigns. FLEXPART backward calculations produce gridded emission sensitivity fields, which can subsequently be multiplied with gridded SSA production fluxes in order to obtain modeled SSA concentrations. This allowed us to efficiently and simultaneously evaluate all 21 source functions against the measurements. Another advantage of this method is that source-region information on wind speed and sea surface temperatures (SSTs) could be stored and used for improving the SSA source function parameterizations. The best source functions reproduced as much as 70% of the observed SSA concentration variability at several stations, which is comparable with "state of the art" aerosol models. The main driver of SSA production is wind, and we found that the best fit to the observation data could be obtained when the SSA production is proportional to U103.5, where U10 is the source region averaged 10 m wind speed. A strong influence of SST on SSA production, with higher temperatures leading to higher production, could be detected as well, although the underlying physical mechanisms of the SST influence remains unclear. Our new source function with wind speed and temperature dependence gives a global SSA production for particles smaller than Dp < 10 μm of 9 Pg yr-1, and is the best fit to the observed concentrations.

  7. A review of sea spray aerosol source functions using a large global set of sea salt aerosol concentration measurements

    NASA Astrophysics Data System (ADS)

    Grythe, H.; Ström, J.; Krejci, R.; Quinn, P.; Stohl, A.

    2013-08-01

    Sea spray aerosols (SSA) are an important part of the climate system through their effects on the global radiative budget both directly as scatterers and absorbers of solar and terrestrial radiation, and indirectly as cloud condensation nuclei (CCN) influencing cloud formation, lifetime and precipitation. In terms of their global mass, SSA have the largest uncertainty of all aerosols. In this study we review 21 SSA source functions from the literature, several of which are used in current climate models, and we also propose a new function. Even excluding outliers, the global annual SSA mass produced by these source functions spans roughly 3-70 Pg yr-1 for the different source functions, with relatively little interannual variability for a given function. The FLEXPART Lagrangian model was run in backward mode for a large global set of observed SSA concentrations, comprised of several station networks and ship cruise measurement campaigns. FLEXPART backward calculations produce gridded emission sensitivity fields, which can subsequently be multiplied with gridded SSA production fluxes to obtain modeled SSA concentrations. This allowed to efficiently evaluate all 21source functions at the same time against the measurements. Another advantage of this method is that source-region information on wind speed and sea surface temperatures (SSTs) could be stored and used for improving the SSA source function parameterizations. The best source functions reproduced as much as 70% of the observed SSA concentration variability at several stations, which is comparable with "state of the art" aerosol models. The main driver of SSA production is wind, and we found that the best fit to the observation data could be obtained when the SSA production is proportional to U103.5 where U10 is the source region averaged 10 m wind speed, to the power of 3.5. A strong influence of SST on SSA production could be detected as well, although the underlying physical mechanisms of the SST influence remains unclear. Our new source function gives a global SSA production for particles smaller than 10 μm of 9 Pg yr-1 and is the best fit to the observed concentrations.

  8. Spectral analysis and cross-correlation of very large seismic data-sets at the persistently restless Telica Volcano, Nicaragua.

    NASA Astrophysics Data System (ADS)

    Rodgers, Mel; Roman, Diana; Geirsson, Halldor; LaFemina, Peter; Munoz, Angelica; Tenorio, Virginia

    2014-05-01

    Telica Volcano, Nicaragua, is a persistently restless volcano (PRV) with daily seismicity rates that can vary from less than ten events per day to over a thousand events per day. Seismicity rates show little clear correlation with eruptive episodes. This presents a challenge for volcano monitoring and highlights the need for a greater understanding of the patterns of seismicity surrounding eruptive activity at Telica and other PRVs. Multi-parameter seismic investigations, including spectral and multiplet analysis, may provide important precursory information, but are challenging given such high rates of seismicity. We present a program 'peakmatch' that can effectively handle the cross-correlation of hundreds of thousands of events and identify multiplets. In addition, frequency ratios, basic spectral information, and amplitudes can be rapidly calculated for very large seismic data sets. An investigation of the seismic characteristics surrounding the 2011 phreatic eruption at Telica shows an unusual pattern of seismicity. Rather than a precursory increase in seismicity, as is observed prior to many volcanic eruptions, we observe a decrease in seismicity many months before the eruption. Spectral analysis indicates that during periods with high seismicity there are events with a broad range of frequencies, and that during periods of low seismicity there is a progressive loss of events with lower frequency energy (< 3 Hz). Multiplet analysis indicates that during periods with high seismicity there is a high degree of waveform correlation, and that during periods with low seismicity there is a low degree of waveform correlation. We suggest that these patterns of seismicity relate to a cyclic transition between open-system and closed-system degassing. Open-system degassing is observed seismically as periods with high event rates, a broad range of frequency content and high waveform correlation. A transition to closed-system degassing could be via sealing of fluid pathways in the magmatic and/or hydrothermal system, and periods of closed-system degassing are observed seismically as low event rates, higher frequency content and low waveform correlation. The eruption may then represent a transition back from closed-system degassing to open-system degassing.

  9. A Large-Scale Inquiry-Based Astronomy Intervention Project: Impact on Students' Content Knowledge Performance and Views of their High School Science Classroom

    NASA Astrophysics Data System (ADS)

    Fitzgerald, Michael; McKinnon, David H.; Danaia, Lena; Deehan, James

    2015-08-01

    In this paper, we present the results from a study of the impact on students involved in a large-scale inquiry-based astronomical high school education intervention in Australia. Students in this intervention were led through an educational design allowing them to undertake an investigative approach to understanding the lifecycle of stars more aligned with the `ideal' picture of school science. Through the use of two instruments, one focused on content knowledge gains and the other on student views of school science, we explore the impact of this design. Overall, students made moderate content knowledge gains although these gains were heavily dependent on the individual teacher, the number of times a teacher implemented and the depth to which an individual teacher went with the provided materials. In terms of students' views, there were significant global changes in their views of their experience of the science classroom. However, there were some areas where no change or slightly negative changes of which some were expected and some were not. From these results, we comment on the necessity of sustained long-period implementations rather than single interventions, the requirement for similarly sustained professional development and the importance of monitoring the impact of inquiry-based implementations. This is especially important as inquiry-based approaches to science are required by many new curriculum reforms, most notably in this context, the new Australian curriculum currently being rolled out.

  10. Classroom Management in Diverse Classrooms

    ERIC Educational Resources Information Center

    Milner, H. Richard, IV; Tenore, F. Blake

    2010-01-01

    Classroom management continues to be a serious concern for teachers and especially in urban and diverse learning environments. The authors present the culturally responsive classroom management practices of two teachers from an urban and diverse middle school to extend the construct, culturally responsive classroom management. The principles that…

  11. 1999 Exemplary Classroom Award Recipients.

    ERIC Educational Resources Information Center

    Starnes, Bobby Ann

    1999-01-01

    Presents the 1999 winners of the Foxfire Exemplary Classrooms Awards. The winners were diverse in grade level, urban and rural settings, years of experience with the Foxfire core practices, and ideas about how to implement the Foxfire approach. Activities and experiences from each winning classroom are highlighted. Criteria for winning an award…

  12. Classroom Ecological Inventory: A Process for Mainstreaming.

    ERIC Educational Resources Information Center

    Fuchs, Douglas; And Others

    1994-01-01

    Teachers in Tennessee are using the Classroom Ecological Inventory (CEI) to prepare students with mild disabilities for moves into mainstream settings. The CEI was field tested as part of the Peabody Reintegration Project and involves observation of the regular classroom, regular teacher interview, comparison of the special and regular classrooms,

  13. Is Our Classroom an Ecological Place?

    ERIC Educational Resources Information Center

    Xia, Wang

    2006-01-01

    The essence of ecology is life and its diversity, integrity, openness and coexistence. When one contemplates and analyzes classroom from the perspective of ecology, classroom should contain open-ended and multiple goals instead of a single and pre-set goal; classroom is more flexible, allowing great diversity instead of being narrow-minded,…

  14. Photometric selection of quasars in large astronomical data sets with a fast and accurate machine learning algorithm

    NASA Astrophysics Data System (ADS)

    Gupta, Pramod; Connolly, Andrew J.; Gardner, Jeffrey P.

    2014-03-01

    Future astronomical surveys will produce data on ˜108 objects per night. In order to characterize and classify these sources, we will require algorithms that scale linearly with the size of the data, that can be easily parallelized and where the speedup of the parallel algorithm will be linear in the number of processing cores. In this paper, we present such an algorithm and apply it to the question of colour selection of quasars. We use non-parametric Bayesian classification and a binning algorithm implemented with hash tables (BASH tables). We show that this algorithm's run time scales linearly with the number of test set objects and is independent of the number of training set objects. We also show that it has the same classification accuracy as other algorithms. For current data set sizes, it is up to three orders of magnitude faster than commonly used naive kernel-density-estimation techniques and it is estimated to be about eight times faster than the current fastest algorithm using dual kd-trees for kernel density estimation. The BASH table algorithm scales linearly with the size of the test set data only, and so for future larger data sets, it will be even faster compared to other algorithms which all depend on the size of the test set and the size of the training set. Since it uses linear data structures, it is easier to parallelize compared to tree-based algorithms and its speedup is linear in the number of cores unlike tree-based algorithms whose speedup plateaus after a certain number of cores. Moreover, due to the use of hash tables to implement the binning, the memory usage is very small. While our analysis is for the specific problem of selection of quasars, the ideas are general and the BASH table algorithm can be applied to any density-estimation problem involving sparse high-dimensional data sets. Since sparse high-dimensional data sets are a common type of scientific data set, this method has the potential to be useful in a broad range of machine-learning applications in astrophysics.

  15. Novel method to construct large-scale design space in lubrication process utilizing Bayesian estimation based on a small-scale design-of-experiment and small sets of large-scale manufacturing data.

    PubMed

    Maeda, Jin; Suzuki, Tatsuya; Takayama, Kozo

    2012-12-01

    A large-scale design space was constructed using a Bayesian estimation method with a small-scale design of experiments (DoE) and small sets of large-scale manufacturing data without enforcing a large-scale DoE. The small-scale DoE was conducted using various Froude numbers (X(1)) and blending times (X(2)) in the lubricant blending process for theophylline tablets. The response surfaces, design space, and their reliability of the compression rate of the powder mixture (Y(1)), tablet hardness (Y(2)), and dissolution rate (Y(3)) on a small scale were calculated using multivariate spline interpolation, a bootstrap resampling technique, and self-organizing map clustering. The constant Froude number was applied as a scale-up rule. Three experiments under an optimal condition and two experiments under other conditions were performed on a large scale. The response surfaces on the small scale were corrected to those on a large scale by Bayesian estimation using the large-scale results. Large-scale experiments under three additional sets of conditions showed that the corrected design space was more reliable than that on the small scale, even if there was some discrepancy in the pharmaceutical quality between the manufacturing scales. This approach is useful for setting up a design space in pharmaceutical development when a DoE cannot be performed at a commercial large manufacturing scale. PMID:22356256

  16. Children's Interactions in Follow Through Classrooms: The DCB Observational System.

    ERIC Educational Resources Information Center

    Ross, Sylvia; Zimiles, Herbert

    Observational findings regarding differentiated child behavior in Follow Through classrooms for the year 1971 are presented. The interactional behavior of three groups of elementary school children from different classroom settings (Bank Street School for Children, an open classroom approach; Bank Street Follow Through, an open classroom approach;…

  17. Impacts of Flipped Classroom in High School Health Education

    ERIC Educational Resources Information Center

    Chen, Li-Ling

    2016-01-01

    As advanced technology increasingly infiltrated into classroom, the flipped classroom has come to light in secondary educational settings. The flipped classroom is a new instructional approach that intends to flip the traditional teacher-centered classroom into student centered. The purpose of this research is to investigate the impact of the…

  18. A Zebra in the Classroom.

    ERIC Educational Resources Information Center

    Leake, Devin; Morvillo, Nancy

    1998-01-01

    Describes the care and breeding of zebra fish, suggests various experiments and observations easily performed in a classroom setting, and provides some ideas to further student interest and exploration of these organisms. (DDR)

  19. New technique for real-time interface pressure analysis: getting more out of large image data sets.

    PubMed

    Bogie, Kath; Wang, Xiaofeng; Fei, Baowei; Sun, Jiayang

    2008-01-01

    Recent technological improvements have led to increasing clinical use of interface pressure mapping for seating pressure evaluation, which often requires repeated assessments. However, clinical conditions cannot be controlled as closely as research settings, thereby creating challenges to statistical analysis of data. A multistage longitudinal analysis and self-registration (LASR) technique is introduced that emphasizes real-time interface pressure image analysis in three dimensions. Suitable for use in clinical settings, LASR is composed of several modern statistical components, including a segmentation method. The robustness of our segmentation method is also shown. Application of LASR to analysis of data from neuromuscular electrical stimulation (NMES) experiments confirms that NMES improves static seating pressure distributions in the sacral-ischial region over time. Dynamic NMES also improves weight-shifting over time. These changes may reduce the risk of pressure ulcer development. PMID:18712638

  20. New technique for real-time interface pressure analysis: Getting more out of large image data sets

    PubMed Central

    Bogie, Kath; Wang, Xiaofeng; Fei, Baowei; Sun, Jiayang

    2009-01-01

    Recent technological improvements have led to increasing clinical use of interface pressure mapping for seating pressure evaluation, which often requires repeated assessments. However, clinical conditions cannot be controlled as closely as research settings, thereby creating challenges to statistical analysis of data. A multistage longitudinal analysis and self-registration (LASR) technique is introduced that emphasizes real-time interface pressure image analysis in three dimensions. Suitable for use in clinical settings, LASR is composed of several modern statistical components, including a segmentation method. The robustness of our segmentation method is also shown. Application of LASR to analysis of data from neuromuscular electrical stimulation (NMES) experiments confirms that NMES improves static seating pressure distributions in the sacral-ischial region over time. Dynamic NMES also improves weight-shifting over time. These changes may reduce the risk of pressure ulcer development. PMID:18712638

  1. Influence of Behavior Settings on Role of Inappropriate and Appropriate Behavior.

    ERIC Educational Resources Information Center

    Grimmett, Sadie; And Others

    Behavior settings as control systems of behavior were investigated in two first grade classes in the Tucson Early Education Program (TEEM) experimental school. The two classrooms which served as the experimental units were observed during two behavior settings, individual choice and large group. The observation consisted of recording every 10…

  2. Pre-Service Teachers and Classroom Authority

    ERIC Educational Resources Information Center

    Pellegrino, Anthony M.

    2010-01-01

    This study examined the classroom practices of five pre-service teachers from three secondary schools in a large southeastern state. Through classroom observations, survey responses, reviews of refection logs, and focus-group interview responses, we centered on the issue of developing classroom authority as a means to effective classroom…

  3. Free vascularised fibular grafting with OsteoSet®2 demineralised bone matrix versus autograft for large osteonecrotic lesions of the femoral head.

    PubMed

    Feng, Yong; Wang, Shanzhi; Jin, Dongxu; Sheng, Jiagen; Chen, Shengbao; Cheng, Xiangguo; Zhang, Changqing

    2011-04-01

    The aim of this study was to compare the safety and efficacy of OsteoSet®2 DBM with autologous cancellous bone in free vascularised fibular grafting for the treatment of large osteonecrotic lesions of the femoral head. Twenty-four patients (30 hips) with large osteonecrotic lesions of the femoral head (stage IIC in six hips, stage IIIC in 14, and stage IVC in ten, according to the classification system of Steinberg et al.) underwent free vascularised fibular grafting with OsteoSet®2 DBM. This group was retrospectively matched to a group of 24 patients (30 hips) who underwent free vascularised fibular grafting with autologous cancellous bone during the same time period according to the aetiology, stage, and size of the lesion and the mean preoperative Harris hip score. A prospective case-controlled study was then performed with a mean follow-up duration of 26 months. The results show no statistically significant differences between the two groups in overall clinical outcome or the radiographic assessment. Furthermore, no adverse events related to the use of the OsteoSet®2 DBM were observed. The results demonstrate that OsteoSet®2 DBM combined with autograft bone performs equally as well as that of autologous bone alone. Therefore, OsteoSet®2 DBM can be used as a safe and effective graft extender in free vascularised fibular grafting for large osteonecrotic lesions of the femoral head. PMID:20012040

  4. Creating a Family-Like Atmosphere in Child Care Settings: All the More Difficult in Large Child Care Centers.

    ERIC Educational Resources Information Center

    Whitehead, Linda C.; Ginsberg, Stacey I.

    1999-01-01

    Presents suggestions for creating family-like programs in large child-care centers in three areas: (1) physical environment, incorporating cozy spaces, beauty, and space for family interaction; (2) caregiving climate, such as sharing home photographs, and serving meals family style; and (3) family involvement, including regular conversations with…

  5. The Effect of Repeated Reading with Pairs of Students in a Large-Group Setting on Fluency and Comprehension for Students at Risk for Reading Failure

    ERIC Educational Resources Information Center

    Frame, John N.

    2011-01-01

    Problem: Some students are failing to develop acceptable reading skills; however, instructional time allocated to reading fluency can increase reading comprehension. The purpose of this study was to compare students who received repeated reading with pairs of students in a large-group setting with those who did not in terms of reading fluency,…

  6. Developing a "Semi-Systematic" Approach to Using Large-Scale Data-Sets for Small-Scale Interventions: The "Baby Matterz" Initiative as a Case Study

    ERIC Educational Resources Information Center

    O'Brien, Mark

    2011-01-01

    The appropriateness of using statistical data to inform the design of any given service development or initiative often depends upon judgements regarding scale. Large-scale data sets, perhaps national in scope, whilst potentially important in informing the design, implementation and roll-out of experimental initiatives, will often remain unused…

  7. Developing a "Semi-Systematic" Approach to Using Large-Scale Data-Sets for Small-Scale Interventions: The "Baby Matterz" Initiative as a Case Study

    ERIC Educational Resources Information Center

    O'Brien, Mark

    2011-01-01

    The appropriateness of using statistical data to inform the design of any given service development or initiative often depends upon judgements regarding scale. Large-scale data sets, perhaps national in scope, whilst potentially important in informing the design, implementation and roll-out of experimental initiatives, will often remain unused

  8. Key Issues and Strategies for Recruitment and Implementation in Large-Scale Randomized Controlled Trial Studies in Afterschool Settings. Afterschool Research Brief. Issue No. 2

    ERIC Educational Resources Information Center

    Jones, Debra Hughes; Vaden-Kiernan, Michael; Rudo, Zena; Fitzgerald, Robert; Hartry, Ardice; Chambers, Bette; Smith, Dewi; Muller, Patricia; Moss, Marcey A.

    2008-01-01

    Under the larger scope of the National Partnership for Quality Afterschool Learning, SEDL funded three awardees to carry out large-scale randomized controlled trials (RCT) assessing the efficacy of promising literacy curricula in afterschool settings on student academic achievement. SEDL provided analytic and technical support to the RCT studies…

  9. pXRF quantitative analysis of the Otowi Member of the Bandelier Tuff: Generating large, robust data sets to decipher trace element zonation in large silicic magma chambers

    NASA Astrophysics Data System (ADS)

    Van Hoose, A. E.; Wolff, J.; Conrey, R.

    2013-12-01

    Advances in portable X-Ray fluorescence (pXRF) analytical technology have made it possible for high-quality, quantitative data to be collected in a fraction of the time required by standard, non-portable analytical techniques. Not only do these advances reduce analysis time, but data may also be collected in the field in conjunction with sampling. Rhyolitic pumice, being primarily glass, is an excellent material to be analyzed with this technology. High-quality, quantitative data for elements that are tracers of magmatic differentiation (e.g. Rb, Sr, Y, Nb) can be collected for whole, individual pumices and subsamples of larger pumices in 4 minutes. We have developed a calibration for powdered rhyolite pumice from the Otowi Member of the Bandelier Tuff analyzed with the Bruker Tracer IV pXRF using Bruker software and influence coefficients for pumice, which measures the following 19 oxides and elements: SiO2, TiO2, Al2O3, FeO*, MnO, CaO, K2O, P2O5, Zn, Ga, Rb, Sr, Y, Zr, Nb, Ba, Ce, Pb, and Th. With this calibration for the pXRF and thousands of individual powdered pumice samples, we have generated an unparalleled data set for any single eruptive unit with known trace element zonation. The Bandelier Tuff of the Valles-Toledo Caldera Complex, Jemez Mountains, New Mexico, is divided into three main eruptive events. For this study, we have chosen the 1.61 Ma, 450 km3 Otowi Member as it is primarily unwelded and pumice samples are easily accessible. The eruption began with a plinian phase from a single source located near center of the current caldera and deposited the Guaje Pumice Bed. The initial Unit A of the Guaje is geochemically monotonous, but Units B through E, co-deposited with ignimbrite show very strong chemical zonation in trace elements, progressing upwards through the deposits from highly differentiated compositions (Rb ~350 ppm, Nb ~200 ppm) to less differentiated (Rb ~100 ppm, Nb ~50 ppm). Co-erupted ignimbrites emplaced during column collapse show similar trace element zonation. The eruption culminated in caldera collapse after transitioning from a single central vent to ring fracture vents. Ignimbrites deposited at this time have lithic breccias and chaotic geochemical profiles. The geochemical discrepancy between early and late deposits warrants detailed, high-resolution sampling and analysis in order to fully understand the dynamics behind zonation processes. Samples were collected from locations that circumvent the caldera and prepared and analyzed in the field and the laboratory with the pXRF. Approximately 2,000 pumice samples will complete this unprecedented data set, allowing detailed reconstruction of trace element zonation around all sides of the Valles Caldera. These data are then used to constrain models of magma chamber processes that produce trace element zonation and how it is preserved in the deposits after a catastrophic, caldera-forming eruption.

  10. Learning in Tomorrow's Classrooms

    ERIC Educational Resources Information Center

    Bowman, Richard F.

    2015-01-01

    Teaching today remains the most individualistic of all the professions, with educators characteristically operating in a highly fragmented world of "their" courses, "their" skills, and "their" students. Learning will occur in the classrooms of the future through a sustainable set of complementary capabilities:…

  11. Classroom Management That Works

    ERIC Educational Resources Information Center

    Cleve, Lauren

    2012-01-01

    The purpose of this study was to find the best classroom management strategies to use when teaching in an elementary school setting. I wanted to conduct the best possible management tools for a variety of age groups as well as meet educational standards. Through my research I found different approaches in different grade levels is an important…

  12. The Paperless Music Classroom

    ERIC Educational Resources Information Center

    Giebelhausen, Robin

    2016-01-01

    In an age where the world is becoming ever more aware of paper consumption, educators are turning toward technology to cut back on paper waste. Besides the environmental reasons, a paperless music classroom helps students develop their musicianship in new and exciting ways. This article will look at the considerations for setting up a paperless…

  13. Statistical Analysis of a Large Sample Size Pyroshock Test Data Set Including Post Flight Data Assessment. Revision 1

    NASA Technical Reports Server (NTRS)

    Hughes, William O.; McNelis, Anne M.

    2010-01-01

    The Earth Observing System (EOS) Terra spacecraft was launched on an Atlas IIAS launch vehicle on its mission to observe planet Earth in late 1999. Prior to launch, the new design of the spacecraft's pyroshock separation system was characterized by a series of 13 separation ground tests. The analysis methods used to evaluate this unusually large amount of shock data will be discussed in this paper, with particular emphasis on population distributions and finding statistically significant families of data, leading to an overall shock separation interface level. The wealth of ground test data also allowed a derivation of a Mission Assurance level for the flight. All of the flight shock measurements were below the EOS Terra Mission Assurance level thus contributing to the overall success of the EOS Terra mission. The effectiveness of the statistical methodology for characterizing the shock interface level and for developing a flight Mission Assurance level from a large sample size of shock data is demonstrated in this paper.

  14. The large karstic holes at the top of the Syrian coastal Mountain Range. Importance of structural setting for the karstogenesis.

    NASA Astrophysics Data System (ADS)

    Mocochain, Ludovic; Blanpied, Christian; Bigot, Jean-Yves; Peyronel, Olivier; Gorini, Christian; Abdalla, Abdelkarim Al; Azki, Fawaz

    2015-04-01

    Along the Eastern Mediterranean Sea, the Syria Coastal Mountain Range spreads from north to south over 150 km of long. This range is a monocline structure stopped by a major escarpment that domines Al-Gahb Graben to the East. The Coastal Mountain Range is mainly formed by Mesozoic limestone that show a major unconformity between the Upper Jurassic and Aptien deposits, and important erosions in the Upper Cretaceous deposits. Locally, the Juro-Cretaceous unconformity is characterized by a layer of continental basalts with fossil woods that reveal a long emersion of the platform. The most recent carbonate deposits at the top of the Coastal Mountain Range are Turonian age. In the center part of the Coastal Mountain Range, in a small area, the Cretaceous carbonates are affected by large karstic dolines. These dolines are curiously located at the top of the mountain range. This position is not beneficial for the development of large karstic holes.

  15. Sorting a large set of heavily used LiF:Mg,Ti thermoluminescent detectors into repeatable subsets of similar response.

    PubMed

    Kearfott, Kimberlee J; Newton, Jill P; Rafique, Muhammad

    2014-10-30

    A set of 920 heavily used LiF:Mg,Ti thermoluminescent dosimeters (TLDs) was placed into a polymethyl methacrylate (PMMA) plate attached to a 40×40×15cm(3) PMMA phantom and irradiated to 4.52mGy using a (137)Cs source. This was repeated three times to determine the mean and standard deviation of each TLD׳s sensitivity. Reader drift was tracked over time with 10 control dosimeters. Two test sets of 100 TLDs were divided into subsets with sensitivities within ±1% of their subset means. All dosimeters were re-irradiated four times to test the TLDs׳ response repeatability and determine the sensitivity uniformity within the subsets. Coefficients of variation revealed that, within a given subset, the dosimeters responded within ±2.5% of their subset mean in all calibrations. The coefficient of variation in any of the 200 TLDs׳ calibrations was below 6% across the four calibrations. The work validates the approach of performing three calibrations to separate heavily used and aged TLDs with overall sensitivity variations of ±25% into subsets that reproducibly respond within ±2.5%. PMID:25464196

  16. Dissecting the genetic make-up of North-East Sardinia using a large set of haploid and autosomal markers

    PubMed Central

    Pardo, Luba M; Piras, Giovanna; Asproni, Rosanna; van der Gaag, Kristiaan J; Gabbas, Attilio; Ruiz-Linares, Andres; de Knijff, Peter; Monne, Maria; Rizzu, Patrizia; Heutink, Peter

    2012-01-01

    Sardinia has been used for genetic studies because of its historical isolation, genetic homogeneity and increased prevalence of certain rare diseases. Controversy remains concerning the genetic substructure and the extent of genetic homogeneity, which has implications for the design of genome-wide association studies (GWAS). We revisited this issue by examining the genetic make-up of a sample from North-East Sardinia using a dense set of autosomal, Y chromosome and mitochondrial markers to assess the potential of the sample for GWAS and fine mapping studies. We genotyped individuals for 500K single-nucleotide polymorphisms, Y chromosome markers and sequenced the mitochondrial hypervariable (HVI–HVII) regions. We identified major haplogroups and compared these with other populations. We estimated linkage disequilibrium (LD) and haplotype diversity across autosomal markers, and compared these with other populations. Our results show that within Sardinia there is no major population substructure and thus it can be considered a genetically homogenous population. We did not find substantial differences in the extent of LD in Sardinians compared with other populations. However, we showed that at least 9% of genomic regions in Sardinians differed in LD structure, which is helpful for identifying functional variants using fine mapping. We concluded that Sardinia is a powerful setting for genetic studies including GWAS and other mapping approaches. PMID:22378280

  17. Accompanying Readings & Tools for Enhancing Classroom Approaches for Addressing Barriers to Learning: Classroom-Focused Enabling.

    ERIC Educational Resources Information Center

    California Univ., Los Angeles. Center for Mental Health in Schools.

    This publication presents a set of readings and tools that accompany the education modules "Enhancing Classroom Approaches to Addressing Barriers to Learning: Classroom-Focused Enabling." Together, they delineate a preservice/inservice teacher preparation curriculum covering how regular classrooms and schools should be designed to ensure all

  18. Classroom interactions and science inquiry: A comparative study examining differential implementation of a science program in two middle school classrooms

    NASA Astrophysics Data System (ADS)

    Goldberg, Jennifer Sarah

    This dissertation explores two classroom communities during the implementation of a new environmental science curriculum. The classrooms are similar in that both are located in the same middle school and led by experienced classroom teachers. Despite these similarities, differences among learning outcomes are found in analyses of student pre- and post-science tests in the two rooms. Through videotape analysis of classroom interaction within parallel curricular activities, learning opportunities are contrasted in terms of the social and cognitive organization of science activities and the roles played by teachers, students, and scientists as manifested in their discourse. In one classroom, tasks flow between whole class discussions and small group work. Curricular activities are interwoven with transitions eased as goals are shared with students. Scientific concepts are connected through various activities and related to ideas outside of the classroom. Furthermore, the classroom community is united, established largely through the teacher's discourse patterns, such as deictics (specifically, inclusive personal pronouns). Moreover, the teacher emphasizes that she is learning alongside the students. In the other classroom, the focus of their science period is typically centered around whole class instruction or small group work depending on the particular lesson. This organization accompanied by a heavy use of directives leads to an implicit goal of completing the assigned task. Curricular activities are isolated, with an emphasis on following protocol instructions. Through discursive patterns, such as endearing address terms and exclusive pronouns, a dichotomy is created between the teacher and student. As the designated expert, this teacher imparts her knowledge of science to the students. Several implications emerge from this study. Although pre-packaged, curricular lessons appear identical on paper, the enacted curriculum differs, even in similar settings. Without doubt, science curricula can be useful in providing suggested guidelines and much needed materials for the classroom, but such curricula do not necessarily translate into student inquiry. As researchers and educators, we need to look beyond the curricula into the classrooms themselves. Indeed, this research has convinced me that a better understanding of classroom communities can be gleaned through the study of lesson organization and the classroom roles.

  19. Outdoor Classrooms

    ERIC Educational Resources Information Center

    Mayes, Valynda

    2010-01-01

    An outdoor classroom is the ideal vehicle for community involvement: Parents, native plant societies, 4-H, garden clubs, and master naturalists are all resources waiting to be tapped, as are local businesses offering support. If you enlist your community in the development and maintenance of your outdoor classroom, the entire community will…

  20. Classroom Connect.

    ERIC Educational Resources Information Center

    Richardson, Sandra

    1997-01-01

    Describes the World Wide Web site called Classroom Connect. Notes that it gives easy access to Global Resources and Directory of Educational Sites (GRADES), which lists only "high quality" sites. Briefly discusses 17 sites listed by GRADES, and seven sections of the Classroom Connect site. (RS)

  1. Classroom Screening.

    ERIC Educational Resources Information Center

    Alpha Plus Corp., Piedmont, CA.

    This classroom screening device was developed by the Circle Preschool First Chance Project, a government-funded program to integrate handicapped children into regular classroom activities, for use in preschools, nursery schools, Head Start centers and other agencies working with young children. It is designed to give a gross measure of a child's…

  2. Classroom Management.

    ERIC Educational Resources Information Center

    Dinsmore, Terri Sue

    This paper is a report of a middle-school teacher's study of classroom management. The teacher/researcher was interested in how some of the techniques in the Kovalik Integrated Thematic Instruction model of training would influence the teacher/researcher's classroom management; the effects of direct instruction within a community circle; the…

  3. Classroom Activities.

    ERIC Educational Resources Information Center

    Stuart, Frances R.

    This pamphlet suggests activities that may be used in the elementary school classroom. Chapter I lists various short plays that children can easily perform which encourage their imagination. Chapter II details a few quiet classroom games such as "I Saw,""Corral the Wild Horse,""Who Has Gone from the Room," and "Six-Man-Football Checkers." A number…

  4. Classroom Organization

    ERIC Educational Resources Information Center

    Technology & Learning, 2005

    2005-01-01

    Good organization skills are key to running an efficient classroom, and having the right tools makes it easier to manage all of the tasks, save time, and be more productive. Having the power of information when and where anyone need it makes a difference in how well any teacher runs the classroom and knows his or her students. A Palm handheld…

  5. Deep sequencing of large library selections allows computational discovery of diverse sets of zinc fingers that bind common targets.

    PubMed

    Persikov, Anton V; Rowland, Elizabeth F; Oakes, Benjamin L; Singh, Mona; Noyes, Marcus B

    2014-02-01

    The Cys2His2 zinc finger (ZF) is the most frequently found sequence-specific DNA-binding domain in eukaryotic proteins. The ZF's modular protein-DNA interface has also served as a platform for genome engineering applications. Despite decades of intense study, a predictive understanding of the DNA-binding specificities of either natural or engineered ZF domains remains elusive. To help fill this gap, we developed an integrated experimental-computational approach to enrich and recover distinct groups of ZFs that bind common targets. To showcase the power of our approach, we built several large ZF libraries and demonstrated their excellent diversity. As proof of principle, we used one of these ZF libraries to select and recover thousands of ZFs that bind several 3-nt targets of interest. We were then able to computationally cluster these recovered ZFs to reveal several distinct classes of proteins, all recovered from a single selection, to bind the same target. Finally, for each target studied, we confirmed that one or more representative ZFs yield the desired specificity. In sum, the described approach enables comprehensive large-scale selection and characterization of ZF specificities and should be a great aid in furthering our understanding of the ZF domain. PMID:24214968

  6. Deep sequencing of large library selections allows computational discovery of diverse sets of zinc fingers that bind common targets

    PubMed Central

    Persikov, Anton V.; Rowland, Elizabeth F.; Oakes, Benjamin L.; Singh, Mona; Noyes, Marcus B.

    2014-01-01

    The Cys2His2 zinc finger (ZF) is the most frequently found sequence-specific DNA-binding domain in eukaryotic proteins. The ZF’s modular protein–DNA interface has also served as a platform for genome engineering applications. Despite decades of intense study, a predictive understanding of the DNA-binding specificities of either natural or engineered ZF domains remains elusive. To help fill this gap, we developed an integrated experimental-computational approach to enrich and recover distinct groups of ZFs that bind common targets. To showcase the power of our approach, we built several large ZF libraries and demonstrated their excellent diversity. As proof of principle, we used one of these ZF libraries to select and recover thousands of ZFs that bind several 3-nt targets of interest. We were then able to computationally cluster these recovered ZFs to reveal several distinct classes of proteins, all recovered from a single selection, to bind the same target. Finally, for each target studied, we confirmed that one or more representative ZFs yield the desired specificity. In sum, the described approach enables comprehensive large-scale selection and characterization of ZF specificities and should be a great aid in furthering our understanding of the ZF domain. PMID:24214968

  7. The Effects of Positive Verbal Reinforcement on the Time Spent outside the Classroom for Students with Emotional and Behavioral Disorders in a Residential Setting

    ERIC Educational Resources Information Center

    Kennedy, Christina; Jolivette, Kristine

    2008-01-01

    To more effectively instruct the entire class, teachers of students with emotional behavioral disorders (EBD) often choose to send students who display inappropriate behavior out of the room. A multiple baseline across settings was used to evaluate the effects of increasing teacher positive verbal reinforcement on the amount of time 2 students…

  8. The Effects of Positive Verbal Reinforcement on the Time Spent outside the Classroom for Students with Emotional and Behavioral Disorders in a Residential Setting

    ERIC Educational Resources Information Center

    Kennedy, Christina; Jolivette, Kristine

    2008-01-01

    To more effectively instruct the entire class, teachers of students with emotional behavioral disorders (EBD) often choose to send students who display inappropriate behavior out of the room. A multiple baseline across settings was used to evaluate the effects of increasing teacher positive verbal reinforcement on the amount of time 2 students

  9. Moving toward an Empowering Setting in a First Grade Classroom Serving Primarily Working Class and Working Poor Latina/o Children: An Exploratory Analysis

    ERIC Educational Resources Information Center

    Silva, Janelle M.; Langhout, Regina Day

    2016-01-01

    Empowering settings are important places for people to develop leadership skills in order to enact social change. Yet, due to socio-cultural constructions of childhood in the US, especially constructions around working class and working poor children of Color, they are often not seen as capable or competent change agents, or in need of being in

  10. Moving toward an Empowering Setting in a First Grade Classroom Serving Primarily Working Class and Working Poor Latina/o Children: An Exploratory Analysis

    ERIC Educational Resources Information Center

    Silva, Janelle M.; Langhout, Regina Day

    2016-01-01

    Empowering settings are important places for people to develop leadership skills in order to enact social change. Yet, due to socio-cultural constructions of childhood in the US, especially constructions around working class and working poor children of Color, they are often not seen as capable or competent change agents, or in need of being in…

  11. Efficient computation of k-Nearest Neighbour Graphs for large high-dimensional data sets on GPU clusters.

    PubMed

    Dashti, Ali; Komarov, Ivan; D'Souza, Roshan M

    2013-01-01

    This paper presents an implementation of the brute-force exact k-Nearest Neighbor Graph (k-NNG) construction for ultra-large high-dimensional data cloud. The proposed method uses Graphics Processing Units (GPUs) and is scalable with multi-levels of parallelism (between nodes of a cluster, between different GPUs on a single node, and within a GPU). The method is applicable to homogeneous computing clusters with a varying number of nodes and GPUs per node. We achieve a 6-fold speedup in data processing as compared with an optimized method running on a cluster of CPUs and bring a hitherto impossible [Formula: see text]-NNG generation for a dataset of twenty million images with 15 k dimensionality into the realm of practical possibility. PMID:24086314

  12. The cosmogenic 21Ne production rate in quartz evaluated on a large set of existing 21Ne- 10Be data

    NASA Astrophysics Data System (ADS)

    Kober, F.; Alfimov, V.; Ivy-Ochs, S.; Kubik, P. W.; Wieler, R.

    2011-02-01

    Based on a compilation of published combined 10Be and 21Ne cosmogenic nuclide data sets from quartz samples obtained at ETH Zürich we assess the 21Ne/10Be (P21/P10) production rate ratio with the goal to determine the 21Ne production rate (P21) in quartz. A variety of sliding “erosion islands” in a 21Ne/10Be versus 10Be diagram were evaluated to find the one that fits the data best, which in turn yields the most probable P21 if the 10Be production rate is known. The approach minimizes the influence of samples with a complex exposure history. A best-fit value for P21/P10sp (sp - the 10Be fraction being produced by spallation, as opposed to production by muons) of 4.23 ± 0.17 is obtained for a 10Be half-life of 1.39 Ma. Adopting a P10sp value in quartz of 4.41 ± 0.52 at g-1 yr-1 this yields a P21 of 18.7 ± 2.3 at g-1 yr-1. It is possible that 2% of the 21Ne is produced by fast muons.

  13. The Learning Environment in Clicker Classrooms: Student Processes of Learning and Involvement in Large University-Level Courses Using Student Response Systems

    ERIC Educational Resources Information Center

    Trees, April R.; Jackson, Michele H.

    2007-01-01

    To explore what social and educational infrastructure is needed to support classroom use of student response systems (Roschelle et al., 2004), this study investigated the ways in which student characteristics and course design choices were related to students' assessments of the contribution of clicker use to their learning and involvement in the…

  14. "Did Ronald McDonald also Tend to Scare You as a Child?": Working to Emplace Consumption, Commodities and Citizen-Students in a Large Classroom Setting

    ERIC Educational Resources Information Center

    Goodman, Michael K.

    2008-01-01

    So-called "radical" and "critical"pedagogy seems to be everywhere these days on the landscapes of geographical teaching praxis and theory. Part of the remit of radical/critical pedagogy involves a de-centring of the traditional "banking" method of pedagogical praxis. Yet, how do we challenge this "banking" model of knowledge transmission in both a

  15. "Did Ronald McDonald also Tend to Scare You as a Child?": Working to Emplace Consumption, Commodities and Citizen-Students in a Large Classroom Setting

    ERIC Educational Resources Information Center

    Goodman, Michael K.

    2008-01-01

    So-called "radical" and "critical"pedagogy seems to be everywhere these days on the landscapes of geographical teaching praxis and theory. Part of the remit of radical/critical pedagogy involves a de-centring of the traditional "banking" method of pedagogical praxis. Yet, how do we challenge this "banking" model of knowledge transmission in both a…

  16. Approaching the complete basis set limit of CCSD(T) for large systems by the third-order incremental dual-basis set zero-buffer F12 method

    SciTech Connect

    Zhang, Jun Dolg, Michael

    2014-01-28

    The third-order incremental dual-basis set zero-buffer approach was combined with CCSD(T)-F12x (x = a, b) theory to develop a new approach, i.e., the inc3-db-B0-CCSD(T)-F12 method, which can be applied as a black-box procedure to efficiently obtain the near complete basis set (CBS) limit of the CCSD(T) energies also for large systems. We tested this method for several cases of different chemical nature: four complexes taken from the standard benchmark sets S66 and X40, the energy difference between isomers of water hexamer and the rotation barrier of biphenyl. The results show that our method has an error relative to the best estimation of CBS energy of only 0.2 kcal/mol or less. By parallelization, our method can accomplish the CCSD(T)-F12 calculations of about 60 correlated electrons and 800 basis functions in only several days, which by standard implementation are impossible for ordinary hardware. We conclude that the inc3-db-B0-CCSD(T)-F12a/AVTZ method, which is of CCSD(T)/AV5Z quality, is close to the limit of accuracy that one can achieve for large systems currently.

  17. Assessment of amyloid β-protein precursor gene mutations in a large set of familial and sporadic Alzheimer disease cases

    PubMed Central

    Tanzi, Rudolph E.; Vaula, Giovanna; Romano, Donna M.; Mortilla, Marzia; Huang, Tricia L.; Tupler, Rossella G.; Wasco, Wilma; Hyman, Bradley T.; Haines, Jonathan L.; Jenkins, Barbara J.; Kalaitsidaki, Marianna; Warren, Andrew C.; McInnis, Melvin C.; Antonarakis, Stylianos E.; Karlinsky, Harry; Percy, Maire E.; Connor, Linda; Growdon, John; Crapper-McIachlan, Donald R.; Gusella, James F.; George-Hyslop, Peter H. St

    1992-01-01

    A genetic locus associated with familial Alzheimer disease (FAD) and a candidate gene, APP, encoding the amyloid protein precursor have both been assigned previously to chromosome 21, and, in a few FAD families, mutations of APP have been detected. However, obligate crossovers between APP and FAD have also been reported in several FAD pedigrees, including FAD4, a large kindred showing highly suggestive evidence for linkage of the disorder to chromosome 21. In case the apparent APP crossover in FAD4 actually represented an intragenic recombination event or segregation of different mutations in different family branches, we have performed a more detailed assessment of APP as a candidate gene in this family. The entire coding region of the APP gene was sequenced for FAD4 and for FAD1, a second large kindred. No mutations were found, indicating that, in at least one chromosome 21–linked FAD pedigree, the gene defect is not accounted for by a mutation in the known coding region of the APP gene. A total of 25 well-characterized early- and late-onset FAD pedigrees were typed for genetic linkage to APP, to assess the percentage of FAD families predicted to carry mutations in the APP gene. None of the FAD families yielded positive lod scores at a recombination fraction of 0.0. To estimate the overall prevalence of FAD-associated mutations in the βA4 domain of APP, we sequenced exons 16 and 17 in 30 (20 early- and 10 late-onset) FAD kindreds and in 11 sporadic AD cases, and we screened 56 FAD kindreds and 81 cases of sporadic AD for the presence of the originally reported FAD-associated mutation, APP717 Val→Ile (by BclI digestion). No APP gene mutations were found in any of the FAD families or sporadic-AD samples examined in this study, suggesting that the mutations in exons 16 and 17 are a rare cause of FAD. Overall, these data suggest that APP gene mutations account for a very small portion of FAD. ImagesFigure 1 PMID:1642228

  18. Organizational development trajectory of a large academic radiotherapy department set up similarly to a prospective clinical trial: the MAASTRO experience

    PubMed Central

    Boersma, L; Dekker, A; Hermanns, E; Houben, R; Govers, M; van Merode, F; Lambin, P

    2015-01-01

    Objective: To simultaneously improve patient care processes and clinical research activities by starting a hypothesis-driven reorganization trajectory mimicking the rigorous methodology of a prospective clinical trial. Methods: The design of this reorganization trajectory was based on the model of a prospective trial. It consisted of (1) listing problems and analysing their potential causes, (2) defining interventions, (3) defining end points and (4) measuring the effect of the interventions (i.e. at baseline and after 1 and 2 years). The primary end point for patient care was the number of organizational root causes of incidents/near incidents; for clinical research, it was the number of patients in trials. There were several secondary end points. We analysed the data using two sample z-tests, χ2 test, a Mann–Whitney U test and the one-way analysis of variance with Bonferroni correction. Results: The number of organizational root causes was reduced by 27% (p < 0.001). There was no effect on the percentage of patients included in trials. Conclusion: The reorganizational trajectory was successful for the primary end point of patient care and had no effect on clinical research. Some confounding events hampered our ability to draw strong conclusions. Nevertheless, the transparency of this approach can give medical professionals more confidence in moving forward with other organizational changes in the same way. Advances in knowledge: This article is novel because managerial interventions were set up similarly to a prospective clinical trial. This study is the first of its kind in radiotherapy, and this approach can contribute to discussions about the effectiveness of managerial interventions. PMID:25679320

  19. Flexible Classroom Furniture

    ERIC Educational Resources Information Center

    Kim Hassell,

    2011-01-01

    Classroom design for the 21st-century learning environment should accommodate a variety of learning skills and needs. The space should be large enough so it can be configured to accommodate a number of learning activities. This also includes furniture that provides flexibility and accommodates collaboration and interactive work among students and

  20. Flexible Classroom Furniture

    ERIC Educational Resources Information Center

    Kim Hassell,

    2011-01-01

    Classroom design for the 21st-century learning environment should accommodate a variety of learning skills and needs. The space should be large enough so it can be configured to accommodate a number of learning activities. This also includes furniture that provides flexibility and accommodates collaboration and interactive work among students and…

  1. Tips from the Classroom.

    ERIC Educational Resources Information Center

    TESOL Journal, 1993

    1993-01-01

    Four short articles are combined: "Adding Discourse-Level Practice to Sentence-Level Exercises" (Eric S. Nelson); "Presenting Picture Books in the ESL Classroom" (Lijun Shen); "Role Playing in a Large Class" (Ellen Rosen); and "Calvin and Hobbes and Other Icons of Americana" (Daniel J. Conrad). (Contains seven references.) (LB)

  2. LINC-NIRVANA for the large binocular telescope: setting up the world's largest near infrared binoculars for astronomy

    NASA Astrophysics Data System (ADS)

    Hofferbert, Ralph; Baumeister, Harald; Bertram, Thomas; Berwein, Jürgen; Bizenberger, Peter; Böhm, Armin; Böhm, Michael; Borelli, José Luis; Brangier, Matthieu; Briegel, Florian; Conrad, Albert; De Bonis, Fulvio; Follert, Roman; Herbst, Tom; Huber, Armin; Kittmann, Frank; Kürster, Martin; Laun, Werner; Mall, Ulrich; Meschke, Daniel; Mohr, Lars; Naranjo, Vianak; Pavlov, Aleksei; Pott, Jörg-Uwe; Rix, Hans-Walter; Rohloff, Ralf-Rainer; Schinnerer, Eva; Storz, Clemens; Trowitzsch, Jan; Yan, Zhaojun; Zhang, Xianyu; Eckart, Andreas; Horrobin, Matthew; Rost, Steffen; Straubmeier, Christian; Wank, Imke; Zuther, Jens; Beckmann, Udo; Connot, Claus; Heininger, Matthias; Hofmann, Karl-Heinz; Kröner, Tim; Nussbaum, Eddy; Schertl, Dieter; Weigelt, Gerd; Bergomi, Maria; Brunelli, Alessandro; Dima, Marco; Farinato, Jacopo; Magrin, Demetrio; Marafatto, Luca; Ragazzoni, Roberto; Viotto, Valentina; Arcidiacono, Carmelo; Bregoli, Giovanni; Ciliegi, Paolo; Cosentino, Guiseppe; Diolaiti, Emiliano; Foppiani, Italo; Lombini, Matteo; Schreiber, Laura; D'Alessio, Francesco; Li Causi, Gianluca; Lorenzetti, Dario; Vitali, Fabrizio; Bertero, Mario; Boccacci, Patrizia; La Camera, Andrea

    2013-08-01

    LINC-NIRVANA (LN) is the near-infrared, Fizeau-type imaging interferometer for the large binocular telescope (LBT) on Mt. Graham, Arizona (elevation of 3267 m). The instrument is currently being built by a consortium of German and Italian institutes under the leadership of the Max Planck Institute for Astronomy in Heidelberg, Germany. It will combine the radiation from both 8.4 m primary mirrors of LBT in such a way that the sensitivity of a 11.9 m telescope and the spatial resolution of a 22.8 m telescope will be obtained within a 10.5×10.5 arcsec scientific field of view. Interferometric fringes of the combined beams are tracked in an oval field with diameters of 1 and 1.5 arcmin. In addition, both incoming beams are individually corrected by LN's multiconjugate adaptive optics system to reduce atmospheric image distortion over a circular field of up to 6 arcmin in diameter. A comprehensive technical overview of the instrument is presented, comprising the detailed design of LN's four major systems for interferometric imaging and fringe tracking, both in the near infrared range of 1 to 2.4 μm, as well as atmospheric turbulence correction at two altitudes, both in the visible range of 0.6 to 0.9 μm. The resulting performance capabilities and a short outlook of some of the major science goals will be presented. In addition, the roadmap for the related assembly, integration, and verification process are discussed. To avoid late interface-related risks, strategies for early hardware as well as software interactions with the telescope have been elaborated. The goal is to ship LN to the LBT in 2014.

  3. Strategy Training in a Task-Based Language Classroom

    ERIC Educational Resources Information Center

    Lai, Chun; Lin, Xiaolin

    2015-01-01

    Recent literature that examines the implementation of task-based language teaching (TBLT) in classroom settings has reported various challenges related to educational cultures, classroom management, teacher cognition and learner perceptions. To facilitate the smooth transition of TBLT from laboratory settings to classroom contexts, measures need…

  4. Full-color high-resolution 21-inch common large-area display set to replace 19-inch CRTs in several USAF C4I applications

    NASA Astrophysics Data System (ADS)

    Orkis, Randall E.; Gorenflo, Ronald L.; Hermann, David J.

    1995-06-01

    Battelle is under contract with Warner Robins Air Logistics Center to design a Common Large Area Display Set (CLADS) for use in multiple airborne C4I applications that currently use unique 19" CRTs. Battelle engineers have determined that by taking advantage of the latest flat panel display technology and the commonality between C4I applications, one display set (21" diag. 1280 X 1024) can be designed for use in multiple applications. In addition, common nodular driver and processing electronics are being designed by Battelle to reduce the number of installation-specific circuit card assemblies required for a particular application. Three initial applications include the E-3 (AWACS) color monitor assembly, E-8 (JSTARS) graphics display unit, and ABCCC airborne color display. For these three applications reliability and maintainability are key drivers. The common design approach reduces the number of unique subassemblies in the USAF inventory by approximately 56 to 66 percent. The new design is also expected to have MTBF of at least 3350 hours, and order of magnitude better than one of the current systems. In the JSTARS installation, more than 1400 lbs can be eliminated from the aircraft. In the E-3 installation, the CLADS is estimated to provide a power reduction of approximately 1750 watts per aircraft. This paper will discuss the common large area display set design and its use in a variety of C4I applications that require a large area, high resolution, full color display.

  5. Eruptive history and tectonic setting of Medicine Lake Volcano, a large rear-arc volcano in the southern Cascades

    USGS Publications Warehouse

    Donnelly-Nolan, J. M.; Grove, T.L.; Lanphere, M.A.; Champion, D.E.; Ramsey, D.W.

    2008-01-01

    Medicine Lake Volcano (MLV), located in the southern Cascades ??? 55??km east-northeast of contemporaneous Mount Shasta, has been found by exploratory geothermal drilling to have a surprisingly silicic core mantled by mafic lavas. This unexpected result is very different from the long-held view derived from previous mapping of exposed geology that MLV is a dominantly basaltic shield volcano. Detailed mapping shows that < 6% of the ??? 2000??km2 of mapped MLV lavas on this southern Cascade Range shield-shaped edifice are rhyolitic and dacitic, but drill holes on the edifice penetrated more than 30% silicic lava. Argon dating yields ages in the range ??? 475 to 300??ka for early rhyolites. Dates on the stratigraphically lowest mafic lavas at MLV fall into this time frame as well, indicating that volcanism at MLV began about half a million years ago. Mafic compositions apparently did not dominate until ??? 300??ka. Rhyolite eruptions were scarce post-300??ka until late Holocene time. However, a dacite episode at ??? 200 to ??? 180??ka included the volcano's only ash-flow tuff, which was erupted from within the summit caldera. At ??? 100??ka, compositionally distinctive high-Na andesite and minor dacite built most of the present caldera rim. Eruption of these lavas was followed soon after by several large basalt flows, such that the combined area covered by eruptions between 100??ka and postglacial time amounts to nearly two-thirds of the volcano's area. Postglacial eruptive activity was strongly episodic and also covered a disproportionate amount of area. The volcano has erupted 9 times in the past 5200??years, one of the highest rates of late Holocene eruptive activity in the Cascades. Estimated volume of MLV is ??? 600??km3, giving an overall effusion rate of ??? 1.2??km3 per thousand years, although the rate for the past 100??kyr may be only half that. During much of the volcano's history, both dry HAOT (high-alumina olivine tholeiite) and hydrous calcalkaline basalts erupted together in close temporal and spatial proximity. Petrologic studies indicate that the HAOT magmas were derived by dry melting of spinel peridotite mantle near the crust mantle boundary. Subduction-derived H2O-rich fluids played an important role in the generation of calcalkaline magmas. Petrology, geochemistry and proximity indicate that MLV is part of the Cascades magmatic arc and not a Basin and Range volcano, although Basin and Range extension impinges on the volcano and strongly influences its eruptive style. MLV may be analogous to Mount Adams in southern Washington, but not, as sometimes proposed, to the older distributed back-arc Simcoe Mountains volcanic field.

  6. News Teaching: The epiSTEMe project: KS3 maths and science improvement Field trip: Pupils learn physics in a stately home Conference: ShowPhysics welcomes fun in Europe Student numbers: Physics numbers increase in UK Tournament: Physics tournament travels to Singapore Particle physics: Hadron Collider sets new record Astronomy: Take your classroom into space Forthcoming Events

    NASA Astrophysics Data System (ADS)

    2010-05-01

    Teaching: The epiSTEMe project: KS3 maths and science improvement Field trip: Pupils learn physics in a stately home Conference: ShowPhysics welcomes fun in Europe Student numbers: Physics numbers increase in UK Tournament: Physics tournament travels to Singapore Particle physics: Hadron Collider sets new record Astronomy: Take your classroom into space Forthcoming Events

  7. Antithrombotic Utilization Trends after Noncardioembolic Ischemic Stroke or TIA in the Setting of Large Antithrombotic Trials (2002–2009)

    PubMed Central

    Khan, Amir S.; Qureshi, Adnan I.

    2015-01-01

    Background and Purpose Several large trials published over the last decade have significantly altered recommended guidelines for therapy following a noncardioembolic ischemic stroke or transient ischemic attack (TIA). The impact of these studies on patient usage of alternative antithrombotic agents has hitherto not been evaluated. We examined the usage of these agents in the United States over the last decade, with regard to the publication of the Management of Atherothrombosis with Clopidogrel in High-Risk Patients (MATCH), European/Australasian Stroke Prevention in Reversible Ischaemia Trial (ESPRIT), and Prevention Regimen for Effectively Avoiding Second Strokes (PRoFESS) clinical trials, in order to test the hypothesis that resulting recommendations are reflected in usage trends. Methods Antithrombotic utilization was prospectively collected as part of the National Ambulatory Medical Care Survey (NAMCS) on a total of 53,608,351 patients in the United States between 2002 and 2009. Patients with a history of ischemic stroke or TIA were included. Patients were excluded if there was a prior history of subarachnoid or intracerebral hemorrhage, or if other indications for antithrombotic treatment were present, including deep venous thrombosis, pulmonary embolism, atrial fibrillation or flutter, mechanical cardiac valve replacement, congestive heart failure, coronary artery disease, peripheral arterial disease, and rheumatoid arthritis. Annual utilization of the following antithrombotic strategies was compared in 53,608,351 patients: 1) aspirin monotherapy, 2) clopidogrel monotherapy, 3) combined clopidogrel and aspirin, 4) combined extended-release dipyridamole (ERDP) and aspirin, and 5) warfarin. Annual utilization was compared before and after publication of MATCH, ESPRIT, and PRoFESS in 2004, 2006, and 2008, respectively. Trend analysis was performed with the Mantel–Haenszel test for trends. Sensitivity analysis of demographic and clinical characteristics stratified by antithrombotic-usage group was performed using the Wald Chi-square test. Results Utilization of combined clopidogrel and aspirin increased from 3.3% to 6.7% after the MATCH trial (p<0.0001). Following the results of the ESPRIT trial, utilization of combination ERDP and aspirin decreased from 4% to 3% (p<0.0001), utilization of clopidogrel declined from 6.8% to 6% (p<0.0001), and utilization of aspirin remained essentially unchanged. After the PRoFESS trial, utilization of clopidogrel increased from 5% to 9% (p<0.0001), utilization of ERDP-aspirin increased from 3 % to 4.6% (p<0.0001), and utilization of aspirin increased from 15.6% to 17.8% (p<0.0001). The proportion of patients on none of the five antithrombotic secondary prevention strategies steadily declined from a peak of 74% in 2003 to 57% by 2009. Conclusions The impact of the MATCH, ESPRIT, and PRoFESS trials on antithrombotic utilization has been variable. These findings highlight the importance of addressing factors that affect the implementation of findings from major clinical trials. PMID:25825628

  8. Electronic Classroom.

    ERIC Educational Resources Information Center

    Price, Harry A.

    The conversion of a limited-use, "white elephant" auditorium into an electronic classroom to be used as a flexible instructional space with numerous potentials for enrichment of learning via utilization of electromechanical aids. (FS)

  9. Rethinking the Christian Studies Classroom: Reflections on the Dynamics of Teaching Religion in Southern Public Universities

    ERIC Educational Resources Information Center

    Gravett, Sandie; Hulsether, Mark; Medine, Carolyn

    2011-01-01

    An extended set of conversations conducted by three religious studies faculty teaching at large public universities in the Southern United States spurred these reflections on how their institutional locations inflected issues such as the cultural expectations students bring to the classroom, how these expectations interact with the evolving…

  10. The Impact of Course Delivery Systems on Student Achievement and Sense of Community: A Comparison of Learning Community versus Stand-Alone Classroom Settings in an Open-Enrollment Inner City Public Community College

    ERIC Educational Resources Information Center

    Bandyopadhyay, Pamela

    2010-01-01

    This study examined the effects of two types of course delivery systems (learning community classroom environments versus stand-alone classroom environments) on the achievement of students who were simultaneously enrolled in remedial and college-level social science courses at an inner city open-enrollment public community college. This study was…

  11. The effect of between-set rest intervals on the oxygen uptake during and after resistance exercise sessions performed with large- and small-muscle mass.

    PubMed

    Farinatti, Paulo T V; Castinheiras Neto, Antonio G

    2011-11-01

    Between-set rest intervals (RIs) may influence accumulated fatigue, work volume, and therefore oxygen uptake (VO2) and energy expenditure (EE) during resistance training. The study investigated the effects of different RIs on VO2 and EE in resistance exercises performed with multiple sets and recruiting large and small-muscle mass. Ten healthy men performed 4 randomized protocols (5 sets of 10 repetitions with 15 repetition maximum workloads in either horizontal leg press [LP] or chest fly [CF] with an RI of 1 and 3 minutes). The VO2 was measured at rest, within sets, and during 90-minute postexercise recovery (excess postexercise oxygen consumption [EPOC]). The EE was estimated from VO2net (total VO2 - rest VO2). The VO2 increased in all protocols, being higher within the exercises and during EPOC in the LP than in the CF regardless of the RI. The 1-minute RI induced higher accumulated VO2 during LP (p < 0.05) but not during CF. The EPOC lasted approximately 40 minutes after LP1, LP3, and CF1, being longer than after CF3 (20 minutes, p < 0.05). Total EE was mainly influenced by muscle mass (p < 0.001) (LP3 = 91.1 ± 13.5 kcal ∼ LP1 = 88.7 ± 18.4 kcal > CF1 = 50.3 ± 14.4 kcal ∼ CF3 = 54.1 ± 12.0 kcal). In conclusion, total VO2 was always higher in LP than in CF. Shortening RI enhanced the accumulated fatigue throughout sets only in LP and increased VO2 in the initial few minutes of EPOC, whereas it did not influence total VO2 and EE in both exercises. Therefore, (a) the role of RI in preventing early fatigue seems to be more important when large-muscle groups are recruited; (b) resistance exercises recruiting large-muscle mass induce higher EE because of a greater EPOC magnitude. PMID:21993043

  12. Tectonic stress inversion of large multi-phase fracture data sets: application of Win-Tensor to reveal the brittle tectonic history of the Lufilan Arc, DRC

    NASA Astrophysics Data System (ADS)

    Delvaux, Damien; Kipata, Louis; Sintubin, Manuel

    2013-04-01

    Large fault-slip data sets from multiphase orogenic regions present a particular challenge in paleostress reconstructions. The Lufilian Arc is an arcuate fold-and-thrust belt that formed during the late Pan-African times as the result of combined N-S and E-W amalgamation of Gondwana in SE-DRCongo and N-Zambia. We studied more than 22 sites in the Lufilian Arc, and its foreland and correlated the results obtained with existing result in the Ubende belt of W-Tanzania. Most studied sites are characterized by multiphase brittle deformation in which the observed brittle structures are the result of progressive saturation of the host rock by neoformed fractures and the reactivation of early formed fractures. They correspond to large mining exploitations with multiple large and continuous outcrops that allow obtaining datasets sufficiently large to be of statistical significance and often corresponding to several successive brittle events. In this context, the reconstruction of tectonic stress necessitates an initial field-base separation of data, completed by a dynamic separation of the original data set into subsets. In the largest sites, several parts of the deposits have been measured independently and are considered as sub-sites that are be processed separately in an initial stage. The procedure used for interactive fault-slip data separation and stress inversion will be illustrated by field examples (Luiswishi and Manono mining sites). This principle has been applied to all result in the reconstruction of the brittle tectonic history of the region, starting with two major phases of orogenic compression, followed by late orogenic extension and extensional collapse. A regional tectonic inversion during the early Mesozoic, as a result of far- field stresses mark the transition towards rift-related extension. More details in Kipata, Delvaux et al.(2013), Geologica Belgica 16/1-2: 001-017 Win-Tensor can be downloaded at: http://users.skynet.be/damien.delvaux/Tensor/tensor-index.html

  13. Tools to achieve the analysis of large data-set and handling intensity variations of sources with INTEGRAL/SPI : mapping of the sky and study of large-scale structures

    NASA Astrophysics Data System (ADS)

    Bouchet, Laurent

    Nowadays, analysis and reduction of the ever-larger data-sets becomes a crucial issue, especially for long periods of observation combined. The INTEGRAL/SPI X/gamma-ray spectrometer (20 keV-8 MeV) is an instrument for which it is essential to process many exposures at the same time to increase the low signal-to-noise ratio weakest sources and/or low-surface brightness extended emission. Processing several years of data simultaneously (10 years actually) requires to compute not only the solution of a large system of equations (linear or non-linear), but also the associated uncertainties. In this context, traditional methods of data reduction are ineffective and sometimes not possible at all. Thanks to the newly developed tools, processing large data-sets from SPI is now possible both with a reasonable turnaround time and low memory usage. We propose also techniques that help overcome difficulties related to the intensity variation of sources/backgreound between sources between consecutive exposures. They allow the construction of pseudo light-curves of a more rational way. We have developed a specific algorithm which involves the transfer function SPI. Based on these advanced tools, we have developed imaging algorithms. Finally, we show some applications to point-sources studies and to the imaging and morphologies study of the large scale structures of the Galaxy ( 511 kev electron-positron annihilation line, the (26) Al line and the diffuse continuum).

  14. How Teachers Know Their Classrooms: A Crosscultural Study of Teachers' Understanding of Classroom Situations.

    ERIC Educational Resources Information Center

    Ben-Peretz, Miriam; Halkes, Rob

    Teachers' knowledge and understanding of classrooms is perceived in this study in terms of interpretation of non-verbal and situational cues. Two sets of videotapes of classroom episodes were prepared, one in Hebrew and one in Dutch. These tapes were viewed by Israeli and Dutch teachers in cross-cultural settings. Teachers responded in writing and…

  15. How Teachers Know Their Classrooms: A Cross-Cultural Study of Teachers' Understanding of Classroom Situations.

    ERIC Educational Resources Information Center

    Ben-Peretz, Miriam; Halkes, Rob

    1987-01-01

    Examines teachers' knowledge and understanding of classrooms in terms of interpretation of nonverbal and situational clues. Two sets of videotapes of classroom episodes, one in Hebrew and one in Dutch, were viewed by both Dutch and Israeli teachers in cross-cultural settings. Analysis of responses yielded insights into the two cultures.…

  16. A geometrical correction for the inter- and intra-molecular basis set superposition error in Hartree-Fock and density functional theory calculations for large systems.

    PubMed

    Kruse, Holger; Grimme, Stefan

    2012-04-21

    A semi-empirical counterpoise-type correction for basis set superposition error (BSSE) in molecular systems is presented. An atom pair-wise potential corrects for the inter- and intra-molecular BSSE in supermolecular Hartree-Fock (HF) or density functional theory (DFT) calculations. This geometrical counterpoise (gCP) denoted scheme depends only on the molecular geometry, i.e., no input from the electronic wave-function is required and hence is applicable to molecules with ten thousands of atoms. The four necessary parameters have been determined by a fit to standard Boys and Bernadi counterpoise corrections for Hobza's S66×8 set of non-covalently bound complexes (528 data points). The method's target are small basis sets (e.g., minimal, split-valence, 6-31G*), but reliable results are also obtained for larger triple-ζ sets. The intermolecular BSSE is calculated by gCP within a typical error of 10%-30% that proves sufficient in many practical applications. The approach is suggested as a quantitative correction in production work and can also be routinely applied to estimate the magnitude of the BSSE beforehand. The applicability for biomolecules as the primary target is tested for the crambin protein, where gCP removes intramolecular BSSE effectively and yields conformational energies comparable to def2-TZVP basis results. Good mutual agreement is also found with Jensen's ACP(4) scheme, estimating the intramolecular BSSE in the phenylalanine-glycine-phenylalanine tripeptide, for which also a relaxed rotational energy profile is presented. A variety of minimal and double-ζ basis sets combined with gCP and the dispersion corrections DFT-D3 and DFT-NL are successfully benchmarked on the S22 and S66 sets of non-covalent interactions. Outstanding performance with a mean absolute deviation (MAD) of 0.51 kcal/mol (0.38 kcal/mol after D3-refit) is obtained at the gCP-corrected HF-D3/(minimal basis) level for the S66 benchmark. The gCP-corrected B3LYP-D3/6-31G* model chemistry yields MAD=0.68 kcal/mol, which represents a huge improvement over plain B3LYP/6-31G* (MAD=2.3 kcal/mol). Application of gCP-corrected B97-D3 and HF-D3 on a set of large protein-ligand complexes prove the robustness of the method. Analytical gCP gradients make optimizations of large systems feasible with small basis sets, as demonstrated for the inter-ring distances of 9-helicene and most of the complexes in Hobza's S22 test set. The method is implemented in a freely available FORTRAN program obtainable from the author's website. PMID:22519309

  17. Methodologic implications of social inequalities for analyzing health disparities in large spatiotemporal data sets: An example using breast cancer incidence data (Northern and Southern California, 1988–2002)

    PubMed Central

    Chen, Jarvis T; Coull, Brent A.; Waterman, Pamela D.; Schwartz, Joel; Krieger, Nancy

    2008-01-01

    SUMMARY Efforts to monitor, investigate, and ultimately eliminate health disparities across racial/ethnic and socioeconomic groups can benefit greatly from spatiotemporal models that enable exploration of spatial and temporal variation in health. Hierarchical Bayes methods are well-established tools in the statistical literature for fitting such models, as they permit smoothing of unstable small-area rates. However, issues presented by ‘real-life’ surveillance data can be a barrier to routine use of these models by epidemiologists. These include (1) shifting of regional boundaries over time, (2) social inequalities in racial/ethnic residential segregation, which imply differential spatial structuring across different racial/ethnic groups, and (3) heavy computational burdens for large spatiotemporal data sets. Using data from a study of changing socioeconomic gradients in female breast cancer incidence in two population-based cancer registries covering the San Francisco Bay Area and Los Angeles County, CA (1988–2002), we illustrate a two-stage approach to modeling health disparities and census tract (CT) variation in incidence over time. In the first stage, we fit race- and year-specific spatial models using CT boundaries normalized to the U.S. Census 2000. In stage 2, temporal patterns in the race- and year-specific estimates of racial/ethnic and socioeconomic effects are explored using a variety of methods. Our approach provides a straightforward means of fitting spatiotemporal models in large data sets, while highlighting differences in spatial patterning across racial/ethnic population and across time. PMID:18551507

  18. Methodologic implications of social inequalities for analyzing health disparities in large spatiotemporal data sets: an example using breast cancer incidence data (Northern and Southern California, 1988--2002).

    PubMed

    Chen, Jarvis T; Coull, Brent A; Waterman, Pamela D; Schwartz, Joel; Krieger, Nancy

    2008-09-10

    Efforts to monitor, investigate, and ultimately eliminate health disparities across racial/ethnic and socioeconomic groups can benefit greatly from spatiotemporal models that enable exploration of spatial and temporal variation in health. Hierarchical Bayes methods are well-established tools in the statistical literature for fitting such models, as they permit smoothing of unstable small-area rates. However, issues presented by 'real-life' surveillance data can be a barrier to routine use of these models by epidemiologists. These include (1) shifting of regional boundaries over time, (2) social inequalities in racial/ethnic residential segregation, which imply differential spatial structuring across different racial/ethnic groups, and (3) heavy computational burdens for large spatiotemporal data sets. Using data from a study of changing socioeconomic gradients in female breast cancer incidence in two population-based cancer registries covering the San Francisco Bay Area and Los Angeles County, CA (1988--2002), we illustrate a two-stage approach to modeling health disparities and census tract (CT) variation in incidence over time. In the first stage, we fit race- and year-specific spatial models using CT boundaries normalized to the U.S. Census 2000. In stage 2, temporal patterns in the race- and year-specific estimates of racial/ethnic and socioeconomic effects are explored using a variety of methods. Our approach provides a straightforward means of fitting spatiotemporal models in large data sets, while highlighting differences in spatial patterning across racial/ethnic population and across time. PMID:18551507

  19. River Modeling in Large and Ungauged Basins: Experience of Setting up the HEC RAS Model over the Ganges-Brahmaputra-Meghna Basins

    NASA Astrophysics Data System (ADS)

    Hossain, F.; Maswood, M.

    2014-12-01

    River modeling is the processing of setting up a physically-based hydrodynamic model that can simulate the water flow dynamics of a stream network against time varying boundary conditions. Such river models are an important component of any flood forecasting system that forecasts river levels in flood prone regions. However, many large river basins in the developing world such as the Ganges, Brahmaputra, Meghna (GBM), Indus, Irrawaddy, Salween, Mekong and Niger are mostly ungauged. Such large basins lack the necessary in-situ measurements of river bed depth/slope, bathymetry (river cross section), floodplain mapping and boundary condition flows for forcing a river model. For such basins, proxy approaches relying mostly on remote sensing data from space platforms are the only alternative. In this study, we share our experience of setting up the widely-used 1-D river model over the entire GBM basin and its stream network. Good quality in-situ measurements of river hydraulics (cross section, slope, flow) was available only for the downstream and flood prone region of the basin, which comprises only 7% of the basin area. For the remaining 93% of the basin area, we resorted to the use of data from the following satellite sensors to build a workable river model: a) Shuttle Radar Topography Mission (SRTM) for deriving bed slope; b) LANDSAT/MODIS for updating river network and flow direction generated by elevation data; c) radar altimetry data to build depth versus width relationship at river locations; d) satellite precipitation based hydrologic modeling of lateral flows into main stem rivers. In addition, we referred to an extensive body of literature to estimate the prevailing baseline hydraulics of rivers in the ungauged region. We measured success of our approach by systematically testing how well the basin-wide river model could simulate river level dynamics at two measured locations inside Bangladesh. Our experience of river modeling was replete with numerous hurdles that we did not anticipate, and often required a change in plan. In this study we summarize these key hurdles faced and offer a step by step approach to setting up river models for large ungauged river basins. Such a guide can be useful for the community wishing to set up RAS type models in basins such as Niger, Mekong, Irrawaddy, Indus etc.

  20. Collaborative Classroom Management. Video to Accompany "A Biological Brain in a Cultural Classroom: Applying Biological Research to Classroom Management." [Videotape].

    ERIC Educational Resources Information Center

    2001

    This 43-minute VHS videotape is designed to be used in course and workshop settings with "A Biological Brain in a Cultural Classroom: Applying Biological Research to Classroom Management." The videotape's principal values are as an introduction to the issues explored in the book and as a catalyst for group discussions and activities related to…

  1. Appropriate and inappropriate uses of classroom amplification

    NASA Astrophysics Data System (ADS)

    Lubman, David; Sutherland, Louis C.

    2005-09-01

    Currently, classroom amplifiers are being aggressively advocated as substitutes for good acoustics in small mainstream classrooms. Amplifiers are routinely installed without regard to unoccupied classroom noise levels and reverberation times. Amplifiers are being specified by some school districts as a money-saving alternative to mandating compliance with the ANSI standard on classroom acoustics, S12.60-2002. Manufacturers of portable classrooms and noisy wall mounted HVAC systems have joined in supporting the use of classroom amplifiers, claiming that low (35 dBA) classroom noise levels specified by the ANSI standard are unaffordable and unnecessary given amplifiers. The authors believe that the routine use of classroom amplification is appropriate in very large lecture rooms, in special education classrooms for hearing impaired students, for voice-impaired occupants, and perhaps in certain other limited circumstances. The authors explain why they believe the routine use of amplifiers in small mainstream classrooms is an inappropriate substitute for the good classroom acoustics specified in the ANSI standard.

  2. mzDB: A File Format Using Multiple Indexing Strategies for the Efficient Analysis of Large LC-MS/MS and SWATH-MS Data Sets*

    PubMed Central

    Bouyssié, David; Dubois, Marc; Nasso, Sara; Gonzalez de Peredo, Anne; Burlet-Schiltz, Odile; Aebersold, Ruedi; Monsarrat, Bernard

    2015-01-01

    The analysis and management of MS data, especially those generated by data independent MS acquisition, exemplified by SWATH-MS, pose significant challenges for proteomics bioinformatics. The large size and vast amount of information inherent to these data sets need to be properly structured to enable an efficient and straightforward extraction of the signals used to identify specific target peptides. Standard XML based formats are not well suited to large MS data files, for example, those generated by SWATH-MS, and compromise high-throughput data processing and storing. We developed mzDB, an efficient file format for large MS data sets. It relies on the SQLite software library and consists of a standardized and portable server-less single-file database. An optimized 3D indexing approach is adopted, where the LC-MS coordinates (retention time and m/z), along with the precursor m/z for SWATH-MS data, are used to query the database for data extraction. In comparison with XML formats, mzDB saves ∼25% of storage space and improves access times by a factor of twofold up to even 2000-fold, depending on the particular data access. Similarly, mzDB shows also slightly to significantly lower access times in comparison with other formats like mz5. Both C++ and Java implementations, converting raw or XML formats to mzDB and providing access methods, will be released under permissive license. mzDB can be easily accessed by the SQLite C library and its drivers for all major languages, and browsed with existing dedicated GUIs. The mzDB described here can boost existing mass spectrometry data analysis pipelines, offering unprecedented performance in terms of efficiency, portability, compactness, and flexibility. PMID:25505153

  3. mzDB: a file format using multiple indexing strategies for the efficient analysis of large LC-MS/MS and SWATH-MS data sets.

    PubMed

    Bouyssié, David; Dubois, Marc; Nasso, Sara; Gonzalez de Peredo, Anne; Burlet-Schiltz, Odile; Aebersold, Ruedi; Monsarrat, Bernard

    2015-03-01

    The analysis and management of MS data, especially those generated by data independent MS acquisition, exemplified by SWATH-MS, pose significant challenges for proteomics bioinformatics. The large size and vast amount of information inherent to these data sets need to be properly structured to enable an efficient and straightforward extraction of the signals used to identify specific target peptides. Standard XML based formats are not well suited to large MS data files, for example, those generated by SWATH-MS, and compromise high-throughput data processing and storing. We developed mzDB, an efficient file format for large MS data sets. It relies on the SQLite software library and consists of a standardized and portable server-less single-file database. An optimized 3D indexing approach is adopted, where the LC-MS coordinates (retention time and m/z), along with the precursor m/z for SWATH-MS data, are used to query the database for data extraction. In comparison with XML formats, mzDB saves ∼25% of storage space and improves access times by a factor of twofold up to even 2000-fold, depending on the particular data access. Similarly, mzDB shows also slightly to significantly lower access times in comparison with other formats like mz5. Both C++ and Java implementations, converting raw or XML formats to mzDB and providing access methods, will be released under permissive license. mzDB can be easily accessed by the SQLite C library and its drivers for all major languages, and browsed with existing dedicated GUIs. The mzDB described here can boost existing mass spectrometry data analysis pipelines, offering unprecedented performance in terms of efficiency, portability, compactness, and flexibility. PMID:25505153

  4. Consolidating the set of known human protein-protein interactions in preparation for large-scale mapping of the human interactome

    PubMed Central

    Ramani, Arun K; Bunescu, Razvan C; Mooney, Raymond J; Marcotte, Edward M

    2005-01-01

    Background Extensive protein interaction maps are being constructed for yeast, worm, and fly to ask how the proteins organize into pathways and systems, but no such genome-wide interaction map yet exists for the set of human proteins. To prepare for studies in humans, we wished to establish tests for the accuracy of future interaction assays and to consolidate the known interactions among human proteins. Results We established two tests of the accuracy of human protein interaction datasets and measured the relative accuracy of the available data. We then developed and applied natural language processing and literature-mining algorithms to recover from Medline abstracts 6,580 interactions among 3,737 human proteins. A three-part algorithm was used: first, human protein names were identified in Medline abstracts using a discriminator based on conditional random fields, then interactions were identified by the co-occurrence of protein names across the set of Medline abstracts, filtering the interactions with a Bayesian classifier to enrich for legitimate physical interactions. These mined interactions were combined with existing interaction data to obtain a network of 31,609 interactions among 7,748 human proteins, accurate to the same degree as the existing datasets. Conclusion These interactions and the accuracy benchmarks will aid interpretation of current functional genomics data and provide a basis for determining the quality of future large-scale human protein interaction assays. Projecting from the approximately 15 interactions per protein in the best-sampled interaction set to the estimated 25,000 human genes implies more than 375,000 interactions in the complete human protein interaction network. This set therefore represents no more than 10% of the complete network. PMID:15892868

  5. Resistance to Disruption in a Classroom Setting

    ERIC Educational Resources Information Center

    Parry-Cruwys, Diana E.; Neal, Carrie M.; Ahearn, William H.; Wheeler, Emily E.; Premchander, Raseeka; Loeb, Melissa B.; Dube, William V.

    2011-01-01

    Substantial experimental evidence indicates that behavior reinforced on a denser schedule is more resistant to disruption than is behavior reinforced on a thinner schedule. The present experiment studied resistance to disruption in a natural educational environment. Responding during familiar activities was reinforced on a multiple

  6. RESISTANCE TO DISRUPTION IN A CLASSROOM SETTING

    PubMed Central

    Parry-Cruwys, Diana E; Neal, Carrie M; Ahearn, William H; Wheeler, Emily E; Premchander, Raseeka; Loeb, Melissa B; Dube, William V

    2011-01-01

    Substantial experimental evidence indicates that behavior reinforced on a denser schedule is more resistant to disruption than is behavior reinforced on a thinner schedule. The present experiment studied resistance to disruption in a natural educational environment. Responding during familiar activities was reinforced on a multiple variable-interval (VI) 7-s VI 30-s schedule for 6 participants with developmental disabilities. Resistance to disruption was measured by presenting a distracting item. Response rates in the disruption components were compared to within-session response rates in prior baseline components. Results were consistent with the predictions of behavioral momentum theory for 5 of 6 participants. PMID:21709794

  7. Enhancing Vocabulary Development in Multiple Classroom Contexts.

    ERIC Educational Resources Information Center

    Harmon, Janis M.; Staton, Denise G.

    1999-01-01

    Describes ways teachers can enhance students' vocabulary development through multiple contexts available in typical middle school classroom settings. Addresses questions about vocabulary learning and offers suggestions for enhancing vocabulary with narrative and expository texts that involve multiple classroom contexts. Considers the Vocab-o-gram…

  8. Concomitant Learnings: Hidden Influences in the Classroom.

    ERIC Educational Resources Information Center

    Wallace, Sharon A., Ed.

    This bulletin contains sets of learning experiences and emphasizes the multiple effects of classroom experiences. These effects are referred to as concomitant learning or the student's affective incidental learning. In an article preceding the area materials, the teacher is identified as the classroom's most important hidden persuader, and the…

  9. Expanding Knowledge: From the Classroom into Cyberspace

    ERIC Educational Resources Information Center

    Barbas, Maria Potes Santa-Clara

    2006-01-01

    This paper is part of a larger project in the area of research. The main purpose of this mediated discourse was to implement, observe and analyse experiences of teachers in a training project developed for two different settings in the classroom. The first was between international classrooms through cyberspace and the second was a cyberspace…

  10. Application of Transcultural Themes in International Classrooms

    ERIC Educational Resources Information Center

    Van Hook, Steven R.

    2007-01-01

    The effective use of transcultural themes and images may help promote positive resonance in international settings, such as found in the traditional and online classrooms of globalizing higher education. Findings of transculturally resonant themes and images may be applied to international classroom pedagogy through such means as multimedia…

  11. Should Supervisors Intervene during Classroom Visits?

    ERIC Educational Resources Information Center

    Marshall, Kim

    2015-01-01

    Real-time coaching has become the go-to supervisory model in some schools (especially charters), with supervisors routinely jumping in during teacher observations and sometimes taking over the class to model a more effective approach. The author sets out goals and guidelines for impromptu classroom visits that include visiting each classroom at…

  12. Classroom Notes

    ERIC Educational Resources Information Center

    International Journal of Mathematical Education in Science and Technology, 2007

    2007-01-01

    In this issue's "Classroom Notes" section, the following papers are discussed: (1) "Constructing a line segment whose length is equal to the measure of a given angle" (W. Jacob and T. J. Osler); (2) "Generating functions for the powers of Fibonacci sequences" (D. Terrana and H. Chen); (3) "Evaluation of mean and variance integrals without

  13. Smart Classroom

    ERIC Educational Resources Information Center

    Kelly, Rhea, Ed.

    2006-01-01

    What makes a classroom "smart"? Presentation technologies such as projectors, document cameras, and LCD panels clearly fit the bill, but when considering other technologies for teaching, learning, and developing content, the possibilities become limited only by the boundaries of an institution's innovation. This article presents 32 best practices…

  14. Supplementary Classroom.

    ERIC Educational Resources Information Center

    Douglas Fir Plywood Association, Tacoma, WA.

    Three prototype portable classrooms were developed for both conventional and component construction. One of these economical units was built for $7.50 per square foot. Construction of each type is explained through use of photographs and text. Included in the presentation are--(1) cluster grouping suggestions, (2) interior and exterior

  15. Classroom Notes

    ERIC Educational Resources Information Center

    International Journal of Mathematical Education in Science and Technology, 2007

    2007-01-01

    In this issue's "Classroom Notes" section, the following papers are discussed: (1) "Constructing a line segment whose length is equal to the measure of a given angle" (W. Jacob and T. J. Osler); (2) "Generating functions for the powers of Fibonacci sequences" (D. Terrana and H. Chen); (3) "Evaluation of mean and variance integrals without…

  16. Classroom Tech

    ERIC Educational Resources Information Center

    Instructor, 2006

    2006-01-01

    This article features the latest classroom technologies namely the FLY Pentop, WriteToLearn, and a new iris scan identification system. The FLY Pentop is a computerized pen from Leapster that "magically" understands what kids write and draw on special FLY paper. WriteToLearn is an automatic grading software from Pearson Knowledge Technologies and…

  17. The Social Context of Urban Classrooms: Measuring Student Psychological Climate

    ERIC Educational Resources Information Center

    Frazier, Stacy L.; Mehta, Tara G.; Atkins, Marc S.; Glisson, Charles; Green, Philip D.; Gibbons, Robert D.; Kim, Jong Bae; Chapman, Jason E.; Schoenwald, Sonja K.; Cua, Grace; Ogle, Robert R.

    2015-01-01

    Classrooms are unique and complex work settings in which teachers and students both participate in and contribute to classroom processes. This article describes the measurement phase of a study that examined the social ecology of urban classrooms. Informed by the dimensions and items of an established measure of organizational climate, we designed…

  18. Practical Classroom Applications of Language Experience: Looking Back, Looking Forward.

    ERIC Educational Resources Information Center

    Nelson, Olga G., Ed.; Linek, Wayne M., Ed.

    The 38 essays in this book look back at language experience as an educational approach, provide practical classroom applications, and reconceptualize language experience as an overarching education process. Classroom teachers and reading specialists describe strategies in use in a variety of classroom settings and describe ways to integrate…

  19. Environmentally Enriched Classrooms and the Development of Disadvantaged Preschool Children.

    ERIC Educational Resources Information Center

    Busse, Thomas V.; And Others

    This study evaluates the effects of placement of additional equipment in preschool classrooms on the cognitive, perceptual, and social development of urban Negro four-year-old children. Two Get Set classrooms in each of six areas of Philadelphia were paired for teachers, subjects, physical facilities and equipment. One classroom in each pair was…

  20. The Social Context of Urban Classrooms: Measuring Student Psychological Climate

    ERIC Educational Resources Information Center

    Frazier, Stacy L.; Mehta, Tara G.; Atkins, Marc S.; Glisson, Charles; Green, Philip D.; Gibbons, Robert D.; Kim, Jong Bae; Chapman, Jason E.; Schoenwald, Sonja K.; Cua, Grace; Ogle, Robert R.

    2015-01-01

    Classrooms are unique and complex work settings in which teachers and students both participate in and contribute to classroom processes. This article describes the measurement phase of a study that examined the social ecology of urban classrooms. Informed by the dimensions and items of an established measure of organizational climate, we designed

  1. Practical Classroom Applications of Language Experience: Looking Back, Looking Forward.

    ERIC Educational Resources Information Center

    Nelson, Olga G., Ed.; Linek, Wayne M., Ed.

    The 38 essays in this book look back at language experience as an educational approach, provide practical classroom applications, and reconceptualize language experience as an overarching education process. Classroom teachers and reading specialists describe strategies in use in a variety of classroom settings and describe ways to integrate

  2. Systemize Classroom Management to Enhance Teaching and Learning

    ERIC Educational Resources Information Center

    Delman, Douglas J.

    2011-01-01

    Good classroom management is one of the most important goals teachers strive to establish from the first day of class. The rules, procedures, activities, and behaviors set the classroom tone throughout the school year. By revising, updating, and systemizing classroom management activities, teachers can eliminate many problems created by students…

  3. Improving the Teacher's Awareness of Nonverbal Communication in the Classroom.

    ERIC Educational Resources Information Center

    Kachur, Donald; And Others

    The emphasis in this paper is on developing teacher awareness of how nonverbal communication fits into the classroom setting. Various positive and negative aspects of this phase of communication in the classroom are explored. A classroom teacher is observed closely by students every day, and her/his attitude, feelings, mood or state of mind,…

  4. A large proportion of asymptomatic Plasmodium infections with low and sub-microscopic parasite densities in the low transmission setting of Temotu Province, Solomon Islands: challenges for malaria diagnostics in an elimination setting

    PubMed Central

    2010-01-01

    Background Many countries are scaling up malaria interventions towards elimination. This transition changes demands on malaria diagnostics from diagnosing ill patients to detecting parasites in all carriers including asymptomatic infections and infections with low parasite densities. Detection methods suitable to local malaria epidemiology must be selected prior to transitioning a malaria control programme to elimination. A baseline malaria survey conducted in Temotu Province, Solomon Islands in late 2008, as the first step in a provincial malaria elimination programme, provided malaria epidemiology data and an opportunity to assess how well different diagnostic methods performed in this setting. Methods During the survey, 9,491 blood samples were collected and examined by microscopy for Plasmodium species and density, with a subset also examined by polymerase chain reaction (PCR) and rapid diagnostic tests (RDTs). The performances of these diagnostic methods were compared. Results A total of 256 samples were positive by microscopy, giving a point prevalence of 2.7%. The species distribution was 17.5% Plasmodium falciparum and 82.4% Plasmodium vivax. In this low transmission setting, only 17.8% of the P. falciparum and 2.9% of P. vivax infected subjects were febrile (?38C) at the time of the survey. A significant proportion of infections detected by microscopy, 40% and 65.6% for P. falciparum and P. vivax respectively, had parasite density below 100/?L. There was an age correlation for the proportion of parasite density below 100/?L for P. vivax infections, but not for P. falciparum infections. PCR detected substantially more infections than microscopy (point prevalence of 8.71%), indicating a large number of subjects had sub-microscopic parasitemia. The concordance between PCR and microscopy in detecting single species was greater for P. vivax (135/162) compared to P. falciparum (36/118). The malaria RDT detected the 12 microscopy and PCR positive P. falciparum, but failed to detect 12/13 microscopy and PCR positive P. vivax infections. Conclusion Asymptomatic malaria infections and infections with low and sub-microscopic parasite densities are highly prevalent in Temotu province where malaria transmission is low. This presents a challenge for elimination since the large proportion of the parasite reservoir will not be detected by standard active and passive case detection. Therefore effective mass screening and treatment campaigns will most likely need more sensitive assays such as a field deployable molecular based assay. PMID:20822506

  5. Promoting Active Involvement in Classrooms

    ERIC Educational Resources Information Center

    Conderman, Greg; Bresnahan, Val; Hedin, Laura

    2012-01-01

    This article presents a rationale for using active involvement techniques, describes large- and small-group methods based on their documented effectiveness and applicability to K-12 classrooms, and illustrates their use. These approaches include ways of engaging students in large groups (e.g., unison responses, response cards, dry-erase boards,

  6. Promoting Active Involvement in Classrooms

    ERIC Educational Resources Information Center

    Conderman, Greg; Bresnahan, Val; Hedin, Laura

    2012-01-01

    This article presents a rationale for using active involvement techniques, describes large- and small-group methods based on their documented effectiveness and applicability to K-12 classrooms, and illustrates their use. These approaches include ways of engaging students in large groups (e.g., unison responses, response cards, dry-erase boards,…

  7. Treatment outcomes in AIDS-related diffuse large B-cell lymphoma in the setting roll-out of combination antiretroviral therapy in South Africa

    PubMed Central

    de Witt, Pieter; Maartens, Deborah J; Uldrick, Thomas S; Sissolak, Gerhard

    2013-01-01

    Background Long term survival for patients with AIDS-related diffuse large B-cell lymphoma (DLBCL) is feasible in settings with available combination antiretroviral therapy (cART). However, given limited oncology resources, outcomes for AIDS-associated DLBCL in South Africa are unknown. Methods We performed a retrospective analysis of survival in patients with newly diagnosed AIDS-related diffuse large B-cell lymphoma (DLBCL) treated at a tertiary teaching hospital in Cape Town, South Africa with CHOP or CHOP-like chemotherapy (January 2004 until Dec 2010). HIV and lymphoma related prognostic factors were evaluated. Results 36 patients evaluated; median age 37.3 years, 52.8% men, and 61.1% black South Africans. Median CD4 count 184 cells/μl (in 27.8% this was < 100 cells/μl), 80% high-risk according to the age-adjusted International Prognostic Index. Concurrent Mycobacterium tuberculosis in 25%. Two-year overall survival (OS) was 40.5% (median OS 10.5 months, 95%CI 6.5 – 31.8). ECOG performance status of 2 or more (25.4% versus 50.0%, p = 0.01) and poor response to cART (18.0% versus 53.9%, p = 0.03) predicted inferior 2-year OS. No difference in 2-year OS was demonstrated in patients co-infected with Mycobacterium tuberculosis (p = 0.87). Conclusions Two-year OS for patients with AIDS-related DLBCL treated with CHOP like regimens and cART is comparable to that seen in the US and Europe. Important factors effecting OS in AIDS-related DLBCL in South Africa include performance status at presentation and response to cART. Patients with co-morbid Mycobacterium tuberculosis or hepatitis B seropositivity appear to tolerate CHOP in our setting. Additional improvements in outcomes are likely possible. PMID:23797692

  8. Integrated QSPR models to predict the soil sorption coefficient for a large diverse set of compounds by using different modeling methods

    NASA Astrophysics Data System (ADS)

    Shao, Yonghua; Liu, Jining; Wang, Meixia; Shi, Lili; Yao, Xiaojun; Gramatica, Paola

    2014-05-01

    The soil sorption coefficient (Koc) is a key physicochemical parameter to assess the environmental risk of organic compounds. To predict soil sorption coefficient in a more effective and economical way, here, quantitative structure-property relationship (QSPR) models were developed based on a large diverse dataset including 964 non-ionic organic compounds. Multiple linear regression (MLR), local lazy regression (LLR) and least squares support vector machine (LS-SVM) were utilized to develop QSPR models based on the four most relevant theoretical molecular descriptors selected by genetic algorithms-variable subset selection (GA-VSS) procedure. The QSPR development strictly followed the OECD principles for QSPR model validation, thus great attentions were paid to internal and external validations, applicability domain and mechanistic interpretation. The obtained results indicate that the LS-SVM model performed better than the MLR and the LLR models. For best LS-SVM model, the correlation coefficients (R2) for the training set was 0.913 and concordance correlation coefficient (CCC) for the prediction set was 0.917. The root-mean square errors (RMSE) were 0.330 and 0.426, respectively. The results of internal and external validations together with applicability domain analysis indicate that the QSPR models proposed in our work are predictive and could provide a useful tool for prediction soil sorption coefficient of new compounds.

  9. Multilevel and Diverse Classrooms

    ERIC Educational Resources Information Center

    Baurain, Bradley, Ed.; Ha, Phan Le, Ed.

    2010-01-01

    The benefits and advantages of classroom practices incorporating unity-in-diversity and diversity-in-unity are what "Multilevel and Diverse Classrooms" is all about. Multilevel classrooms--also known as mixed-ability or heterogeneous classrooms--are a fact of life in ESOL programs around the world. These classrooms are often not only multilevel…

  10. Multilevel and Diverse Classrooms

    ERIC Educational Resources Information Center

    Baurain, Bradley, Ed.; Ha, Phan Le, Ed.

    2010-01-01

    The benefits and advantages of classroom practices incorporating unity-in-diversity and diversity-in-unity are what "Multilevel and Diverse Classrooms" is all about. Multilevel classrooms--also known as mixed-ability or heterogeneous classrooms--are a fact of life in ESOL programs around the world. These classrooms are often not only multilevel

  11. Early Childhood/CDA Learning Modules: A Competency-Based Training Program for Classroom Personnel in Preschool Programs.

    ERIC Educational Resources Information Center

    Beaty, Janice J.

    This training manual for early childhood CDA candidates provides 21 learning modules on the following subjects: field trips, books, parent involvement, setting up the classroom, the role of play, creative expression (art), preschool science, cooking, self-image, large motor development, speaking and listening skills, daily program planning, small…

  12. Behavior Problems in Learning Activities and Social Interactions in Head Start Classrooms and Early Reading, Mathematics, and Approaches to Learning

    ERIC Educational Resources Information Center

    Bulotsky-Shearer, Rebecca J.; Fernandez, Veronica; Dominguez, Ximena; Rouse, Heather L.

    2011-01-01

    Relations between early problem behavior in preschool classrooms and a comprehensive set of school readiness outcomes were examined for a stratified random sample (N = 256) of 4-year-old children enrolled in a large, urban school district Head Start program. A series of multilevel models examined the unique contribution of early problem behavior…

  13. New Ways of Classroom Assessment. Revised

    ERIC Educational Resources Information Center

    Brown, J. D., Ed.

    2013-01-01

    In this revised edition in the popular New Ways Series, teachers have once again been given an opportunity to show how they do assessment in their classrooms on an everyday basis. Often feeling helpless when confronted with large-scale standardized testing practices, teachers here offer classroom testing created with the direct aim of helping

  14. Learning the Three C's: Classroom Communication Climate.

    ERIC Educational Resources Information Center

    Myers, Scott A.

    A study examined the communication climate of a graduate teaching assistant's (GTA) college classroom. Because the teaching role is often new to the GTA, establishing a communication climate may be a significant factor in classroom management. One section of a public speaking class taught by a new graduate teaching assistant at a large midwestern…

  15. Creating Learning Communities in the Classroom

    ERIC Educational Resources Information Center

    Saville, Bryan K.; Lawrence, Natalie Kerr; Jakobsen, Krisztina V.

    2012-01-01

    There are many ways to construct classroom-based learning communities. Nevertheless, the emphasis is always on cooperative learning. In this article, the authors focus on three teaching methods--interteaching, team-based learning, and cooperative learning in large, lecture-based courses--that they have used successfully to create classroom-based…

  16. New Ways of Classroom Assessment. Revised

    ERIC Educational Resources Information Center

    Brown, J. D., Ed.

    2013-01-01

    In this revised edition in the popular New Ways Series, teachers have once again been given an opportunity to show how they do assessment in their classrooms on an everyday basis. Often feeling helpless when confronted with large-scale standardized testing practices, teachers here offer classroom testing created with the direct aim of helping…

  17. Classroom Connectivity: Increasing Participation and Understanding Inside the Classroom

    ERIC Educational Resources Information Center

    Hegedus, Stephen

    2007-01-01

    This article shows how highly mobile computing, when used with new forms of network connectivity, can allow new forms of activities in the mathematics classroom. Examples are provided, such as the ability to share, harvest, and aggregate mathematical objects, and the ability for teachers and students to analyze the entire set of classroom…

  18. Bag-Tanks for Your Classroom.

    ERIC Educational Resources Information Center

    Wulfson, Stephen E.

    1981-01-01

    Suggests using plastic bags as aquaria and terraria. Describes techniques for converting plastic sheets into aquaria, how to set them up for classroom use, and other uses for plastic bag aquaria. (DS)

  19. Maximizing Classroom Participation.

    ERIC Educational Resources Information Center

    Englander, Karen

    2001-01-01

    Discusses how to maximize classroom participation in the English-as-a-Second-or-Foreign-Language classroom, and provides a classroom discussion method that is based on real-life problem solving. (Author/VWL)

  20. Getting Started in Classroom Computing.

    ERIC Educational Resources Information Center

    Ahl, David H.

    Written for secondary students, this booklet provides an introduction to several computer-related concepts through a set of six classroom games, most of which can be played with little more than a sheet of paper and a pencil. The games are: 1) SECRET CODES--introduction to binary coding, punched cards, and paper tape; 2) GUESS--efficient methods…

  1. Structural Analysis in the Classroom

    ERIC Educational Resources Information Center

    Gage, Nicholas A.; Lewis, Timothy J.

    2010-01-01

    The purpose of this article is to describe an applied method of assessing and manipulating environmental factors influencing student behavior. The assessment procedure is called structural analysis (SA) and can be a part of a functional behavioral assessment (FBA) process or a stand-alone set of procedures for teachers to use in their classrooms.…

  2. Classroom Meetings: A Program Model.

    ERIC Educational Resources Information Center

    Frey, Andy; Doyle, Hallie Davis

    2001-01-01

    Describes a model for classroom meetings in an elementary school setting. Focuses on enhancing children's communication and problem-solving skills for typical students and those identified through special education. The purpose of the meetings is to provide a nurturing climate for the learning of social skills that the children can use in the…

  3. Creating an A+++ Classroom Library.

    ERIC Educational Resources Information Center

    Lowe, Jeff

    1998-01-01

    Describes how teachers can set up high-quality classroom libraries that motivate students to read. Suggestions include assessing student needs; acquiring reading materials (through book clubs and donations from students and families); and being actively innovative so the library remains fresh and dynamic (e.g., by adding shelves, offering a wide…

  4. Classroom Culture Promotes Academic Resiliency

    ERIC Educational Resources Information Center

    DiTullio, Gina

    2014-01-01

    Resiliency is what propels many students to continue moving forward under difficult learning and life conditions. We intuitively think that such resilience is a character quality that cannot be taught. On the contrary, when a teacher sets the right conditions and culture for it in the classroom by teaching collaboration and communication skills,…

  5. The Classroom Animal: Crickets.

    ERIC Educational Resources Information Center

    Kramer, David C.

    1985-01-01

    Suggests using crickets for classroom activities, providing background information on their anatomy and reproduction and tips on keeping individual organisms or a breeding colony in the classroom. (JN)

  6. Flipped Classroom Modules for Large Enrollment General Chemistry Courses: A Low Barrier Approach to Increase Active Learning and Improve Student Grades

    ERIC Educational Resources Information Center

    Eichler, Jack F.; Peeples, Junelyn

    2016-01-01

    In the face of mounting evidence revealing active learning approaches result in improved student learning outcomes compared to traditional passive lecturing, there is a growing need to change the way instructors teach large introductory science courses. However, a large proportion of STEM faculty continues to use traditional instructor-centered…

  7. Flipped Classroom Modules for Large Enrollment General Chemistry Courses: A Low Barrier Approach to Increase Active Learning and Improve Student Grades

    ERIC Educational Resources Information Center

    Eichler, Jack F.; Peeples, Junelyn

    2016-01-01

    In the face of mounting evidence revealing active learning approaches result in improved student learning outcomes compared to traditional passive lecturing, there is a growing need to change the way instructors teach large introductory science courses. However, a large proportion of STEM faculty continues to use traditional instructor-centered

  8. Comparison of Two Methods for Estimating the Sampling-Related Uncertainty of Satellite Rainfall Averages Based on a Large Radar Data Set

    NASA Technical Reports Server (NTRS)

    Lau, William K. M. (Technical Monitor); Bell, Thomas L.; Steiner, Matthias; Zhang, Yu; Wood, Eric F.

    2002-01-01

    The uncertainty of rainfall estimated from averages of discrete samples collected by a satellite is assessed using a multi-year radar data set covering a large portion of the United States. The sampling-related uncertainty of rainfall estimates is evaluated for all combinations of 100 km, 200 km, and 500 km space domains, 1 day, 5 day, and 30 day rainfall accumulations, and regular sampling time intervals of 1 h, 3 h, 6 h, 8 h, and 12 h. These extensive analyses are combined to characterize the sampling uncertainty as a function of space and time domain, sampling frequency, and rainfall characteristics by means of a simple scaling law. Moreover, it is shown that both parametric and non-parametric statistical techniques of estimating the sampling uncertainty produce comparable results. Sampling uncertainty estimates, however, do depend on the choice of technique for obtaining them. They can also vary considerably from case to case, reflecting the great variability of natural rainfall, and should therefore be expressed in probabilistic terms. Rainfall calibration errors are shown to affect comparison of results obtained by studies based on data from different climate regions and/or observation platforms.

  9. Classroom Management and Teachers' Coping Strategies: Inside Classrooms in Australia, China and Israel

    ERIC Educational Resources Information Center

    Romi, Shlomo; Lewis, Ramon; Roache, Joel

    2013-01-01

    This paper discusses the degree to which recently reported relationships between the classroom management techniques and coping styles of Australian teachers apply in two other national settings: China and Israel. Little is known about which teacher characteristics relate to their approach to classroom management, although researchers in Australia…

  10. Global Internet Video Classroom: A Technology Supported Learner-Centered Classroom

    ERIC Educational Resources Information Center

    Lawrence, Oliver

    2010-01-01

    The Global Internet Video Classroom (GIVC) Project connected Chicago Civil Rights activists of the 1960s with Cape Town Anti-Apartheid activists of the 1960s in a classroom setting where learners from Cape Town and Chicago engaged activists in conversations about their motivation, principles, and strategies. The project was launched in order to…

  11. Global Internet Video Classroom: A Technology Supported Learner-Centered Classroom

    ERIC Educational Resources Information Center

    Lawrence, Oliver

    2010-01-01

    The Global Internet Video Classroom (GIVC) Project connected Chicago Civil Rights activists of the 1960s with Cape Town Anti-Apartheid activists of the 1960s in a classroom setting where learners from Cape Town and Chicago engaged activists in conversations about their motivation, principles, and strategies. The project was launched in order to

  12. All Together Now: Measuring Staff Cohesion in Special Education Classrooms

    PubMed Central

    Kratz, Hilary E.; Locke, Jill; Piotrowski, Zinnia; Ouellette, Rachel R.; Xie, Ming; Stahmer, Aubyn C.; Mandell, David S.

    2015-01-01

    This study sought to validate a new measure, the Classroom Cohesion Survey (CCS), designed to examine the relationship between teachers and classroom assistants in autism support classrooms. Teachers, classroom assistants, and external observers showed good inter-rater agreement on the CCS and good internal consistency for all scales. Simple factor structures were found for both teacher- and classroom assistant–rated scales, with one-factor solutions for both scales. Paired t tests revealed that on average, classroom assistants rated classroom cohesion stronger than teachers. The CCS may be an effective tool for measuring cohesion between classroom staff and may have an important impact on various clinical and implementation outcomes in school settings. PMID:26213443

  13. Consistency of Toddler Engagement across Two Settings

    ERIC Educational Resources Information Center

    Aguiar, Cecilia; McWilliam, R. A.

    2013-01-01

    This study documented the consistency of child engagement across two settings, toddler child care classrooms and mother-child dyadic play. One hundred twelve children, aged 14-36 months (M = 25.17, SD = 6.06), randomly selected from 30 toddler child care classrooms from the district of Porto, Portugal, participated. Levels of engagement were…

  14. Classroom Management. Brief

    ERIC Educational Resources Information Center

    National Education Association Research Department, 2006

    2006-01-01

    In learning-centered classrooms, the emphasis of classroom management shifts from maintaining behavioral control to fostering student engagement and self-regulation as well as community responsibility. This brief describes classroom management in "learning centered" classrooms, where practices are consistent with recent research knowledge about…

  15. Classroom Discipline. Research Roundup.

    ERIC Educational Resources Information Center

    Bielefeldt, Talbot

    1989-01-01

    Recent research in classroom discipline tends to show that discipline is a by-product of effective instruction and classroom management. The five publications reviewed in this annotated bibliography explore aspects of the complex classroom environment that relate to student discipline. Walter Doyle's chapter on "Classroom Organization and…

  16. The Machine in the Classroom.

    ERIC Educational Resources Information Center

    Snider, Robert C.

    1992-01-01

    Since the 1960s, difficulty of developing a technology of instruction in public schools has proved insurmountable; results have been spotty, machines have come and gone, and classroom practices remain largely unchanged. Public clamor for reform has provided neither direction nor purpose. Technology will ultimately prevail; the problem is educating…

  17. Classroom Games: A Prisoner's Dilemma.

    ERIC Educational Resources Information Center

    Holt, Charles A.; Capra, Monica

    2000-01-01

    Describes a classroom game called the prisoner's dilemma that illustrates the conflict between social incentives to cooperate and private incentives to defect. Explains that it is a simple card game involving a large number of students. States that the students should be introduced to the real-world applications of the game. (CMK)

  18. Photonics Explorer: revolutionizing photonics in the classroom

    NASA Astrophysics Data System (ADS)

    Prasad, Amrita; Debaes, Nathalie; Cords, Nina; Fischer, Robert; Vlekken, Johan; Euler, Manfred; Thienpont, Hugo

    2012-10-01

    The `Photonics Explorer' is a unique intra-curricular optics kit designed to engage, excite and educate secondary school students about the fascination of working with light - hands-on, in their own classrooms. Developed with a pan European collaboration of experts, the kit equips teachers with class sets of experimental material provided within a supporting didactic framework, distributed in conjunction with teacher training courses. The material has been specifically designed to integrate into European science curricula. Each kit contains robust and versatile components sufficient for a class of 25-30 students to work in groups of 2-3. The didactic content is based on guided inquiry-based learning (IBL) techniques with a strong emphasis on hands-on experiments, team work and relating abstract concepts to real world applications. The content has been developed in conjunction with over 30 teachers and experts in pedagogy to ensure high quality and ease of integration. It is currently available in 7 European languages. The Photonics Explorer allows students not only to hone their essential scientific skills but also to really work as scientists and engineers in the classroom. Thus, it aims to encourage more young people to pursue scientific careers and avert the imminent lack of scientific workforce in Europe. 50 Photonics Explorer kits have been successfully tested in 7 European countries with over 1500 secondary school students. The positive impact of the kit in the classroom has been qualitatively and quantitatively evaluated. A non-profit organisation, EYESTvzw [Excite Youth for Engineering Science and Technology], is responsible for the large scale distribution of the Photonics Explorer.

  19. A classroom demonstration of reciprocal space

    NASA Astrophysics Data System (ADS)

    Hannibal Madsen, Morten; Høpfner, Louise; Rasmussen, Nina; Stolborg, Mikkel; Nygârd, Jesper; Feidenhans'l, Robert; Thomsen, Jan W.

    2013-04-01

    An array of nanowires and a laser pointer are used for a simple visualization of two-dimensional reciprocal space. The experiment can be performed without any preparation and in any classroom. It aids the teaching of scattering experiments, and illustrates the underlying principles of electron, x-ray, and neutron scattering. A detailed study of the diffraction pattern was performed by mounting the sample with nanowires on a stage designed for x-ray scattering. The setup is well suited for undergraduate students, who get training in sample alignment in a small lab instead of at a large-scale facility. The exact positions of the diffraction spots are calculated and monitored experimentally for a 360° rotation of the sample. By fitting to this set of images, it is possible to determine the lattice vectors of the artificial crystal with an uncertainty of less than 1%.

  20. Mendel in the Modern Classroom

    NASA Astrophysics Data System (ADS)

    Smith, Mike U.; Gericke, Niklas M.

    2015-01-01

    Mendel is an icon in the history of genetics and part of our common culture and modern biology instruction. The aim of this paper is to summarize the place of Mendel in the modern biology classroom. In the present article we will identify key issues that make Mendel relevant in the classroom today. First, we recount some of the historical controversies that have relevance to modern curricular design, such as Fisher's (Ann Sci 1:115-137, 1936/2008) claim that Mendel's data were too good to be true. We also address questions about Mendel's status as the father of genetics as well as questions about the sequencing of Mendel's work in genetics instruction in relation to modern molecular genetics and evolution. Next, we present a systematic set of examples of research based approaches to the use of Mendel in the modern classroom along with criticisms of these designs and questions about the historical accuracy of the story of Mendel as presented in the typical classroom. Finally, we identify gaps in our understanding in need of further study and present a selected set of resources that, along with the references cited, should be valuable to science educators interested in further study of the story of Mendel.