Science.gov

Sample records for large classroom setting

  1. Calibrated Peer Review: A New Tool for Integrating Information Literacy Skills in Writing-Intensive Large Classroom Settings

    ERIC Educational Resources Information Center

    Fosmire, Michael

    2010-01-01

    Calibrated Peer Review[TM] (CPR) is a program that can significantly enhance the ability to integrate intensive information literacy exercises into large classroom settings. CPR is founded on a solid pedagogic base for learning, and it is formulated in such a way that information skills can easily be inserted. However, there is no mention of its…

  2. Active Learning in a Large Medical Classroom Setting for Teaching Renal Physiology

    ERIC Educational Resources Information Center

    Dietz, John R.; Stevenson, Frazier T.

    2011-01-01

    In this article, the authors describe an active learning exercise which has been used to replace some lecture hours in the renal portion of an integrated, organ system-based curriculum for first-year medical students. The exercise takes place in a large auditorium with ~150 students. The authors, who are faculty members, lead the discussions,…

  3. A Classroom Tariff-Setting Game

    ERIC Educational Resources Information Center

    Winchester, Niven

    2006-01-01

    The author outlines a classroom tariff-setting game that allows students to explore the consequences of import tariffs imposed by large countries (countries able to influence world prices). Groups of students represent countries, which are organized into trading pairs. Each group's objective is to maximize welfare by choosing an appropriate ad…

  4. Impact of Problem-Based Learning in a Large Classroom Setting: Student Perception and Problem-Solving Skills

    ERIC Educational Resources Information Center

    Klegeris, Andis; Hurren, Heather

    2011-01-01

    Problem-based learning (PBL) can be described as a learning environment where the problem drives the learning. This technique usually involves learning in small groups, which are supervised by tutors. It is becoming evident that PBL in a small-group setting has a robust positive effect on student learning and skills, including better…

  5. Controlling Setting Events in the Classroom

    ERIC Educational Resources Information Center

    Chan, Paula E.

    2016-01-01

    Teachers face the challenging job of differentiating instruction for the diverse needs of their students. This task is difficult enough with happy students who are eager to learn; unfortunately students often enter the classroom in a bad mood because of events that happened outside the classroom walls. These events--called setting events--can…

  6. Classroom Management in Inclusive Settings

    ERIC Educational Resources Information Center

    Soodak, Leslie C.

    2003-01-01

    The inclusion of children with disabilities in general education classes provides an opportunity for teachers to identify classroom management policies and practices that promote diversity and community. Community-building management strategies that facilitate friendships, collaboration, parent involvement, and address challenging behaviors in a…

  7. Individualizing in Traditional Classroom Settings.

    ERIC Educational Resources Information Center

    Thornell, John G.

    1980-01-01

    Effective individualized instruction depends primarily on the teacher possessing the skills to implement it. Individualization is therefore quite compatible with the traditional self-contained elementary classroom model, but not with its alternative, departmentalization, which allows teachers neither the time flexibility nor the familiarity with…

  8. Children's Fears in the Classroom Setting.

    ERIC Educational Resources Information Center

    Johnson, Suzanne Bennett

    1979-01-01

    Fears common to the classroom setting are discussed, including school phobia, social withdrawal, and test anxiety. Incidence data, theoretical explanations, and treatment research are reviewed, and directions for future research are suggested. (Author/MH)

  9. Improvement in Generic Problem-Solving Abilities of Students by Use of Tutor-Less Problem-Based Learning in a Large Classroom Setting

    ERIC Educational Resources Information Center

    Klegeris, Andis; Bahniwal, Manpreet; Hurren, Heather

    2013-01-01

    Problem-based learning (PBL) was originally introduced in medical education programs as a form of small-group learning, but its use has now spread to large undergraduate classrooms in various other disciplines. Introduction of new teaching techniques, including PBL-based methods, needs to be justified by demonstrating the benefits of such…

  10. Improvement in Generic Problem-Solving Abilities of Students by Use of Tutor-less Problem-Based Learning in a Large Classroom Setting

    PubMed Central

    Klegeris, Andis; Bahniwal, Manpreet; Hurren, Heather

    2013-01-01

    Problem-based learning (PBL) was originally introduced in medical education programs as a form of small-group learning, but its use has now spread to large undergraduate classrooms in various other disciplines. Introduction of new teaching techniques, including PBL-based methods, needs to be justified by demonstrating the benefits of such techniques over classical teaching styles. Previously, we demonstrated that introduction of tutor-less PBL in a large third-year biochemistry undergraduate class increased student satisfaction and attendance. The current study assessed the generic problem-solving abilities of students from the same class at the beginning and end of the term, and compared student scores with similar data obtained in three classes not using PBL. Two generic problem-solving tests of equal difficulty were administered such that students took different tests at the beginning and the end of the term. Blinded marking showed a statistically significant 13% increase in the test scores of the biochemistry students exposed to PBL, while no trend toward significant change in scores was observed in any of the control groups not using PBL. Our study is among the first to demonstrate that use of tutor-less PBL in a large classroom leads to statistically significant improvement in generic problem-solving skills of students. PMID:23463230

  11. Implementing iPads in the Inclusive Classroom Setting

    ERIC Educational Resources Information Center

    Maich, Kimberly; Hall, Carmen

    2016-01-01

    This column provides practical suggestions to help guide teachers in utilizing classroom sets of iPads. Following a brief introduction to tablet technology in inclusive classrooms and the origin of these recommendations from a case study focus group, important elements of setting up classroom iPad use, from finding funding to teaching apps, are…

  12. Climate Setting in Second-Language Classrooms.

    ERIC Educational Resources Information Center

    Evans-Harvey, Cher

    1993-01-01

    Discusses the creation of a positive classroom climate, examines four dimensions of classroom climate (physical, academic, organizational, and social-emotional), and reviews techniques that teachers can use to promote a positive classroom climate. Teachers need to get to know their students, discuss the course objectives with their students, and…

  13. Collaboration within Large Groups in the Classroom

    ERIC Educational Resources Information Center

    Szewkis, Eyal; Nussbaum, Miguel; Rosen, Tal; Abalos, Jose; Denardin, Fernanda; Caballero, Daniela; Tagle, Arturo; Alcoholado, Cristian

    2011-01-01

    The purpose of this paper is to show how a large group of students can work collaboratively in a synchronous way within the classroom using the cheapest possible technological support. Making use of the features of Single Display Groupware and of Multiple Mice we propose a computer-supported collaborative learning approach for big groups within…

  14. Classrooms and Computers as Instructional Settings.

    ERIC Educational Resources Information Center

    Amarel, Marianne

    1983-01-01

    The influx of computers into the classroom is discussed from a teacher's point of view. Teachers' reactions to the PLATO Elementary Mathematics and Reading Project (a computer aided instructional model) from the author's point of view are noted. (JMK)

  15. Tangential Floor in a Classroom Setting

    ERIC Educational Resources Information Center

    Marti, Leyla

    2012-01-01

    This article examines floor management in two classroom sessions: a task-oriented computer lesson and a literature lesson. Recordings made in the computer lesson show the organization of floor when a task is given to students. Temporary or "incipient" side floors (Jones and Thornborrow, 2004) emerge beside the main floor. In the literature lesson,…

  16. Analyzing Multimodal Interaction within a Classroom Setting

    ERIC Educational Resources Information Center

    Moura, Heloisa

    2006-01-01

    Human interactions are multimodal in nature. From simple to complex forms of transferal of information, human beings draw on a multiplicity of communicative modes, such as intonation and gaze, to make sense of everyday experiences. Likewise, the learning process, either within traditional classrooms or Virtual Learning Environments, is shaped by…

  17. A Practical Setting of Distance Learning Classroom.

    ERIC Educational Resources Information Center

    Wang, Shousan; Buck, Lawrence

    1996-01-01

    Describes a distance-learning classroom developed and used by Central Connecticut State University for nurse training, educational statistics, mathematics, and technology courses. Discusses initial engineering, video cameras, video source switching, lighting, audio, and other technical and related aspects. Block diagrams and lists of equipment for…

  18. Teaching Music in the Urban Classroom Set

    ERIC Educational Resources Information Center

    Frierson-Campbell, Carol Ed.

    2006-01-01

    The change needed in urban music education not only relates to the idea that music should be at the center of the curriculum; rather, it is that culturally relevant music should be a creative force at the center of reform in urban education. This set is the start of a national-level conversation aimed at making that goal a reality. In both…

  19. Student Engagement and Success in the Large Astronomy 101 Classroom

    NASA Astrophysics Data System (ADS)

    Jensen, J. B.

    2014-07-01

    The large auditorium classroom presents unique challenges to maintaining student engagement. During the fall 2012 semester, I adopted several specific strategies for increasing student engagement and reducing anonymity with the goal of maximizing student success in the large class. I measured attendance and student success in two classes, one with 300 students and one with 42, but otherwise taught as similarly as possible. While the students in the large class probably did better than they would have in a traditional lecture setting, attendance was still significantly lower in the large class, resulting in lower student success than in the small control class by all measures. I will discuss these results and compare to classes in previous semesters, including other small classes and large Distance Education classes conducted live over remote television link.

  20. Observation Instrument of Play Behaviour in a Classroom Setting

    ERIC Educational Resources Information Center

    Berkhout, Louise; Hoekman, Joop; Goorhuis-Brouwer, Sieneke M.

    2012-01-01

    The objective of this study was to develop an instrument to observe the play behaviour of a whole group of children from four to six years of age in a classroom setting on the basis of video recording. The instrument was developed in collaboration with experienced teachers and experts on play. Categories of play were derived from the literature…

  1. Enhancing Feedback via Peer Learning in Large Classrooms

    ERIC Educational Resources Information Center

    Zher, Ng Huey; Hussein, Raja Maznah Raja; Saat, Rohaida Mohd

    2016-01-01

    Feedback has been lauded as a key pedagogical tool in higher education. Unfortunately, the value of feedback falls short when being carried out in large classrooms. In this study, strategies for sustaining feedback in large classroom based on peer learning are explored. All the characteristics identified within the concept of peer learning were…

  2. Examining the Effectiveness of Team-Based Learning (TBL) in Different Classroom Settings

    ERIC Educational Resources Information Center

    Yuretich, Richard F.; Kanner, Lisa C.

    2015-01-01

    The problem of effective learning in college classrooms, especially in a large lecture setting, has been a topic of discussion for a considerable span of time. Most efforts to improve learning incorporate various forms of student-active learning, such as in-class investigations or problems, group discussions, collaborative examinations and…

  3. Radial sets: interactive visual analysis of large overlapping sets.

    PubMed

    Alsallakh, Bilal; Aigner, Wolfgang; Miksch, Silvia; Hauser, Helwig

    2013-12-01

    In many applications, data tables contain multi-valued attributes that often store the memberships of the table entities to multiple sets such as which languages a person masters, which skills an applicant documents, or which features a product comes with. With a growing number of entities, the resulting element-set membership matrix becomes very rich of information about how these sets overlap. Many analysis tasks targeted at set-typed data are concerned with these overlaps as salient features of such data. This paper presents Radial Sets, a novel visual technique to analyze set memberships for a large number of elements. Our technique uses frequency-based representations to enable quickly finding and analyzing different kinds of overlaps between the sets, and relating these overlaps to other attributes of the table entities. Furthermore, it enables various interactions to select elements of interest, find out if they are over-represented in specific sets or overlaps, and if they exhibit a different distribution for a specific attribute compared to the rest of the elements. These interactions allow formulating highly-expressive visual queries on the elements in terms of their set memberships and attribute values. As we demonstrate via two usage scenarios, Radial Sets enable revealing and analyzing a multitude of overlapping patterns between large sets, beyond the limits of state-of-the-art techniques. PMID:24051816

  4. Teacher and Student Research Using Large Data Sets

    NASA Astrophysics Data System (ADS)

    Croft, S. K.; Pompea, S. M.; Sparks, R. T.

    2005-12-01

    One of the objectives of teacher research experiences is to immerse the teacher in an authentic research situation to help the teacher understand what real research is all about: "to do science as scientists do." Experiences include doing experiments in laboratories, gathering data out in the field, and observing at professional observatories. However, a rapidly growing area of scientific research is in "data mining" increasingly large public data archives. In the earth and space sciences, such large archives are built around data from Landsat 7, the Sloan Digital Sky Survey, and in about seven years, the Large Synoptic Survey Telescope. The LSST will re-photograph the entire night sky every three day, resulting in a data flow of about 20 terabytes per night. The resulting LSST archive will represent a huge challenge of simple storage and retrieval for professional scientists. It will be a much greater challenge to help K-12 teachers use such gargantuan files and collections of data effectively in the classroom and to understand and begin to practice the new research procedures involved in data mining. At NOAO we are exploring ways of using large data sets in formal educational settings like classrooms, and public settings like planetariums and museums. In our existing professional development programs, such as our Teacher leaders in Research Based Science Education, we have introduced teachers to research via on-site observing experiences and partnerships with active astronomers. To successfully initiate research in the classroom, we have found that teachers need training in specific science content, use of specialized software to work with the data, development of research questions and objectives, and explicit pedagogical strategies for classroom use. Our research projects are well defined, though not "canned," and incorporate specific types of data, such as solar images. These data can be replaced with new data from an archive for the classroom research

  5. A Student Response System in an Electronic Classroom: Technology Aids for Large Classroom Instruction

    NASA Astrophysics Data System (ADS)

    Ober, D.; Errington, P.; Islam, S.; Robertson, T.; Watson, J.

    1997-10-01

    In the fall of 1996, thirteen (13) classrooms on the Ball State campus were equipped with technological aids to enhance learning in large classrooms (for typically 100 students or larger). Each classroom was equipped with the following built-in equipment: computer, zip drive, laser disc player, VCR, LAN and Internet connection, TV monitors, and Elmo overhead camera with large-screen projection system. This past fall semester a student response system was added to a 108-seat classroom in the Physics and Astronomy department for use with large General Education courses. Each student seat was equipped with a hardwired hand-held unit possessing input capabilities and LCD feedback for the student. The introduction of the student response system was added in order enhance more active learning by students in the large classroom environment. Attendance, quizzes, hour exams, and in-class surveys are early uses for the system; initial reactions by student and faculty users will be given.

  6. Observations of Children’s Interactions with Teachers, Peers, and Tasks across Preschool Classroom Activity Settings

    PubMed Central

    Booren, Leslie M.; Downer, Jason T.; Vitiello, Virginia E.

    2014-01-01

    This descriptive study examined classroom activity settings in relation to children’s observed behavior during classroom interactions, child gender, and basic teacher behavior within the preschool classroom. 145 children were observed for an average of 80 minutes during 8 occasions across 2 days using the inCLASS, an observational measure that conceptualizes behavior into teacher, peer, task, and conflict interactions. Findings indicated that on average children’s interactions with teachers were higher in teacher-structured settings, such as large group. On average, children’s interactions with peers and tasks were more positive in child-directed settings, such as free choice. Children experienced more conflict during recess and routines/transitions. Finally, gender differences were observed within small group and meals. The implications of these findings might encourage teachers to be thoughtful and intentional about what types of support and resources are provided so children can successfully navigate the demands of particular settings. These findings are not meant to discourage certain teacher behaviors or imply value of certain classroom settings; instead, by providing an evidenced-based picture of the conditions under which children display the most positive interactions, teachers can be more aware of choices within these settings and have a powerful way to assist in professional development and interventions. PMID:25717282

  7. Using Flipped Classroom Approach to Explore Deep Learning in Large Classrooms

    ERIC Educational Resources Information Center

    Danker, Brenda

    2015-01-01

    This project used two Flipped Classroom approaches to stimulate deep learning in large classrooms during the teaching of a film module as part of a Diploma in Performing Arts course at Sunway University, Malaysia. The flipped classes utilized either a blended learning approach where students first watched online lectures as homework, and then…

  8. Silent Students' Participation in a Large Active Learning Science Classroom

    ERIC Educational Resources Information Center

    Obenland, Carrie A.; Munson, Ashlyn H.; Hutchinson, John S.

    2012-01-01

    Active learning in large science classrooms furthers opportunities for students to engage in the content and in meaningful learning, yet students can still remain anonymously silent. This study aims to understand the impact of active learning on these silent students in a large General Chemistry course taught via Socratic questioning and…

  9. Teaching the Assessment of Normality Using Large Easily-Generated Real Data Sets

    ERIC Educational Resources Information Center

    Kulp, Christopher W.; Sprechini, Gene D.

    2016-01-01

    A classroom activity is presented, which can be used in teaching students statistics with an easily generated, large, real world data set. The activity consists of analyzing a video recording of an object. The colour data of the recorded object can then be used as a data set to explore variation in the data using graphs including histograms,…

  10. Lessons Learned from a Multiculturally, Economically Diverse Classroom Setting.

    ERIC Educational Resources Information Center

    Lyman, Lawrence

    For her sabbatical a professor of teacher education at Emporia State University returned to the elementary classroom after a 20-year absence to teach in a third/fourth combination classroom in the Emporia, Kansas Public Schools. The return to elementary classroom teaching provided the professor with the opportunity to utilize some of the social…

  11. Observations of Children's Interactions with Teachers, Peers, and Tasks across Preschool Classroom Activity Settings

    ERIC Educational Resources Information Center

    Booren, Leslie M.; Downer, Jason T.; Vitiello, Virginia E.

    2012-01-01

    Research Findings: This descriptive study examined classroom activity settings in relation to children's observed behavior during classroom interactions, child gender, and basic teacher behavior within the preschool classroom. A total of 145 children were observed for an average of 80 min during 8 occasions across 2 days using the Individualized…

  12. Designing an Electronic Classroom for Large College Courses.

    ERIC Educational Resources Information Center

    Aiken, Milam W.; Hawley, Delvin D.

    1995-01-01

    Describes a state-of-the-art electronic classroom at the University of Mississippi School of Business designed for large numbers of students and regularly scheduled classes. Highlights include: architecture of the room, hardware components, software utilized in the room, and group decision support system software and its uses. (JKP)

  13. On flipping the classroom in large first year calculus courses

    NASA Astrophysics Data System (ADS)

    Jungić, Veselin; Kaur, Harpreet; Mulholland, Jamie; Xin, Cindy

    2015-05-01

    Over the course of two years, 2012--2014, we have implemented a 'flipping' the classroom approach in three of our large enrolment first year calculus courses: differential and integral calculus for scientists and engineers. In this article we describe the details of our particular approach and share with the reader some experiences of both instructors and students.

  14. On Flipping the Classroom in Large First Year Calculus Courses

    ERIC Educational Resources Information Center

    Jungic, Veselin; Kaur, Harpreet; Mulholland, Jamie; Xin, Cindy

    2015-01-01

    Over the course of two years, 2012-2014, we have implemented a "flipping" the classroom approach in three of our large enrolment first year calculus courses: differential and integral calculus for scientists and engineers. In this article we describe the details of our particular approach and share with the reader some experiences of…

  15. Treatment of Encopresis in a Classroom Setting: A Case Study

    ERIC Educational Resources Information Center

    Scott, E.

    1977-01-01

    This study describes the procedure and results of a behavior modification program carried out in the classroom and aimed at eliminating encopresis (involuntary defecation) in an 8-year-old boy. (Editor/RK)

  16. An Exploration of the Effectiveness of an Audit Simulation Tool in a Classroom Setting

    ERIC Educational Resources Information Center

    Zelin, Robert C., II

    2010-01-01

    The purpose of this study was to examine the effectiveness of using an audit simulation product in a classroom setting. Many students and professionals feel that a disconnect exists between learning auditing in the classroom and practicing auditing in the workplace. It was hoped that the introduction of an audit simulation tool would help to…

  17. The Emergence of Student Creativity in Classroom Settings: A Case Study of Elementary Schools in Korea

    ERIC Educational Resources Information Center

    Cho, Younsoon; Chung, Hye Young; Choi, Kyoulee; Seo, Choyoung; Baek, Eunjoo

    2013-01-01

    This research explores the emergence of student creativity in classroom settings, specifically within two content areas: science and social studies. Fourteen classrooms in three elementary schools in Korea were observed, and the teachers and students were interviewed. The three types of student creativity emerging in the teaching and learning…

  18. The Categorical Facilitation Effects on L2 Vocabulary Learning in a Classroom Setting

    ERIC Educational Resources Information Center

    Hoshino, Yuko

    2010-01-01

    In the field of vocabulary acquisition, there have been many studies on the efficacy of word lists. However, very few of these were based on research in a classroom setting, and therefore, their results may not be applicable to standard classroom situations. This study investigated which of the five types of word lists (synonyms, antonyms,…

  19. Clickers in the large classroom: current research and best-practice tips.

    PubMed

    Caldwell, Jane E

    2007-01-01

    Audience response systems (ARS) or clickers, as they are commonly called, offer a management tool for engaging students in the large classroom. Basic elements of the technology are discussed. These systems have been used in a variety of fields and at all levels of education. Typical goals of ARS questions are discussed, as well as methods of compensating for the reduction in lecture time that typically results from their use. Examples of ARS use occur throughout the literature and often detail positive attitudes from both students and instructors, although exceptions do exist. When used in classes, ARS clickers typically have either a benign or positive effect on student performance on exams, depending on the method and extent of their use, and create a more positive and active atmosphere in the large classroom. These systems are especially valuable as a means of introducing and monitoring peer learning methods in the large lecture classroom. So that the reader may use clickers effectively in his or her own classroom, a set of guidelines for writing good questions and a list of best-practice tips have been culled from the literature and experienced users. PMID:17339389

  20. Clickers in the Large Classroom: Current Research and Best-Practice Tips

    PubMed Central

    2007-01-01

    Audience response systems (ARS) or clickers, as they are commonly called, offer a management tool for engaging students in the large classroom. Basic elements of the technology are discussed. These systems have been used in a variety of fields and at all levels of education. Typical goals of ARS questions are discussed, as well as methods of compensating for the reduction in lecture time that typically results from their use. Examples of ARS use occur throughout the literature and often detail positive attitudes from both students and instructors, although exceptions do exist. When used in classes, ARS clickers typically have either a benign or positive effect on student performance on exams, depending on the method and extent of their use, and create a more positive and active atmosphere in the large classroom. These systems are especially valuable as a means of introducing and monitoring peer learning methods in the large lecture classroom. So that the reader may use clickers effectively in his or her own classroom, a set of guidelines for writing good questions and a list of best-practice tips have been culled from the literature and experienced users. PMID:17339389

  1. Setting of Classroom Environments for Hearing Impaired Children

    ERIC Educational Resources Information Center

    Turan, Zerrin

    2007-01-01

    This paper aims to explain effects of acoustical environments in sound perception of hearing impaired people. Important aspects of sound and hearing impairment are explained. Detrimental factors in acoustic conditions for speech perception are mentioned. Necessary acoustic treatment in classrooms and use of FM systems to eliminate these factors…

  2. Researching Pupil Attending Behavior within Naturalistic Classroom Settings.

    ERIC Educational Resources Information Center

    Brooks, Douglas M.; Rogers, Constance J.

    1981-01-01

    Examines the relationship between teacher attitudes toward students and visual attending behavior in the classroom. Thirty-two students were identified in four categories, subsequently labeled accepted, indifferent, concerned and rejected. Results indicated significant differences in visual attending behavior and a two-way interaction with pupil…

  3. Thinking Routines: Replicating Classroom Practices within Museum Settings

    ERIC Educational Resources Information Center

    Wolberg, Rochelle Ibanez; Goff, Allison

    2012-01-01

    This article describes thinking routines as tools to guide and support young children's thinking. These learning strategies, developed by Harvard University's Project Zero Classroom, actively engage students in constructing meaning while also understanding their own thinking process. The authors discuss how thinking routines can be used in both…

  4. Understanding Bystander Perceptions of Cyberbullying in Inclusive Classroom Settings

    ERIC Educational Resources Information Center

    Guckert, Mary

    2013-01-01

    Cyberbullying is a pervasive problem that puts students at risk of successful academic outcomes and the ability to feel safe in school. As most students with disabilities are served in inclusive classrooms, there is a growing concern that students with special needs are at an increased risk of online bullying harassment. Enhancing responsible…

  5. Twelve Practical Strategies To Prevent Behavioral Escalation in Classroom Settings.

    ERIC Educational Resources Information Center

    Shukla-Mehta, Smita; Albin, Richard W.

    2003-01-01

    Twelve practical strategies that can be used by classroom teachers to prevent behavioral escalation are discussed, including reinforce calm, know the triggers, pay attention to anything unusual, do not escalate, intervene early, know the function of problem behavior, use extinction wisely, teach prosocial behavior, and teach academic survival…

  6. Knowledge Discovery in Large Data Sets

    SciTech Connect

    Simas, Tiago; Silva, Gabriel; Miranda, Bruno; Ribeiro, Rita

    2008-12-05

    In this work we briefly address the problem of unsupervised classification on large datasets, magnitude around 100,000,000 objects. The objects are variable objects, which are around 10% of the 1,000,000,000 astronomical objects that will be collected by GAIA/ESA mission. We tested unsupervised classification algorithms on known datasets such as OGLE and Hipparcos catalogs. Moreover, we are building several templates to represent the main classes of variable objects as well as new classes to build a synthetic dataset of this dimension. In the future we will run the GAIA satellite scanning law on these templates to obtain a testable large dataset.

  7. Introduction to comparing large sequence sets.

    PubMed

    Page, Roderic D M

    2003-02-01

    Comparisons of whole genomes can yield important insights into the evolution of genome structure, such as the role of inversions in bacterial evolution and the identification of large-scale duplications in the human genome. This unit briefly compares two tools for aligning whole genome sequences: MUMmer and PipMaker. These tools differ in both the underlying algorithms used, and in the interface they present to the user. PMID:18428691

  8. Activity Settings and Daily Routines in Preschool Classrooms: Diverse Experiences in Early Learning Settings for Low-Income Children

    ERIC Educational Resources Information Center

    Fuligni, Allison Sidle; Howes, Carollee; Huang, Yiching; Hong, Sandra Soliday; Lara-Cinisomo, Sandraluz

    2012-01-01

    This paper examines activity settings and daily classroom routines experienced by 3- and 4-year-old low-income children in public center-based preschool programs, private center-based programs, and family child care homes. Two daily routine profiles were identified using a time-sampling coding procedure: a High Free-Choice pattern in which…

  9. Activity Settings and Daily Routines in Preschool Classrooms: Diverse Experiences in Early Learning Settings for Low-Income Children

    PubMed Central

    Fuligni, Allison Sidle; Howes, Carollee; Huang, Yiching; Hong, Sandra Soliday; Lara-Cinisomo, Sandraluz

    2011-01-01

    This paper examines activity settings and daily classroom routines experienced by 3- and 4-year-old low-income children in public center-based preschool programs, private center-based programs, and family child care homes. Two daily routine profiles were identified using a time-sampling coding procedure: a High Free-Choice pattern in which children spent a majority of their day engaged in child-directed free-choice activity settings combined with relatively low amounts of teacher-directed activity, and a Structured-Balanced pattern in which children spent relatively equal proportions of their day engaged in child-directed free-choice activity settings and teacher-directed small- and whole-group activities. Daily routine profiles were associated with program type and curriculum use but not with measures of process quality. Children in Structured-Balanced classrooms had more opportunities to engage in language and literacy and math activities, whereas children in High Free-Choice classrooms had more opportunities for gross motor and fantasy play. Being in a Structured-Balanced classroom was associated with children’s language scores but profiles were not associated with measures of children’s math reasoning or socio-emotional behavior. Consideration of teachers’ structuring of daily routines represents a valuable way to understand nuances in the provision of learning experiences for young children in the context of current views about developmentally appropriate practice and school readiness. PMID:22665945

  10. Activity Settings and Daily Routines in Preschool Classrooms: Diverse Experiences in Early Learning Settings for Low-Income Children.

    PubMed

    Fuligni, Allison Sidle; Howes, Carollee; Huang, Yiching; Hong, Sandra Soliday; Lara-Cinisomo, Sandraluz

    2012-06-01

    This paper examines activity settings and daily classroom routines experienced by 3- and 4-year-old low-income children in public center-based preschool programs, private center-based programs, and family child care homes. Two daily routine profiles were identified using a time-sampling coding procedure: a High Free-Choice pattern in which children spent a majority of their day engaged in child-directed free-choice activity settings combined with relatively low amounts of teacher-directed activity, and a Structured-Balanced pattern in which children spent relatively equal proportions of their day engaged in child-directed free-choice activity settings and teacher-directed small- and whole-group activities. Daily routine profiles were associated with program type and curriculum use but not with measures of process quality. Children in Structured-Balanced classrooms had more opportunities to engage in language and literacy and math activities, whereas children in High Free-Choice classrooms had more opportunities for gross motor and fantasy play. Being in a Structured-Balanced classroom was associated with children's language scores but profiles were not associated with measures of children's math reasoning or socio-emotional behavior. Consideration of teachers' structuring of daily routines represents a valuable way to understand nuances in the provision of learning experiences for young children in the context of current views about developmentally appropriate practice and school readiness. PMID:22665945

  11. Large-N in Volcano Settings: Volcanosri

    NASA Astrophysics Data System (ADS)

    Lees, J. M.; Song, W.; Xing, G.; Vick, S.; Phillips, D.

    2014-12-01

    We seek a paradigm shift in the approach we take on volcano monitoring where the compromise from high fidelity to large numbers of sensors is used to increase coverage and resolution. Accessibility, danger and the risk of equipment loss requires that we develop systems that are independent and inexpensive. Furthermore, rather than simply record data on hard disk for later analysis we desire a system that will work autonomously, capitalizing on wireless technology and in field network analysis. To this end we are currently producing a low cost seismic array which will incorporate, at the very basic level, seismological tools for first cut analysis of a volcano in crises mode. At the advanced end we expect to perform tomographic inversions in the network in near real time. Geophone (4 Hz) sensors connected to a low cost recording system will be installed on an active volcano where triggering earthquake location and velocity analysis will take place independent of human interaction. Stations are designed to be inexpensive and possibly disposable. In one of the first implementations the seismic nodes consist of an Arduino Due processor board with an attached Seismic Shield. The Arduino Due processor board contains an Atmel SAM3X8E ARM Cortex-M3 CPU. This 32 bit 84 MHz processor can filter and perform coarse seismic event detection on a 1600 sample signal in fewer than 200 milliseconds. The Seismic Shield contains a GPS module, 900 MHz high power mesh network radio, SD card, seismic amplifier, and 24 bit ADC. External sensors can be attached to either this 24-bit ADC or to the internal multichannel 12 bit ADC contained on the Arduino Due processor board. This allows the node to support attachment of multiple sensors. By utilizing a high-speed 32 bit processor complex signal processing tasks can be performed simultaneously on multiple sensors. Using a 10 W solar panel, second system being developed can run autonomously and collect data on 3 channels at 100Hz for 6 months

  12. Content-Based Instruction for English Language Learners: An Exploration across Multiple Classroom Settings

    ERIC Educational Resources Information Center

    Park, Seo Jung

    2009-01-01

    This study explored the content-based literacy instruction of English language learners (ELLs) across multiple classroom settings in U.S. elementary schools. The following research questions guided the study: (a) How are ELLs taught English in two types of instructional settings: regular content-area literacy instruction in the all-English…

  13. The Transition of Women from the Classroom Setting to the Educational Administration Setting

    ERIC Educational Resources Information Center

    Zachreson, Sarah A.

    2011-01-01

    This qualitative case study examined the research exploring how female teachers had perceived their potential challenges in becoming a principal, and how those perceptions actually changed as they made the move from the classroom to the principal's office. The purpose of the study is to investigate how female administrative candidates assessed and…

  14. Large Data at Small Universities: Astronomical processing using a computer classroom

    NASA Astrophysics Data System (ADS)

    Fuller, Nathaniel James; Clarkson, William I.; Fluharty, Bill; Belanger, Zach; Dage, Kristen

    2016-06-01

    The use of large computing clusters for astronomy research is becoming more commonplace as datasets expand, but access to these required resources is sometimes difficult for research groups working at smaller Universities. As an alternative to purchasing processing time on an off-site computing cluster, or purchasing dedicated hardware, we show how one can easily build a crude on-site cluster by utilizing idle cycles on instructional computers in computer-lab classrooms. Since these computers are maintained as part of the educational mission of the University, the resource impact on the investigator is generally low.By using open source Python routines, it is possible to have a large number of desktop computers working together via a local network to sort through large data sets. By running traditional analysis routines in an “embarrassingly parallel” manner, gains in speed are accomplished without requiring the investigator to learn how to write routines using highly specialized methodology. We demonstrate this concept here applied to 1. photometry of large-format images and 2. Statistical significance-tests for X-ray lightcurve analysis. In these scenarios, we see a speed-up factor which scales almost linearly with the number of cores in the cluster. Additionally, we show that the usage of the cluster does not severely limit performance for a local user, and indeed the processing can be performed while the computers are in use for classroom purposes.

  15. Performance in an Online Introductory Course in a Hybrid Classroom Setting

    ERIC Educational Resources Information Center

    Aly, Ibrahim

    2013-01-01

    This study compared the academic achievement between undergraduate students taking an introductory managerial accounting course online (N = 104) and students who took the same course in a hybrid classroom setting (N = 203). Student achievement was measured using scores from twelve weekly online assignments, two major online assignments, a final…

  16. Comparing Asynchronous Online Discussions and Face-to-Face Discussions in a Classroom Setting

    ERIC Educational Resources Information Center

    Wang, Qiyun; Woo, Huay Lit

    2007-01-01

    The purpose of this study is to investigate the perceived differences between asynchronous online discussions and face-to-face discussions in a classroom setting. The students' reflections were analysed by following a qualitative research approach. The results showed that atmosphere, response, efficiency, interactivity and communication were the…

  17. Enhancing Knowledge Transfer in Classroom versus Online Settings: The Interplay among Instructor, Student, Content, and Context

    ERIC Educational Resources Information Center

    Nemanich, Louise; Banks, Michael; Vera, Dusya

    2009-01-01

    This article integrates management education and organizational learning theories to identify the factors that drive the differences in student outcomes between the online and classroom settings. We draw upon theory on knowledge transfer barriers in organizations to understand the interlinking relationships among presage conditions, deep learning…

  18. Generalizability and Decision Studies to Inform Observational and Experimental Research in Classroom Settings

    ERIC Educational Resources Information Center

    Bottema-Beutel, Kristen; Lloyd, Blair; Carter, Erik W.; Asmus, Jennifer M.

    2014-01-01

    Attaining reliable estimates of observational measures can be challenging in school and classroom settings, as behavior can be influenced by multiple contextual factors. Generalizability (G) studies can enable researchers to estimate the reliability of observational data, and decision (D) studies can inform how many observation sessions are…

  19. How Passive-Aggressive Behavior in Emotionally Disturbed Children Affects Peer Interactions in a Classroom Setting.

    ERIC Educational Resources Information Center

    Hardt, Janet

    Passive-aggressive behavior in an emotionally disturbed child affects the child's progress and affects peer interactions in classroom settings. Passive-aggressive personalities are typically helpless, dependent, impulsive, overly anxious, poorly oriented to reality, and procrastinating. The characteristics of passive-aggressive children need to be…

  20. Descriptive Analysis of Classroom Setting Events on the Social Behaviors of Children with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Boyd, Brian A.; Conroy, Maureen A.; Asmus, Jennifer M.; McKenney, Elizabeth L. W.; Mancil, G. Richmond

    2008-01-01

    Children with Autism Spectrum Disorder (ASD) are characterized by extreme deficits in social relatedness with same-age peers. The purpose of this descriptive study was to identify naturally occurring antecedent variables (i.e., setting events) in the classroom environments of children with ASD that promoted their engagement in peer-related social…

  1. Mobile-IT Education (MIT.EDU): M-Learning Applications for Classroom Settings

    ERIC Educational Resources Information Center

    Sung, M.; Gips, J.; Eagle, N.; Madan, A.; Caneel, R.; DeVaul, R.; Bonsen, J.; Pentland, A.

    2005-01-01

    In this paper, we describe the Mobile-IT Education (MIT.EDU) system, which demonstrates the potential of using a distributed mobile device architecture for rapid prototyping of wireless mobile multi-user applications for use in classroom settings. MIT.EDU is a stable, accessible system that combines inexpensive, commodity hardware, a flexible…

  2. The Impact of Physical Settings on Pre-Schoolers Classroom Organization

    ERIC Educational Resources Information Center

    Tadjic, Mirko; Martinec, Miroslav; Farago, Amalija

    2015-01-01

    The physical setting plays an important role in the lives of pre-schoolers and can be an important component of children's experience and development when it is wisely and meaningfully designed. The classroom organization enhances and supports the pre-schooler capability to perform activities himself, initiate and finish tasks, creates the…

  3. Civility in the University Classroom: An Opportunity for Faculty to Set Expectations

    ERIC Educational Resources Information Center

    Ward, Chris; Yates, Dan

    2014-01-01

    This research examines the types of uncivil behaviors frequently encountered in university classrooms. These behaviors range from walking in late to class, texting in class, and/or unprofessional emails. These behaviors can often undermine a professor's teaching. Setting reasonable and consistent expectations is a combination of university policy,…

  4. Conceptualizing the Classroom of Target Students: A Qualitative Investigation of Panelists' Experiences during Standard Setting

    ERIC Educational Resources Information Center

    Hein, Serge F.; Skaggs, Gary

    2010-01-01

    Increasingly, research has focused on the cognitive processes associated with various standard-setting activities. This qualitative study involved an examination of 16 third-grade reading teachers' experiences with the cognitive task of conceptualizing an entire classroom of hypothetical target students when the single-passage bookmark method or…

  5. Use of Big-Screen Films in Multiple Childbirth Education Classroom Settings

    PubMed Central

    Kaufman, Tamara

    2010-01-01

    Although two recent films, Orgasmic Birth and Pregnant in America, were intended for the big screen, they can also serve as valuable teaching resources in multiple childbirth education settings. Each film conveys powerful messages about birth and today's birthing culture. Depending on a childbirth educator's classroom setting (hospital, birthing center, or home birth environment), particular portions in each film, along with extra clips featured on the films' DVDs, can enhance an educator's curriculum and spark compelling discussions with class participants. PMID:21358831

  6. Comparing Functional Analysis and Paired-choice Assessment Results in Classroom Settings

    PubMed Central

    Berg, Wendy K; Wacker, David P; Cigrand, Karla; Merkle, Steve; Wade, Jeanie; Henry, Kim; Wang, Yu-Chia

    2007-01-01

    The results of a functional analysis of problem behavior and a paired-choice assessment were compared to determine whether the same social reinforcers were identified for problem behavior and an appropriate response (time allocation). The two assessments were conducted in classroom settings with 4 adolescents with mental retardation who engaged in severe problem behavior. Each student's classroom teacher served as the therapist for all phases of assessment. The two assessment procedures identified the same social reinforcers for problem and appropriate behavior for 3 of 4 participants. PMID:17970268

  7. The Relation between High School Teacher Sense of Teaching Efficacy and Self-Reported Attitudes toward the Inclusive Classroom Settings

    ERIC Educational Resources Information Center

    Wright, Heather Dillehay

    2013-01-01

    The purpose of this study was to investigate if collective sense of teaching efficacy, general sense of teaching efficacy, or personal sense of teacher efficacy influenced teacher attitude toward inclusive classroom settings. Additionally, the study sought to determine if teacher attitude toward inclusive classroom settings differed when taking…

  8. Teaching Methodology in a "Large Power Distance" Classroom: A South Korean Context

    ERIC Educational Resources Information Center

    Jambor, Paul Z.

    2005-01-01

    This paper looks at South Korea as an example of a collectivist society having a rather large power distance dimension value. In a traditional Korean classroom the teacher is at the top of the classroom hierarchy, while the students are the passive participants. Gender and age play a role in the hierarchy between students themselves. Teaching…

  9. Strategies for Engaging FCS Learners in a Large-Format Classroom: Embedded Videos

    ERIC Educational Resources Information Center

    Leslie, Catherine Amoroso

    2014-01-01

    This article presents a method for utilizing technology to increase student engagement in large classroom formats. In their lives outside the classroom, students spend considerable time interfacing with media, and they are receptive to information conveyed in electronic formats. Research has shown that multimedia is an effective learning resource;…

  10. INTERIOR VIEW, SETTING LARGE CORE WITH ASSISTANCE FROM THE OVERHEAD ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    INTERIOR VIEW, SETTING LARGE CORE WITH ASSISTANCE FROM THE OVERHEAD RAIL CRANE IN BOX FLOOR MOLD AREA (WORKERS: DAN T. WELLS AND TRUMAN CARLISLE). - Stockham Pipe & Fittings Company, Ductile Iron Foundry, 4000 Tenth Avenue North, Birmingham, Jefferson County, AL

  11. [The BASYS observation system for the analysis of aggressive behavior in classroom-settings].

    PubMed

    Wettstein, Alexander

    2012-01-01

    Educational or therapeutic measures of aggressive student behavior are often based on the judgments of teachers. However, empirical studies show that the objectivity of these judgments is generally low. In order to assess aggressive behavior in classroom settings, we developed a context-sensitive observational system. The observation system exists in a version for teachers in action as well as a version for the uninvolved observer. The teacher version allows categorizing aggressive behavior while teaching. The aim is to differentiate the perception and the judgments of teachers, so that the judgments can serve as trustable diagnostic information. The version for an independent observer, in addition, contains categories to collect information about the context in which aggressions take place. The behavior observation system was tested in four field-studies in regular and special classes. The empirical results show that, after training, teachers were able to make objective observations, and that aggressive behavior depends to a large extent on situational factors. The system allows identification of problematic people-environment relationships and the derivation of intervention measures. PMID:22748725

  12. Generalizability and decision studies to inform observational and experimental research in classroom settings.

    PubMed

    Bottema-Beutel, Kristen; Lloyd, Blair; Carter, Erik W; Asmus, Jennifer M

    2014-11-01

    Attaining reliable estimates of observational measures can be challenging in school and classroom settings, as behavior can be influenced by multiple contextual factors. Generalizability (G) studies can enable researchers to estimate the reliability of observational data, and decision (D) studies can inform how many observation sessions are necessary to achieve a criterion level of reliability. We conducted G and D studies using observational data from a randomized control trial focusing on social and academic participation of students with severe disabilities in inclusive secondary classrooms. Results highlight the importance of anchoring observational decisions to reliability estimates from existing or pilot data sets. We outline steps for conducting G and D studies and address options when reliability estimates are lower than desired. PMID:25354126

  13. Adaptive, multiresolution visualization of large data sets using parallel octrees.

    SciTech Connect

    Freitag, L. A.; Loy, R. M.

    1999-06-10

    The interactive visualization and exploration of large scientific data sets is a challenging and difficult task; their size often far exceeds the performance and memory capacity of even the most powerful graphics work-stations. To address this problem, we have created a technique that combines hierarchical data reduction methods with parallel computing to allow interactive exploration of large data sets while retaining full-resolution capability. The hierarchical representation is built in parallel by strategically inserting field data into an octree data structure. We provide functionality that allows the user to interactively adapt the resolution of the reduced data sets so that resolution is increased in regions of interest without sacrificing local graphics performance. We describe the creation of the reduced data sets using a parallel octree, the software architecture of the system, and the performance of this system on the data from a Rayleigh-Taylor instability simulation.

  14. Looking at large data sets using binned data plots

    SciTech Connect

    Carr, D.B.

    1990-04-01

    This report addresses the monumental challenge of developing exploratory analysis methods for large data sets. The goals of the report are to increase awareness of large data sets problems and to contribute simple graphical methods that address some of the problems. The graphical methods focus on two- and three-dimensional data and common task such as finding outliers and tail structure, assessing central structure and comparing central structures. The methods handle large sample size problems through binning, incorporate information from statistical models and adapt image processing algorithms. Examples demonstrate the application of methods to a variety of publicly available large data sets. The most novel application addresses the too many plots to examine'' problem by using cognostics, computer guiding diagnostics, to prioritize plots. The particular application prioritizes views of computational fluid dynamics solution sets on the fly. That is, as each time step of a solution set is generated on a parallel processor the cognostics algorithms assess virtual plots based on the previous time step. Work in such areas is in its infancy and the examples suggest numerous challenges that remain. 35 refs., 15 figs.

  15. Treatment of psychotic children in a classroom environment: I. Learning in a large group1

    PubMed Central

    Koegel, Robert L.; Rincover, Arnold

    1974-01-01

    The purpose of this study was to investigate systematically the feasibility of modifying the behavior of autistic children in a classroom environment. In the first experiment, eight autistic children were taught certain basic classroom behaviors (including attending to the teacher upon command, imitation, and an elementary speaking and recognition vocabulary) that were assumed to be necessary for subsequent learning to take place in the classroom. Based on research documenting the effectiveness of one-to-one (teacher-child ratio) procedures for modifying such behaviors, these behaviors were taught in one-to-one sessions. It was, however, found that behaviors taught in a one-to-one setting were not performed consistently in a classroom-sized group, or even in a group as small as two children with one teacher. Further, the children evidenced no acquisition of new behaviors in a classroom environment over a four-week period. Therefore, Experiment II introduced a treatment procedure based upon “fading in” the classroom stimulus situation from the one-to-one stimulus situation. Such treatment was highly effective in producing both a transfer in stimulus control and the acquisition of new behaviors in a kindergarten/first-grade classroom environment. PMID:4465373

  16. Reducing Information Overload in Large Seismic Data Sets

    SciTech Connect

    HAMPTON,JEFFERY W.; YOUNG,CHRISTOPHER J.; MERCHANT,BION J.; CARR,DORTHE B.; AGUILAR-CHANG,JULIO

    2000-08-02

    Event catalogs for seismic data can become very large. Furthermore, as researchers collect multiple catalogs and reconcile them into a single catalog that is stored in a relational database, the reconciled set becomes even larger. The sheer number of these events makes searching for relevant events to compare with events of interest problematic. Information overload in this form can lead to the data sets being under-utilized and/or used incorrectly or inconsistently. Thus, efforts have been initiated to research techniques and strategies for helping researchers to make better use of large data sets. In this paper, the authors present their efforts to do so in two ways: (1) the Event Search Engine, which is a waveform correlation tool and (2) some content analysis tools, which area combination of custom-built and commercial off-the-shelf tools for accessing, managing, and querying seismic data stored in a relational database. The current Event Search Engine is based on a hierarchical clustering tool known as the dendrogram tool, which is written as a MatSeis graphical user interface. The dendrogram tool allows the user to build dendrogram diagrams for a set of waveforms by controlling phase windowing, down-sampling, filtering, enveloping, and the clustering method (e.g. single linkage, complete linkage, flexible method). It also allows the clustering to be based on two or more stations simultaneously, which is important to bridge gaps in the sparsely recorded event sets anticipated in such a large reconciled event set. Current efforts are focusing on tools to help the researcher winnow the clusters defined using the dendrogram tool down to the minimum optimal identification set. This will become critical as the number of reference events in the reconciled event set continually grows. The dendrogram tool is part of the MatSeis analysis package, which is available on the Nuclear Explosion Monitoring Research and Engineering Program Web Site. As part of the research

  17. Gaussian predictive process models for large spatial data sets

    PubMed Central

    Banerjee, Sudipto; Gelfand, Alan E.; Finley, Andrew O.; Sang, Huiyan

    2009-01-01

    Summary With scientific data available at geocoded locations, investigators are increasingly turning to spatial process models for carrying out statistical inference. Over the last decade, hierarchical models implemented through Markov chain Monte Carlo methods have become especially popular for spatial modelling, given their flexibility and power to fit models that would be infeasible with classical methods as well as their avoidance of possibly inappropriate asymptotics. However, fitting hierarchical spatial models often involves expensive matrix decompositions whose computational complexity increases in cubic order with the number of spatial locations, rendering such models infeasible for large spatial data sets. This computational burden is exacerbated in multivariate settings with several spatially dependent response variables. It is also aggravated when data are collected at frequent time points and spatiotemporal process models are used. With regard to this challenge, our contribution is to work with what we call predictive process models for spatial and spatiotemporal data. Every spatial (or spatiotemporal) process induces a predictive process model (in fact, arbitrarily many of them). The latter models project process realizations of the former to a lower dimensional subspace, thereby reducing the computational burden. Hence, we achieve the flexibility to accommodate non-stationary, non-Gaussian, possibly multivariate, possibly spatiotemporal processes in the context of large data sets. We discuss attractive theoretical properties of these predictive processes. We also provide a computational template encompassing these diverse settings. Finally, we illustrate the approach with simulated and real data sets. PMID:19750209

  18. Intelligent Archiving and Physics Mining of Large Data Sets (Invited)

    NASA Astrophysics Data System (ADS)

    Karimabadi, H.

    2009-12-01

    There are unique challenges in all aspects related to large data sets, from storage, search and access, to analysis and file sharing. With few exceptions, the adoption of the latest technologies to deal with the management and mining of large data sets has been slow in heliosciences. Web services such as CDAweb have been very successful and have been widely adopted by the community. There are also significant efforts going towards Virtual Observatories (VxOs). The main thrust of VxOs has so far been on data discovery, aggregation and uniform presentation. While work remains, many VxOs can now be used to access data. However data is not knowledge and the challenge of extracting physics from the large data sets remains. Here we review our efforts on (i) implementing advanced data mining techniques as part of the data-to-knowledge discovery pipeline, and (ii) use of social networking paradigm in the development of a science collaboratory environment that enables sharing of large files, creation of projects, among others. We will present new data mining software that works on a variety of data formats and demonstrate its capability through several examples of analysis of spacecraft data. The use of such techniques in intelligent archiving will be discussed. Finally, the use of our science collaboratory service and its unique sharing features such as universal accessibility of staged files will be illustrated.

  19. The attributes of an effective teacher differ between the classroom and the clinical setting.

    PubMed

    Haws, Jolene; Rannelli, Luke; Schaefer, Jeffrey P; Zarnke, Kelly; Coderre, Sylvain; Ravani, Pietro; McLaughlin, Kevin

    2016-10-01

    Most training programs use learners' subjective ratings of their teachers as the primary measure of teaching effectiveness. In a recent study we found that preclinical medical students' ratings of classroom teachers were associated with perceived charisma and physical attractiveness of the teacher, but not intellect. Here we explored whether the relationship between these variables and teaching effectiveness ratings holds in the clinical setting. We asked 27 Internal Medicine residents to rate teaching effectiveness of ten teachers with whom they had worked on a clinical rotation, in addition to rating each teacher's clinical skills, physical attractiveness, and charisma. We used linear regression to study the association between these explanatory variables and teaching effectiveness ratings. We found no association between rating of physical attractiveness and teaching effectiveness. Clinical skill and charisma were independently associated with rating of teaching effectiveness (regression coefficients [95 % confidence interval] 0.73 [0.60, 0.85], p < 0.001 and 0.12 [0.01, 0.23], p = 0.03, respectively). The variables associated with effectiveness of classroom and clinical teachers differ, suggesting context specificity in teaching effectiveness ratings. Context specificity may be explained by differences in the exposure that learners have to teachers in the classroom versus clinical setting-so that raters in the clinical setting may base ratings upon observed behaviours rather than stereotype data. Alternatively, since subjective ratings of teaching effectiveness inevitably incorporate learners' context-specific needs, the attributes that make a teacher effective in one context may not meet the needs of learners in a different context. PMID:26891679

  20. Computing Information-Theoretic Quantities in Large Climate Data Sets

    NASA Astrophysics Data System (ADS)

    Knuth, K. H.; Castle, J. P.; Curry, C. T.; Gotera, A.; Huyser, K. A.; Wheeler, K. R.; Rossow, W. B.

    2005-12-01

    Information-theoretic quantities, such as mutual information, allow one to quantify the amount of information shared by two variables. In large data sets, the mutual information can be used to identify sets of co-informative variables and thus are able to identify variables that can act as predictors of a phenomenon of interest. While mutual information alone does not distinguish a causal interaction between two variables, another information-theoretic quantity called the transfer entropy can indicate such possible causal interactions. Together, these quantities can be used to identify causal interactions among sets of variables in large distributed data sets. We are currently developing a suite of computational tools that will allow researchers to calculate, from data, these useful information-theoretic quantities. Our software tools estimate these quantities along with their associated error bars, the latter of which are critical for describing the degree of uncertainty in the estimates. In this presentation we demonstrate how mutual information and transfer entropy can be applied so as to allow researchers not only to identify relations among climate variables, but also to characterize and quantify their possible causal interactions.

  1. Science Teacher Beliefs and Classroom Practice Related to Constructivism in Different School Settings

    ERIC Educational Resources Information Center

    Savasci, Funda; Berlin, Donna F.

    2012-01-01

    Science teacher beliefs and classroom practice related to constructivism and factors that may influence classroom practice were examined in this cross-case study. Data from four science teachers in two schools included interviews, demographic questionnaire, Classroom Learning Environment Survey (preferred/perceived), and classroom observations and…

  2. Interactive Web-Based Map: Applications to Large Data Sets in the Geosciences. Interactive Web-Based Map: Applications to Large Data Sets in the Geosciences.

    NASA Astrophysics Data System (ADS)

    Garbow, Z. A.; Olson, N. R.; Yuen, D. A.; Boggs, J. M.

    2001-12-01

    Current advances in computer hardware, information technology and data collection techniques have produced very large data sets, sometimes more than terabytes,in a wide variety of scientific and engineering disciplines. We must harness this opportunity to visualize and extract useful information from geophysical and geological data. We have taken the task of data-mining by using a map-like approach over the web for interrogating the humongous data, using a client-server paradigm. The spatial-data is mapped onto a two-dimensional grid from which the user ( client ) can quiz the data with the map-interface as a user extension . The data is stored on high-end compute server. The computational gateway separating the client and the server can be the front-end of an electronic publication , electronic classroom , a Grid system device or e-business. We have used a combination of JAVA, JAVA-3D and Perl for processing the data and communicating them between the client and the server. The user can interrogate the geospatial data over any particular region with arbitrary length scales and pose relevant statistical questions, such as the histogram plots and local statistics. We have applied this method for the following data sets (1.) distribution of prime numbers (2.) two-dimensional mantle convection (3.) three-dimensional mantle convection (4) high-resolution satellite reflectance data over the Upper Midwest for multiple wavelengths (5) molecular dynamics describing the flow of blood in narrow vessels. Using this map-interface concept, the user can actually interrogate these data over the web. This strategy for dissecting large data-sets can be easily applied to other areas, such as satellite geodesy and earthquake data. This mode of data-query may function in an adequately covered wireless web environment with a transfer rate of around 10 Mbit/sec .

  3. Robust Coordination for Large Sets of Simple Rovers

    NASA Technical Reports Server (NTRS)

    Tumer, Kagan; Agogino, Adrian

    2006-01-01

    The ability to coordinate sets of rovers in an unknown environment is critical to the long-term success of many of NASA;s exploration missions. Such coordination policies must have the ability to adapt in unmodeled or partially modeled domains and must be robust against environmental noise and rover failures. In addition such coordination policies must accommodate a large number of rovers, without excessive and burdensome hand-tuning. In this paper we present a distributed coordination method that addresses these issues in the domain of controlling a set of simple rovers. The application of these methods allows reliable and efficient robotic exploration in dangerous, dynamic, and previously unexplored domains. Most control policies for space missions are directly programmed by engineers or created through the use of planning tools, and are appropriate for single rover missions or missions requiring the coordination of a small number of rovers. Such methods typically require significant amounts of domain knowledge, and are difficult to scale to large numbers of rovers. The method described in this article aims to address cases where a large number of rovers need to coordinate to solve a complex time dependent problem in a noisy environment. In this approach, each rover decomposes a global utility, representing the overall goal of the system, into rover-specific utilities that properly assign credit to the rover s actions. Each rover then has the responsibility to create a control policy that maximizes its own rover-specific utility. We show a method of creating rover-utilities that are "aligned" with the global utility, such that when the rovers maximize their own utility, they also maximize the global utility. In addition we show that our method creates rover-utilities that allow the rovers to create their control policies quickly and reliably. Our distributed learning method allows large sets rovers be used unmodeled domains, while providing robustness against

  4. Form Invariance Symmetry Generates a Large Set of FRW Cosmologies

    NASA Astrophysics Data System (ADS)

    Chimento, Luis P.; Richarte, Martín G.; Sánchez, Iván E.

    2013-02-01

    We show that Einstein's field equations for spatially flat Friedmann-Robertson-Walker (FRW) spacetimes have a form invariance symmetry (FIS) realized by the form invariance transformations (FIT) which are indeed generated by an invertible function of the source energy density. These transformations act on the Hubble expansion rate, the energy density and pressure of the cosmic fluid; likewise such transformations are endowed with a Lie group structure. Each representation of this group is associated with a particular fluid and consequently a determined cosmology, so that, the FIS defines a set of equivalent cosmological models. We focus our seek in the FIT generated by a linear function because it provides a natural framework to express the duality and also produces large sets of cosmologies, starting from a seed one, in several contexts as for instance in the cases of a perfect fluid source and a scalar field driven by a potential depending linearly on the scalar field kinetic energy density.

  5. A large-scale crop protection bioassay data set.

    PubMed

    Gaulton, Anna; Kale, Namrata; van Westen, Gerard J P; Bellis, Louisa J; Bento, A Patrícia; Davies, Mark; Hersey, Anne; Papadatos, George; Forster, Mark; Wege, Philip; Overington, John P

    2015-01-01

    ChEMBL is a large-scale drug discovery database containing bioactivity information primarily extracted from scientific literature. Due to the medicinal chemistry focus of the journals from which data are extracted, the data are currently of most direct value in the field of human health research. However, many of the scientific use-cases for the current data set are equally applicable in other fields, such as crop protection research: for example, identification of chemical scaffolds active against a particular target or endpoint, the de-convolution of the potential targets of a phenotypic assay, or the potential targets/pathways for safety liabilities. In order to broaden the applicability of the ChEMBL database and allow more widespread use in crop protection research, an extensive data set of bioactivity data of insecticidal, fungicidal and herbicidal compounds and assays was collated and added to the database. PMID:26175909

  6. Optimizing distance-based methods for large data sets

    NASA Astrophysics Data System (ADS)

    Scholl, Tobias; Brenner, Thomas

    2015-10-01

    Distance-based methods for measuring spatial concentration of industries have received an increasing popularity in the spatial econometrics community. However, a limiting factor for using these methods is their computational complexity since both their memory requirements and running times are in {{O}}(n^2). In this paper, we present an algorithm with constant memory requirements and shorter running time, enabling distance-based methods to deal with large data sets. We discuss three recent distance-based methods in spatial econometrics: the D&O-Index by Duranton and Overman (Rev Econ Stud 72(4):1077-1106, 2005), the M-function by Marcon and Puech (J Econ Geogr 10(5):745-762, 2010) and the Cluster-Index by Scholl and Brenner (Reg Stud (ahead-of-print):1-15, 2014). Finally, we present an alternative calculation for the latter index that allows the use of data sets with millions of firms.

  7. Observations of Teacher-Child Interactions in Classrooms Serving Latinos and Dual Language Learners: Applicability of the Classroom Assessment Scoring System in Diverse Settings

    ERIC Educational Resources Information Center

    Downer, Jason T.; Lopez, Michael L.; Grimm, Kevin J.; Hamagami, Aki; Pianta, Robert C.; Howes, Carollee

    2012-01-01

    With the rising number of Latino and dual language learner (DLL) children attending pre-k and the importance of assessing the quality of their experiences in those settings, this study examined the extent to which a commonly used assessment of teacher-child interactions, the Classroom Assessment Scoring System (CLASS), demonstrated similar…

  8. Support vector machine classifiers for large data sets.

    SciTech Connect

    Gertz, E. M.; Griffin, J. D.

    2006-01-31

    This report concerns the generation of support vector machine classifiers for solving the pattern recognition problem in machine learning. Several methods are proposed based on interior point methods for convex quadratic programming. Software implementations are developed by adapting the object-oriented packaging OOQP to the problem structure and by using the software package PETSc to perform time-intensive computations in a distributed setting. Linear systems arising from classification problems with moderately large numbers of features are solved by using two techniques--one a parallel direct solver, the other a Krylov-subspace method incorporating novel preconditioning strategies. Numerical results are provided, and computational experience is discussed.

  9. Comparing Outcomes from Field and Classroom Based Settings for Undergraduate Geoscience Courses

    NASA Astrophysics Data System (ADS)

    Skinner, M. R.; Harris, R. A.; Flores, J.

    2011-12-01

    Field based learning can be found in nearly every course offered in Geology at Brigham Young University. For example, in our Structural Geology course field studies substitute for labs. Students collect data their own data from several different structural settings of the Wasatch Range. Our curriculum also includes a two-week, sophomore-level field course that introduces students to interpreting field relations themselves and sets the stage for much of what they learn in their upper-division courses. Our senior-level six-week field geology course includes classical field mapping with exercises in petroleum and mineral exploration, environmental geology and geological hazards. Experiments with substituting field-based general education courses for those in traditional classroom settings indicate that student cognition, course enjoyment and recruiting of majors significantly increase in a field-based course. We offer a field-based introductory geology course (Geo 102) that is taught in seven, six-hour field trips during which students travel to localities of geologic interest to investigate a variety of fundamental geological problems. We compare the outcomes of Geo 102 with a traditional classroom-based geology course (Geo 101). For the comparison both courses are taught by the same instructor, use the same text and supplementary materials and take the same exams. The results of 7 years of reporting indicate that test scores and final grades are one-half grade point higher for Geo 102 students versus those in traditional introductory courses. Student evaluations of the course are also 0.8-1.4 points higher on a scale of 1-8, and are consistently the highest in the Department and College. Other observations include increased attendance, attention and curiosity. The later two are measured by the number of students asking questions of other students as well as the instructors, and the total number of questions asked during class time in the field versus the classroom

  10. Reliability and Validity of Information about Student Achievement: Comparing Large-Scale and Classroom Testing Contexts

    ERIC Educational Resources Information Center

    Cizek, Gregory J.

    2009-01-01

    Reliability and validity are two characteristics that must be considered whenever information about student achievement is collected. However, those characteristics--and the methods for evaluating them--differ in large-scale testing and classroom testing contexts. This article presents the distinctions between reliability and validity in the two…

  11. Implementing Concept-Based Learning in a Large Undergraduate Classroom

    ERIC Educational Resources Information Center

    Morse, David; Jutras, France

    2008-01-01

    An experiment explicitly introducing learning strategies to a large, first-year undergraduate cell biology course was undertaken to see whether awareness and use of strategies had a measurable impact on student performance. The construction of concept maps was selected as the strategy to be introduced because of an inherent coherence with a course…

  12. Interaction and Uptake in Large Foreign Language Classrooms

    ERIC Educational Resources Information Center

    Ekembe, Eric Enongene

    2014-01-01

    Inteaction determines and affects the conditions of language acquisition especially in contexts where exposure to the target language is limited. This is believed to be successful only within the context of small classes (Chavez, 2009). This paper examines learners' progress resulting from interaction in large classes. Using pre-, post-, and…

  13. Visualizing large data sets in the earth sciences

    NASA Technical Reports Server (NTRS)

    Hibbard, William; Santek, David

    1989-01-01

    The authors describe the capabilities of McIDAS, an interactive visualization system that is vastly increasing the ability of earth scientists to manage and analyze data from remote sensing instruments and numerical simulation models. McIDAS provides animated three-dimensionsal images and highly interactive displays. The software can manage, analyze, and visualize large data sets that span many physical variables (such as temperature, pressure, humidity, and wind speed), as well as time and three spatial dimensions. The McIDAS system manages data from at least 100 different sources. The data management tools consist of data structures for storing different data types in files, libraries of routines for accessing these data structures, system commands for performing housekeeping functions on the data files, and reformatting programs for converting external data to the system's data structures. The McIDAS tools for three-dimensional visualization of meteorological data run on an IBM mainframe and can load up to 128-frame animation sequences into the workstations. A highly interactive version of the system can provide an interactive window into data sets containing tens of millions of points produced by numerical models and remote sensing instruments. The visualizations are being used for teaching as well as by scientists.

  14. Ready, Set, Science! Putting Research To Work In K-8 Classrooms

    NASA Astrophysics Data System (ADS)

    van der Veen, Wil E.; Moody, T.

    2008-05-01

    What types of instructional experiences help students learn and understand science? What do professional development providers and curriculum designers need to know to create and support such experiences? Ready, Set, Science! is a book that provides a practical and accessible account of current research about teaching and learning science. Based on the groundbreaking National Research Council report "Taking Science to School: Learning and Teaching Science in Grades K-8” (2006), the book reviews principles derived from the latest educational research and applies them to effective teaching practice. Ready, Set, Science! is a MUST READ for everyone involved in K-12 education, or creating products intended for K-12 use. We will review Ready, Set, Science!'s new vision of science in education, its most important recommendations, and its implications for the place of astronomy in K-12 classrooms. We will review some useful suggestions on how to make student thinking visible and report on how we have put this into practice with teachers. We will engage the audience in a brief interactive demonstration of specific questioning techniques described in the book that help to make student thinking visible.

  15. Parallel Analysis Tools for Ultra-Large Climate Data Sets

    NASA Astrophysics Data System (ADS)

    Jacob, Robert; Krishna, Jayesh; Xu, Xiabing; Mickelson, Sheri; Wilde, Mike; Peterson, Kara; Bochev, Pavel; Latham, Robert; Tautges, Tim; Brown, David; Brownrigg, Richard; Haley, Mary; Shea, Dennis; Huang, Wei; Middleton, Don; Schuchardt, Karen; Yin, Jian

    2013-04-01

    While climate models have used parallelism for several years, the post-processing tools are still mostly single-threaded applications and many are closed source. These tools are becoming a bottleneck in the production of new climate knowledge when they confront terabyte-sized output from high-resolution climate models. The ParVis project is using and creating Free and Open Source tools that bring data and task parallelism to climate model analysis to enable analysis of large climate data sets. ParVis is using the Swift task-parallel language to implement a diagnostic suite that generates over 600 plots of atmospheric quantities. ParVis has also created a Parallel Gridded Analysis Library (ParGAL) which implements many common climate analysis operations in a data-parallel fashion using the Message Passing Interface. ParGAL has in turn been built on sophisticated packages for describing grids in parallel (the Mesh Oriented database (MOAB), performing vector operations on arbitrary grids (Intrepid) and reading data in parallel (PnetCDF). ParGAL is being used to implement a parallel version of the NCAR Command Language (NCL) called ParNCL. ParNCL/ParCAL not only speeds up analysis of large datasets but also allows operations to be performed on native grids, eliminating the need to transform data to latitude-longitude grids. All of the tools ParVis is creating are available as free and open source software.

  16. The Incredible Years Teacher Classroom Management Program: Using Coaching to Support Generalization to Real-World Classroom Settings

    ERIC Educational Resources Information Center

    Reinke, Wendy M.; Stormont, Melissa; Webster-Stratton, Carolyn; Newcomer, Lori L.; Herman, Keith C.

    2012-01-01

    This article focuses on the Incredible Years Teacher Classroom Management Training (IY TCM) intervention as an example of an evidence-based program that embeds coaching within its design. First, the core features of the IY TCM program are described. Second, the IY TCM coaching model and processes utilized to facilitate high fidelity of…

  17. The Design and Synthesis of a Large Interactive Classroom

    NASA Astrophysics Data System (ADS)

    Clouston, Laurel L.; Kleinman, Mark H.

    1999-01-01

    The use of group learning techniques in large classes has been used to effectively convey the central concepts of SN1 and SN2 reactions in an introductory organic chemistry class. The activities described are best used as an introduction to these mechanisms. The class begins with the instructor relaying the key points of the reaction pathways. Following this synopsis, the class is divided through the use of assignment sheets that are distributed to the students upon arrival. The use of markers and poster boards, model kits, and role playing help to explain the intricacies of the mechanisms to learners, thereby accommodating a variety of different learning styles. After a guided discussion, each group presents their results to another collection of students who used a different learning technique to understand the alternate reaction. In this manner, each student encounters two learning styles and benefits from the repetitious nature of the exercise. After the groups break up into even smaller groups, higher-order questions are posed for further discussion. The class is terminated by the presentation of a summary slide that contains all the important facts covered during the lecture.

  18. Implementing Concept-based Learning in a Large Undergraduate Classroom

    PubMed Central

    Jutras, France

    2008-01-01

    An experiment explicitly introducing learning strategies to a large, first-year undergraduate cell biology course was undertaken to see whether awareness and use of strategies had a measurable impact on student performance. The construction of concept maps was selected as the strategy to be introduced because of an inherent coherence with a course structured by concepts. Data were collected over three different semesters of an introductory cell biology course, all teaching similar course material with the same professor and all evaluated using similar examinations. The first group, used as a control, did not construct concept maps, the second group constructed individual concept maps, and the third group first constructed individual maps then validated their maps in small teams to provide peer feedback about the individual maps. Assessment of the experiment involved student performance on the final exam, anonymous polls of student perceptions, failure rate, and retention of information at the start of the following year. The main conclusion drawn is that concept maps without feedback have no significant effect on student performance, whereas concept maps with feedback produced a measurable increase in student problem-solving performance and a decrease in failure rates. PMID:18519616

  19. Science Teacher Beliefs and Classroom Practice Related to Constructivism in Different School Settings

    NASA Astrophysics Data System (ADS)

    Savasci, Funda; Berlin, Donna F.

    2012-02-01

    Science teacher beliefs and classroom practice related to constructivism and factors that may influence classroom practice were examined in this cross-case study. Data from four science teachers in two schools included interviews, demographic questionnaire, Classroom Learning Environment Survey (preferred/perceived), and classroom observations and documents. Using an inductive analytic approach, results suggested that the teachers embraced constructivism, but classroom observations did not confirm implementation of these beliefs for three of the four teachers. The most preferred constructivist components were personal relevance and student negotiation; the most perceived component was critical voice. Shared control was the least preferred, least perceived, and least observed constructivist component. School type, grade, student behavior/ability, curriculum/standardized testing, and parental involvement may influence classroom practice.

  20. Web based visualization of large climate data sets

    USGS Publications Warehouse

    Alder, Jay R.; Hostetler, Steven W.

    2015-01-01

    We have implemented the USGS National Climate Change Viewer (NCCV), which is an easy-to-use web application that displays future projections from global climate models over the United States at the state, county and watershed scales. We incorporate the NASA NEX-DCP30 statistically downscaled temperature and precipitation for 30 global climate models being used in the Fifth Assessment Report (AR5) of the Intergovernmental Panel on Climate Change (IPCC), and hydrologic variables we simulated using a simple water-balance model. Our application summarizes very large, complex data sets at scales relevant to resource managers and citizens and makes climate-change projection information accessible to users of varying skill levels. Tens of terabytes of high-resolution climate and water-balance data are distilled to compact binary format summary files that are used in the application. To alleviate slow response times under high loads, we developed a map caching technique that reduces the time it takes to generate maps by several orders of magnitude. The reduced access time scales to >500 concurrent users. We provide code examples that demonstrate key aspects of data processing, data exporting/importing and the caching technique used in the NCCV.

  1. Student-Centred Anti-Smoking Education: Comparing a Classroom-Based and an Out-of-School Setting

    ERIC Educational Resources Information Center

    Geier, Christine S.; Bogner, Franz X.

    2010-01-01

    The present study monitored a student-centred educational anti-smoking intervention with fifth graders by focusing on their cognitive achievement and intrinsic motivation. In order to assess the potential influence of the setting on self-directed learning, the intervention was conducted in two different learning environments: a classroom-based…

  2. A Case Based Analysis Preparation Strategy for Use in a Classroom Management for Inclusive Settings Course: Preliminary Observations

    ERIC Educational Resources Information Center

    Niles, William J.; Cohen, Alan

    2012-01-01

    Case based instruction (CBI) is a pedagogical option in teacher preparation growing in application but short on practical means to implement the method. This paper presents an analysis strategy and questions developed to help teacher trainees focus on classroom management issues embedded in a set of "real" cases. An analysis of teacher candidate…

  3. The Effects of a Teacher-Child Play Intervention on Classroom Compliance in Young Children in Child Care Settings

    ERIC Educational Resources Information Center

    Levine, Darren G.; Ducharme, Joseph M.

    2013-01-01

    The current study evaluated the effects of a teacher-conducted play intervention on preschool-aged children's compliance in child care settings. Study participants included 8 children ranging in age from 3 to 5 years and 5 early childhood education teachers within 5 classrooms across 5 child care centers. A combination ABAB and multiple baseline…

  4. Challenges Associated With Using Large Data Sets for Quality Assessment and Research in Clinical Settings

    PubMed Central

    Cohen, Bevin; Vawdrey, David K.; Liu, Jianfang; Caplan, David; Furuya, E. Yoko; Mis, Frederick W.; Larson, Elaine

    2015-01-01

    The rapidly expanding use of electronic records in health-care settings is generating unprecedented quantities of data available for clinical, epidemiological, and cost-effectiveness research. Several challenges are associated with using these data for clinical research, including issues surrounding access and information security, poor data quality, inconsistency of data within and across institutions, and a paucity of staff with expertise to manage and manipulate large clinical data sets. In this article, we describe our experience with assembling a data-mart and conducting clinical research using electronic data from four facilities within a single hospital network in New York City. We culled data from several electronic sources, including the institution’s admission-discharge-transfer system, cost accounting system, electronic health record, clinical data warehouse, and departmental records. The final data-mart contained information for more than 760,000 discharges occurring from 2006 through 2012. Using categories identified by the National Institutes of Health Big Data to Knowledge initiative as a framework, we outlined challenges encountered during the development and use of a domain-specific data-mart and recommend approaches to overcome these challenges. PMID:26351216

  5. Challenges Associated With Using Large Data Sets for Quality Assessment and Research in Clinical Settings.

    PubMed

    Cohen, Bevin; Vawdrey, David K; Liu, Jianfang; Caplan, David; Furuya, E Yoko; Mis, Frederick W; Larson, Elaine

    2015-08-01

    The rapidly expanding use of electronic records in health-care settings is generating unprecedented quantities of data available for clinical, epidemiological, and cost-effectiveness research. Several challenges are associated with using these data for clinical research, including issues surrounding access and information security, poor data quality, inconsistency of data within and across institutions, and a paucity of staff with expertise to manage and manipulate large clinical data sets. In this article, we describe our experience with assembling a data-mart and conducting clinical research using electronic data from four facilities within a single hospital network in New York City. We culled data from several electronic sources, including the institution's admission-discharge-transfer system, cost accounting system, electronic health record, clinical data warehouse, and departmental records. The final data-mart contained information for more than 760,000 discharges occurring from 2006 through 2012. Using categories identified by the National Institutes of Health Big Data to Knowledge initiative as a framework, we outlined challenges encountered during the development and use of a domain-specific data-mart and recommend approaches to overcome these challenges. PMID:26351216

  6. An Exploration Tool for Very Large Spectrum Data Sets

    NASA Astrophysics Data System (ADS)

    Carbon, Duane F.; Henze, Christopher

    2015-01-01

    We present an exploration tool for very large spectrum data sets such as the SDSS, LAMOST, and 4MOST data sets. The tool works in two stages: the first uses batch processing and the second runs interactively. The latter employs the NASA hyperwall, a configuration of 128 workstation displays (8x16 array) controlled by a parallelized software suite running on NASA's Pleiades supercomputer. The stellar subset of the Sloan Digital Sky Survey DR10 was chosen to show how the tool may be used. In stage one, SDSS files for 569,738 stars are processed through our data pipeline. The pipeline fits each spectrum using an iterative continuum algorithm, distinguishing emission from absorption and handling molecular absorption bands correctly. It then measures 1659 discrete atomic and molecular spectral features that were carefully preselected based on their likelihood of being visible at some spectral type. The depths relative to the local continuum at each feature wavelength are determined for each spectrum: these depths, the local S/N level, and DR10-supplied variables such as magnitudes, colors, positions, and radial velocities are the basic measured quantities used on the hyperwall. In stage two, each hyperwall panel is used to display a 2-D scatter plot showing the depth of feature A vs the depth of feature B for all of the stars. A and B change from panel to panel. The relationships between the various (A,B) strengths and any distinctive clustering are immediately apparent when examining and inter-comparing the different panels on the hyperwall. The interactive software allows the user to select the stars in any interesting region of any 2-D plot on the hyperwall, immediately rendering the same stars on all the other 2-D plots in a unique color. The process may be repeated multiple times, each selection displaying a distinctive color on all the plots. At any time, the spectra of the selected stars may be examined in detail on a connected workstation display. We illustrate

  7. Trans-dimensional Bayesian inference for large sequential data sets

    NASA Astrophysics Data System (ADS)

    Mandolesi, E.; Dettmer, J.; Dosso, S. E.; Holland, C. W.

    2015-12-01

    This work develops a sequential Monte Carlo method to infer seismic parameters of layered seabeds from large sequential reflection-coefficient data sets. The approach provides parameter estimates and uncertainties along survey tracks with the goal to aid in the detection of unexploded ordnance in shallow water. The sequential data are acquired by a moving platform with source and receiver array towed close to the seabed. This geometry requires consideration of spherical reflection coefficients, computed efficiently by massively parallel implementation of the Sommerfeld integral via Levin integration on a graphics processing unit. The seabed is parametrized with a trans-dimensional model to account for changes in the environment (i.e. changes in layering) along the track. The method combines advanced Markov chain Monte Carlo methods (annealing) with particle filtering (resampling). Since data from closely-spaced source transmissions (pings) often sample similar environments, the solution from one ping can be utilized to efficiently estimate the posterior for data from subsequent pings. Since reflection-coefficient data are highly informative, the likelihood function can be extremely peaked, resulting in little overlap between posteriors of adjacent pings. This is addressed by adding bridging distributions (via annealed importance sampling) between pings for more efficient transitions. The approach assumes the environment to be changing slowly enough to justify the local 1D parametrization. However, bridging allows rapid changes between pings to be addressed and we demonstrate the method to be stable in such situations. Results are in terms of trans-D parameter estimates and uncertainties along the track. The algorithm is examined for realistic simulated data along a track and applied to a dataset collected by an autonomous underwater vehicle on the Malta Plateau, Mediterranean Sea. [Work supported by the SERDP, DoD.

  8. The Interests of Full Disclosure: Agenda-Setting and the Practical Initiation of the Feminist Classroom

    ERIC Educational Resources Information Center

    Seymour, Nicole

    2007-01-01

    Several theoretical and pragmatic questions arise when one attempts to employ feminist pedagogy in the classroom (or to study it), such as how to strike a balance between classroom order and instructor de-centering and how to productively address student resistance. In this article, the author describes how she took on her final project for a…

  9. Classrooms that Work: Teaching Generic Skills in Academic and Vocational Settings.

    ERIC Educational Resources Information Center

    Stasz, Cathleen; And Others

    This report documents the second of two studies on teaching and learning generic skills in high schools. It extends the earlier work by providing a model for designing classroom instruction in both academic and vocational classrooms where teaching generic skills is an instructional goal. Ethnographic field methods were used to observe, record, and…

  10. Teaching and learning in an integrated curriculum setting: A case study of classroom practices

    NASA Astrophysics Data System (ADS)

    MacMath, Sheryl Lynn

    Curriculum integration, while a commonly used educational term, remains a challenging concept to define and examine both in research and in classroom practice. Numerous types and definitions of curriculum integration exist in educational research, while, in comparison, teachers tend to focus on curriculum integration simply as a mixing of subject areas. To better understand curriculum integration in practice, this thesis details a case study that examines both teacher and student perspectives regarding a grade nine integrated unit on energy. Set in a public secondary school in Ontario, Canada, I comprehensively describe and analyze teacher understandings of, and challenges with, the implementation of an integrated unit, while also examining student perspectives and academic learning. My participants consisted of two high school teachers, a geography teacher and a science teacher, and their twenty-three students. Using data gathered from interviews before, during, and after the implementation of a 16-lesson unit, as well as observations throughout, I completed a case description and thematic analysis. My results illustrate the importance of examining why teachers choose to implement an integrated unit and the planning and scheduling challenges that exist. In addition, while the students in this study were academically successful, clarification is needed regarding whether student success can be linked to the integration of these two subjects or the types of activities these two teachers utilized.

  11. Connecting scientific research and classroom instruction: Developing authentic problem sets for the undergraduate organic chemistry curriculum

    NASA Astrophysics Data System (ADS)

    Raker, Jeffrey R.

    Reform efforts in science education have called for instructional methods and resources that mirror the practice of science. Little research and design methods have been documented in the literature for designing such materials. The purpose of this study was to develop problems sets for sophomore-level organic chemistry instruction. This research adapted an instructional design methodology from the science education literature for the creation of new curricular problem sets. The first phase of this study was to establish an understanding of current curricular problems in sophomore-level organic chemistry instruction. A sample of 792 problems was collected from four organic chemistry courses. These problems were assessed using three literature reported problem typologies. Two of these problem typologies have previously been used to understand general chemistry problems; comparisons between general and organic chemistry problems were thus made. Data from this phase was used to develop a set of five problems for practicing organic chemists. The second phase of this study was to explore practicing organic chemists' experiences solving problems in the context of organic synthesis research. Eight practicing organic chemists were interviewed and asked to solve two to three of the problems developed in phase one of this research. These participants spoke of three problem types: project level, synthetic planning, and day-to-day. Three knowledge types (internal knowledge, knowledgeable others, and literature) were used in solving these problems in research practice and in the developed problems. A set of guiding factors and implications were derived from this data and the chemistry education literature for the conversion of the problems for practicing chemists to problems for undergraduate students. A subsequent conversion process for the five problems occurred. The third, and last phase, of this study was to explore undergraduate students' experiences solving problems in

  12. Recruiting Participants for Large-Scale Random Assignment Experiments in School Settings

    ERIC Educational Resources Information Center

    Roschelle, Jeremy; Feng, Mingyu; Gallagher, H. Alix; Murphy, Robert; Harris, Christopher; Kamdar, Danae; Trinidad, Gucci

    2014-01-01

    Recruitment is a key challenge for researchers conducting any large school-based study. Control is needed not only over the condition participants receive, but also over how the intervention is implemented, and may include restrictions in other areas of school and classroom functioning. We report here on our experiences in recruiting participants…

  13. Using Large Data Sets to Study College Education Trajectories

    ERIC Educational Resources Information Center

    Oseguera, Leticia; Hwang, Jihee

    2014-01-01

    This chapter presents various considerations researchers undertook to conduct a quantitative study on low-income students using a national data set. Specifically, it describes how a critical quantitative scholar approaches guiding frameworks, variable operationalization, analytic techniques, and result interpretation. Results inform how…

  14. The Single and Combined Effects of Multiple Intensities of Behavior Modification and Methylphenidate for Children with Attention Deficit Hyperactivity Disorder in a Classroom Setting

    ERIC Educational Resources Information Center

    Fabiano, Gregory A.; Pelham, William E., Jr.; Gnagy, Elizabeth M.; Burrows-MacLean, Lisa; Coles, Erika K.; Chacko, Anil; Wymbs, Brian T.; Walker, Kathryn S.; Arnold, Fran; Garefino, Allison; Keenan, Jenna K.; Onyango, Adia N.; Hoffman, Martin T.; Massetti, Greta M.; Robb, Jessica A.

    2007-01-01

    Currently behavior modification, stimulant medication, and combined treatments are supported as evidence-based interventions for attention deficit hyperactivity disorder in classroom settings. However, there has been little study of the relative effects of these two modalities and their combination in classrooms. Using a within-subject design, the…

  15. Increasing the Writing Performance of Urban Seniors Placed At-Risk through Goal-Setting in a Culturally Responsive and Creativity-Centered Classroom

    ERIC Educational Resources Information Center

    Estrada, Brittany; Warren, Susan

    2014-01-01

    Efforts to support marginalized students require not only identifying systemic inequities, but providing a classroom infrastructure that supports the academic achievement of all students. This action research study examined the effects of implementing goal-setting strategies and emphasizing creativity in a culturally responsive classroom (CRC) on…

  16. Evaluation of Data Visualization Software for Large Astronomical Data Sets

    NASA Astrophysics Data System (ADS)

    Doyle, Matthew; Taylor, Roger S.; Kanbur, Shashi; Schofield, Damian; Donalek, Ciro; Djorgovski, Stanislav G.; Davidoff, Scott

    2016-01-01

    This study investigates the efficacy of a 3D visualization application used to classify various types of stars using data derived from large synoptic sky surveys. Evaluation methodology included a cognitive walkthrough that prompted participants to identify a specific star type (Supernovae, RR Lyrae or Eclipsing Binary) and retrieve variable information (MAD, magratio, amplitude, frequency) from the star. This study also implemented a heuristic evaluation that applied usability standards such as the Shneiderman Visual Information Seeking Mantra to the initial iteration of the application. Findings from the evaluation indicated that improvements could be made to the application by developing effective spatial organization and implementing data reduction techniques such as linking, brushing, and small multiples.

  17. Processing large remote sensing image data sets on Beowulf clusters

    USGS Publications Warehouse

    Steinwand, Daniel R.; Maddox, Brian; Beckmann, Tim; Schmidt, Gail

    2003-01-01

    High-performance computing is often concerned with the speed at which floating- point calculations can be performed. The architectures of many parallel computers and/or their network topologies are based on these investigations. Often, benchmarks resulting from these investigations are compiled with little regard to how a large dataset would move about in these systems. This part of the Beowulf study addresses that concern by looking at specific applications software and system-level modifications. Applications include an implementation of a smoothing filter for time-series data, a parallel implementation of the decision tree algorithm used in the Landcover Characterization project, a parallel Kriging algorithm used to fit point data collected in the field on invasive species to a regular grid, and modifications to the Beowulf project's resampling algorithm to handle larger, higher resolution datasets at a national scale. Systems-level investigations include a feasibility study on Flat Neighborhood Networks and modifications of that concept with Parallel File Systems.

  18. Value-based customer grouping from large retail data sets

    NASA Astrophysics Data System (ADS)

    Strehl, Alexander; Ghosh, Joydeep

    2000-04-01

    In this paper, we propose OPOSSUM, a novel similarity-based clustering algorithm using constrained, weighted graph- partitioning. Instead of binary presence or absence of products in a market-basket, we use an extended 'revenue per product' measure to better account for management objectives. Typically the number of clusters desired in a database marketing application is only in the teens or less. OPOSSUM proceeds top-down, which is more efficient and takes a small number of steps to attain the desired number of clusters as compared to bottom-up agglomerative clustering approaches. OPOSSUM delivers clusters that are balanced in terms of either customers (samples) or revenue (value). To facilitate data exploration and validation of results we introduce CLUSION, a visualization toolkit for high-dimensional clustering problems. To enable closed loop deployment of the algorithm, OPOSSUM has no user-specified parameters. Thresholding heuristics are avoided and the optimal number of clusters is automatically determined by a search for maximum performance. Results are presented on a real retail industry data-set of several thousand customers and products, to demonstrate the power of the proposed technique.

  19. Making Room for Group Work I: Teaching Engineering in a Modern Classroom Setting

    ERIC Educational Resources Information Center

    Wilkens, Robert J.; Ciric, Amy R.

    2005-01-01

    This paper describes the results of several teaching experiments in the teaching Studio of The University of Dayton's Learning-Teaching Center. The Studio is a state-of-the-art classroom with a flexible seating arrangements and movable whiteboards and corkboards for small group discussions. The Studio has a communications system with a TV/VCR…

  20. Classroom Management Strategies for Young Children with Challenging Behavior within Early Childhood Settings

    ERIC Educational Resources Information Center

    Jolivette, Kristine; Steed, Elizabeth A.

    2010-01-01

    Many preschool, Head Start, and kindergarten educators of young children express concern about the number of children who exhibit frequent challenging behaviors and report that managing these behaviors is difficult within these classrooms. This article describes research-based strategies with practical applications that can be used as part of…

  1. Explaining Helping Behavior in a Cooperative Learning Classroom Setting Using Attribution Theory

    ERIC Educational Resources Information Center

    Ahles, Paula M.; Contento, Jann M.

    2006-01-01

    This recently completed study examined whether attribution theory can explain helping behavior in an interdependent classroom environment that utilized a cooperative-learning model. The study focused on student participants enrolled in 6 community college communication classes taught by the same instructor. Three levels of cooperative-learning…

  2. The Timing of Feedback on Mathematics Problem Solving in a Classroom Setting

    ERIC Educational Resources Information Center

    Fyfe, Emily R.; Rittle-Johnson, Bethany

    2015-01-01

    Feedback is a ubiquitous learning tool that is theorized to help learners detect and correct their errors. The goal of this study was to examine the effects of feedback in a classroom context for children solving math equivalence problems (problems with operations on both sides of the equal sign). The authors worked with children in 7 second-grade…

  3. Observing Students in Classroom Settings: A Review of Seven Coding Schemes

    ERIC Educational Resources Information Center

    Volpe, Robert J.; DiPerna, James C.; Hintze, John M.; Shapiro, Edward S.

    2005-01-01

    A variety of coding schemes are available for direct observational assessment of student classroom behavior. These instruments have been used for a number of assessment tasks including screening children in need of further evaluation for emotional and behavior problems, diagnostic assessment of emotional and behavior problems, assessment of…

  4. By What Token Economy? A Classroom Learning Tool for Inclusive Settings.

    ERIC Educational Resources Information Center

    Anderson, Carol; Katsiyannis, Antonis

    1997-01-01

    Describes a token economy that used tokens styled as license plates to elicit appropriate behavior in an inclusive fifth-grade class in which four students with behavior disorders were enrolled. Student involvement in establishing the "driving rules" of the classroom is explained, the components of a token economy are outlined, and steps for group…

  5. An Approach to the Problem of Student Passivity in Classroom Settings.

    ERIC Educational Resources Information Center

    MacDonald, Judith B.

    The verbal interaction between a laboratory school class of sixth/seventh grade students and their teacher during 18 social studies discussions was analyzed in order to identify teacher techniques relevant to student discourse and student passivity. Classroom discussions were taped, transcribed, and analyzed according to an adaptation of the…

  6. Teachers on Television. Observing Teachers and Students in Diverse Classroom Settings through the Technology of Television.

    ERIC Educational Resources Information Center

    Iowa State Univ. of Science and Technology, Ames. Coll. of Education.

    Because most teacher preparation institutions have extensive needs for numerous field experiences, the Teachers on Television (TOT) project was first conceived to meet demands for observation opportunities through the use of television technology. The TOT project provided live direct observations via television broadcasts from classrooms located…

  7. Mobile Learning in a Large Blended Computer Science Classroom: System Function, Pedagogies, and Their Impact on Learning

    ERIC Educational Resources Information Center

    Shen, Ruimin; Wang, Minjuan; Gao, Wanping; Novak, D.; Tang, Lin

    2009-01-01

    The computer science classes in China's institutions of higher education often have large numbers of students. In addition, many institutions offer "blended" classes that include both on-campus and online students. These large blended classrooms have long suffered from a lack of interactivity. Many online classes simply provide recorded instructor…

  8. Service user involvement in pre-registration mental health nurse education classroom settings: a review of the literature.

    PubMed

    Terry, J

    2012-11-01

    Service user involvement in pre-registration nurse education is now a requirement, yet little is known about how students engage with users in the classroom, how such initiatives are being evaluated, how service users are prepared themselves to teach students, or the potential influence on clinical practice. The aim of this literature review was to bring together published articles on service user involvement in classroom settings in pre-registration mental health nurse education programmes, including their evaluations. A comprehensive review of the literature was carried out via computer search engines and the Internet, as well as a hand search of pertinent journals and references. This produced eight papers that fitted the inclusion criteria, comprising four empirical studies and four review articles, which were then reviewed using a seven-item checklist. The articles revealed a range of teaching and learning strategies had been employed, ranging from exposure to users' personal stories, to students being required to demonstrate awareness of user perspectives in case study presentations, with others involving eLearning and assessment skills initiatives. This review concludes that further longitudinal research is needed to establish the influence of user involvement in the classroom over time. PMID:22296494

  9. Engaging millennial learners: Effectiveness of personal response system technology with nursing students in small and large classrooms.

    PubMed

    Revell, Susan M Hunter; McCurry, Mary K

    2010-05-01

    Nurse educators must explore innovative technologies that make the most of the characteristics and learning styles of millennial learners. These students are comfortable with technology and prefer interactive classrooms with individual feedback and peer collaboration. This study evaluated the perceived effectiveness of personal response system (PRS) technology in enhancing student learning in small and large classrooms. PRS technology was integrated into two undergraduate courses, nursing research (n = 33) and junior medical-surgical nursing (n = 116). Multiple-choice, true-false, NCLEX-RN alternate format, and reading quiz questions were incorporated within didactic PowerPoint presentations. Data analysis of Likert-type and open-response questions supported the use of PRS technology as an effective strategy for educating millennial learners in both small and large classrooms. PRS technology promotes active learning, increases participation, and provides students and faculty with immediate feedback that reflects comprehension of content and increases faculty-student interaction. PMID:20055325

  10. Feasibility and Acceptability of Adapting the Eating in the Absence of Hunger Assessment for Preschoolers in the Classroom Setting.

    PubMed

    Soltero, Erica G; Ledoux, Tracey; Lee, Rebecca E

    2015-12-01

    Eating in the Absence of Hunger (EAH) represents a failure to self-regulate intake leading to overconsumption. Existing research on EAH has come from the clinical setting, limiting our understanding of this behavior. The purpose of this study was to describe the adaptation of the clinical EAH paradigm for preschoolers to the classroom setting and evaluate the feasibility and acceptability of measuring EAH in the classroom. The adapted protocol was implemented in childcare centers in Houston, Texas (N=4) and Phoenix, Arizona (N=2). The protocol was feasible, economical, and time efficient, eliminating previously identified barriers to administering the EAH assessment such as limited resources and the time constraint of delivering the assessment to participants individually. Implementation challenges included difficulty in choosing palatable test snacks that were in compliance with childcare center food regulations and the limited control over the meal that was administered prior to the assessment. The adapted protocol will allow for broader use of the EAH assessment and encourage researchers to incorporate the assessment into longitudinal studies in order to further our understanding of the causes and emergence of EAH. PMID:26172567

  11. Response Grids: Practical Ways to Display Large Data Sets with High Visual Impact

    ERIC Educational Resources Information Center

    Gates, Simon

    2013-01-01

    Spreadsheets are useful for large data sets but they may be too wide or too long to print as conventional tables. Response grids offer solutions to the challenges posed by any large data set. They have wide application throughout science and for every subject and context where visual data displays are designed, within education and elsewhere.…

  12. Mobile-Phone-Based Classroom Response Systems: Students' Perceptions of Engagement and Learning in a Large Undergraduate Course

    ERIC Educational Resources Information Center

    Dunn, Peter K.; Richardson, Alice; Oprescu, Florin; McDonald, Christine

    2013-01-01

    Using a Classroom Response System (CRS) has been associated with positive educational outcomes, by fostering student engagement and by allowing immediate feedback to both students and instructors. This study examined a low-cost CRS (VotApedia) in a large first-year class, where students responded to questions using their mobile phones. This study…

  13. Silent and Vocal Students in a Large Active Learning Chemistry Classroom: Comparison of Performance and Motivational Factors

    ERIC Educational Resources Information Center

    Obenland, Carrie A.; Munson, Ashlyn H.; Hutchinson, John S.

    2013-01-01

    Active learning is becoming more prevalent in large science classrooms, and this study shows the impact on performance of being vocal during Socratic questioning in a General Chemistry course. 800 college students over a two year period were given a pre and post-test using the Chemistry Concept Reasoning Test. The pre-test results showed that…

  14. Safety and science at sea: connecting science research settings to the classroom through live video

    NASA Astrophysics Data System (ADS)

    Cohen, E.; Peart, L. W.

    2011-12-01

    Many science teachers start the year off with classroom safety topics. Annual repetition helps with mastery of this important and basic knowledge, while helping schools to meet their legal obligations for safe lab science. Although these lessons are necessary, they are often topical, rarely authentic and relatively dull. Interesting connections can, however, be drawn between the importance of safety in science classrooms and the importance of safety in academic laboratories, fieldwork, shipboard research, and commercial research. Teachers can leverage these connections through live video interactions with scientists in the field, thereby creating an authentic learning environment. During the School of Rock 2009, a professional teacher research experience aboard the Integrated Ocean Drilling Program's research vessel JOIDES Resolution, safety and nature-of-science curricula were created to help address this need. By experimenting with various topics and locations on the ship that were accessible and applicable to middle school learning, 43 highly visual "safety signs" and activities were identified and presented "live" by graduate students, teachers, scientists; the ship's mates, doctor and technical staff. Students were exposed to realistic science process skills along with safety content from the world's only riserless, deep-sea drilling research vessel. The once-in-a-lifetime experience caused the students' eyes to brighten behind their safety glasses, especially as they recognized the same eye wash station and safety gear they have to wear and attended a ship's fire and safety drill along side scientists in hard hats and personal floatation devices. This collaborative and replicable live vide approach will connect basic safety content and nature-of-science process skills for a memorable and authentic learning experience for students.

  15. Initial validation of the prekindergarten Classroom Observation Tool and goal setting system for data-based coaching.

    PubMed

    Crawford, April D; Zucker, Tricia A; Williams, Jeffrey M; Bhavsar, Vibhuti; Landry, Susan H

    2013-12-01

    Although coaching is a popular approach for enhancing the quality of Tier 1 instruction, limited research has addressed observational measures specifically designed to focus coaching on evidence-based practices. This study explains the development of the prekindergarten (pre-k) Classroom Observation Tool (COT) designed for use in a data-based coaching model. We examined psychometric characteristics of the COT and explored how coaches and teachers used the COT goal-setting system. The study included 193 coaches working with 3,909 pre-k teachers in a statewide professional development program. Classrooms served 3 and 4 year olds (n = 56,390) enrolled mostly in Title I, Head Start, and other need-based pre-k programs. Coaches used the COT during a 2-hr observation at the beginning of the academic year. Teachers collected progress-monitoring data on children's language, literacy, and math outcomes three times during the year. Results indicated a theoretically supported eight-factor structure of the COT across language, literacy, and math instructional domains. Overall interrater reliability among coaches was good (.75). Although correlations with an established teacher observation measure were small, significant positive relations between COT scores and children's literacy outcomes indicate promising predictive validity. Patterns of goal-setting behaviors indicate teachers and coaches set an average of 43.17 goals during the academic year, and coaches reported that 80.62% of goals were met. Both coaches and teachers reported the COT was a helpful measure for enhancing quality of Tier 1 instruction. Limitations of the current study and implications for research and data-based coaching efforts are discussed. PMID:24059812

  16. A Day in Third Grade: A Large-Scale Study of Classroom Quality and Teacher and Student Behavior

    ERIC Educational Resources Information Center

    Elementary School Journal, 2005

    2005-01-01

    Observations of 780 third-grade classrooms described classroom activities, child-teacher interactions, and dimensions of the global classroom environment, which were examined in relation to structural aspects of the classroom and child behavior. 1 child per classroom was targeted for observation in relation to classroom quality and teacher and…

  17. Assessing the Effectiveness of Inquiry-based Learning Techniques Implemented in Large Classroom Settings

    NASA Astrophysics Data System (ADS)

    Steer, D. N.; McConnell, D. A.; Owens, K.

    2001-12-01

    Geoscience and education faculty at The University of Akron jointly developed a series of inquiry-based learning modules aimed at both non-major and major student populations enrolled in introductory geology courses. These courses typically serve 2500 students per year in four to six classes of 40-160 students each per section. Twelve modules were developed that contained common topics and assessments appropriate to Earth Science, Environmental Geology and Physical Geology classes. All modules were designed to meet four primary learning objectives agreed upon by Department of Geology faculty. These major objectives include: 1) Improvement of student understanding of the scientific method; 2) Incorporation of problem solving strategies involving analysis, synthesis, and interpretation; 3) Development of the ability to distinguish between inferences, data and observations; and 4) Obtaining an understanding of basic processes that operate on Earth. Additional objectives that may be addressed by selected modules include: 1) The societal relevance of science; 2) Use and interpretation of quantitative data to better understand the Earth; 3) Development of the students' ability to communicate scientific results; 4) Distinguishing differences between science, religion and pseudo-science; 5) Evaluation of scientific information found in the mass media; and 6) Building interpersonal relationships through in-class group work. Student pre- and post-instruction progress was evaluated by administering a test of logical thinking, an attitude toward science survey, and formative evaluations. Scores from the logical thinking instrument were used to form balanced four-person working groups based on the students' incoming cognitive level. Groups were required to complete a series of activities and/or exercises that targeted different cognitive domains based upon Bloom's taxonomy (knowledge, comprehension, application, analysis, synthesis and evaluation of information). Daily assessments of knowledge-level learning included evaluations of student responses to pre- and post-instruction conceptual test questions, short group exercises and content-oriented exam questions. Higher level thinking skills were assessed when students completed exercises that required the completion of Venn diagrams, concept maps and/or evaluation rubrics both during class periods and on exams. Initial results indicate that these techniques improved student attendance significantly and improved overall retention in the course by 8-14% over traditional lecture formats. Student scores on multiple choice exam questions were slightly higher (1-3%) for students taught in the active learning environment and short answer questions showed larger gains (7%) over students' scores in a more traditional class structure.

  18. When the globe is your classroom: teaching and learning about large-scale environmental change online

    NASA Astrophysics Data System (ADS)

    Howard, E. A.; Coleman, K. J.; Barford, C. L.; Kucharik, C.; Foley, J. A.

    2005-12-01

    Understanding environmental problems that cross physical and disciplinary boundaries requires a more holistic view of the world - a "systems" approach. Yet it is a challenge for many learners to start thinking this way, particularly when the problems are large in scale and not easily visible. We will describe our online university course, "Humans and the Changing Biosphere," which takes a whole-systems perspective for teaching regional to global-scale environmental science concepts, including climate, hydrology, ecology, and human demographics. We will share our syllabus and learning objectives and summarize our efforts to incorporate "best" practices for online teaching. We will describe challenges we have faced, and our efforts to reach different learner types. Our goals for this presentation are: (1) to communicate how a systems approach ties together environmental sciences (including climate, hydrology, ecology, biogeochemistry, and demography) that are often taught as separate disciplines; (2) to generate discussion about challenges of teaching large-scale environmental processes; (3) to share our experiences in teaching these topics online; (4) to receive ideas and feedback on future teaching strategies. We will explain why we developed this course online, and share our experiences about benefits and challenges of teaching over the web - including some suggestions about how to use technology to supplement face-to-face learning experiences (and vice versa). We will summarize assessment data about what students learned during the course, and discuss key misconceptions and barriers to learning. We will highlight the role of an online discussion board in creating classroom community, identifying misconceptions, and engaging different types of learners.

  19. BEST in CLASS: A Classroom-Based Model for Ameliorating Problem Behavior in Early Childhood Settings

    ERIC Educational Resources Information Center

    Vo, Abigail; Sutherland, Kevin S.; Conroy, Maureen A.

    2012-01-01

    As more young children enter school settings to attend early childhood programs, early childhood teachers and school psychologists have been charged with supporting a growing number of young children with chronic problem behaviors that put them at risk for the development of emotional/behavioral disorders (EBDs). There is a need for effective,…

  20. Intercultural Education Set Forward: Operational Strategies and Procedures in Cypriot Classrooms

    ERIC Educational Resources Information Center

    Hajisoteriou, Christina

    2012-01-01

    Teachers in Cyprus are being called upon for the first time to teach within culturally diverse educational settings. Given the substantial role, teachers play in the implementation of intercultural education, this paper explores the intercultural strategies and procedures adopted by primary school teachers in Cyprus. Interviews were carried out…

  1. Best in Class: A Classroom-Based Model for Ameliorating Problem Behavior in Early Childhood Settings

    ERIC Educational Resources Information Center

    Vo, Abigail K.; Sutherland, Kevin S.; Conroy, Maureen A.

    2012-01-01

    As more young children enter school settings to attend early childhood programs, early childhood teachers and school psychologists have been charged with supporting a growing number of young children with chronic problem behaviors that put them at risk for the development of emotional/behavioral disorders (EBDs). There is a need for effective,…

  2. A Classroom Exercise in Spatial Analysis Using an Imaginary Data Set.

    ERIC Educational Resources Information Center

    Kopaska-Merkel, David C.

    One skill that elementary students need to acquire is the ability to analyze spatially distributed data. In this activity students are asked to complete the following tasks: (1) plot a set of data (related to "mud-sharks"--an imaginary fish) on a map of the state of Alabama, (2) identify trends in the data, (3) make graphs using the data…

  3. An Academic Approach to Stress Management for College Students in a Conventional Classroom Setting.

    ERIC Educational Resources Information Center

    Carnahan, Robert E.; And Others

    Since the identification of stress and the relationship of individual stress responses to physical and mental health, medical and behavioral professionals have been training individuals in coping strategies. To investigate the possibility of teaching cognitive coping skills to a nonclinical population in an academic setting, 41 college students…

  4. Toddler Subtraction with Large Sets: Further Evidence for an Analog-Magnitude Representation of Number

    ERIC Educational Resources Information Center

    Slaughter, Virginia; Kamppi, Dorian; Paynter, Jessica

    2006-01-01

    Two experiments were conducted to test the hypothesis that toddlers have access to an analog-magnitude number representation that supports numerical reasoning about relatively large numbers. Three-year-olds were presented with subtraction problems in which initial set size and proportions subtracted were systematically varied. Two sets of cookies…

  5. Using Mobile Phones to Increase Classroom Interaction

    ERIC Educational Resources Information Center

    Cobb, Stephanie; Heaney, Rose; Corcoran, Olivia; Henderson-Begg, Stephanie

    2010-01-01

    This study examines the possible benefits of using mobile phones to increase interaction and promote active learning in large classroom settings. First year undergraduate students studying Cellular Processes at the University of East London took part in a trial of a new text-based classroom interaction system and evaluated their experience by…

  6. Impact of Abbreviated Lecture with Interactive Mini-cases vs Traditional Lecture on Student Performance in the Large Classroom

    PubMed Central

    Nykamp, Diane L.; Momary, Kathryn M.

    2014-01-01

    Objective. To compare the impact of 2 different teaching and learning methods on student mastery of learning objectives in a pharmacotherapy module in the large classroom setting. Design. Two teaching and learning methods were implemented and compared in a required pharmacotherapy module for 2 years. The first year, multiple interactive mini-cases with inclass individual assessment and an abbreviated lecture were used to teach osteoarthritis; a traditional lecture with 1 inclass case discussion was used to teach gout. In the second year, the same topics were used but the methods were flipped. Student performance on pre/post individual readiness assessment tests (iRATs), case questions, and subsequent examinations were compared each year by the teaching and learning method and then between years by topic for each method. Students also voluntarily completed a 20-item evaluation of the teaching and learning methods. Assessment. Postpresentation iRATs were significantly higher than prepresentation iRATs for each topic each year with the interactive mini-cases; there was no significant difference in iRATs before and after traditional lecture. For osteoarthritis, postpresentation iRATs after interactive mini-cases in year 1 were significantly higher than postpresentation iRATs after traditional lecture in year 2; the difference in iRATs for gout per learning method was not significant. The difference between examination performance for osteoarthritis and gout was not significant when the teaching and learning methods were compared. On the student evaluations, 2 items were significant both years when answers were compared by teaching and learning method. Each year, students ranked their class participation higher with interactive cases than with traditional lecture, but both years they reported enjoying the traditional lecture format more. Conclusion. Multiple interactive mini-cases with an abbreviated lecture improved immediate mastery of learning objectives compared to

  7. Confirming the Phylogeny of Mammals by Use of Large Comparative Sequence Data Sets

    PubMed Central

    Prasad, Arjun B.; Allard, Marc W.

    2008-01-01

    The ongoing generation of prodigious amounts of genomic sequence data from myriad vertebrates is providing unparalleled opportunities for establishing definitive phylogenetic relationships among species. The size and complexities of such comparative sequence data sets not only allow smaller and more difficult branches to be resolved but also present unique challenges, including large computational requirements and the negative consequences of systematic biases. To explore these issues and to clarify the phylogenetic relationships among mammals, we have analyzed a large data set of over 60 megabase pairs (Mb) of high-quality genomic sequence, which we generated from 41 mammals and 3 other vertebrates. All sequences are orthologous to a 1.9-Mb region of the human genome that encompasses the cystic fibrosis transmembrane conductance regulator gene (CFTR). To understand the characteristics and challenges associated with phylogenetic analyses of such a large data set, we partitioned the sequence data in several ways and utilized maximum likelihood, maximum parsimony, and Neighbor-Joining algorithms, implemented in parallel on Linux clusters. These studies yielded well-supported phylogenetic trees, largely confirming other recent molecular phylogenetic analyses. Our results provide support for rooting the placental mammal tree between Atlantogenata (Xenarthra and Afrotheria) and Boreoeutheria (Euarchontoglires and Laurasiatheria), illustrate the difficulty in resolving some branches even with large amounts of data (e.g., in the case of Laurasiatheria), and demonstrate the valuable role that very large comparative sequence data sets can play in refining our understanding of the evolutionary relationships of vertebrates. PMID:18453548

  8. Experiments and other methods for developing expertise with design of experiments in a classroom setting

    NASA Technical Reports Server (NTRS)

    Patterson, John W.

    1990-01-01

    The only way to gain genuine expertise in Statistical Process Control (SPC) and the design of experiments (DOX) is with repeated practice, but not on canned problems with dead data sets. Rather, one must negotiate a wide variety of problems each with its own peculiarities and its own constantly changing data. The problems should not be of the type for which there is a single, well-defined answer that can be looked up in a fraternity file or in some text. The problems should match as closely as possible the open-ended types for which there is always an abundance of uncertainty. These are the only kinds that arise in real research, whether that be basic research in academe or engineering research in industry. To gain this kind of experience, either as a professional consultant or as an industrial employee, takes years. Vast amounts of money, not to mention careers, must be put at risk. The purpose here is to outline some realistic simulation-type lab exercises that are so simple and inexpensive to run that the students can repeat them as often as desired at virtually no cost. Simulations also allow the instructor to design problems whose outcomes are as noisy as desired but still predictable within limits. Also the instructor and the students can learn a great deal more from the postmortum conducted after the exercise is completed. One never knows for sure what the true data should have been when dealing only with real life experiments. To add a bit more realism to the exercises, it is sometimes desirable to make the students pay for each experimental result from a make-believe budget allocation for the problem.

  9. Out in the Classroom: Transgender Student Experiences at a Large Public University

    ERIC Educational Resources Information Center

    Pryor, Jonathan T.

    2015-01-01

    Faculty and peer interactions are 2 of the most important relationships for college students to foster (Astin, 1993). Transgender college students have an increasing visible presence on college campuses (Pusch, 2005), yet limited research exists on their experiences and struggles in the classroom environment (Garvey & Rankin, 2015; Renn,…

  10. Classroom Response Systems for Implementing "Interactive Inquiry" in Large Organic Chemistry Classes

    ERIC Educational Resources Information Center

    Morrison, Richard W.; Caughran, Joel A.; Sauers, Angela L.

    2014-01-01

    The authors have developed "sequence response applications" for classroom response systems (CRSs) that allow instructors to engage and actively involve students in the learning process, probe for common misconceptions regarding lecture material, and increase interaction between instructors and students. "Guided inquiry" and…

  11. Prekindergarten Teachers' Verbal References to Print during Classroom-Based, Large-Group Shared Reading

    ERIC Educational Resources Information Center

    Zucker, Tricia A.; Justice, Laura M.; Piasta, Shayne B.

    2009-01-01

    Purpose: The frequency with which adults reference print when reading with preschool-age children is associated with growth in children's print knowledge (e.g., L.M. Justice & H.K. Ezell, 2000, 2002). This study examined whether prekindergarten (pre-K) teachers naturally reference print during classroom shared reading and if verbal print…

  12. Coaching as a Key Component in Teachers' Professional Development: Improving Classroom Practices in Head Start Settings. OPRE Report 2012-4

    ERIC Educational Resources Information Center

    Lloyd, Chrrishana M.; Modlin, Emmily L.

    2012-01-01

    Head Start CARES (Classroom-based Approaches and Resources for Emotion and Social Skill Promotion) is a large-scale, national research demonstration that was designed to test the effects of a one-year program aimed at improving pre-kindergarteners' social and emotional readiness for school. To facilitate the delivery of the program, teachers…

  13. The Classroom Observation Schedule to Measure Intentional Communication (COSMIC): An Observational Measure of the Intentional Communication of Children with Autism in an Unstructured Classroom Setting

    ERIC Educational Resources Information Center

    Pasco, Greg; Gordon, Rosanna K.; Howlin, Patricia; Charman, Tony

    2008-01-01

    The Classroom Observation Schedule to Measure Intentional Communication (COSMIC) was devised to provide ecologically valid outcome measures for a communication-focused intervention trial. Ninety-one children with autism spectrum disorder aged 6 years 10 months (SD 16 months) were videoed during their everyday snack, teaching and free play…

  14. Preschoolers' Nonsymbolic Arithmetic with Large Sets: Is Addition More Accurate than Subtraction?

    ERIC Educational Resources Information Center

    Shinskey, Jeanne L.; Chan, Cindy Ho-man; Coleman, Rhea; Moxom, Lauren; Yamamoto, Eri

    2009-01-01

    Adult and developing humans share with other animals analog magnitude representations of number that support nonsymbolic arithmetic with large sets. This experiment tested the hypothesis that such representations may be more accurate for addition than for subtraction in children as young as 3 1/2 years of age. In these tasks, the experimenter hid…

  15. Using Content-Specific Lyrics to Familiar Tunes in a Large Lecture Setting

    ERIC Educational Resources Information Center

    McLachlin, Derek T.

    2009-01-01

    Music can be used in lectures to increase student engagement and help students retain information. In this paper, I describe my use of biochemistry-related lyrics written to the tune of the theme to the television show, The Flintstones, in a large class setting (400-800 students). To determine student perceptions, the class was surveyed several…

  16. Influences of large sets of environmental exposures on immune responses in healthy adult men

    PubMed Central

    Yi, Buqing; Rykova, Marina; Jäger, Gundula; Feuerecker, Matthias; Hörl, Marion; Matzel, Sandra; Ponomarev, Sergey; Vassilieva, Galina; Nichiporuk, Igor; Choukèr, Alexander

    2015-01-01

    Environmental factors have long been known to influence immune responses. In particular, clinical studies about the association between migration and increased risk of atopy/asthma have provided important information on the role of migration associated large sets of environmental exposures in the development of allergic diseases. However, investigations about environmental effects on immune responses are mostly limited in candidate environmental exposures, such as air pollution. The influences of large sets of environmental exposures on immune responses are still largely unknown. A simulated 520-d Mars mission provided an opportunity to investigate this topic. Six healthy males lived in a closed habitat simulating a spacecraft for 520 days. When they exited their “spacecraft” after the mission, the scenario was similar to that of migration, involving exposure to a new set of environmental pollutants and allergens. We measured multiple immune parameters with blood samples at chosen time points after the mission. At the early adaptation stage, highly enhanced cytokine responses were observed upon ex vivo antigen stimulations. For cell population frequencies, we found the subjects displayed increased neutrophils. These results may presumably represent the immune changes occurred in healthy humans when migrating, indicating that large sets of environmental exposures may trigger aberrant immune activity. PMID:26306804

  17. DocCube: Multi-Dimensional Visualization and Exploration of Large Document Sets.

    ERIC Educational Resources Information Center

    Mothe, Josiane; Chrisment, Claude; Dousset, Bernard; Alaux, Joel

    2003-01-01

    Describes a user interface that provides global visualizations of large document sets to help users formulate the query that corresponds to their information needs. Highlights include concept hierarchies that users can browse to specify and refine information needs; knowledge discovery in databases and texts; and multidimensional modeling.…

  18. Influences of large sets of environmental exposures on immune responses in healthy adult men.

    PubMed

    Yi, Buqing; Rykova, Marina; Jäger, Gundula; Feuerecker, Matthias; Hörl, Marion; Matzel, Sandra; Ponomarev, Sergey; Vassilieva, Galina; Nichiporuk, Igor; Choukèr, Alexander

    2015-01-01

    Environmental factors have long been known to influence immune responses. In particular, clinical studies about the association between migration and increased risk of atopy/asthma have provided important information on the role of migration associated large sets of environmental exposures in the development of allergic diseases. However, investigations about environmental effects on immune responses are mostly limited in candidate environmental exposures, such as air pollution. The influences of large sets of environmental exposures on immune responses are still largely unknown. A simulated 520-d Mars mission provided an opportunity to investigate this topic. Six healthy males lived in a closed habitat simulating a spacecraft for 520 days. When they exited their "spacecraft" after the mission, the scenario was similar to that of migration, involving exposure to a new set of environmental pollutants and allergens. We measured multiple immune parameters with blood samples at chosen time points after the mission. At the early adaptation stage, highly enhanced cytokine responses were observed upon ex vivo antigen stimulations. For cell population frequencies, we found the subjects displayed increased neutrophils. These results may presumably represent the immune changes occurred in healthy humans when migrating, indicating that large sets of environmental exposures may trigger aberrant immune activity. PMID:26306804

  19. Large-scale detection of metals with a small set of fluorescent DNA-like chemosensors.

    PubMed

    Yuen, Lik Hang; Franzini, Raphael M; Tan, Samuel S; Kool, Eric T

    2014-10-15

    An important advantage of pattern-based chemosensor sets is their potential to detect and differentiate a large number of analytes with only few sensors. Here we test this principle at a conceptual limit by analyzing a large set of metal ion analytes covering essentially the entire periodic table, employing fluorescent DNA-like chemosensors on solid support. A tetrameric "oligodeoxyfluoroside" (ODF) library of 6561 members containing metal-binding monomers was screened for strong responders to 57 metal ions in solution. Our results show that a set of 9 chemosensors could successfully discriminate the 57 species, including alkali, alkaline earth, post-transition, transition, and lanthanide metals. As few as 6 ODF chemosensors could detect and differentiate 50 metals at 100 μM; sensitivity for some metals was achieved at midnanomolar ranges. A blind test with 50 metals further confirmed the discriminating power of the ODFs. PMID:25255102

  20. On basis set superposition error corrected stabilization energies for large n-body clusters.

    PubMed

    Walczak, Katarzyna; Friedrich, Joachim; Dolg, Michael

    2011-10-01

    In this contribution, we propose an approximate basis set superposition error (BSSE) correction scheme for the site-site function counterpoise and for the Valiron-Mayer function counterpoise correction of second order to account for the basis set superposition error in clusters with a large number of subunits. The accuracy of the proposed scheme has been investigated for a water cluster series at the CCSD(T), CCSD, MP2, and self-consistent field levels of theory using Dunning's correlation consistent basis sets. The BSSE corrected stabilization energies for a series of water clusters are presented. A study regarding the possible savings with respect to computational resources has been carried out as well as a monitoring of the basis set dependence of the approximate BSSE corrections. PMID:21992293

  1. Large Code Set for Double User Capacity and Low PAPR Level in Multicarrier Systems

    NASA Astrophysics Data System (ADS)

    Anwar, Khoirul; Saito, Masato; Hara, Takao; Okada, Minoru

    In this paper, a new large spreading code set with a uniform low cross-correlation is proposed. The proposed code set is capable of (1) increasing the number of assigned user (capacity) in a multicarrier code division multiple access (MC-CDMA) system and (2) reducing the peak-to-average power ratio (PAPR) of an orthogonal frequency division multiplexing (OFDM) system. In this paper, we derive a new code set and present an example to demonstrate performance improvements of OFDM and MC-CDMA systems. Our proposed code set with code length of N has K=2N+1 number of codes for supporting up to (2N+1) users and exhibits lower cross correlation properties compared to the existing spreading code sets. Our results with subcarrier N=16 confirm that the proposed code set outperforms the current pseudo-orthogonal carrier interferometry (POCI) code set with gain of 5dB at bit-error-rate (BER) level of 10-4 in the additive white Gaussian noise (AWGN) channel and gain of more than 3.6dB in a multipath fading channel.

  2. Implementing Child-focused Activity Meter Utilization into the Elementary School Classroom Setting Using a Collaborative Community-based Approach

    PubMed Central

    Lynch, BA; Jones, A; Biggs, BK; Kaufman, T; Cristiani, V; Kumar, S; Quigg, S; Maxson, J; Swenson, L; Jacobson, N

    2016-01-01

    Introduction The prevalence of pediatric obesity has increased over the past 3 decades and is a pressing public health program. New technology advancements that can encourage more physical in children are needed. The Zamzee program is an activity meter linked to a motivational website designed for children 8–14 years of age. The objective of the study was to use a collaborative approach between a medical center, the private sector and local school staff to assess the feasibility of using the Zamzee Program in the school-based setting to improve physical activity levels in children. Methods This was a pilot 8-week observational study offered to all children in one fifth grade classroom. Body mass index (BMI), the amount of physical activity by 3-day recall survey, and satisfaction with usability of the Zamzee Program were measured pre- and post-study. Results Out of 11 children who enrolled in the study, 7 completed all study activities. In those who completed the study, the median (interquartile range) total activity time by survey increased by 17 (1042) minutes and the BMI percentile change was 0 (8). Both children and their caregivers found the Zamzee Activity Meter (6/7) and website (6/7) “very easy” or “easy” to use. Conclusion The Zamzee Program was found to be usable but did not significantly improve physical activity levels or BMI. Collaborative obesity intervention projects involving medical centers, the private sector and local schools are feasible but the effectiveness needs to be evaluated in larger-scale studies. PMID:27042382

  3. An Analysis Framework Addressing the Scale and Legibility of Large Scientific Data Sets

    SciTech Connect

    Childs, H R

    2006-11-20

    Much of the previous work in the large data visualization area has solely focused on handling the scale of the data. This task is clearly a great challenge and necessary, but it is not sufficient. Applying standard visualization techniques to large scale data sets often creates complicated pictures where meaningful trends are lost. A second challenge, then, is to also provide algorithms that simplify what an analyst must understand, using either visual or quantitative means. This challenge can be summarized as improving the legibility or reducing the complexity of massive data sets. Fully meeting both of these challenges is the work of many, many PhD dissertations. In this dissertation, we describe some new techniques to address both the scale and legibility challenges, in hope of contributing to the larger solution. In addition to our assumption of simultaneously addressing both scale and legibility, we add an additional requirement that the solutions considered fit well within an interoperable framework for diverse algorithms, because a large suite of algorithms is often necessary to fully understand complex data sets. For scale, we present a general architecture for handling large data, as well as details of a contract-based system for integrating advanced optimizations into a data flow network design. We also describe techniques for volume rendering and performing comparisons at the extreme scale. For legibility, we present several techniques. Most noteworthy are equivalence class functions, a technique to drive visualizations using statistical methods, and line-scan based techniques for characterizing shape.

  4. PrestoPronto: a code devoted to handling large data sets

    NASA Astrophysics Data System (ADS)

    Figueroa, S. J. A.; Prestipino, C.

    2016-05-01

    The software PrestoPronto consist to a full graphical user interface (GUI) program aimed to execute the analysis of large X-ray Absorption Spectroscopy data sets. Written in Python is free and open source. The code is able to read large datasets, apply calibration, alignment corrections and perform classical data analysis, from the extraction of the signal to EXAFS fit. The package includes also programs with GUIs] to perform, Principal Component Analysis and Linear Combination Fits. The main benefit of this program is allow to follow quickly the evolution of time resolved experiments coming from Quick-EXAFS (QEXAFS) and dispersive EXAFS beamlines.

  5. Validating a large geophysical data set: Experiences with satellite-derived cloud parameters

    NASA Technical Reports Server (NTRS)

    Kahn, Ralph; Haskins, Robert D.; Knighton, James E.; Pursch, Andrew; Granger-Gallegos, Stephanie

    1992-01-01

    We are validating the global cloud parameters derived from the satellite-borne HIRS2 and MSU atmospheric sounding instrument measurements, and are using the analysis of these data as one prototype for studying large geophysical data sets in general. The HIRS2/MSU data set contains a total of 40 physical parameters, filling 25 MB/day; raw HIRS2/MSU data are available for a period exceeding 10 years. Validation involves developing a quantitative sense for the physical meaning of the derived parameters over the range of environmental conditions sampled. This is accomplished by comparing the spatial and temporal distributions of the derived quantities with similar measurements made using other techniques, and with model results. The data handling needed for this work is possible only with the help of a suite of interactive graphical and numerical analysis tools. Level 3 (gridded) data is the common form in which large data sets of this type are distributed for scientific analysis. We find that Level 3 data is inadequate for the data comparisons required for validation. Level 2 data (individual measurements in geophysical units) is needed. A sampling problem arises when individual measurements, which are not uniformly distributed in space or time, are used for the comparisons. Standard 'interpolation' methods involve fitting the measurements for each data set to surfaces, which are then compared. We are experimenting with formal criteria for selecting geographical regions, based upon the spatial frequency and variability of measurements, that allow us to quantify the uncertainty due to sampling. As part of this project, we are also dealing with ways to keep track of constraints placed on the output by assumptions made in the computer code. The need to work with Level 2 data introduces a number of other data handling issues, such as accessing data files across machine types, meeting large data storage requirements, accessing other validated data sets, processing speed

  6. Faculty and student experiences with Web-based discussion groups in a large lecture setting.

    PubMed

    Harden, Janet Kula

    2003-01-01

    The exchange of ideas in a discussion format is a more effective way of developing critical thinking in students than a traditional lecture format. Although research has shown that discussion groups are more effective for developing skills in application, analysis, and synthesis of content, it is difficult to implement in a large lecture setting. The author discusses how computer discussion groups were incorporated into a class of 117 nursing students. PMID:12544613

  7. A Scalable Approach for Protein False Discovery Rate Estimation in Large Proteomic Data Sets.

    PubMed

    Savitski, Mikhail M; Wilhelm, Mathias; Hahne, Hannes; Kuster, Bernhard; Bantscheff, Marcus

    2015-09-01

    Calculating the number of confidently identified proteins and estimating false discovery rate (FDR) is a challenge when analyzing very large proteomic data sets such as entire human proteomes. Biological and technical heterogeneity in proteomic experiments further add to the challenge and there are strong differences in opinion regarding the conceptual validity of a protein FDR and no consensus regarding the methodology for protein FDR determination. There are also limitations inherent to the widely used classic target-decoy strategy that particularly show when analyzing very large data sets and that lead to a strong over-representation of decoy identifications. In this study, we investigated the merits of the classic, as well as a novel target-decoy-based protein FDR estimation approach, taking advantage of a heterogeneous data collection comprised of ∼19,000 LC-MS/MS runs deposited in ProteomicsDB (https://www.proteomicsdb.org). The "picked" protein FDR approach treats target and decoy sequences of the same protein as a pair rather than as individual entities and chooses either the target or the decoy sequence depending on which receives the highest score. We investigated the performance of this approach in combination with q-value based peptide scoring to normalize sample-, instrument-, and search engine-specific differences. The "picked" target-decoy strategy performed best when protein scoring was based on the best peptide q-value for each protein yielding a stable number of true positive protein identifications over a wide range of q-value thresholds. We show that this simple and unbiased strategy eliminates a conceptual issue in the commonly used "classic" protein FDR approach that causes overprediction of false-positive protein identification in large data sets. The approach scales from small to very large data sets without losing performance, consistently increases the number of true-positive protein identifications and is readily implemented in

  8. The search for structure - Object classification in large data sets. [for astronomers

    NASA Technical Reports Server (NTRS)

    Kurtz, Michael J.

    1988-01-01

    Research concerning object classifications schemes are reviewed, focusing on large data sets. Classification techniques are discussed, including syntactic, decision theoretic methods, fuzzy techniques, and stochastic and fuzzy grammars. Consideration is given to the automation of MK classification (Morgan and Keenan, 1973) and other problems associated with the classification of spectra. In addition, the classification of galaxies is examined, including the problems of systematic errors, blended objects, galaxy types, and galaxy clusters.

  9. Coffee Shops, Classrooms and Conversations: public engagement and outreach in a large interdisciplinary research Hub

    NASA Astrophysics Data System (ADS)

    Holden, Jennifer A.

    2014-05-01

    Public engagement and outreach activities are increasingly using specialist staff for co-ordination, training and support for researchers, they are also becoming expected for large investments. Here, the experience of public engagement and outreach a large, interdisciplinary Research Hub is described. dot.rural, based at the University of Aberdeen UK, is a £11.8 million Research Councils UK Rural Digital Economy Hub, funded as part of the RCUK Digital Economy Theme (2009-2015). Digital Economy research aims to realise the transformational impact of digital technologies on aspects of the environment, community life, cultural experiences, future society, and the economy. The dot.rural Hub involves 92 researchers from 12 different disciplines, including Geography, Hydrology and Ecology. Public Engagement and Outreach is embedded in the dot.rural Digital Economy Hub via an Outreach Officer. Alongside this position, public engagement and outreach activities are compulsory part of PhD student contracts. Public Engagement and Outreach activities at the dot.rural Hub involve individuals and groups in both formal and informal settings organised by dot.rural and other organisations. Activities in the realms of Education, Public Engagement, Traditional and Social Media are determined by a set of Underlying Principles designed for the Hub by the Outreach Officer. The underlying Engagement and Outreach principles match funding agency requirements and expectations alongside researcher demands and the user-led nature of Digital Economy Research. All activities include researchers alongside the Outreach Officer are research informed and embedded into specific projects that form the Hub. Successful public engagement activities have included participation in Café Scientifique series, workshops in primary and secondary schools, and online activities such as I'm a Scientist Get Me Out of Here. From how to engage 8 year olds with making hydrographs more understandable to members of

  10. Automatic alignment of individual peaks in large high-resolution spectral data sets

    NASA Astrophysics Data System (ADS)

    Stoyanova, Radka; Nicholls, Andrew W.; Nicholson, Jeremy K.; Lindon, John C.; Brown, Truman R.

    2004-10-01

    Pattern recognition techniques are effective tools for reducing the information contained in large spectral data sets to a much smaller number of significant features which can then be used to make interpretations about the chemical or biochemical system under study. Often the effectiveness of such approaches is impeded by experimental and instrument induced variations in the position, phase, and line width of the spectral peaks. Although characterizing the cause and magnitude of these fluctuations could be important in its own right (pH-induced NMR chemical shift changes, for example) in general they obscure the process of pattern discovery. One major area of application is the use of large databases of 1H NMR spectra of biofluids such as urine for investigating perturbations in metabolic profiles caused by drugs or disease, a process now termed metabonomics. Frequency shifts of individual peaks are the dominant source of such unwanted variations in this type of data. In this paper, an automatic procedure for aligning the individual peaks in the data set is described and evaluated. The proposed method will be vital for the efficient and automatic analysis of large metabonomic data sets and should also be applicable to other types of data.

  11. Moving Large Data Sets Over High-Performance Long Distance Networks

    SciTech Connect

    Hodson, Stephen W; Poole, Stephen W; Ruwart, Thomas; Settlemyer, Bradley W

    2011-04-01

    In this project we look at the performance characteristics of three tools used to move large data sets over dedicated long distance networking infrastructure. Although performance studies of wide area networks have been a frequent topic of interest, performance analyses have tended to focus on network latency characteristics and peak throughput using network traffic generators. In this study we instead perform an end-to-end long distance networking analysis that includes reading large data sets from a source file system and committing large data sets to a destination file system. An evaluation of end-to-end data movement is also an evaluation of the system configurations employed and the tools used to move the data. For this paper, we have built several storage platforms and connected them with a high performance long distance network configuration. We use these systems to analyze the capabilities of three data movement tools: BBcp, GridFTP, and XDD. Our studies demonstrate that existing data movement tools do not provide efficient performance levels or exercise the storage devices in their highest performance modes. We describe the device information required to achieve high levels of I/O performance and discuss how this data is applicable in use cases beyond data movement performance.

  12. Non-rigid Registration for Large Sets of Microscopic Images on Graphics Processors

    PubMed Central

    Ruiz, Antonio; Ujaldon, Manuel; Cooper, Lee

    2014-01-01

    Microscopic imaging is an important tool for characterizing tissue morphology and pathology. 3D reconstruction and visualization of large sample tissue structure requires registration of large sets of high-resolution images. However, the scale of this problem presents a challenge for automatic registration methods. In this paper we present a novel method for efficient automatic registration using graphics processing units (GPUs) and parallel programming. Comparing a C++ CPU implementation with Compute Unified Device Architecture (CUDA) libraries and pthreads running on GPU we achieve a speed-up factor of up to 4.11× with a single GPU and 6.68× with a GPU pair. We present execution times for a benchmark composed of two sets of large-scale images: mouse placenta (16K × 16K pixels) and breast cancer tumors (23K × 62K pixels). It takes more than 12 hours for the genetic case in C++ to register a typical sample composed of 500 consecutive slides, which was reduced to less than 2 hours using two GPUs, in addition to a very promising scalability for extending those gains easily on a large number of GPUs in a distributed system. PMID:25328635

  13. COLLABORATIVE RESEARCH: Parallel Analysis Tools and New Visualization Techniques for Ultra-Large Climate Data Set

    SciTech Connect

    middleton, Don; Haley, Mary

    2014-12-10

    ParVis was a project funded under LAB 10-05: “Earth System Modeling: Advanced Scientific Visualization of Ultra-Large Climate Data Sets”. Argonne was the lead lab with partners at PNNL, SNL, NCAR and UC-Davis. This report covers progress from January 1st, 2013 through Dec 1st, 2014. Two previous reports covered the period from Summer, 2010, through September 2011 and October 2011 through December 2012, respectively. While the project was originally planned to end on April 30, 2013, personnel and priority changes allowed many of the institutions to continue work through FY14 using existing funds. A primary focus of ParVis was introducing parallelism to climate model analysis to greatly reduce the time-to-visualization for ultra-large climate data sets. Work in the first two years was conducted on two tracks with different time horizons: one track to provide immediate help to climate scientists already struggling to apply their analysis to existing large data sets and another focused on building a new data-parallel library and tool for climate analysis and visualization that will give the field a platform for performing analysis and visualization on ultra-large datasets for the foreseeable future. In the final 2 years of the project, we focused mostly on the new data-parallel library and associated tools for climate analysis and visualization.

  14. Breeding and Genetics Symposium: really big data: processing and analysis of very large data sets.

    PubMed

    Cole, J B; Newman, S; Foertter, F; Aguilar, I; Coffey, M

    2012-03-01

    Modern animal breeding data sets are large and getting larger, due in part to recent availability of high-density SNP arrays and cheap sequencing technology. High-performance computing methods for efficient data warehousing and analysis are under development. Financial and security considerations are important when using shared clusters. Sound software engineering practices are needed, and it is better to use existing solutions when possible. Storage requirements for genotypes are modest, although full-sequence data will require greater storage capacity. Storage requirements for intermediate and results files for genetic evaluations are much greater, particularly when multiple runs must be stored for research and validation studies. The greatest gains in accuracy from genomic selection have been realized for traits of low heritability, and there is increasing interest in new health and management traits. The collection of sufficient phenotypes to produce accurate evaluations may take many years, and high-reliability proofs for older bulls are needed to estimate marker effects. Data mining algorithms applied to large data sets may help identify unexpected relationships in the data, and improved visualization tools will provide insights. Genomic selection using large data requires a lot of computing power, particularly when large fractions of the population are genotyped. Theoretical improvements have made possible the inversion of large numerator relationship matrices, permitted the solving of large systems of equations, and produced fast algorithms for variance component estimation. Recent work shows that single-step approaches combining BLUP with a genomic relationship (G) matrix have similar computational requirements to traditional BLUP, and the limiting factor is the construction and inversion of G for many genotypes. A naïve algorithm for creating G for 14,000 individuals required almost 24 h to run, but custom libraries and parallel computing reduced that to

  15. The gradient boosting algorithm and random boosting for genome-assisted evaluation in large data sets.

    PubMed

    González-Recio, O; Jiménez-Montero, J A; Alenda, R

    2013-01-01

    In the next few years, with the advent of high-density single nucleotide polymorphism (SNP) arrays and genome sequencing, genomic evaluation methods will need to deal with a large number of genetic variants and an increasing sample size. The boosting algorithm is a machine-learning technique that may alleviate the drawbacks of dealing with such large data sets. This algorithm combines different predictors in a sequential manner with some shrinkage on them; each predictor is applied consecutively to the residuals from the committee formed by the previous ones to form a final prediction based on a subset of covariates. Here, a detailed description is provided and examples using a toy data set are included. A modification of the algorithm called "random boosting" was proposed to increase predictive ability and decrease computation time of genome-assisted evaluation in large data sets. Random boosting uses a random selection of markers to add a subsequent weak learner to the predictive model. These modifications were applied to a real data set composed of 1,797 bulls genotyped for 39,714 SNP. Deregressed proofs of 4 yield traits and 1 type trait from January 2009 routine evaluations were used as dependent variables. A 2-fold cross-validation scenario was implemented. Sires born before 2005 were used as a training sample (1,576 and 1,562 for production and type traits, respectively), whereas younger sires were used as a testing sample to evaluate predictive ability of the algorithm on yet-to-be-observed phenotypes. Comparison with the original algorithm was provided. The predictive ability of the algorithm was measured as Pearson correlations between observed and predicted responses. Further, estimated bias was computed as the average difference between observed and predicted phenotypes. The results showed that the modification of the original boosting algorithm could be run in 1% of the time used with the original algorithm and with negligible differences in accuracy

  16. An interactive environment for the analysis of large Earth observation and model data sets

    NASA Technical Reports Server (NTRS)

    Bowman, Kenneth P.; Walsh, John E.; Wilhelmson, Robert B.

    1994-01-01

    Envision is an interactive environment that provides researchers in the earth sciences convenient ways to manage, browse, and visualize large observed or model data sets. Its main features are support for the netCDF and HDF file formats, an easy to use X/Motif user interface, a client-server configuration, and portability to many UNIX workstations. The Envision package also provides new ways to view and change metadata in a set of data files. It permits a scientist to conveniently and efficiently manage large data sets consisting of many data files. It also provides links to popular visualization tools so that data can be quickly browsed. Envision is a public domain package, freely available to the scientific community. Envision software (binaries and source code) and documentation can be obtained from either of these servers: ftp://vista.atmos.uiuc.edu/pub/envision/ and ftp://csrp.tamu.edu/pub/envision/. Detailed descriptions of Envision capabilities and operations can be found in the User's Guide and Reference Manuals distributed with Envision software.

  17. Neuron-synapse IC chip-set for large-scale chaotic neural networks.

    PubMed

    Horio, Y; Aihara, K; Yamamoto, O

    2003-01-01

    We propose a neuron-synapse integrated circuit (IC) chip-set for large-scale chaotic neural networks. We use switched-capacitor (SC) circuit techniques to implement a three-internal-state transiently-chaotic neural network model. The SC chaotic neuron chip faithfully reproduces complex chaotic dynamics in real numbers through continuous state variables of the analog circuitry. We can digitally control most of the model parameters by means of programmable capacitive arrays embedded in the SC chaotic neuron chip. Since the output of the neuron is transfered into a digital pulse according to the all-or-nothing property of an axon, we design a synapse chip with digital circuits. We propose a memory-based synapse circuit architecture to achieve a rapid calculation of a vast number of weighted summations. Both of the SC neuron and the digital synapse circuits have been fabricated as IC forms. We have tested these IC chips extensively, and confirmed the functions and performance of the chip-set. The proposed neuron-synapse IC chip-set makes it possible to construct a scalable and reconfigurable large-scale chaotic neural network with 10000 neurons and 10000/sup 2/ synaptic connections. PMID:18244585

  18. Tiny videos: a large data set for nonparametric video retrieval and frame classification.

    PubMed

    Karpenko, Alexandre; Aarabi, Parham

    2011-03-01

    In this paper, we present a large database of over 50,000 user-labeled videos collected from YouTube. We develop a compact representation called "tiny videos" that achieves high video compression rates while retaining the overall visual appearance of the video as it varies over time. We show that frame sampling using affinity propagation-an exemplar-based clustering algorithm-achieves the best trade-off between compression and video recall. We use this large collection of user-labeled videos in conjunction with simple data mining techniques to perform related video retrieval, as well as classification of images and video frames. The classification results achieved by tiny videos are compared with the tiny images framework [24] for a variety of recognition tasks. The tiny images data set consists of 80 million images collected from the Internet. These are the largest labeled research data sets of videos and images available to date. We show that tiny videos are better suited for classifying scenery and sports activities, while tiny images perform better at recognizing objects. Furthermore, we demonstrate that combining the tiny images and tiny videos data sets improves classification precision in a wider range of categories. PMID:21252400

  19. Classroom-Based Interventions and Teachers' Perceived Job Stressors and Confidence: Evidence from a Randomized Trial in Head Start Settings

    ERIC Educational Resources Information Center

    Zhai, Fuhua; Raver, C. Cybele; Li-Grining, Christine

    2011-01-01

    Preschool teachers' job stressors have received increasing attention but have been understudied in the literature. We investigated the impacts of a classroom-based intervention, the Chicago School Readiness Project (CSRP), on teachers' perceived job stressors and confidence, as indexed by their perceptions of job control, job resources, job…

  20. An Analogous Study of Children's Attitudes Toward School in an Open Classroom Environment as Opposed to a Conventional Setting.

    ERIC Educational Resources Information Center

    Zeli, Doris Conti

    A study sought to determine whether intermediate age children exposed to open classroom teaching strategy have a more positive attitude toward school than intermediate age children exposed to conventional teaching strategy. The hypothesis was that there would be no significant difference in attitude between the two groups. The study was limited to…

  1. Science in the Classroom: Finding a Balance between Autonomous Exploration and Teacher-Led Instruction in Preschool Settings

    ERIC Educational Resources Information Center

    Nayfeld, Irena; Brenneman, Kimberly; Gelman, Rochel

    2011-01-01

    Research Findings: This paper reports on children's use of science materials in preschool classrooms during their free choice time. Baseline observations showed that children and teachers rarely spend time in the designated science area. An intervention was designed to "market" the science center by introducing children to 1 science tool, the…

  2. Child and Setting Characteristics Affecting the Adult Talk Directed at Preschoolers with Autism Spectrum Disorder in the Inclusive Classroom

    ERIC Educational Resources Information Center

    Irvin, Dwight W.; Boyd, Brian A.; Odom, Samuel L.

    2015-01-01

    Difficulty with social competence is a core deficit of autism spectrum disorder. Research on typically developing children and children with disabilities, in general, suggests the adult talk received in the classroom is related to their social development. The aims of this study were to examine (1) the types and amounts of adult talk children with…

  3. Initial Validation of the Prekindergarten Classroom Observation Tool and Goal Setting System for Data-Based Coaching

    ERIC Educational Resources Information Center

    Crawford, April D.; Zucker, Tricia A.; Williams, Jeffrey M.; Bhavsar, Vibhuti; Landry, Susan H.

    2013-01-01

    Although coaching is a popular approach for enhancing the quality of Tier 1 instruction, limited research has addressed observational measures specifically designed to focus coaching on evidence-based practices. This study explains the development of the prekindergarten (pre-k) Classroom Observation Tool (COT) designed for use in a data-based…

  4. Analogies as Tools for Meaning Making in Elementary Science Education: How Do They Work in Classroom Settings?

    ERIC Educational Resources Information Center

    Guerra-Ramos, Maria Teresa

    2011-01-01

    In this paper there is a critical overview of the role of analogies as tools for meaning making in science education, their advantages and disadvantages. Two empirical studies on the use of analogies in primary classrooms are discussed and analysed. In the first study, the "string circuit" analogy was used in the teaching of electric circuits with…

  5. Multimodal Literacy Practices in the Indigenous Sámi Classroom: Children Navigating in a Complex Multilingual Setting

    ERIC Educational Resources Information Center

    Pietikäinen, Sari; Pitkänen-Huhta, Anne

    2013-01-01

    This article explores multimodal literacy practices in a transforming multilingual context of an indigenous and endangered Sámi language classroom. Looking at literacy practices as embedded in a complex and shifting terrain of language ideologies, language norms, and individual experiences and attitudes, we examined how multilingual Sámi children…

  6. Coresets vs clustering: comparison of methods for redundancy reduction in very large white matter fiber sets

    NASA Astrophysics Data System (ADS)

    Alexandroni, Guy; Zimmerman Moreno, Gali; Sochen, Nir; Greenspan, Hayit

    2016-03-01

    Recent advances in Diffusion Weighted Magnetic Resonance Imaging (DW-MRI) of white matter in conjunction with improved tractography produce impressive reconstructions of White Matter (WM) pathways. These pathways (fiber sets) often contain hundreds of thousands of fibers, or more. In order to make fiber based analysis more practical, the fiber set needs to be preprocessed to eliminate redundancies and to keep only essential representative fibers. In this paper we demonstrate and compare two distinctive frameworks for selecting this reduced set of fibers. The first framework entails pre-clustering the fibers using k-means, followed by Hierarchical Clustering and replacing each cluster with one representative. For the second clustering stage seven distance metrics were evaluated. The second framework is based on an efficient geometric approximation paradigm named coresets. Coresets present a new approach to optimization and have huge success especially in tasks requiring large computation time and/or memory. We propose a modified version of the coresets algorithm, Density Coreset. It is used for extracting the main fibers from dense datasets, leaving a small set that represents the main structures and connectivity of the brain. A novel approach, based on a 3D indicator structure, is used for comparing the frameworks. This comparison was applied to High Angular Resolution Diffusion Imaging (HARDI) scans of 4 healthy individuals. We show that among the clustering based methods, that cosine distance gives the best performance. In comparing the clustering schemes with coresets, Density Coreset method achieves the best performance.

  7. Web-Queryable Large-Scale Data Sets for Hypothesis Generation in Plant Biology

    PubMed Central

    Brady, Siobhan M.; Provart, Nicholas J.

    2009-01-01

    The approaching end of the 21st century's first decade marks an exciting time for plant biology. Several National Science Foundation Arabidopsis 2010 Projects will conclude, and whether or not the stated goal of the National Science Foundation 2010 Program—to determine the function of 25,000 Arabidopsis genes by 2010—is reached, these projects and others in a similar vein, such as those performed by the AtGenExpress Consortium and various plant genome sequencing initiatives, have generated important and unprecedented large-scale data sets. While providing significant biological insights for the individual laboratories that generated them, these data sets, in conjunction with the appropriate tools, are also permitting plant biologists worldwide to gain new insights into their own biological systems of interest, often at a mouse click through a Web browser. This review provides an overview of several such genomic, epigenomic, transcriptomic, proteomic, and metabolomic data sets and describes Web-based tools for querying them in the context of hypothesis generation for plant biology. We provide five biological examples of how such tools and data sets have been used to provide biological insight. PMID:19401381

  8. Heuristic method for searches on large data-sets organised using network models

    NASA Astrophysics Data System (ADS)

    Ruiz-Fernández, D.; Quintana-Pacheco, Y.

    2016-05-01

    Searches on large data-sets have become an important issue in recent years. An alternative, which has achieved good results, is the use of methods relying on data mining techniques, such as cluster-based retrieval. This paper proposes a heuristic search that is based on an organisational model that reflects similarity relationships among data elements. The search is guided by using quality estimators of model nodes, which are obtained by the progressive evaluation of the given target function for the elements associated with each node. The results of the experiments confirm the effectiveness of the proposed algorithm. High-quality solutions are obtained evaluating a relatively small percentage of elements in the data-sets.

  9. Distributed Computation of the knn Graph for Large High-Dimensional Point Sets.

    PubMed

    Plaku, Erion; Kavraki, Lydia E

    2007-03-01

    High-dimensional problems arising from robot motion planning, biology, data mining, and geographic information systems often require the computation of k nearest neighbor (knn) graphs. The knn graph of a data set is obtained by connecting each point to its k closest points. As the research in the above-mentioned fields progressively addresses problems of unprecedented complexity, the demand for computing knn graphs based on arbitrary distance metrics and large high-dimensional data sets increases, exceeding resources available to a single machine. In this work we efficiently distribute the computation of knn graphs for clusters of processors with message passing. Extensions to our distributed framework include the computation of graphs based on other proximity queries, such as approximate knn or range queries. Our experiments show nearly linear speedup with over one hundred processors and indicate that similar speedup can be obtained with several hundred processors. PMID:19847318

  10. Distributed Computation of the knn Graph for Large High-Dimensional Point Sets

    PubMed Central

    Plaku, Erion; Kavraki, Lydia E.

    2009-01-01

    High-dimensional problems arising from robot motion planning, biology, data mining, and geographic information systems often require the computation of k nearest neighbor (knn) graphs. The knn graph of a data set is obtained by connecting each point to its k closest points. As the research in the above-mentioned fields progressively addresses problems of unprecedented complexity, the demand for computing knn graphs based on arbitrary distance metrics and large high-dimensional data sets increases, exceeding resources available to a single machine. In this work we efficiently distribute the computation of knn graphs for clusters of processors with message passing. Extensions to our distributed framework include the computation of graphs based on other proximity queries, such as approximate knn or range queries. Our experiments show nearly linear speedup with over one hundred processors and indicate that similar speedup can be obtained with several hundred processors. PMID:19847318

  11. Envision: An interactive system for the management and visualization of large geophysical data sets

    NASA Technical Reports Server (NTRS)

    Searight, K. R.; Wojtowicz, D. P.; Walsh, J. E.; Pathi, S.; Bowman, K. P.; Wilhelmson, R. B.

    1995-01-01

    Envision is a software project at the University of Illinois and Texas A&M, funded by NASA's Applied Information Systems Research Project. It provides researchers in the geophysical sciences convenient ways to manage, browse, and visualize large observed or model data sets. Envision integrates data management, analysis, and visualization of geophysical data in an interactive environment. It employs commonly used standards in data formats, operating systems, networking, and graphics. It also attempts, wherever possible, to integrate with existing scientific visualization and analysis software. Envision has an easy-to-use graphical interface, distributed process components, and an extensible design. It is a public domain package, freely available to the scientific community.

  12. Multifractal analysis of the irregular set for almost-additive sequences via large deviations

    NASA Astrophysics Data System (ADS)

    Bomfim, Thiago; Varandas, Paulo

    2015-10-01

    In this paper we introduce a notion of free energy and large deviations rate function for asymptotically additive sequences of potentials via an approximation method by families of continuous potentials. We provide estimates for the topological pressure of the set of points whose non-additive sequences are far from the limit described through Kingman’s sub-additive ergodic theorem and give some applications in the context of Lyapunov exponents for diffeomorphisms and cocycles, and the Shannon-McMillan-Breiman theorem for Gibbs measures.

  13. Unusually large shear wave anisotropy for chlorite in subduction zone settings

    NASA Astrophysics Data System (ADS)

    Mookherjee, Mainak; Mainprice, David

    2014-03-01

    Using first principle simulations we calculated the elasticity of chlorite. At a density ρ~ 2.60 g cm-3, the elastic constant tensor reveals significant elastic anisotropy: VP ~27%, VS1 ~56%, and VS2 ~43%. The shear anisotropy is exceptionally large for chlorite and enhances upon compression. Upon compression, the shear elastic constant component C44 and C55 decreases, whereas C66 shear component stiffens. The softening in C44 and C55 is reflected in shear modulus, G, and the shear wave velocity, VS. Our results on elastic anisotropy at conditions relevant to the mantle wedge indicates that a 10-20 km layer of hydrated peridotite with serpentine and chlorite could account for the observed shear polarization anisotropy and associated large delay times of 1-2 s observed in some subduction zone settings. In addition, chlorite could also explain the low VP/VS ratios that have been observed in recent high-resolution seismological studies.

  14. A practical, bioinformatic workflow system for large data sets generated by next-generation sequencing

    PubMed Central

    Cantacessi, Cinzia; Jex, Aaron R.; Hall, Ross S.; Young, Neil D.; Campbell, Bronwyn E.; Joachim, Anja; Nolan, Matthew J.; Abubucker, Sahar; Sternberg, Paul W.; Ranganathan, Shoba; Mitreva, Makedonka; Gasser, Robin B.

    2010-01-01

    Transcriptomics (at the level of single cells, tissues and/or whole organisms) underpins many fields of biomedical science, from understanding the basic cellular function in model organisms, to the elucidation of the biological events that govern the development and progression of human diseases, and the exploration of the mechanisms of survival, drug-resistance and virulence of pathogens. Next-generation sequencing (NGS) technologies are contributing to a massive expansion of transcriptomics in all fields and are reducing the cost, time and performance barriers presented by conventional approaches. However, bioinformatic tools for the analysis of the sequence data sets produced by these technologies can be daunting to researchers with limited or no expertise in bioinformatics. Here, we constructed a semi-automated, bioinformatic workflow system, and critically evaluated it for the analysis and annotation of large-scale sequence data sets generated by NGS. We demonstrated its utility for the exploration of differences in the transcriptomes among various stages and both sexes of an economically important parasitic worm (Oesophagostomum dentatum) as well as the prediction and prioritization of essential molecules (including GTPases, protein kinases and phosphatases) as novel drug target candidates. This workflow system provides a practical tool for the assembly, annotation and analysis of NGS data sets, also to researchers with a limited bioinformatic expertise. The custom-written Perl, Python and Unix shell computer scripts used can be readily modified or adapted to suit many different applications. This system is now utilized routinely for the analysis of data sets from pathogens of major socio-economic importance and can, in principle, be applied to transcriptomics data sets from any organism. PMID:20682560

  15. Approaching the exa-scale: a real-world evaluation of rendering extremely large data sets

    SciTech Connect

    Patchett, John M; Ahrens, James P; Lo, Li - Ta; Browniee, Carson S; Mitchell, Christopher J; Hansen, Chuck

    2010-10-15

    Extremely large scale analysis is becoming increasingly important as supercomputers and their simulations move from petascale to exascale. The lack of dedicated hardware acceleration for rendering on today's supercomputing platforms motivates our detailed evaluation of the possibility of interactive rendering on the supercomputer. In order to facilitate our understanding of rendering on the supercomputing platform, we focus on scalability of rendering algorithms and architecture envisioned for exascale datasets. To understand tradeoffs for dealing with extremely large datasets, we compare three different rendering algorithms for large polygonal data: software based ray tracing, software based rasterization and hardware accelerated rasterization. We present a case study of strong and weak scaling of rendering extremely large data on both GPU and CPU based parallel supercomputers using Para View, a parallel visualization tool. Wc use three different data sets: two synthetic and one from a scientific application. At an extreme scale, algorithmic rendering choices make a difference and should be considered while approaching exascale computing, visualization, and analysis. We find software based ray-tracing offers a viable approach for scalable rendering of the projected future massive data sizes.

  16. A flexible method for estimating the fraction of fitness influencing mutations from large sequencing data sets.

    PubMed

    Moon, Sunjin; Akey, Joshua M

    2016-06-01

    A continuing challenge in the analysis of massively large sequencing data sets is quantifying and interpreting non-neutrally evolving mutations. Here, we describe a flexible and robust approach based on the site frequency spectrum to estimate the fraction of deleterious and adaptive variants from large-scale sequencing data sets. We applied our method to approximately 1 million single nucleotide variants (SNVs) identified in high-coverage exome sequences of 6515 individuals. We estimate that the fraction of deleterious nonsynonymous SNVs is higher than previously reported; quantify the effects of genomic context, codon bias, chromatin accessibility, and number of protein-protein interactions on deleterious protein-coding SNVs; and identify pathways and networks that have likely been influenced by positive selection. Furthermore, we show that the fraction of deleterious nonsynonymous SNVs is significantly higher for Mendelian versus complex disease loci and in exons harboring dominant versus recessive Mendelian mutations. In summary, as genome-scale sequencing data accumulate in progressively larger sample sizes, our method will enable increasingly high-resolution inferences into the characteristics and determinants of non-neutral variation. PMID:27197222

  17. Litho-kinematic facies model for large landslide deposits in arid settings

    SciTech Connect

    Yarnold, J.C.; Lombard, J.P.

    1989-04-01

    Reconnaissance field studies of six large landslide deposits in the S. Basin and Range suggest that a set of characteristic features is common to the deposits of large landslides in an arid setting. These include a coarse boulder cap, an upper massive zone, a lower disrupted zone, and a mixed zone overlying disturbed substrate. The upper massive zone is dominated by crackel breccia. This grades downward into a lower disrupted zone composed of a more matrix-rich breccia that is internally sheared, intruded by clastic dikes, and often contains a cataclasite layer at its base. An underlying discontinuous mixed zone is composed of material from the overlying breccia mixed with material entrained from the underlying substrate. Bedding in the substrate sometimes displays folding and contortion that die out downward. The authors work suggests a spatial zonation of these characteristic features within many landslide deposits. In general, clastic dikes, the basal cataclasite, and folding in the substrate are observed mainly in distal parts of landslides. In most cases, total thickness, thickness of the basal disturbed and mixed zones, and the degree of internal shearing increase distally, whereas maximum clast size commonly decreases distally. Zonation of these features is interpreted to result from kinematics of emplacement that cause generally increased deformation in the distal regions of the landslide.

  18. Using an ensemble of statistical metrics to quantify large sets of plant transcription factor binding sites

    PubMed Central

    2013-01-01

    Background From initial seed germination through reproduction, plants continuously reprogram their transcriptional repertoire to facilitate growth and development. This dynamic is mediated by a diverse but inextricably-linked catalog of regulatory proteins called transcription factors (TFs). Statistically quantifying TF binding site (TFBS) abundance in promoters of differentially expressed genes can be used to identify binding site patterns in promoters that are closely related to stress-response. Output from today’s transcriptomic assays necessitates statistically-oriented software to handle large promoter-sequence sets in a computationally tractable fashion. Results We present Marina, an open-source software for identifying over-represented TFBSs from amongst large sets of promoter sequences, using an ensemble of 7 statistical metrics and binding-site profiles. Through software comparison, we show that Marina can identify considerably more over-represented plant TFBSs compared to a popular software alternative. Conclusions Marina was used to identify over-represented TFBSs in a two time-point RNA-Seq study exploring the transcriptomic interplay between soybean (Glycine max) and soybean rust (Phakopsora pachyrhizi). Marina identified numerous abundant TFBSs recognized by transcription factors that are associated with defense-response such as WRKY, HY5 and MYB2. Comparing results from Marina to that of a popular software alternative suggests that regardless of the number of promoter-sequences, Marina is able to identify significantly more over-represented TFBSs. PMID:23578135

  19. A Technique for Moving Large Data Sets over High-Performance Long Distance Networks

    SciTech Connect

    Settlemyer, Bradley W; Dobson, Jonathan D; Hodson, Stephen W; Kuehn, Jeffery A; Poole, Stephen W; Ruwart, Thomas

    2011-01-01

    In this paper we look at the performance characteristics of three tools used to move large data sets over dedicated long distance networking infrastructure. Although performance studies of wide area networks have been a frequent topic of interest, performance analyses have tended to focus on network latency characteristics and peak throughput using network traffic generators. In this study we instead perform an end-to-end long distance networking analysis that includes reading large data sets from a source file system and committing the data to a remote destination file system. An evaluation of end-to-end data movement is also an evaluation of the system configurations employed and the tools used to move the data. For this paper, we have built several storage platforms and connected them with a high performance long distance network configuration. We use these systems to analyze the capabilities of three data movement tools: BBcp, GridFTP, and XDD. Our studies demonstrate that existing data movement tools do not provide efficient performance levels or exercise the storage devices in their highest performance modes.

  20. Contextual settings, science stories, and large context problems: Toward a more humanistic science education

    NASA Astrophysics Data System (ADS)

    Stinner, Arthur

    This article addresses the need for and the problem of organizing a science curriculum around contextual settings and science stories that serve to involve and motivate students to develop an understanding of the world that is rooted in the scientific and the humanistic traditions. A program of activities placed around contextual settings, science stories, and contemporary issues of interest is recommended in an attempt to move toward a slow and secure abolition of the gulf between scientific knowledge and common sense beliefs. A conceptual development model is described to guide the connection between theory and evidence on a level appropriate for children, from early years to senior years. For the senior years it is also important to connect the activity of teaching to a sound theoretical structure. The theoretical structure must illuminate the status of theory in science, establish what counts as evidence, clarify the relationship between experiment and explanation, and make connections to the history of science. The article concludes with a proposed program of activities in terms of a sequence of theoretical and empirical experiences that involve contextual settings, science stories, large context problems, thematic teaching, and popular science literature teaching.

  1. Developing consistent Landsat data sets for large area applications: the MRLC 2001 protocol

    USGS Publications Warehouse

    Chander, G.; Huang, C.; Yang, L.; Homer, C.; Larson, C.

    2009-01-01

    One of the major efforts in large area land cover mapping over the last two decades was the completion of two U.S. National Land Cover Data sets (NLCD), developed with nominal 1992 and 2001 Landsat imagery under the auspices of the MultiResolution Land Characteristics (MRLC) Consortium. Following the successful generation of NLCD 1992, a second generation MRLC initiative was launched with two primary goals: (1) to develop a consistent Landsat imagery data set for the U.S. and (2) to develop a second generation National Land Cover Database (NLCD 2001). One of the key enhancements was the formulation of an image preprocessing protocol and implementation of a consistent image processing method. The core data set of the NLCD 2001 database consists of Landsat 7 Enhanced Thematic Mapper Plus (ETM+) images. This letter details the procedures for processing the original ETM+ images and more recent scenes added to the database. NLCD 2001 products include Anderson Level II land cover classes, percent tree canopy, and percent urban imperviousness at 30-m resolution derived from Landsat imagery. The products are freely available for download to the general public from the MRLC Consortium Web site at http://www.mrlc.gov.

  2. Hierarchical Unbiased Graph Shrinkage (HUGS): A Novel Groupwise Registration for Large Data Set

    PubMed Central

    Ying, Shihui; Wu, Guorong; Wang, Qian; Shen, Dinggang

    2014-01-01

    Normalizing all images in a large data set into a common space is a key step in many clinical and research studies, e.g., for brain development, maturation, and aging. Recently, groupwise registration has been developed for simultaneous alignment of all images without selecting a particular image as template, thus potentially avoiding bias in the registration. However, most conventional groupwise registration methods do not explore the data distribution during the image registration. Thus, their performance could be affected by large inter-subject variations in the data set under registration. To solve this potential issue, we propose to use a graph to model the distribution of all image data sitting on the image manifold, with each node representing an image and each edge representing the geodesic pathway between two nodes (or images). Then, the procedure of warping all images to their population center turns to the dynamic shrinking of the graph nodes along their graph edges until all graph nodes become close to each other. Thus, the topology of image distribution on the image manifold is always preserved during the groupwise registration. More importantly, by modeling the distribution of all images via a graph, we can potentially reduce registration error since every time each image is warped only according to its nearby images with similar structures in the graph. We have evaluated our proposed groupwise registration method on both infant and adult data sets, by also comparing with the conventional group-mean based registration and the ABSORB methods. All experimental results show that our proposed method can achieve better performance in terms of registration accuracy and robustness. PMID:24055505

  3. Classroom-based Interventions and Teachers’ Perceived Job Stressors and Confidence: Evidence from a Randomized Trial in Head Start Settings

    PubMed Central

    Zhai, Fuhua; Raver, C. Cybele; Li-Grining, Christine

    2011-01-01

    Preschool teachers’ job stressors have received increasing attention but have been understudied in the literature. We investigated the impacts of a classroom-based intervention, the Chicago School Readiness Project (CSRP), on teachers’ perceived job stressors and confidence, as indexed by their perceptions of job control, job resources, job demands, and confidence in behavior management. Using a clustered randomized controlled trial (RCT) design, the CSRP provided multifaceted services to the treatment group, including teacher training and mental health consultation, which were accompanied by stress-reduction services and workshops. Overall, 90 teachers in 35 classrooms at 18 Head Start sites participated in the study. After adjusting for teacher and classroom factors and site fixed effects, we found that the CSRP had significant effects on the improvement of teachers’ perceived job control and work-related resources. We also found that the CSRP decreased teachers’ confidence in behavior management and had no statistically significant effects on job demands. Overall, we did not find significant moderation effects of teacher race/ethnicity, education, teaching experience, or teacher type. The implications for research and policy are discussed. PMID:21927538

  4. Child and setting characteristics affecting the adult talk directed at preschoolers with autism spectrum disorder in the inclusive classroom.

    PubMed

    Irvin, Dwight W; Boyd, Brian A; Odom, Samuel L

    2015-02-01

    Difficulty with social competence is a core deficit of autism spectrum disorder. Research on typically developing children and children with disabilities, in general, suggests the adult talk received in the classroom is related to their social development. The aims of this study were to examine (1) the types and amounts of adult talk children with autism spectrum disorder are exposed to in the preschool classroom and (2) the associations between child characteristics (e.g. language), activity area, and adult talk. Kontos' Teacher Talk classification was used to code videos approximately 30 min in length of 73 children with autism spectrum disorder (ages 3-5) in inclusive classrooms (n = 33) during center time. The results indicated practical/personal assistance was the most common type of adult talk coded, and behavior management talk least often coded. Child characteristics (i.e. age and autism severity) and activity area were found to be related to specific types of adult talk. Given the findings, implications for future research are discussed. PMID:24463432

  5. Contamination in the MACHO data set and the puzzle of Large Magellanic Cloud microlensing

    NASA Astrophysics Data System (ADS)

    Griest, Kim; Thomas, Christian L.

    2005-05-01

    In a recent series of three papers, Belokurov, Evans & Le Du and Evans & Belokurov reanalysed the MACHO collaboration data and gave alternative sets of microlensing events and an alternative optical depth to microlensing towards the Large Magellanic Cloud (LMC). Although these authors examined less than 0.2 per cent of the data, they reported that by using a neural net program they had reliably selected a better (and smaller) set of microlensing candidates. Estimating the optical depth from this smaller set, they claimed that the MACHO collaboration overestimated the optical depth by a significant factor and that the MACHO microlensing experiment is consistent with lensing by known stars in the Milky Way and LMC. As we show below, the analysis by these authors contains several errors, and as a result their conclusions are incorrect. Their efficiency analysis is in error, and since they did not search through the entire MACHO data set, they do not know how many microlensing events their neural net would find in the data nor what optical depth their method would give. Examination of their selected events suggests that their method misses low signal-to-noise ratio events and thus would have lower efficiency than the MACHO selection criteria. In addition, their method is likely to give many more false positives (non-lensing events identified as lensing). Both effects would increase their estimated optical depth. Finally, we note that the EROS discovery that LMC event 23 is a variable star reduces the MACHO collaboration estimates of optical depth and the Macho halo fraction by around 8 per cent, and does open the question of additional contamination.

  6. The Large Synoptic Survey Telescope and Foundations for Data Exploitation of Petabyte Data Sets

    SciTech Connect

    Cook, K H; Nikolaev, S; Huber, M E

    2007-02-26

    The next generation of imaging surveys in astronomy, such as the Large Synoptic Survey Telescope (LSST), will require multigigapixel cameras that can process enormous amounts of data read out every few seconds. This huge increase in data throughput (compared to megapixel cameras and minute- to hour-long integrations of today's instruments) calls for a new paradigm for extracting the knowledge content. We have developed foundations for this new approach. In this project, we have studied the necessary processes for extracting information from large time-domain databases systematics. In the process, we have produced significant scientific breakthroughs by developing new methods to probe both the elusive time and spatial variations in astrophysics data sets from the SuperMACHO (Massive Compact Halo Objects) survey, the Lowell Observatory Near-Earth Object Search (LONEOS), and the Taiwanese American Occultation Survey (TAOS). This project continues to contribute to the development of the scientific foundations for future wide-field, time-domain surveys. Our algorithm and pipeline development has provided the building blocks for the development of the LSST science software system. Our database design and performance measures have helped to size and constrain LSST database design. LLNL made significant contributions to the foundations of the LSST, which has applications for large-scale imaging and data-mining activities at LLNL. These developments are being actively applied to the previously mentioned surveys producing important scientific results that have been released to the scientific community and more continue to be published and referenced, enhancing LLNL's scientific stature.

  7. Twelve- to 14-Month-Old Infants Can Predict Single-Event Probability with Large Set Sizes

    ERIC Educational Resources Information Center

    Denison, Stephanie; Xu, Fei

    2010-01-01

    Previous research has revealed that infants can reason correctly about single-event probabilities with small but not large set sizes (Bonatti, 2008; Teglas "et al.", 2007). The current study asks whether infants can make predictions regarding single-event probability with large set sizes using a novel procedure. Infants completed two trials: A…

  8. Radiometric Normalization of Large Airborne Image Data Sets Acquired by Different Sensor Types

    NASA Astrophysics Data System (ADS)

    Gehrke, S.; Beshah, B. T.

    2016-06-01

    successfully applied to large sets of heterogeneous imagery, including the adjustment of original sensor images prior to quality control and further processing as well as radiometric adjustment for ortho-image mosaic generation.

  9. Generating extreme weather event sets from very large ensembles of regional climate models

    NASA Astrophysics Data System (ADS)

    Massey, Neil; Guillod, Benoit; Otto, Friederike; Allen, Myles; Jones, Richard; Hall, Jim

    2015-04-01

    Generating extreme weather event sets from very large ensembles of regional climate models Neil Massey, Benoit P. Guillod, Friederike E. L. Otto, Myles R. Allen, Richard Jones, Jim W. Hall Environmental Change Institute, University of Oxford, Oxford, UK Extreme events can have large impacts on societies and are therefore being increasingly studied. In particular, climate change is expected to impact the frequency and intensity of these events. However, a major limitation when investigating extreme weather events is that, by definition, only few events are present in observations. A way to overcome this issue it to use large ensembles of model simulations. Using the volunteer distributed computing (VDC) infrastructure of weather@home [1], we run a very large number (10'000s) of RCM simulations over the European domain at a resolution of 25km, with an improved land-surface scheme, nested within a free-running GCM. Using VDC allows many thousands of climate model runs to be computed. Using observations for the GCM boundary forcings we can run historical "hindcast" simulations over the past 100 to 150 years. This allows us, due to the chaotic variability of the atmosphere, to ascertain how likely an extreme event was, given the boundary forcings, and to derive synthetic event sets. The events in these sets did not actually occur in the observed record but could have occurred given the boundary forcings, with an associated probability. The event sets contain time-series of fields of meteorological variables that allow impact modellers to assess the loss the event would incur. Projections of events into the future are achieved by modelling projections of the sea-surface temperature (SST) and sea-ice boundary forcings, by combining the variability of the SST in the observed record with a range of warming signals derived from the varying responses of SSTs in the CMIP5 ensemble to elevated greenhouse gas (GHG) emissions in three RCP scenarios. Simulating the future with a

  10. Perl One-Liners: Bridging the Gap Between Large Data Sets and Analysis Tools.

    PubMed

    Hokamp, Karsten

    2015-01-01

    Computational analyses of biological data are becoming increasingly powerful, and researchers intending on carrying out their own analyses can often choose from a wide array of tools and resources. However, their application might be obstructed by the wide variety of different data formats that are in use, from standard, commonly used formats to output files from high-throughput analysis platforms. The latter are often too large to be opened, viewed, or edited by standard programs, potentially leading to a bottleneck in the analysis. Perl one-liners provide a simple solution to quickly reformat, filter, and merge data sets in preparation for downstream analyses. This chapter presents example code that can be easily adjusted to meet individual requirements. An online version is available at http://bioinf.gen.tcd.ie/pol. PMID:26498621

  11. Calculations of safe collimator settings and β* at the CERN Large Hadron Collider

    NASA Astrophysics Data System (ADS)

    Bruce, R.; Assmann, R. W.; Redaelli, S.

    2015-06-01

    The first run of the Large Hadron Collider (LHC) at CERN was very successful and resulted in important physics discoveries. One way of increasing the luminosity in a collider, which gave a very significant contribution to the LHC performance in the first run and can be used even if the beam intensity cannot be increased, is to decrease the transverse beam size at the interaction points by reducing the optical function β*. However, when doing so, the beam becomes larger in the final focusing system, which could expose its aperture to beam losses. For the LHC, which is designed to store beams with a total energy of 362 MJ, this is critical, since the loss of even a small fraction of the beam could cause a magnet quench or even damage. Therefore, the machine aperture has to be protected by the collimation system. The settings of the collimators constrain the maximum beam size that can be tolerated and therefore impose a lower limit on β*. In this paper, we present calculations to determine safe collimator settings and the resulting limit on β*, based on available aperture and operational stability of the machine. Our model was used to determine the LHC configurations in 2011 and 2012 and it was found that β* could be decreased significantly compared to the conservative model used in 2010. The gain in luminosity resulting from the decreased margins between collimators was more than a factor 2, and a further contribution from the use of realistic aperture estimates based on measurements was almost as large. This has played an essential role in the rapid and successful accumulation of experimental data in the LHC.

  12. Motif-based analysis of large nucleotide data sets using MEME-ChIP

    PubMed Central

    Ma, Wenxiu; Noble, William S; Bailey, Timothy L

    2014-01-01

    MEME-ChIP is a web-based tool for analyzing motifs in large DNA or RNA data sets. It can analyze peak regions identified by ChIP-seq, cross-linking sites identified by cLIP-seq and related assays, as well as sets of genomic regions selected using other criteria. MEME-ChIP performs de novo motif discovery, motif enrichment analysis, motif location analysis and motif clustering, providing a comprehensive picture of the DNA or RNA motifs that are enriched in the input sequences. MEME-ChIP performs two complementary types of de novo motif discovery: weight matrix–based discovery for high accuracy; and word-based discovery for high sensitivity. Motif enrichment analysis using DNA or RNA motifs from human, mouse, worm, fly and other model organisms provides even greater sensitivity. MEME-ChIP’s interactive HTML output groups and aligns significant motifs to ease interpretation. this protocol takes less than 3 h, and it provides motif discovery approaches that are distinct and complementary to other online methods. PMID:24853928

  13. Ghost transmission: How large basis sets can make electron transport calculations worse

    SciTech Connect

    Herrmann, Carmen; Solomon, Gemma C.; Subotnik, Joseph E.; Mujica, Vladimiro; Ratner, Mark A.

    2010-01-01

    The Landauer approach has proven to be an invaluable tool for calculating the electron transport properties of single molecules, especially when combined with a nonequilibrium Green’s function approach and Kohn–Sham density functional theory. However, when using large nonorthogonal atom-centered basis sets, such as those common in quantum chemistry, one can find erroneous results if the Landauer approach is applied blindly. In fact, basis sets of triple-zeta quality or higher sometimes result in an artificially high transmission and possibly even qualitatively wrong conclusions regarding chemical trends. In these cases, transport persists when molecular atoms are replaced by basis functions alone (“ghost atoms”). The occurrence of such ghost transmission is correlated with low-energy virtual molecular orbitals of the central subsystem and may be interpreted as a biased and thus inaccurate description of vacuum transmission. An approximate practical correction scheme is to calculate the ghost transmission and subtract it from the full transmission. As a further consequence of this study, it is recommended that sensitive molecules be used for parameter studies, in particular those whose transmission functions show antiresonance features such as benzene-based systems connected to the electrodes in meta positions and other low-conducting systems such as alkanes and silanes.

  14. Science Teachers' Decision-Making in Abstinence-Only-Until-Marriage (AOUM) Classrooms: Taboo Subjects and Discourses of Sex and Sexuality in Classroom Settings

    ERIC Educational Resources Information Center

    Gill, Puneet Singh

    2015-01-01

    Sex education, especially in the southeastern USA, remains steeped in an Abstinence-Only-Until-Marriage (AOUM) approach, which sets up barriers to the education of sexually active students. Research confirms that science education has the potential to facilitate discussion of controversial topics, including sex education. Science teachers in the…

  15. A multivariate approach to filling gaps in large ecological data sets using probabilistic matrix factorization techniques

    NASA Astrophysics Data System (ADS)

    Schrodt, F. I.; Shan, H.; Kattge, J.; Reich, P.; Banerjee, A.; Reichstein, M.

    2012-12-01

    With the advent of remotely sensed data and coordinated efforts to create global databases, the ecological community has progressively become more data-intensive. However, in contrast to other disciplines, statistical ways of handling these large data sets, especially the gaps which are inherent to them, are lacking. Widely used theoretical approaches, for example model averaging based on Akaike's information criterion (AIC), are sensitive to missing values. Yet, the most common way of handling sparse matrices - the deletion of cases with missing data (complete case analysis) - is known to severely reduce statistical power as well as inducing biased parameter estimates. In order to address these issues, we present novel approaches to gap filling in large ecological data sets using matrix factorization techniques. Factorization based matrix completion was developed in a recommender system context and has since been widely used to impute missing data in fields outside the ecological community. Here, we evaluate the effectiveness of probabilistic matrix factorization techniques for imputing missing data in ecological matrices using two imputation techniques. Hierarchical Probabilistic Matrix Factorization (HPMF) effectively incorporates hierarchical phylogenetic information (phylogenetic group, family, genus, species and individual plant) into the trait imputation. Kernelized Probabilistic Matrix Factorization (KPMF) on the other hand includes environmental information (climate and soils) into the matrix factorization through kernel matrices over rows and columns. We test the accuracy and effectiveness of HPMF and KPMF in filling sparse matrices, using the TRY database of plant functional traits (http://www.try-db.org). TRY is one of the largest global compilations of plant trait databases (750 traits of 1 million plants), encompassing data on morphological, anatomical, biochemical, phenological and physiological features of plants. However, despite of unprecedented

  16. Listserv Lemmings and Fly-brarians on the Wall: A Librarian-Instructor Team Taming the Cyberbeast in the Large Classroom.

    ERIC Educational Resources Information Center

    Dickstein, Ruth; McBride, Kari Boyd

    1998-01-01

    Computer technology can empower students if they have the tools to find their way through print and online sources. This article describes how a reference librarian and a faculty instructor collaborated to teach research strategies and critical thinking skills (including analysis and evaluation of resources) in a large university classroom using a…

  17. An Evaluation of the Developmental Designs Approach and Professional Development Model on Classroom Management in 22 Middle Schools in a Large, Midwestern School District

    ERIC Educational Resources Information Center

    Hough, David L.

    2011-01-01

    This study presents findings from an evaluation of the Developmental Designs classroom management approach and professional development model during its first year of implementation across 22 middle schools in a large, Midwestern school district. The impact of this professional development model on teaching and learning as related to participants'…

  18. Any Questions? An Application of Weick's Model of Organizing to Increase Student Involvement in the Large-Lecture Classroom

    ERIC Educational Resources Information Center

    Ledford, Christy J. W.; Saperstein, Adam K.; Cafferty, Lauren A.; McClintick, Stacey H.; Bernstein, Ethan M.

    2015-01-01

    Microblogs, with their interactive nature, can engage students in community building and sensemaking. Using Weick's model of organizing as a framework, we integrated the use of micromessaging to increase student engagement in the large-lecture classroom. Students asked significantly more questions and asked a greater diversity of questions…

  19. Galaxy Evolution Insights from Spectral Modeling of Large Data Sets from the Sloan Digital Sky Survey

    SciTech Connect

    Hoversten, Erik A.

    2007-10-01

    This thesis centers on the use of spectral modeling techniques on data from the Sloan Digital Sky Survey (SDSS) to gain new insights into current questions in galaxy evolution. The SDSS provides a large, uniform, high quality data set which can be exploited in a number of ways. One avenue pursued here is to use the large sample size to measure precisely the mean properties of galaxies of increasingly narrow parameter ranges. The other route taken is to look for rare objects which open up for exploration new areas in galaxy parameter space. The crux of this thesis is revisiting the classical Kennicutt method for inferring the stellar initial mass function (IMF) from the integrated light properties of galaxies. A large data set (~ 105 galaxies) from the SDSS DR4 is combined with more in-depth modeling and quantitative statistical analysis to search for systematic IMF variations as a function of galaxy luminosity. Galaxy Hα equivalent widths are compared to a broadband color index to constrain the IMF. It is found that for the sample as a whole the best fitting IMF power law slope above 0.5 M is Γ = 1.5 ± 0.1 with the error dominated by systematics. Galaxies brighter than around Mr,0.1 = -20 (including galaxies like the Milky Way which has Mr,0.1 ~ -21) are well fit by a universal Γ ~ 1.4 IMF, similar to the classical Salpeter slope, and smooth, exponential star formation histories (SFH). Fainter galaxies prefer steeper IMFs and the quality of the fits reveal that for these galaxies a universal IMF with smooth SFHs is actually a poor assumption. Related projects are also pursued. A targeted photometric search is conducted for strongly lensed Lyman break galaxies (LBG) similar to MS1512-cB58. The evolution of the photometric selection technique is described as are the results of spectroscopic follow-up of the best targets. The serendipitous discovery of two interesting blue compact dwarf galaxies is reported. These

  20. PORTAAL: A Classroom Observation Tool Assessing Evidence-Based Teaching Practices for Active Learning in Large Science, Technology, Engineering, and Mathematics Classes.

    PubMed

    Eddy, Sarah L; Converse, Mercedes; Wenderoth, Mary Pat

    2015-01-01

    There is extensive evidence that active learning works better than a completely passive lecture. Despite this evidence, adoption of these evidence-based teaching practices remains low. In this paper, we offer one tool to help faculty members implement active learning. This tool identifies 21 readily implemented elements that have been shown to increase student outcomes related to achievement, logic development, or other relevant learning goals with college-age students. Thus, this tool both clarifies the research-supported elements of best practices for instructor implementation of active learning in the classroom setting and measures instructors' alignment with these practices. We describe how we reviewed the discipline-based education research literature to identify best practices in active learning for adult learners in the classroom and used these results to develop an observation tool (Practical Observation Rubric To Assess Active Learning, or PORTAAL) that documents the extent to which instructors incorporate these practices into their classrooms. We then use PORTAAL to explore the classroom practices of 25 introductory biology instructors who employ some form of active learning. Overall, PORTAAL documents how well aligned classrooms are with research-supported best practices for active learning and provides specific feedback and guidance to instructors to allow them to identify what they do well and what could be improved. PMID:26033871

  1. PORTAAL: A Classroom Observation Tool Assessing Evidence-Based Teaching Practices for Active Learning in Large Science, Technology, Engineering, and Mathematics Classes

    PubMed Central

    Eddy, Sarah L.; Converse, Mercedes; Wenderoth, Mary Pat

    2015-01-01

    There is extensive evidence that active learning works better than a completely passive lecture. Despite this evidence, adoption of these evidence-based teaching practices remains low. In this paper, we offer one tool to help faculty members implement active learning. This tool identifies 21 readily implemented elements that have been shown to increase student outcomes related to achievement, logic development, or other relevant learning goals with college-age students. Thus, this tool both clarifies the research-supported elements of best practices for instructor implementation of active learning in the classroom setting and measures instructors’ alignment with these practices. We describe how we reviewed the discipline-based education research literature to identify best practices in active learning for adult learners in the classroom and used these results to develop an observation tool (Practical Observation Rubric To Assess Active Learning, or PORTAAL) that documents the extent to which instructors incorporate these practices into their classrooms. We then use PORTAAL to explore the classroom practices of 25 introductory biology instructors who employ some form of active learning. Overall, PORTAAL documents how well aligned classrooms are with research-supported best practices for active learning and provides specific feedback and guidance to instructors to allow them to identify what they do well and what could be improved. PMID:26033871

  2. Efficient Implementation of an Optimal Interpolator for Large Spatial Data Sets

    NASA Technical Reports Server (NTRS)

    Memarsadeghi, Nargess; Mount, David M.

    2007-01-01

    Scattered data interpolation is a problem of interest in numerous areas such as electronic imaging, smooth surface modeling, and computational geometry. Our motivation arises from applications in geology and mining, which often involve large scattered data sets and a demand for high accuracy. The method of choice is ordinary kriging. This is because it is a best unbiased estimator. Unfortunately, this interpolant is computationally very expensive to compute exactly. For n scattered data points, computing the value of a single interpolant involves solving a dense linear system of size roughly n x n. This is infeasible for large n. In practice, kriging is solved approximately by local approaches that are based on considering only a relatively small'number of points that lie close to the query point. There are many problems with this local approach, however. The first is that determining the proper neighborhood size is tricky, and is usually solved by ad hoc methods such as selecting a fixed number of nearest neighbors or all the points lying within a fixed radius. Such fixed neighborhood sizes may not work well for all query points, depending on local density of the point distribution. Local methods also suffer from the problem that the resulting interpolant is not continuous. Meyer showed that while kriging produces smooth continues surfaces, it has zero order continuity along its borders. Thus, at interface boundaries where the neighborhood changes, the interpolant behaves discontinuously. Therefore, it is important to consider and solve the global system for each interpolant. However, solving such large dense systems for each query point is impractical. Recently a more principled approach to approximating kriging has been proposed based on a technique called covariance tapering. The problems arise from the fact that the covariance functions that are used in kriging have global support. Our implementations combine, utilize, and enhance a number of different

  3. Redefining the Ojibwe Classroom: Indigenous Language Programs within Large Research Universities

    ERIC Educational Resources Information Center

    Morgan, Mindy J.

    2005-01-01

    Indigenous languages are powerful symbols of self-determination and sovereignty for tribal communities in the United States, and many community-based programs have been developed to support and maintain them. The successes of these programs, however, have been difficult to replicate at large research institutions. This article examines the issues…

  4. Taking Energy to the Physics Classroom from the Large Hadron Collider at CERN

    ERIC Educational Resources Information Center

    Cid, Xabier; Cid, Ramon

    2009-01-01

    In 2008, the greatest experiment in history began. When in full operation, the Large Hadron Collider (LHC) at CERN will generate the greatest amount of information that has ever been produced in an experiment before. It will also reveal some of the most fundamental secrets of nature. Despite the enormous amount of information available on this…

  5. Visualization of large medical data sets using memory-optimized CPU and GPU algorithms

    NASA Astrophysics Data System (ADS)

    Kiefer, Gundolf; Lehmann, Helko; Weese, Juergen

    2005-04-01

    With the evolution of medical scanners towards higher spatial resolutions, the sizes of image data sets are increasing rapidly. To profit from the higher resolution in medical applications such as 3D-angiography for a more efficient and precise diagnosis, high-performance visualization is essential. However, to make sure that the performance of a volume rendering algorithm scales with the performance of future computer architectures, technology trends need to be considered. The design of such scalable volume rendering algorithms remains challenging. One of the major trends in the development of computer architectures is the wider use of cache memory hierarchies to bridge the growing gap between the faster evolving processing power and the slower evolving memory access speed. In this paper we propose ways to exploit the standard PC"s cache memories supporting the main processors (CPU"s) and the graphics hardware (graphics processing unit, GPU), respectively, for computing Maximum Intensity Projections (MIPs). To this end, we describe a generic and flexible way to improve the cache efficiency of software ray casting algorithms and show by means of cache simulations, that it enables cache miss rates close to the theoretical optimum. For GPU-based rendering we propose a similar, brick-based technique to optimize the utilization of onboard caches and the transfer of data to the GPU on-board memory. All algorithms produce images of identical quality, which enables us to compare the performance of their implementations in a fair way without eventually trading quality for speed. Our comparison indicates that the proposed methods perform superior, in particular for large data sets.

  6. Registering coherent change detection products associated with large image sets and long capture intervals

    DOEpatents

    Perkins, David Nikolaus; Gonzales, Antonio I

    2014-04-08

    A set of co-registered coherent change detection (CCD) products is produced from a set of temporally separated synthetic aperture radar (SAR) images of a target scene. A plurality of transformations are determined, which transformations are respectively for transforming a plurality of the SAR images to a predetermined image coordinate system. The transformations are used to create, from a set of CCD products produced from the set of SAR images, a corresponding set of co-registered CCD products.

  7. Evaluating hydrological ensemble predictions using a large and varied set of catchments (Invited)

    NASA Astrophysics Data System (ADS)

    Ramos, M.; Andreassian, V.; Perrin, C.; Loumagne, C.

    2010-12-01

    It is widely accepted that local and national operational early warning systems can play a key role in mitigating flood damage and losses to society while improving risk awareness and flood preparedness. Over the last years, special attention has been paid to efficiently couple meteorological and hydrological warning systems to track uncertainty and achieve longer lead times in hydrological forecasting. Several national and international scientific programs have focused on the pre-operational test and development of ensemble hydrological forecasting. Based on the lumped soil-moisture-accounting type rainfall-runoff model GRP, developed at Cemagref, we have set up a research tool for ensemble forecasting and conducted several studies to evaluate the quality of streamflow forecasts. The model has been driven by available archives of weather ensemble prediction systems from different sources (Météo-France, ECMWF, TIGGE archive). Our approach has sought to combine overall validation under varied geographical and climate conditions (to assess model robustness and generality) and site-specific validation (to locally accept or reject the hydrologic forecast system and contribute to defining its limits of applicability). The general aim is to contribute to methodological developments concerning a wide range of key aspects in hydrological forecasting, including: the links between predictability skill and catchment characteristics, the magnitude and the distribution of forecasting errors, the analysis of nested or neighbouring catchments for prediction in ungauged basins, as well as the reliability of model predictions when forecasting under conditions not previously encountered during the period of setup and calibration of the system. This presentation will cover the aforementioned topics and present examples from studies carried out to evaluate and inter-compare ensemble forecasting systems using a large and varied set of catchments in France. The specific need to

  8. Considerations for observational research using large data sets in radiation oncology.

    PubMed

    Jagsi, Reshma; Bekelman, Justin E; Chen, Aileen; Chen, Ronald C; Hoffman, Karen; Shih, Ya-Chen Tina; Smith, Benjamin D; Yu, James B

    2014-09-01

    The radiation oncology community has witnessed growing interest in observational research conducted using large-scale data sources such as registries and claims-based data sets. With the growing emphasis on observational analyses in health care, the radiation oncology community must possess a sophisticated understanding of the methodological considerations of such studies in order to evaluate evidence appropriately to guide practice and policy. Because observational research has unique features that distinguish it from clinical trials and other forms of traditional radiation oncology research, the International Journal of Radiation Oncology, Biology, Physics assembled a panel of experts in health services research to provide a concise and well-referenced review, intended to be informative for the lay reader, as well as for scholars who wish to embark on such research without prior experience. This review begins by discussing the types of research questions relevant to radiation oncology that large-scale databases may help illuminate. It then describes major potential data sources for such endeavors, including information regarding access and insights regarding the strengths and limitations of each. Finally, it provides guidance regarding the analytical challenges that observational studies must confront, along with discussion of the techniques that have been developed to help minimize the impact of certain common analytical issues in observational analysis. Features characterizing a well-designed observational study include clearly defined research questions, careful selection of an appropriate data source, consultation with investigators with relevant methodological expertise, inclusion of sensitivity analyses, caution not to overinterpret small but significant differences, and recognition of limitations when trying to evaluate causality. This review concludes that carefully designed and executed studies using observational data that possess these qualities hold

  9. Considerations for Observational Research Using Large Data Sets in Radiation Oncology

    SciTech Connect

    Jagsi, Reshma; Bekelman, Justin E.; Chen, Aileen; Chen, Ronald C.; Hoffman, Karen; Tina Shih, Ya-Chen; Smith, Benjamin D.; Yu, James B.

    2014-09-01

    The radiation oncology community has witnessed growing interest in observational research conducted using large-scale data sources such as registries and claims-based data sets. With the growing emphasis on observational analyses in health care, the radiation oncology community must possess a sophisticated understanding of the methodological considerations of such studies in order to evaluate evidence appropriately to guide practice and policy. Because observational research has unique features that distinguish it from clinical trials and other forms of traditional radiation oncology research, the International Journal of Radiation Oncology, Biology, Physics assembled a panel of experts in health services research to provide a concise and well-referenced review, intended to be informative for the lay reader, as well as for scholars who wish to embark on such research without prior experience. This review begins by discussing the types of research questions relevant to radiation oncology that large-scale databases may help illuminate. It then describes major potential data sources for such endeavors, including information regarding access and insights regarding the strengths and limitations of each. Finally, it provides guidance regarding the analytical challenges that observational studies must confront, along with discussion of the techniques that have been developed to help minimize the impact of certain common analytical issues in observational analysis. Features characterizing a well-designed observational study include clearly defined research questions, careful selection of an appropriate data source, consultation with investigators with relevant methodological expertise, inclusion of sensitivity analyses, caution not to overinterpret small but significant differences, and recognition of limitations when trying to evaluate causality. This review concludes that carefully designed and executed studies using observational data that possess these qualities hold

  10. Anomaly Detection in Large Sets of High-Dimensional Symbol Sequences

    NASA Technical Reports Server (NTRS)

    Budalakoti, Suratna; Srivastava, Ashok N.; Akella, Ram; Turkov, Eugene

    2006-01-01

    This paper addresses the problem of detecting and describing anomalies in large sets of high-dimensional symbol sequences. The approach taken uses unsupervised clustering of sequences using the normalized longest common subsequence (LCS) as a similarity measure, followed by detailed analysis of outliers to detect anomalies. As the LCS measure is expensive to compute, the first part of the paper discusses existing algorithms, such as the Hunt-Szymanski algorithm, that have low time-complexity. We then discuss why these algorithms often do not work well in practice and present a new hybrid algorithm for computing the LCS that, in our tests, outperforms the Hunt-Szymanski algorithm by a factor of five. The second part of the paper presents new algorithms for outlier analysis that provide comprehensible indicators as to why a particular sequence was deemed to be an outlier. The algorithms provide a coherent description to an analyst of the anomalies in the sequence, compared to more normal sequences. The algorithms we present are general and domain-independent, so we discuss applications in related areas such as anomaly detection.

  11. Public-private partnerships with large corporations: setting the ground rules for better health.

    PubMed

    Galea, Gauden; McKee, Martin

    2014-04-01

    Public-private partnerships with large corporations offer potential benefits to the health sector but many concerns have been raised, highlighting the need for appropriate safeguards. In this paper we propose five tests that public policy makers may wish to apply when considering engaging in such a public-private partnership. First, are the core products and services provided by the corporation health enhancing or health damaging? In some cases, such as tobacco, the answer is obvious but others, such as food and alcohol, are contested. In such cases, the burden of proof is on the potential partners to show that their activities are health enhancing. Second, do potential partners put their policies into practice in the settings where they can do so, their own workplaces? Third, are the corporate social responsibility activities of potential partners independently audited? Fourth, do potential partners make contributions to the commons rather than to narrow programmes of their choosing? Fifth, is the role of the partner confined to policy implementation rather than policy development, which is ultimately the responsibility of government alone? PMID:24581699

  12. Information Theoretic Approaches to Rapid Discovery of Relationships in Large Climate Data Sets

    NASA Technical Reports Server (NTRS)

    Knuth, Kevin H.; Rossow, William B.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Mutual information as the asymptotic Bayesian measure of independence is an excellent starting point for investigating the existence of possible relationships among climate-relevant variables in large data sets, As mutual information is a nonlinear function of of its arguments, it is not beholden to the assumption of a linear relationship between the variables in question and can reveal features missed in linear correlation analyses. However, as mutual information is symmetric in its arguments, it only has the ability to reveal the probability that two variables are related. it provides no information as to how they are related; specifically, causal interactions or a relation based on a common cause cannot be detected. For this reason we also investigate the utility of a related quantity called the transfer entropy. The transfer entropy can be written as a difference between mutual informations and has the capability to reveal whether and how the variables are causally related. The application of these information theoretic measures is rested on some familiar examples using data from the International Satellite Cloud Climatology Project (ISCCP) to identify relation between global cloud cover and other variables, including equatorial pacific sea surface temperature (SST), over seasonal and El Nino Southern Oscillation (ENSO) cycles.

  13. Spatially-aware Processing of Large Raw LiDAR Data Sets

    NASA Astrophysics Data System (ADS)

    Strane, M. D.; Oskin, M.

    2004-12-01

    An ultimate goal of LiDAR (LIght Detection And Ranging) data acquisition is to produce a regularly sampled accurate topographic view of the surface of the Earth. Last-return and inverse-distance weighted sampling of raw LiDAR data do not take into account the non-random distribution of raw data points. While elevation data produced by these methods is of high accuracy, gradients are not well-resolved and aliasing artifacts are produced, especially on low gradient surfaces. Because of the volume of data involved, resampling schemes that take into account the spatial distribution of raw data have been cumbersome to implement. We have developed a resampling method that uses the free open-source PostgresSQL database to store the raw LiDAR data indexed spatially and as its original time series. This database permits rapid access to raw data points via spatial queries. A robust and expedient algorithm has been implemented to produce regularly gridded resampled data with a least squares plane fit regression. This algorithm reduces aliasing artifacts on low gradient surfaces. The algorithm is also a proof-of-concept to show that complex spatially-aware processing of large LiDAR data sets is feasible on a reasonable time scale, and will be the basis for further improvements such as vegetation removal.

  14. Long DNA sequences and large data sets: investigating the Quaternary via ancient DNA

    NASA Astrophysics Data System (ADS)

    Hofreiter, Michael

    2008-12-01

    Progress in technical development has allowed piecing together increasingly long DNA sequences from subfossil remains of both extinct and extant species. At the same time, more and more species are analyzed on the population level, leading to a better understanding of population dynamics over time. Finally, new sequencing techniques have allowed targeting complete nuclear genomes of extinct species. The sequences obtained yield insights into a variety of research fields. First, phylogenetic relationships can be resolved with much greater accuracy and it becomes possible to date divergence events of species during and before the Quaternary. Second, large data sets in population genetics facilitate the assessment of changes in genetic diversity over time, an approach that has substantially revised our views about phylogeographic patterns and population dynamics. In the future, the combination of population genetics with long DNA sequences, e.g. complete mitochondrial (mt) DNA genomes, should lead to much more precise estimates of population size changes to be made. This will enable us to make inferences about - and hopefully understand - the causes for faunal turnover and extinctions during the Quaternary. Third, with regard to the nuclear genome, complete genes and genomes can now be sequenced and studied with regard to their function, revealing insights about the numerous traits of extinct species that are not preserved in the fossil record.

  15. Analog and digital interface solutions for the common large-area display set (CLADS)

    NASA Astrophysics Data System (ADS)

    Hermann, David J.; Gorenflo, Ronald L.

    1997-07-01

    Battelle is under contract with Warner Robins Air Logistics Center to design a common large area display set (CLADS) for use in multiple airborne command, control, communications, computers and intelligence applications that currently use unique 19 inch cathode ray tubes (CRTs). The CLADS is a modular design, with common modules used wherever possible. Each CLADS includes an application-specific integration kit, which incorporates all of the unique interface components. Since there is no existing digital video interface standard for high resolution workstations, a standard interface was developed for CLADS and documented as an interface specification.One of the application-specific modules, the application video interface module (AVIM), readily incorporates most of the required application electrical interfaces for a given system into a single module. The analog AVIM, however, poses unique design problems when folding multiple application interface requirements into a single common AVIM for the most prevalent workstation display interface: analog RGB video. Future workstation display interfaces will incorporate fully digital video between the graphics hardware and the digital display device. A digital AVIM is described which utilizes a fiber channel interface to deliver high speed 1280 by 1024, 24- bit, 60 Hz digital video from a PCI graphics card to the CLADS. A video recording and playback device is described, as well as other common CLADS modules, including the display controller and power supply. This paper will discuss both the analog and digital AVIM interfaces, application BIT and power interfaces, as well as CLADS internal interfaces.

  16. Classroom Management and the Librarian

    ERIC Educational Resources Information Center

    Blackburn, Heidi; Hays, Lauren

    2014-01-01

    As librarians take on more instructional responsibilities, the need for classroom management skills becomes vital. Unfortunately, classroom management skills are not taught in library school and therefore, many librarians are forced to learn how to manage a classroom on the job. Different classroom settings such as one-shot instruction sessions…

  17. An efficient out-of-core volume ray casting method for the visualization of large medical data sets

    NASA Astrophysics Data System (ADS)

    Xue, Jian; Tian, Jie; Chen, Jian; Dai, Yakang

    2007-03-01

    Volume ray casting algorithm is widely recognized for high quality volume visualization. However, when rendering very large volume data sets, the original ray casting algorithm will lead to very inefficient random accesses in disk and make it very slowly to render the whole volume data set. In order to solve this problem, an efficient out-of-core volume ray casting method with a new out-of-core framework for processing large volume data sets based on consumer PC hardware is proposed in this paper. The new framework gives a transparent and efficient access to the volume data set cached in disk, while the new volume ray casting method minimizes the data exchange between hard disk and physical memory and performs comparatively fast high quality volume rendering. The experimental results indicate that the new method and framework are effective and efficient for the visualization of very large medical data sets.

  18. Assembly of large metagenome data sets using a Convey HC-1 hybrid core computer (7th Annual SFAF Meeting, 2012)

    ScienceCinema

    Copeland, Alex [DOE JGI

    2013-02-11

    Alex Copeland on "Assembly of large metagenome data sets using a Convey HC-1 hybrid core computer" at the 2012 Sequencing, Finishing, Analysis in the Future Meeting held June 5-7, 2012 in Santa Fe, New Mexico.

  19. Mobile-phone-based classroom response systems: Students' perceptions of engagement and learning in a large undergraduate course

    NASA Astrophysics Data System (ADS)

    Dunn, Peter K.; Richardson, Alice; Oprescu, Florin; McDonald, Christine

    2013-12-01

    Using a Classroom Response System (CRS) has been associated with positive educational outcomes, by fostering student engagement and by allowing immediate feedback to both students and instructors. This study examined a low-cost CRS (VotApedia) in a large first-year class, where students responded to questions using their mobile phones. This study explored whether the use of VotApedia retained the advantages of other CRS, overcame some of the challenges of other CRS, and whether new challenges were introduced by using VotApedia. These issues were studied within three themes: students' perceptions of using VotApedia; the impact of VotApedia on their engagement; and the impact of VotApedia on their learning. Data were collected from an online survey, focus groups and student feedback on teaching and course content. The results indicated that using VotApedia retains the pedagogical advantages of other CRS, while overcoming some of the challenges presented by using other CRS, without introducing any new challenges.

  20. Linked Scatter Plots, A Powerful Exploration Tool For Very Large Sets of Spectra

    NASA Astrophysics Data System (ADS)

    Carbon, Duane Francis; Henze, Christopher

    2015-08-01

    We present a new tool, based on linked scatter plots, that is designed to efficiently explore very large spectrum data sets such as the SDSS, APOGEE, LAMOST, GAIA, and RAVE data sets.The tool works in two stages: the first uses batch processing and the second runs interactively. In the batch stage, spectra are processed through our data pipeline which computes the depths relative to the local continuum at preselected feature wavelengths. These depths, and any additional available variables such as local S/N level, magnitudes, colors, positions, and radial velocities, are the basic measured quantities used in the interactive stage.The interactive stage employs the NASA hyperwall, a configuration of 128 workstation displays (8x16 array) controlled by a parallelized software suite running on NASA's Pleiades supercomputer. Each hyperwall panel is used to display a fully linked 2-D scatter plot showing the depth of feature A vs the depth of feature B for all of the spectra. A and B change from panel to panel. The relationships between the various (A,B) strengths and any distinctive clustering, as well as unique outlier groupings, are visually apparent when examining and inter-comparing the different panels on the hyperwall. In addition, the data links between the scatter plots allow the user to apply a logical algebra to the measurements. By graphically selecting the objects in any interesting region of any 2-D plot on the hyperwall, the tool immediately and clearly shows how the selected objects are distributed in all the other 2-D plots. The selection process may be repeated multiple times and, at each step, the selections can represent a sequence of logical constraints on the measurements, revealing those objects which satisfy all the constraints thus far. The spectra of the selected objects may be examined at any time on a connected workstation display.Using over 945,000,000 depth measurements from 569,738 SDSS DR10 stellar spectra, we illustrate how to quickly

  1. The Sheffield experiment: the effects of centralising accident and emergency services in a large urban setting

    PubMed Central

    Simpson, A; Wardrope, J; Burke, D

    2001-01-01

    Objectives—To assess the effects of centralisation of accident and emergency (A&E) services in a large urban setting. The end points were the quality of patient care judged by time to see a doctor or nurse practitioner, time to admission and the cost of the A&E service as a whole. Methods—Sheffield is a large industrial city with a population of 471 000. In 1994 Sheffield health authority took a decision to centralise a number of services including the A&E services. This study presents data collected over a three year period before, during and after the centralisation of adult A&E services from two sites to one site and the centralisation of children's A&E services to a separate site. A minor injury unit was also established along with an emergency admissions unit. The study used information from the A&E departments' computer system and routinely available financial data. Results—There has been a small decrease in the number of new patient attendances using the Sheffield A&E system. Most patients go to the correct department. The numbers of acute admissions through the adult A&E have doubled. Measures of process efficiency show some improvement in times to admission. There has been measurable deterioration in the time to be seen for minor injuries in the A&E departments. This is partly offset by the very good waiting time to be seen in the minor injuries unit. The costs of providing the service within Sheffield have increased. Conclusion—Centralisation of A&E services in Sheffield has led to concentration of the most ill patients in a single adult department and separate paediatric A&E department. Despite a greatly increased number of admissions at the adult site this change has not resulted in increased waiting times for admission because of the transfer of adequate beds to support the changes. There has however been a deterioration in the time to see a clinician, especially in the A&E departments. The waiting times at the minor injury unit are very short

  2. BACHSCORE. A tool for evaluating efficiently and reliably the quality of large sets of protein structures

    NASA Astrophysics Data System (ADS)

    Sarti, E.; Zamuner, S.; Cossio, P.; Laio, A.; Seno, F.; Trovato, A.

    2013-12-01

    In protein structure prediction it is of crucial importance, especially at the refinement stage, to score efficiently large sets of models by selecting the ones that are closest to the native state. We here present a new computational tool, BACHSCORE, that allows its users to rank different structural models of the same protein according to their quality, evaluated by using the BACH++ (Bayesian Analysis Conformation Hunt) scoring function. The original BACH statistical potential was already shown to discriminate with very good reliability the protein native state in large sets of misfolded models of the same protein. BACH++ features a novel upgrade in the solvation potential of the scoring function, now computed by adapting the LCPO (Linear Combination of Pairwise Orbitals) algorithm. This change further enhances the already good performance of the scoring function. BACHSCORE can be accessed directly through the web server: bachserver.pd.infn.it. Catalogue identifier: AEQD_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEQD_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: GNU General Public License version 3 No. of lines in distributed program, including test data, etc.: 130159 No. of bytes in distributed program, including test data, etc.: 24 687 455 Distribution format: tar.gz Programming language: C++. Computer: Any computer capable of running an executable produced by a g++ compiler (4.6.3 version). Operating system: Linux, Unix OS-es. RAM: 1 073 741 824 bytes Classification: 3. Nature of problem: Evaluate the quality of a protein structural model, taking into account the possible “a priori” knowledge of a reference primary sequence that may be different from the amino-acid sequence of the model; the native protein structure should be recognized as the best model. Solution method: The contact potential scores the occurrence of any given type of residue pair in 5 possible

  3. Gaining A Geological Perspective Through Active Learning in the Large Lecture Classroom

    NASA Astrophysics Data System (ADS)

    Kapp, J. L.; Richardson, R. M.; Slater, S. J.

    2008-12-01

    NATS 101 A Geological Perspective is a general education course taken by non science majors. We offer 600 seats per semester, with four large lecture sections taught by different faculty members. In the past we have offered optional once a week study groups taught by graduate teaching assistants. Students often feel overwhelmed by the science and associated jargon, and many are prone to skipping lectures altogether. Optional study groups are only attended by ~50% of the students. Faculty members find the class to be a lot of work, mainly due to the grading it generates. Activities given in lecture are often short multiple choice or true false assignments, limiting the depth of understanding we can evaluate. Our students often lack math and critical thinking skills, and we spend a lot of time in lecture reintroducing ideas students should have already gotten from the text. In summer 2007 we were funded to redesign the course. Our goals were to 1) cut the cost of running the course, and 2) improve student learning. Under our redesign optional study groups were replaced by once a week mandatory break out sessions where students complete activities that have been introduced in lecture. Break out sessions substitute for one hour of lecture, and are run by undergraduate preceptors and graduate teaching assistants (GTAs). During the lecture period, lectures themselves are brief with a large portion of the class devoted to active learning in small groups. Weekly reading quizzes are submitted via the online course management system. Break out sessions allow students to spend more time interacting with their fellow students, undergraduate preceptors, and GTAs. They get one on one help in break out sessions on assignments designed to enhance the lecture material. The active lecture format means less of their time is devoted to listening passively to a lecture, and more time is spent peer learning an interacting with the instructor. Completing quizzes online allows students

  4. Repulsive parallel MCMC algorithm for discovering diverse motifs from large sequence sets

    PubMed Central

    Ikebata, Hisaki; Yoshida, Ryo

    2015-01-01

    Motivation: The motif discovery problem consists of finding recurring patterns of short strings in a set of nucleotide sequences. This classical problem is receiving renewed attention as most early motif discovery methods lack the ability to handle large data of recent genome-wide ChIP studies. New ChIP-tailored methods focus on reducing computation time and pay little regard to the accuracy of motif detection. Unlike such methods, our method focuses on increasing the detection accuracy while maintaining the computation efficiency at an acceptable level. The major advantage of our method is that it can mine diverse multiple motifs undetectable by current methods. Results: The repulsive parallel Markov chain Monte Carlo (RPMCMC) algorithm that we propose is a parallel version of the widely used Gibbs motif sampler. RPMCMC is run on parallel interacting motif samplers. A repulsive force is generated when different motifs produced by different samplers near each other. Thus, different samplers explore different motifs. In this way, we can detect much more diverse motifs than conventional methods can. Through application to 228 transcription factor ChIP-seq datasets of the ENCODE project, we show that the RPMCMC algorithm can find many reliable cofactor interacting motifs that existing methods are unable to discover. Availability and implementation: A C++ implementation of RPMCMC and discovered cofactor motifs for the 228 ENCODE ChIP-seq datasets are available from http://daweb.ism.ac.jp/yoshidalab/motif. Contact: ikebata.hisaki@ism.ac.jp, yoshidar@ism.ac.jp Supplementary information: Supplementary data are available from Bioinformatics online. PMID:25583120

  5. Toward accurate thermochemical models for transition metals : G3large basis sets for atoms Sc-Zn.

    SciTech Connect

    Mayhall, N. J.; Raghavachari, K.; Redfern, P. C.; Curtiss, L. A.; Rassolov, V.; Indiana Univ.; Univ. of South Carolina

    2008-04-01

    An augmented valence triple-zeta basis set, referred to as G3Large, is reported for the first-row transition metal elements Sc through Zn. The basis set is constructed in a manner similar to the G3Large basis set developed previously for other elements (H-Ar, K, Ca, Ga-Kr) and used as a key component in Gaussian-3 theory. It is based on a contraction of a set of 15s13p5d Gaussian primitives to 8s7p3d, and also includes sets of f and g polarization functions, diffuse spd functions, and core df polarization functions. The basis set is evaluated with triples-augmented coupled cluster [CCSD(T)] and Brueckner orbital [BD(T)] methods for a small test set involving energies of atoms, atomic ions, and diatomic hydrides. It performs well for the low-lying s{yields}d excitation energies of atoms, atomic ionization energies, and the dissociation energies of the diatomic hydrides. The Brueckner orbital-based BD(T) method performs substantially better than Hartree-Fock-based CCSD(T) for molecules such as NiH, where the starting unrestricted Hartree-Fock wavefunction suffers from a high degree of spin contamination. Comparison with available data for geometries of transition metal hydrides also shows good agreement. A smaller basis set without core polarization functions, G3MP2Large, is also defined.

  6. Quality in Inclusive Preschool Classrooms

    ERIC Educational Resources Information Center

    Hestenes, Linda L.; Cassidy, Deborah J.; Shim, Jonghee; Hegde, Archana V.

    2008-01-01

    Research Findings: Quality of care for preschool children in inclusive and noninclusive classrooms was examined in two studies. In Study 1, comparisons across a large sample of classrooms (N = 1, 313) showed that inclusive classrooms were higher than noninclusive classrooms in global quality as well as on two dimensions of quality…

  7. Improving Library Effectiveness: A Proposal for Applying Fuzzy Set Concepts in the Management of Large Collections.

    ERIC Educational Resources Information Center

    Robinson, Earl J.; Turner, Stephen J.

    1981-01-01

    Fuzzy set theory, a mathematical modeling technique that allows for the consideration of such factors as "professional expertise" in decision making, is discussed as a tool for use in libraries--specifically in collection management. The fundamentals of fuzzy set theory are introduced and a reference list is included. (JL)

  8. Problems in the Cataloging of Large Microform Sets or, Learning to Expect the Unexpected.

    ERIC Educational Resources Information Center

    Joachim, Martin D.

    1989-01-01

    Describes problems encountered during the cataloging of three major microform sets at the Indiana University Libraries. Areas discussed include size and contents of the sets, staffing for the project, equipment, authority work, rare book cataloging rules, serials, language of materials, musical scores, and manuscripts. (CLB)

  9. Tools for Analysis and Visualization of Large Time-Varying CFD Data Sets

    NASA Technical Reports Server (NTRS)

    Wilhelms, Jane; VanGelder, Allen

    1997-01-01

    In the second year, we continued to built upon and improve our scanline-based direct volume renderer that we developed in the first year of this grant. This extremely general rendering approach can handle regular or irregular grids, including overlapping multiple grids, and polygon mesh surfaces. It runs in parallel on multi-processors. It can also be used in conjunction with a k-d tree hierarchy, where approximate models and error terms are stored in the nodes of the tree, and approximate fast renderings can be created. We have extended our software to handle time-varying data where the data changes but the grid does not. We are now working on extending it to handle more general time-varying data. We have also developed a new extension of our direct volume renderer that uses automatic decimation of the 3D grid, as opposed to an explicit hierarchy. We explored this alternative approach as being more appropriate for very large data sets, where the extra expense of a tree may be unacceptable. We also describe a new approach to direct volume rendering using hardware 3D textures and incorporates lighting effects. Volume rendering using hardware 3D textures is extremely fast, and machines capable of using this technique are becoming more moderately priced. While this technique, at present, is limited to use with regular grids, we are pursuing possible algorithms extending the approach to more general grid types. We have also begun to explore a new method for determining the accuracy of approximate models based on the light field method described at ACM SIGGRAPH '96. In our initial implementation, we automatically image the volume from 32 equi-distant positions on the surface of an enclosing tessellated sphere. We then calculate differences between these images under different conditions of volume approximation or decimation. We are studying whether this will give a quantitative measure of the effects of approximation. We have created new tools for exploring the

  10. Engaged: Making Large Classes Feel Small through Blended Learning Instructional Strategies that Promote Increased Student Performance

    ERIC Educational Resources Information Center

    Francis, Raymond W.

    2012-01-01

    It is not enough to be great at sharing information in a large classroom setting. To be an effective teacher you must be able to meaningfully engage your students with their peers and with the content. And you must do this regardless of class size or content. The issues of teaching effectively in large classroom settings have presented ongoing…