Science.gov

Sample records for large classroom setting

  1. Calibrated Peer Review: A New Tool for Integrating Information Literacy Skills in Writing-Intensive Large Classroom Settings

    ERIC Educational Resources Information Center

    Fosmire, Michael

    2010-01-01

    Calibrated Peer Review[TM] (CPR) is a program that can significantly enhance the ability to integrate intensive information literacy exercises into large classroom settings. CPR is founded on a solid pedagogic base for learning, and it is formulated in such a way that information skills can easily be inserted. However, there is no mention of its…

  2. Active Learning in a Large Medical Classroom Setting for Teaching Renal Physiology

    ERIC Educational Resources Information Center

    Dietz, John R.; Stevenson, Frazier T.

    2011-01-01

    In this article, the authors describe an active learning exercise which has been used to replace some lecture hours in the renal portion of an integrated, organ system-based curriculum for first-year medical students. The exercise takes place in a large auditorium with ~150 students. The authors, who are faculty members, lead the discussions,…

  3. A Classroom Tariff-Setting Game

    ERIC Educational Resources Information Center

    Winchester, Niven

    2006-01-01

    The author outlines a classroom tariff-setting game that allows students to explore the consequences of import tariffs imposed by large countries (countries able to influence world prices). Groups of students represent countries, which are organized into trading pairs. Each group's objective is to maximize welfare by choosing an appropriate ad…

  4. Impact of Problem-Based Learning in a Large Classroom Setting: Student Perception and Problem-Solving Skills

    ERIC Educational Resources Information Center

    Klegeris, Andis; Hurren, Heather

    2011-01-01

    Problem-based learning (PBL) can be described as a learning environment where the problem drives the learning. This technique usually involves learning in small groups, which are supervised by tutors. It is becoming evident that PBL in a small-group setting has a robust positive effect on student learning and skills, including better…

  5. Controlling Setting Events in the Classroom

    ERIC Educational Resources Information Center

    Chan, Paula E.

    2016-01-01

    Teachers face the challenging job of differentiating instruction for the diverse needs of their students. This task is difficult enough with happy students who are eager to learn; unfortunately students often enter the classroom in a bad mood because of events that happened outside the classroom walls. These events--called setting events--can…

  6. Classroom Management in Inclusive Settings

    ERIC Educational Resources Information Center

    Soodak, Leslie C.

    2003-01-01

    The inclusion of children with disabilities in general education classes provides an opportunity for teachers to identify classroom management policies and practices that promote diversity and community. Community-building management strategies that facilitate friendships, collaboration, parent involvement, and address challenging behaviors in a…

  7. Individualizing in Traditional Classroom Settings.

    ERIC Educational Resources Information Center

    Thornell, John G.

    1980-01-01

    Effective individualized instruction depends primarily on the teacher possessing the skills to implement it. Individualization is therefore quite compatible with the traditional self-contained elementary classroom model, but not with its alternative, departmentalization, which allows teachers neither the time flexibility nor the familiarity with…

  8. Children's Fears in the Classroom Setting.

    ERIC Educational Resources Information Center

    Johnson, Suzanne Bennett

    1979-01-01

    Fears common to the classroom setting are discussed, including school phobia, social withdrawal, and test anxiety. Incidence data, theoretical explanations, and treatment research are reviewed, and directions for future research are suggested. (Author/MH)

  9. Improvement in Generic Problem-Solving Abilities of Students by Use of Tutor-Less Problem-Based Learning in a Large Classroom Setting

    ERIC Educational Resources Information Center

    Klegeris, Andis; Bahniwal, Manpreet; Hurren, Heather

    2013-01-01

    Problem-based learning (PBL) was originally introduced in medical education programs as a form of small-group learning, but its use has now spread to large undergraduate classrooms in various other disciplines. Introduction of new teaching techniques, including PBL-based methods, needs to be justified by demonstrating the benefits of such…

  10. Improvement in Generic Problem-Solving Abilities of Students by Use of Tutor-less Problem-Based Learning in a Large Classroom Setting

    PubMed Central

    Klegeris, Andis; Bahniwal, Manpreet; Hurren, Heather

    2013-01-01

    Problem-based learning (PBL) was originally introduced in medical education programs as a form of small-group learning, but its use has now spread to large undergraduate classrooms in various other disciplines. Introduction of new teaching techniques, including PBL-based methods, needs to be justified by demonstrating the benefits of such techniques over classical teaching styles. Previously, we demonstrated that introduction of tutor-less PBL in a large third-year biochemistry undergraduate class increased student satisfaction and attendance. The current study assessed the generic problem-solving abilities of students from the same class at the beginning and end of the term, and compared student scores with similar data obtained in three classes not using PBL. Two generic problem-solving tests of equal difficulty were administered such that students took different tests at the beginning and the end of the term. Blinded marking showed a statistically significant 13% increase in the test scores of the biochemistry students exposed to PBL, while no trend toward significant change in scores was observed in any of the control groups not using PBL. Our study is among the first to demonstrate that use of tutor-less PBL in a large classroom leads to statistically significant improvement in generic problem-solving skills of students. PMID:23463230

  11. Implementing iPads in the Inclusive Classroom Setting

    ERIC Educational Resources Information Center

    Maich, Kimberly; Hall, Carmen

    2016-01-01

    This column provides practical suggestions to help guide teachers in utilizing classroom sets of iPads. Following a brief introduction to tablet technology in inclusive classrooms and the origin of these recommendations from a case study focus group, important elements of setting up classroom iPad use, from finding funding to teaching apps, are…

  12. Climate Setting in Second-Language Classrooms.

    ERIC Educational Resources Information Center

    Evans-Harvey, Cher

    1993-01-01

    Discusses the creation of a positive classroom climate, examines four dimensions of classroom climate (physical, academic, organizational, and social-emotional), and reviews techniques that teachers can use to promote a positive classroom climate. Teachers need to get to know their students, discuss the course objectives with their students, and…

  13. Collaboration within Large Groups in the Classroom

    ERIC Educational Resources Information Center

    Szewkis, Eyal; Nussbaum, Miguel; Rosen, Tal; Abalos, Jose; Denardin, Fernanda; Caballero, Daniela; Tagle, Arturo; Alcoholado, Cristian

    2011-01-01

    The purpose of this paper is to show how a large group of students can work collaboratively in a synchronous way within the classroom using the cheapest possible technological support. Making use of the features of Single Display Groupware and of Multiple Mice we propose a computer-supported collaborative learning approach for big groups within…

  14. Classrooms and Computers as Instructional Settings.

    ERIC Educational Resources Information Center

    Amarel, Marianne

    1983-01-01

    The influx of computers into the classroom is discussed from a teacher's point of view. Teachers' reactions to the PLATO Elementary Mathematics and Reading Project (a computer aided instructional model) from the author's point of view are noted. (JMK)

  15. Analyzing Multimodal Interaction within a Classroom Setting

    ERIC Educational Resources Information Center

    Moura, Heloisa

    2006-01-01

    Human interactions are multimodal in nature. From simple to complex forms of transferal of information, human beings draw on a multiplicity of communicative modes, such as intonation and gaze, to make sense of everyday experiences. Likewise, the learning process, either within traditional classrooms or Virtual Learning Environments, is shaped by…

  16. A Practical Setting of Distance Learning Classroom.

    ERIC Educational Resources Information Center

    Wang, Shousan; Buck, Lawrence

    1996-01-01

    Describes a distance-learning classroom developed and used by Central Connecticut State University for nurse training, educational statistics, mathematics, and technology courses. Discusses initial engineering, video cameras, video source switching, lighting, audio, and other technical and related aspects. Block diagrams and lists of equipment for…

  17. Tangential Floor in a Classroom Setting

    ERIC Educational Resources Information Center

    Marti, Leyla

    2012-01-01

    This article examines floor management in two classroom sessions: a task-oriented computer lesson and a literature lesson. Recordings made in the computer lesson show the organization of floor when a task is given to students. Temporary or "incipient" side floors (Jones and Thornborrow, 2004) emerge beside the main floor. In the literature lesson,…

  18. Teaching Music in the Urban Classroom Set

    ERIC Educational Resources Information Center

    Frierson-Campbell, Carol Ed.

    2006-01-01

    The change needed in urban music education not only relates to the idea that music should be at the center of the curriculum; rather, it is that culturally relevant music should be a creative force at the center of reform in urban education. This set is the start of a national-level conversation aimed at making that goal a reality. In both…

  19. Student Engagement and Success in the Large Astronomy 101 Classroom

    NASA Astrophysics Data System (ADS)

    Jensen, J. B.

    2014-07-01

    The large auditorium classroom presents unique challenges to maintaining student engagement. During the fall 2012 semester, I adopted several specific strategies for increasing student engagement and reducing anonymity with the goal of maximizing student success in the large class. I measured attendance and student success in two classes, one with 300 students and one with 42, but otherwise taught as similarly as possible. While the students in the large class probably did better than they would have in a traditional lecture setting, attendance was still significantly lower in the large class, resulting in lower student success than in the small control class by all measures. I will discuss these results and compare to classes in previous semesters, including other small classes and large Distance Education classes conducted live over remote television link.

  20. Observation Instrument of Play Behaviour in a Classroom Setting

    ERIC Educational Resources Information Center

    Berkhout, Louise; Hoekman, Joop; Goorhuis-Brouwer, Sieneke M.

    2012-01-01

    The objective of this study was to develop an instrument to observe the play behaviour of a whole group of children from four to six years of age in a classroom setting on the basis of video recording. The instrument was developed in collaboration with experienced teachers and experts on play. Categories of play were derived from the literature…

  1. Enhancing Feedback via Peer Learning in Large Classrooms

    ERIC Educational Resources Information Center

    Zher, Ng Huey; Hussein, Raja Maznah Raja; Saat, Rohaida Mohd

    2016-01-01

    Feedback has been lauded as a key pedagogical tool in higher education. Unfortunately, the value of feedback falls short when being carried out in large classrooms. In this study, strategies for sustaining feedback in large classroom based on peer learning are explored. All the characteristics identified within the concept of peer learning were…

  2. Examining the Effectiveness of Team-Based Learning (TBL) in Different Classroom Settings

    ERIC Educational Resources Information Center

    Yuretich, Richard F.; Kanner, Lisa C.

    2015-01-01

    The problem of effective learning in college classrooms, especially in a large lecture setting, has been a topic of discussion for a considerable span of time. Most efforts to improve learning incorporate various forms of student-active learning, such as in-class investigations or problems, group discussions, collaborative examinations and…

  3. Radial sets: interactive visual analysis of large overlapping sets.

    PubMed

    Alsallakh, Bilal; Aigner, Wolfgang; Miksch, Silvia; Hauser, Helwig

    2013-12-01

    In many applications, data tables contain multi-valued attributes that often store the memberships of the table entities to multiple sets such as which languages a person masters, which skills an applicant documents, or which features a product comes with. With a growing number of entities, the resulting element-set membership matrix becomes very rich of information about how these sets overlap. Many analysis tasks targeted at set-typed data are concerned with these overlaps as salient features of such data. This paper presents Radial Sets, a novel visual technique to analyze set memberships for a large number of elements. Our technique uses frequency-based representations to enable quickly finding and analyzing different kinds of overlaps between the sets, and relating these overlaps to other attributes of the table entities. Furthermore, it enables various interactions to select elements of interest, find out if they are over-represented in specific sets or overlaps, and if they exhibit a different distribution for a specific attribute compared to the rest of the elements. These interactions allow formulating highly-expressive visual queries on the elements in terms of their set memberships and attribute values. As we demonstrate via two usage scenarios, Radial Sets enable revealing and analyzing a multitude of overlapping patterns between large sets, beyond the limits of state-of-the-art techniques. PMID:24051816

  4. Teacher and Student Research Using Large Data Sets

    NASA Astrophysics Data System (ADS)

    Croft, S. K.; Pompea, S. M.; Sparks, R. T.

    2005-12-01

    One of the objectives of teacher research experiences is to immerse the teacher in an authentic research situation to help the teacher understand what real research is all about: "to do science as scientists do." Experiences include doing experiments in laboratories, gathering data out in the field, and observing at professional observatories. However, a rapidly growing area of scientific research is in "data mining" increasingly large public data archives. In the earth and space sciences, such large archives are built around data from Landsat 7, the Sloan Digital Sky Survey, and in about seven years, the Large Synoptic Survey Telescope. The LSST will re-photograph the entire night sky every three day, resulting in a data flow of about 20 terabytes per night. The resulting LSST archive will represent a huge challenge of simple storage and retrieval for professional scientists. It will be a much greater challenge to help K-12 teachers use such gargantuan files and collections of data effectively in the classroom and to understand and begin to practice the new research procedures involved in data mining. At NOAO we are exploring ways of using large data sets in formal educational settings like classrooms, and public settings like planetariums and museums. In our existing professional development programs, such as our Teacher leaders in Research Based Science Education, we have introduced teachers to research via on-site observing experiences and partnerships with active astronomers. To successfully initiate research in the classroom, we have found that teachers need training in specific science content, use of specialized software to work with the data, development of research questions and objectives, and explicit pedagogical strategies for classroom use. Our research projects are well defined, though not "canned," and incorporate specific types of data, such as solar images. These data can be replaced with new data from an archive for the classroom research

  5. A Student Response System in an Electronic Classroom: Technology Aids for Large Classroom Instruction

    NASA Astrophysics Data System (ADS)

    Ober, D.; Errington, P.; Islam, S.; Robertson, T.; Watson, J.

    1997-10-01

    In the fall of 1996, thirteen (13) classrooms on the Ball State campus were equipped with technological aids to enhance learning in large classrooms (for typically 100 students or larger). Each classroom was equipped with the following built-in equipment: computer, zip drive, laser disc player, VCR, LAN and Internet connection, TV monitors, and Elmo overhead camera with large-screen projection system. This past fall semester a student response system was added to a 108-seat classroom in the Physics and Astronomy department for use with large General Education courses. Each student seat was equipped with a hardwired hand-held unit possessing input capabilities and LCD feedback for the student. The introduction of the student response system was added in order enhance more active learning by students in the large classroom environment. Attendance, quizzes, hour exams, and in-class surveys are early uses for the system; initial reactions by student and faculty users will be given.

  6. Observations of Children’s Interactions with Teachers, Peers, and Tasks across Preschool Classroom Activity Settings

    PubMed Central

    Booren, Leslie M.; Downer, Jason T.; Vitiello, Virginia E.

    2014-01-01

    This descriptive study examined classroom activity settings in relation to children’s observed behavior during classroom interactions, child gender, and basic teacher behavior within the preschool classroom. 145 children were observed for an average of 80 minutes during 8 occasions across 2 days using the inCLASS, an observational measure that conceptualizes behavior into teacher, peer, task, and conflict interactions. Findings indicated that on average children’s interactions with teachers were higher in teacher-structured settings, such as large group. On average, children’s interactions with peers and tasks were more positive in child-directed settings, such as free choice. Children experienced more conflict during recess and routines/transitions. Finally, gender differences were observed within small group and meals. The implications of these findings might encourage teachers to be thoughtful and intentional about what types of support and resources are provided so children can successfully navigate the demands of particular settings. These findings are not meant to discourage certain teacher behaviors or imply value of certain classroom settings; instead, by providing an evidenced-based picture of the conditions under which children display the most positive interactions, teachers can be more aware of choices within these settings and have a powerful way to assist in professional development and interventions. PMID:25717282

  7. Using Flipped Classroom Approach to Explore Deep Learning in Large Classrooms

    ERIC Educational Resources Information Center

    Danker, Brenda

    2015-01-01

    This project used two Flipped Classroom approaches to stimulate deep learning in large classrooms during the teaching of a film module as part of a Diploma in Performing Arts course at Sunway University, Malaysia. The flipped classes utilized either a blended learning approach where students first watched online lectures as homework, and then…

  8. Silent Students' Participation in a Large Active Learning Science Classroom

    ERIC Educational Resources Information Center

    Obenland, Carrie A.; Munson, Ashlyn H.; Hutchinson, John S.

    2012-01-01

    Active learning in large science classrooms furthers opportunities for students to engage in the content and in meaningful learning, yet students can still remain anonymously silent. This study aims to understand the impact of active learning on these silent students in a large General Chemistry course taught via Socratic questioning and…

  9. Lessons Learned from a Multiculturally, Economically Diverse Classroom Setting.

    ERIC Educational Resources Information Center

    Lyman, Lawrence

    For her sabbatical a professor of teacher education at Emporia State University returned to the elementary classroom after a 20-year absence to teach in a third/fourth combination classroom in the Emporia, Kansas Public Schools. The return to elementary classroom teaching provided the professor with the opportunity to utilize some of the social…

  10. Teaching the Assessment of Normality Using Large Easily-Generated Real Data Sets

    ERIC Educational Resources Information Center

    Kulp, Christopher W.; Sprechini, Gene D.

    2016-01-01

    A classroom activity is presented, which can be used in teaching students statistics with an easily generated, large, real world data set. The activity consists of analyzing a video recording of an object. The colour data of the recorded object can then be used as a data set to explore variation in the data using graphs including histograms,…

  11. Observations of Children's Interactions with Teachers, Peers, and Tasks across Preschool Classroom Activity Settings

    ERIC Educational Resources Information Center

    Booren, Leslie M.; Downer, Jason T.; Vitiello, Virginia E.

    2012-01-01

    Research Findings: This descriptive study examined classroom activity settings in relation to children's observed behavior during classroom interactions, child gender, and basic teacher behavior within the preschool classroom. A total of 145 children were observed for an average of 80 min during 8 occasions across 2 days using the Individualized…

  12. Designing an Electronic Classroom for Large College Courses.

    ERIC Educational Resources Information Center

    Aiken, Milam W.; Hawley, Delvin D.

    1995-01-01

    Describes a state-of-the-art electronic classroom at the University of Mississippi School of Business designed for large numbers of students and regularly scheduled classes. Highlights include: architecture of the room, hardware components, software utilized in the room, and group decision support system software and its uses. (JKP)

  13. On Flipping the Classroom in Large First Year Calculus Courses

    ERIC Educational Resources Information Center

    Jungic, Veselin; Kaur, Harpreet; Mulholland, Jamie; Xin, Cindy

    2015-01-01

    Over the course of two years, 2012-2014, we have implemented a "flipping" the classroom approach in three of our large enrolment first year calculus courses: differential and integral calculus for scientists and engineers. In this article we describe the details of our particular approach and share with the reader some experiences of…

  14. On flipping the classroom in large first year calculus courses

    NASA Astrophysics Data System (ADS)

    Jungić, Veselin; Kaur, Harpreet; Mulholland, Jamie; Xin, Cindy

    2015-05-01

    Over the course of two years, 2012--2014, we have implemented a 'flipping' the classroom approach in three of our large enrolment first year calculus courses: differential and integral calculus for scientists and engineers. In this article we describe the details of our particular approach and share with the reader some experiences of both instructors and students.

  15. Treatment of Encopresis in a Classroom Setting: A Case Study

    ERIC Educational Resources Information Center

    Scott, E.

    1977-01-01

    This study describes the procedure and results of a behavior modification program carried out in the classroom and aimed at eliminating encopresis (involuntary defecation) in an 8-year-old boy. (Editor/RK)

  16. The Emergence of Student Creativity in Classroom Settings: A Case Study of Elementary Schools in Korea

    ERIC Educational Resources Information Center

    Cho, Younsoon; Chung, Hye Young; Choi, Kyoulee; Seo, Choyoung; Baek, Eunjoo

    2013-01-01

    This research explores the emergence of student creativity in classroom settings, specifically within two content areas: science and social studies. Fourteen classrooms in three elementary schools in Korea were observed, and the teachers and students were interviewed. The three types of student creativity emerging in the teaching and learning…

  17. An Exploration of the Effectiveness of an Audit Simulation Tool in a Classroom Setting

    ERIC Educational Resources Information Center

    Zelin, Robert C., II

    2010-01-01

    The purpose of this study was to examine the effectiveness of using an audit simulation product in a classroom setting. Many students and professionals feel that a disconnect exists between learning auditing in the classroom and practicing auditing in the workplace. It was hoped that the introduction of an audit simulation tool would help to…

  18. The Categorical Facilitation Effects on L2 Vocabulary Learning in a Classroom Setting

    ERIC Educational Resources Information Center

    Hoshino, Yuko

    2010-01-01

    In the field of vocabulary acquisition, there have been many studies on the efficacy of word lists. However, very few of these were based on research in a classroom setting, and therefore, their results may not be applicable to standard classroom situations. This study investigated which of the five types of word lists (synonyms, antonyms,…

  19. Clickers in the large classroom: current research and best-practice tips.

    PubMed

    Caldwell, Jane E

    2007-01-01

    Audience response systems (ARS) or clickers, as they are commonly called, offer a management tool for engaging students in the large classroom. Basic elements of the technology are discussed. These systems have been used in a variety of fields and at all levels of education. Typical goals of ARS questions are discussed, as well as methods of compensating for the reduction in lecture time that typically results from their use. Examples of ARS use occur throughout the literature and often detail positive attitudes from both students and instructors, although exceptions do exist. When used in classes, ARS clickers typically have either a benign or positive effect on student performance on exams, depending on the method and extent of their use, and create a more positive and active atmosphere in the large classroom. These systems are especially valuable as a means of introducing and monitoring peer learning methods in the large lecture classroom. So that the reader may use clickers effectively in his or her own classroom, a set of guidelines for writing good questions and a list of best-practice tips have been culled from the literature and experienced users. PMID:17339389

  20. Clickers in the Large Classroom: Current Research and Best-Practice Tips

    PubMed Central

    2007-01-01

    Audience response systems (ARS) or clickers, as they are commonly called, offer a management tool for engaging students in the large classroom. Basic elements of the technology are discussed. These systems have been used in a variety of fields and at all levels of education. Typical goals of ARS questions are discussed, as well as methods of compensating for the reduction in lecture time that typically results from their use. Examples of ARS use occur throughout the literature and often detail positive attitudes from both students and instructors, although exceptions do exist. When used in classes, ARS clickers typically have either a benign or positive effect on student performance on exams, depending on the method and extent of their use, and create a more positive and active atmosphere in the large classroom. These systems are especially valuable as a means of introducing and monitoring peer learning methods in the large lecture classroom. So that the reader may use clickers effectively in his or her own classroom, a set of guidelines for writing good questions and a list of best-practice tips have been culled from the literature and experienced users. PMID:17339389

  1. Setting of Classroom Environments for Hearing Impaired Children

    ERIC Educational Resources Information Center

    Turan, Zerrin

    2007-01-01

    This paper aims to explain effects of acoustical environments in sound perception of hearing impaired people. Important aspects of sound and hearing impairment are explained. Detrimental factors in acoustic conditions for speech perception are mentioned. Necessary acoustic treatment in classrooms and use of FM systems to eliminate these factors…

  2. Researching Pupil Attending Behavior within Naturalistic Classroom Settings.

    ERIC Educational Resources Information Center

    Brooks, Douglas M.; Rogers, Constance J.

    1981-01-01

    Examines the relationship between teacher attitudes toward students and visual attending behavior in the classroom. Thirty-two students were identified in four categories, subsequently labeled accepted, indifferent, concerned and rejected. Results indicated significant differences in visual attending behavior and a two-way interaction with pupil…

  3. Thinking Routines: Replicating Classroom Practices within Museum Settings

    ERIC Educational Resources Information Center

    Wolberg, Rochelle Ibanez; Goff, Allison

    2012-01-01

    This article describes thinking routines as tools to guide and support young children's thinking. These learning strategies, developed by Harvard University's Project Zero Classroom, actively engage students in constructing meaning while also understanding their own thinking process. The authors discuss how thinking routines can be used in both…

  4. Twelve Practical Strategies To Prevent Behavioral Escalation in Classroom Settings.

    ERIC Educational Resources Information Center

    Shukla-Mehta, Smita; Albin, Richard W.

    2003-01-01

    Twelve practical strategies that can be used by classroom teachers to prevent behavioral escalation are discussed, including reinforce calm, know the triggers, pay attention to anything unusual, do not escalate, intervene early, know the function of problem behavior, use extinction wisely, teach prosocial behavior, and teach academic survival…

  5. Understanding Bystander Perceptions of Cyberbullying in Inclusive Classroom Settings

    ERIC Educational Resources Information Center

    Guckert, Mary

    2013-01-01

    Cyberbullying is a pervasive problem that puts students at risk of successful academic outcomes and the ability to feel safe in school. As most students with disabilities are served in inclusive classrooms, there is a growing concern that students with special needs are at an increased risk of online bullying harassment. Enhancing responsible…

  6. Knowledge Discovery in Large Data Sets

    SciTech Connect

    Simas, Tiago; Silva, Gabriel; Miranda, Bruno; Ribeiro, Rita

    2008-12-05

    In this work we briefly address the problem of unsupervised classification on large datasets, magnitude around 100,000,000 objects. The objects are variable objects, which are around 10% of the 1,000,000,000 astronomical objects that will be collected by GAIA/ESA mission. We tested unsupervised classification algorithms on known datasets such as OGLE and Hipparcos catalogs. Moreover, we are building several templates to represent the main classes of variable objects as well as new classes to build a synthetic dataset of this dimension. In the future we will run the GAIA satellite scanning law on these templates to obtain a testable large dataset.

  7. Introduction to comparing large sequence sets.

    PubMed

    Page, Roderic D M

    2003-02-01

    Comparisons of whole genomes can yield important insights into the evolution of genome structure, such as the role of inversions in bacterial evolution and the identification of large-scale duplications in the human genome. This unit briefly compares two tools for aligning whole genome sequences: MUMmer and PipMaker. These tools differ in both the underlying algorithms used, and in the interface they present to the user. PMID:18428691

  8. Activity Settings and Daily Routines in Preschool Classrooms: Diverse Experiences in Early Learning Settings for Low-Income Children

    ERIC Educational Resources Information Center

    Fuligni, Allison Sidle; Howes, Carollee; Huang, Yiching; Hong, Sandra Soliday; Lara-Cinisomo, Sandraluz

    2012-01-01

    This paper examines activity settings and daily classroom routines experienced by 3- and 4-year-old low-income children in public center-based preschool programs, private center-based programs, and family child care homes. Two daily routine profiles were identified using a time-sampling coding procedure: a High Free-Choice pattern in which…

  9. Activity Settings and Daily Routines in Preschool Classrooms: Diverse Experiences in Early Learning Settings for Low-Income Children

    PubMed Central

    Fuligni, Allison Sidle; Howes, Carollee; Huang, Yiching; Hong, Sandra Soliday; Lara-Cinisomo, Sandraluz

    2011-01-01

    This paper examines activity settings and daily classroom routines experienced by 3- and 4-year-old low-income children in public center-based preschool programs, private center-based programs, and family child care homes. Two daily routine profiles were identified using a time-sampling coding procedure: a High Free-Choice pattern in which children spent a majority of their day engaged in child-directed free-choice activity settings combined with relatively low amounts of teacher-directed activity, and a Structured-Balanced pattern in which children spent relatively equal proportions of their day engaged in child-directed free-choice activity settings and teacher-directed small- and whole-group activities. Daily routine profiles were associated with program type and curriculum use but not with measures of process quality. Children in Structured-Balanced classrooms had more opportunities to engage in language and literacy and math activities, whereas children in High Free-Choice classrooms had more opportunities for gross motor and fantasy play. Being in a Structured-Balanced classroom was associated with children’s language scores but profiles were not associated with measures of children’s math reasoning or socio-emotional behavior. Consideration of teachers’ structuring of daily routines represents a valuable way to understand nuances in the provision of learning experiences for young children in the context of current views about developmentally appropriate practice and school readiness. PMID:22665945

  10. Activity Settings and Daily Routines in Preschool Classrooms: Diverse Experiences in Early Learning Settings for Low-Income Children.

    PubMed

    Fuligni, Allison Sidle; Howes, Carollee; Huang, Yiching; Hong, Sandra Soliday; Lara-Cinisomo, Sandraluz

    2012-06-01

    This paper examines activity settings and daily classroom routines experienced by 3- and 4-year-old low-income children in public center-based preschool programs, private center-based programs, and family child care homes. Two daily routine profiles were identified using a time-sampling coding procedure: a High Free-Choice pattern in which children spent a majority of their day engaged in child-directed free-choice activity settings combined with relatively low amounts of teacher-directed activity, and a Structured-Balanced pattern in which children spent relatively equal proportions of their day engaged in child-directed free-choice activity settings and teacher-directed small- and whole-group activities. Daily routine profiles were associated with program type and curriculum use but not with measures of process quality. Children in Structured-Balanced classrooms had more opportunities to engage in language and literacy and math activities, whereas children in High Free-Choice classrooms had more opportunities for gross motor and fantasy play. Being in a Structured-Balanced classroom was associated with children's language scores but profiles were not associated with measures of children's math reasoning or socio-emotional behavior. Consideration of teachers' structuring of daily routines represents a valuable way to understand nuances in the provision of learning experiences for young children in the context of current views about developmentally appropriate practice and school readiness. PMID:22665945

  11. Large-N in Volcano Settings: Volcanosri

    NASA Astrophysics Data System (ADS)

    Lees, J. M.; Song, W.; Xing, G.; Vick, S.; Phillips, D.

    2014-12-01

    We seek a paradigm shift in the approach we take on volcano monitoring where the compromise from high fidelity to large numbers of sensors is used to increase coverage and resolution. Accessibility, danger and the risk of equipment loss requires that we develop systems that are independent and inexpensive. Furthermore, rather than simply record data on hard disk for later analysis we desire a system that will work autonomously, capitalizing on wireless technology and in field network analysis. To this end we are currently producing a low cost seismic array which will incorporate, at the very basic level, seismological tools for first cut analysis of a volcano in crises mode. At the advanced end we expect to perform tomographic inversions in the network in near real time. Geophone (4 Hz) sensors connected to a low cost recording system will be installed on an active volcano where triggering earthquake location and velocity analysis will take place independent of human interaction. Stations are designed to be inexpensive and possibly disposable. In one of the first implementations the seismic nodes consist of an Arduino Due processor board with an attached Seismic Shield. The Arduino Due processor board contains an Atmel SAM3X8E ARM Cortex-M3 CPU. This 32 bit 84 MHz processor can filter and perform coarse seismic event detection on a 1600 sample signal in fewer than 200 milliseconds. The Seismic Shield contains a GPS module, 900 MHz high power mesh network radio, SD card, seismic amplifier, and 24 bit ADC. External sensors can be attached to either this 24-bit ADC or to the internal multichannel 12 bit ADC contained on the Arduino Due processor board. This allows the node to support attachment of multiple sensors. By utilizing a high-speed 32 bit processor complex signal processing tasks can be performed simultaneously on multiple sensors. Using a 10 W solar panel, second system being developed can run autonomously and collect data on 3 channels at 100Hz for 6 months

  12. Content-Based Instruction for English Language Learners: An Exploration across Multiple Classroom Settings

    ERIC Educational Resources Information Center

    Park, Seo Jung

    2009-01-01

    This study explored the content-based literacy instruction of English language learners (ELLs) across multiple classroom settings in U.S. elementary schools. The following research questions guided the study: (a) How are ELLs taught English in two types of instructional settings: regular content-area literacy instruction in the all-English…

  13. The Transition of Women from the Classroom Setting to the Educational Administration Setting

    ERIC Educational Resources Information Center

    Zachreson, Sarah A.

    2011-01-01

    This qualitative case study examined the research exploring how female teachers had perceived their potential challenges in becoming a principal, and how those perceptions actually changed as they made the move from the classroom to the principal's office. The purpose of the study is to investigate how female administrative candidates assessed and…

  14. Large Data at Small Universities: Astronomical processing using a computer classroom

    NASA Astrophysics Data System (ADS)

    Fuller, Nathaniel James; Clarkson, William I.; Fluharty, Bill; Belanger, Zach; Dage, Kristen

    2016-06-01

    The use of large computing clusters for astronomy research is becoming more commonplace as datasets expand, but access to these required resources is sometimes difficult for research groups working at smaller Universities. As an alternative to purchasing processing time on an off-site computing cluster, or purchasing dedicated hardware, we show how one can easily build a crude on-site cluster by utilizing idle cycles on instructional computers in computer-lab classrooms. Since these computers are maintained as part of the educational mission of the University, the resource impact on the investigator is generally low.By using open source Python routines, it is possible to have a large number of desktop computers working together via a local network to sort through large data sets. By running traditional analysis routines in an “embarrassingly parallel” manner, gains in speed are accomplished without requiring the investigator to learn how to write routines using highly specialized methodology. We demonstrate this concept here applied to 1. photometry of large-format images and 2. Statistical significance-tests for X-ray lightcurve analysis. In these scenarios, we see a speed-up factor which scales almost linearly with the number of cores in the cluster. Additionally, we show that the usage of the cluster does not severely limit performance for a local user, and indeed the processing can be performed while the computers are in use for classroom purposes.

  15. How Passive-Aggressive Behavior in Emotionally Disturbed Children Affects Peer Interactions in a Classroom Setting.

    ERIC Educational Resources Information Center

    Hardt, Janet

    Passive-aggressive behavior in an emotionally disturbed child affects the child's progress and affects peer interactions in classroom settings. Passive-aggressive personalities are typically helpless, dependent, impulsive, overly anxious, poorly oriented to reality, and procrastinating. The characteristics of passive-aggressive children need to be…

  16. Descriptive Analysis of Classroom Setting Events on the Social Behaviors of Children with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Boyd, Brian A.; Conroy, Maureen A.; Asmus, Jennifer M.; McKenney, Elizabeth L. W.; Mancil, G. Richmond

    2008-01-01

    Children with Autism Spectrum Disorder (ASD) are characterized by extreme deficits in social relatedness with same-age peers. The purpose of this descriptive study was to identify naturally occurring antecedent variables (i.e., setting events) in the classroom environments of children with ASD that promoted their engagement in peer-related social…

  17. Enhancing Knowledge Transfer in Classroom versus Online Settings: The Interplay among Instructor, Student, Content, and Context

    ERIC Educational Resources Information Center

    Nemanich, Louise; Banks, Michael; Vera, Dusya

    2009-01-01

    This article integrates management education and organizational learning theories to identify the factors that drive the differences in student outcomes between the online and classroom settings. We draw upon theory on knowledge transfer barriers in organizations to understand the interlinking relationships among presage conditions, deep learning…

  18. Generalizability and Decision Studies to Inform Observational and Experimental Research in Classroom Settings

    ERIC Educational Resources Information Center

    Bottema-Beutel, Kristen; Lloyd, Blair; Carter, Erik W.; Asmus, Jennifer M.

    2014-01-01

    Attaining reliable estimates of observational measures can be challenging in school and classroom settings, as behavior can be influenced by multiple contextual factors. Generalizability (G) studies can enable researchers to estimate the reliability of observational data, and decision (D) studies can inform how many observation sessions are…

  19. The Impact of Physical Settings on Pre-Schoolers Classroom Organization

    ERIC Educational Resources Information Center

    Tadjic, Mirko; Martinec, Miroslav; Farago, Amalija

    2015-01-01

    The physical setting plays an important role in the lives of pre-schoolers and can be an important component of children's experience and development when it is wisely and meaningfully designed. The classroom organization enhances and supports the pre-schooler capability to perform activities himself, initiate and finish tasks, creates the…

  20. Performance in an Online Introductory Course in a Hybrid Classroom Setting

    ERIC Educational Resources Information Center

    Aly, Ibrahim

    2013-01-01

    This study compared the academic achievement between undergraduate students taking an introductory managerial accounting course online (N = 104) and students who took the same course in a hybrid classroom setting (N = 203). Student achievement was measured using scores from twelve weekly online assignments, two major online assignments, a final…

  1. Mobile-IT Education (MIT.EDU): M-Learning Applications for Classroom Settings

    ERIC Educational Resources Information Center

    Sung, M.; Gips, J.; Eagle, N.; Madan, A.; Caneel, R.; DeVaul, R.; Bonsen, J.; Pentland, A.

    2005-01-01

    In this paper, we describe the Mobile-IT Education (MIT.EDU) system, which demonstrates the potential of using a distributed mobile device architecture for rapid prototyping of wireless mobile multi-user applications for use in classroom settings. MIT.EDU is a stable, accessible system that combines inexpensive, commodity hardware, a flexible…

  2. Comparing Asynchronous Online Discussions and Face-to-Face Discussions in a Classroom Setting

    ERIC Educational Resources Information Center

    Wang, Qiyun; Woo, Huay Lit

    2007-01-01

    The purpose of this study is to investigate the perceived differences between asynchronous online discussions and face-to-face discussions in a classroom setting. The students' reflections were analysed by following a qualitative research approach. The results showed that atmosphere, response, efficiency, interactivity and communication were the…

  3. Conceptualizing the Classroom of Target Students: A Qualitative Investigation of Panelists' Experiences during Standard Setting

    ERIC Educational Resources Information Center

    Hein, Serge F.; Skaggs, Gary

    2010-01-01

    Increasingly, research has focused on the cognitive processes associated with various standard-setting activities. This qualitative study involved an examination of 16 third-grade reading teachers' experiences with the cognitive task of conceptualizing an entire classroom of hypothetical target students when the single-passage bookmark method or…

  4. Civility in the University Classroom: An Opportunity for Faculty to Set Expectations

    ERIC Educational Resources Information Center

    Ward, Chris; Yates, Dan

    2014-01-01

    This research examines the types of uncivil behaviors frequently encountered in university classrooms. These behaviors range from walking in late to class, texting in class, and/or unprofessional emails. These behaviors can often undermine a professor's teaching. Setting reasonable and consistent expectations is a combination of university policy,…

  5. Use of Big-Screen Films in Multiple Childbirth Education Classroom Settings

    PubMed Central

    Kaufman, Tamara

    2010-01-01

    Although two recent films, Orgasmic Birth and Pregnant in America, were intended for the big screen, they can also serve as valuable teaching resources in multiple childbirth education settings. Each film conveys powerful messages about birth and today's birthing culture. Depending on a childbirth educator's classroom setting (hospital, birthing center, or home birth environment), particular portions in each film, along with extra clips featured on the films' DVDs, can enhance an educator's curriculum and spark compelling discussions with class participants. PMID:21358831

  6. Comparing Functional Analysis and Paired-choice Assessment Results in Classroom Settings

    PubMed Central

    Berg, Wendy K; Wacker, David P; Cigrand, Karla; Merkle, Steve; Wade, Jeanie; Henry, Kim; Wang, Yu-Chia

    2007-01-01

    The results of a functional analysis of problem behavior and a paired-choice assessment were compared to determine whether the same social reinforcers were identified for problem behavior and an appropriate response (time allocation). The two assessments were conducted in classroom settings with 4 adolescents with mental retardation who engaged in severe problem behavior. Each student's classroom teacher served as the therapist for all phases of assessment. The two assessment procedures identified the same social reinforcers for problem and appropriate behavior for 3 of 4 participants. PMID:17970268

  7. The Relation between High School Teacher Sense of Teaching Efficacy and Self-Reported Attitudes toward the Inclusive Classroom Settings

    ERIC Educational Resources Information Center

    Wright, Heather Dillehay

    2013-01-01

    The purpose of this study was to investigate if collective sense of teaching efficacy, general sense of teaching efficacy, or personal sense of teacher efficacy influenced teacher attitude toward inclusive classroom settings. Additionally, the study sought to determine if teacher attitude toward inclusive classroom settings differed when taking…

  8. Teaching Methodology in a "Large Power Distance" Classroom: A South Korean Context

    ERIC Educational Resources Information Center

    Jambor, Paul Z.

    2005-01-01

    This paper looks at South Korea as an example of a collectivist society having a rather large power distance dimension value. In a traditional Korean classroom the teacher is at the top of the classroom hierarchy, while the students are the passive participants. Gender and age play a role in the hierarchy between students themselves. Teaching…

  9. Strategies for Engaging FCS Learners in a Large-Format Classroom: Embedded Videos

    ERIC Educational Resources Information Center

    Leslie, Catherine Amoroso

    2014-01-01

    This article presents a method for utilizing technology to increase student engagement in large classroom formats. In their lives outside the classroom, students spend considerable time interfacing with media, and they are receptive to information conveyed in electronic formats. Research has shown that multimedia is an effective learning resource;…

  10. INTERIOR VIEW, SETTING LARGE CORE WITH ASSISTANCE FROM THE OVERHEAD ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    INTERIOR VIEW, SETTING LARGE CORE WITH ASSISTANCE FROM THE OVERHEAD RAIL CRANE IN BOX FLOOR MOLD AREA (WORKERS: DAN T. WELLS AND TRUMAN CARLISLE). - Stockham Pipe & Fittings Company, Ductile Iron Foundry, 4000 Tenth Avenue North, Birmingham, Jefferson County, AL

  11. [The BASYS observation system for the analysis of aggressive behavior in classroom-settings].

    PubMed

    Wettstein, Alexander

    2012-01-01

    Educational or therapeutic measures of aggressive student behavior are often based on the judgments of teachers. However, empirical studies show that the objectivity of these judgments is generally low. In order to assess aggressive behavior in classroom settings, we developed a context-sensitive observational system. The observation system exists in a version for teachers in action as well as a version for the uninvolved observer. The teacher version allows categorizing aggressive behavior while teaching. The aim is to differentiate the perception and the judgments of teachers, so that the judgments can serve as trustable diagnostic information. The version for an independent observer, in addition, contains categories to collect information about the context in which aggressions take place. The behavior observation system was tested in four field-studies in regular and special classes. The empirical results show that, after training, teachers were able to make objective observations, and that aggressive behavior depends to a large extent on situational factors. The system allows identification of problematic people-environment relationships and the derivation of intervention measures. PMID:22748725

  12. Generalizability and decision studies to inform observational and experimental research in classroom settings.

    PubMed

    Bottema-Beutel, Kristen; Lloyd, Blair; Carter, Erik W; Asmus, Jennifer M

    2014-11-01

    Attaining reliable estimates of observational measures can be challenging in school and classroom settings, as behavior can be influenced by multiple contextual factors. Generalizability (G) studies can enable researchers to estimate the reliability of observational data, and decision (D) studies can inform how many observation sessions are necessary to achieve a criterion level of reliability. We conducted G and D studies using observational data from a randomized control trial focusing on social and academic participation of students with severe disabilities in inclusive secondary classrooms. Results highlight the importance of anchoring observational decisions to reliability estimates from existing or pilot data sets. We outline steps for conducting G and D studies and address options when reliability estimates are lower than desired. PMID:25354126

  13. Adaptive, multiresolution visualization of large data sets using parallel octrees.

    SciTech Connect

    Freitag, L. A.; Loy, R. M.

    1999-06-10

    The interactive visualization and exploration of large scientific data sets is a challenging and difficult task; their size often far exceeds the performance and memory capacity of even the most powerful graphics work-stations. To address this problem, we have created a technique that combines hierarchical data reduction methods with parallel computing to allow interactive exploration of large data sets while retaining full-resolution capability. The hierarchical representation is built in parallel by strategically inserting field data into an octree data structure. We provide functionality that allows the user to interactively adapt the resolution of the reduced data sets so that resolution is increased in regions of interest without sacrificing local graphics performance. We describe the creation of the reduced data sets using a parallel octree, the software architecture of the system, and the performance of this system on the data from a Rayleigh-Taylor instability simulation.

  14. Looking at large data sets using binned data plots

    SciTech Connect

    Carr, D.B.

    1990-04-01

    This report addresses the monumental challenge of developing exploratory analysis methods for large data sets. The goals of the report are to increase awareness of large data sets problems and to contribute simple graphical methods that address some of the problems. The graphical methods focus on two- and three-dimensional data and common task such as finding outliers and tail structure, assessing central structure and comparing central structures. The methods handle large sample size problems through binning, incorporate information from statistical models and adapt image processing algorithms. Examples demonstrate the application of methods to a variety of publicly available large data sets. The most novel application addresses the too many plots to examine'' problem by using cognostics, computer guiding diagnostics, to prioritize plots. The particular application prioritizes views of computational fluid dynamics solution sets on the fly. That is, as each time step of a solution set is generated on a parallel processor the cognostics algorithms assess virtual plots based on the previous time step. Work in such areas is in its infancy and the examples suggest numerous challenges that remain. 35 refs., 15 figs.

  15. Treatment of psychotic children in a classroom environment: I. Learning in a large group1

    PubMed Central

    Koegel, Robert L.; Rincover, Arnold

    1974-01-01

    The purpose of this study was to investigate systematically the feasibility of modifying the behavior of autistic children in a classroom environment. In the first experiment, eight autistic children were taught certain basic classroom behaviors (including attending to the teacher upon command, imitation, and an elementary speaking and recognition vocabulary) that were assumed to be necessary for subsequent learning to take place in the classroom. Based on research documenting the effectiveness of one-to-one (teacher-child ratio) procedures for modifying such behaviors, these behaviors were taught in one-to-one sessions. It was, however, found that behaviors taught in a one-to-one setting were not performed consistently in a classroom-sized group, or even in a group as small as two children with one teacher. Further, the children evidenced no acquisition of new behaviors in a classroom environment over a four-week period. Therefore, Experiment II introduced a treatment procedure based upon “fading in” the classroom stimulus situation from the one-to-one stimulus situation. Such treatment was highly effective in producing both a transfer in stimulus control and the acquisition of new behaviors in a kindergarten/first-grade classroom environment. PMID:4465373

  16. Reducing Information Overload in Large Seismic Data Sets

    SciTech Connect

    HAMPTON,JEFFERY W.; YOUNG,CHRISTOPHER J.; MERCHANT,BION J.; CARR,DORTHE B.; AGUILAR-CHANG,JULIO

    2000-08-02

    Event catalogs for seismic data can become very large. Furthermore, as researchers collect multiple catalogs and reconcile them into a single catalog that is stored in a relational database, the reconciled set becomes even larger. The sheer number of these events makes searching for relevant events to compare with events of interest problematic. Information overload in this form can lead to the data sets being under-utilized and/or used incorrectly or inconsistently. Thus, efforts have been initiated to research techniques and strategies for helping researchers to make better use of large data sets. In this paper, the authors present their efforts to do so in two ways: (1) the Event Search Engine, which is a waveform correlation tool and (2) some content analysis tools, which area combination of custom-built and commercial off-the-shelf tools for accessing, managing, and querying seismic data stored in a relational database. The current Event Search Engine is based on a hierarchical clustering tool known as the dendrogram tool, which is written as a MatSeis graphical user interface. The dendrogram tool allows the user to build dendrogram diagrams for a set of waveforms by controlling phase windowing, down-sampling, filtering, enveloping, and the clustering method (e.g. single linkage, complete linkage, flexible method). It also allows the clustering to be based on two or more stations simultaneously, which is important to bridge gaps in the sparsely recorded event sets anticipated in such a large reconciled event set. Current efforts are focusing on tools to help the researcher winnow the clusters defined using the dendrogram tool down to the minimum optimal identification set. This will become critical as the number of reference events in the reconciled event set continually grows. The dendrogram tool is part of the MatSeis analysis package, which is available on the Nuclear Explosion Monitoring Research and Engineering Program Web Site. As part of the research

  17. Gaussian predictive process models for large spatial data sets

    PubMed Central

    Banerjee, Sudipto; Gelfand, Alan E.; Finley, Andrew O.; Sang, Huiyan

    2009-01-01

    Summary With scientific data available at geocoded locations, investigators are increasingly turning to spatial process models for carrying out statistical inference. Over the last decade, hierarchical models implemented through Markov chain Monte Carlo methods have become especially popular for spatial modelling, given their flexibility and power to fit models that would be infeasible with classical methods as well as their avoidance of possibly inappropriate asymptotics. However, fitting hierarchical spatial models often involves expensive matrix decompositions whose computational complexity increases in cubic order with the number of spatial locations, rendering such models infeasible for large spatial data sets. This computational burden is exacerbated in multivariate settings with several spatially dependent response variables. It is also aggravated when data are collected at frequent time points and spatiotemporal process models are used. With regard to this challenge, our contribution is to work with what we call predictive process models for spatial and spatiotemporal data. Every spatial (or spatiotemporal) process induces a predictive process model (in fact, arbitrarily many of them). The latter models project process realizations of the former to a lower dimensional subspace, thereby reducing the computational burden. Hence, we achieve the flexibility to accommodate non-stationary, non-Gaussian, possibly multivariate, possibly spatiotemporal processes in the context of large data sets. We discuss attractive theoretical properties of these predictive processes. We also provide a computational template encompassing these diverse settings. Finally, we illustrate the approach with simulated and real data sets. PMID:19750209

  18. Intelligent Archiving and Physics Mining of Large Data Sets (Invited)

    NASA Astrophysics Data System (ADS)

    Karimabadi, H.

    2009-12-01

    There are unique challenges in all aspects related to large data sets, from storage, search and access, to analysis and file sharing. With few exceptions, the adoption of the latest technologies to deal with the management and mining of large data sets has been slow in heliosciences. Web services such as CDAweb have been very successful and have been widely adopted by the community. There are also significant efforts going towards Virtual Observatories (VxOs). The main thrust of VxOs has so far been on data discovery, aggregation and uniform presentation. While work remains, many VxOs can now be used to access data. However data is not knowledge and the challenge of extracting physics from the large data sets remains. Here we review our efforts on (i) implementing advanced data mining techniques as part of the data-to-knowledge discovery pipeline, and (ii) use of social networking paradigm in the development of a science collaboratory environment that enables sharing of large files, creation of projects, among others. We will present new data mining software that works on a variety of data formats and demonstrate its capability through several examples of analysis of spacecraft data. The use of such techniques in intelligent archiving will be discussed. Finally, the use of our science collaboratory service and its unique sharing features such as universal accessibility of staged files will be illustrated.

  19. The attributes of an effective teacher differ between the classroom and the clinical setting.

    PubMed

    Haws, Jolene; Rannelli, Luke; Schaefer, Jeffrey P; Zarnke, Kelly; Coderre, Sylvain; Ravani, Pietro; McLaughlin, Kevin

    2016-10-01

    Most training programs use learners' subjective ratings of their teachers as the primary measure of teaching effectiveness. In a recent study we found that preclinical medical students' ratings of classroom teachers were associated with perceived charisma and physical attractiveness of the teacher, but not intellect. Here we explored whether the relationship between these variables and teaching effectiveness ratings holds in the clinical setting. We asked 27 Internal Medicine residents to rate teaching effectiveness of ten teachers with whom they had worked on a clinical rotation, in addition to rating each teacher's clinical skills, physical attractiveness, and charisma. We used linear regression to study the association between these explanatory variables and teaching effectiveness ratings. We found no association between rating of physical attractiveness and teaching effectiveness. Clinical skill and charisma were independently associated with rating of teaching effectiveness (regression coefficients [95 % confidence interval] 0.73 [0.60, 0.85], p < 0.001 and 0.12 [0.01, 0.23], p = 0.03, respectively). The variables associated with effectiveness of classroom and clinical teachers differ, suggesting context specificity in teaching effectiveness ratings. Context specificity may be explained by differences in the exposure that learners have to teachers in the classroom versus clinical setting-so that raters in the clinical setting may base ratings upon observed behaviours rather than stereotype data. Alternatively, since subjective ratings of teaching effectiveness inevitably incorporate learners' context-specific needs, the attributes that make a teacher effective in one context may not meet the needs of learners in a different context. PMID:26891679

  20. Computing Information-Theoretic Quantities in Large Climate Data Sets

    NASA Astrophysics Data System (ADS)

    Knuth, K. H.; Castle, J. P.; Curry, C. T.; Gotera, A.; Huyser, K. A.; Wheeler, K. R.; Rossow, W. B.

    2005-12-01

    Information-theoretic quantities, such as mutual information, allow one to quantify the amount of information shared by two variables. In large data sets, the mutual information can be used to identify sets of co-informative variables and thus are able to identify variables that can act as predictors of a phenomenon of interest. While mutual information alone does not distinguish a causal interaction between two variables, another information-theoretic quantity called the transfer entropy can indicate such possible causal interactions. Together, these quantities can be used to identify causal interactions among sets of variables in large distributed data sets. We are currently developing a suite of computational tools that will allow researchers to calculate, from data, these useful information-theoretic quantities. Our software tools estimate these quantities along with their associated error bars, the latter of which are critical for describing the degree of uncertainty in the estimates. In this presentation we demonstrate how mutual information and transfer entropy can be applied so as to allow researchers not only to identify relations among climate variables, but also to characterize and quantify their possible causal interactions.

  1. Science Teacher Beliefs and Classroom Practice Related to Constructivism in Different School Settings

    ERIC Educational Resources Information Center

    Savasci, Funda; Berlin, Donna F.

    2012-01-01

    Science teacher beliefs and classroom practice related to constructivism and factors that may influence classroom practice were examined in this cross-case study. Data from four science teachers in two schools included interviews, demographic questionnaire, Classroom Learning Environment Survey (preferred/perceived), and classroom observations and…

  2. Interactive Web-Based Map: Applications to Large Data Sets in the Geosciences. Interactive Web-Based Map: Applications to Large Data Sets in the Geosciences.

    NASA Astrophysics Data System (ADS)

    Garbow, Z. A.; Olson, N. R.; Yuen, D. A.; Boggs, J. M.

    2001-12-01

    Current advances in computer hardware, information technology and data collection techniques have produced very large data sets, sometimes more than terabytes,in a wide variety of scientific and engineering disciplines. We must harness this opportunity to visualize and extract useful information from geophysical and geological data. We have taken the task of data-mining by using a map-like approach over the web for interrogating the humongous data, using a client-server paradigm. The spatial-data is mapped onto a two-dimensional grid from which the user ( client ) can quiz the data with the map-interface as a user extension . The data is stored on high-end compute server. The computational gateway separating the client and the server can be the front-end of an electronic publication , electronic classroom , a Grid system device or e-business. We have used a combination of JAVA, JAVA-3D and Perl for processing the data and communicating them between the client and the server. The user can interrogate the geospatial data over any particular region with arbitrary length scales and pose relevant statistical questions, such as the histogram plots and local statistics. We have applied this method for the following data sets (1.) distribution of prime numbers (2.) two-dimensional mantle convection (3.) three-dimensional mantle convection (4) high-resolution satellite reflectance data over the Upper Midwest for multiple wavelengths (5) molecular dynamics describing the flow of blood in narrow vessels. Using this map-interface concept, the user can actually interrogate these data over the web. This strategy for dissecting large data-sets can be easily applied to other areas, such as satellite geodesy and earthquake data. This mode of data-query may function in an adequately covered wireless web environment with a transfer rate of around 10 Mbit/sec .

  3. Robust Coordination for Large Sets of Simple Rovers

    NASA Technical Reports Server (NTRS)

    Tumer, Kagan; Agogino, Adrian

    2006-01-01

    The ability to coordinate sets of rovers in an unknown environment is critical to the long-term success of many of NASA;s exploration missions. Such coordination policies must have the ability to adapt in unmodeled or partially modeled domains and must be robust against environmental noise and rover failures. In addition such coordination policies must accommodate a large number of rovers, without excessive and burdensome hand-tuning. In this paper we present a distributed coordination method that addresses these issues in the domain of controlling a set of simple rovers. The application of these methods allows reliable and efficient robotic exploration in dangerous, dynamic, and previously unexplored domains. Most control policies for space missions are directly programmed by engineers or created through the use of planning tools, and are appropriate for single rover missions or missions requiring the coordination of a small number of rovers. Such methods typically require significant amounts of domain knowledge, and are difficult to scale to large numbers of rovers. The method described in this article aims to address cases where a large number of rovers need to coordinate to solve a complex time dependent problem in a noisy environment. In this approach, each rover decomposes a global utility, representing the overall goal of the system, into rover-specific utilities that properly assign credit to the rover s actions. Each rover then has the responsibility to create a control policy that maximizes its own rover-specific utility. We show a method of creating rover-utilities that are "aligned" with the global utility, such that when the rovers maximize their own utility, they also maximize the global utility. In addition we show that our method creates rover-utilities that allow the rovers to create their control policies quickly and reliably. Our distributed learning method allows large sets rovers be used unmodeled domains, while providing robustness against

  4. Optimizing distance-based methods for large data sets

    NASA Astrophysics Data System (ADS)

    Scholl, Tobias; Brenner, Thomas

    2015-10-01

    Distance-based methods for measuring spatial concentration of industries have received an increasing popularity in the spatial econometrics community. However, a limiting factor for using these methods is their computational complexity since both their memory requirements and running times are in {{O}}(n^2). In this paper, we present an algorithm with constant memory requirements and shorter running time, enabling distance-based methods to deal with large data sets. We discuss three recent distance-based methods in spatial econometrics: the D&O-Index by Duranton and Overman (Rev Econ Stud 72(4):1077-1106, 2005), the M-function by Marcon and Puech (J Econ Geogr 10(5):745-762, 2010) and the Cluster-Index by Scholl and Brenner (Reg Stud (ahead-of-print):1-15, 2014). Finally, we present an alternative calculation for the latter index that allows the use of data sets with millions of firms.

  5. Form Invariance Symmetry Generates a Large Set of FRW Cosmologies

    NASA Astrophysics Data System (ADS)

    Chimento, Luis P.; Richarte, Martín G.; Sánchez, Iván E.

    2013-02-01

    We show that Einstein's field equations for spatially flat Friedmann-Robertson-Walker (FRW) spacetimes have a form invariance symmetry (FIS) realized by the form invariance transformations (FIT) which are indeed generated by an invertible function of the source energy density. These transformations act on the Hubble expansion rate, the energy density and pressure of the cosmic fluid; likewise such transformations are endowed with a Lie group structure. Each representation of this group is associated with a particular fluid and consequently a determined cosmology, so that, the FIS defines a set of equivalent cosmological models. We focus our seek in the FIT generated by a linear function because it provides a natural framework to express the duality and also produces large sets of cosmologies, starting from a seed one, in several contexts as for instance in the cases of a perfect fluid source and a scalar field driven by a potential depending linearly on the scalar field kinetic energy density.

  6. A large-scale crop protection bioassay data set.

    PubMed

    Gaulton, Anna; Kale, Namrata; van Westen, Gerard J P; Bellis, Louisa J; Bento, A Patrícia; Davies, Mark; Hersey, Anne; Papadatos, George; Forster, Mark; Wege, Philip; Overington, John P

    2015-01-01

    ChEMBL is a large-scale drug discovery database containing bioactivity information primarily extracted from scientific literature. Due to the medicinal chemistry focus of the journals from which data are extracted, the data are currently of most direct value in the field of human health research. However, many of the scientific use-cases for the current data set are equally applicable in other fields, such as crop protection research: for example, identification of chemical scaffolds active against a particular target or endpoint, the de-convolution of the potential targets of a phenotypic assay, or the potential targets/pathways for safety liabilities. In order to broaden the applicability of the ChEMBL database and allow more widespread use in crop protection research, an extensive data set of bioactivity data of insecticidal, fungicidal and herbicidal compounds and assays was collated and added to the database. PMID:26175909

  7. Observations of Teacher-Child Interactions in Classrooms Serving Latinos and Dual Language Learners: Applicability of the Classroom Assessment Scoring System in Diverse Settings

    ERIC Educational Resources Information Center

    Downer, Jason T.; Lopez, Michael L.; Grimm, Kevin J.; Hamagami, Aki; Pianta, Robert C.; Howes, Carollee

    2012-01-01

    With the rising number of Latino and dual language learner (DLL) children attending pre-k and the importance of assessing the quality of their experiences in those settings, this study examined the extent to which a commonly used assessment of teacher-child interactions, the Classroom Assessment Scoring System (CLASS), demonstrated similar…

  8. Support vector machine classifiers for large data sets.

    SciTech Connect

    Gertz, E. M.; Griffin, J. D.

    2006-01-31

    This report concerns the generation of support vector machine classifiers for solving the pattern recognition problem in machine learning. Several methods are proposed based on interior point methods for convex quadratic programming. Software implementations are developed by adapting the object-oriented packaging OOQP to the problem structure and by using the software package PETSc to perform time-intensive computations in a distributed setting. Linear systems arising from classification problems with moderately large numbers of features are solved by using two techniques--one a parallel direct solver, the other a Krylov-subspace method incorporating novel preconditioning strategies. Numerical results are provided, and computational experience is discussed.

  9. Comparing Outcomes from Field and Classroom Based Settings for Undergraduate Geoscience Courses

    NASA Astrophysics Data System (ADS)

    Skinner, M. R.; Harris, R. A.; Flores, J.

    2011-12-01

    Field based learning can be found in nearly every course offered in Geology at Brigham Young University. For example, in our Structural Geology course field studies substitute for labs. Students collect data their own data from several different structural settings of the Wasatch Range. Our curriculum also includes a two-week, sophomore-level field course that introduces students to interpreting field relations themselves and sets the stage for much of what they learn in their upper-division courses. Our senior-level six-week field geology course includes classical field mapping with exercises in petroleum and mineral exploration, environmental geology and geological hazards. Experiments with substituting field-based general education courses for those in traditional classroom settings indicate that student cognition, course enjoyment and recruiting of majors significantly increase in a field-based course. We offer a field-based introductory geology course (Geo 102) that is taught in seven, six-hour field trips during which students travel to localities of geologic interest to investigate a variety of fundamental geological problems. We compare the outcomes of Geo 102 with a traditional classroom-based geology course (Geo 101). For the comparison both courses are taught by the same instructor, use the same text and supplementary materials and take the same exams. The results of 7 years of reporting indicate that test scores and final grades are one-half grade point higher for Geo 102 students versus those in traditional introductory courses. Student evaluations of the course are also 0.8-1.4 points higher on a scale of 1-8, and are consistently the highest in the Department and College. Other observations include increased attendance, attention and curiosity. The later two are measured by the number of students asking questions of other students as well as the instructors, and the total number of questions asked during class time in the field versus the classroom

  10. Reliability and Validity of Information about Student Achievement: Comparing Large-Scale and Classroom Testing Contexts

    ERIC Educational Resources Information Center

    Cizek, Gregory J.

    2009-01-01

    Reliability and validity are two characteristics that must be considered whenever information about student achievement is collected. However, those characteristics--and the methods for evaluating them--differ in large-scale testing and classroom testing contexts. This article presents the distinctions between reliability and validity in the two…

  11. Implementing Concept-Based Learning in a Large Undergraduate Classroom

    ERIC Educational Resources Information Center

    Morse, David; Jutras, France

    2008-01-01

    An experiment explicitly introducing learning strategies to a large, first-year undergraduate cell biology course was undertaken to see whether awareness and use of strategies had a measurable impact on student performance. The construction of concept maps was selected as the strategy to be introduced because of an inherent coherence with a course…

  12. Interaction and Uptake in Large Foreign Language Classrooms

    ERIC Educational Resources Information Center

    Ekembe, Eric Enongene

    2014-01-01

    Inteaction determines and affects the conditions of language acquisition especially in contexts where exposure to the target language is limited. This is believed to be successful only within the context of small classes (Chavez, 2009). This paper examines learners' progress resulting from interaction in large classes. Using pre-, post-, and…

  13. Visualizing large data sets in the earth sciences

    NASA Technical Reports Server (NTRS)

    Hibbard, William; Santek, David

    1989-01-01

    The authors describe the capabilities of McIDAS, an interactive visualization system that is vastly increasing the ability of earth scientists to manage and analyze data from remote sensing instruments and numerical simulation models. McIDAS provides animated three-dimensionsal images and highly interactive displays. The software can manage, analyze, and visualize large data sets that span many physical variables (such as temperature, pressure, humidity, and wind speed), as well as time and three spatial dimensions. The McIDAS system manages data from at least 100 different sources. The data management tools consist of data structures for storing different data types in files, libraries of routines for accessing these data structures, system commands for performing housekeeping functions on the data files, and reformatting programs for converting external data to the system's data structures. The McIDAS tools for three-dimensional visualization of meteorological data run on an IBM mainframe and can load up to 128-frame animation sequences into the workstations. A highly interactive version of the system can provide an interactive window into data sets containing tens of millions of points produced by numerical models and remote sensing instruments. The visualizations are being used for teaching as well as by scientists.

  14. Ready, Set, Science! Putting Research To Work In K-8 Classrooms

    NASA Astrophysics Data System (ADS)

    van der Veen, Wil E.; Moody, T.

    2008-05-01

    What types of instructional experiences help students learn and understand science? What do professional development providers and curriculum designers need to know to create and support such experiences? Ready, Set, Science! is a book that provides a practical and accessible account of current research about teaching and learning science. Based on the groundbreaking National Research Council report "Taking Science to School: Learning and Teaching Science in Grades K-8” (2006), the book reviews principles derived from the latest educational research and applies them to effective teaching practice. Ready, Set, Science! is a MUST READ for everyone involved in K-12 education, or creating products intended for K-12 use. We will review Ready, Set, Science!'s new vision of science in education, its most important recommendations, and its implications for the place of astronomy in K-12 classrooms. We will review some useful suggestions on how to make student thinking visible and report on how we have put this into practice with teachers. We will engage the audience in a brief interactive demonstration of specific questioning techniques described in the book that help to make student thinking visible.

  15. Parallel Analysis Tools for Ultra-Large Climate Data Sets

    NASA Astrophysics Data System (ADS)

    Jacob, Robert; Krishna, Jayesh; Xu, Xiabing; Mickelson, Sheri; Wilde, Mike; Peterson, Kara; Bochev, Pavel; Latham, Robert; Tautges, Tim; Brown, David; Brownrigg, Richard; Haley, Mary; Shea, Dennis; Huang, Wei; Middleton, Don; Schuchardt, Karen; Yin, Jian

    2013-04-01

    While climate models have used parallelism for several years, the post-processing tools are still mostly single-threaded applications and many are closed source. These tools are becoming a bottleneck in the production of new climate knowledge when they confront terabyte-sized output from high-resolution climate models. The ParVis project is using and creating Free and Open Source tools that bring data and task parallelism to climate model analysis to enable analysis of large climate data sets. ParVis is using the Swift task-parallel language to implement a diagnostic suite that generates over 600 plots of atmospheric quantities. ParVis has also created a Parallel Gridded Analysis Library (ParGAL) which implements many common climate analysis operations in a data-parallel fashion using the Message Passing Interface. ParGAL has in turn been built on sophisticated packages for describing grids in parallel (the Mesh Oriented database (MOAB), performing vector operations on arbitrary grids (Intrepid) and reading data in parallel (PnetCDF). ParGAL is being used to implement a parallel version of the NCAR Command Language (NCL) called ParNCL. ParNCL/ParCAL not only speeds up analysis of large datasets but also allows operations to be performed on native grids, eliminating the need to transform data to latitude-longitude grids. All of the tools ParVis is creating are available as free and open source software.

  16. The Incredible Years Teacher Classroom Management Program: Using Coaching to Support Generalization to Real-World Classroom Settings

    ERIC Educational Resources Information Center

    Reinke, Wendy M.; Stormont, Melissa; Webster-Stratton, Carolyn; Newcomer, Lori L.; Herman, Keith C.

    2012-01-01

    This article focuses on the Incredible Years Teacher Classroom Management Training (IY TCM) intervention as an example of an evidence-based program that embeds coaching within its design. First, the core features of the IY TCM program are described. Second, the IY TCM coaching model and processes utilized to facilitate high fidelity of…

  17. The Design and Synthesis of a Large Interactive Classroom

    NASA Astrophysics Data System (ADS)

    Clouston, Laurel L.; Kleinman, Mark H.

    1999-01-01

    The use of group learning techniques in large classes has been used to effectively convey the central concepts of SN1 and SN2 reactions in an introductory organic chemistry class. The activities described are best used as an introduction to these mechanisms. The class begins with the instructor relaying the key points of the reaction pathways. Following this synopsis, the class is divided through the use of assignment sheets that are distributed to the students upon arrival. The use of markers and poster boards, model kits, and role playing help to explain the intricacies of the mechanisms to learners, thereby accommodating a variety of different learning styles. After a guided discussion, each group presents their results to another collection of students who used a different learning technique to understand the alternate reaction. In this manner, each student encounters two learning styles and benefits from the repetitious nature of the exercise. After the groups break up into even smaller groups, higher-order questions are posed for further discussion. The class is terminated by the presentation of a summary slide that contains all the important facts covered during the lecture.

  18. Implementing Concept-based Learning in a Large Undergraduate Classroom

    PubMed Central

    Jutras, France

    2008-01-01

    An experiment explicitly introducing learning strategies to a large, first-year undergraduate cell biology course was undertaken to see whether awareness and use of strategies had a measurable impact on student performance. The construction of concept maps was selected as the strategy to be introduced because of an inherent coherence with a course structured by concepts. Data were collected over three different semesters of an introductory cell biology course, all teaching similar course material with the same professor and all evaluated using similar examinations. The first group, used as a control, did not construct concept maps, the second group constructed individual concept maps, and the third group first constructed individual maps then validated their maps in small teams to provide peer feedback about the individual maps. Assessment of the experiment involved student performance on the final exam, anonymous polls of student perceptions, failure rate, and retention of information at the start of the following year. The main conclusion drawn is that concept maps without feedback have no significant effect on student performance, whereas concept maps with feedback produced a measurable increase in student problem-solving performance and a decrease in failure rates. PMID:18519616

  19. Science Teacher Beliefs and Classroom Practice Related to Constructivism in Different School Settings

    NASA Astrophysics Data System (ADS)

    Savasci, Funda; Berlin, Donna F.

    2012-02-01

    Science teacher beliefs and classroom practice related to constructivism and factors that may influence classroom practice were examined in this cross-case study. Data from four science teachers in two schools included interviews, demographic questionnaire, Classroom Learning Environment Survey (preferred/perceived), and classroom observations and documents. Using an inductive analytic approach, results suggested that the teachers embraced constructivism, but classroom observations did not confirm implementation of these beliefs for three of the four teachers. The most preferred constructivist components were personal relevance and student negotiation; the most perceived component was critical voice. Shared control was the least preferred, least perceived, and least observed constructivist component. School type, grade, student behavior/ability, curriculum/standardized testing, and parental involvement may influence classroom practice.

  20. Web based visualization of large climate data sets

    USGS Publications Warehouse

    Alder, Jay R.; Hostetler, Steven W.

    2015-01-01

    We have implemented the USGS National Climate Change Viewer (NCCV), which is an easy-to-use web application that displays future projections from global climate models over the United States at the state, county and watershed scales. We incorporate the NASA NEX-DCP30 statistically downscaled temperature and precipitation for 30 global climate models being used in the Fifth Assessment Report (AR5) of the Intergovernmental Panel on Climate Change (IPCC), and hydrologic variables we simulated using a simple water-balance model. Our application summarizes very large, complex data sets at scales relevant to resource managers and citizens and makes climate-change projection information accessible to users of varying skill levels. Tens of terabytes of high-resolution climate and water-balance data are distilled to compact binary format summary files that are used in the application. To alleviate slow response times under high loads, we developed a map caching technique that reduces the time it takes to generate maps by several orders of magnitude. The reduced access time scales to >500 concurrent users. We provide code examples that demonstrate key aspects of data processing, data exporting/importing and the caching technique used in the NCCV.

  1. Student-Centred Anti-Smoking Education: Comparing a Classroom-Based and an Out-of-School Setting

    ERIC Educational Resources Information Center

    Geier, Christine S.; Bogner, Franz X.

    2010-01-01

    The present study monitored a student-centred educational anti-smoking intervention with fifth graders by focusing on their cognitive achievement and intrinsic motivation. In order to assess the potential influence of the setting on self-directed learning, the intervention was conducted in two different learning environments: a classroom-based…

  2. The Effects of a Teacher-Child Play Intervention on Classroom Compliance in Young Children in Child Care Settings

    ERIC Educational Resources Information Center

    Levine, Darren G.; Ducharme, Joseph M.

    2013-01-01

    The current study evaluated the effects of a teacher-conducted play intervention on preschool-aged children's compliance in child care settings. Study participants included 8 children ranging in age from 3 to 5 years and 5 early childhood education teachers within 5 classrooms across 5 child care centers. A combination ABAB and multiple baseline…

  3. A Case Based Analysis Preparation Strategy for Use in a Classroom Management for Inclusive Settings Course: Preliminary Observations

    ERIC Educational Resources Information Center

    Niles, William J.; Cohen, Alan

    2012-01-01

    Case based instruction (CBI) is a pedagogical option in teacher preparation growing in application but short on practical means to implement the method. This paper presents an analysis strategy and questions developed to help teacher trainees focus on classroom management issues embedded in a set of "real" cases. An analysis of teacher candidate…

  4. Challenges Associated With Using Large Data Sets for Quality Assessment and Research in Clinical Settings

    PubMed Central

    Cohen, Bevin; Vawdrey, David K.; Liu, Jianfang; Caplan, David; Furuya, E. Yoko; Mis, Frederick W.; Larson, Elaine

    2015-01-01

    The rapidly expanding use of electronic records in health-care settings is generating unprecedented quantities of data available for clinical, epidemiological, and cost-effectiveness research. Several challenges are associated with using these data for clinical research, including issues surrounding access and information security, poor data quality, inconsistency of data within and across institutions, and a paucity of staff with expertise to manage and manipulate large clinical data sets. In this article, we describe our experience with assembling a data-mart and conducting clinical research using electronic data from four facilities within a single hospital network in New York City. We culled data from several electronic sources, including the institution’s admission-discharge-transfer system, cost accounting system, electronic health record, clinical data warehouse, and departmental records. The final data-mart contained information for more than 760,000 discharges occurring from 2006 through 2012. Using categories identified by the National Institutes of Health Big Data to Knowledge initiative as a framework, we outlined challenges encountered during the development and use of a domain-specific data-mart and recommend approaches to overcome these challenges. PMID:26351216

  5. Challenges Associated With Using Large Data Sets for Quality Assessment and Research in Clinical Settings.

    PubMed

    Cohen, Bevin; Vawdrey, David K; Liu, Jianfang; Caplan, David; Furuya, E Yoko; Mis, Frederick W; Larson, Elaine

    2015-08-01

    The rapidly expanding use of electronic records in health-care settings is generating unprecedented quantities of data available for clinical, epidemiological, and cost-effectiveness research. Several challenges are associated with using these data for clinical research, including issues surrounding access and information security, poor data quality, inconsistency of data within and across institutions, and a paucity of staff with expertise to manage and manipulate large clinical data sets. In this article, we describe our experience with assembling a data-mart and conducting clinical research using electronic data from four facilities within a single hospital network in New York City. We culled data from several electronic sources, including the institution's admission-discharge-transfer system, cost accounting system, electronic health record, clinical data warehouse, and departmental records. The final data-mart contained information for more than 760,000 discharges occurring from 2006 through 2012. Using categories identified by the National Institutes of Health Big Data to Knowledge initiative as a framework, we outlined challenges encountered during the development and use of a domain-specific data-mart and recommend approaches to overcome these challenges. PMID:26351216

  6. An Exploration Tool for Very Large Spectrum Data Sets

    NASA Astrophysics Data System (ADS)

    Carbon, Duane F.; Henze, Christopher

    2015-01-01

    We present an exploration tool for very large spectrum data sets such as the SDSS, LAMOST, and 4MOST data sets. The tool works in two stages: the first uses batch processing and the second runs interactively. The latter employs the NASA hyperwall, a configuration of 128 workstation displays (8x16 array) controlled by a parallelized software suite running on NASA's Pleiades supercomputer. The stellar subset of the Sloan Digital Sky Survey DR10 was chosen to show how the tool may be used. In stage one, SDSS files for 569,738 stars are processed through our data pipeline. The pipeline fits each spectrum using an iterative continuum algorithm, distinguishing emission from absorption and handling molecular absorption bands correctly. It then measures 1659 discrete atomic and molecular spectral features that were carefully preselected based on their likelihood of being visible at some spectral type. The depths relative to the local continuum at each feature wavelength are determined for each spectrum: these depths, the local S/N level, and DR10-supplied variables such as magnitudes, colors, positions, and radial velocities are the basic measured quantities used on the hyperwall. In stage two, each hyperwall panel is used to display a 2-D scatter plot showing the depth of feature A vs the depth of feature B for all of the stars. A and B change from panel to panel. The relationships between the various (A,B) strengths and any distinctive clustering are immediately apparent when examining and inter-comparing the different panels on the hyperwall. The interactive software allows the user to select the stars in any interesting region of any 2-D plot on the hyperwall, immediately rendering the same stars on all the other 2-D plots in a unique color. The process may be repeated multiple times, each selection displaying a distinctive color on all the plots. At any time, the spectra of the selected stars may be examined in detail on a connected workstation display. We illustrate

  7. Trans-dimensional Bayesian inference for large sequential data sets

    NASA Astrophysics Data System (ADS)

    Mandolesi, E.; Dettmer, J.; Dosso, S. E.; Holland, C. W.

    2015-12-01

    This work develops a sequential Monte Carlo method to infer seismic parameters of layered seabeds from large sequential reflection-coefficient data sets. The approach provides parameter estimates and uncertainties along survey tracks with the goal to aid in the detection of unexploded ordnance in shallow water. The sequential data are acquired by a moving platform with source and receiver array towed close to the seabed. This geometry requires consideration of spherical reflection coefficients, computed efficiently by massively parallel implementation of the Sommerfeld integral via Levin integration on a graphics processing unit. The seabed is parametrized with a trans-dimensional model to account for changes in the environment (i.e. changes in layering) along the track. The method combines advanced Markov chain Monte Carlo methods (annealing) with particle filtering (resampling). Since data from closely-spaced source transmissions (pings) often sample similar environments, the solution from one ping can be utilized to efficiently estimate the posterior for data from subsequent pings. Since reflection-coefficient data are highly informative, the likelihood function can be extremely peaked, resulting in little overlap between posteriors of adjacent pings. This is addressed by adding bridging distributions (via annealed importance sampling) between pings for more efficient transitions. The approach assumes the environment to be changing slowly enough to justify the local 1D parametrization. However, bridging allows rapid changes between pings to be addressed and we demonstrate the method to be stable in such situations. Results are in terms of trans-D parameter estimates and uncertainties along the track. The algorithm is examined for realistic simulated data along a track and applied to a dataset collected by an autonomous underwater vehicle on the Malta Plateau, Mediterranean Sea. [Work supported by the SERDP, DoD.

  8. The Interests of Full Disclosure: Agenda-Setting and the Practical Initiation of the Feminist Classroom

    ERIC Educational Resources Information Center

    Seymour, Nicole

    2007-01-01

    Several theoretical and pragmatic questions arise when one attempts to employ feminist pedagogy in the classroom (or to study it), such as how to strike a balance between classroom order and instructor de-centering and how to productively address student resistance. In this article, the author describes how she took on her final project for a…

  9. Classrooms that Work: Teaching Generic Skills in Academic and Vocational Settings.

    ERIC Educational Resources Information Center

    Stasz, Cathleen; And Others

    This report documents the second of two studies on teaching and learning generic skills in high schools. It extends the earlier work by providing a model for designing classroom instruction in both academic and vocational classrooms where teaching generic skills is an instructional goal. Ethnographic field methods were used to observe, record, and…

  10. Teaching and learning in an integrated curriculum setting: A case study of classroom practices

    NASA Astrophysics Data System (ADS)

    MacMath, Sheryl Lynn

    Curriculum integration, while a commonly used educational term, remains a challenging concept to define and examine both in research and in classroom practice. Numerous types and definitions of curriculum integration exist in educational research, while, in comparison, teachers tend to focus on curriculum integration simply as a mixing of subject areas. To better understand curriculum integration in practice, this thesis details a case study that examines both teacher and student perspectives regarding a grade nine integrated unit on energy. Set in a public secondary school in Ontario, Canada, I comprehensively describe and analyze teacher understandings of, and challenges with, the implementation of an integrated unit, while also examining student perspectives and academic learning. My participants consisted of two high school teachers, a geography teacher and a science teacher, and their twenty-three students. Using data gathered from interviews before, during, and after the implementation of a 16-lesson unit, as well as observations throughout, I completed a case description and thematic analysis. My results illustrate the importance of examining why teachers choose to implement an integrated unit and the planning and scheduling challenges that exist. In addition, while the students in this study were academically successful, clarification is needed regarding whether student success can be linked to the integration of these two subjects or the types of activities these two teachers utilized.

  11. Connecting scientific research and classroom instruction: Developing authentic problem sets for the undergraduate organic chemistry curriculum

    NASA Astrophysics Data System (ADS)

    Raker, Jeffrey R.

    Reform efforts in science education have called for instructional methods and resources that mirror the practice of science. Little research and design methods have been documented in the literature for designing such materials. The purpose of this study was to develop problems sets for sophomore-level organic chemistry instruction. This research adapted an instructional design methodology from the science education literature for the creation of new curricular problem sets. The first phase of this study was to establish an understanding of current curricular problems in sophomore-level organic chemistry instruction. A sample of 792 problems was collected from four organic chemistry courses. These problems were assessed using three literature reported problem typologies. Two of these problem typologies have previously been used to understand general chemistry problems; comparisons between general and organic chemistry problems were thus made. Data from this phase was used to develop a set of five problems for practicing organic chemists. The second phase of this study was to explore practicing organic chemists' experiences solving problems in the context of organic synthesis research. Eight practicing organic chemists were interviewed and asked to solve two to three of the problems developed in phase one of this research. These participants spoke of three problem types: project level, synthetic planning, and day-to-day. Three knowledge types (internal knowledge, knowledgeable others, and literature) were used in solving these problems in research practice and in the developed problems. A set of guiding factors and implications were derived from this data and the chemistry education literature for the conversion of the problems for practicing chemists to problems for undergraduate students. A subsequent conversion process for the five problems occurred. The third, and last phase, of this study was to explore undergraduate students' experiences solving problems in

  12. Recruiting Participants for Large-Scale Random Assignment Experiments in School Settings

    ERIC Educational Resources Information Center

    Roschelle, Jeremy; Feng, Mingyu; Gallagher, H. Alix; Murphy, Robert; Harris, Christopher; Kamdar, Danae; Trinidad, Gucci

    2014-01-01

    Recruitment is a key challenge for researchers conducting any large school-based study. Control is needed not only over the condition participants receive, but also over how the intervention is implemented, and may include restrictions in other areas of school and classroom functioning. We report here on our experiences in recruiting participants…

  13. Using Large Data Sets to Study College Education Trajectories

    ERIC Educational Resources Information Center

    Oseguera, Leticia; Hwang, Jihee

    2014-01-01

    This chapter presents various considerations researchers undertook to conduct a quantitative study on low-income students using a national data set. Specifically, it describes how a critical quantitative scholar approaches guiding frameworks, variable operationalization, analytic techniques, and result interpretation. Results inform how…

  14. Increasing the Writing Performance of Urban Seniors Placed At-Risk through Goal-Setting in a Culturally Responsive and Creativity-Centered Classroom

    ERIC Educational Resources Information Center

    Estrada, Brittany; Warren, Susan

    2014-01-01

    Efforts to support marginalized students require not only identifying systemic inequities, but providing a classroom infrastructure that supports the academic achievement of all students. This action research study examined the effects of implementing goal-setting strategies and emphasizing creativity in a culturally responsive classroom (CRC) on…

  15. The Single and Combined Effects of Multiple Intensities of Behavior Modification and Methylphenidate for Children with Attention Deficit Hyperactivity Disorder in a Classroom Setting

    ERIC Educational Resources Information Center

    Fabiano, Gregory A.; Pelham, William E., Jr.; Gnagy, Elizabeth M.; Burrows-MacLean, Lisa; Coles, Erika K.; Chacko, Anil; Wymbs, Brian T.; Walker, Kathryn S.; Arnold, Fran; Garefino, Allison; Keenan, Jenna K.; Onyango, Adia N.; Hoffman, Martin T.; Massetti, Greta M.; Robb, Jessica A.

    2007-01-01

    Currently behavior modification, stimulant medication, and combined treatments are supported as evidence-based interventions for attention deficit hyperactivity disorder in classroom settings. However, there has been little study of the relative effects of these two modalities and their combination in classrooms. Using a within-subject design, the…

  16. Evaluation of Data Visualization Software for Large Astronomical Data Sets

    NASA Astrophysics Data System (ADS)

    Doyle, Matthew; Taylor, Roger S.; Kanbur, Shashi; Schofield, Damian; Donalek, Ciro; Djorgovski, Stanislav G.; Davidoff, Scott

    2016-01-01

    This study investigates the efficacy of a 3D visualization application used to classify various types of stars using data derived from large synoptic sky surveys. Evaluation methodology included a cognitive walkthrough that prompted participants to identify a specific star type (Supernovae, RR Lyrae or Eclipsing Binary) and retrieve variable information (MAD, magratio, amplitude, frequency) from the star. This study also implemented a heuristic evaluation that applied usability standards such as the Shneiderman Visual Information Seeking Mantra to the initial iteration of the application. Findings from the evaluation indicated that improvements could be made to the application by developing effective spatial organization and implementing data reduction techniques such as linking, brushing, and small multiples.

  17. Processing large remote sensing image data sets on Beowulf clusters

    USGS Publications Warehouse

    Steinwand, Daniel R.; Maddox, Brian; Beckmann, Tim; Schmidt, Gail

    2003-01-01

    High-performance computing is often concerned with the speed at which floating- point calculations can be performed. The architectures of many parallel computers and/or their network topologies are based on these investigations. Often, benchmarks resulting from these investigations are compiled with little regard to how a large dataset would move about in these systems. This part of the Beowulf study addresses that concern by looking at specific applications software and system-level modifications. Applications include an implementation of a smoothing filter for time-series data, a parallel implementation of the decision tree algorithm used in the Landcover Characterization project, a parallel Kriging algorithm used to fit point data collected in the field on invasive species to a regular grid, and modifications to the Beowulf project's resampling algorithm to handle larger, higher resolution datasets at a national scale. Systems-level investigations include a feasibility study on Flat Neighborhood Networks and modifications of that concept with Parallel File Systems.

  18. Value-based customer grouping from large retail data sets

    NASA Astrophysics Data System (ADS)

    Strehl, Alexander; Ghosh, Joydeep

    2000-04-01

    In this paper, we propose OPOSSUM, a novel similarity-based clustering algorithm using constrained, weighted graph- partitioning. Instead of binary presence or absence of products in a market-basket, we use an extended 'revenue per product' measure to better account for management objectives. Typically the number of clusters desired in a database marketing application is only in the teens or less. OPOSSUM proceeds top-down, which is more efficient and takes a small number of steps to attain the desired number of clusters as compared to bottom-up agglomerative clustering approaches. OPOSSUM delivers clusters that are balanced in terms of either customers (samples) or revenue (value). To facilitate data exploration and validation of results we introduce CLUSION, a visualization toolkit for high-dimensional clustering problems. To enable closed loop deployment of the algorithm, OPOSSUM has no user-specified parameters. Thresholding heuristics are avoided and the optimal number of clusters is automatically determined by a search for maximum performance. Results are presented on a real retail industry data-set of several thousand customers and products, to demonstrate the power of the proposed technique.

  19. Explaining Helping Behavior in a Cooperative Learning Classroom Setting Using Attribution Theory

    ERIC Educational Resources Information Center

    Ahles, Paula M.; Contento, Jann M.

    2006-01-01

    This recently completed study examined whether attribution theory can explain helping behavior in an interdependent classroom environment that utilized a cooperative-learning model. The study focused on student participants enrolled in 6 community college communication classes taught by the same instructor. Three levels of cooperative-learning…

  20. The Timing of Feedback on Mathematics Problem Solving in a Classroom Setting

    ERIC Educational Resources Information Center

    Fyfe, Emily R.; Rittle-Johnson, Bethany

    2015-01-01

    Feedback is a ubiquitous learning tool that is theorized to help learners detect and correct their errors. The goal of this study was to examine the effects of feedback in a classroom context for children solving math equivalence problems (problems with operations on both sides of the equal sign). The authors worked with children in 7 second-grade…

  1. Observing Students in Classroom Settings: A Review of Seven Coding Schemes

    ERIC Educational Resources Information Center

    Volpe, Robert J.; DiPerna, James C.; Hintze, John M.; Shapiro, Edward S.

    2005-01-01

    A variety of coding schemes are available for direct observational assessment of student classroom behavior. These instruments have been used for a number of assessment tasks including screening children in need of further evaluation for emotional and behavior problems, diagnostic assessment of emotional and behavior problems, assessment of…

  2. Making Room for Group Work I: Teaching Engineering in a Modern Classroom Setting

    ERIC Educational Resources Information Center

    Wilkens, Robert J.; Ciric, Amy R.

    2005-01-01

    This paper describes the results of several teaching experiments in the teaching Studio of The University of Dayton's Learning-Teaching Center. The Studio is a state-of-the-art classroom with a flexible seating arrangements and movable whiteboards and corkboards for small group discussions. The Studio has a communications system with a TV/VCR…

  3. Classroom Management Strategies for Young Children with Challenging Behavior within Early Childhood Settings

    ERIC Educational Resources Information Center

    Jolivette, Kristine; Steed, Elizabeth A.

    2010-01-01

    Many preschool, Head Start, and kindergarten educators of young children express concern about the number of children who exhibit frequent challenging behaviors and report that managing these behaviors is difficult within these classrooms. This article describes research-based strategies with practical applications that can be used as part of…

  4. An Approach to the Problem of Student Passivity in Classroom Settings.

    ERIC Educational Resources Information Center

    MacDonald, Judith B.

    The verbal interaction between a laboratory school class of sixth/seventh grade students and their teacher during 18 social studies discussions was analyzed in order to identify teacher techniques relevant to student discourse and student passivity. Classroom discussions were taped, transcribed, and analyzed according to an adaptation of the…

  5. By What Token Economy? A Classroom Learning Tool for Inclusive Settings.

    ERIC Educational Resources Information Center

    Anderson, Carol; Katsiyannis, Antonis

    1997-01-01

    Describes a token economy that used tokens styled as license plates to elicit appropriate behavior in an inclusive fifth-grade class in which four students with behavior disorders were enrolled. Student involvement in establishing the "driving rules" of the classroom is explained, the components of a token economy are outlined, and steps for group…

  6. Teachers on Television. Observing Teachers and Students in Diverse Classroom Settings through the Technology of Television.

    ERIC Educational Resources Information Center

    Iowa State Univ. of Science and Technology, Ames. Coll. of Education.

    Because most teacher preparation institutions have extensive needs for numerous field experiences, the Teachers on Television (TOT) project was first conceived to meet demands for observation opportunities through the use of television technology. The TOT project provided live direct observations via television broadcasts from classrooms located…

  7. Mobile Learning in a Large Blended Computer Science Classroom: System Function, Pedagogies, and Their Impact on Learning

    ERIC Educational Resources Information Center

    Shen, Ruimin; Wang, Minjuan; Gao, Wanping; Novak, D.; Tang, Lin

    2009-01-01

    The computer science classes in China's institutions of higher education often have large numbers of students. In addition, many institutions offer "blended" classes that include both on-campus and online students. These large blended classrooms have long suffered from a lack of interactivity. Many online classes simply provide recorded instructor…

  8. Service user involvement in pre-registration mental health nurse education classroom settings: a review of the literature.

    PubMed

    Terry, J

    2012-11-01

    Service user involvement in pre-registration nurse education is now a requirement, yet little is known about how students engage with users in the classroom, how such initiatives are being evaluated, how service users are prepared themselves to teach students, or the potential influence on clinical practice. The aim of this literature review was to bring together published articles on service user involvement in classroom settings in pre-registration mental health nurse education programmes, including their evaluations. A comprehensive review of the literature was carried out via computer search engines and the Internet, as well as a hand search of pertinent journals and references. This produced eight papers that fitted the inclusion criteria, comprising four empirical studies and four review articles, which were then reviewed using a seven-item checklist. The articles revealed a range of teaching and learning strategies had been employed, ranging from exposure to users' personal stories, to students being required to demonstrate awareness of user perspectives in case study presentations, with others involving eLearning and assessment skills initiatives. This review concludes that further longitudinal research is needed to establish the influence of user involvement in the classroom over time. PMID:22296494

  9. Engaging millennial learners: Effectiveness of personal response system technology with nursing students in small and large classrooms.

    PubMed

    Revell, Susan M Hunter; McCurry, Mary K

    2010-05-01

    Nurse educators must explore innovative technologies that make the most of the characteristics and learning styles of millennial learners. These students are comfortable with technology and prefer interactive classrooms with individual feedback and peer collaboration. This study evaluated the perceived effectiveness of personal response system (PRS) technology in enhancing student learning in small and large classrooms. PRS technology was integrated into two undergraduate courses, nursing research (n = 33) and junior medical-surgical nursing (n = 116). Multiple-choice, true-false, NCLEX-RN alternate format, and reading quiz questions were incorporated within didactic PowerPoint presentations. Data analysis of Likert-type and open-response questions supported the use of PRS technology as an effective strategy for educating millennial learners in both small and large classrooms. PRS technology promotes active learning, increases participation, and provides students and faculty with immediate feedback that reflects comprehension of content and increases faculty-student interaction. PMID:20055325

  10. Feasibility and Acceptability of Adapting the Eating in the Absence of Hunger Assessment for Preschoolers in the Classroom Setting.

    PubMed

    Soltero, Erica G; Ledoux, Tracey; Lee, Rebecca E

    2015-12-01

    Eating in the Absence of Hunger (EAH) represents a failure to self-regulate intake leading to overconsumption. Existing research on EAH has come from the clinical setting, limiting our understanding of this behavior. The purpose of this study was to describe the adaptation of the clinical EAH paradigm for preschoolers to the classroom setting and evaluate the feasibility and acceptability of measuring EAH in the classroom. The adapted protocol was implemented in childcare centers in Houston, Texas (N=4) and Phoenix, Arizona (N=2). The protocol was feasible, economical, and time efficient, eliminating previously identified barriers to administering the EAH assessment such as limited resources and the time constraint of delivering the assessment to participants individually. Implementation challenges included difficulty in choosing palatable test snacks that were in compliance with childcare center food regulations and the limited control over the meal that was administered prior to the assessment. The adapted protocol will allow for broader use of the EAH assessment and encourage researchers to incorporate the assessment into longitudinal studies in order to further our understanding of the causes and emergence of EAH. PMID:26172567

  11. Response Grids: Practical Ways to Display Large Data Sets with High Visual Impact

    ERIC Educational Resources Information Center

    Gates, Simon

    2013-01-01

    Spreadsheets are useful for large data sets but they may be too wide or too long to print as conventional tables. Response grids offer solutions to the challenges posed by any large data set. They have wide application throughout science and for every subject and context where visual data displays are designed, within education and elsewhere.…

  12. Silent and Vocal Students in a Large Active Learning Chemistry Classroom: Comparison of Performance and Motivational Factors

    ERIC Educational Resources Information Center

    Obenland, Carrie A.; Munson, Ashlyn H.; Hutchinson, John S.

    2013-01-01

    Active learning is becoming more prevalent in large science classrooms, and this study shows the impact on performance of being vocal during Socratic questioning in a General Chemistry course. 800 college students over a two year period were given a pre and post-test using the Chemistry Concept Reasoning Test. The pre-test results showed that…

  13. Mobile-Phone-Based Classroom Response Systems: Students' Perceptions of Engagement and Learning in a Large Undergraduate Course

    ERIC Educational Resources Information Center

    Dunn, Peter K.; Richardson, Alice; Oprescu, Florin; McDonald, Christine

    2013-01-01

    Using a Classroom Response System (CRS) has been associated with positive educational outcomes, by fostering student engagement and by allowing immediate feedback to both students and instructors. This study examined a low-cost CRS (VotApedia) in a large first-year class, where students responded to questions using their mobile phones. This study…

  14. Safety and science at sea: connecting science research settings to the classroom through live video

    NASA Astrophysics Data System (ADS)

    Cohen, E.; Peart, L. W.

    2011-12-01

    Many science teachers start the year off with classroom safety topics. Annual repetition helps with mastery of this important and basic knowledge, while helping schools to meet their legal obligations for safe lab science. Although these lessons are necessary, they are often topical, rarely authentic and relatively dull. Interesting connections can, however, be drawn between the importance of safety in science classrooms and the importance of safety in academic laboratories, fieldwork, shipboard research, and commercial research. Teachers can leverage these connections through live video interactions with scientists in the field, thereby creating an authentic learning environment. During the School of Rock 2009, a professional teacher research experience aboard the Integrated Ocean Drilling Program's research vessel JOIDES Resolution, safety and nature-of-science curricula were created to help address this need. By experimenting with various topics and locations on the ship that were accessible and applicable to middle school learning, 43 highly visual "safety signs" and activities were identified and presented "live" by graduate students, teachers, scientists; the ship's mates, doctor and technical staff. Students were exposed to realistic science process skills along with safety content from the world's only riserless, deep-sea drilling research vessel. The once-in-a-lifetime experience caused the students' eyes to brighten behind their safety glasses, especially as they recognized the same eye wash station and safety gear they have to wear and attended a ship's fire and safety drill along side scientists in hard hats and personal floatation devices. This collaborative and replicable live vide approach will connect basic safety content and nature-of-science process skills for a memorable and authentic learning experience for students.

  15. Initial validation of the prekindergarten Classroom Observation Tool and goal setting system for data-based coaching.

    PubMed

    Crawford, April D; Zucker, Tricia A; Williams, Jeffrey M; Bhavsar, Vibhuti; Landry, Susan H

    2013-12-01

    Although coaching is a popular approach for enhancing the quality of Tier 1 instruction, limited research has addressed observational measures specifically designed to focus coaching on evidence-based practices. This study explains the development of the prekindergarten (pre-k) Classroom Observation Tool (COT) designed for use in a data-based coaching model. We examined psychometric characteristics of the COT and explored how coaches and teachers used the COT goal-setting system. The study included 193 coaches working with 3,909 pre-k teachers in a statewide professional development program. Classrooms served 3 and 4 year olds (n = 56,390) enrolled mostly in Title I, Head Start, and other need-based pre-k programs. Coaches used the COT during a 2-hr observation at the beginning of the academic year. Teachers collected progress-monitoring data on children's language, literacy, and math outcomes three times during the year. Results indicated a theoretically supported eight-factor structure of the COT across language, literacy, and math instructional domains. Overall interrater reliability among coaches was good (.75). Although correlations with an established teacher observation measure were small, significant positive relations between COT scores and children's literacy outcomes indicate promising predictive validity. Patterns of goal-setting behaviors indicate teachers and coaches set an average of 43.17 goals during the academic year, and coaches reported that 80.62% of goals were met. Both coaches and teachers reported the COT was a helpful measure for enhancing quality of Tier 1 instruction. Limitations of the current study and implications for research and data-based coaching efforts are discussed. PMID:24059812

  16. A Day in Third Grade: A Large-Scale Study of Classroom Quality and Teacher and Student Behavior

    ERIC Educational Resources Information Center

    Elementary School Journal, 2005

    2005-01-01

    Observations of 780 third-grade classrooms described classroom activities, child-teacher interactions, and dimensions of the global classroom environment, which were examined in relation to structural aspects of the classroom and child behavior. 1 child per classroom was targeted for observation in relation to classroom quality and teacher and…

  17. Assessing the Effectiveness of Inquiry-based Learning Techniques Implemented in Large Classroom Settings

    NASA Astrophysics Data System (ADS)

    Steer, D. N.; McConnell, D. A.; Owens, K.

    2001-12-01

    Geoscience and education faculty at The University of Akron jointly developed a series of inquiry-based learning modules aimed at both non-major and major student populations enrolled in introductory geology courses. These courses typically serve 2500 students per year in four to six classes of 40-160 students each per section. Twelve modules were developed that contained common topics and assessments appropriate to Earth Science, Environmental Geology and Physical Geology classes. All modules were designed to meet four primary learning objectives agreed upon by Department of Geology faculty. These major objectives include: 1) Improvement of student understanding of the scientific method; 2) Incorporation of problem solving strategies involving analysis, synthesis, and interpretation; 3) Development of the ability to distinguish between inferences, data and observations; and 4) Obtaining an understanding of basic processes that operate on Earth. Additional objectives that may be addressed by selected modules include: 1) The societal relevance of science; 2) Use and interpretation of quantitative data to better understand the Earth; 3) Development of the students' ability to communicate scientific results; 4) Distinguishing differences between science, religion and pseudo-science; 5) Evaluation of scientific information found in the mass media; and 6) Building interpersonal relationships through in-class group work. Student pre- and post-instruction progress was evaluated by administering a test of logical thinking, an attitude toward science survey, and formative evaluations. Scores from the logical thinking instrument were used to form balanced four-person working groups based on the students' incoming cognitive level. Groups were required to complete a series of activities and/or exercises that targeted different cognitive domains based upon Bloom's taxonomy (knowledge, comprehension, application, analysis, synthesis and evaluation of information). Daily assessments of knowledge-level learning included evaluations of student responses to pre- and post-instruction conceptual test questions, short group exercises and content-oriented exam questions. Higher level thinking skills were assessed when students completed exercises that required the completion of Venn diagrams, concept maps and/or evaluation rubrics both during class periods and on exams. Initial results indicate that these techniques improved student attendance significantly and improved overall retention in the course by 8-14% over traditional lecture formats. Student scores on multiple choice exam questions were slightly higher (1-3%) for students taught in the active learning environment and short answer questions showed larger gains (7%) over students' scores in a more traditional class structure.

  18. When the globe is your classroom: teaching and learning about large-scale environmental change online

    NASA Astrophysics Data System (ADS)

    Howard, E. A.; Coleman, K. J.; Barford, C. L.; Kucharik, C.; Foley, J. A.

    2005-12-01

    Understanding environmental problems that cross physical and disciplinary boundaries requires a more holistic view of the world - a "systems" approach. Yet it is a challenge for many learners to start thinking this way, particularly when the problems are large in scale and not easily visible. We will describe our online university course, "Humans and the Changing Biosphere," which takes a whole-systems perspective for teaching regional to global-scale environmental science concepts, including climate, hydrology, ecology, and human demographics. We will share our syllabus and learning objectives and summarize our efforts to incorporate "best" practices for online teaching. We will describe challenges we have faced, and our efforts to reach different learner types. Our goals for this presentation are: (1) to communicate how a systems approach ties together environmental sciences (including climate, hydrology, ecology, biogeochemistry, and demography) that are often taught as separate disciplines; (2) to generate discussion about challenges of teaching large-scale environmental processes; (3) to share our experiences in teaching these topics online; (4) to receive ideas and feedback on future teaching strategies. We will explain why we developed this course online, and share our experiences about benefits and challenges of teaching over the web - including some suggestions about how to use technology to supplement face-to-face learning experiences (and vice versa). We will summarize assessment data about what students learned during the course, and discuss key misconceptions and barriers to learning. We will highlight the role of an online discussion board in creating classroom community, identifying misconceptions, and engaging different types of learners.

  19. BEST in CLASS: A Classroom-Based Model for Ameliorating Problem Behavior in Early Childhood Settings

    ERIC Educational Resources Information Center

    Vo, Abigail; Sutherland, Kevin S.; Conroy, Maureen A.

    2012-01-01

    As more young children enter school settings to attend early childhood programs, early childhood teachers and school psychologists have been charged with supporting a growing number of young children with chronic problem behaviors that put them at risk for the development of emotional/behavioral disorders (EBDs). There is a need for effective,…

  20. Intercultural Education Set Forward: Operational Strategies and Procedures in Cypriot Classrooms

    ERIC Educational Resources Information Center

    Hajisoteriou, Christina

    2012-01-01

    Teachers in Cyprus are being called upon for the first time to teach within culturally diverse educational settings. Given the substantial role, teachers play in the implementation of intercultural education, this paper explores the intercultural strategies and procedures adopted by primary school teachers in Cyprus. Interviews were carried out…

  1. Best in Class: A Classroom-Based Model for Ameliorating Problem Behavior in Early Childhood Settings

    ERIC Educational Resources Information Center

    Vo, Abigail K.; Sutherland, Kevin S.; Conroy, Maureen A.

    2012-01-01

    As more young children enter school settings to attend early childhood programs, early childhood teachers and school psychologists have been charged with supporting a growing number of young children with chronic problem behaviors that put them at risk for the development of emotional/behavioral disorders (EBDs). There is a need for effective,…

  2. An Academic Approach to Stress Management for College Students in a Conventional Classroom Setting.

    ERIC Educational Resources Information Center

    Carnahan, Robert E.; And Others

    Since the identification of stress and the relationship of individual stress responses to physical and mental health, medical and behavioral professionals have been training individuals in coping strategies. To investigate the possibility of teaching cognitive coping skills to a nonclinical population in an academic setting, 41 college students…

  3. A Classroom Exercise in Spatial Analysis Using an Imaginary Data Set.

    ERIC Educational Resources Information Center

    Kopaska-Merkel, David C.

    One skill that elementary students need to acquire is the ability to analyze spatially distributed data. In this activity students are asked to complete the following tasks: (1) plot a set of data (related to "mud-sharks"--an imaginary fish) on a map of the state of Alabama, (2) identify trends in the data, (3) make graphs using the data…

  4. Toddler Subtraction with Large Sets: Further Evidence for an Analog-Magnitude Representation of Number

    ERIC Educational Resources Information Center

    Slaughter, Virginia; Kamppi, Dorian; Paynter, Jessica

    2006-01-01

    Two experiments were conducted to test the hypothesis that toddlers have access to an analog-magnitude number representation that supports numerical reasoning about relatively large numbers. Three-year-olds were presented with subtraction problems in which initial set size and proportions subtracted were systematically varied. Two sets of cookies…

  5. Using Mobile Phones to Increase Classroom Interaction

    ERIC Educational Resources Information Center

    Cobb, Stephanie; Heaney, Rose; Corcoran, Olivia; Henderson-Begg, Stephanie

    2010-01-01

    This study examines the possible benefits of using mobile phones to increase interaction and promote active learning in large classroom settings. First year undergraduate students studying Cellular Processes at the University of East London took part in a trial of a new text-based classroom interaction system and evaluated their experience by…

  6. Impact of Abbreviated Lecture with Interactive Mini-cases vs Traditional Lecture on Student Performance in the Large Classroom

    PubMed Central

    Nykamp, Diane L.; Momary, Kathryn M.

    2014-01-01

    Objective. To compare the impact of 2 different teaching and learning methods on student mastery of learning objectives in a pharmacotherapy module in the large classroom setting. Design. Two teaching and learning methods were implemented and compared in a required pharmacotherapy module for 2 years. The first year, multiple interactive mini-cases with inclass individual assessment and an abbreviated lecture were used to teach osteoarthritis; a traditional lecture with 1 inclass case discussion was used to teach gout. In the second year, the same topics were used but the methods were flipped. Student performance on pre/post individual readiness assessment tests (iRATs), case questions, and subsequent examinations were compared each year by the teaching and learning method and then between years by topic for each method. Students also voluntarily completed a 20-item evaluation of the teaching and learning methods. Assessment. Postpresentation iRATs were significantly higher than prepresentation iRATs for each topic each year with the interactive mini-cases; there was no significant difference in iRATs before and after traditional lecture. For osteoarthritis, postpresentation iRATs after interactive mini-cases in year 1 were significantly higher than postpresentation iRATs after traditional lecture in year 2; the difference in iRATs for gout per learning method was not significant. The difference between examination performance for osteoarthritis and gout was not significant when the teaching and learning methods were compared. On the student evaluations, 2 items were significant both years when answers were compared by teaching and learning method. Each year, students ranked their class participation higher with interactive cases than with traditional lecture, but both years they reported enjoying the traditional lecture format more. Conclusion. Multiple interactive mini-cases with an abbreviated lecture improved immediate mastery of learning objectives compared to

  7. Confirming the Phylogeny of Mammals by Use of Large Comparative Sequence Data Sets

    PubMed Central

    Prasad, Arjun B.; Allard, Marc W.

    2008-01-01

    The ongoing generation of prodigious amounts of genomic sequence data from myriad vertebrates is providing unparalleled opportunities for establishing definitive phylogenetic relationships among species. The size and complexities of such comparative sequence data sets not only allow smaller and more difficult branches to be resolved but also present unique challenges, including large computational requirements and the negative consequences of systematic biases. To explore these issues and to clarify the phylogenetic relationships among mammals, we have analyzed a large data set of over 60 megabase pairs (Mb) of high-quality genomic sequence, which we generated from 41 mammals and 3 other vertebrates. All sequences are orthologous to a 1.9-Mb region of the human genome that encompasses the cystic fibrosis transmembrane conductance regulator gene (CFTR). To understand the characteristics and challenges associated with phylogenetic analyses of such a large data set, we partitioned the sequence data in several ways and utilized maximum likelihood, maximum parsimony, and Neighbor-Joining algorithms, implemented in parallel on Linux clusters. These studies yielded well-supported phylogenetic trees, largely confirming other recent molecular phylogenetic analyses. Our results provide support for rooting the placental mammal tree between Atlantogenata (Xenarthra and Afrotheria) and Boreoeutheria (Euarchontoglires and Laurasiatheria), illustrate the difficulty in resolving some branches even with large amounts of data (e.g., in the case of Laurasiatheria), and demonstrate the valuable role that very large comparative sequence data sets can play in refining our understanding of the evolutionary relationships of vertebrates. PMID:18453548

  8. Experiments and other methods for developing expertise with design of experiments in a classroom setting

    NASA Technical Reports Server (NTRS)

    Patterson, John W.

    1990-01-01

    The only way to gain genuine expertise in Statistical Process Control (SPC) and the design of experiments (DOX) is with repeated practice, but not on canned problems with dead data sets. Rather, one must negotiate a wide variety of problems each with its own peculiarities and its own constantly changing data. The problems should not be of the type for which there is a single, well-defined answer that can be looked up in a fraternity file or in some text. The problems should match as closely as possible the open-ended types for which there is always an abundance of uncertainty. These are the only kinds that arise in real research, whether that be basic research in academe or engineering research in industry. To gain this kind of experience, either as a professional consultant or as an industrial employee, takes years. Vast amounts of money, not to mention careers, must be put at risk. The purpose here is to outline some realistic simulation-type lab exercises that are so simple and inexpensive to run that the students can repeat them as often as desired at virtually no cost. Simulations also allow the instructor to design problems whose outcomes are as noisy as desired but still predictable within limits. Also the instructor and the students can learn a great deal more from the postmortum conducted after the exercise is completed. One never knows for sure what the true data should have been when dealing only with real life experiments. To add a bit more realism to the exercises, it is sometimes desirable to make the students pay for each experimental result from a make-believe budget allocation for the problem.

  9. Coaching as a Key Component in Teachers' Professional Development: Improving Classroom Practices in Head Start Settings. OPRE Report 2012-4

    ERIC Educational Resources Information Center

    Lloyd, Chrrishana M.; Modlin, Emmily L.

    2012-01-01

    Head Start CARES (Classroom-based Approaches and Resources for Emotion and Social Skill Promotion) is a large-scale, national research demonstration that was designed to test the effects of a one-year program aimed at improving pre-kindergarteners' social and emotional readiness for school. To facilitate the delivery of the program, teachers…

  10. Classroom Response Systems for Implementing "Interactive Inquiry" in Large Organic Chemistry Classes

    ERIC Educational Resources Information Center

    Morrison, Richard W.; Caughran, Joel A.; Sauers, Angela L.

    2014-01-01

    The authors have developed "sequence response applications" for classroom response systems (CRSs) that allow instructors to engage and actively involve students in the learning process, probe for common misconceptions regarding lecture material, and increase interaction between instructors and students. "Guided inquiry" and…

  11. Prekindergarten Teachers' Verbal References to Print during Classroom-Based, Large-Group Shared Reading

    ERIC Educational Resources Information Center

    Zucker, Tricia A.; Justice, Laura M.; Piasta, Shayne B.

    2009-01-01

    Purpose: The frequency with which adults reference print when reading with preschool-age children is associated with growth in children's print knowledge (e.g., L.M. Justice & H.K. Ezell, 2000, 2002). This study examined whether prekindergarten (pre-K) teachers naturally reference print during classroom shared reading and if verbal print…

  12. Out in the Classroom: Transgender Student Experiences at a Large Public University

    ERIC Educational Resources Information Center

    Pryor, Jonathan T.

    2015-01-01

    Faculty and peer interactions are 2 of the most important relationships for college students to foster (Astin, 1993). Transgender college students have an increasing visible presence on college campuses (Pusch, 2005), yet limited research exists on their experiences and struggles in the classroom environment (Garvey & Rankin, 2015; Renn,…

  13. The Classroom Observation Schedule to Measure Intentional Communication (COSMIC): An Observational Measure of the Intentional Communication of Children with Autism in an Unstructured Classroom Setting

    ERIC Educational Resources Information Center

    Pasco, Greg; Gordon, Rosanna K.; Howlin, Patricia; Charman, Tony

    2008-01-01

    The Classroom Observation Schedule to Measure Intentional Communication (COSMIC) was devised to provide ecologically valid outcome measures for a communication-focused intervention trial. Ninety-one children with autism spectrum disorder aged 6 years 10 months (SD 16 months) were videoed during their everyday snack, teaching and free play…

  14. Using Content-Specific Lyrics to Familiar Tunes in a Large Lecture Setting

    ERIC Educational Resources Information Center

    McLachlin, Derek T.

    2009-01-01

    Music can be used in lectures to increase student engagement and help students retain information. In this paper, I describe my use of biochemistry-related lyrics written to the tune of the theme to the television show, The Flintstones, in a large class setting (400-800 students). To determine student perceptions, the class was surveyed several…

  15. Preschoolers' Nonsymbolic Arithmetic with Large Sets: Is Addition More Accurate than Subtraction?

    ERIC Educational Resources Information Center

    Shinskey, Jeanne L.; Chan, Cindy Ho-man; Coleman, Rhea; Moxom, Lauren; Yamamoto, Eri

    2009-01-01

    Adult and developing humans share with other animals analog magnitude representations of number that support nonsymbolic arithmetic with large sets. This experiment tested the hypothesis that such representations may be more accurate for addition than for subtraction in children as young as 3 1/2 years of age. In these tasks, the experimenter hid…

  16. Influences of large sets of environmental exposures on immune responses in healthy adult men.

    PubMed

    Yi, Buqing; Rykova, Marina; Jäger, Gundula; Feuerecker, Matthias; Hörl, Marion; Matzel, Sandra; Ponomarev, Sergey; Vassilieva, Galina; Nichiporuk, Igor; Choukèr, Alexander

    2015-01-01

    Environmental factors have long been known to influence immune responses. In particular, clinical studies about the association between migration and increased risk of atopy/asthma have provided important information on the role of migration associated large sets of environmental exposures in the development of allergic diseases. However, investigations about environmental effects on immune responses are mostly limited in candidate environmental exposures, such as air pollution. The influences of large sets of environmental exposures on immune responses are still largely unknown. A simulated 520-d Mars mission provided an opportunity to investigate this topic. Six healthy males lived in a closed habitat simulating a spacecraft for 520 days. When they exited their "spacecraft" after the mission, the scenario was similar to that of migration, involving exposure to a new set of environmental pollutants and allergens. We measured multiple immune parameters with blood samples at chosen time points after the mission. At the early adaptation stage, highly enhanced cytokine responses were observed upon ex vivo antigen stimulations. For cell population frequencies, we found the subjects displayed increased neutrophils. These results may presumably represent the immune changes occurred in healthy humans when migrating, indicating that large sets of environmental exposures may trigger aberrant immune activity. PMID:26306804

  17. Influences of large sets of environmental exposures on immune responses in healthy adult men

    PubMed Central

    Yi, Buqing; Rykova, Marina; Jäger, Gundula; Feuerecker, Matthias; Hörl, Marion; Matzel, Sandra; Ponomarev, Sergey; Vassilieva, Galina; Nichiporuk, Igor; Choukèr, Alexander

    2015-01-01

    Environmental factors have long been known to influence immune responses. In particular, clinical studies about the association between migration and increased risk of atopy/asthma have provided important information on the role of migration associated large sets of environmental exposures in the development of allergic diseases. However, investigations about environmental effects on immune responses are mostly limited in candidate environmental exposures, such as air pollution. The influences of large sets of environmental exposures on immune responses are still largely unknown. A simulated 520-d Mars mission provided an opportunity to investigate this topic. Six healthy males lived in a closed habitat simulating a spacecraft for 520 days. When they exited their “spacecraft” after the mission, the scenario was similar to that of migration, involving exposure to a new set of environmental pollutants and allergens. We measured multiple immune parameters with blood samples at chosen time points after the mission. At the early adaptation stage, highly enhanced cytokine responses were observed upon ex vivo antigen stimulations. For cell population frequencies, we found the subjects displayed increased neutrophils. These results may presumably represent the immune changes occurred in healthy humans when migrating, indicating that large sets of environmental exposures may trigger aberrant immune activity. PMID:26306804

  18. DocCube: Multi-Dimensional Visualization and Exploration of Large Document Sets.

    ERIC Educational Resources Information Center

    Mothe, Josiane; Chrisment, Claude; Dousset, Bernard; Alaux, Joel

    2003-01-01

    Describes a user interface that provides global visualizations of large document sets to help users formulate the query that corresponds to their information needs. Highlights include concept hierarchies that users can browse to specify and refine information needs; knowledge discovery in databases and texts; and multidimensional modeling.…

  19. Large-scale detection of metals with a small set of fluorescent DNA-like chemosensors.

    PubMed

    Yuen, Lik Hang; Franzini, Raphael M; Tan, Samuel S; Kool, Eric T

    2014-10-15

    An important advantage of pattern-based chemosensor sets is their potential to detect and differentiate a large number of analytes with only few sensors. Here we test this principle at a conceptual limit by analyzing a large set of metal ion analytes covering essentially the entire periodic table, employing fluorescent DNA-like chemosensors on solid support. A tetrameric "oligodeoxyfluoroside" (ODF) library of 6561 members containing metal-binding monomers was screened for strong responders to 57 metal ions in solution. Our results show that a set of 9 chemosensors could successfully discriminate the 57 species, including alkali, alkaline earth, post-transition, transition, and lanthanide metals. As few as 6 ODF chemosensors could detect and differentiate 50 metals at 100 μM; sensitivity for some metals was achieved at midnanomolar ranges. A blind test with 50 metals further confirmed the discriminating power of the ODFs. PMID:25255102

  20. On basis set superposition error corrected stabilization energies for large n-body clusters.

    PubMed

    Walczak, Katarzyna; Friedrich, Joachim; Dolg, Michael

    2011-10-01

    In this contribution, we propose an approximate basis set superposition error (BSSE) correction scheme for the site-site function counterpoise and for the Valiron-Mayer function counterpoise correction of second order to account for the basis set superposition error in clusters with a large number of subunits. The accuracy of the proposed scheme has been investigated for a water cluster series at the CCSD(T), CCSD, MP2, and self-consistent field levels of theory using Dunning's correlation consistent basis sets. The BSSE corrected stabilization energies for a series of water clusters are presented. A study regarding the possible savings with respect to computational resources has been carried out as well as a monitoring of the basis set dependence of the approximate BSSE corrections. PMID:21992293

  1. Large Code Set for Double User Capacity and Low PAPR Level in Multicarrier Systems

    NASA Astrophysics Data System (ADS)

    Anwar, Khoirul; Saito, Masato; Hara, Takao; Okada, Minoru

    In this paper, a new large spreading code set with a uniform low cross-correlation is proposed. The proposed code set is capable of (1) increasing the number of assigned user (capacity) in a multicarrier code division multiple access (MC-CDMA) system and (2) reducing the peak-to-average power ratio (PAPR) of an orthogonal frequency division multiplexing (OFDM) system. In this paper, we derive a new code set and present an example to demonstrate performance improvements of OFDM and MC-CDMA systems. Our proposed code set with code length of N has K=2N+1 number of codes for supporting up to (2N+1) users and exhibits lower cross correlation properties compared to the existing spreading code sets. Our results with subcarrier N=16 confirm that the proposed code set outperforms the current pseudo-orthogonal carrier interferometry (POCI) code set with gain of 5dB at bit-error-rate (BER) level of 10-4 in the additive white Gaussian noise (AWGN) channel and gain of more than 3.6dB in a multipath fading channel.

  2. Implementing Child-focused Activity Meter Utilization into the Elementary School Classroom Setting Using a Collaborative Community-based Approach

    PubMed Central

    Lynch, BA; Jones, A; Biggs, BK; Kaufman, T; Cristiani, V; Kumar, S; Quigg, S; Maxson, J; Swenson, L; Jacobson, N

    2016-01-01

    Introduction The prevalence of pediatric obesity has increased over the past 3 decades and is a pressing public health program. New technology advancements that can encourage more physical in children are needed. The Zamzee program is an activity meter linked to a motivational website designed for children 8–14 years of age. The objective of the study was to use a collaborative approach between a medical center, the private sector and local school staff to assess the feasibility of using the Zamzee Program in the school-based setting to improve physical activity levels in children. Methods This was a pilot 8-week observational study offered to all children in one fifth grade classroom. Body mass index (BMI), the amount of physical activity by 3-day recall survey, and satisfaction with usability of the Zamzee Program were measured pre- and post-study. Results Out of 11 children who enrolled in the study, 7 completed all study activities. In those who completed the study, the median (interquartile range) total activity time by survey increased by 17 (1042) minutes and the BMI percentile change was 0 (8). Both children and their caregivers found the Zamzee Activity Meter (6/7) and website (6/7) “very easy” or “easy” to use. Conclusion The Zamzee Program was found to be usable but did not significantly improve physical activity levels or BMI. Collaborative obesity intervention projects involving medical centers, the private sector and local schools are feasible but the effectiveness needs to be evaluated in larger-scale studies. PMID:27042382

  3. An Analysis Framework Addressing the Scale and Legibility of Large Scientific Data Sets

    SciTech Connect

    Childs, H R

    2006-11-20

    Much of the previous work in the large data visualization area has solely focused on handling the scale of the data. This task is clearly a great challenge and necessary, but it is not sufficient. Applying standard visualization techniques to large scale data sets often creates complicated pictures where meaningful trends are lost. A second challenge, then, is to also provide algorithms that simplify what an analyst must understand, using either visual or quantitative means. This challenge can be summarized as improving the legibility or reducing the complexity of massive data sets. Fully meeting both of these challenges is the work of many, many PhD dissertations. In this dissertation, we describe some new techniques to address both the scale and legibility challenges, in hope of contributing to the larger solution. In addition to our assumption of simultaneously addressing both scale and legibility, we add an additional requirement that the solutions considered fit well within an interoperable framework for diverse algorithms, because a large suite of algorithms is often necessary to fully understand complex data sets. For scale, we present a general architecture for handling large data, as well as details of a contract-based system for integrating advanced optimizations into a data flow network design. We also describe techniques for volume rendering and performing comparisons at the extreme scale. For legibility, we present several techniques. Most noteworthy are equivalence class functions, a technique to drive visualizations using statistical methods, and line-scan based techniques for characterizing shape.

  4. PrestoPronto: a code devoted to handling large data sets

    NASA Astrophysics Data System (ADS)

    Figueroa, S. J. A.; Prestipino, C.

    2016-05-01

    The software PrestoPronto consist to a full graphical user interface (GUI) program aimed to execute the analysis of large X-ray Absorption Spectroscopy data sets. Written in Python is free and open source. The code is able to read large datasets, apply calibration, alignment corrections and perform classical data analysis, from the extraction of the signal to EXAFS fit. The package includes also programs with GUIs] to perform, Principal Component Analysis and Linear Combination Fits. The main benefit of this program is allow to follow quickly the evolution of time resolved experiments coming from Quick-EXAFS (QEXAFS) and dispersive EXAFS beamlines.

  5. Validating a large geophysical data set: Experiences with satellite-derived cloud parameters

    NASA Technical Reports Server (NTRS)

    Kahn, Ralph; Haskins, Robert D.; Knighton, James E.; Pursch, Andrew; Granger-Gallegos, Stephanie

    1992-01-01

    We are validating the global cloud parameters derived from the satellite-borne HIRS2 and MSU atmospheric sounding instrument measurements, and are using the analysis of these data as one prototype for studying large geophysical data sets in general. The HIRS2/MSU data set contains a total of 40 physical parameters, filling 25 MB/day; raw HIRS2/MSU data are available for a period exceeding 10 years. Validation involves developing a quantitative sense for the physical meaning of the derived parameters over the range of environmental conditions sampled. This is accomplished by comparing the spatial and temporal distributions of the derived quantities with similar measurements made using other techniques, and with model results. The data handling needed for this work is possible only with the help of a suite of interactive graphical and numerical analysis tools. Level 3 (gridded) data is the common form in which large data sets of this type are distributed for scientific analysis. We find that Level 3 data is inadequate for the data comparisons required for validation. Level 2 data (individual measurements in geophysical units) is needed. A sampling problem arises when individual measurements, which are not uniformly distributed in space or time, are used for the comparisons. Standard 'interpolation' methods involve fitting the measurements for each data set to surfaces, which are then compared. We are experimenting with formal criteria for selecting geographical regions, based upon the spatial frequency and variability of measurements, that allow us to quantify the uncertainty due to sampling. As part of this project, we are also dealing with ways to keep track of constraints placed on the output by assumptions made in the computer code. The need to work with Level 2 data introduces a number of other data handling issues, such as accessing data files across machine types, meeting large data storage requirements, accessing other validated data sets, processing speed

  6. A Scalable Approach for Protein False Discovery Rate Estimation in Large Proteomic Data Sets.

    PubMed

    Savitski, Mikhail M; Wilhelm, Mathias; Hahne, Hannes; Kuster, Bernhard; Bantscheff, Marcus

    2015-09-01

    Calculating the number of confidently identified proteins and estimating false discovery rate (FDR) is a challenge when analyzing very large proteomic data sets such as entire human proteomes. Biological and technical heterogeneity in proteomic experiments further add to the challenge and there are strong differences in opinion regarding the conceptual validity of a protein FDR and no consensus regarding the methodology for protein FDR determination. There are also limitations inherent to the widely used classic target-decoy strategy that particularly show when analyzing very large data sets and that lead to a strong over-representation of decoy identifications. In this study, we investigated the merits of the classic, as well as a novel target-decoy-based protein FDR estimation approach, taking advantage of a heterogeneous data collection comprised of ∼19,000 LC-MS/MS runs deposited in ProteomicsDB (https://www.proteomicsdb.org). The "picked" protein FDR approach treats target and decoy sequences of the same protein as a pair rather than as individual entities and chooses either the target or the decoy sequence depending on which receives the highest score. We investigated the performance of this approach in combination with q-value based peptide scoring to normalize sample-, instrument-, and search engine-specific differences. The "picked" target-decoy strategy performed best when protein scoring was based on the best peptide q-value for each protein yielding a stable number of true positive protein identifications over a wide range of q-value thresholds. We show that this simple and unbiased strategy eliminates a conceptual issue in the commonly used "classic" protein FDR approach that causes overprediction of false-positive protein identification in large data sets. The approach scales from small to very large data sets without losing performance, consistently increases the number of true-positive protein identifications and is readily implemented in

  7. The search for structure - Object classification in large data sets. [for astronomers

    NASA Technical Reports Server (NTRS)

    Kurtz, Michael J.

    1988-01-01

    Research concerning object classifications schemes are reviewed, focusing on large data sets. Classification techniques are discussed, including syntactic, decision theoretic methods, fuzzy techniques, and stochastic and fuzzy grammars. Consideration is given to the automation of MK classification (Morgan and Keenan, 1973) and other problems associated with the classification of spectra. In addition, the classification of galaxies is examined, including the problems of systematic errors, blended objects, galaxy types, and galaxy clusters.

  8. Faculty and student experiences with Web-based discussion groups in a large lecture setting.

    PubMed

    Harden, Janet Kula

    2003-01-01

    The exchange of ideas in a discussion format is a more effective way of developing critical thinking in students than a traditional lecture format. Although research has shown that discussion groups are more effective for developing skills in application, analysis, and synthesis of content, it is difficult to implement in a large lecture setting. The author discusses how computer discussion groups were incorporated into a class of 117 nursing students. PMID:12544613

  9. Coffee Shops, Classrooms and Conversations: public engagement and outreach in a large interdisciplinary research Hub

    NASA Astrophysics Data System (ADS)

    Holden, Jennifer A.

    2014-05-01

    Public engagement and outreach activities are increasingly using specialist staff for co-ordination, training and support for researchers, they are also becoming expected for large investments. Here, the experience of public engagement and outreach a large, interdisciplinary Research Hub is described. dot.rural, based at the University of Aberdeen UK, is a £11.8 million Research Councils UK Rural Digital Economy Hub, funded as part of the RCUK Digital Economy Theme (2009-2015). Digital Economy research aims to realise the transformational impact of digital technologies on aspects of the environment, community life, cultural experiences, future society, and the economy. The dot.rural Hub involves 92 researchers from 12 different disciplines, including Geography, Hydrology and Ecology. Public Engagement and Outreach is embedded in the dot.rural Digital Economy Hub via an Outreach Officer. Alongside this position, public engagement and outreach activities are compulsory part of PhD student contracts. Public Engagement and Outreach activities at the dot.rural Hub involve individuals and groups in both formal and informal settings organised by dot.rural and other organisations. Activities in the realms of Education, Public Engagement, Traditional and Social Media are determined by a set of Underlying Principles designed for the Hub by the Outreach Officer. The underlying Engagement and Outreach principles match funding agency requirements and expectations alongside researcher demands and the user-led nature of Digital Economy Research. All activities include researchers alongside the Outreach Officer are research informed and embedded into specific projects that form the Hub. Successful public engagement activities have included participation in Café Scientifique series, workshops in primary and secondary schools, and online activities such as I'm a Scientist Get Me Out of Here. From how to engage 8 year olds with making hydrographs more understandable to members of

  10. Automatic alignment of individual peaks in large high-resolution spectral data sets

    NASA Astrophysics Data System (ADS)

    Stoyanova, Radka; Nicholls, Andrew W.; Nicholson, Jeremy K.; Lindon, John C.; Brown, Truman R.

    2004-10-01

    Pattern recognition techniques are effective tools for reducing the information contained in large spectral data sets to a much smaller number of significant features which can then be used to make interpretations about the chemical or biochemical system under study. Often the effectiveness of such approaches is impeded by experimental and instrument induced variations in the position, phase, and line width of the spectral peaks. Although characterizing the cause and magnitude of these fluctuations could be important in its own right (pH-induced NMR chemical shift changes, for example) in general they obscure the process of pattern discovery. One major area of application is the use of large databases of 1H NMR spectra of biofluids such as urine for investigating perturbations in metabolic profiles caused by drugs or disease, a process now termed metabonomics. Frequency shifts of individual peaks are the dominant source of such unwanted variations in this type of data. In this paper, an automatic procedure for aligning the individual peaks in the data set is described and evaluated. The proposed method will be vital for the efficient and automatic analysis of large metabonomic data sets and should also be applicable to other types of data.

  11. Moving Large Data Sets Over High-Performance Long Distance Networks

    SciTech Connect

    Hodson, Stephen W; Poole, Stephen W; Ruwart, Thomas; Settlemyer, Bradley W

    2011-04-01

    In this project we look at the performance characteristics of three tools used to move large data sets over dedicated long distance networking infrastructure. Although performance studies of wide area networks have been a frequent topic of interest, performance analyses have tended to focus on network latency characteristics and peak throughput using network traffic generators. In this study we instead perform an end-to-end long distance networking analysis that includes reading large data sets from a source file system and committing large data sets to a destination file system. An evaluation of end-to-end data movement is also an evaluation of the system configurations employed and the tools used to move the data. For this paper, we have built several storage platforms and connected them with a high performance long distance network configuration. We use these systems to analyze the capabilities of three data movement tools: BBcp, GridFTP, and XDD. Our studies demonstrate that existing data movement tools do not provide efficient performance levels or exercise the storage devices in their highest performance modes. We describe the device information required to achieve high levels of I/O performance and discuss how this data is applicable in use cases beyond data movement performance.

  12. Non-rigid Registration for Large Sets of Microscopic Images on Graphics Processors

    PubMed Central

    Ruiz, Antonio; Ujaldon, Manuel; Cooper, Lee

    2014-01-01

    Microscopic imaging is an important tool for characterizing tissue morphology and pathology. 3D reconstruction and visualization of large sample tissue structure requires registration of large sets of high-resolution images. However, the scale of this problem presents a challenge for automatic registration methods. In this paper we present a novel method for efficient automatic registration using graphics processing units (GPUs) and parallel programming. Comparing a C++ CPU implementation with Compute Unified Device Architecture (CUDA) libraries and pthreads running on GPU we achieve a speed-up factor of up to 4.11× with a single GPU and 6.68× with a GPU pair. We present execution times for a benchmark composed of two sets of large-scale images: mouse placenta (16K × 16K pixels) and breast cancer tumors (23K × 62K pixels). It takes more than 12 hours for the genetic case in C++ to register a typical sample composed of 500 consecutive slides, which was reduced to less than 2 hours using two GPUs, in addition to a very promising scalability for extending those gains easily on a large number of GPUs in a distributed system. PMID:25328635

  13. COLLABORATIVE RESEARCH: Parallel Analysis Tools and New Visualization Techniques for Ultra-Large Climate Data Set

    SciTech Connect

    middleton, Don; Haley, Mary

    2014-12-10

    ParVis was a project funded under LAB 10-05: “Earth System Modeling: Advanced Scientific Visualization of Ultra-Large Climate Data Sets”. Argonne was the lead lab with partners at PNNL, SNL, NCAR and UC-Davis. This report covers progress from January 1st, 2013 through Dec 1st, 2014. Two previous reports covered the period from Summer, 2010, through September 2011 and October 2011 through December 2012, respectively. While the project was originally planned to end on April 30, 2013, personnel and priority changes allowed many of the institutions to continue work through FY14 using existing funds. A primary focus of ParVis was introducing parallelism to climate model analysis to greatly reduce the time-to-visualization for ultra-large climate data sets. Work in the first two years was conducted on two tracks with different time horizons: one track to provide immediate help to climate scientists already struggling to apply their analysis to existing large data sets and another focused on building a new data-parallel library and tool for climate analysis and visualization that will give the field a platform for performing analysis and visualization on ultra-large datasets for the foreseeable future. In the final 2 years of the project, we focused mostly on the new data-parallel library and associated tools for climate analysis and visualization.

  14. Breeding and Genetics Symposium: really big data: processing and analysis of very large data sets.

    PubMed

    Cole, J B; Newman, S; Foertter, F; Aguilar, I; Coffey, M

    2012-03-01

    Modern animal breeding data sets are large and getting larger, due in part to recent availability of high-density SNP arrays and cheap sequencing technology. High-performance computing methods for efficient data warehousing and analysis are under development. Financial and security considerations are important when using shared clusters. Sound software engineering practices are needed, and it is better to use existing solutions when possible. Storage requirements for genotypes are modest, although full-sequence data will require greater storage capacity. Storage requirements for intermediate and results files for genetic evaluations are much greater, particularly when multiple runs must be stored for research and validation studies. The greatest gains in accuracy from genomic selection have been realized for traits of low heritability, and there is increasing interest in new health and management traits. The collection of sufficient phenotypes to produce accurate evaluations may take many years, and high-reliability proofs for older bulls are needed to estimate marker effects. Data mining algorithms applied to large data sets may help identify unexpected relationships in the data, and improved visualization tools will provide insights. Genomic selection using large data requires a lot of computing power, particularly when large fractions of the population are genotyped. Theoretical improvements have made possible the inversion of large numerator relationship matrices, permitted the solving of large systems of equations, and produced fast algorithms for variance component estimation. Recent work shows that single-step approaches combining BLUP with a genomic relationship (G) matrix have similar computational requirements to traditional BLUP, and the limiting factor is the construction and inversion of G for many genotypes. A naïve algorithm for creating G for 14,000 individuals required almost 24 h to run, but custom libraries and parallel computing reduced that to

  15. The gradient boosting algorithm and random boosting for genome-assisted evaluation in large data sets.

    PubMed

    González-Recio, O; Jiménez-Montero, J A; Alenda, R

    2013-01-01

    In the next few years, with the advent of high-density single nucleotide polymorphism (SNP) arrays and genome sequencing, genomic evaluation methods will need to deal with a large number of genetic variants and an increasing sample size. The boosting algorithm is a machine-learning technique that may alleviate the drawbacks of dealing with such large data sets. This algorithm combines different predictors in a sequential manner with some shrinkage on them; each predictor is applied consecutively to the residuals from the committee formed by the previous ones to form a final prediction based on a subset of covariates. Here, a detailed description is provided and examples using a toy data set are included. A modification of the algorithm called "random boosting" was proposed to increase predictive ability and decrease computation time of genome-assisted evaluation in large data sets. Random boosting uses a random selection of markers to add a subsequent weak learner to the predictive model. These modifications were applied to a real data set composed of 1,797 bulls genotyped for 39,714 SNP. Deregressed proofs of 4 yield traits and 1 type trait from January 2009 routine evaluations were used as dependent variables. A 2-fold cross-validation scenario was implemented. Sires born before 2005 were used as a training sample (1,576 and 1,562 for production and type traits, respectively), whereas younger sires were used as a testing sample to evaluate predictive ability of the algorithm on yet-to-be-observed phenotypes. Comparison with the original algorithm was provided. The predictive ability of the algorithm was measured as Pearson correlations between observed and predicted responses. Further, estimated bias was computed as the average difference between observed and predicted phenotypes. The results showed that the modification of the original boosting algorithm could be run in 1% of the time used with the original algorithm and with negligible differences in accuracy

  16. An interactive environment for the analysis of large Earth observation and model data sets

    NASA Technical Reports Server (NTRS)

    Bowman, Kenneth P.; Walsh, John E.; Wilhelmson, Robert B.

    1994-01-01

    Envision is an interactive environment that provides researchers in the earth sciences convenient ways to manage, browse, and visualize large observed or model data sets. Its main features are support for the netCDF and HDF file formats, an easy to use X/Motif user interface, a client-server configuration, and portability to many UNIX workstations. The Envision package also provides new ways to view and change metadata in a set of data files. It permits a scientist to conveniently and efficiently manage large data sets consisting of many data files. It also provides links to popular visualization tools so that data can be quickly browsed. Envision is a public domain package, freely available to the scientific community. Envision software (binaries and source code) and documentation can be obtained from either of these servers: ftp://vista.atmos.uiuc.edu/pub/envision/ and ftp://csrp.tamu.edu/pub/envision/. Detailed descriptions of Envision capabilities and operations can be found in the User's Guide and Reference Manuals distributed with Envision software.

  17. Neuron-synapse IC chip-set for large-scale chaotic neural networks.

    PubMed

    Horio, Y; Aihara, K; Yamamoto, O

    2003-01-01

    We propose a neuron-synapse integrated circuit (IC) chip-set for large-scale chaotic neural networks. We use switched-capacitor (SC) circuit techniques to implement a three-internal-state transiently-chaotic neural network model. The SC chaotic neuron chip faithfully reproduces complex chaotic dynamics in real numbers through continuous state variables of the analog circuitry. We can digitally control most of the model parameters by means of programmable capacitive arrays embedded in the SC chaotic neuron chip. Since the output of the neuron is transfered into a digital pulse according to the all-or-nothing property of an axon, we design a synapse chip with digital circuits. We propose a memory-based synapse circuit architecture to achieve a rapid calculation of a vast number of weighted summations. Both of the SC neuron and the digital synapse circuits have been fabricated as IC forms. We have tested these IC chips extensively, and confirmed the functions and performance of the chip-set. The proposed neuron-synapse IC chip-set makes it possible to construct a scalable and reconfigurable large-scale chaotic neural network with 10000 neurons and 10000/sup 2/ synaptic connections. PMID:18244585

  18. Tiny videos: a large data set for nonparametric video retrieval and frame classification.

    PubMed

    Karpenko, Alexandre; Aarabi, Parham

    2011-03-01

    In this paper, we present a large database of over 50,000 user-labeled videos collected from YouTube. We develop a compact representation called "tiny videos" that achieves high video compression rates while retaining the overall visual appearance of the video as it varies over time. We show that frame sampling using affinity propagation-an exemplar-based clustering algorithm-achieves the best trade-off between compression and video recall. We use this large collection of user-labeled videos in conjunction with simple data mining techniques to perform related video retrieval, as well as classification of images and video frames. The classification results achieved by tiny videos are compared with the tiny images framework [24] for a variety of recognition tasks. The tiny images data set consists of 80 million images collected from the Internet. These are the largest labeled research data sets of videos and images available to date. We show that tiny videos are better suited for classifying scenery and sports activities, while tiny images perform better at recognizing objects. Furthermore, we demonstrate that combining the tiny images and tiny videos data sets improves classification precision in a wider range of categories. PMID:21252400

  19. Classroom-Based Interventions and Teachers' Perceived Job Stressors and Confidence: Evidence from a Randomized Trial in Head Start Settings

    ERIC Educational Resources Information Center

    Zhai, Fuhua; Raver, C. Cybele; Li-Grining, Christine

    2011-01-01

    Preschool teachers' job stressors have received increasing attention but have been understudied in the literature. We investigated the impacts of a classroom-based intervention, the Chicago School Readiness Project (CSRP), on teachers' perceived job stressors and confidence, as indexed by their perceptions of job control, job resources, job…

  20. An Analogous Study of Children's Attitudes Toward School in an Open Classroom Environment as Opposed to a Conventional Setting.

    ERIC Educational Resources Information Center

    Zeli, Doris Conti

    A study sought to determine whether intermediate age children exposed to open classroom teaching strategy have a more positive attitude toward school than intermediate age children exposed to conventional teaching strategy. The hypothesis was that there would be no significant difference in attitude between the two groups. The study was limited to…

  1. Science in the Classroom: Finding a Balance between Autonomous Exploration and Teacher-Led Instruction in Preschool Settings

    ERIC Educational Resources Information Center

    Nayfeld, Irena; Brenneman, Kimberly; Gelman, Rochel

    2011-01-01

    Research Findings: This paper reports on children's use of science materials in preschool classrooms during their free choice time. Baseline observations showed that children and teachers rarely spend time in the designated science area. An intervention was designed to "market" the science center by introducing children to 1 science tool, the…

  2. Initial Validation of the Prekindergarten Classroom Observation Tool and Goal Setting System for Data-Based Coaching

    ERIC Educational Resources Information Center

    Crawford, April D.; Zucker, Tricia A.; Williams, Jeffrey M.; Bhavsar, Vibhuti; Landry, Susan H.

    2013-01-01

    Although coaching is a popular approach for enhancing the quality of Tier 1 instruction, limited research has addressed observational measures specifically designed to focus coaching on evidence-based practices. This study explains the development of the prekindergarten (pre-k) Classroom Observation Tool (COT) designed for use in a data-based…

  3. Child and Setting Characteristics Affecting the Adult Talk Directed at Preschoolers with Autism Spectrum Disorder in the Inclusive Classroom

    ERIC Educational Resources Information Center

    Irvin, Dwight W.; Boyd, Brian A.; Odom, Samuel L.

    2015-01-01

    Difficulty with social competence is a core deficit of autism spectrum disorder. Research on typically developing children and children with disabilities, in general, suggests the adult talk received in the classroom is related to their social development. The aims of this study were to examine (1) the types and amounts of adult talk children with…

  4. Analogies as Tools for Meaning Making in Elementary Science Education: How Do They Work in Classroom Settings?

    ERIC Educational Resources Information Center

    Guerra-Ramos, Maria Teresa

    2011-01-01

    In this paper there is a critical overview of the role of analogies as tools for meaning making in science education, their advantages and disadvantages. Two empirical studies on the use of analogies in primary classrooms are discussed and analysed. In the first study, the "string circuit" analogy was used in the teaching of electric circuits with…

  5. Multimodal Literacy Practices in the Indigenous Sámi Classroom: Children Navigating in a Complex Multilingual Setting

    ERIC Educational Resources Information Center

    Pietikäinen, Sari; Pitkänen-Huhta, Anne

    2013-01-01

    This article explores multimodal literacy practices in a transforming multilingual context of an indigenous and endangered Sámi language classroom. Looking at literacy practices as embedded in a complex and shifting terrain of language ideologies, language norms, and individual experiences and attitudes, we examined how multilingual Sámi children…

  6. Web-Queryable Large-Scale Data Sets for Hypothesis Generation in Plant Biology

    PubMed Central

    Brady, Siobhan M.; Provart, Nicholas J.

    2009-01-01

    The approaching end of the 21st century's first decade marks an exciting time for plant biology. Several National Science Foundation Arabidopsis 2010 Projects will conclude, and whether or not the stated goal of the National Science Foundation 2010 Program—to determine the function of 25,000 Arabidopsis genes by 2010—is reached, these projects and others in a similar vein, such as those performed by the AtGenExpress Consortium and various plant genome sequencing initiatives, have generated important and unprecedented large-scale data sets. While providing significant biological insights for the individual laboratories that generated them, these data sets, in conjunction with the appropriate tools, are also permitting plant biologists worldwide to gain new insights into their own biological systems of interest, often at a mouse click through a Web browser. This review provides an overview of several such genomic, epigenomic, transcriptomic, proteomic, and metabolomic data sets and describes Web-based tools for querying them in the context of hypothesis generation for plant biology. We provide five biological examples of how such tools and data sets have been used to provide biological insight. PMID:19401381

  7. Coresets vs clustering: comparison of methods for redundancy reduction in very large white matter fiber sets

    NASA Astrophysics Data System (ADS)

    Alexandroni, Guy; Zimmerman Moreno, Gali; Sochen, Nir; Greenspan, Hayit

    2016-03-01

    Recent advances in Diffusion Weighted Magnetic Resonance Imaging (DW-MRI) of white matter in conjunction with improved tractography produce impressive reconstructions of White Matter (WM) pathways. These pathways (fiber sets) often contain hundreds of thousands of fibers, or more. In order to make fiber based analysis more practical, the fiber set needs to be preprocessed to eliminate redundancies and to keep only essential representative fibers. In this paper we demonstrate and compare two distinctive frameworks for selecting this reduced set of fibers. The first framework entails pre-clustering the fibers using k-means, followed by Hierarchical Clustering and replacing each cluster with one representative. For the second clustering stage seven distance metrics were evaluated. The second framework is based on an efficient geometric approximation paradigm named coresets. Coresets present a new approach to optimization and have huge success especially in tasks requiring large computation time and/or memory. We propose a modified version of the coresets algorithm, Density Coreset. It is used for extracting the main fibers from dense datasets, leaving a small set that represents the main structures and connectivity of the brain. A novel approach, based on a 3D indicator structure, is used for comparing the frameworks. This comparison was applied to High Angular Resolution Diffusion Imaging (HARDI) scans of 4 healthy individuals. We show that among the clustering based methods, that cosine distance gives the best performance. In comparing the clustering schemes with coresets, Density Coreset method achieves the best performance.

  8. Heuristic method for searches on large data-sets organised using network models

    NASA Astrophysics Data System (ADS)

    Ruiz-Fernández, D.; Quintana-Pacheco, Y.

    2016-05-01

    Searches on large data-sets have become an important issue in recent years. An alternative, which has achieved good results, is the use of methods relying on data mining techniques, such as cluster-based retrieval. This paper proposes a heuristic search that is based on an organisational model that reflects similarity relationships among data elements. The search is guided by using quality estimators of model nodes, which are obtained by the progressive evaluation of the given target function for the elements associated with each node. The results of the experiments confirm the effectiveness of the proposed algorithm. High-quality solutions are obtained evaluating a relatively small percentage of elements in the data-sets.

  9. Distributed Computation of the knn Graph for Large High-Dimensional Point Sets.

    PubMed

    Plaku, Erion; Kavraki, Lydia E

    2007-03-01

    High-dimensional problems arising from robot motion planning, biology, data mining, and geographic information systems often require the computation of k nearest neighbor (knn) graphs. The knn graph of a data set is obtained by connecting each point to its k closest points. As the research in the above-mentioned fields progressively addresses problems of unprecedented complexity, the demand for computing knn graphs based on arbitrary distance metrics and large high-dimensional data sets increases, exceeding resources available to a single machine. In this work we efficiently distribute the computation of knn graphs for clusters of processors with message passing. Extensions to our distributed framework include the computation of graphs based on other proximity queries, such as approximate knn or range queries. Our experiments show nearly linear speedup with over one hundred processors and indicate that similar speedup can be obtained with several hundred processors. PMID:19847318

  10. Distributed Computation of the knn Graph for Large High-Dimensional Point Sets

    PubMed Central

    Plaku, Erion; Kavraki, Lydia E.

    2009-01-01

    High-dimensional problems arising from robot motion planning, biology, data mining, and geographic information systems often require the computation of k nearest neighbor (knn) graphs. The knn graph of a data set is obtained by connecting each point to its k closest points. As the research in the above-mentioned fields progressively addresses problems of unprecedented complexity, the demand for computing knn graphs based on arbitrary distance metrics and large high-dimensional data sets increases, exceeding resources available to a single machine. In this work we efficiently distribute the computation of knn graphs for clusters of processors with message passing. Extensions to our distributed framework include the computation of graphs based on other proximity queries, such as approximate knn or range queries. Our experiments show nearly linear speedup with over one hundred processors and indicate that similar speedup can be obtained with several hundred processors. PMID:19847318

  11. Envision: An interactive system for the management and visualization of large geophysical data sets

    NASA Technical Reports Server (NTRS)

    Searight, K. R.; Wojtowicz, D. P.; Walsh, J. E.; Pathi, S.; Bowman, K. P.; Wilhelmson, R. B.

    1995-01-01

    Envision is a software project at the University of Illinois and Texas A&M, funded by NASA's Applied Information Systems Research Project. It provides researchers in the geophysical sciences convenient ways to manage, browse, and visualize large observed or model data sets. Envision integrates data management, analysis, and visualization of geophysical data in an interactive environment. It employs commonly used standards in data formats, operating systems, networking, and graphics. It also attempts, wherever possible, to integrate with existing scientific visualization and analysis software. Envision has an easy-to-use graphical interface, distributed process components, and an extensible design. It is a public domain package, freely available to the scientific community.

  12. Multifractal analysis of the irregular set for almost-additive sequences via large deviations

    NASA Astrophysics Data System (ADS)

    Bomfim, Thiago; Varandas, Paulo

    2015-10-01

    In this paper we introduce a notion of free energy and large deviations rate function for asymptotically additive sequences of potentials via an approximation method by families of continuous potentials. We provide estimates for the topological pressure of the set of points whose non-additive sequences are far from the limit described through Kingman’s sub-additive ergodic theorem and give some applications in the context of Lyapunov exponents for diffeomorphisms and cocycles, and the Shannon-McMillan-Breiman theorem for Gibbs measures.

  13. Unusually large shear wave anisotropy for chlorite in subduction zone settings

    NASA Astrophysics Data System (ADS)

    Mookherjee, Mainak; Mainprice, David

    2014-03-01

    Using first principle simulations we calculated the elasticity of chlorite. At a density ρ~ 2.60 g cm-3, the elastic constant tensor reveals significant elastic anisotropy: VP ~27%, VS1 ~56%, and VS2 ~43%. The shear anisotropy is exceptionally large for chlorite and enhances upon compression. Upon compression, the shear elastic constant component C44 and C55 decreases, whereas C66 shear component stiffens. The softening in C44 and C55 is reflected in shear modulus, G, and the shear wave velocity, VS. Our results on elastic anisotropy at conditions relevant to the mantle wedge indicates that a 10-20 km layer of hydrated peridotite with serpentine and chlorite could account for the observed shear polarization anisotropy and associated large delay times of 1-2 s observed in some subduction zone settings. In addition, chlorite could also explain the low VP/VS ratios that have been observed in recent high-resolution seismological studies.

  14. A practical, bioinformatic workflow system for large data sets generated by next-generation sequencing

    PubMed Central

    Cantacessi, Cinzia; Jex, Aaron R.; Hall, Ross S.; Young, Neil D.; Campbell, Bronwyn E.; Joachim, Anja; Nolan, Matthew J.; Abubucker, Sahar; Sternberg, Paul W.; Ranganathan, Shoba; Mitreva, Makedonka; Gasser, Robin B.

    2010-01-01

    Transcriptomics (at the level of single cells, tissues and/or whole organisms) underpins many fields of biomedical science, from understanding the basic cellular function in model organisms, to the elucidation of the biological events that govern the development and progression of human diseases, and the exploration of the mechanisms of survival, drug-resistance and virulence of pathogens. Next-generation sequencing (NGS) technologies are contributing to a massive expansion of transcriptomics in all fields and are reducing the cost, time and performance barriers presented by conventional approaches. However, bioinformatic tools for the analysis of the sequence data sets produced by these technologies can be daunting to researchers with limited or no expertise in bioinformatics. Here, we constructed a semi-automated, bioinformatic workflow system, and critically evaluated it for the analysis and annotation of large-scale sequence data sets generated by NGS. We demonstrated its utility for the exploration of differences in the transcriptomes among various stages and both sexes of an economically important parasitic worm (Oesophagostomum dentatum) as well as the prediction and prioritization of essential molecules (including GTPases, protein kinases and phosphatases) as novel drug target candidates. This workflow system provides a practical tool for the assembly, annotation and analysis of NGS data sets, also to researchers with a limited bioinformatic expertise. The custom-written Perl, Python and Unix shell computer scripts used can be readily modified or adapted to suit many different applications. This system is now utilized routinely for the analysis of data sets from pathogens of major socio-economic importance and can, in principle, be applied to transcriptomics data sets from any organism. PMID:20682560

  15. Approaching the exa-scale: a real-world evaluation of rendering extremely large data sets

    SciTech Connect

    Patchett, John M; Ahrens, James P; Lo, Li - Ta; Browniee, Carson S; Mitchell, Christopher J; Hansen, Chuck

    2010-10-15

    Extremely large scale analysis is becoming increasingly important as supercomputers and their simulations move from petascale to exascale. The lack of dedicated hardware acceleration for rendering on today's supercomputing platforms motivates our detailed evaluation of the possibility of interactive rendering on the supercomputer. In order to facilitate our understanding of rendering on the supercomputing platform, we focus on scalability of rendering algorithms and architecture envisioned for exascale datasets. To understand tradeoffs for dealing with extremely large datasets, we compare three different rendering algorithms for large polygonal data: software based ray tracing, software based rasterization and hardware accelerated rasterization. We present a case study of strong and weak scaling of rendering extremely large data on both GPU and CPU based parallel supercomputers using Para View, a parallel visualization tool. Wc use three different data sets: two synthetic and one from a scientific application. At an extreme scale, algorithmic rendering choices make a difference and should be considered while approaching exascale computing, visualization, and analysis. We find software based ray-tracing offers a viable approach for scalable rendering of the projected future massive data sizes.

  16. Litho-kinematic facies model for large landslide deposits in arid settings

    SciTech Connect

    Yarnold, J.C.; Lombard, J.P.

    1989-04-01

    Reconnaissance field studies of six large landslide deposits in the S. Basin and Range suggest that a set of characteristic features is common to the deposits of large landslides in an arid setting. These include a coarse boulder cap, an upper massive zone, a lower disrupted zone, and a mixed zone overlying disturbed substrate. The upper massive zone is dominated by crackel breccia. This grades downward into a lower disrupted zone composed of a more matrix-rich breccia that is internally sheared, intruded by clastic dikes, and often contains a cataclasite layer at its base. An underlying discontinuous mixed zone is composed of material from the overlying breccia mixed with material entrained from the underlying substrate. Bedding in the substrate sometimes displays folding and contortion that die out downward. The authors work suggests a spatial zonation of these characteristic features within many landslide deposits. In general, clastic dikes, the basal cataclasite, and folding in the substrate are observed mainly in distal parts of landslides. In most cases, total thickness, thickness of the basal disturbed and mixed zones, and the degree of internal shearing increase distally, whereas maximum clast size commonly decreases distally. Zonation of these features is interpreted to result from kinematics of emplacement that cause generally increased deformation in the distal regions of the landslide.

  17. A Technique for Moving Large Data Sets over High-Performance Long Distance Networks

    SciTech Connect

    Settlemyer, Bradley W; Dobson, Jonathan D; Hodson, Stephen W; Kuehn, Jeffery A; Poole, Stephen W; Ruwart, Thomas

    2011-01-01

    In this paper we look at the performance characteristics of three tools used to move large data sets over dedicated long distance networking infrastructure. Although performance studies of wide area networks have been a frequent topic of interest, performance analyses have tended to focus on network latency characteristics and peak throughput using network traffic generators. In this study we instead perform an end-to-end long distance networking analysis that includes reading large data sets from a source file system and committing the data to a remote destination file system. An evaluation of end-to-end data movement is also an evaluation of the system configurations employed and the tools used to move the data. For this paper, we have built several storage platforms and connected them with a high performance long distance network configuration. We use these systems to analyze the capabilities of three data movement tools: BBcp, GridFTP, and XDD. Our studies demonstrate that existing data movement tools do not provide efficient performance levels or exercise the storage devices in their highest performance modes.

  18. Using an ensemble of statistical metrics to quantify large sets of plant transcription factor binding sites

    PubMed Central

    2013-01-01

    Background From initial seed germination through reproduction, plants continuously reprogram their transcriptional repertoire to facilitate growth and development. This dynamic is mediated by a diverse but inextricably-linked catalog of regulatory proteins called transcription factors (TFs). Statistically quantifying TF binding site (TFBS) abundance in promoters of differentially expressed genes can be used to identify binding site patterns in promoters that are closely related to stress-response. Output from today’s transcriptomic assays necessitates statistically-oriented software to handle large promoter-sequence sets in a computationally tractable fashion. Results We present Marina, an open-source software for identifying over-represented TFBSs from amongst large sets of promoter sequences, using an ensemble of 7 statistical metrics and binding-site profiles. Through software comparison, we show that Marina can identify considerably more over-represented plant TFBSs compared to a popular software alternative. Conclusions Marina was used to identify over-represented TFBSs in a two time-point RNA-Seq study exploring the transcriptomic interplay between soybean (Glycine max) and soybean rust (Phakopsora pachyrhizi). Marina identified numerous abundant TFBSs recognized by transcription factors that are associated with defense-response such as WRKY, HY5 and MYB2. Comparing results from Marina to that of a popular software alternative suggests that regardless of the number of promoter-sequences, Marina is able to identify significantly more over-represented TFBSs. PMID:23578135

  19. A flexible method for estimating the fraction of fitness influencing mutations from large sequencing data sets.

    PubMed

    Moon, Sunjin; Akey, Joshua M

    2016-06-01

    A continuing challenge in the analysis of massively large sequencing data sets is quantifying and interpreting non-neutrally evolving mutations. Here, we describe a flexible and robust approach based on the site frequency spectrum to estimate the fraction of deleterious and adaptive variants from large-scale sequencing data sets. We applied our method to approximately 1 million single nucleotide variants (SNVs) identified in high-coverage exome sequences of 6515 individuals. We estimate that the fraction of deleterious nonsynonymous SNVs is higher than previously reported; quantify the effects of genomic context, codon bias, chromatin accessibility, and number of protein-protein interactions on deleterious protein-coding SNVs; and identify pathways and networks that have likely been influenced by positive selection. Furthermore, we show that the fraction of deleterious nonsynonymous SNVs is significantly higher for Mendelian versus complex disease loci and in exons harboring dominant versus recessive Mendelian mutations. In summary, as genome-scale sequencing data accumulate in progressively larger sample sizes, our method will enable increasingly high-resolution inferences into the characteristics and determinants of non-neutral variation. PMID:27197222

  20. Contextual settings, science stories, and large context problems: Toward a more humanistic science education

    NASA Astrophysics Data System (ADS)

    Stinner, Arthur

    This article addresses the need for and the problem of organizing a science curriculum around contextual settings and science stories that serve to involve and motivate students to develop an understanding of the world that is rooted in the scientific and the humanistic traditions. A program of activities placed around contextual settings, science stories, and contemporary issues of interest is recommended in an attempt to move toward a slow and secure abolition of the gulf between scientific knowledge and common sense beliefs. A conceptual development model is described to guide the connection between theory and evidence on a level appropriate for children, from early years to senior years. For the senior years it is also important to connect the activity of teaching to a sound theoretical structure. The theoretical structure must illuminate the status of theory in science, establish what counts as evidence, clarify the relationship between experiment and explanation, and make connections to the history of science. The article concludes with a proposed program of activities in terms of a sequence of theoretical and empirical experiences that involve contextual settings, science stories, large context problems, thematic teaching, and popular science literature teaching.

  1. Developing consistent Landsat data sets for large area applications: the MRLC 2001 protocol

    USGS Publications Warehouse

    Chander, G.; Huang, C.; Yang, L.; Homer, C.; Larson, C.

    2009-01-01

    One of the major efforts in large area land cover mapping over the last two decades was the completion of two U.S. National Land Cover Data sets (NLCD), developed with nominal 1992 and 2001 Landsat imagery under the auspices of the MultiResolution Land Characteristics (MRLC) Consortium. Following the successful generation of NLCD 1992, a second generation MRLC initiative was launched with two primary goals: (1) to develop a consistent Landsat imagery data set for the U.S. and (2) to develop a second generation National Land Cover Database (NLCD 2001). One of the key enhancements was the formulation of an image preprocessing protocol and implementation of a consistent image processing method. The core data set of the NLCD 2001 database consists of Landsat 7 Enhanced Thematic Mapper Plus (ETM+) images. This letter details the procedures for processing the original ETM+ images and more recent scenes added to the database. NLCD 2001 products include Anderson Level II land cover classes, percent tree canopy, and percent urban imperviousness at 30-m resolution derived from Landsat imagery. The products are freely available for download to the general public from the MRLC Consortium Web site at http://www.mrlc.gov.

  2. Hierarchical Unbiased Graph Shrinkage (HUGS): A Novel Groupwise Registration for Large Data Set

    PubMed Central

    Ying, Shihui; Wu, Guorong; Wang, Qian; Shen, Dinggang

    2014-01-01

    Normalizing all images in a large data set into a common space is a key step in many clinical and research studies, e.g., for brain development, maturation, and aging. Recently, groupwise registration has been developed for simultaneous alignment of all images without selecting a particular image as template, thus potentially avoiding bias in the registration. However, most conventional groupwise registration methods do not explore the data distribution during the image registration. Thus, their performance could be affected by large inter-subject variations in the data set under registration. To solve this potential issue, we propose to use a graph to model the distribution of all image data sitting on the image manifold, with each node representing an image and each edge representing the geodesic pathway between two nodes (or images). Then, the procedure of warping all images to their population center turns to the dynamic shrinking of the graph nodes along their graph edges until all graph nodes become close to each other. Thus, the topology of image distribution on the image manifold is always preserved during the groupwise registration. More importantly, by modeling the distribution of all images via a graph, we can potentially reduce registration error since every time each image is warped only according to its nearby images with similar structures in the graph. We have evaluated our proposed groupwise registration method on both infant and adult data sets, by also comparing with the conventional group-mean based registration and the ABSORB methods. All experimental results show that our proposed method can achieve better performance in terms of registration accuracy and robustness. PMID:24055505

  3. Classroom-based Interventions and Teachers’ Perceived Job Stressors and Confidence: Evidence from a Randomized Trial in Head Start Settings

    PubMed Central

    Zhai, Fuhua; Raver, C. Cybele; Li-Grining, Christine

    2011-01-01

    Preschool teachers’ job stressors have received increasing attention but have been understudied in the literature. We investigated the impacts of a classroom-based intervention, the Chicago School Readiness Project (CSRP), on teachers’ perceived job stressors and confidence, as indexed by their perceptions of job control, job resources, job demands, and confidence in behavior management. Using a clustered randomized controlled trial (RCT) design, the CSRP provided multifaceted services to the treatment group, including teacher training and mental health consultation, which were accompanied by stress-reduction services and workshops. Overall, 90 teachers in 35 classrooms at 18 Head Start sites participated in the study. After adjusting for teacher and classroom factors and site fixed effects, we found that the CSRP had significant effects on the improvement of teachers’ perceived job control and work-related resources. We also found that the CSRP decreased teachers’ confidence in behavior management and had no statistically significant effects on job demands. Overall, we did not find significant moderation effects of teacher race/ethnicity, education, teaching experience, or teacher type. The implications for research and policy are discussed. PMID:21927538

  4. Child and setting characteristics affecting the adult talk directed at preschoolers with autism spectrum disorder in the inclusive classroom.

    PubMed

    Irvin, Dwight W; Boyd, Brian A; Odom, Samuel L

    2015-02-01

    Difficulty with social competence is a core deficit of autism spectrum disorder. Research on typically developing children and children with disabilities, in general, suggests the adult talk received in the classroom is related to their social development. The aims of this study were to examine (1) the types and amounts of adult talk children with autism spectrum disorder are exposed to in the preschool classroom and (2) the associations between child characteristics (e.g. language), activity area, and adult talk. Kontos' Teacher Talk classification was used to code videos approximately 30 min in length of 73 children with autism spectrum disorder (ages 3-5) in inclusive classrooms (n = 33) during center time. The results indicated practical/personal assistance was the most common type of adult talk coded, and behavior management talk least often coded. Child characteristics (i.e. age and autism severity) and activity area were found to be related to specific types of adult talk. Given the findings, implications for future research are discussed. PMID:24463432

  5. Contamination in the MACHO data set and the puzzle of Large Magellanic Cloud microlensing

    NASA Astrophysics Data System (ADS)

    Griest, Kim; Thomas, Christian L.

    2005-05-01

    In a recent series of three papers, Belokurov, Evans & Le Du and Evans & Belokurov reanalysed the MACHO collaboration data and gave alternative sets of microlensing events and an alternative optical depth to microlensing towards the Large Magellanic Cloud (LMC). Although these authors examined less than 0.2 per cent of the data, they reported that by using a neural net program they had reliably selected a better (and smaller) set of microlensing candidates. Estimating the optical depth from this smaller set, they claimed that the MACHO collaboration overestimated the optical depth by a significant factor and that the MACHO microlensing experiment is consistent with lensing by known stars in the Milky Way and LMC. As we show below, the analysis by these authors contains several errors, and as a result their conclusions are incorrect. Their efficiency analysis is in error, and since they did not search through the entire MACHO data set, they do not know how many microlensing events their neural net would find in the data nor what optical depth their method would give. Examination of their selected events suggests that their method misses low signal-to-noise ratio events and thus would have lower efficiency than the MACHO selection criteria. In addition, their method is likely to give many more false positives (non-lensing events identified as lensing). Both effects would increase their estimated optical depth. Finally, we note that the EROS discovery that LMC event 23 is a variable star reduces the MACHO collaboration estimates of optical depth and the Macho halo fraction by around 8 per cent, and does open the question of additional contamination.

  6. The Large Synoptic Survey Telescope and Foundations for Data Exploitation of Petabyte Data Sets

    SciTech Connect

    Cook, K H; Nikolaev, S; Huber, M E

    2007-02-26

    The next generation of imaging surveys in astronomy, such as the Large Synoptic Survey Telescope (LSST), will require multigigapixel cameras that can process enormous amounts of data read out every few seconds. This huge increase in data throughput (compared to megapixel cameras and minute- to hour-long integrations of today's instruments) calls for a new paradigm for extracting the knowledge content. We have developed foundations for this new approach. In this project, we have studied the necessary processes for extracting information from large time-domain databases systematics. In the process, we have produced significant scientific breakthroughs by developing new methods to probe both the elusive time and spatial variations in astrophysics data sets from the SuperMACHO (Massive Compact Halo Objects) survey, the Lowell Observatory Near-Earth Object Search (LONEOS), and the Taiwanese American Occultation Survey (TAOS). This project continues to contribute to the development of the scientific foundations for future wide-field, time-domain surveys. Our algorithm and pipeline development has provided the building blocks for the development of the LSST science software system. Our database design and performance measures have helped to size and constrain LSST database design. LLNL made significant contributions to the foundations of the LSST, which has applications for large-scale imaging and data-mining activities at LLNL. These developments are being actively applied to the previously mentioned surveys producing important scientific results that have been released to the scientific community and more continue to be published and referenced, enhancing LLNL's scientific stature.

  7. Twelve- to 14-Month-Old Infants Can Predict Single-Event Probability with Large Set Sizes

    ERIC Educational Resources Information Center

    Denison, Stephanie; Xu, Fei

    2010-01-01

    Previous research has revealed that infants can reason correctly about single-event probabilities with small but not large set sizes (Bonatti, 2008; Teglas "et al.", 2007). The current study asks whether infants can make predictions regarding single-event probability with large set sizes using a novel procedure. Infants completed two trials: A…

  8. Radiometric Normalization of Large Airborne Image Data Sets Acquired by Different Sensor Types

    NASA Astrophysics Data System (ADS)

    Gehrke, S.; Beshah, B. T.

    2016-06-01

    successfully applied to large sets of heterogeneous imagery, including the adjustment of original sensor images prior to quality control and further processing as well as radiometric adjustment for ortho-image mosaic generation.

  9. Generating extreme weather event sets from very large ensembles of regional climate models

    NASA Astrophysics Data System (ADS)

    Massey, Neil; Guillod, Benoit; Otto, Friederike; Allen, Myles; Jones, Richard; Hall, Jim

    2015-04-01

    Generating extreme weather event sets from very large ensembles of regional climate models Neil Massey, Benoit P. Guillod, Friederike E. L. Otto, Myles R. Allen, Richard Jones, Jim W. Hall Environmental Change Institute, University of Oxford, Oxford, UK Extreme events can have large impacts on societies and are therefore being increasingly studied. In particular, climate change is expected to impact the frequency and intensity of these events. However, a major limitation when investigating extreme weather events is that, by definition, only few events are present in observations. A way to overcome this issue it to use large ensembles of model simulations. Using the volunteer distributed computing (VDC) infrastructure of weather@home [1], we run a very large number (10'000s) of RCM simulations over the European domain at a resolution of 25km, with an improved land-surface scheme, nested within a free-running GCM. Using VDC allows many thousands of climate model runs to be computed. Using observations for the GCM boundary forcings we can run historical "hindcast" simulations over the past 100 to 150 years. This allows us, due to the chaotic variability of the atmosphere, to ascertain how likely an extreme event was, given the boundary forcings, and to derive synthetic event sets. The events in these sets did not actually occur in the observed record but could have occurred given the boundary forcings, with an associated probability. The event sets contain time-series of fields of meteorological variables that allow impact modellers to assess the loss the event would incur. Projections of events into the future are achieved by modelling projections of the sea-surface temperature (SST) and sea-ice boundary forcings, by combining the variability of the SST in the observed record with a range of warming signals derived from the varying responses of SSTs in the CMIP5 ensemble to elevated greenhouse gas (GHG) emissions in three RCP scenarios. Simulating the future with a

  10. Perl One-Liners: Bridging the Gap Between Large Data Sets and Analysis Tools.

    PubMed

    Hokamp, Karsten

    2015-01-01

    Computational analyses of biological data are becoming increasingly powerful, and researchers intending on carrying out their own analyses can often choose from a wide array of tools and resources. However, their application might be obstructed by the wide variety of different data formats that are in use, from standard, commonly used formats to output files from high-throughput analysis platforms. The latter are often too large to be opened, viewed, or edited by standard programs, potentially leading to a bottleneck in the analysis. Perl one-liners provide a simple solution to quickly reformat, filter, and merge data sets in preparation for downstream analyses. This chapter presents example code that can be easily adjusted to meet individual requirements. An online version is available at http://bioinf.gen.tcd.ie/pol. PMID:26498621

  11. Calculations of safe collimator settings and β* at the CERN Large Hadron Collider

    NASA Astrophysics Data System (ADS)

    Bruce, R.; Assmann, R. W.; Redaelli, S.

    2015-06-01

    The first run of the Large Hadron Collider (LHC) at CERN was very successful and resulted in important physics discoveries. One way of increasing the luminosity in a collider, which gave a very significant contribution to the LHC performance in the first run and can be used even if the beam intensity cannot be increased, is to decrease the transverse beam size at the interaction points by reducing the optical function β*. However, when doing so, the beam becomes larger in the final focusing system, which could expose its aperture to beam losses. For the LHC, which is designed to store beams with a total energy of 362 MJ, this is critical, since the loss of even a small fraction of the beam could cause a magnet quench or even damage. Therefore, the machine aperture has to be protected by the collimation system. The settings of the collimators constrain the maximum beam size that can be tolerated and therefore impose a lower limit on β*. In this paper, we present calculations to determine safe collimator settings and the resulting limit on β*, based on available aperture and operational stability of the machine. Our model was used to determine the LHC configurations in 2011 and 2012 and it was found that β* could be decreased significantly compared to the conservative model used in 2010. The gain in luminosity resulting from the decreased margins between collimators was more than a factor 2, and a further contribution from the use of realistic aperture estimates based on measurements was almost as large. This has played an essential role in the rapid and successful accumulation of experimental data in the LHC.

  12. Ghost transmission: How large basis sets can make electron transport calculations worse

    SciTech Connect

    Herrmann, Carmen; Solomon, Gemma C.; Subotnik, Joseph E.; Mujica, Vladimiro; Ratner, Mark A.

    2010-01-01

    The Landauer approach has proven to be an invaluable tool for calculating the electron transport properties of single molecules, especially when combined with a nonequilibrium Green’s function approach and Kohn–Sham density functional theory. However, when using large nonorthogonal atom-centered basis sets, such as those common in quantum chemistry, one can find erroneous results if the Landauer approach is applied blindly. In fact, basis sets of triple-zeta quality or higher sometimes result in an artificially high transmission and possibly even qualitatively wrong conclusions regarding chemical trends. In these cases, transport persists when molecular atoms are replaced by basis functions alone (“ghost atoms”). The occurrence of such ghost transmission is correlated with low-energy virtual molecular orbitals of the central subsystem and may be interpreted as a biased and thus inaccurate description of vacuum transmission. An approximate practical correction scheme is to calculate the ghost transmission and subtract it from the full transmission. As a further consequence of this study, it is recommended that sensitive molecules be used for parameter studies, in particular those whose transmission functions show antiresonance features such as benzene-based systems connected to the electrodes in meta positions and other low-conducting systems such as alkanes and silanes.

  13. Motif-based analysis of large nucleotide data sets using MEME-ChIP

    PubMed Central

    Ma, Wenxiu; Noble, William S; Bailey, Timothy L

    2014-01-01

    MEME-ChIP is a web-based tool for analyzing motifs in large DNA or RNA data sets. It can analyze peak regions identified by ChIP-seq, cross-linking sites identified by cLIP-seq and related assays, as well as sets of genomic regions selected using other criteria. MEME-ChIP performs de novo motif discovery, motif enrichment analysis, motif location analysis and motif clustering, providing a comprehensive picture of the DNA or RNA motifs that are enriched in the input sequences. MEME-ChIP performs two complementary types of de novo motif discovery: weight matrix–based discovery for high accuracy; and word-based discovery for high sensitivity. Motif enrichment analysis using DNA or RNA motifs from human, mouse, worm, fly and other model organisms provides even greater sensitivity. MEME-ChIP’s interactive HTML output groups and aligns significant motifs to ease interpretation. this protocol takes less than 3 h, and it provides motif discovery approaches that are distinct and complementary to other online methods. PMID:24853928

  14. Science Teachers' Decision-Making in Abstinence-Only-Until-Marriage (AOUM) Classrooms: Taboo Subjects and Discourses of Sex and Sexuality in Classroom Settings

    ERIC Educational Resources Information Center

    Gill, Puneet Singh

    2015-01-01

    Sex education, especially in the southeastern USA, remains steeped in an Abstinence-Only-Until-Marriage (AOUM) approach, which sets up barriers to the education of sexually active students. Research confirms that science education has the potential to facilitate discussion of controversial topics, including sex education. Science teachers in the…

  15. A multivariate approach to filling gaps in large ecological data sets using probabilistic matrix factorization techniques

    NASA Astrophysics Data System (ADS)

    Schrodt, F. I.; Shan, H.; Kattge, J.; Reich, P.; Banerjee, A.; Reichstein, M.

    2012-12-01

    With the advent of remotely sensed data and coordinated efforts to create global databases, the ecological community has progressively become more data-intensive. However, in contrast to other disciplines, statistical ways of handling these large data sets, especially the gaps which are inherent to them, are lacking. Widely used theoretical approaches, for example model averaging based on Akaike's information criterion (AIC), are sensitive to missing values. Yet, the most common way of handling sparse matrices - the deletion of cases with missing data (complete case analysis) - is known to severely reduce statistical power as well as inducing biased parameter estimates. In order to address these issues, we present novel approaches to gap filling in large ecological data sets using matrix factorization techniques. Factorization based matrix completion was developed in a recommender system context and has since been widely used to impute missing data in fields outside the ecological community. Here, we evaluate the effectiveness of probabilistic matrix factorization techniques for imputing missing data in ecological matrices using two imputation techniques. Hierarchical Probabilistic Matrix Factorization (HPMF) effectively incorporates hierarchical phylogenetic information (phylogenetic group, family, genus, species and individual plant) into the trait imputation. Kernelized Probabilistic Matrix Factorization (KPMF) on the other hand includes environmental information (climate and soils) into the matrix factorization through kernel matrices over rows and columns. We test the accuracy and effectiveness of HPMF and KPMF in filling sparse matrices, using the TRY database of plant functional traits (http://www.try-db.org). TRY is one of the largest global compilations of plant trait databases (750 traits of 1 million plants), encompassing data on morphological, anatomical, biochemical, phenological and physiological features of plants. However, despite of unprecedented

  16. Any Questions? An Application of Weick's Model of Organizing to Increase Student Involvement in the Large-Lecture Classroom

    ERIC Educational Resources Information Center

    Ledford, Christy J. W.; Saperstein, Adam K.; Cafferty, Lauren A.; McClintick, Stacey H.; Bernstein, Ethan M.

    2015-01-01

    Microblogs, with their interactive nature, can engage students in community building and sensemaking. Using Weick's model of organizing as a framework, we integrated the use of micromessaging to increase student engagement in the large-lecture classroom. Students asked significantly more questions and asked a greater diversity of questions…

  17. Listserv Lemmings and Fly-brarians on the Wall: A Librarian-Instructor Team Taming the Cyberbeast in the Large Classroom.

    ERIC Educational Resources Information Center

    Dickstein, Ruth; McBride, Kari Boyd

    1998-01-01

    Computer technology can empower students if they have the tools to find their way through print and online sources. This article describes how a reference librarian and a faculty instructor collaborated to teach research strategies and critical thinking skills (including analysis and evaluation of resources) in a large university classroom using a…

  18. An Evaluation of the Developmental Designs Approach and Professional Development Model on Classroom Management in 22 Middle Schools in a Large, Midwestern School District

    ERIC Educational Resources Information Center

    Hough, David L.

    2011-01-01

    This study presents findings from an evaluation of the Developmental Designs classroom management approach and professional development model during its first year of implementation across 22 middle schools in a large, Midwestern school district. The impact of this professional development model on teaching and learning as related to participants'…

  19. Galaxy Evolution Insights from Spectral Modeling of Large Data Sets from the Sloan Digital Sky Survey

    SciTech Connect

    Hoversten, Erik A.

    2007-10-01

    This thesis centers on the use of spectral modeling techniques on data from the Sloan Digital Sky Survey (SDSS) to gain new insights into current questions in galaxy evolution. The SDSS provides a large, uniform, high quality data set which can be exploited in a number of ways. One avenue pursued here is to use the large sample size to measure precisely the mean properties of galaxies of increasingly narrow parameter ranges. The other route taken is to look for rare objects which open up for exploration new areas in galaxy parameter space. The crux of this thesis is revisiting the classical Kennicutt method for inferring the stellar initial mass function (IMF) from the integrated light properties of galaxies. A large data set (~ 105 galaxies) from the SDSS DR4 is combined with more in-depth modeling and quantitative statistical analysis to search for systematic IMF variations as a function of galaxy luminosity. Galaxy Hα equivalent widths are compared to a broadband color index to constrain the IMF. It is found that for the sample as a whole the best fitting IMF power law slope above 0.5 M is Γ = 1.5 ± 0.1 with the error dominated by systematics. Galaxies brighter than around Mr,0.1 = -20 (including galaxies like the Milky Way which has Mr,0.1 ~ -21) are well fit by a universal Γ ~ 1.4 IMF, similar to the classical Salpeter slope, and smooth, exponential star formation histories (SFH). Fainter galaxies prefer steeper IMFs and the quality of the fits reveal that for these galaxies a universal IMF with smooth SFHs is actually a poor assumption. Related projects are also pursued. A targeted photometric search is conducted for strongly lensed Lyman break galaxies (LBG) similar to MS1512-cB58. The evolution of the photometric selection technique is described as are the results of spectroscopic follow-up of the best targets. The serendipitous discovery of two interesting blue compact dwarf galaxies is reported. These

  20. PORTAAL: A Classroom Observation Tool Assessing Evidence-Based Teaching Practices for Active Learning in Large Science, Technology, Engineering, and Mathematics Classes

    PubMed Central

    Eddy, Sarah L.; Converse, Mercedes; Wenderoth, Mary Pat

    2015-01-01

    There is extensive evidence that active learning works better than a completely passive lecture. Despite this evidence, adoption of these evidence-based teaching practices remains low. In this paper, we offer one tool to help faculty members implement active learning. This tool identifies 21 readily implemented elements that have been shown to increase student outcomes related to achievement, logic development, or other relevant learning goals with college-age students. Thus, this tool both clarifies the research-supported elements of best practices for instructor implementation of active learning in the classroom setting and measures instructors’ alignment with these practices. We describe how we reviewed the discipline-based education research literature to identify best practices in active learning for adult learners in the classroom and used these results to develop an observation tool (Practical Observation Rubric To Assess Active Learning, or PORTAAL) that documents the extent to which instructors incorporate these practices into their classrooms. We then use PORTAAL to explore the classroom practices of 25 introductory biology instructors who employ some form of active learning. Overall, PORTAAL documents how well aligned classrooms are with research-supported best practices for active learning and provides specific feedback and guidance to instructors to allow them to identify what they do well and what could be improved. PMID:26033871

  1. PORTAAL: A Classroom Observation Tool Assessing Evidence-Based Teaching Practices for Active Learning in Large Science, Technology, Engineering, and Mathematics Classes.

    PubMed

    Eddy, Sarah L; Converse, Mercedes; Wenderoth, Mary Pat

    2015-01-01

    There is extensive evidence that active learning works better than a completely passive lecture. Despite this evidence, adoption of these evidence-based teaching practices remains low. In this paper, we offer one tool to help faculty members implement active learning. This tool identifies 21 readily implemented elements that have been shown to increase student outcomes related to achievement, logic development, or other relevant learning goals with college-age students. Thus, this tool both clarifies the research-supported elements of best practices for instructor implementation of active learning in the classroom setting and measures instructors' alignment with these practices. We describe how we reviewed the discipline-based education research literature to identify best practices in active learning for adult learners in the classroom and used these results to develop an observation tool (Practical Observation Rubric To Assess Active Learning, or PORTAAL) that documents the extent to which instructors incorporate these practices into their classrooms. We then use PORTAAL to explore the classroom practices of 25 introductory biology instructors who employ some form of active learning. Overall, PORTAAL documents how well aligned classrooms are with research-supported best practices for active learning and provides specific feedback and guidance to instructors to allow them to identify what they do well and what could be improved. PMID:26033871

  2. Efficient Implementation of an Optimal Interpolator for Large Spatial Data Sets

    NASA Technical Reports Server (NTRS)

    Memarsadeghi, Nargess; Mount, David M.

    2007-01-01

    Scattered data interpolation is a problem of interest in numerous areas such as electronic imaging, smooth surface modeling, and computational geometry. Our motivation arises from applications in geology and mining, which often involve large scattered data sets and a demand for high accuracy. The method of choice is ordinary kriging. This is because it is a best unbiased estimator. Unfortunately, this interpolant is computationally very expensive to compute exactly. For n scattered data points, computing the value of a single interpolant involves solving a dense linear system of size roughly n x n. This is infeasible for large n. In practice, kriging is solved approximately by local approaches that are based on considering only a relatively small'number of points that lie close to the query point. There are many problems with this local approach, however. The first is that determining the proper neighborhood size is tricky, and is usually solved by ad hoc methods such as selecting a fixed number of nearest neighbors or all the points lying within a fixed radius. Such fixed neighborhood sizes may not work well for all query points, depending on local density of the point distribution. Local methods also suffer from the problem that the resulting interpolant is not continuous. Meyer showed that while kriging produces smooth continues surfaces, it has zero order continuity along its borders. Thus, at interface boundaries where the neighborhood changes, the interpolant behaves discontinuously. Therefore, it is important to consider and solve the global system for each interpolant. However, solving such large dense systems for each query point is impractical. Recently a more principled approach to approximating kriging has been proposed based on a technique called covariance tapering. The problems arise from the fact that the covariance functions that are used in kriging have global support. Our implementations combine, utilize, and enhance a number of different

  3. Taking Energy to the Physics Classroom from the Large Hadron Collider at CERN

    ERIC Educational Resources Information Center

    Cid, Xabier; Cid, Ramon

    2009-01-01

    In 2008, the greatest experiment in history began. When in full operation, the Large Hadron Collider (LHC) at CERN will generate the greatest amount of information that has ever been produced in an experiment before. It will also reveal some of the most fundamental secrets of nature. Despite the enormous amount of information available on this…

  4. Redefining the Ojibwe Classroom: Indigenous Language Programs within Large Research Universities

    ERIC Educational Resources Information Center

    Morgan, Mindy J.

    2005-01-01

    Indigenous languages are powerful symbols of self-determination and sovereignty for tribal communities in the United States, and many community-based programs have been developed to support and maintain them. The successes of these programs, however, have been difficult to replicate at large research institutions. This article examines the issues…

  5. Visualization of large medical data sets using memory-optimized CPU and GPU algorithms

    NASA Astrophysics Data System (ADS)

    Kiefer, Gundolf; Lehmann, Helko; Weese, Juergen

    2005-04-01

    With the evolution of medical scanners towards higher spatial resolutions, the sizes of image data sets are increasing rapidly. To profit from the higher resolution in medical applications such as 3D-angiography for a more efficient and precise diagnosis, high-performance visualization is essential. However, to make sure that the performance of a volume rendering algorithm scales with the performance of future computer architectures, technology trends need to be considered. The design of such scalable volume rendering algorithms remains challenging. One of the major trends in the development of computer architectures is the wider use of cache memory hierarchies to bridge the growing gap between the faster evolving processing power and the slower evolving memory access speed. In this paper we propose ways to exploit the standard PC"s cache memories supporting the main processors (CPU"s) and the graphics hardware (graphics processing unit, GPU), respectively, for computing Maximum Intensity Projections (MIPs). To this end, we describe a generic and flexible way to improve the cache efficiency of software ray casting algorithms and show by means of cache simulations, that it enables cache miss rates close to the theoretical optimum. For GPU-based rendering we propose a similar, brick-based technique to optimize the utilization of onboard caches and the transfer of data to the GPU on-board memory. All algorithms produce images of identical quality, which enables us to compare the performance of their implementations in a fair way without eventually trading quality for speed. Our comparison indicates that the proposed methods perform superior, in particular for large data sets.

  6. Registering coherent change detection products associated with large image sets and long capture intervals

    DOEpatents

    Perkins, David Nikolaus; Gonzales, Antonio I

    2014-04-08

    A set of co-registered coherent change detection (CCD) products is produced from a set of temporally separated synthetic aperture radar (SAR) images of a target scene. A plurality of transformations are determined, which transformations are respectively for transforming a plurality of the SAR images to a predetermined image coordinate system. The transformations are used to create, from a set of CCD products produced from the set of SAR images, a corresponding set of co-registered CCD products.

  7. Evaluating hydrological ensemble predictions using a large and varied set of catchments (Invited)

    NASA Astrophysics Data System (ADS)

    Ramos, M.; Andreassian, V.; Perrin, C.; Loumagne, C.

    2010-12-01

    It is widely accepted that local and national operational early warning systems can play a key role in mitigating flood damage and losses to society while improving risk awareness and flood preparedness. Over the last years, special attention has been paid to efficiently couple meteorological and hydrological warning systems to track uncertainty and achieve longer lead times in hydrological forecasting. Several national and international scientific programs have focused on the pre-operational test and development of ensemble hydrological forecasting. Based on the lumped soil-moisture-accounting type rainfall-runoff model GRP, developed at Cemagref, we have set up a research tool for ensemble forecasting and conducted several studies to evaluate the quality of streamflow forecasts. The model has been driven by available archives of weather ensemble prediction systems from different sources (Météo-France, ECMWF, TIGGE archive). Our approach has sought to combine overall validation under varied geographical and climate conditions (to assess model robustness and generality) and site-specific validation (to locally accept or reject the hydrologic forecast system and contribute to defining its limits of applicability). The general aim is to contribute to methodological developments concerning a wide range of key aspects in hydrological forecasting, including: the links between predictability skill and catchment characteristics, the magnitude and the distribution of forecasting errors, the analysis of nested or neighbouring catchments for prediction in ungauged basins, as well as the reliability of model predictions when forecasting under conditions not previously encountered during the period of setup and calibration of the system. This presentation will cover the aforementioned topics and present examples from studies carried out to evaluate and inter-compare ensemble forecasting systems using a large and varied set of catchments in France. The specific need to

  8. Considerations for Observational Research Using Large Data Sets in Radiation Oncology

    SciTech Connect

    Jagsi, Reshma; Bekelman, Justin E.; Chen, Aileen; Chen, Ronald C.; Hoffman, Karen; Tina Shih, Ya-Chen; Smith, Benjamin D.; Yu, James B.

    2014-09-01

    The radiation oncology community has witnessed growing interest in observational research conducted using large-scale data sources such as registries and claims-based data sets. With the growing emphasis on observational analyses in health care, the radiation oncology community must possess a sophisticated understanding of the methodological considerations of such studies in order to evaluate evidence appropriately to guide practice and policy. Because observational research has unique features that distinguish it from clinical trials and other forms of traditional radiation oncology research, the International Journal of Radiation Oncology, Biology, Physics assembled a panel of experts in health services research to provide a concise and well-referenced review, intended to be informative for the lay reader, as well as for scholars who wish to embark on such research without prior experience. This review begins by discussing the types of research questions relevant to radiation oncology that large-scale databases may help illuminate. It then describes major potential data sources for such endeavors, including information regarding access and insights regarding the strengths and limitations of each. Finally, it provides guidance regarding the analytical challenges that observational studies must confront, along with discussion of the techniques that have been developed to help minimize the impact of certain common analytical issues in observational analysis. Features characterizing a well-designed observational study include clearly defined research questions, careful selection of an appropriate data source, consultation with investigators with relevant methodological expertise, inclusion of sensitivity analyses, caution not to overinterpret small but significant differences, and recognition of limitations when trying to evaluate causality. This review concludes that carefully designed and executed studies using observational data that possess these qualities hold

  9. Considerations for observational research using large data sets in radiation oncology.

    PubMed

    Jagsi, Reshma; Bekelman, Justin E; Chen, Aileen; Chen, Ronald C; Hoffman, Karen; Shih, Ya-Chen Tina; Smith, Benjamin D; Yu, James B

    2014-09-01

    The radiation oncology community has witnessed growing interest in observational research conducted using large-scale data sources such as registries and claims-based data sets. With the growing emphasis on observational analyses in health care, the radiation oncology community must possess a sophisticated understanding of the methodological considerations of such studies in order to evaluate evidence appropriately to guide practice and policy. Because observational research has unique features that distinguish it from clinical trials and other forms of traditional radiation oncology research, the International Journal of Radiation Oncology, Biology, Physics assembled a panel of experts in health services research to provide a concise and well-referenced review, intended to be informative for the lay reader, as well as for scholars who wish to embark on such research without prior experience. This review begins by discussing the types of research questions relevant to radiation oncology that large-scale databases may help illuminate. It then describes major potential data sources for such endeavors, including information regarding access and insights regarding the strengths and limitations of each. Finally, it provides guidance regarding the analytical challenges that observational studies must confront, along with discussion of the techniques that have been developed to help minimize the impact of certain common analytical issues in observational analysis. Features characterizing a well-designed observational study include clearly defined research questions, careful selection of an appropriate data source, consultation with investigators with relevant methodological expertise, inclusion of sensitivity analyses, caution not to overinterpret small but significant differences, and recognition of limitations when trying to evaluate causality. This review concludes that carefully designed and executed studies using observational data that possess these qualities hold

  10. Long DNA sequences and large data sets: investigating the Quaternary via ancient DNA

    NASA Astrophysics Data System (ADS)

    Hofreiter, Michael

    2008-12-01

    Progress in technical development has allowed piecing together increasingly long DNA sequences from subfossil remains of both extinct and extant species. At the same time, more and more species are analyzed on the population level, leading to a better understanding of population dynamics over time. Finally, new sequencing techniques have allowed targeting complete nuclear genomes of extinct species. The sequences obtained yield insights into a variety of research fields. First, phylogenetic relationships can be resolved with much greater accuracy and it becomes possible to date divergence events of species during and before the Quaternary. Second, large data sets in population genetics facilitate the assessment of changes in genetic diversity over time, an approach that has substantially revised our views about phylogeographic patterns and population dynamics. In the future, the combination of population genetics with long DNA sequences, e.g. complete mitochondrial (mt) DNA genomes, should lead to much more precise estimates of population size changes to be made. This will enable us to make inferences about - and hopefully understand - the causes for faunal turnover and extinctions during the Quaternary. Third, with regard to the nuclear genome, complete genes and genomes can now be sequenced and studied with regard to their function, revealing insights about the numerous traits of extinct species that are not preserved in the fossil record.

  11. Public-private partnerships with large corporations: setting the ground rules for better health.

    PubMed

    Galea, Gauden; McKee, Martin

    2014-04-01

    Public-private partnerships with large corporations offer potential benefits to the health sector but many concerns have been raised, highlighting the need for appropriate safeguards. In this paper we propose five tests that public policy makers may wish to apply when considering engaging in such a public-private partnership. First, are the core products and services provided by the corporation health enhancing or health damaging? In some cases, such as tobacco, the answer is obvious but others, such as food and alcohol, are contested. In such cases, the burden of proof is on the potential partners to show that their activities are health enhancing. Second, do potential partners put their policies into practice in the settings where they can do so, their own workplaces? Third, are the corporate social responsibility activities of potential partners independently audited? Fourth, do potential partners make contributions to the commons rather than to narrow programmes of their choosing? Fifth, is the role of the partner confined to policy implementation rather than policy development, which is ultimately the responsibility of government alone? PMID:24581699

  12. Information Theoretic Approaches to Rapid Discovery of Relationships in Large Climate Data Sets

    NASA Technical Reports Server (NTRS)

    Knuth, Kevin H.; Rossow, William B.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Mutual information as the asymptotic Bayesian measure of independence is an excellent starting point for investigating the existence of possible relationships among climate-relevant variables in large data sets, As mutual information is a nonlinear function of of its arguments, it is not beholden to the assumption of a linear relationship between the variables in question and can reveal features missed in linear correlation analyses. However, as mutual information is symmetric in its arguments, it only has the ability to reveal the probability that two variables are related. it provides no information as to how they are related; specifically, causal interactions or a relation based on a common cause cannot be detected. For this reason we also investigate the utility of a related quantity called the transfer entropy. The transfer entropy can be written as a difference between mutual informations and has the capability to reveal whether and how the variables are causally related. The application of these information theoretic measures is rested on some familiar examples using data from the International Satellite Cloud Climatology Project (ISCCP) to identify relation between global cloud cover and other variables, including equatorial pacific sea surface temperature (SST), over seasonal and El Nino Southern Oscillation (ENSO) cycles.

  13. Spatially-aware Processing of Large Raw LiDAR Data Sets

    NASA Astrophysics Data System (ADS)

    Strane, M. D.; Oskin, M.

    2004-12-01

    An ultimate goal of LiDAR (LIght Detection And Ranging) data acquisition is to produce a regularly sampled accurate topographic view of the surface of the Earth. Last-return and inverse-distance weighted sampling of raw LiDAR data do not take into account the non-random distribution of raw data points. While elevation data produced by these methods is of high accuracy, gradients are not well-resolved and aliasing artifacts are produced, especially on low gradient surfaces. Because of the volume of data involved, resampling schemes that take into account the spatial distribution of raw data have been cumbersome to implement. We have developed a resampling method that uses the free open-source PostgresSQL database to store the raw LiDAR data indexed spatially and as its original time series. This database permits rapid access to raw data points via spatial queries. A robust and expedient algorithm has been implemented to produce regularly gridded resampled data with a least squares plane fit regression. This algorithm reduces aliasing artifacts on low gradient surfaces. The algorithm is also a proof-of-concept to show that complex spatially-aware processing of large LiDAR data sets is feasible on a reasonable time scale, and will be the basis for further improvements such as vegetation removal.

  14. Anomaly Detection in Large Sets of High-Dimensional Symbol Sequences

    NASA Technical Reports Server (NTRS)

    Budalakoti, Suratna; Srivastava, Ashok N.; Akella, Ram; Turkov, Eugene

    2006-01-01

    This paper addresses the problem of detecting and describing anomalies in large sets of high-dimensional symbol sequences. The approach taken uses unsupervised clustering of sequences using the normalized longest common subsequence (LCS) as a similarity measure, followed by detailed analysis of outliers to detect anomalies. As the LCS measure is expensive to compute, the first part of the paper discusses existing algorithms, such as the Hunt-Szymanski algorithm, that have low time-complexity. We then discuss why these algorithms often do not work well in practice and present a new hybrid algorithm for computing the LCS that, in our tests, outperforms the Hunt-Szymanski algorithm by a factor of five. The second part of the paper presents new algorithms for outlier analysis that provide comprehensible indicators as to why a particular sequence was deemed to be an outlier. The algorithms provide a coherent description to an analyst of the anomalies in the sequence, compared to more normal sequences. The algorithms we present are general and domain-independent, so we discuss applications in related areas such as anomaly detection.

  15. Analog and digital interface solutions for the common large-area display set (CLADS)

    NASA Astrophysics Data System (ADS)

    Hermann, David J.; Gorenflo, Ronald L.

    1997-07-01

    Battelle is under contract with Warner Robins Air Logistics Center to design a common large area display set (CLADS) for use in multiple airborne command, control, communications, computers and intelligence applications that currently use unique 19 inch cathode ray tubes (CRTs). The CLADS is a modular design, with common modules used wherever possible. Each CLADS includes an application-specific integration kit, which incorporates all of the unique interface components. Since there is no existing digital video interface standard for high resolution workstations, a standard interface was developed for CLADS and documented as an interface specification.One of the application-specific modules, the application video interface module (AVIM), readily incorporates most of the required application electrical interfaces for a given system into a single module. The analog AVIM, however, poses unique design problems when folding multiple application interface requirements into a single common AVIM for the most prevalent workstation display interface: analog RGB video. Future workstation display interfaces will incorporate fully digital video between the graphics hardware and the digital display device. A digital AVIM is described which utilizes a fiber channel interface to deliver high speed 1280 by 1024, 24- bit, 60 Hz digital video from a PCI graphics card to the CLADS. A video recording and playback device is described, as well as other common CLADS modules, including the display controller and power supply. This paper will discuss both the analog and digital AVIM interfaces, application BIT and power interfaces, as well as CLADS internal interfaces.

  16. Classroom Management and the Librarian

    ERIC Educational Resources Information Center

    Blackburn, Heidi; Hays, Lauren

    2014-01-01

    As librarians take on more instructional responsibilities, the need for classroom management skills becomes vital. Unfortunately, classroom management skills are not taught in library school and therefore, many librarians are forced to learn how to manage a classroom on the job. Different classroom settings such as one-shot instruction sessions…

  17. An efficient out-of-core volume ray casting method for the visualization of large medical data sets

    NASA Astrophysics Data System (ADS)

    Xue, Jian; Tian, Jie; Chen, Jian; Dai, Yakang

    2007-03-01

    Volume ray casting algorithm is widely recognized for high quality volume visualization. However, when rendering very large volume data sets, the original ray casting algorithm will lead to very inefficient random accesses in disk and make it very slowly to render the whole volume data set. In order to solve this problem, an efficient out-of-core volume ray casting method with a new out-of-core framework for processing large volume data sets based on consumer PC hardware is proposed in this paper. The new framework gives a transparent and efficient access to the volume data set cached in disk, while the new volume ray casting method minimizes the data exchange between hard disk and physical memory and performs comparatively fast high quality volume rendering. The experimental results indicate that the new method and framework are effective and efficient for the visualization of very large medical data sets.

  18. Assembly of large metagenome data sets using a Convey HC-1 hybrid core computer (7th Annual SFAF Meeting, 2012)

    ScienceCinema

    Copeland, Alex [DOE JGI

    2013-02-11

    Alex Copeland on "Assembly of large metagenome data sets using a Convey HC-1 hybrid core computer" at the 2012 Sequencing, Finishing, Analysis in the Future Meeting held June 5-7, 2012 in Santa Fe, New Mexico.

  19. Mobile-phone-based classroom response systems: Students' perceptions of engagement and learning in a large undergraduate course

    NASA Astrophysics Data System (ADS)

    Dunn, Peter K.; Richardson, Alice; Oprescu, Florin; McDonald, Christine

    2013-12-01

    Using a Classroom Response System (CRS) has been associated with positive educational outcomes, by fostering student engagement and by allowing immediate feedback to both students and instructors. This study examined a low-cost CRS (VotApedia) in a large first-year class, where students responded to questions using their mobile phones. This study explored whether the use of VotApedia retained the advantages of other CRS, overcame some of the challenges of other CRS, and whether new challenges were introduced by using VotApedia. These issues were studied within three themes: students' perceptions of using VotApedia; the impact of VotApedia on their engagement; and the impact of VotApedia on their learning. Data were collected from an online survey, focus groups and student feedback on teaching and course content. The results indicated that using VotApedia retains the pedagogical advantages of other CRS, while overcoming some of the challenges presented by using other CRS, without introducing any new challenges.

  20. Linked Scatter Plots, A Powerful Exploration Tool For Very Large Sets of Spectra

    NASA Astrophysics Data System (ADS)

    Carbon, Duane Francis; Henze, Christopher

    2015-08-01

    We present a new tool, based on linked scatter plots, that is designed to efficiently explore very large spectrum data sets such as the SDSS, APOGEE, LAMOST, GAIA, and RAVE data sets.The tool works in two stages: the first uses batch processing and the second runs interactively. In the batch stage, spectra are processed through our data pipeline which computes the depths relative to the local continuum at preselected feature wavelengths. These depths, and any additional available variables such as local S/N level, magnitudes, colors, positions, and radial velocities, are the basic measured quantities used in the interactive stage.The interactive stage employs the NASA hyperwall, a configuration of 128 workstation displays (8x16 array) controlled by a parallelized software suite running on NASA's Pleiades supercomputer. Each hyperwall panel is used to display a fully linked 2-D scatter plot showing the depth of feature A vs the depth of feature B for all of the spectra. A and B change from panel to panel. The relationships between the various (A,B) strengths and any distinctive clustering, as well as unique outlier groupings, are visually apparent when examining and inter-comparing the different panels on the hyperwall. In addition, the data links between the scatter plots allow the user to apply a logical algebra to the measurements. By graphically selecting the objects in any interesting region of any 2-D plot on the hyperwall, the tool immediately and clearly shows how the selected objects are distributed in all the other 2-D plots. The selection process may be repeated multiple times and, at each step, the selections can represent a sequence of logical constraints on the measurements, revealing those objects which satisfy all the constraints thus far. The spectra of the selected objects may be examined at any time on a connected workstation display.Using over 945,000,000 depth measurements from 569,738 SDSS DR10 stellar spectra, we illustrate how to quickly

  1. The Sheffield experiment: the effects of centralising accident and emergency services in a large urban setting

    PubMed Central

    Simpson, A; Wardrope, J; Burke, D

    2001-01-01

    Objectives—To assess the effects of centralisation of accident and emergency (A&E) services in a large urban setting. The end points were the quality of patient care judged by time to see a doctor or nurse practitioner, time to admission and the cost of the A&E service as a whole. Methods—Sheffield is a large industrial city with a population of 471 000. In 1994 Sheffield health authority took a decision to centralise a number of services including the A&E services. This study presents data collected over a three year period before, during and after the centralisation of adult A&E services from two sites to one site and the centralisation of children's A&E services to a separate site. A minor injury unit was also established along with an emergency admissions unit. The study used information from the A&E departments' computer system and routinely available financial data. Results—There has been a small decrease in the number of new patient attendances using the Sheffield A&E system. Most patients go to the correct department. The numbers of acute admissions through the adult A&E have doubled. Measures of process efficiency show some improvement in times to admission. There has been measurable deterioration in the time to be seen for minor injuries in the A&E departments. This is partly offset by the very good waiting time to be seen in the minor injuries unit. The costs of providing the service within Sheffield have increased. Conclusion—Centralisation of A&E services in Sheffield has led to concentration of the most ill patients in a single adult department and separate paediatric A&E department. Despite a greatly increased number of admissions at the adult site this change has not resulted in increased waiting times for admission because of the transfer of adequate beds to support the changes. There has however been a deterioration in the time to see a clinician, especially in the A&E departments. The waiting times at the minor injury unit are very short

  2. BACHSCORE. A tool for evaluating efficiently and reliably the quality of large sets of protein structures

    NASA Astrophysics Data System (ADS)

    Sarti, E.; Zamuner, S.; Cossio, P.; Laio, A.; Seno, F.; Trovato, A.

    2013-12-01

    In protein structure prediction it is of crucial importance, especially at the refinement stage, to score efficiently large sets of models by selecting the ones that are closest to the native state. We here present a new computational tool, BACHSCORE, that allows its users to rank different structural models of the same protein according to their quality, evaluated by using the BACH++ (Bayesian Analysis Conformation Hunt) scoring function. The original BACH statistical potential was already shown to discriminate with very good reliability the protein native state in large sets of misfolded models of the same protein. BACH++ features a novel upgrade in the solvation potential of the scoring function, now computed by adapting the LCPO (Linear Combination of Pairwise Orbitals) algorithm. This change further enhances the already good performance of the scoring function. BACHSCORE can be accessed directly through the web server: bachserver.pd.infn.it. Catalogue identifier: AEQD_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEQD_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: GNU General Public License version 3 No. of lines in distributed program, including test data, etc.: 130159 No. of bytes in distributed program, including test data, etc.: 24 687 455 Distribution format: tar.gz Programming language: C++. Computer: Any computer capable of running an executable produced by a g++ compiler (4.6.3 version). Operating system: Linux, Unix OS-es. RAM: 1 073 741 824 bytes Classification: 3. Nature of problem: Evaluate the quality of a protein structural model, taking into account the possible “a priori” knowledge of a reference primary sequence that may be different from the amino-acid sequence of the model; the native protein structure should be recognized as the best model. Solution method: The contact potential scores the occurrence of any given type of residue pair in 5 possible

  3. Gaining A Geological Perspective Through Active Learning in the Large Lecture Classroom

    NASA Astrophysics Data System (ADS)

    Kapp, J. L.; Richardson, R. M.; Slater, S. J.

    2008-12-01

    NATS 101 A Geological Perspective is a general education course taken by non science majors. We offer 600 seats per semester, with four large lecture sections taught by different faculty members. In the past we have offered optional once a week study groups taught by graduate teaching assistants. Students often feel overwhelmed by the science and associated jargon, and many are prone to skipping lectures altogether. Optional study groups are only attended by ~50% of the students. Faculty members find the class to be a lot of work, mainly due to the grading it generates. Activities given in lecture are often short multiple choice or true false assignments, limiting the depth of understanding we can evaluate. Our students often lack math and critical thinking skills, and we spend a lot of time in lecture reintroducing ideas students should have already gotten from the text. In summer 2007 we were funded to redesign the course. Our goals were to 1) cut the cost of running the course, and 2) improve student learning. Under our redesign optional study groups were replaced by once a week mandatory break out sessions where students complete activities that have been introduced in lecture. Break out sessions substitute for one hour of lecture, and are run by undergraduate preceptors and graduate teaching assistants (GTAs). During the lecture period, lectures themselves are brief with a large portion of the class devoted to active learning in small groups. Weekly reading quizzes are submitted via the online course management system. Break out sessions allow students to spend more time interacting with their fellow students, undergraduate preceptors, and GTAs. They get one on one help in break out sessions on assignments designed to enhance the lecture material. The active lecture format means less of their time is devoted to listening passively to a lecture, and more time is spent peer learning an interacting with the instructor. Completing quizzes online allows students

  4. Repulsive parallel MCMC algorithm for discovering diverse motifs from large sequence sets

    PubMed Central

    Ikebata, Hisaki; Yoshida, Ryo

    2015-01-01

    Motivation: The motif discovery problem consists of finding recurring patterns of short strings in a set of nucleotide sequences. This classical problem is receiving renewed attention as most early motif discovery methods lack the ability to handle large data of recent genome-wide ChIP studies. New ChIP-tailored methods focus on reducing computation time and pay little regard to the accuracy of motif detection. Unlike such methods, our method focuses on increasing the detection accuracy while maintaining the computation efficiency at an acceptable level. The major advantage of our method is that it can mine diverse multiple motifs undetectable by current methods. Results: The repulsive parallel Markov chain Monte Carlo (RPMCMC) algorithm that we propose is a parallel version of the widely used Gibbs motif sampler. RPMCMC is run on parallel interacting motif samplers. A repulsive force is generated when different motifs produced by different samplers near each other. Thus, different samplers explore different motifs. In this way, we can detect much more diverse motifs than conventional methods can. Through application to 228 transcription factor ChIP-seq datasets of the ENCODE project, we show that the RPMCMC algorithm can find many reliable cofactor interacting motifs that existing methods are unable to discover. Availability and implementation: A C++ implementation of RPMCMC and discovered cofactor motifs for the 228 ENCODE ChIP-seq datasets are available from http://daweb.ism.ac.jp/yoshidalab/motif. Contact: ikebata.hisaki@ism.ac.jp, yoshidar@ism.ac.jp Supplementary information: Supplementary data are available from Bioinformatics online. PMID:25583120

  5. Toward accurate thermochemical models for transition metals : G3large basis sets for atoms Sc-Zn.

    SciTech Connect

    Mayhall, N. J.; Raghavachari, K.; Redfern, P. C.; Curtiss, L. A.; Rassolov, V.; Indiana Univ.; Univ. of South Carolina

    2008-04-01

    An augmented valence triple-zeta basis set, referred to as G3Large, is reported for the first-row transition metal elements Sc through Zn. The basis set is constructed in a manner similar to the G3Large basis set developed previously for other elements (H-Ar, K, Ca, Ga-Kr) and used as a key component in Gaussian-3 theory. It is based on a contraction of a set of 15s13p5d Gaussian primitives to 8s7p3d, and also includes sets of f and g polarization functions, diffuse spd functions, and core df polarization functions. The basis set is evaluated with triples-augmented coupled cluster [CCSD(T)] and Brueckner orbital [BD(T)] methods for a small test set involving energies of atoms, atomic ions, and diatomic hydrides. It performs well for the low-lying s{yields}d excitation energies of atoms, atomic ionization energies, and the dissociation energies of the diatomic hydrides. The Brueckner orbital-based BD(T) method performs substantially better than Hartree-Fock-based CCSD(T) for molecules such as NiH, where the starting unrestricted Hartree-Fock wavefunction suffers from a high degree of spin contamination. Comparison with available data for geometries of transition metal hydrides also shows good agreement. A smaller basis set without core polarization functions, G3MP2Large, is also defined.

  6. Quality in Inclusive Preschool Classrooms

    ERIC Educational Resources Information Center

    Hestenes, Linda L.; Cassidy, Deborah J.; Shim, Jonghee; Hegde, Archana V.

    2008-01-01

    Research Findings: Quality of care for preschool children in inclusive and noninclusive classrooms was examined in two studies. In Study 1, comparisons across a large sample of classrooms (N = 1, 313) showed that inclusive classrooms were higher than noninclusive classrooms in global quality as well as on two dimensions of quality…

  7. Problems in the Cataloging of Large Microform Sets or, Learning to Expect the Unexpected.

    ERIC Educational Resources Information Center

    Joachim, Martin D.

    1989-01-01

    Describes problems encountered during the cataloging of three major microform sets at the Indiana University Libraries. Areas discussed include size and contents of the sets, staffing for the project, equipment, authority work, rare book cataloging rules, serials, language of materials, musical scores, and manuscripts. (CLB)

  8. Improving Library Effectiveness: A Proposal for Applying Fuzzy Set Concepts in the Management of Large Collections.

    ERIC Educational Resources Information Center

    Robinson, Earl J.; Turner, Stephen J.

    1981-01-01

    Fuzzy set theory, a mathematical modeling technique that allows for the consideration of such factors as "professional expertise" in decision making, is discussed as a tool for use in libraries--specifically in collection management. The fundamentals of fuzzy set theory are introduced and a reference list is included. (JL)

  9. Tools for Analysis and Visualization of Large Time-Varying CFD Data Sets

    NASA Technical Reports Server (NTRS)

    Wilhelms, Jane; VanGelder, Allen

    1997-01-01

    In the second year, we continued to built upon and improve our scanline-based direct volume renderer that we developed in the first year of this grant. This extremely general rendering approach can handle regular or irregular grids, including overlapping multiple grids, and polygon mesh surfaces. It runs in parallel on multi-processors. It can also be used in conjunction with a k-d tree hierarchy, where approximate models and error terms are stored in the nodes of the tree, and approximate fast renderings can be created. We have extended our software to handle time-varying data where the data changes but the grid does not. We are now working on extending it to handle more general time-varying data. We have also developed a new extension of our direct volume renderer that uses automatic decimation of the 3D grid, as opposed to an explicit hierarchy. We explored this alternative approach as being more appropriate for very large data sets, where the extra expense of a tree may be unacceptable. We also describe a new approach to direct volume rendering using hardware 3D textures and incorporates lighting effects. Volume rendering using hardware 3D textures is extremely fast, and machines capable of using this technique are becoming more moderately priced. While this technique, at present, is limited to use with regular grids, we are pursuing possible algorithms extending the approach to more general grid types. We have also begun to explore a new method for determining the accuracy of approximate models based on the light field method described at ACM SIGGRAPH '96. In our initial implementation, we automatically image the volume from 32 equi-distant positions on the surface of an enclosing tessellated sphere. We then calculate differences between these images under different conditions of volume approximation or decimation. We are studying whether this will give a quantitative measure of the effects of approximation. We have created new tools for exploring the

  10. Engaged: Making Large Classes Feel Small through Blended Learning Instructional Strategies that Promote Increased Student Performance

    ERIC Educational Resources Information Center

    Francis, Raymond W.

    2012-01-01

    It is not enough to be great at sharing information in a large classroom setting. To be an effective teacher you must be able to meaningfully engage your students with their peers and with the content. And you must do this regardless of class size or content. The issues of teaching effectively in large classroom settings have presented ongoing…

  11. Adaptation of Bharatanatyam Dance Pedagogy for Multicultural Classrooms: Questions and Relevance in a North American University Setting

    ERIC Educational Resources Information Center

    Banerjee, Suparna

    2013-01-01

    This article opens up questions around introducing Bharatanatyam, a form of Indian classical dance, to undergraduate learners within a North American university setting. The aim is to observe how the learners understood and received a particular cultural practice and to explore issues related to learning goals, curriculum content, approaches to…

  12. Training Health Service Technicians as Teacher Assistants in an Inpatient Residential Emotional/Behavior Disorder Classroom Setting

    ERIC Educational Resources Information Center

    Banks, Walter E.

    2012-01-01

    Schools have identified that the use of Teacher Assistants often provides needed additional support in the school setting. In a Health Care Facility that provides inpatient psychiatric services, children ages 5-14 are required to engage in school activities. Currently there are no Teacher Assistants trained in the facility. This study focuses on…

  13. On the performance of large Gaussian basis sets for the computation of total atomization energies

    NASA Technical Reports Server (NTRS)

    Martin, J. M. L.

    1992-01-01

    The total atomization energies of a number of molecules have been computed using an augmented coupled-cluster method and (5s4p3d2f1g) and 4s3p2d1f) atomic natural orbital (ANO) basis sets, as well as the correlation consistent valence triple zeta plus polarization (cc-pVTZ) correlation consistent valence quadrupole zeta plus polarization (cc-pVQZ) basis sets. The performance of ANO and correlation consistent basis sets is comparable throughout, although the latter can result in significant CPU time savings. Whereas the inclusion of g functions has significant effects on the computed Sigma D(e) values, chemical accuracy is still not reached for molecules involving multiple bonds. A Gaussian-1 (G) type correction lowers the error, but not much beyond the accuracy of the G1 model itself. Using separate corrections for sigma bonds, pi bonds, and valence pairs brings down the mean absolute error to less than 1 kcal/mol for the spdf basis sets, and about 0.5 kcal/mol for the spdfg basis sets. Some conclusions on the success of the Gaussian-1 and Gaussian-2 models are drawn.

  14. Scalable Algorithms for Unsupervised Classification and Anomaly Detection in Large Geospatiotemporal Data Sets

    NASA Astrophysics Data System (ADS)

    Mills, R. T.; Hoffman, F. M.; Kumar, J.

    2015-12-01

    The increasing availability of high-resolution geospatiotemporal datasets from sources such as observatory networks, remote sensing platforms, and computational Earth system models has opened new possibilities for knowledge discovery and mining of ecological data sets fused from disparate sources. Traditional algorithms and computing platforms are impractical for the analysis and synthesis of data sets of this size; however, new algorithmic approaches that can effectively utilize the complex memory hierarchies and the extremely high levels of available parallelism in state-of-the-art high-performance computing platforms can enable such analysis. We describe some unsupervised knowledge discovery and anomaly detection approaches based on highly scalable parallel algorithms for k-means clustering and singular value decomposition, consider a few practical applications thereof to the analysis of climatic and remotely-sensed vegetation phenology data sets, and speculate on some of the new applications that such scalable analysis methods may enable.

  15. Manufacturing physics: using large(r) data sets and physical insight to develop great products

    NASA Astrophysics Data System (ADS)

    Rosenblum, Steven

    2011-03-01

    Early stage research does a fantastic job providing knowledge and proof-of-feasibility for new product concepts. However, the handful of data points required to validate a concept is typically insufficient to provide insight on the whole range of effects relevant to manufacturing the product. Moving to manufacturing brings larger data sets and variability; opportunistic analysis of these larger sets can yield better product design rules. In the early 2000s Corning developed an optical transmission fiber optimized to suppress stimulated Brillouin scattering (SBS). Analyzing the larger data set provided by the manufacturing environment using the same theoretical framework developed by the original researchers refined our understanding of how to improve SBS in optical fibers beyond what was known from our early efforts. This greater understanding allowed us to design better performing products.

  16. A posteriori correction of camera characteristics from large image data sets.

    PubMed

    Afanasyev, Pavel; Ravelli, Raimond B G; Matadeen, Rishi; De Carlo, Sacha; van Duinen, Gijs; Alewijnse, Bart; Peters, Peter J; Abrahams, Jan-Pieter; Portugal, Rodrigo V; Schatz, Michael; van Heel, Marin

    2015-01-01

    Large datasets are emerging in many fields of image processing including: electron microscopy, light microscopy, medical X-ray imaging, astronomy, etc. Novel computer-controlled instrumentation facilitates the collection of very large datasets containing thousands of individual digital images. In single-particle cryogenic electron microscopy ("cryo-EM"), for example, large datasets are required for achieving quasi-atomic resolution structures of biological complexes. Based on the collected data alone, large datasets allow us to precisely determine the statistical properties of the imaging sensor on a pixel-by-pixel basis, independent of any "a priori" normalization routinely applied to the raw image data during collection ("flat field correction"). Our straightforward "a posteriori" correction yields clean linear images as can be verified by Fourier Ring Correlation (FRC), illustrating the statistical independence of the corrected images over all spatial frequencies. The image sensor characteristics can also be measured continuously and used for correcting upcoming images. PMID:26068909

  17. Addressing Methodological Challenges in Large Communication Data Sets: Collecting and Coding Longitudinal Interactions in Home Hospice Cancer Care.

    PubMed

    Reblin, Maija; Clayton, Margaret F; John, Kevin K; Ellington, Lee

    2016-07-01

    In this article, we present strategies for collecting and coding a large longitudinal communication data set collected across multiple sites, consisting of more than 2000 hours of digital audio recordings from approximately 300 families. We describe our methods within the context of implementing a large-scale study of communication during cancer home hospice nurse visits, but this procedure could be adapted to communication data sets across a wide variety of settings. This research is the first study designed to capture home hospice nurse-caregiver communication, a highly understudied location and type of communication event. We present a detailed example protocol encompassing data collection in the home environment, large-scale, multisite secure data management, the development of theoretically-based communication coding, and strategies for preventing coder drift and ensuring reliability of analyses. Although each of these challenges has the potential to undermine the utility of the data, reliability between coders is often the only issue consistently reported and addressed in the literature. Overall, our approach demonstrates rigor and provides a "how-to" example for managing large, digitally recorded data sets from collection through analysis. These strategies can inform other large-scale health communication research. PMID:26580414

  18. Improved student engagement, satisfaction, and learning outcomes in a "flipped" large-lecture setting

    NASA Astrophysics Data System (ADS)

    Ward, A. S.; Bettis, E. A., III; Russell, J. E.; Van Horne, S.; Rocheford, M. K.; Sipola, M.; Colombo, M. R.

    2014-12-01

    Large lecture courses are traditional teaching practices of most large institutions of public higher education. They have historically provided an efficient way to deliver content information to the large number of students with the least amount of faculty resources. However, research of student learning indicates that the traditional lecture format does not provide the best learning experience for students, and students learn better in the active learning environments in which students engage in meaningful learning activities rather than just listening. In this study, we compare two offerings of Introduction to Environmental Science, a large-lecture general education course, offered in two formats by the same instructors in subsequent years. In the first offering (Spring 2013) the course was offered as a traditional large-lecture course, with lecture to large audiences and a limited number of exams for assessment. In the second offering (Spring 2014), the course included small-group discussion periods, peer-review of writing assignments, guest lectures, and online learning with limited traditional lecture. Our primary objective was to quantify differences in student engagement and learning outcomes between the two course offerings. Results of our study show that the students in the transformed course indicated higher interest, engagement level, and satisfaction than the students in the traditional lecture course. Furthermore, students in the transformed course reported increased behavior, emotional, and cognitive engagement over those in the traditional course, and also increased satisfaction with the course.

  19. Large-Scale Disturbance Events in Terrestrial Ecosystems Detected using Global Satellite Data Sets

    NASA Astrophysics Data System (ADS)

    Potter, C.; Tan, P.; Kumar, V.; Klooster, S.

    2004-12-01

    Studies are being conducted to evaluate patterns in a 19-year record of global satellite observations of vegetation phenology from the Advanced Very High Resolution Radiometer (AVHRR), as a means to characterize large-scale ecosystem disturbance events and regimes. The fraction absorbed of photosynthetically active radiation (FPAR) by vegetation canopies worldwide has been computed at a monthly time interval from 1982 to 2000 and gridded at a spatial resolution of 8-km globally. Potential disturbance events were identified in the FPAR time series by locating anomalously low values (FPAR-LO) that lasted longer than 12 consecutive months at any 8-km pixel. We can find verifiable evidence of numerous disturbance types across North America, including major regional patterns of cold and heat waves, forest fires, tropical storms, and large-scale forest logging. Based on this analysis, an historical picture is emerging of periodic droughts and heat waves, possibly coupled with herbivorous insect outbreaks, as among the most important causes of ecosystem disturbance in North America. In South America, large areas of northeastern Brazil appear to have been impacted in the early 1990s by severe drought. Amazon tropical forest disturbance can be detected at large scales particularly in the mid 1990s. In Asia, large-scale disturbance events appear in the mid 1980s and the late 1990s across boreal and temperate forest zones, as well as in cropland areas of western India. In northern Europe and central Africa, large-scale forest disturbance appears in the mid 1990s.

  20. A Controlled Trial of Active versus Passive Learning Strategies in a Large Group Setting

    ERIC Educational Resources Information Center

    Haidet, Paul; Morgan, Robert O.; O'Malley, Kimberly; Moran, Betty Jeanne; Richards, Boyd F.

    2004-01-01

    Objective: To compare the effects of active and didactic teaching strategies on learning- and process-oriented outcomes. Design: Controlled trial. Setting: After-hours residents' teaching session. Participants: Family and Community Medicine, Internal Medicine, and Pediatrics residents at two academic medical institutions. Interventions: We…

  1. Options in Education, Transcript for February 16, 1976: National Commitment to Equal Rights & Equal Educational Opportunity, Racial Conflict in the Classroom, Setting Up a Publishing Business, and Women in Education (Mathematics and Sex).

    ERIC Educational Resources Information Center

    George Washington Univ., Washington, DC. Inst. for Educational Leadership.

    "Options in Education" is a radio news program which focuses on issues and developments in education. This transcript contains discussions of the national commitment to desegregated education, racial conflict in the classroom, learning how to set up a publishing business, women in education (mathematics and sex) and education news highlights.…

  2. Using a Classroom Response System for Promoting Interaction to Teaching Mathematics to Large Groups of Undergraduate Students

    ERIC Educational Resources Information Center

    Morais, Adolfo; Barragués, José Ignacio; Guisasola, Jenaro

    2015-01-01

    This work describes the design and evaluation of a proposal to use Classroom Response Systems (CRS), intended to promote participative classes of Mathematics at University. The proposal is based on Problem Based Learnig (PBL) and uses Robert's six hypotheses for mathematical teaching-learning. The results show that PBL is a relevant strategy to…

  3. The Impact of Mobile Learning on Students' Learning Behaviours and Performance: Report from a Large Blended Classroom

    ERIC Educational Resources Information Center

    Wang, Minjuan; Shen, Ruimin; Novak, Daniel; Pan, Xiaoyan

    2009-01-01

    Chinese classrooms, whether on school grounds or online, have long suffered from a lack of interactivity. Many online classes simply provide recorded instructor lectures, which only reinforces the negative effects of passive nonparticipatory learning. At Shanghai Jiaotong University, researchers and developers actively seek technologic…

  4. Use of Large-Scale Data Sets to Study Educational Pathways of American Indian and Alaska Native Students

    ERIC Educational Resources Information Center

    Faircloth, Susan C.; Alcantar, Cynthia M.; Stage, Frances K.

    2014-01-01

    This chapter discusses issues and challenges encountered in using large-scale data sets to study educational experiences and subsequent outcomes for American Indian and Alaska Native (AI/AN) students. In this chapter, we argue that the linguistic and cultural diversity of Native peoples, coupled with the legal and political ways in which education…

  5. Caught you: threats to confidentiality due to the public release of large-scale genetic data sets

    PubMed Central

    2010-01-01

    Background Large-scale genetic data sets are frequently shared with other research groups and even released on the Internet to allow for secondary analysis. Study participants are usually not informed about such data sharing because data sets are assumed to be anonymous after stripping off personal identifiers. Discussion The assumption of anonymity of genetic data sets, however, is tenuous because genetic data are intrinsically self-identifying. Two types of re-identification are possible: the "Netflix" type and the "profiling" type. The "Netflix" type needs another small genetic data set, usually with less than 100 SNPs but including a personal identifier. This second data set might originate from another clinical examination, a study of leftover samples or forensic testing. When merged to the primary, unidentified set it will re-identify all samples of that individual. Even with no second data set at hand, a "profiling" strategy can be developed to extract as much information as possible from a sample collection. Starting with the identification of ethnic subgroups along with predictions of body characteristics and diseases, the asthma kids case as a real-life example is used to illustrate that approach. Summary Depending on the degree of supplemental information, there is a good chance that at least a few individuals can be identified from an anonymized data set. Any re-identification, however, may potentially harm study participants because it will release individual genetic disease risks to the public. PMID:21190545

  6. Psychology in an Interdisciplinary Setting: A Large-Scale Project to Improve University Teaching

    ERIC Educational Resources Information Center

    Koch, Franziska D.; Vogt, Joachim

    2015-01-01

    At a German university of technology, a large-scale project was funded as a part of the "Quality Pact for Teaching", a programme launched by the German Federal Ministry of Education and Research to improve the quality of university teaching and study conditions. The project aims at intensifying interdisciplinary networking in teaching,…

  7. Learning through Discussions: Comparing the Benefits of Small-Group and Large-Class Settings

    ERIC Educational Resources Information Center

    Pollock, Philip H.; Hamann, Kerstin; Wilson, Bruce M.

    2011-01-01

    The literature on teaching and learning heralds the benefits of discussion for student learner outcomes, especially its ability to improve students' critical thinking skills. Yet, few studies compare the effects of different types of face-to-face discussions on learners. Using student surveys, we analyze the benefits of small-group and large-class…

  8. The PRRS Host Genomic Consortium (PHGC) Database: Management of large data sets.

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In any consortium project where large amounts of phenotypic and genotypic data are collected across several research labs, issues arise with maintenance and analysis of datasets. The PRRS Host Genomic Consortium (PHGC) Database was developed to meet this need for the PRRS research community. The sch...

  9. A posteriori correction of camera characteristics from large image data sets

    PubMed Central

    Afanasyev, Pavel; Ravelli, Raimond B. G.; Matadeen, Rishi; De Carlo, Sacha; van Duinen, Gijs; Alewijnse, Bart; Peters, Peter J.; Abrahams, Jan-Pieter; Portugal, Rodrigo V.; Schatz, Michael; van Heel, Marin

    2015-01-01

    Large datasets are emerging in many fields of image processing including: electron microscopy, light microscopy, medical X-ray imaging, astronomy, etc. Novel computer-controlled instrumentation facilitates the collection of very large datasets containing thousands of individual digital images. In single-particle cryogenic electron microscopy (“cryo-EM”), for example, large datasets are required for achieving quasi-atomic resolution structures of biological complexes. Based on the collected data alone, large datasets allow us to precisely determine the statistical properties of the imaging sensor on a pixel-by-pixel basis, independent of any “a priori” normalization routinely applied to the raw image data during collection (“flat field correction”). Our straightforward “a posteriori” correction yields clean linear images as can be verified by Fourier Ring Correlation (FRC), illustrating the statistical independence of the corrected images over all spatial frequencies. The image sensor characteristics can also be measured continuously and used for correcting upcoming images. PMID:26068909

  10. Design of Availability-Dependent Distributed Services in Large-Scale Uncooperative Settings

    ERIC Educational Resources Information Center

    Morales, Ramses Victor

    2009-01-01

    Thesis Statement: "Availability-dependent global predicates can be efficiently and scalably realized for a class of distributed services, in spite of specific selfish and colluding behaviors, using local and decentralized protocols". Several types of large-scale distributed systems spanning the Internet have to deal with availability variations…

  11. An Examination of Classroom Social Environment on Motivation and Engagement of College Early Entrant Honors Students

    ERIC Educational Resources Information Center

    Maddox, Richard S.

    2010-01-01

    This study set out to examine the relationships between the classroom social environment, motivation, engagement and achievement of a group of early entrant Honors students at a large urban university. Prior research on the classroom environment, motivation, engagement and high ability students was examined, leading to the assumption that the…

  12. Generating mock data sets for large-scale Lyman-α forest correlation measurements

    SciTech Connect

    Font-Ribera, Andreu; McDonald, Patrick; Miralda-Escudé, Jordi E-mail: pvmcdonald@lbl.gov

    2012-01-01

    Massive spectroscopic surveys of high-redshift quasars yield large numbers of correlated Lyα absorption spectra that can be used to measure large-scale structure. Simulations of these surveys are required to accurately interpret the measurements of correlations and correct for systematic errors. An efficient method to generate mock realizations of Lyα forest surveys is presented which generates a field over the lines of sight to the survey sources only, instead of having to generate it over the entire three-dimensional volume of the survey. The method can be calibrated to reproduce the power spectrum and one-point distribution function of the transmitted flux fraction, as well as the redshift evolution of these quantities, and is easily used for modeling any survey systematic effects. We present an example of how these mock surveys are applied to predict the measurement errors in a survey with similar parameters as the BOSS quasar survey in SDSS-III.

  13. High-throughput film-densitometry: An efficient approach to generate large data sets

    SciTech Connect

    Typke, Dieter; Nordmeyer, Robert A.; Jones, Arthur; Lee, Juyoung; Avila-Sakar, Agustin; Downing, Kenneth H.; Glaeser, Robert M.

    2004-07-14

    A film-handling machine (robot) has been built which can, in conjunction with a commercially available film densitometer, exchange and digitize over 300 electron micrographs per day. Implementation of robotic film handling effectively eliminates the delay and tedium associated with digitizing images when data are initially recorded on photographic film. The modulation transfer function (MTF) of the commercially available densitometer is significantly worse than that of a high-end, scientific microdensitometer. Nevertheless, its signal-to-noise ratio (S/N) is quite excellent, allowing substantial restoration of the output to ''near-to-perfect'' performance. Due to the large area of the standard electron microscope film that can be digitized by the commercial densitometer (up to 10,000 x 13,680 pixels with an appropriately coded holder), automated film digitization offers a fast and inexpensive alternative to high-end CCD cameras as a means of acquiring large amounts of image data in electron microscopy.

  14. Parallel k-Means Clustering for Quantitative Ecoregion Delineation Using Large Data Sets

    SciTech Connect

    Kumar, Jitendra; Mills, Richard T; Hoffman, Forrest M; HargroveJr., William Walter

    2011-01-01

    Identification of geographic ecoregions has long been of interest to environmental scientists and ecologists for identifying regions of similar ecological and environmental conditions. Such classifications are important for predicting suitable species ranges, for stratification of ecological samples, and to help prioritize habitat preservation and remediation efforts. Hargrove and Hoffman (1999, 2009) have developed geographical spatio-temporal clustering algorithms and codes and have successfully applied them to a variety of environmental science domains, including ecological regionalization; environmental monitoring network design; analysis of satellite-, airborne-, and ground-based remote sensing, and climate model-model and model-measurement intercomparison. With the advances in state-of-the-art satellite remote sensing and climate models, observations and model outputs are available at increasingly high spatial and temporal resolutions. Long time series of these high resolution datasets are extremely large in size and growing. Analysis and knowledge extraction from these large datasets are not just algorithmic and ecological problems, but also pose a complex computational problem. This paper focuses on the development of a massively parallel multivariate geographical spatio-temporal clustering code for analysis of very large datasets using tens of thousands processors on one of the fastest supercomputers in the world.

  15. Latest developments in the display of large-scale ionospheric and thermospheric data sets

    NASA Technical Reports Server (NTRS)

    Sojka, J. J.

    1992-01-01

    Over the past decade, data base sizes have continually increased and will continue to do so in the future. This problem of size is further compounded because the trend in present-day studies is to use data from many different locations and different instruments and then compare it with data from global scale physical models. The latter produce data bases of comparable if not even larger size. Much of the data can be viewed as 'image' time sequences and is most readily viewed on color display terminals. These data sets reside in national or owner-generated data bases linked together by computer networks. As the size increases, just moving this data around, taking 'quick-looks' at the data, or even storing it locally become severe problems compromising the scientific return from the data. Is the present-day technology with these analysis techniques being used in the best way? What are the prospects for reducing the storage and transmission size of the data sets? Examples of such problems and potential solutions are described in this paper.

  16. The Same or Separate? An Exploration of Teachers' Perceptions of the Classroom Assignment of Twins in Prior to School and Kindergarten to Year Two School Settings

    ERIC Educational Resources Information Center

    Jones, Laura; De Gioia, Katey

    2010-01-01

    This article investigates the perceptions of 12 teachers from New South Wales, Australia, regarding the classroom assignment of twins. Analysis of semi-structured interviews with each of the teachers revealed four key findings: 1) teachers' perceptions about the classroom assignment of twins vary according to their previous experience and…

  17. Validating hierarchical verbal autopsy expert algorithms in a large data set with known causes of death

    PubMed Central

    Kalter, Henry D; Perin, Jamie; Black, Robert E

    2016-01-01

    Background Physician assessment historically has been the most common method of analyzing verbal autopsy (VA) data. Recently, the World Health Organization endorsed two automated methods, Tariff 2.0 and InterVA–4, which promise greater objectivity and lower cost. A disadvantage of the Tariff method is that it requires a training data set from a prior validation study, while InterVA relies on clinically specified conditional probabilities. We undertook to validate the hierarchical expert algorithm analysis of VA data, an automated, intuitive, deterministic method that does not require a training data set. Methods Using Population Health Metrics Research Consortium study hospital source data, we compared the primary causes of 1629 neonatal and 1456 1–59 month–old child deaths from VA expert algorithms arranged in a hierarchy to their reference standard causes. The expert algorithms were held constant, while five prior and one new “compromise” neonatal hierarchy, and three former child hierarchies were tested. For each comparison, the reference standard data were resampled 1000 times within the range of cause–specific mortality fractions (CSMF) for one of three approximated community scenarios in the 2013 WHO global causes of death, plus one random mortality cause proportions scenario. We utilized CSMF accuracy to assess overall population–level validity, and the absolute difference between VA and reference standard CSMFs to examine particular causes. Chance–corrected concordance (CCC) and Cohen’s kappa were used to evaluate individual–level cause assignment. Results Overall CSMF accuracy for the best–performing expert algorithm hierarchy was 0.80 (range 0.57–0.96) for neonatal deaths and 0.76 (0.50–0.97) for child deaths. Performance for particular causes of death varied, with fairly flat estimated CSMF over a range of reference values for several causes. Performance at the individual diagnosis level was also less favorable than that for

  18. Using large clinical data sets to infer pathogenicity for rare copy number variants in autism cohorts

    PubMed Central

    Moreno-De-Luca, D; Sanders, S J; Willsey, A J; Mulle, J G; Lowe, J K; Geschwind, D H; State, M W; Martin, C L; Ledbetter, D H

    2013-01-01

    Copy number variants (CNVs) have a major role in the etiology of autism spectrum disorders (ASD), and several of these have reached statistical significance in case–control analyses. Nevertheless, current ASD cohorts are not large enough to detect very rare CNVs that may be causative or contributory (that is, risk alleles). Here, we use a tiered approach, in which clinically significant CNVs are first identified in large clinical cohorts of neurodevelopmental disorders (including but not specific to ASD), after which these CNVs are then systematically identified within well-characterized ASD cohorts. We focused our initial analysis on 48 recurrent CNVs (segmental duplication-mediated ‘hotspots') from 24 loci in 31 516 published clinical cases with neurodevelopmental disorders and 13 696 published controls, which yielded a total of 19 deletion CNVs and 11 duplication CNVs that reached statistical significance. We then investigated the overlap of these 30 CNVs in a combined sample of 3955 well-characterized ASD cases from three published studies. We identified 73 deleterious recurrent CNVs, including 36 deletions from 11 loci and 37 duplications from seven loci, for a frequency of 1 in 54; had we considered the ASD cohorts alone, only 58 CNVs from eight loci (24 deletions from three loci and 34 duplications from five loci) would have reached statistical significance. In conclusion, until there are sufficiently large ASD research cohorts with enough power to detect very rare causative or contributory CNVs, data from larger clinical cohorts can be used to infer the likely clinical significance of CNVs in ASD. PMID:23044707

  19. Plastic set of smooth large radii of curvature thermal conductance specimens at light loads.

    NASA Technical Reports Server (NTRS)

    Mckinzie, D. J., Jr.

    1972-01-01

    Thermal contact conductance test data at high vacuum were obtained from two Armco iron specimens having smooth, large radii of curvature, convex, one-half wave length surfaces. The data are compared with calculations based on two macroscopic elastic deformation theories and an empirical expression. Major disagreement with the theories and fair agreement with the empirical expression resulted. Plastic deformation of all the contacting surfaces was verified from surface analyzer statistics. These results indicate that the theoretical assumption of macroscopic elastic deformation is inadequate for accurate prediction of heat transfer with light loads for Armco iron specimens similar to those used in this investigation.

  20. Plastic set of smooth large radii of curvature thermal conductance specimens at light loads

    NASA Technical Reports Server (NTRS)

    Mckinzie, D. J., Jr.

    1972-01-01

    Thermal contact conductance test data at high vacuum were obtained from two Armco iron specimens having smooth, large radii of curvature, convex, one-half wave length surfaces. The data are compared with calculations based on two macroscopic elastic deformation theories and an empirical expression. Major disagreement with the theories and fair agreement with the empirical expression resulted. Plastic deformation of all the contacting surfaces was verified from surface analyzer statistics. These results indicate that the theoretical assumption of macroscopic elastic deformation is inadequate for accurate prediction of heat transfer with light loads for Armco iron specimens similar to those used in this investigation.

  1. A new tool called DISSECT for analysing large genomic data sets using a Big Data approach

    PubMed Central

    Canela-Xandri, Oriol; Law, Andy; Gray, Alan; Woolliams, John A.; Tenesa, Albert

    2015-01-01

    Large-scale genetic and genomic data are increasingly available and the major bottleneck in their analysis is a lack of sufficiently scalable computational tools. To address this problem in the context of complex traits analysis, we present DISSECT. DISSECT is a new and freely available software that is able to exploit the distributed-memory parallel computational architectures of compute clusters, to perform a wide range of genomic and epidemiologic analyses, which currently can only be carried out on reduced sample sizes or under restricted conditions. We demonstrate the usefulness of our new tool by addressing the challenge of predicting phenotypes from genotype data in human populations using mixed-linear model analysis. We analyse simulated traits from 470,000 individuals genotyped for 590,004 SNPs in ∼4 h using the combined computational power of 8,400 processor cores. We find that prediction accuracies in excess of 80% of the theoretical maximum could be achieved with large sample sizes. PMID:26657010

  2. A new tool called DISSECT for analysing large genomic data sets using a Big Data approach.

    PubMed

    Canela-Xandri, Oriol; Law, Andy; Gray, Alan; Woolliams, John A; Tenesa, Albert

    2015-01-01

    Large-scale genetic and genomic data are increasingly available and the major bottleneck in their analysis is a lack of sufficiently scalable computational tools. To address this problem in the context of complex traits analysis, we present DISSECT. DISSECT is a new and freely available software that is able to exploit the distributed-memory parallel computational architectures of compute clusters, to perform a wide range of genomic and epidemiologic analyses, which currently can only be carried out on reduced sample sizes or under restricted conditions. We demonstrate the usefulness of our new tool by addressing the challenge of predicting phenotypes from genotype data in human populations using mixed-linear model analysis. We analyse simulated traits from 470,000 individuals genotyped for 590,004 SNPs in ∼4 h using the combined computational power of 8,400 processor cores. We find that prediction accuracies in excess of 80% of the theoretical maximum could be achieved with large sample sizes. PMID:26657010

  3. Distilling Artificial Recombinants from Large Sets of Complete mtDNA Genomes

    PubMed Central

    Kong, Qing-Peng; Salas, Antonio; Sun, Chang; Fuku, Noriyuki; Tanaka, Masashi; Zhong, Li; Wang, Cheng-Ye; Yao, Yong-Gang; Bandelt, Hans-Jürgen

    2008-01-01

    Background Large-scale genome sequencing poses enormous problems to the logistics of laboratory work and data handling. When numerous fragments of different genomes are PCR amplified and sequenced in a laboratory, there is a high immanent risk of sample confusion. For genetic markers, such as mitochondrial DNA (mtDNA), which are free of natural recombination, single instances of sample mix-up involving different branches of the mtDNA phylogeny would give rise to reticulate patterns and should therefore be detectable. Methodology/Principal Findings We have developed a strategy for comparing new complete mtDNA genomes, one by one, to a current skeleton of the worldwide mtDNA phylogeny. The mutations distinguishing the reference sequence from a putative recombinant sequence can then be allocated to two or more different branches of this phylogenetic skeleton. Thus, one would search for two (or three) near-matches in the total mtDNA database that together best explain the variation seen in the recombinants. The evolutionary pathway from the mtDNA tree connecting this pair together with the recombinant then generate a grid-like median network, from which one can read off the exchanged segments. Conclusions We have applied this procedure to a large collection of complete human mtDNA sequences, where several recombinants could be distilled by our method. All these recombinant sequences were subsequently corrected by de novo experiments – fully concordant with the predictions from our data-analytical approach. PMID:18714389

  4. External beam IBA set-up with large-area thin Si3N4 window

    NASA Astrophysics Data System (ADS)

    Palonen, V.; Mizohata, K.; Nissinen, T.; Räisänen, J.

    2016-08-01

    A compact external beam setup has been constructed for Particle Induced X-ray Emission (PIXE) and Nuclear Reaction (NRA) analyses. The key issue in the design has been to obtain a wide beam spot size with maximized beam current utilizing a thin Si3N4 exit window. The employed specific exit window support enables use of foils with thickness of 100 nm for a beam spot size of 4 mm in diameter. The durable thin foil and the large beam spot size will be especially important for the complementary external beam NRA measurements. The path between the exit foil and sample is filled with flowing helium to minimize radiation hazard as well as energy loss and straggling, and to cool the samples. For sample-independent beam current monitoring and irradiation fluence measurement, indirect charge integration, based on secondary electron current measurement from a beam profilometer, is utilized.

  5. SeqPig: simple and scalable scripting for large sequencing data sets in Hadoop

    PubMed Central

    Schumacher, André; Pireddu, Luca; Niemenmaa, Matti; Kallio, Aleksi; Korpelainen, Eija; Zanetti, Gianluigi; Heljanko, Keijo

    2014-01-01

    Summary: Hadoop MapReduce-based approaches have become increasingly popular due to their scalability in processing large sequencing datasets. However, as these methods typically require in-depth expertise in Hadoop and Java, they are still out of reach of many bioinformaticians. To solve this problem, we have created SeqPig, a library and a collection of tools to manipulate, analyze and query sequencing datasets in a scalable and simple manner. SeqPigscripts use the Hadoop-based distributed scripting engine Apache Pig, which automatically parallelizes and distributes data processing tasks. We demonstrate SeqPig’s scalability over many computing nodes and illustrate its use with example scripts. Availability and Implementation: Available under the open source MIT license at http://sourceforge.net/projects/seqpig/ Contact: andre.schumacher@yahoo.com Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24149054

  6. A hybrid structure for the storage and manipulation of very large spatial data sets

    USGS Publications Warehouse

    Peuquet, Donna J.

    1982-01-01

    The map data input and output problem for geographic information systems is rapidly diminishing with the increasing availability of mass digitizing, direct spatial data capture and graphics hardware based on raster technology. Although a large number of efficient raster-based algorithms exist for performing a wide variety of common tasks on these data, there are a number of procedures which are more efficiently performed in vector mode or for which raster mode equivalents of current vector-based techniques have not yet been developed. This paper presents a hybrid spatial data structure, named the ?vaster' structure, which can utilize the advantages of both raster and vector structures while potentially eliminating, or greatly reducing, the need for raster-to-vector and vector-to-raster conversion. Other advantages of the vaster structure are also discussed.

  7. Processing large sensor data sets for safeguards : the knowledge generation system.

    SciTech Connect

    Thomas, Maikel A.; Smartt, Heidi Anne; Matthews, Robert F.

    2012-04-01

    Modern nuclear facilities, such as reprocessing plants, present inspectors with significant challenges due in part to the sheer amount of equipment that must be safeguarded. The Sandia-developed and patented Knowledge Generation system was designed to automatically analyze large amounts of safeguards data to identify anomalous events of interest by comparing sensor readings with those expected from a process of interest and operator declarations. This paper describes a demonstration of the Knowledge Generation system using simulated accountability tank sensor data to represent part of a reprocessing plant. The demonstration indicated that Knowledge Generation has the potential to address several problems critical to the future of safeguards. It could be extended to facilitate remote inspections and trigger random inspections. Knowledge Generation could analyze data to establish trust hierarchies, to facilitate safeguards use of operator-owned sensors.

  8. Setting up a Rayleigh Scattering Based Flow Measuring System in a Large Nozzle Testing Facility

    NASA Technical Reports Server (NTRS)

    Panda, Jayanta; Gomez, Carlos R.

    2002-01-01

    A molecular Rayleigh scattering based air density measurement system has been built in a large nozzle testing facility at NASA Glenn Research Center. The technique depends on the light scattering by gas molecules present in air; no artificial seeding is required. Light from a single mode, continuous wave laser was transmitted to the nozzle facility by optical fiber, and light scattered by gas molecules, at various points along the laser beam, is collected and measured by photon-counting electronics. By placing the laser beam and collection optics on synchronized traversing units, the point measurement technique is made effective for surveying density variation over a cross-section of the nozzle plume. Various difficulties associated with dust particles, stray light, high noise level and vibration are discussed. Finally, a limited amount of data from an underexpanded jet are presented and compared with expected variations to validate the technique.

  9. Improved Species-Specific Lysine Acetylation Site Prediction Based on a Large Variety of Features Set

    PubMed Central

    Wuyun, Qiqige; Zheng, Wei; Zhang, Yanping; Ruan, Jishou; Hu, Gang

    2016-01-01

    Lysine acetylation is a major post-translational modification. It plays a vital role in numerous essential biological processes, such as gene expression and metabolism, and is related to some human diseases. To fully understand the regulatory mechanism of acetylation, identification of acetylation sites is first and most important. However, experimental identification of protein acetylation sites is often time consuming and expensive. Therefore, the alternative computational methods are necessary. Here, we developed a novel tool, KA-predictor, to predict species-specific lysine acetylation sites based on support vector machine (SVM) classifier. We incorporated different types of features and employed an efficient feature selection on each type to form the final optimal feature set for model learning. And our predictor was highly competitive for the majority of species when compared with other methods. Feature contribution analysis indicated that HSE features, which were firstly introduced for lysine acetylation prediction, significantly improved the predictive performance. Particularly, we constructed a high-accurate structure dataset of H.sapiens from PDB to analyze the structural properties around lysine acetylation sites. Our datasets and a user-friendly local tool of KA-predictor can be freely available at http://sourceforge.net/p/ka-predictor. PMID:27183223

  10. Data Mining on Large Data Set for Predicting Salmon Spawning Habitat

    SciTech Connect

    Xie, YuLong; Murray, Christopher J.; Hanrahan, Timothy P.; Geist, David R.

    2008-07-01

    Hydraulic properties related to river flow affect salmon spawning habitat. Accurate prediction of salmon spawning habitat and understanding the influential properties on the spawning behavior are of great interest for hydroelectric dam management. Previous research predicted salmon spawning habitat through deriving river specific spawning suitability indices and employing a function estimate method like logistic regression on several static river flow related properties and had some success. The objective of this study was two-fold. First dynamic river flow properties associated with upstream dam operation were successfully derived from a huge set of time series of both water velocity and water depth for about one fifth of a million habitat cells through principal component analysis (PCA) using nonlinear iterative partial least squares (NIPLAS). The inclusion of dynamic variables in the models greatly improved the model prediction. Secondly, nine machine learning methods were applied to the data and it was found that decision tree and rule induction methods were generally outperformed usually used logistic regression. Specifically random forest, an advanced decision tree algorithm, provided unanimous better results. Over-prediction problem in previous studies were greatly alleviated.

  11. Improved Species-Specific Lysine Acetylation Site Prediction Based on a Large Variety of Features Set.

    PubMed

    Wuyun, Qiqige; Zheng, Wei; Zhang, Yanping; Ruan, Jishou; Hu, Gang

    2016-01-01

    Lysine acetylation is a major post-translational modification. It plays a vital role in numerous essential biological processes, such as gene expression and metabolism, and is related to some human diseases. To fully understand the regulatory mechanism of acetylation, identification of acetylation sites is first and most important. However, experimental identification of protein acetylation sites is often time consuming and expensive. Therefore, the alternative computational methods are necessary. Here, we developed a novel tool, KA-predictor, to predict species-specific lysine acetylation sites based on support vector machine (SVM) classifier. We incorporated different types of features and employed an efficient feature selection on each type to form the final optimal feature set for model learning. And our predictor was highly competitive for the majority of species when compared with other methods. Feature contribution analysis indicated that HSE features, which were firstly introduced for lysine acetylation prediction, significantly improved the predictive performance. Particularly, we constructed a high-accurate structure dataset of H.sapiens from PDB to analyze the structural properties around lysine acetylation sites. Our datasets and a user-friendly local tool of KA-predictor can be freely available at http://sourceforge.net/p/ka-predictor. PMID:27183223

  12. Cytotoxicity evaluation of large cyanobacterial strain set using selected human and murine in vitro cell models.

    PubMed

    Hrouzek, Pavel; Kapuścik, Aleksandra; Vacek, Jan; Voráčová, Kateřina; Paichlová, Jindřiška; Kosina, Pavel; Voloshko, Ludmila; Ventura, Stefano; Kopecký, Jiří

    2016-02-01

    The production of cytotoxic molecules interfering with mammalian cells is extensively reported in cyanobacteria. These compounds may have a use in pharmacological applications; however, their potential toxicity needs to be considered. We performed cytotoxicity tests of crude cyanobacterial extracts in six cell models in order to address the frequency of cyanobacterial cytotoxicity to human cells and the level of specificity to a particular cell line. A set of more than 100 cyanobacterial crude extracts isolated from soil habitats (mainly genera Nostoc and Tolypothrix) was tested by MTT test for in vitro toxicity on the hepatic and non-hepatic human cell lines HepG2 and HeLa, and three cell systems of rodent origin: Yac-1, Sp-2 and Balb/c 3T3 fibroblasts. Furthermore, a subset of the extracts was assessed for cytotoxicity against primary cultures of human hepatocytes as a model for evaluating potential hepatotoxicity. Roughly one third of cyanobacterial extracts caused cytotoxic effects (i.e. viability<75%) on human cell lines. Despite the sensitivity differences, high correlation coefficients among the inhibition values were obtained for particular cell systems. This suggests a prevailing general cytotoxic effect of extracts and their constituents. The non-transformed immortalized fibroblasts (Balb/c 3T3) and hepatic cancer line HepG2 exhibited good correlations with primary cultures of human hepatocytes. The presence of cytotoxic fractions in strongly cytotoxic extracts was confirmed by an activity-guided HPLC fractionation, and it was demonstrated that cyanobacterial cytotoxicity is caused by a mixture of components with similar hydrophobic/hydrophilic properties. The data presented here could be used in further research into in vitro testing based on human models for the toxicological monitoring of complex cyanobacterial samples. PMID:26519817

  13. Scoring Methods for Building Genotypic Scores: An Application to Didanosine Resistance in a Large Derivation Set

    PubMed Central

    Houssaini, Allal; Assoumou, Lambert; Miller, Veronica; Calvez, Vincent; Marcelin, Anne-Geneviève; Flandre, Philippe

    2013-01-01

    Background Several attempts have been made to determine HIV-1 resistance from genotype resistance testing. We compare scoring methods for building weighted genotyping scores and commonly used systems to determine whether the virus of a HIV-infected patient is resistant. Methods and Principal Findings Three statistical methods (linear discriminant analysis, support vector machine and logistic regression) are used to determine the weight of mutations involved in HIV resistance. We compared these weighted scores with known interpretation systems (ANRS, REGA and Stanford HIV-db) to classify patients as resistant or not. Our methodology is illustrated on the Forum for Collaborative HIV Research didanosine database (N = 1453). The database was divided into four samples according to the country of enrolment (France, USA/Canada, Italy and Spain/UK/Switzerland). The total sample and the four country-based samples allow external validation (one sample is used to estimate a score and the other samples are used to validate it). We used the observed precision to compare the performance of newly derived scores with other interpretation systems. Our results show that newly derived scores performed better than or similar to existing interpretation systems, even with external validation sets. No difference was found between the three methods investigated. Our analysis identified four new mutations associated with didanosine resistance: D123S, Q207K, H208Y and K223Q. Conclusions We explored the potential of three statistical methods to construct weighted scores for didanosine resistance. Our proposed scores performed at least as well as already existing interpretation systems and previously unrecognized didanosine-resistance associated mutations were identified. This approach could be used for building scores of genotypic resistance to other antiretroviral drugs. PMID:23555613

  14. Evaluation in the Classroom.

    ERIC Educational Resources Information Center

    Becnel, Shirley

    Six classroom research-based instructional projects funded under Chapter 2 are described, and their outcomes are summarized. The projects each used computer hardware and software in the classroom setting. The projects and their salient points include: (1) the Science Technology Project, in which 48 teachers and 2,847 students in 18 schools used…

  15. Solving the Dirac equation, using the large component only, in a Dirac-type Slater orbital basis set

    NASA Astrophysics Data System (ADS)

    van Lenthe, E.; Baerends, E. J.; Snijders, J. G.

    1995-04-01

    We solve the Dirac equation by solving the two-component energy-dependent equation for the large component that results from the elimination of the small component. This requires for every occupied orbital the diagonalization of a Hamiltonian. Advantages are, however, that these Hamiltonians are all bounded from below, unlike the Dirac Hamiltonian, and that only a basis set for the large component is needed. We use Dirac-type Slater orbitals, adapted from solutions to the hydrogen-like atom. This offers the perspective of performing relativistic calculations to the same accuracy as non-relativistic ones, with a comparable number of basis functions.

  16. Can Wide Consultation Help with Setting Priorities for Large-Scale Biodiversity Monitoring Programs?

    PubMed Central

    Boivin, Frédéric; Simard, Anouk; Peres-Neto, Pedro

    2014-01-01

    Climate and other global change phenomena affecting biodiversity require monitoring to track ecosystem changes and guide policy and management actions. Designing a biodiversity monitoring program is a difficult task that requires making decisions that often lack consensus due to budgetary constrains. As monitoring programs require long-term investment, they also require strong and continuing support from all interested parties. As such, stakeholder consultation is key to identify priorities and make sound design decisions that have as much support as possible. Here, we present the results of a consultation conducted to serve as an aid for designing a large-scale biodiversity monitoring program for the province of Québec (Canada). The consultation took the form of a survey with 13 discrete choices involving tradeoffs in respect to design priorities and 10 demographic questions (e.g., age, profession). The survey was sent to thousands of individuals having expected interests and knowledge about biodiversity and was completed by 621 participants. Overall, consensuses were few and it appeared difficult to create a design fulfilling the priorities of the majority. Most participants wanted 1) a monitoring design covering the entire territory and focusing on natural habitats; 2) a focus on species related to ecosystem services, on threatened and on invasive species. The only demographic characteristic that was related to the type of prioritization was the declared level of knowledge in biodiversity (null to high), but even then the influence was quite small. PMID:25525798

  17. "Tools For Analysis and Visualization of Large Time- Varying CFD Data Sets"

    NASA Technical Reports Server (NTRS)

    Wilhelms, Jane; vanGelder, Allen

    1999-01-01

    During the four years of this grant (including the one year extension), we have explored many aspects of the visualization of large CFD (Computational Fluid Dynamics) datasets. These have included new direct volume rendering approaches, hierarchical methods, volume decimation, error metrics, parallelization, hardware texture mapping, and methods for analyzing and comparing images. First, we implemented an extremely general direct volume rendering approach that can be used to render rectilinear, curvilinear, or tetrahedral grids, including overlapping multiple zone grids, and time-varying grids. Next, we developed techniques for associating the sample data with a k-d tree, a simple hierarchial data model to approximate samples in the regions covered by each node of the tree, and an error metric for the accuracy of the model. We also explored a new method for determining the accuracy of approximate models based on the light field method described at ACM SIGGRAPH (Association for Computing Machinery Special Interest Group on Computer Graphics) '96. In our initial implementation, we automatically image the volume from 32 approximately evenly distributed positions on the surface of an enclosing tessellated sphere. We then calculate differences between these images under different conditions of volume approximation or decimation.

  18. A Large Set of Newly Created Interspecific Saccharomyces Hybrids Increases Aromatic Diversity in Lager Beers

    PubMed Central

    Mertens, Stijn; Steensels, Jan; Saels, Veerle; De Rouck, Gert; Aerts, Guido

    2015-01-01

    Lager beer is the most consumed alcoholic beverage in the world. Its production process is marked by a fermentation conducted at low (8 to 15°C) temperatures and by the use of Saccharomyces pastorianus, an interspecific hybrid between Saccharomyces cerevisiae and the cold-tolerant Saccharomyces eubayanus. Recent whole-genome-sequencing efforts revealed that the currently available lager yeasts belong to one of only two archetypes, “Saaz” and “Frohberg.” This limited genetic variation likely reflects that all lager yeasts descend from only two separate interspecific hybridization events, which may also explain the relatively limited aromatic diversity between the available lager beer yeasts compared to, for example, wine and ale beer yeasts. In this study, 31 novel interspecific yeast hybrids were developed, resulting from large-scale robot-assisted selection and breeding between carefully selected strains of S. cerevisiae (six strains) and S. eubayanus (two strains). Interestingly, many of the resulting hybrids showed a broader temperature tolerance than their parental strains and reference S. pastorianus yeasts. Moreover, they combined a high fermentation capacity with a desirable aroma profile in laboratory-scale lager beer fermentations, thereby successfully enriching the currently available lager yeast biodiversity. Pilot-scale trials further confirmed the industrial potential of these hybrids and identified one strain, hybrid H29, which combines a fast fermentation, high attenuation, and the production of a complex, desirable fruity aroma. PMID:26407881

  19. A large set of newly created interspecific Saccharomyces hybrids increases aromatic diversity in lager beers.

    PubMed

    Mertens, Stijn; Steensels, Jan; Saels, Veerle; De Rouck, Gert; Aerts, Guido; Verstrepen, Kevin J

    2015-12-01

    Lager beer is the most consumed alcoholic beverage in the world. Its production process is marked by a fermentation conducted at low (8 to 15°C) temperatures and by the use of Saccharomyces pastorianus, an interspecific hybrid between Saccharomyces cerevisiae and the cold-tolerant Saccharomyces eubayanus. Recent whole-genome-sequencing efforts revealed that the currently available lager yeasts belong to one of only two archetypes, "Saaz" and "Frohberg." This limited genetic variation likely reflects that all lager yeasts descend from only two separate interspecific hybridization events, which may also explain the relatively limited aromatic diversity between the available lager beer yeasts compared to, for example, wine and ale beer yeasts. In this study, 31 novel interspecific yeast hybrids were developed, resulting from large-scale robot-assisted selection and breeding between carefully selected strains of S. cerevisiae (six strains) and S. eubayanus (two strains). Interestingly, many of the resulting hybrids showed a broader temperature tolerance than their parental strains and reference S. pastorianus yeasts. Moreover, they combined a high fermentation capacity with a desirable aroma profile in laboratory-scale lager beer fermentations, thereby successfully enriching the currently available lager yeast biodiversity. Pilot-scale trials further confirmed the industrial potential of these hybrids and identified one strain, hybrid H29, which combines a fast fermentation, high attenuation, and the production of a complex, desirable fruity aroma. PMID:26407881

  20. MUSI: an integrated system for identifying multiple specificity from very large peptide or nucleic acid data sets

    PubMed Central

    Kim, TaeHyung; Tyndel, Marc S.; Huang, Haiming; Sidhu, Sachdev S.; Bader, Gary D.; Gfeller, David; Kim, Philip M.

    2012-01-01

    Peptide recognition domains and transcription factors play crucial roles in cellular signaling. They bind linear stretches of amino acids or nucleotides, respectively, with high specificity. Experimental techniques that assess the binding specificity of these domains, such as microarrays or phage display, can retrieve thousands of distinct ligands, providing detailed insight into binding specificity. In particular, the advent of next-generation sequencing has recently increased the throughput of such methods by several orders of magnitude. These advances have helped reveal the presence of distinct binding specificity classes that co-exist within a set of ligands interacting with the same target. Here, we introduce a software system called MUSI that can rapidly analyze very large data sets of binding sequences to determine the relevant binding specificity patterns. Our pipeline provides two major advances. First, it can detect previously unrecognized multiple specificity patterns in any data set. Second, it offers integrated processing of very large data sets from next-generation sequencing machines. The results are visualized as multiple sequence logos describing the different binding preferences of the protein under investigation. We demonstrate the performance of MUSI by analyzing recent phage display data for human SH3 domains as well as microarray data for mouse transcription factors. PMID:22210894

  1. Twelve- to 14-month-old infants can predict single-event probability with large set sizes.

    PubMed

    Denison, Stephanie; Xu, Fei

    2010-09-01

    Previous research has revealed that infants can reason correctly about single-event probabilities with small but not large set sizes (Bonatti, 2008; Teglas et al., 2007). The current study asks whether infants can make predictions regarding single-event probability with large set sizes using a novel procedure. Infants completed two trials: A preference trial to determine whether they preferred pink or black lollipops and a test trial where infants saw two jars, one containing mostly pink lollipops and another containing mostly black lollipops. The experimenter removed one occluded lollipop from each jar and placed them in two separate opaque cups. Seventy-eight percent of infants searched in the cup that contained a lollipop from the jar with a higher proportion of their preferred color object, significantly better than chance. Thus infants can reason about single-event probabilities with large set sizes in a choice paradigm, and contrary to most findings in the infant literature, the prediction task used here appears a more sensitive measure than the standard looking-time task. PMID:20712746

  2. Designing Websites for Displaying Large Data Sets and Images on Multiple Platforms

    NASA Astrophysics Data System (ADS)

    Anderson, A.; Wolf, V. G.; Garron, J.; Kirschner, M.

    2012-12-01

    The desire to build websites to analyze and display ever increasing amounts of scientific data and images pushes for web site designs which utilize large displays, and to use the display area as efficiently as possible. Yet, scientists and users of their data are increasingly wishing to access these websites in the field and on mobile devices. This results in the need to develop websites that can support a wide range of devices and screen sizes, and to optimally use whatever display area is available. Historically, designers have addressed this issue by building two websites; one for mobile devices, and one for desktop environments, resulting in increased cost, duplicity of work, and longer development times. Recent advancements in web design technology and techniques have evolved which allow for the development of a single website that dynamically adjusts to the type of device being used to browse the website (smartphone, tablet, desktop). In addition they provide the opportunity to truly optimize whatever display area is available. HTML5 and CSS3 give web designers media query statements which allow design style sheets to be aware of the size of the display being used, and to format web content differently based upon the queried response. Web elements can be rendered in a different size, position, or even removed from the display entirely, based upon the size of the display area. Using HTML5/CSS3 media queries in this manner is referred to as "Responsive Web Design" (RWD). RWD in combination with technologies such as LESS and Twitter Bootstrap allow the web designer to build web sites which not only dynamically respond to the browser display size being used, but to do so in very controlled and intelligent ways, ensuring that good layout and graphic design principles are followed while doing so. At the University of Alaska Fairbanks, the Alaska Satellite Facility SAR Data Center (ASF) recently redesigned their popular Vertex application and converted it from a

  3. Actual Versus Estimated Utility Factor of a Large Set of Privately Owned Chevrolet Volts

    SciTech Connect

    John Smart; Thomas Bradley; Stephen Schey

    2014-04-01

    In order to determine the overall fuel economy of a plug-in hybrid electric vehicle (PHEV), the amount of operation in charge depleting (CD) versus charge sustaining modes must be determined. Mode of operation is predominantly dependent on customer usage of the vehicle and is therefore highly variable. The utility factor (UF) concept was developed to quantify the distance a group of vehicles has traveled or may travel in CD mode. SAE J2841 presents a UF calculation method based on data collected from travel surveys of conventional vehicles. UF estimates have been used in a variety of areas, including the calculation of window sticker fuel economy, policy decisions, and vehicle design determination. The EV Project, a plug-in electric vehicle charging infrastructure demonstration being conducted across the United States, provides the opportunity to determine the real-world UF of a large group of privately owned Chevrolet Volt extended range electric vehicles. Using data collected from Volts enrolled in The EV Project, this paper compares the real-world UF of two groups of Chevrolet Volts to estimated UF's based on J2841. The actual observed fleet utility factors (FUF) for the MY2011/2012 and MY2013 Volt groups studied were observed to be 72% and 74%, respectively. Using the EPA CD ranges, the method prescribed by J2841 estimates a FUF of 65% and 68% for the MY2011/2012 and MY2013 Volt groups, respectively. Volt drivers achieved higher percentages of distance traveled in EV mode for two reasons. First, they had fewer long-distance travel days than drivers in the national travel survey referenced by J2841. Second, they charged more frequently than the J2841 assumption of once per day - drivers of Volts in this study averaged over 1.4 charging events per day. Although actual CD range varied widely as driving conditions varied, the average CD ranges for the two Volt groups studied matched the EPA CD range estimates, so CD range variation did not affect FUF results.

  4. A Rainbow for the Classroom.

    ERIC Educational Resources Information Center

    Russell, R. D.

    1989-01-01

    Describes an experiment producing a visible spectrum with inexpensive equipment available in the physics classroom. Discusses some related equations, apparatus settings, and instructional methods. (YP)

  5. The Viking viewer for connectomics: scalable multi-user annotation and summarization of large volume data sets.

    PubMed

    Anderson, J R; Mohammed, S; Grimm, B; Jones, B W; Koshevoy, P; Tasdizen, T; Whitaker, R; Marc, R E

    2011-01-01

    Modern microscope automation permits the collection of vast amounts of continuous anatomical imagery in both two and three dimensions. These large data sets present significant challenges for data storage, access, viewing, annotation and analysis. The cost and overhead of collecting and storing the data can be extremely high. Large data sets quickly exceed an individual's capability for timely analysis and present challenges in efficiently applying transforms, if needed. Finally annotated anatomical data sets can represent a significant investment of resources and should be easily accessible to the scientific community. The Viking application was our solution created to view and annotate a 16.5 TB ultrastructural retinal connectome volume and we demonstrate its utility in reconstructing neural networks for a distinctive retinal amacrine cell class. Viking has several key features. (1) It works over the internet using HTTP and supports many concurrent users limited only by hardware. (2) It supports a multi-user, collaborative annotation strategy. (3) It cleanly demarcates viewing and analysis from data collection and hosting. (4) It is capable of applying transformations in real-time. (5) It has an easily extensible user interface, allowing addition of specialized modules without rewriting the viewer. PMID:21118201

  6. Supporting Classroom Activities with the BSUL System

    ERIC Educational Resources Information Center

    Ogata, Hiroaki; Saito, Nobuji A.; Paredes J., Rosa G.; San Martin, Gerardo Ayala; Yano, Yoneo

    2008-01-01

    This paper presents the integration of ubiquitous computing systems into classroom settings, in order to provide basic support for classrooms and field activities. We have developed web application components using Java technology and configured a classroom with wireless network access and a web camera for our purposes. In this classroom, the…

  7. Knowledge and theme discovery across very large biological data sets using distributed queries: a prototype combining unstructured and structured data.

    PubMed

    Mudunuri, Uma S; Khouja, Mohamad; Repetski, Stephen; Venkataraman, Girish; Che, Anney; Luke, Brian T; Girard, F Pascal; Stephens, Robert M

    2013-01-01

    As the discipline of biomedical science continues to apply new technologies capable of producing unprecedented volumes of noisy and complex biological data, it has become evident that available methods for deriving meaningful information from such data are simply not keeping pace. In order to achieve useful results, researchers require methods that consolidate, store and query combinations of structured and unstructured data sets efficiently and effectively. As we move towards personalized medicine, the need to combine unstructured data, such as medical literature, with large amounts of highly structured and high-throughput data such as human variation or expression data from very large cohorts, is especially urgent. For our study, we investigated a likely biomedical query using the Hadoop framework. We ran queries using native MapReduce tools we developed as well as other open source and proprietary tools. Our results suggest that the available technologies within the Big Data domain can reduce the time and effort needed to utilize and apply distributed queries over large datasets in practical clinical applications in the life sciences domain. The methodologies and technologies discussed in this paper set the stage for a more detailed evaluation that investigates how various data structures and data models are best mapped to the proper computational framework. PMID:24312478

  8. Knowledge and Theme Discovery across Very Large Biological Data Sets Using Distributed Queries: A Prototype Combining Unstructured and Structured Data

    PubMed Central

    Repetski, Stephen; Venkataraman, Girish; Che, Anney; Luke, Brian T.; Girard, F. Pascal; Stephens, Robert M.

    2013-01-01

    As the discipline of biomedical science continues to apply new technologies capable of producing unprecedented volumes of noisy and complex biological data, it has become evident that available methods for deriving meaningful information from such data are simply not keeping pace. In order to achieve useful results, researchers require methods that consolidate, store and query combinations of structured and unstructured data sets efficiently and effectively. As we move towards personalized medicine, the need to combine unstructured data, such as medical literature, with large amounts of highly structured and high-throughput data such as human variation or expression data from very large cohorts, is especially urgent. For our study, we investigated a likely biomedical query using the Hadoop framework. We ran queries using native MapReduce tools we developed as well as other open source and proprietary tools. Our results suggest that the available technologies within the Big Data domain can reduce the time and effort needed to utilize and apply distributed queries over large datasets in practical clinical applications in the life sciences domain. The methodologies and technologies discussed in this paper set the stage for a more detailed evaluation that investigates how various data structures and data models are best mapped to the proper computational framework. PMID:24312478

  9. A parallelized surface extraction algorithm for large binary image data sets based on an adaptive 3D delaunay subdivision strategy.

    PubMed

    Ma, Yingliang; Saetzler, Kurt

    2008-01-01

    In this paper we describe a novel 3D subdivision strategy to extract the surface of binary image data. This iterative approach generates a series of surface meshes that capture different levels of detail of the underlying structure. At the highest level of detail, the resulting surface mesh generated by our approach uses only about 10% of the triangles in comparison to the marching cube algorithm (MC) even in settings were almost no image noise is present. Our approach also eliminates the so-called "staircase effect" which voxel based algorithms like the MC are likely to show, particularly if non-uniformly sampled images are processed. Finally, we show how the presented algorithm can be parallelized by subdividing 3D image space into rectilinear blocks of subimages. As the algorithm scales very well with an increasing number of processors in a multi-threaded setting, this approach is suited to process large image data sets of several gigabytes. Although the presented work is still computationally more expensive than simple voxel-based algorithms, it produces fewer surface triangles while capturing the same level of detail, is more robust towards image noise and eliminates the above-mentioned "staircase" effect in anisotropic settings. These properties make it particularly useful for biomedical applications, where these conditions are often encountered. PMID:17993710

  10. Performance-friendly rule extraction in large water data-sets with AOC posets and relational concept analysis

    NASA Astrophysics Data System (ADS)

    Dolques, Xavier; Le Ber, Florence; Huchard, Marianne; Grac, Corinne

    2016-02-01

    In this paper, we consider data analysis methods for knowledge extraction from large water data-sets. More specifically, we try to connect physico-chemical parameters and the characteristics of taxons living in sample sites. Among these data analysis methods, we consider formal concept analysis (FCA), which is a recognized tool for classification and rule discovery on object-attribute data. Relational concept analysis (RCA) relies on FCA and deals with sets of object-attribute data provided with relations. RCA produces more informative results but at the expense of an increase in complexity. Besides, in numerous applications of FCA, the partially ordered set of concepts introducing attributes or objects (AOC poset, for Attribute-Object-Concept poset) is used rather than the concept lattice in order to reduce combinatorial problems. AOC posets are much smaller and easier to compute than concept lattices and still contain the information needed to rebuild the initial data. This paper introduces a variant of the RCA process based on AOC posets rather than concept lattices. This approach is compared with RCA based on iceberg lattices. Experiments are performed with various scaling operators, and a specific operator is introduced to deal with noisy data. We show that using AOC poset on water data-sets provides a reasonable concept number and allows us to extract meaningful implication rules (association rules whose confidence is 1), whose semantics depends on the chosen scaling operator.

  11. Development and Validation of Decision Forest Model for Estrogen Receptor Binding Prediction of Chemicals Using Large Data Sets.

    PubMed

    Ng, Hui Wen; Doughty, Stephen W; Luo, Heng; Ye, Hao; Ge, Weigong; Tong, Weida; Hong, Huixiao

    2015-12-21

    Some chemicals in the environment possess the potential to interact with the endocrine system in the human body. Multiple receptors are involved in the endocrine system; estrogen receptor α (ERα) plays very important roles in endocrine activity and is the most studied receptor. Understanding and predicting estrogenic activity of chemicals facilitates the evaluation of their endocrine activity. Hence, we have developed a decision forest classification model to predict chemical binding to ERα using a large training data set of 3308 chemicals obtained from the U.S. Food and Drug Administration's Estrogenic Activity Database. We tested the model using cross validations and external data sets of 1641 chemicals obtained from the U.S. Environmental Protection Agency's ToxCast project. The model showed good performance in both internal (92% accuracy) and external validations (∼ 70-89% relative balanced accuracies), where the latter involved the validations of the model across different ER pathway-related assays in ToxCast. The important features that contribute to the prediction ability of the model were identified through informative descriptor analysis and were related to current knowledge of ER binding. Prediction confidence analysis revealed that the model had both high prediction confidence and accuracy for most predicted chemicals. The results demonstrated that the model constructed based on the large training data set is more accurate and robust for predicting ER binding of chemicals than the published models that have been developed using much smaller data sets. The model could be useful for the evaluation of ERα-mediated endocrine activity potential of environmental chemicals. PMID:26524122

  12. Software tools that facilitate kinetic modelling with large data sets: an example using growth modelling in sugarcane.

    PubMed

    Uys, L; Hofmeyr, J H S; Snoep, J L; Rohwer, J M

    2006-09-01

    A solution to manage cumbersome data sets associated with large modelling projects is described. A kinetic model of sucrose accumulation in sugarcane is used to predict changes in sucrose metabolism with sugarcane internode maturity. This results in large amounts of output data to be analysed. Growth is simulated by reassigning maximal activity values, specific to each internode of the sugarcane plant, to parameter attributes of a model object. From a programming perspective, only one model definition file is required for the simulation software used; however, the amount of input data increases with each extra interrnode that is modelled, and likewise the amount of output data that is generated also increases. To store, manipulate and analyse these data, the modelling was performed from within a spreadsheet. This was made possible by the scripting language Python and the modelling software PySCeS through an embedded Python interpreter available in the Gnumeric spreadsheet program. PMID:16986323

  13. The plateau in mnemonic resolution across large set sizes indicates discrete resource limits in visual working memory.

    PubMed

    Anderson, David E; Awh, Edward

    2012-07-01

    The precision of visual working memory (WM) representations declines monotonically with increasing storage load. Two distinct models of WM capacity predict different shapes for this precision-by-set-size function. Flexible-resource models, which assert a continuous allocation of resources across an unlimited number of items, predict a monotonic decline in precision across a large range of set sizes. Conversely, discrete-resource models, which assert a relatively small item limit for WM storage, predict that precision will plateau once this item limit is exceeded. Recent work has demonstrated such a plateau in mnemonic precision. Moreover, the set size at which mnemonic precision reached asymptote has been strongly predicted by estimated item limits in WM. In the present work, we extend this evidence in three ways. First, we show that this empirical pattern generalizes beyond orientation memory to color memory. Second, we rule out encoding limits as the source of discrete limits by demonstrating equivalent performance across simultaneous and sequential presentations of the memoranda. Finally, we demonstrate that the analytic approach commonly used to estimate precision yields flawed parameter estimates when the range of stimulus space is narrowed (e.g., a 180º rather than a 360º orientation space) and typical numbers of observations are collected. Such errors in parameter estimation reconcile an apparent conflict between our findings and others based on different stimuli. These findings provide further support for discrete-resource models of WM capacity. PMID:22477058

  14. The plateau in mnemonic resolution across large set sizes indicates discrete resource limits in visual working memory

    PubMed Central

    Anderson, David E.

    2015-01-01

    The precision of visual working memory (WM) representations declines monotonically with increasing storage load. Two distinct models of WM capacity predict different shapes for this precision-by-set-size function. Flexible-resource models, which assert a continuous allocation of resources across an unlimited number of items, predict a monotonic decline in precision across a large range of set sizes. Conversely, discrete-resource models, which assert a relatively small item limit for WM storage, predict that precision will plateau once this item limit is exceeded. Recent work has demonstrated such a plateau in mnemonic precision. Moreover, the set size at which mnemonic precision reached asymptote has been strongly predicted by estimated item limits in WM. In the present work, we extend this evidence in three ways. First, we show that this empirical pattern generalizes beyond orientation memory to color memory. Second, we rule out encoding limits as the source of discrete limits by demonstrating equivalent performance across simultaneous and sequential presentations of the memoranda. Finally, we demonstrate that the analytic approach commonly used to estimate precision yields flawed parameter estimates when the range of stimulus space is narrowed (e.g., a 180° rather than a 360° orientation space) and typical numbers of observations are collected. Such errors in parameter estimation reconcile an apparent conflict between our findings and others based on different stimuli. These findings provide further support for discrete-resource models of WM capacity. PMID:22477058

  15. Goal Setting and Student Achievement: A Longitudinal Study

    ERIC Educational Resources Information Center

    Moeller, Aleidine J.; Theiler, Janine M.; Wu, Chaorong

    2012-01-01

    The connection between goals and student motivation has been widely investigated in the research literature, but the relationship of goal setting and student achievement at the classroom level has remained largely unexplored. This article reports the findings of a 5-year quasi-experimental study examining goal setting and student achievement in…

  16. Culture in the Classroom

    ERIC Educational Resources Information Center

    Medin, Douglas L.; Bang, Megan

    2014-01-01

    Culture plays a large but often unnoticeable role in what we teach and how we teach children. We are a country of immense diversity, but in classrooms the dominant European-American culture has become the language of learning.

  17. Evaluation of Calendar Values of the Climatostratigraphic Borders on the Base of Large Sets of 14C Dates

    NASA Astrophysics Data System (ADS)

    Michczynska, D. J.; Michczynski, A.; Pazdur, A.; Starkel, L.

    2009-04-01

    Two large sets of radiocarbon dates (785 dates for peat samples and 331 dates for fluvial sediments) were used to establish calendar values of the climatostratigraphic borders for the last 16 ka. All samples were collected from the territory of Poland and dated in Gliwice Radiocarbon Laboratory. For both sets Probability Density Functions (PDFs) were constructed by summing the probability distributions of individual 14C dates after the calibration. In the previous analysis (Michczynska and Pazdur, 2004; Michczynski and Michczynska, 2006) authors noticed and discussed the presence of high narrow peaks of the PDFs. Their appearance is caused by two facts: 1. Calibration curve is a record of the environmental changes in the past. The steep slope sections of the calibration curve work as an amplifier and increase the height of the PDF. 2. Environmental changes are indirectly recorded in the frequency of radiocarbon dates because of preferential sampling - the general rule of taking samples from places of visible sedimentation changes (e.g. from the top and bottom of the peat layer) may be the reason that samples from the border of the Late Quaternary climatostratigraphic subdivisions are collected essentially frequently. The high, narrow peaks of the PDFs are produced both by preferential sampling and through the influence of the calibration curve shape. This fact may be useful to establish the border of the Late Quaternary subdivision on the calendar scale for the analyzed geographical area. References: Michczyński A., Michczyńska D.J., 2006. The efect of pdf peaks' height increase during calibration of radiocarbon date sets. Geochronometria, 25: 1-4. Michczyńska D.J., Pazdur A., 2004. A shape analysis of cumulative probability density function of radiocarbon dates set in the study of climate change in Late Glacial and Holocene. Radiocarbon, 46(2): 733-744.

  18. Initial Development and Piloting of a Learning-Based, Classroom Assessment and Consultation System: New Perspectives on the Rhetoric of Improving Instruction in Higher Education Settings.

    ERIC Educational Resources Information Center

    Loup, Karen S.; And Others

    Results are reported of three years of research and development, piloting, and extended field testing of a classroom-based assessment and professional consultation system used to assess important teaching and learning variables in higher education contexts. Of particular interest is the focus of the total system on enhancing learning and newer…

  19. "Designing Instrument for Science Classroom Learning Environment in Francophone Minority Settings: Accounting for Voiced Concerns among Teachers and Immigrant/Refugee Students"

    ERIC Educational Resources Information Center

    Bolivar, Bathélemy

    2015-01-01

    The three-phase process "-Instrument for Minority Immigrant Science Learning Environment," an 8-scale, 32-item see Appendix I- (I_MISLE) instrument when completed by teachers provides an accurate description of existing conditions in classrooms in which immigrant and refugee students are situated. Through the completion of the instrument…

  20. Navigating the Problem Space of Academia: Exploring Processes of Course Design and Classroom Teaching in Postsecondary Settings. WCER Working Paper No. 2014-1

    ERIC Educational Resources Information Center

    Hora, Matthew T.

    2014-01-01

    Policymakers and educators alike increasing focus on faculty adoption of interactive teaching techniques as a way to improve undergraduate education. Yet, little empirical research exists that examines the processes whereby faculty make decisions about curriculum design and classroom teaching in real-world situations. In this study, I use the idea…

  1. The Impact of Brief Teacher Training on Classroom Management and Child Behavior in At-Risk Preschool Settings: Mediators and Treatment Utility

    ERIC Educational Resources Information Center

    Snyder, James; Low, Sabina; Schultz, Tara; Barner, Stacy; Moreno, Desirae; Garst, Meladee; Leiker, Ryan; Swink, Nathan; Schrepferman, Lynn

    2011-01-01

    Teachers from fourteen classrooms were randomly assigned to an adaptation of Incredible Years (IY) teacher training or to teacher training-as-usual. Observations were made of the behavior of 136 target preschool boys and girls nominated by teachers as having many or few conduct problems. Peer and teacher behavior were observed at baseline and post…

  2. The Role of Personal Narrative in Constructing Classroom Curriculum

    ERIC Educational Resources Information Center

    Kouri, Donald A.

    2005-01-01

    This article explores the nature of the author's classroom practice and provides insights about how he constructed and delivered classroom curriculum for the electrical apprenticeship classroom. In this article, the author defines classroom curriculum as the planned and guided learning experiences that take place in a classroom setting with…

  3. My Classroom Physical Activity Pyramid: A Tool for Integrating Movement into the Classroom

    ERIC Educational Resources Information Center

    Orlowski, Marietta; Lorson, Kevin; Lyon, Anna; Minoughan, Susan

    2013-01-01

    The classroom teacher is a critical team member of a comprehensive school physical activity program and an activity-friendly school environment. Students spend more time in the classroom than in any other school setting or environment. Classrooms are busy places, and classroom teachers must make decisions about how to make the best use of their…

  4. Possible calcium centers for hydrogen storage applications: An accurate many-body study by AFQMC calculations with large basis sets

    NASA Astrophysics Data System (ADS)

    Purwanto, Wirawan; Krakauer, Henry; Zhang, Shiwei; Virgus, Yudistira

    2011-03-01

    Weak H2 physisorption energies present a significant challenge to first-principle theoretical modeling and prediction of materials for H storage. There has been controversy regarding the accuracy of DFT on systems involving Ca cations. We use the auxiliary-field quantum Monte Carlo (AFQMC) method to accurately predict the binding energy of Ca + , - 4{H}2 . AFQMC scales as Nbasis3and has demonstrated accuracy similar to or better than the gold-standard coupled cluster CCSD(T) method. We apply a modified Cholesky decomposition to achieve efficient Hubbard-Stratonovich transformation in AFQMC at large basis sizes. We employ the largest correlation consistent basis sets available, up to Ca/cc-pCV5Z, to extrapolate to the complete basis limit. The calculated potential energy curve exhibits binding with a double-well structure. Supported by DOE and NSF. Calculations were performed at OLCF Jaguar and CPD.

  5. WebViz:A Web-based Collaborative Interactive Visualization System for large-Scale Data Sets

    NASA Astrophysics Data System (ADS)

    Yuen, D. A.; McArthur, E.; Weiss, R. M.; Zhou, J.; Yao, B.

    2010-12-01

    WebViz is a web-based application designed to conduct collaborative, interactive visualizations of large data sets for multiple users, allowing researchers situated all over the world to utilize the visualization services offered by the University of Minnesota’s Laboratory for Computational Sciences and Engineering (LCSE). This ongoing project has been built upon over the last 3 1/2 years .The motivation behind WebViz lies primarily with the need to parse through an increasing amount of data produced by the scientific community as a result of larger and faster multicore and massively parallel computers coming to the market, including the use of general purpose GPU computing. WebViz allows these large data sets to be visualized online by anyone with an account. The application allows users to save time and resources by visualizing data ‘on the fly’, wherever he or she may be located. By leveraging AJAX via the Google Web Toolkit (http://code.google.com/webtoolkit/), we are able to provide users with a remote, web portal to LCSE's (http://www.lcse.umn.edu) large-scale interactive visualization system already in place at the University of Minnesota. LCSE’s custom hierarchical volume rendering software provides high resolution visualizations on the order of 15 million pixels and has been employed for visualizing data primarily from simulations in astrophysics to geophysical fluid dynamics . In the current version of WebViz, we have implemented a highly extensible back-end framework built around HTTP "server push" technology. The web application is accessible via a variety of devices including netbooks, iPhones, and other web and javascript-enabled cell phones. Features in the current version include the ability for users to (1) securely login (2) launch multiple visualizations (3) conduct collaborative visualization sessions (4) delegate control aspects of a visualization to others and (5) engage in collaborative chats with other users within the user interface

  6. Hydraulic behavior of two areas of the Floridan aquifer system characterized by complex hydrogeologic settings and large groundwater withdrawals

    SciTech Connect

    Maslia, M.L. )

    1993-03-01

    Two areas of the Florida aquifer system (FAS) that are characterized by complex hydrogeologic settings and exceedingly large ground-water withdrawals are the Dougherty Plain area of southwest GA and the Glynn County area of southeast GA. In southwest GA, large scale withdrawals of ground water for agricultural and livestock irrigation amounted to about 148 million gallons per day (mg/d) during 1990. Large scale pumping in Glynn County, primarily used for industrial purposes and centered in the City of Brunswick, amounted to about 88 mg/d during 1990. In southwest GA, the FAS consists primarily of the Ocala Limestone (OL) of late Eocene age. Confining the aquifer from above is a residual layer (50 ft thick) of sand and clay containing silicified boulders which is derived from the chemical weathering of the OL. This area is characterized by karst topography marked by numerous depressions and sinkholes, high transmissivity (generally greater than 50,000 feet squared per day), and significant hydraulic connections to overlying streams and lakes. These characteristics, along with the seasonal nature of pumping and mean annual recharge of about 10 inches per year have prevented permanent, long-term water-level declines. In the Glynn County area, the FAS can be more than 2,600 ft thick, consisting of a sequence of calcareous and dolomitic rocks that are of Late Cretaceous to early Miocene in age. The aquifer system is confined above by clastic rocks of Middle Miocene age, having an average thickness of 400 ft. This area is characterized by post-depositional tectonic modification of the subsurface as opposed to simple karst development, thick confinement of the aquifer system, and significant amounts of vertical leakage of water from below. These characteristics and heavy-long term pumping from the Upper Floridan aquifer (UFA) have caused a broad, shallow cone of depression to develop and the upward migration of saltwater to contaminate the freshwater zones of the UFA.

  7. Question Driven Instruction with Classroom Response Technology

    NASA Astrophysics Data System (ADS)

    Gerace, William; Beatty, Ian

    2007-10-01

    Essentially, a classroom response system is technology that: 1) allows an instructor to present a question or problem to the class; 2) allows students to enter their answers into some kind of device; and 3) instantly aggregates and summarizes students' answers for the instructor, usually as a histogram. Most response systems provide additional functionality. Some additional names for this class of system (or for subsets of the class) are classroom communication system (CCS), audience response system (ARS), voting machine system, audience feedback system, and--most ambitiously--CATAALYST system (for ``Classroom Aggregation Technology for Activating and Assessing Learning and Your Students' Thinking''). UMPERG has been teaching with and researching classroom response systems since 1993. We find that the technology has the potential to transform the way we teach science in large lecture settings. CRSs can serve as catalysts for creating a more interactive, student-centered classroom in the lecture hall, thereby allowing students to become more actively involved in constructing and using knowledge. CRSs not only make it easier to engage students in learning activities during lecture but also enhance the communication among students, and between the students and the instructor. This enhanced communication assists the students and the instructor in assessing understanding during class time, and affords the instructor the opportunity to devise instructional interventions that target students' needs as they arise.

  8. Teaching Cell Biology in the Large-Enrollment Classroom: Methods to Promote Analytical Thinking and Assessment of Their Effectiveness

    ERIC Educational Resources Information Center

    Kitchen, Elizabeth; Bell, John D.; Reeve, Suzanne; Sudweeks, Richard R.; Bradshaw, William S.

    2003-01-01

    A large-enrollment, undergraduate cellular biology lecture course is described whose primary goal is to help students acquire skill in the interpretation of experimental data. The premise is that this kind of analytical reasoning is not intuitive for most people and, in the absence of hands-on laboratory experience, will not readily develop unless…

  9. Clickenomics: Using a Classroom Response System to Increase Student Engagement in a Large-Enrollment Principles of Economics Course

    ERIC Educational Resources Information Center

    Salemi, Michael K.

    2009-01-01

    One of the most important challenges facing college instructors of economics is helping students engage. Engagement is particularly important in a large-enrollment Principles of Economics course, where it can help students achieve a long-lived understanding of how economists use basic economic ideas to look at the world. The author reports how…

  10. Using Technology To Implement Active Learning in Large Classes. Technical Report.

    ERIC Educational Resources Information Center

    Gerace, William J.; Dufresne, Robert J.; Leonard, William J.

    An emerging technology, classroom communication systems (CCSs), has the potential to transform the way we teach science in large-lecture settings. CCSs can serve as catalysts for creating a more interactive, student-centered classroom in the lecture hall, thereby allowing students to become more actively involved in constructing and using…

  11. Spatial Fingerprints of Community Structure in Human Interaction Network for an Extensive Set of Large-Scale Regions

    PubMed Central

    Kallus, Zsófia; Barankai, Norbert; Szüle, János; Vattay, Gábor

    2015-01-01

    Human interaction networks inferred from country-wide telephone activity recordings were recently used to redraw political maps by projecting their topological partitions into geographical space. The results showed remarkable spatial cohesiveness of the network communities and a significant overlap between the redrawn and the administrative borders. Here we present a similar analysis based on one of the most popular online social networks represented by the ties between more than 5.8 million of its geo-located users. The worldwide coverage of their measured activity allowed us to analyze the large-scale regional subgraphs of entire continents and an extensive set of examples for single countries. We present results for North and South America, Europe and Asia. In our analysis we used the well-established method of modularity clustering after an aggregation of the individual links into a weighted graph connecting equal-area geographical pixels. Our results show fingerprints of both of the opposing forces of dividing local conflicts and of uniting cross-cultural trends of globalization. PMID:25993329

  12. Global nonlinear kernel prediction for large data set with a particle swarm-optimized interval support vector regression.

    PubMed

    Ding, Yongsheng; Cheng, Lijun; Pedrycz, Witold; Hao, Kuangrong

    2015-10-01

    A new global nonlinear predictor with a particle swarm-optimized interval support vector regression (PSO-ISVR) is proposed to address three issues (viz., kernel selection, model optimization, kernel method speed) encountered when applying SVR in the presence of large data sets. The novel prediction model can reduce the SVR computing overhead by dividing input space and adaptively selecting the optimized kernel functions to obtain optimal SVR parameter by PSO. To quantify the quality of the predictor, its generalization performance and execution speed are investigated based on statistical learning theory. In addition, experiments using synthetic data as well as the stock volume weighted average price are reported to demonstrate the effectiveness of the developed models. The experimental results show that the proposed PSO-ISVR predictor can improve the computational efficiency and the overall prediction accuracy compared with the results produced by the SVR and other regression methods. The proposed PSO-ISVR provides an important tool for nonlinear regression analysis of big data. PMID:25974954

  13. Determination of counterfeit medicines by Raman spectroscopy: Systematic study based on a large set of model tablets.

    PubMed

    Neuberger, Sabine; Neusüß, Christian

    2015-08-10

    In the last decade, counterfeit pharmaceutical products have become a widespread issue for public health. Raman spectroscopy which is easy, non-destructive and information-rich is particularly suitable as screening method for fast characterization of chemicals and pharmaceuticals. Combined with chemometric techniques, it provides a powerful tool for the analysis and determination of counterfeit medicines. Here, for the first time, a systematic study of the benefits and limitations of Raman spectroscopy for the analysis of pharmaceutical samples on a large set of model tablets, varying with respect to chemical and physical properties, was performed. To discriminate between the different mixtures, a combination of dispersive Raman spectroscopy performing in backscattering mode and principal component analysis was used. The discrimination between samples with different coatings, a varying amount of active pharmaceutical ingredients and a diversity of excipients were possible. However, it was not possible to distinguish between variations of the press power, mixing quality and granulation. As a showcase, the change in Raman signals of commercial acetylsalicylic acid effervescent tablets due to five different storage conditions was monitored. It was possible to detect early small chemical changes caused by inappropriate storage conditions. These results demonstrate that Raman spectroscopy combined with multivariate data analysis provides a powerful methodology for the fast and easy characterization of genuine and counterfeit medicines. PMID:25956227

  14. Identifying Cognate Binding Pairs among a Large Set of Paralogs: The Case of PE/PPE Proteins of Mycobacterium tuberculosis

    PubMed Central

    Riley, Robert; Pellegrini, Matteo; Eisenberg, David

    2008-01-01

    We consider the problem of how to detect cognate pairs of proteins that bind when each belongs to a large family of paralogs. To illustrate the problem, we have undertaken a genomewide analysis of interactions of members of the PE and PPE protein families of Mycobacterium tuberculosis. Our computational method uses structural information, operon organization, and protein coevolution to infer the interaction of PE and PPE proteins. Some 289 PE/PPE complexes were predicted out of a possible 5,590 PE/PPE pairs genomewide. Thirty-five of these predicted complexes were also found to have correlated mRNA expression, providing additional evidence for these interactions. We show that our method is applicable to other protein families, by analyzing interactions of the Esx family of proteins. Our resulting set of predictions is a starting point for genomewide experimental interaction screens of the PE and PPE families, and our method may be generally useful for detecting interactions of proteins within families having many paralogs. PMID:18787688

  15. Spatial fingerprints of community structure in human interaction network for an extensive set of large-scale regions.

    PubMed

    Kallus, Zsófia; Barankai, Norbert; Szüle, János; Vattay, Gábor

    2015-01-01

    Human interaction networks inferred from country-wide telephone activity recordings were recently used to redraw political maps by projecting their topological partitions into geographical space. The results showed remarkable spatial cohesiveness of the network communities and a significant overlap between the redrawn and the administrative borders. Here we present a similar analysis based on one of the most popular online social networks represented by the ties between more than 5.8 million of its geo-located users. The worldwide coverage of their measured activity allowed us to analyze the large-scale regional subgraphs of entire continents and an extensive set of examples for single countries. We present results for North and South America, Europe and Asia. In our analysis we used the well-established method of modularity clustering after an aggregation of the individual links into a weighted graph connecting equal-area geographical pixels. Our results show fingerprints of both of the opposing forces of dividing local conflicts and of uniting cross-cultural trends of globalization. PMID:25993329

  16. RelMon: A General Approach to QA, Validation and Physics Analysis through Comparison of large Sets of Histograms

    NASA Astrophysics Data System (ADS)

    Piparo, Danilo

    2012-12-01

    The estimation of the compatibility of large amounts of histogram pairs is a recurrent problem in high energy physics. The issue is common to several different areas, from software quality monitoring to data certification, preservation and analysis. Given two sets of histograms, it is very important to be able to scrutinize the outcome of several goodness of fit tests, obtain a clear answer about the overall compatibility, easily spot the single anomalies and directly access the concerned histogram pairs. This procedure must be automated in order to reduce the human workload, therefore improving the process of identification of differences which is usually carried out by a trained human mind. Some solutions to this problem have been proposed, but they are experiment specific. RelMon depends only on ROOT and offers several goodness of fit tests (e.g. chi-squared or Kolmogorov-Smirnov). It produces highly readable web reports, in which aggregations of the comparisons rankings are available as well as all the plots of the single histogram overlays. The comparison procedure is fully automatic and scales smoothly towards ensembles of millions of histograms. Examples of RelMon utilisation within the regular workflows of the CMS collaboration and the advantages therewith obtained are described. Its interplay with the data quality monitoring infrastructure is illustrated as well as its role in the QA of the event reconstruction code, its integration in the CMS software release cycle process, CMS user data analysis and dataset validation.

  17. Automatic detection of rate change in large data sets with an unsupervised approach: the case of influenza viruses.

    PubMed

    Labonté, Kasandra; Aris-Brosou, Stéphane

    2016-04-01

    Influenza viruses evolve at such a high rate that vaccine recommendations need to be changed, but not quite on a regular basis. This observation suggests that the rate of evolution of these viruses is not constant through time, which begs the question as to when such rate changes occur, if they do so independently of the host in which they circulate and (or) independently of their subtype. To address these outstanding questions, we introduce a novel heuristics, Mclust*, that is based on a two-tier clustering approach in a phylogenetic context to estimate (i) absolute rates of evolution and (ii) when rate change occurs. We employ the novel approach to compare the two influenza surface proteins, hemagglutinin and neuraminidase, that circulated in avian, human, and swine hosts between 1960 and 2014 in two subtypes: H3N2 and H1N1. We show that the algorithm performs well in most conditions, accounting for phylogenetic uncertainty by means of bootstrapping and scales up to analyze very large data sets. Our results show that our approach is robust to the time-dependent artifact of rate estimation, and confirm pervasive punctuated evolution across hosts and subtypes. As such, the novel approach can potentially detect when vaccine composition needs to be updated. PMID:26966881

  18. Early Miocene Kirka-Phrigian caldera, western Anatolia - an example of large volume silicic magma generation in extensional setting

    NASA Astrophysics Data System (ADS)

    Seghedi, Ioan; Helvacı, Cahit

    2014-05-01

    Large rhyolitic ignimbrite occurrences are close connected to the Early Miocene initiation of extensional processes in the central-west Anatolia along Taşvanlı-Afyon zones. Field correlations, petrographical, geochemical and geochronological data lead to a substantial reinterpretation of the ignimbrite surrounding Kırka area, known from its world-class borate deposits, as representing the climatic event of a caldera collapse, unknown up to now and newly named "Kırka-Phrigian caldera". The caldera, which is roughly oval (24 km x 15km) in shape, one of the largest in Turkey, is supposed to have been formed in a single stage collapse event, at ~19 Ma that generated huge volume extracaldera outflow ignimbrites. Transtensive/distensive tectonic stresses since 25 Ma ago resulted in the NNW-SSE elongation of the magma chamber and influenced the roughly elliptical shape of the subsided block (caldera floor) belonging to the apex of Eskişehir-Afyon-Isparta volcanic area. Intracaldera post-collapse sedimentation and volcanism (at ~ 18 Ma) was controlled through subsidence-related faults with generation of a series of volcanic structures (mainly domes) showing a large compositional range from saturated silicic rhyolites and crystal-rich trachytes to undersaturated lamproites. Such volcanic rock association is typical for lithospheric extension. In this scenario, enriched mantle components within the subcontinental lithospheric mantle will begin to melt via decompression melting during the initiation of extension. Interaction of these melts with crustal rocks, fractionation processes and crustal anatexis driven by the heat contained in the ascending mantle melts produced the silicic compositions in a large crustal reservoir. Such silicic melts generated the initial eruptions of Kırka-Phrigian caldera ignimbrites. The rock volume and geochemical evidence suggests that silicic volcanic rocks come from a long-lived magma chamber that evolved episodically; after caldera

  19. Development of a large-sample watershed-scale hydrometeorological data set for the contiguous USA: data set characteristics and assessment of regional variability in hydrologic model performance

    NASA Astrophysics Data System (ADS)

    Newman, A. J.; Clark, M. P.; Sampson, K.; Wood, A.; Hay, L. E.; Bock, A.; Viger, R. J.; Blodgett, D.; Brekke, L.; Arnold, J. R.; Hopson, T.; Duan, Q.

    2015-01-01

    We present a community data set of daily forcing and hydrologic response data for 671 small- to medium-sized basins across the contiguous United States (median basin size of 336 km2) that spans a very wide range of hydroclimatic conditions. Area-averaged forcing data for the period 1980-2010 was generated for three basin spatial configurations - basin mean, hydrologic response units (HRUs) and elevation bands - by mapping daily, gridded meteorological data sets to the subbasin (Daymet) and basin polygons (Daymet, Maurer and NLDAS). Daily streamflow data was compiled from the United States Geological Survey National Water Information System. The focus of this paper is to (1) present the data set for community use and (2) provide a model performance benchmark using the coupled Snow-17 snow model and the Sacramento Soil Moisture Accounting Model, calibrated using the shuffled complex evolution global optimization routine. After optimization minimizing daily root mean squared error, 90% of the basins have Nash-Sutcliffe efficiency scores ≥0.55 for the calibration period and 34% ≥ 0.8. This benchmark provides a reference level of hydrologic model performance for a commonly used model and calibration system, and highlights some regional variations in model performance. For example, basins with a more pronounced seasonal cycle generally have a negative low flow bias, while basins with a smaller seasonal cycle have a positive low flow bias. Finally, we find that data points with extreme error (defined as individual days with a high fraction of total error) are more common in arid basins with limited snow and, for a given aridity, fewer extreme error days are present as the basin snow water equivalent increases.

  20. Classroom Network Technology as a Support for Systemic Mathematics Reform: The Effects of TI MathForward on Student Achievement in a Large, Diverse District

    ERIC Educational Resources Information Center

    Penuel, William; Singleton, Corinne; Roschelle, Jeremy

    2011-01-01

    Low-cost, portable classroom network technologies have shown great promise in recent years for improving teaching and learning in mathematics. This paper explores the impacts on student learning in mathematics when a program to introduce network technologies into mathematics classrooms is integrated into a systemic reform initiative at the…

  1. In silico assessment of adverse effects of a large set of 6-fluoroquinolones obtained from a study of tuberculosis chemotherapy.

    PubMed

    Tusar, Marjan; Minovski, Nikola; Fjodorova, Natalja; Novic, Marjana

    2012-09-01

    Among the different chemotherapeutic classes available today, the 6-fluoroquinolone (6-FQ) antibacterials are still one of the most effective cures in fighting tuberculosis (TB). Nowadays, the development of novel 6-FQs for treatment of TB mainly depends on understanding how the structural modifications of the main quinolone scaffold at specific positions affect the anti-mycobacterial activity. Alongside the structure-activity relationship (SAR) studies of the 6-FQ antibacterials, which can be considered as a golden rule in the development of novel active antitubercular 6-FQs, the structure side effects relationship (SSER) of these drugs must be also taken into account. In the present study we focus on a proficient implementation of the existing knowledge-based expert systems for design of novel 6-FQ antibacterials with possible enhanced biological activity against Mycobaterium tuberculosis as well as lower toxicity. Following the SAR in silico studies of the quinolone antibacterials against M. tuberculosis performed in our laboratory, a large set of 6-FQs was selected. Several new 6-FQ derivatives were proposed as drug candidates for further research and development. The 6- FQs identified as potentially effective against M. tuberculosis were subjected to an additional SSER study for prediction of their toxicological profile. The assessment of structurally-driven adverse effects which might hamper the potential of new drug candidates is mandatory for an effective drug design. We applied publicly available knowledge-based (expert) systems and Quantitative Structure-Activity Relationship (QSAR) models in order to prepare a priority list of active compounds. A preferred order of drug candidates was obtained, so that the less harmful candidates were identified for further testing. TOXTREE expert system as well as some QSAR models developed in the framework of EC funded project CAESAR were used to assess toxicity. CAESAR models were developed according to the OECD

  2. Hyperview-fast, economical access to large data sets: a system for the archiving and distribution of hyperspectral data sets and derived products

    NASA Astrophysics Data System (ADS)

    Lurie, Joan B.

    1996-12-01

    TRW, under a Small Satellite Technology Initiative (SSTI) contract, is building the Lewis satellite. The principal sensor on Lewis is a hyperspectral imaging spectrometer. Part of the SSTI mission is to establish the commercial and educational utility of this data and the hyperspectral data already being acquired on airborne platforms. Essential requirements are rapid availability (after data acquisition) and easy accessibility to a catalog of images and imagery products. Each image is approximately 256 by 512 pixels with 384 bands of data acquired at each pixel. For some applications, some users will want the entire data sets; in other cases partial data sets (e.g. three band images) will be all that a user can handle or need for a given application. In order to make the most effective use of this new imagery and justify the cost of collecting it, we must find ways to make the information it contains more readily accessible to an ever broadening community of potential users. Tools are needed to store, access, and communicate the data more efficiently, to place it in context, and to derive both qualitative and quantitative information from it. A variety of information products which address the specific needs of particular user communities will be derived from the imagery. The data is unique in its ability to provide high spatial and spectral resolution simultaneously, and shows great promise in both military and civilian applications. A data management and analysis system has been built at TRW. This development has been prompted by the business opportunities, by the series of instruments built here and by the availability of data from other instruments. The products of the processing system have been shown to prospective customers in the U.S. and abroad. The system has been used to process data produced by TRW sensors and other instruments. This paper provides an overview of the TRW hyperspectral collection, data handling and exploitation capability.

  3. Classroom Management in Diverse Classrooms

    ERIC Educational Resources Information Center

    Milner, H. Richard, IV; Tenore, F. Blake

    2010-01-01

    Classroom management continues to be a serious concern for teachers and especially in urban and diverse learning environments. The authors present the culturally responsive classroom management practices of two teachers from an urban and diverse middle school to extend the construct, culturally responsive classroom management. The principles that…

  4. Empirical Mining of Large Data Sets Already Helps to Solve Practical Ecological Problems; A Panoply of Working Examples (Invited)

    NASA Astrophysics Data System (ADS)

    Hargrove, W. W.; Hoffman, F. M.; Kumar, J.; Spruce, J.; Norman, S. P.

    2013-12-01

    Here we present diverse examples where empirical mining and statistical analysis of large data sets have already been shown to be useful for a wide variety of practical decision-making problems within the realm of large-scale ecology. Because a full understanding and appreciation of particular ecological phenomena are possible only after hypothesis-directed research regarding the existence and nature of that process, some ecologists may feel that purely empirical data harvesting may represent a less-than-satisfactory approach. Restricting ourselves exclusively to process-driven approaches, however, may actually slow progress, particularly for more complex or subtle ecological processes. We may not be able to afford the delays caused by such directed approaches. Rather than attempting to formulate and ask every relevant question correctly, empirical methods allow trends, relationships and associations to emerge freely from the data themselves, unencumbered by a priori theories, ideas and prejudices that have been imposed upon them. Although they cannot directly demonstrate causality, empirical methods can be extremely efficient at uncovering strong correlations with intermediate "linking" variables. In practice, these correlative structures and linking variables, once identified, may provide sufficient predictive power to be useful themselves. Such correlation "shadows" of causation can be harnessed by, e.g., Bayesian Belief Nets, which bias ecological management decisions, made with incomplete information, toward favorable outcomes. Empirical data-harvesting also generates a myriad of testable hypotheses regarding processes, some of which may even be correct. Quantitative statistical regionalizations based on quantitative multivariate similarity have lended insights into carbon eddy-flux direction and magnitude, wildfire biophysical conditions, phenological ecoregions useful for vegetation type mapping and monitoring, forest disease risk maps (e.g., sudden oak

  5. Rates and Mechanisms of Solidification in Large Magma Bodies: Implications for Melt Extraction in all Tectonic Settings

    NASA Astrophysics Data System (ADS)

    VanTongeren, J. A.

    2013-12-01

    As is observed in both experiment and theory, in the absence of hydrothermal convection, the majority of magma chamber heat loss occurs via conduction through the roof of the intrusion and into the cold country rock above. The formation of an upper solidification front (or Upper Border Series, UBS), recorded in the rocks both geochemically and texturally, is a natural outcome of the progression of the solidification front from the cold roof to the hot center of the magma chamber. There are, however, a few unique layered mafic intrusions for which little or no UBS exists. In this study, I examine the thermal evolution and crystallization rates of several classic layered intrusions as it is recorded in the extent of the preserved UBS. For those intrusions that have experienced crystallization at the roof, such as the Skaergaard Intrusion, the development of a UBS reduces the temperature gradient at the roof and effectively slows the rate of heat loss from the main magma body. However, for those intrusions that do not have an UBS, such as the Bushveld Complex, the cooling rate is controlled only by the maximum rate of conductive heat loss through the overlying roof rocks, which decreases with time. The implications are two-fold: (1) The relative thickness of the UBS in large intrusions may be the key to quantifying their cooling and solidification rates; and (2) The nature of the magma mush zone near the roof of an intrusion may depend principally on the long-term thermal evolution of the magma body. Particularly at the end stages of crystallization, when the liquids are likely to be highly evolved and high viscosities may inhibit convection, intrusions lacking a well-defined UBS may provide important insights into the mechanics of crystal-liquid separation, melt extraction, and compaction in felsic plutons as well as mafic intrusions. These results are important for long-lived (>500 kyr) or repeatedly replenished magma chambers in all tectonic settings.

  6. Teaching Cell Biology in the Large-Enrollment Classroom: Methods to Promote Analytical Thinking and Assessment of Their Effectiveness

    PubMed Central

    Kitchen, Elizabeth; Bell, John D.; Reeve, Suzanne; Sudweeks, Richard R.; Bradshaw, William S.

    2003-01-01

    A large-enrollment, undergraduate cellular biology lecture course is described whose primary goal is to help students acquire skill in the interpretation of experimental data. The premise is that this kind of analytical reasoning is not intuitive for most people and, in the absence of hands-on laboratory experience, will not readily develop unless instructional methods and examinations specifically designed to foster it are employed. Promoting scientific thinking forces changes in the roles of both teacher and student. We describe didactic strategies that include directed practice of data analysis in a workshop format, active learning through verbal and written communication, visualization of abstractions diagrammatically, and the use of ancillary small-group mentoring sessions with faculty. The implications for a teacher in reducing the breadth and depth of coverage, becoming coach instead of lecturer, and helping students to diagnose cognitive weaknesses are discussed. In order to determine the efficacy of these strategies, we have carefully monitored student performance and have demonstrated a large gain in a pre- and posttest comparison of scores on identical problems, improved test scores on several successive midterm examinations when the statistical analysis accounts for the relative difficulty of the problems, and higher scores in comparison to students in a control course whose objective was information transfer, not acquisition of reasoning skills. A novel analytical index (student mobility profile) is described that demonstrates that this improvement was not random, but a systematic outcome of the teaching/learning strategies employed. An assessment of attitudes showed that, in spite of finding it difficult, students endorse this approach to learning, but also favor curricular changes that would introduce an analytical emphasis earlier in their training. PMID:14506506

  7. The Nonsexist Classroom. Primary Place.

    ERIC Educational Resources Information Center

    Taus, Kay; Spann, Mary Beth

    1992-01-01

    Presents strategies to help teachers keep elementary classrooms free of sex-role stereotyping. The article explains how to set the tone, observe classroom behavior, share nonsexist lessons, provide role models, modify sexist statements, connect with parents, discuss female heroes, reverse traditional roles, use nonsexist photo files, and make role…

  8. A Large-Scale Inquiry-Based Astronomy Intervention Project: Impact on Students' Content Knowledge Performance and Views of their High School Science Classroom

    NASA Astrophysics Data System (ADS)

    Fitzgerald, Michael; McKinnon, David H.; Danaia, Lena; Deehan, James

    2015-08-01

    In this paper, we present the results from a study of the impact on students involved in a large-scale inquiry-based astronomical high school education intervention in Australia. Students in this intervention were led through an educational design allowing them to undertake an investigative approach to understanding the lifecycle of stars more aligned with the `ideal' picture of school science. Through the use of two instruments, one focused on content knowledge gains and the other on student views of school science, we explore the impact of this design. Overall, students made moderate content knowledge gains although these gains were heavily dependent on the individual teacher, the number of times a teacher implemented and the depth to which an individual teacher went with the provided materials. In terms of students' views, there were significant global changes in their views of their experience of the science classroom. However, there were some areas where no change or slightly negative changes of which some were expected and some were not. From these results, we comment on the necessity of sustained long-period implementations rather than single interventions, the requirement for similarly sustained professional development and the importance of monitoring the impact of inquiry-based implementations. This is especially important as inquiry-based approaches to science are required by many new curriculum reforms, most notably in this context, the new Australian curriculum currently being rolled out.

  9. Is Our Classroom an Ecological Place?

    ERIC Educational Resources Information Center

    Xia, Wang

    2006-01-01

    The essence of ecology is life and its diversity, integrity, openness and coexistence. When one contemplates and analyzes classroom from the perspective of ecology, classroom should contain open-ended and multiple goals instead of a single and pre-set goal; classroom is more flexible, allowing great diversity instead of being narrow-minded,…

  10. Spectral analysis and cross-correlation of very large seismic data-sets at the persistently restless Telica Volcano, Nicaragua.

    NASA Astrophysics Data System (ADS)

    Rodgers, Mel; Roman, Diana; Geirsson, Halldor; LaFemina, Peter; Munoz, Angelica; Tenorio, Virginia

    2014-05-01

    Telica Volcano, Nicaragua, is a persistently restless volcano (PRV) with daily seismicity rates that can vary from less than ten events per day to over a thousand events per day. Seismicity rates show little clear correlation with eruptive episodes. This presents a challenge for volcano monitoring and highlights the need for a greater understanding of the patterns of seismicity surrounding eruptive activity at Telica and other PRVs. Multi-parameter seismic investigations, including spectral and multiplet analysis, may provide important precursory information, but are challenging given such high rates of seismicity. We present a program 'peakmatch' that can effectively handle the cross-correlation of hundreds of thousands of events and identify multiplets. In addition, frequency ratios, basic spectral information, and amplitudes can be rapidly calculated for very large seismic data sets. An investigation of the seismic characteristics surrounding the 2011 phreatic eruption at Telica shows an unusual pattern of seismicity. Rather than a precursory increase in seismicity, as is observed prior to many volcanic eruptions, we observe a decrease in seismicity many months before the eruption. Spectral analysis indicates that during periods with high seismicity there are events with a broad range of frequencies, and that during periods of low seismicity there is a progressive loss of events with lower frequency energy (< 3 Hz). Multiplet analysis indicates that during periods with high seismicity there is a high degree of waveform correlation, and that during periods with low seismicity there is a low degree of waveform correlation. We suggest that these patterns of seismicity relate to a cyclic transition between open-system and closed-system degassing. Open-system degassing is observed seismically as periods with high event rates, a broad range of frequency content and high waveform correlation. A transition to closed-system degassing could be via sealing of fluid

  11. How do you assign persistent identifiers to extracts from large, complex, dynamic data sets that underpin scholarly publications?

    NASA Astrophysics Data System (ADS)

    Wyborn, Lesley; Car, Nicholas; Evans, Benjamin; Klump, Jens

    2016-04-01

    Persistent identifiers in the form of a Digital Object Identifier (DOI) are becoming more mainstream, assigned at both the collection and dataset level. For static datasets, this is a relatively straight-forward matter. However, many new data collections are dynamic, with new data being appended, models and derivative products being revised with new data, or the data itself revised as processing methods are improved. Further, because data collections are becoming accessible as services, researchers can log in and dynamically create user-defined subsets for specific research projects: they also can easily mix and match data from multiple collections, each of which can have a complex history. Inevitably extracts from such dynamic data sets underpin scholarly publications, and this presents new challenges. The National Computational Infrastructure (NCI) has been experiencing and making progress towards addressing these issues. The NCI is large node of the Research Data Services initiative (RDS) of the Australian Government's research infrastructure, which currently makes available over 10 PBytes of priority research collections, ranging from geosciences, geophysics, environment, and climate, through to astronomy, bioinformatics, and social sciences. Data are replicated to, or are produced at, NCI and then processed there to higher-level data products or directly analysed. Individual datasets range from multi-petabyte computational models and large volume raster arrays, down to gigabyte size, ultra-high resolution datasets. To facilitate access, maximise reuse and enable integration across the disciplines, datasets have been organized on a platform called the National Environmental Research Data Interoperability Platform (NERDIP). Combined, the NERDIP data collections form a rich and diverse asset for researchers: their co-location and standardization optimises the value of existing data, and forms a new resource to underpin data-intensive Science. New publication

  12. A Zebra in the Classroom.

    ERIC Educational Resources Information Center

    Leake, Devin; Morvillo, Nancy

    1998-01-01

    Describes the care and breeding of zebra fish, suggests various experiments and observations easily performed in a classroom setting, and provides some ideas to further student interest and exploration of these organisms. (DDR)

  13. Photometric selection of quasars in large astronomical data sets with a fast and accurate machine learning algorithm

    NASA Astrophysics Data System (ADS)

    Gupta, Pramod; Connolly, Andrew J.; Gardner, Jeffrey P.

    2014-03-01

    Future astronomical surveys will produce data on ˜108 objects per night. In order to characterize and classify these sources, we will require algorithms that scale linearly with the size of the data, that can be easily parallelized and where the speedup of the parallel algorithm will be linear in the number of processing cores. In this paper, we present such an algorithm and apply it to the question of colour selection of quasars. We use non-parametric Bayesian classification and a binning algorithm implemented with hash tables (BASH tables). We show that this algorithm's run time scales linearly with the number of test set objects and is independent of the number of training set objects. We also show that it has the same classification accuracy as other algorithms. For current data set sizes, it is up to three orders of magnitude faster than commonly used naive kernel-density-estimation techniques and it is estimated to be about eight times faster than the current fastest algorithm using dual kd-trees for kernel density estimation. The BASH table algorithm scales linearly with the size of the test set data only, and so for future larger data sets, it will be even faster compared to other algorithms which all depend on the size of the test set and the size of the training set. Since it uses linear data structures, it is easier to parallelize compared to tree-based algorithms and its speedup is linear in the number of cores unlike tree-based algorithms whose speedup plateaus after a certain number of cores. Moreover, due to the use of hash tables to implement the binning, the memory usage is very small. While our analysis is for the specific problem of selection of quasars, the ideas are general and the BASH table algorithm can be applied to any density-estimation problem involving sparse high-dimensional data sets. Since sparse high-dimensional data sets are a common type of scientific data set, this method has the potential to be useful in a broad range of

  14. Impacts of Flipped Classroom in High School Health Education

    ERIC Educational Resources Information Center

    Chen, Li-Ling

    2016-01-01

    As advanced technology increasingly infiltrated into classroom, the flipped classroom has come to light in secondary educational settings. The flipped classroom is a new instructional approach that intends to flip the traditional teacher-centered classroom into student centered. The purpose of this research is to investigate the impact of the…

  15. Pre-Service Teachers and Classroom Authority

    ERIC Educational Resources Information Center

    Pellegrino, Anthony M.

    2010-01-01

    This study examined the classroom practices of five pre-service teachers from three secondary schools in a large southeastern state. Through classroom observations, survey responses, reviews of refection logs, and focus-group interview responses, we centered on the issue of developing classroom authority as a means to effective classroom…

  16. Improving Interactions in the Large Language Class.

    ERIC Educational Resources Information Center

    Raymond, Patricia M.; Raymond, Jacques; Pilon, Daniel

    1998-01-01

    Describes a prototypical microcomputer system that improves the interactions between teacher and large language classes in a traditional language classroom setting. This system achieves dynamic interactions through multiple student/professor interventions, immediate and delayed feedback, and individual teacher/student conferences. The system uses…

  17. The Paperless Music Classroom

    ERIC Educational Resources Information Center

    Giebelhausen, Robin

    2016-01-01

    In an age where the world is becoming ever more aware of paper consumption, educators are turning toward technology to cut back on paper waste. Besides the environmental reasons, a paperless music classroom helps students develop their musicianship in new and exciting ways. This article will look at the considerations for setting up a paperless…

  18. Classroom Management That Works

    ERIC Educational Resources Information Center

    Cleve, Lauren

    2012-01-01

    The purpose of this study was to find the best classroom management strategies to use when teaching in an elementary school setting. I wanted to conduct the best possible management tools for a variety of age groups as well as meet educational standards. Through my research I found different approaches in different grade levels is an important…

  19. 'Flipping' the Classroom.

    PubMed

    Billings, Diane M

    2016-09-01

    This article is one in a series on the roles of adjunct clinical faculty and preceptors, who teach nursing students and new graduates to apply knowledge in clinical settings. This article describes the benefits and challenges of using a "flipped" classroom to promote active engagement among learners and more meaningful interaction between learners and educators. PMID:27560340

  20. Learning in Tomorrow's Classrooms

    ERIC Educational Resources Information Center

    Bowman, Richard F.

    2015-01-01

    Teaching today remains the most individualistic of all the professions, with educators characteristically operating in a highly fragmented world of "their" courses, "their" skills, and "their" students. Learning will occur in the classrooms of the future through a sustainable set of complementary capabilities:…

  1. Novel method to construct large-scale design space in lubrication process utilizing Bayesian estimation based on a small-scale design-of-experiment and small sets of large-scale manufacturing data.

    PubMed

    Maeda, Jin; Suzuki, Tatsuya; Takayama, Kozo

    2012-12-01

    A large-scale design space was constructed using a Bayesian estimation method with a small-scale design of experiments (DoE) and small sets of large-scale manufacturing data without enforcing a large-scale DoE. The small-scale DoE was conducted using various Froude numbers (X(1)) and blending times (X(2)) in the lubricant blending process for theophylline tablets. The response surfaces, design space, and their reliability of the compression rate of the powder mixture (Y(1)), tablet hardness (Y(2)), and dissolution rate (Y(3)) on a small scale were calculated using multivariate spline interpolation, a bootstrap resampling technique, and self-organizing map clustering. The constant Froude number was applied as a scale-up rule. Three experiments under an optimal condition and two experiments under other conditions were performed on a large scale. The response surfaces on the small scale were corrected to those on a large scale by Bayesian estimation using the large-scale results. Large-scale experiments under three additional sets of conditions showed that the corrected design space was more reliable than that on the small scale, even if there was some discrepancy in the pharmaceutical quality between the manufacturing scales. This approach is useful for setting up a design space in pharmaceutical development when a DoE cannot be performed at a commercial large manufacturing scale. PMID:22356256

  2. Classroom Screening.

    ERIC Educational Resources Information Center

    Alpha Plus Corp., Piedmont, CA.

    This classroom screening device was developed by the Circle Preschool First Chance Project, a government-funded program to integrate handicapped children into regular classroom activities, for use in preschools, nursery schools, Head Start centers and other agencies working with young children. It is designed to give a gross measure of a child's…

  3. Classroom Management.

    ERIC Educational Resources Information Center

    Dinsmore, Terri Sue

    This paper is a report of a middle-school teacher's study of classroom management. The teacher/researcher was interested in how some of the techniques in the Kovalik Integrated Thematic Instruction model of training would influence the teacher/researcher's classroom management; the effects of direct instruction within a community circle; the…

  4. Classroom Organization

    ERIC Educational Resources Information Center

    Technology & Learning, 2005

    2005-01-01

    Good organization skills are key to running an efficient classroom, and having the right tools makes it easier to manage all of the tasks, save time, and be more productive. Having the power of information when and where anyone need it makes a difference in how well any teacher runs the classroom and knows his or her students. A Palm handheld…

  5. Accompanying Readings & Tools for Enhancing Classroom Approaches for Addressing Barriers to Learning: Classroom-Focused Enabling.

    ERIC Educational Resources Information Center

    California Univ., Los Angeles. Center for Mental Health in Schools.

    This publication presents a set of readings and tools that accompany the education modules "Enhancing Classroom Approaches to Addressing Barriers to Learning: Classroom-Focused Enabling." Together, they delineate a preservice/inservice teacher preparation curriculum covering how regular classrooms and schools should be designed to ensure all…

  6. New technique for real-time interface pressure analysis: getting more out of large image data sets.

    PubMed

    Bogie, Kath; Wang, Xiaofeng; Fei, Baowei; Sun, Jiayang

    2008-01-01

    Recent technological improvements have led to increasing clinical use of interface pressure mapping for seating pressure evaluation, which often requires repeated assessments. However, clinical conditions cannot be controlled as closely as research settings, thereby creating challenges to statistical analysis of data. A multistage longitudinal analysis and self-registration (LASR) technique is introduced that emphasizes real-time interface pressure image analysis in three dimensions. Suitable for use in clinical settings, LASR is composed of several modern statistical components, including a segmentation method. The robustness of our segmentation method is also shown. Application of LASR to analysis of data from neuromuscular electrical stimulation (NMES) experiments confirms that NMES improves static seating pressure distributions in the sacral-ischial region over time. Dynamic NMES also improves weight-shifting over time. These changes may reduce the risk of pressure ulcer development. PMID:18712638

  7. New technique for real-time interface pressure analysis: Getting more out of large image data sets

    PubMed Central

    Bogie, Kath; Wang, Xiaofeng; Fei, Baowei; Sun, Jiayang

    2009-01-01

    Recent technological improvements have led to increasing clinical use of interface pressure mapping for seating pressure evaluation, which often requires repeated assessments. However, clinical conditions cannot be controlled as closely as research settings, thereby creating challenges to statistical analysis of data. A multistage longitudinal analysis and self-registration (LASR) technique is introduced that emphasizes real-time interface pressure image analysis in three dimensions. Suitable for use in clinical settings, LASR is composed of several modern statistical components, including a segmentation method. The robustness of our segmentation method is also shown. Application of LASR to analysis of data from neuromuscular electrical stimulation (NMES) experiments confirms that NMES improves static seating pressure distributions in the sacral-ischial region over time. Dynamic NMES also improves weight-shifting over time. These changes may reduce the risk of pressure ulcer development. PMID:18712638

  8. Classroom interactions and science inquiry: A comparative study examining differential implementation of a science program in two middle school classrooms

    NASA Astrophysics Data System (ADS)

    Goldberg, Jennifer Sarah

    This dissertation explores two classroom communities during the implementation of a new environmental science curriculum. The classrooms are similar in that both are located in the same middle school and led by experienced classroom teachers. Despite these similarities, differences among learning outcomes are found in analyses of student pre- and post-science tests in the two rooms. Through videotape analysis of classroom interaction within parallel curricular activities, learning opportunities are contrasted in terms of the social and cognitive organization of science activities and the roles played by teachers, students, and scientists as manifested in their discourse. In one classroom, tasks flow between whole class discussions and small group work. Curricular activities are interwoven with transitions eased as goals are shared with students. Scientific concepts are connected through various activities and related to ideas outside of the classroom. Furthermore, the classroom community is united, established largely through the teacher's discourse patterns, such as deictics (specifically, inclusive personal pronouns). Moreover, the teacher emphasizes that she is learning alongside the students. In the other classroom, the focus of their science period is typically centered around whole class instruction or small group work depending on the particular lesson. This organization accompanied by a heavy use of directives leads to an implicit goal of completing the assigned task. Curricular activities are isolated, with an emphasis on following protocol instructions. Through discursive patterns, such as endearing address terms and exclusive pronouns, a dichotomy is created between the teacher and student. As the designated expert, this teacher imparts her knowledge of science to the students. Several implications emerge from this study. Although pre-packaged, curricular lessons appear identical on paper, the enacted curriculum differs, even in similar settings. Without

  9. Free vascularised fibular grafting with OsteoSet®2 demineralised bone matrix versus autograft for large osteonecrotic lesions of the femoral head.

    PubMed

    Feng, Yong; Wang, Shanzhi; Jin, Dongxu; Sheng, Jiagen; Chen, Shengbao; Cheng, Xiangguo; Zhang, Changqing

    2011-04-01

    The aim of this study was to compare the safety and efficacy of OsteoSet®2 DBM with autologous cancellous bone in free vascularised fibular grafting for the treatment of large osteonecrotic lesions of the femoral head. Twenty-four patients (30 hips) with large osteonecrotic lesions of the femoral head (stage IIC in six hips, stage IIIC in 14, and stage IVC in ten, according to the classification system of Steinberg et al.) underwent free vascularised fibular grafting with OsteoSet®2 DBM. This group was retrospectively matched to a group of 24 patients (30 hips) who underwent free vascularised fibular grafting with autologous cancellous bone during the same time period according to the aetiology, stage, and size of the lesion and the mean preoperative Harris hip score. A prospective case-controlled study was then performed with a mean follow-up duration of 26 months. The results show no statistically significant differences between the two groups in overall clinical outcome or the radiographic assessment. Furthermore, no adverse events related to the use of the OsteoSet®2 DBM were observed. The results demonstrate that OsteoSet®2 DBM combined with autograft bone performs equally as well as that of autologous bone alone. Therefore, OsteoSet®2 DBM can be used as a safe and effective graft extender in free vascularised fibular grafting for large osteonecrotic lesions of the femoral head. PMID:20012040

  10. Developing a "Semi-Systematic" Approach to Using Large-Scale Data-Sets for Small-Scale Interventions: The "Baby Matterz" Initiative as a Case Study

    ERIC Educational Resources Information Center

    O'Brien, Mark

    2011-01-01

    The appropriateness of using statistical data to inform the design of any given service development or initiative often depends upon judgements regarding scale. Large-scale data sets, perhaps national in scope, whilst potentially important in informing the design, implementation and roll-out of experimental initiatives, will often remain unused…

  11. Moving toward an Empowering Setting in a First Grade Classroom Serving Primarily Working Class and Working Poor Latina/o Children: An Exploratory Analysis

    ERIC Educational Resources Information Center

    Silva, Janelle M.; Langhout, Regina Day

    2016-01-01

    Empowering settings are important places for people to develop leadership skills in order to enact social change. Yet, due to socio-cultural constructions of childhood in the US, especially constructions around working class and working poor children of Color, they are often not seen as capable or competent change agents, or in need of being in…

  12. The Effects of Positive Verbal Reinforcement on the Time Spent outside the Classroom for Students with Emotional and Behavioral Disorders in a Residential Setting

    ERIC Educational Resources Information Center

    Kennedy, Christina; Jolivette, Kristine

    2008-01-01

    To more effectively instruct the entire class, teachers of students with emotional behavioral disorders (EBD) often choose to send students who display inappropriate behavior out of the room. A multiple baseline across settings was used to evaluate the effects of increasing teacher positive verbal reinforcement on the amount of time 2 students…

  13. pXRF quantitative analysis of the Otowi Member of the Bandelier Tuff: Generating large, robust data sets to decipher trace element zonation in large silicic magma chambers

    NASA Astrophysics Data System (ADS)

    Van Hoose, A. E.; Wolff, J.; Conrey, R.

    2013-12-01

    Advances in portable X-Ray fluorescence (pXRF) analytical technology have made it possible for high-quality, quantitative data to be collected in a fraction of the time required by standard, non-portable analytical techniques. Not only do these advances reduce analysis time, but data may also be collected in the field in conjunction with sampling. Rhyolitic pumice, being primarily glass, is an excellent material to be analyzed with this technology. High-quality, quantitative data for elements that are tracers of magmatic differentiation (e.g. Rb, Sr, Y, Nb) can be collected for whole, individual pumices and subsamples of larger pumices in 4 minutes. We have developed a calibration for powdered rhyolite pumice from the Otowi Member of the Bandelier Tuff analyzed with the Bruker Tracer IV pXRF using Bruker software and influence coefficients for pumice, which measures the following 19 oxides and elements: SiO2, TiO2, Al2O3, FeO*, MnO, CaO, K2O, P2O5, Zn, Ga, Rb, Sr, Y, Zr, Nb, Ba, Ce, Pb, and Th. With this calibration for the pXRF and thousands of individual powdered pumice samples, we have generated an unparalleled data set for any single eruptive unit with known trace element zonation. The Bandelier Tuff of the Valles-Toledo Caldera Complex, Jemez Mountains, New Mexico, is divided into three main eruptive events. For this study, we have chosen the 1.61 Ma, 450 km3 Otowi Member as it is primarily unwelded and pumice samples are easily accessible. The eruption began with a plinian phase from a single source located near center of the current caldera and deposited the Guaje Pumice Bed. The initial Unit A of the Guaje is geochemically monotonous, but Units B through E, co-deposited with ignimbrite show very strong chemical zonation in trace elements, progressing upwards through the deposits from highly differentiated compositions (Rb ~350 ppm, Nb ~200 ppm) to less differentiated (Rb ~100 ppm, Nb ~50 ppm). Co-erupted ignimbrites emplaced during column collapse show

  14. Flexible Classroom Furniture

    ERIC Educational Resources Information Center

    Kim Hassell,

    2011-01-01

    Classroom design for the 21st-century learning environment should accommodate a variety of learning skills and needs. The space should be large enough so it can be configured to accommodate a number of learning activities. This also includes furniture that provides flexibility and accommodates collaboration and interactive work among students and…

  15. Statistical Analysis of a Large Sample Size Pyroshock Test Data Set Including Post Flight Data Assessment. Revision 1

    NASA Technical Reports Server (NTRS)

    Hughes, William O.; McNelis, Anne M.

    2010-01-01

    The Earth Observing System (EOS) Terra spacecraft was launched on an Atlas IIAS launch vehicle on its mission to observe planet Earth in late 1999. Prior to launch, the new design of the spacecraft's pyroshock separation system was characterized by a series of 13 separation ground tests. The analysis methods used to evaluate this unusually large amount of shock data will be discussed in this paper, with particular emphasis on population distributions and finding statistically significant families of data, leading to an overall shock separation interface level. The wealth of ground test data also allowed a derivation of a Mission Assurance level for the flight. All of the flight shock measurements were below the EOS Terra Mission Assurance level thus contributing to the overall success of the EOS Terra mission. The effectiveness of the statistical methodology for characterizing the shock interface level and for developing a flight Mission Assurance level from a large sample size of shock data is demonstrated in this paper.

  16. The large karstic holes at the top of the Syrian coastal Mountain Range. Importance of structural setting for the karstogenesis.

    NASA Astrophysics Data System (ADS)

    Mocochain, Ludovic; Blanpied, Christian; Bigot, Jean-Yves; Peyronel, Olivier; Gorini, Christian; Abdalla, Abdelkarim Al; Azki, Fawaz

    2015-04-01

    Along the Eastern Mediterranean Sea, the Syria Coastal Mountain Range spreads from north to south over 150 km of long. This range is a monocline structure stopped by a major escarpment that domines Al-Gahb Graben to the East. The Coastal Mountain Range is mainly formed by Mesozoic limestone that show a major unconformity between the Upper Jurassic and Aptien deposits, and important erosions in the Upper Cretaceous deposits. Locally, the Juro-Cretaceous unconformity is characterized by a layer of continental basalts with fossil woods that reveal a long emersion of the platform. The most recent carbonate deposits at the top of the Coastal Mountain Range are Turonian age. In the center part of the Coastal Mountain Range, in a small area, the Cretaceous carbonates are affected by large karstic dolines. These dolines are curiously located at the top of the mountain range. This position is not beneficial for the development of large karstic holes.

  17. Learning to Stand: The Acceptability and Feasibility of Introducing Standing Desks into College Classrooms

    PubMed Central

    Benzo, Roberto M.; Gremaud, Allene L.; Jerome, Matthew; Carr, Lucas J.

    2016-01-01

    Prolonged sedentary behavior is an independent risk factor for multiple negative health outcomes. Evidence supports introducing standing desks into K-12 classrooms and work settings to reduce sitting time, but no studies have been conducted in the college classroom environment. The present study explored the acceptability and feasibility of introducing standing desks in college classrooms. A total of 993 students and 149 instructors completed a single online needs assessment survey. This cross-sectional study was conducted during the fall semester of 2015 at a large Midwestern University. The large majority of students (95%) reported they would prefer the option to stand in class. Most students (82.7%) reported they currently sit during their entire class time. Most students (76.6%) and instructors (86.6%) reported being in favor of introducing standing desks into college classrooms. More than half of students and instructors predicted having access to standing desks in class would improve student’s “physical health”, “attention”, and “restlessness”. Collectively, these findings support the acceptability of introducing standing desks in college classrooms. Future research is needed to test the feasibility, cost-effectiveness and efficacy of introducing standing desks in college classrooms. Such studies would be useful for informing institutional policies regarding classroom designs. PMID:27537901

  18. Learning to Stand: The Acceptability and Feasibility of Introducing Standing Desks into College Classrooms.

    PubMed

    Benzo, Roberto M; Gremaud, Allene L; Jerome, Matthew; Carr, Lucas J

    2016-01-01

    Prolonged sedentary behavior is an independent risk factor for multiple negative health outcomes. Evidence supports introducing standing desks into K-12 classrooms and work settings to reduce sitting time, but no studies have been conducted in the college classroom environment. The present study explored the acceptability and feasibility of introducing standing desks in college classrooms. A total of 993 students and 149 instructors completed a single online needs assessment survey. This cross-sectional study was conducted during the fall semester of 2015 at a large Midwestern University. The large majority of students (95%) reported they would prefer the option to stand in class. Most students (82.7%) reported they currently sit during their entire class time. Most students (76.6%) and instructors (86.6%) reported being in favor of introducing standing desks into college classrooms. More than half of students and instructors predicted having access to standing desks in class would improve student's "physical health", "attention", and "restlessness". Collectively, these findings support the acceptability of introducing standing desks in college classrooms. Future research is needed to test the feasibility, cost-effectiveness and efficacy of introducing standing desks in college classrooms. Such studies would be useful for informing institutional policies regarding classroom designs. PMID:27537901

  19. "Did Ronald McDonald also Tend to Scare You as a Child?": Working to Emplace Consumption, Commodities and Citizen-Students in a Large Classroom Setting

    ERIC Educational Resources Information Center

    Goodman, Michael K.

    2008-01-01

    So-called "radical" and "critical"pedagogy seems to be everywhere these days on the landscapes of geographical teaching praxis and theory. Part of the remit of radical/critical pedagogy involves a de-centring of the traditional "banking" method of pedagogical praxis. Yet, how do we challenge this "banking" model of knowledge transmission in both a…

  20. Estimation of comprehensive forest variable sets from multiparameter SAR data over a large area with diverse species

    NASA Technical Reports Server (NTRS)

    Moghaddam, M.

    2001-01-01

    Polarimetric and multifrequency data from the NASA/JPL airborne synthetic aperture radar (AIRSAR) have been used in a multi-tier estimation algorithm to calculate a comprehensive set of forest canopy properties including branch layer moisture and thickness, trunk density, trunk water content and diameter, trunk height, and subcanapy soil moisture. The estimation algorithm takes advantage of species-specific allometric relations, and is applied to a 100Km x 100Km area in the Canadian boreal region containing many different vegetation species types. The results show very good agreement with ground measurements taken at several focused and auxiliary study sites. This paper expands on the results reported in [1] and applies the algorithm on the regional scale.

  1. Strategy Training in a Task-Based Language Classroom

    ERIC Educational Resources Information Center

    Lai, Chun; Lin, Xiaolin

    2015-01-01

    Recent literature that examines the implementation of task-based language teaching (TBLT) in classroom settings has reported various challenges related to educational cultures, classroom management, teacher cognition and learner perceptions. To facilitate the smooth transition of TBLT from laboratory settings to classroom contexts, measures need…

  2. Worsening Hypoxemia in the Face of Increasing PEEP: A Case of Large Pulmonary Embolism in the Setting of Intracardiac Shunt.

    PubMed

    Granati, Glen T; Teressa, Getu

    2016-01-01

    BACKGROUND Patent foramen ovale (PFO) are common, normally resulting in a left-to-right shunt or no net shunting. Pulmonary embolism (PE) can cause sustained increased pulmonary vascular resistance (PVR) and right atrial pressure. Increasing positive end-expiratory pressure (PEEP) improves oxygenation at the expense of increasing intrathoracic pressures (ITP). Airway pressure release ventilation (APRV) decreases shunt fraction, improves ventilation/perfusion (V/Q) matching, increases cardiac output, and decreases right atrial pressure by facilitating low airway pressure. CASE REPORT A 40-year-old man presented with dyspnea and hemoptysis. Oxygen saturation (SaO2) 80% on room air with A-a gradient of 633 mmHg. Post-intubation SaO2 dropped to 71% on assist control, FiO2 100%, and PEEP of 5 cmH20. Successive PEEP dropped SaO2 to 60-70% and blood pressure plummeted. APRV was initaiated with improvement in SaO2 to 95% and improvement in blood pressure. Hemiparesis developed and CT head showed infarction. CT pulmonary angiogram found a large pulmonary embolism. Transthoracic echocardiogram detected right-to left intracardiac shunt, with large PFO. CONCLUSIONS There should be suspicion for a PFO when severe hypoxemia paradoxically worsens in response to increasing airway pressures. Concomitant venous and arterial thromboemboli should prompt evaluation for intra-cardiac shunt. Patients with PFO and hypoxemia should be evaluated for causes of sustained right-to-left pressure gradient, such as PE. Management should aim to decrease PVR and optimize V/Q matching by treating the inciting incident (e.g., thrombolytics in PE) and by minimizing ITP. APRV can minimize PVR and maximize V/Q ratios and should be considered in treating patients similar to the one whose case is presented here. PMID:27377010

  3. News Teaching: The epiSTEMe project: KS3 maths and science improvement Field trip: Pupils learn physics in a stately home Conference: ShowPhysics welcomes fun in Europe Student numbers: Physics numbers increase in UK Tournament: Physics tournament travels to Singapore Particle physics: Hadron Collider sets new record Astronomy: Take your classroom into space Forthcoming Events

    NASA Astrophysics Data System (ADS)

    2010-05-01

    Teaching: The epiSTEMe project: KS3 maths and science improvement Field trip: Pupils learn physics in a stately home Conference: ShowPhysics welcomes fun in Europe Student numbers: Physics numbers increase in UK Tournament: Physics tournament travels to Singapore Particle physics: Hadron Collider sets new record Astronomy: Take your classroom into space Forthcoming Events

  4. Approaching the complete basis set limit of CCSD(T) for large systems by the third-order incremental dual-basis set zero-buffer F12 method

    NASA Astrophysics Data System (ADS)

    Zhang, Jun; Dolg, Michael

    2014-01-01

    The third-order incremental dual-basis set zero-buffer approach was combined with CCSD(T)-F12x (x = a, b) theory to develop a new approach, i.e., the inc3-db-B0-CCSD(T)-F12 method, which can be applied as a black-box procedure to efficiently obtain the near complete basis set (CBS) limit of the CCSD(T) energies also for large systems. We tested this method for several cases of different chemical nature: four complexes taken from the standard benchmark sets S66 and X40, the energy difference between isomers of water hexamer and the rotation barrier of biphenyl. The results show that our method has an error relative to the best estimation of CBS energy of only 0.2 kcal/mol or less. By parallelization, our method can accomplish the CCSD(T)-F12 calculations of about 60 correlated electrons and 800 basis functions in only several days, which by standard implementation are impossible for ordinary hardware. We conclude that the inc3-db-B0-CCSD(T)-F12a/AVTZ method, which is of CCSD(T)/AV5Z quality, is close to the limit of accuracy that one can achieve for large systems currently.

  5. Approaching the complete basis set limit of CCSD(T) for large systems by the third-order incremental dual-basis set zero-buffer F12 method

    SciTech Connect

    Zhang, Jun Dolg, Michael

    2014-01-28

    The third-order incremental dual-basis set zero-buffer approach was combined with CCSD(T)-F12x (x = a, b) theory to develop a new approach, i.e., the inc3-db-B0-CCSD(T)-F12 method, which can be applied as a black-box procedure to efficiently obtain the near complete basis set (CBS) limit of the CCSD(T) energies also for large systems. We tested this method for several cases of different chemical nature: four complexes taken from the standard benchmark sets S66 and X40, the energy difference between isomers of water hexamer and the rotation barrier of biphenyl. The results show that our method has an error relative to the best estimation of CBS energy of only 0.2 kcal/mol or less. By parallelization, our method can accomplish the CCSD(T)-F12 calculations of about 60 correlated electrons and 800 basis functions in only several days, which by standard implementation are impossible for ordinary hardware. We conclude that the inc3-db-B0-CCSD(T)-F12a/AVTZ method, which is of CCSD(T)/AV5Z quality, is close to the limit of accuracy that one can achieve for large systems currently.

  6. Worsening Hypoxemia in the Face of Increasing PEEP: A Case of Large Pulmonary Embolism in the Setting of Intracardiac Shunt

    PubMed Central

    Granati, Glen T.; Teressa, Getu

    2016-01-01

    Patient: Male, 40 Final Diagnosis: Patent foramen ovale Symptoms: Dyspnea exertional • hemoptysis • shortness of breath Medication: — Clinical Procedure: Airway pressure release ventilation Specialty: Critical Care Medicine Objective: Rare co-existance of disease or pathology Background: Patent foramen ovale (PFO) are common, normally resulting in a left to right shunt or no net shunting. Pulmonary embolism (PE) can cause sustained increased pulmonary vascular resistance (PVR) and right atrial pressure. Increasing positive end-expiratory pressure (PEEP) improves oxygenation at the expense of increasing intrathoracic pressures (ITP). Airway pressure release ventilation (APRV) decreases shunt fraction, improves ventilation/perfusion (V/Q) matching, increases cardiac output, and decreases right atrial pressure by facilitating low airway pressure. Case Report: A 40-year-old man presented with dyspnea and hemoptysis. Oxygen saturation (SaO2) 80% on room air with A a gradient of 633 mmHg. Post-intubation SaO2 dropped to 71% on assist control, FiO2 100%, and PEEP of 5 cmH20. Successive PEEP dropped SaO2 to 60–70% and blood pressure plummeted. APRV was initaiated with improvement in SaO2 to 95% and improvement in blood pressure. Hemiparesis developed and CT head showed infarction. CT pulmonary angiogram found a large pulmonary embolism. Transthoracic echocardiogram detected right-to left intracardiac shunt, with large PFO. Conclusions: There should be suspicion for a PFO when severe hypoxemia paradoxically worsens in response to increasing airway pressures. Concomitant venous and arterial thromboemboli should prompt evaluation for intra cardiac shunt. Patients with PFO and hypoxemia should be evaluated for causes of sustained right-to left pressure gradient, such as PE. Management should aim to decrease PVR and optimize V/Q matching by treating the inciting incident (e.g., thrombolytics in PE) and by minimizing ITP. APRV can minimize PVR and maximize V/Q ratios and

  7. Organizational development trajectory of a large academic radiotherapy department set up similarly to a prospective clinical trial: the MAASTRO experience

    PubMed Central

    Boersma, L; Dekker, A; Hermanns, E; Houben, R; Govers, M; van Merode, F; Lambin, P

    2015-01-01

    Objective: To simultaneously improve patient care processes and clinical research activities by starting a hypothesis-driven reorganization trajectory mimicking the rigorous methodology of a prospective clinical trial. Methods: The design of this reorganization trajectory was based on the model of a prospective trial. It consisted of (1) listing problems and analysing their potential causes, (2) defining interventions, (3) defining end points and (4) measuring the effect of the interventions (i.e. at baseline and after 1 and 2 years). The primary end point for patient care was the number of organizational root causes of incidents/near incidents; for clinical research, it was the number of patients in trials. There were several secondary end points. We analysed the data using two sample z-tests, χ2 test, a Mann–Whitney U test and the one-way analysis of variance with Bonferroni correction. Results: The number of organizational root causes was reduced by 27% (p < 0.001). There was no effect on the percentage of patients included in trials. Conclusion: The reorganizational trajectory was successful for the primary end point of patient care and had no effect on clinical research. Some confounding events hampered our ability to draw strong conclusions. Nevertheless, the transparency of this approach can give medical professionals more confidence in moving forward with other organizational changes in the same way. Advances in knowledge: This article is novel because managerial interventions were set up similarly to a prospective clinical trial. This study is the first of its kind in radiotherapy, and this approach can contribute to discussions about the effectiveness of managerial interventions. PMID:25679320

  8. Large increases in spending on postacute care in Medicare point to the potential for cost savings in these settings.

    PubMed

    Chandra, Amitabh; Dalton, Maurice A; Holmes, Jonathan

    2013-05-01

    Identifying policies that will cut or constrain US health care spending and spending growth dominates reform efforts, yet little is known about whether the drivers of spending levels and of spending growth are the same. Policies that produce a one-time reduction in the level of spending, for example by making hospitals more efficient, may do little to reduce subsequent annual spending growth. To identify factors causing health care spending to grow the fastest, we focused on three conditions in the Medicare population: heart attacks, congestive heart failure, and hip fractures. We found that spending on postacute care-long-term hospital care, rehabilitation care, and skilled nursing facility care--was the fastest growing major spending category and accounted for a large portion of spending growth in 1994-2009. During that period average spending for postacute care doubled for patients with hip fractures, more than doubled for those with congestive heart failure, and more than tripled for those with heart attacks. We conclude that policies aimed at controlling acute care spending, such as bundled payments for short-term hospital spending and physician services, are likely to be more effective if they include postacute care, as is currently being tested under Medicare's Bundled Payment for Care Improvement Initiative. PMID:23650319

  9. Large Increases In Spending On Postacute Care In Medicare Point To The Potential For Cost Savings In These Settings

    PubMed Central

    Chandra, Amitabh; Dalton, Maurice A.; Holmes, Jonathan

    2013-01-01

    Identifying policies that will cut or constrain US health care spending and spending growth dominates reform efforts, yet little is known about whether the drivers of spending levels and of spending growth are the same. Policies that produce a one-time reduction in the level of spending, for example by making hospitals more efficient, may do little to reduce subsequent annual spending growth. To identify factors causing health care spending to grow the fastest, we focused on three conditions in the Medicare population: heart attacks, congestive heart failure, and hip fractures. We found that spending on postacute care—long-term hospital care, rehabilitation care, and skilled nursing facility care—was the fastest growing major spending category and accounted for a large portion of spending growth in 1994–2009. During that period average spending for postacute care doubled for patients with hip fractures, more than doubled for those with congestive heart failure, and more than tripled for those with heart attacks. We conclude that policies aimed at controlling acute care spending, such as bundled payments for short-term hospital spending and physician services, are likely to be more effective if they include postacute care, as is currently being tested under Medicare’s Bundled Payment for Care Improvement Initiative. PMID:23650319

  10. Control of tectonic setting and large-scale faults on the basin-scale distribution of deformation bands in porous sandstone (Provence, France)

    NASA Astrophysics Data System (ADS)

    Ballas, G.; Soliva, R.; Benedicto, A.; Sizun, J.

    2013-12-01

    From outcrops located in Provence (South-East France), we describe the distribution, the microstructures, and the petrophysical properties of deformation bands networks related to different tectonic events. In contractional setting, pervasively distributed networks of reverse-sense compactional-shear bands are observed in all the folded-sand units of the foreland, whereas localized networks of clustered reverse-sense shear bands are only observed close to a large-scale thrust. In extensional setting, networks of clustered normal-sense shear bands are generally observed adjacent to large-scale faults, although few and randomly distributed bands are also observed between these faults. Normal-sense cataclastic faults are also observed restricted to sand units, suggesting that faults can initiate in the sands in extension, which is not observed in contraction. Shear bands and faults show cataclastic microstructures of low-permeability whereas compactional-shear bands show crush microbreccia or protocataclastic microstructures of moderate permeability. This basin-scale analysis underlines the major role of tectonic settings (thrust-fault versus normal-fault andersonian-stress regime) and the influence of inherited large-scale faults on the formation of low-permeability shear bands. We also provide a geometrical analysis of the band network properties (spacing, thickness, shear/compaction ratio, degree of cataclasis, petrophysical properties) with respect to the host sand granulometry. This analysis suggests that granulometry, although less important than tectonic setting and the presence of large-scale faults, has however a non-negligible effect on the band networks geometry.

  11. SU-E-I-58: Experiences in Setting Up An Online Fluoroscopy Tracking System in a Large Healthcare System

    SciTech Connect

    Fisher, R; Wunderle, K; Lingenfelter, M

    2015-06-15

    Purpose: Transitioning from a paper based to an online system for tracking fluoroscopic case information required by state regulation and to conform to NCRP patient dose tracking suggestions. Methods: State regulations require documentation of operator, equipment, and some metric of tube output for fluoroscopy exams. This information was previously collected in paper logs, which was cumbersome and inefficient for the large number of fluoroscopic units across multiple locations within the system. The “tech notes” feature within Siemens’ Syngo workflow RIS was utilized to create an entry form for technologists to input case information, which was sent to a third party vendor for archiving and display though an online web based portal. Results: Over 55k cases were logged in the first year of implementation, with approximately 6,500 cases per month once fully online. A system was built for area managers to oversee and correct data, which has increased the accuracy of inputted values. A high-dose report was built to automatically send notifications when patients exceed trigger levels. In addition to meeting regulatory requirements, the new system allows for larger scale QC in fluoroscopic cases by allowing comparison of data from specific procedures, locations, equipment, and operators so that instances that fall outside of reference levels can be identified for further evaluation. The system has also drastically improved identification of operators without documented equipment specific training. Conclusion: The transition to online fluoroscopy logs has improved efficiency in meeting state regulatory requirements as well as allowed for identification of particular procedures, equipment, and operators in need of additional attention in order to optimize patient and personnel doses, while high dose alerts improve patient care and follow up. Future efforts are focused on incorporating case information from outside of radiology, as well as on automating processes for

  12. The trajectory of the blood DNA methylome ageing rate is largely set before adulthood: evidence from two longitudinal studies.

    PubMed

    Kananen, L; Marttila, S; Nevalainen, T; Kummola, L; Junttila, I; Mononen, N; Kähönen, M; Raitakari, O T; Hervonen, A; Jylhä, M; Lehtimäki, T; Hurme, M; Jylhävä, J

    2016-06-01

    The epigenetic clock, defined as the DNA methylome age (DNAmAge), is a candidate biomarker of ageing. In this study, we aimed to characterize the behaviour of this marker during the human lifespan in more detail using two follow-up cohorts (the Young Finns study, calendar age i.e. cAge range at baseline 15-24 years, 25-year-follow-up, N = 183; The Vitality 90+ study, cAge range at baseline 19-90 years, 4-year-follow-up, N = 48). We also aimed to assess the relationship between DNAmAge estimate and the blood cell distributions, as both of these measures are known to change as a function of age. The subjects' DNAmAges were determined using Horvath's calculator of epigenetic cAge. The estimate of the DNA methylome age acceleration (Δ-cAge-DNAmAge) demonstrated remarkable stability in both cohorts: the individual rank orders of the DNAmAges remained largely unchanged during the follow-ups. The blood cell distributions also demonstrated significant intra-individual correlation between the baseline and follow-up time points. Interestingly, the immunosenescence-associated features (CD8+CD28- and CD4+CD28- cell proportions and the CD4/CD8 cell ratio) were tightly associated with the estimate of the DNA methylome age. In summary, our data demonstrate that the general level of Δ-cAge-DNAmAge is fixed before adulthood and appears to be quite stationary thereafter, even in the oldest-old ages. Moreover, the blood DNAmAge estimate seems to be tightly associated with ageing-associated shifts in blood cell composition, especially with those that are the hallmarks of immunosenescence. Overall, these observations contribute to the understanding of the longitudinal aspects of the DNAmAge estimate. PMID:27300324

  13. LINC-NIRVANA for the large binocular telescope: setting up the world's largest near infrared binoculars for astronomy

    NASA Astrophysics Data System (ADS)

    Hofferbert, Ralph; Baumeister, Harald; Bertram, Thomas; Berwein, Jürgen; Bizenberger, Peter; Böhm, Armin; Böhm, Michael; Borelli, José Luis; Brangier, Matthieu; Briegel, Florian; Conrad, Albert; De Bonis, Fulvio; Follert, Roman; Herbst, Tom; Huber, Armin; Kittmann, Frank; Kürster, Martin; Laun, Werner; Mall, Ulrich; Meschke, Daniel; Mohr, Lars; Naranjo, Vianak; Pavlov, Aleksei; Pott, Jörg-Uwe; Rix, Hans-Walter; Rohloff, Ralf-Rainer; Schinnerer, Eva; Storz, Clemens; Trowitzsch, Jan; Yan, Zhaojun; Zhang, Xianyu; Eckart, Andreas; Horrobin, Matthew; Rost, Steffen; Straubmeier, Christian; Wank, Imke; Zuther, Jens; Beckmann, Udo; Connot, Claus; Heininger, Matthias; Hofmann, Karl-Heinz; Kröner, Tim; Nussbaum, Eddy; Schertl, Dieter; Weigelt, Gerd; Bergomi, Maria; Brunelli, Alessandro; Dima, Marco; Farinato, Jacopo; Magrin, Demetrio; Marafatto, Luca; Ragazzoni, Roberto; Viotto, Valentina; Arcidiacono, Carmelo; Bregoli, Giovanni; Ciliegi, Paolo; Cosentino, Guiseppe; Diolaiti, Emiliano; Foppiani, Italo; Lombini, Matteo; Schreiber, Laura; D'Alessio, Francesco; Li Causi, Gianluca; Lorenzetti, Dario; Vitali, Fabrizio; Bertero, Mario; Boccacci, Patrizia; La Camera, Andrea

    2013-08-01

    LINC-NIRVANA (LN) is the near-infrared, Fizeau-type imaging interferometer for the large binocular telescope (LBT) on Mt. Graham, Arizona (elevation of 3267 m). The instrument is currently being built by a consortium of German and Italian institutes under the leadership of the Max Planck Institute for Astronomy in Heidelberg, Germany. It will combine the radiation from both 8.4 m primary mirrors of LBT in such a way that the sensitivity of a 11.9 m telescope and the spatial resolution of a 22.8 m telescope will be obtained within a 10.5×10.5 arcsec scientific field of view. Interferometric fringes of the combined beams are tracked in an oval field with diameters of 1 and 1.5 arcmin. In addition, both incoming beams are individually corrected by LN's multiconjugate adaptive optics system to reduce atmospheric image distortion over a circular field of up to 6 arcmin in diameter. A comprehensive technical overview of the instrument is presented, comprising the detailed design of LN's four major systems for interferometric imaging and fringe tracking, both in the near infrared range of 1 to 2.4 μm, as well as atmospheric turbulence correction at two altitudes, both in the visible range of 0.6 to 0.9 μm. The resulting performance capabilities and a short outlook of some of the major science goals will be presented. In addition, the roadmap for the related assembly, integration, and verification process are discussed. To avoid late interface-related risks, strategies for early hardware as well as software interactions with the telescope have been elaborated. The goal is to ship LN to the LBT in 2014.

  14. Rethinking the Christian Studies Classroom: Reflections on the Dynamics of Teaching Religion in Southern Public Universities

    ERIC Educational Resources Information Center

    Gravett, Sandie; Hulsether, Mark; Medine, Carolyn

    2011-01-01

    An extended set of conversations conducted by three religious studies faculty teaching at large public universities in the Southern United States spurred these reflections on how their institutional locations inflected issues such as the cultural expectations students bring to the classroom, how these expectations interact with the evolving…

  15. How Curriculum and Classroom Achievement Predict Teacher Time on Lecture- and Inquiry-Based Mathematics Activities

    ERIC Educational Resources Information Center

    Kaufman, Julia H.; Rita Karam; Pane, John F.; Junker, Brian W.

    2012-01-01

    This study drew on data from a large, randomized trial of Cognitive Tutor Algebra (CTA) in high-poverty settings to investigate how mathematics curricula and classroom achievement related to teacher reports of time spent on inquiry-based and lecture-based mathematics activities. We found that teachers using the CTA curriculum reported more time on…

  16. Eruptive history and tectonic setting of Medicine Lake Volcano, a large rear-arc volcano in the southern Cascades

    NASA Astrophysics Data System (ADS)

    Donnelly-Nolan, Julie M.; Grove, Timothy L.; Lanphere, Marvin A.; Champion, Duane E.; Ramsey, David W.

    2008-10-01

    Medicine Lake Volcano (MLV), located in the southern Cascades ˜ 55 km east-northeast of contemporaneous Mount Shasta, has been found by exploratory geothermal drilling to have a surprisingly silicic core mantled by mafic lavas. This unexpected result is very different from the long-held view derived from previous mapping of exposed geology that MLV is a dominantly basaltic shield volcano. Detailed mapping shows that < 6% of the ˜ 2000 km 2 of mapped MLV lavas on this southern Cascade Range shield-shaped edifice are rhyolitic and dacitic, but drill holes on the edifice penetrated more than 30% silicic lava. Argon dating yields ages in the range ˜ 475 to 300 ka for early rhyolites. Dates on the stratigraphically lowest mafic lavas at MLV fall into this time frame as well, indicating that volcanism at MLV began about half a million years ago. Mafic compositions apparently did not dominate until ˜ 300 ka. Rhyolite eruptions were scarce post-300 ka until late Holocene time. However, a dacite episode at ˜ 200 to ˜ 180 ka included the volcano's only ash-flow tuff, which was erupted from within the summit caldera. At ˜ 100 ka, compositionally distinctive high-Na andesite and minor dacite built most of the present caldera rim. Eruption of these lavas was followed soon after by several large basalt flows, such that the combined area covered by eruptions between 100 ka and postglacial time amounts to nearly two-thirds of the volcano's area. Postglacial eruptive activity was strongly episodic and also covered a disproportionate amount of area. The volcano has erupted 9 times in the past 5200 years, one of the highest rates of late Holocene eruptive activity in the Cascades. Estimated volume of MLV is ˜ 600 km 3, giving an overall effusion rate of ˜ 1.2 km 3 per thousand years, although the rate for the past 100 kyr may be only half that. During much of the volcano's history, both dry HAOT (high-alumina olivine tholeiite) and hydrous calcalkaline basalts erupted

  17. Eruptive history and tectonic setting of Medicine Lake Volcano, a large rear-arc volcano in the southern Cascades

    USGS Publications Warehouse

    Donnelly-Nolan, J. M.; Grove, T.L.; Lanphere, M.A.; Champion, D.E.; Ramsey, D.W.

    2008-01-01

    Medicine Lake Volcano (MLV), located in the southern Cascades ??? 55??km east-northeast of contemporaneous Mount Shasta, has been found by exploratory geothermal drilling to have a surprisingly silicic core mantled by mafic lavas. This unexpected result is very different from the long-held view derived from previous mapping of exposed geology that MLV is a dominantly basaltic shield volcano. Detailed mapping shows that < 6% of the ??? 2000??km2 of mapped MLV lavas on this southern Cascade Range shield-shaped edifice are rhyolitic and dacitic, but drill holes on the edifice penetrated more than 30% silicic lava. Argon dating yields ages in the range ??? 475 to 300??ka for early rhyolites. Dates on the stratigraphically lowest mafic lavas at MLV fall into this time frame as well, indicating that volcanism at MLV began about half a million years ago. Mafic compositions apparently did not dominate until ??? 300??ka. Rhyolite eruptions were scarce post-300??ka until late Holocene time. However, a dacite episode at ??? 200 to ??? 180??ka included the volcano's only ash-flow tuff, which was erupted from within the summit caldera. At ??? 100??ka, compositionally distinctive high-Na andesite and minor dacite built most of the present caldera rim. Eruption of these lavas was followed soon after by several large basalt flows, such that the combined area covered by eruptions between 100??ka and postglacial time amounts to nearly two-thirds of the volcano's area. Postglacial eruptive activity was strongly episodic and also covered a disproportionate amount of area. The volcano has erupted 9 times in the past 5200??years, one of the highest rates of late Holocene eruptive activity in the Cascades. Estimated volume of MLV is ??? 600??km3, giving an overall effusion rate of ??? 1.2??km3 per thousand years, although the rate for the past 100??kyr may be only half that. During much of the volcano's history, both dry HAOT (high-alumina olivine tholeiite) and hydrous calcalkaline

  18. The Impact of Course Delivery Systems on Student Achievement and Sense of Community: A Comparison of Learning Community versus Stand-Alone Classroom Settings in an Open-Enrollment Inner City Public Community College

    ERIC Educational Resources Information Center

    Bandyopadhyay, Pamela

    2010-01-01

    This study examined the effects of two types of course delivery systems (learning community classroom environments versus stand-alone classroom environments) on the achievement of students who were simultaneously enrolled in remedial and college-level social science courses at an inner city open-enrollment public community college. This study was…

  19. Grafting computer projected simulations and interactive engagement methods within a traditional classroom setting: The influence on secondary level students' understanding of Newtonian mechanics and on attitudes towards physics

    NASA Astrophysics Data System (ADS)

    Zoubeir, Wassim Fouad

    This research explored the effects of a constructivist approach using computer projected simulations (CPS) and interactive engagement (IE) methods on 12th grade school students. The treatment lasted 18 weeks during the 1999-2000 fall semester and seeked to evaluate three variations in students': (1)conceptual understanding of Newtonian mechanics as measured by the Force Concept Inventory (FCI), (2)modification of their views about science as measured by the Views About Science Survey (VASS), and (3)achievement on traditional examinations, as measured by their end of semester grades. Analysis of Covariance (ANCOVA) was applied to determine the differences between the mean scores of the experimental group students, and students of the control group, who were exposed to traditional teaching methods only. The FCI data analysis showed that, after 18 weeks, conceptual understanding of Newtonian mechanics had markedly improved only in the experimental group (F(1,99) = 44.739, p < .001). By contrast, there was no statistically significant difference in students' performance on the VASS instrument for both groups (F(1,99) = .033, p = .856), confirming previous and comparable findings for studies of short implementation period. The lack of statistically significant difference between the control and experimental groups in graded achievement, while controlling for students' previous achievement, was unexpected (F(1,99) = 1.178, p = .280). It is suggested that in this particular setting, the influence of a technical factor may have been overlooked: the monitored and systematic drill exercises using elaborate math formulae to prepare students for traditional math-loaded exams. Still, despite being intentionally deprived of such preparation throughout the study, students of the experimental group did not achieve less than their counterpart, and in addition, they had gained a satisfactory understanding of Newtonian mechanics. This result points unmistakably at a plausible

  20. Cloning of complete genome sets of six dsRNA viruses using an improved cloning method for large dsRNA genes.

    PubMed

    Potgieter, A C; Steele, A D; van Dijk, A A

    2002-09-01

    Cloning full-length large (>3 kb) dsRNA genome segments from small amounts of dsRNA has thus far remained problematic. Here, a single-primer amplification sequence-independent dsRNA cloning procedure was perfected for large genes and tailored for routine use to clone complete genome sets or individual genes. Nine complete viral genome sets were amplified by PCR, namely those of two human rotaviruses, two African horsesickness viruses (AHSV), two equine encephalosis viruses (EEV), one bluetongue virus (BTV), one reovirus and bacteriophage Phi12. Of these amplified genomes, six complete genome sets were cloned for viruses with genes ranging in size from 0.8 to 6.8 kb. Rotavirus dsRNA was extracted directly from stool samples. Co-expressed EEV VP3 and VP7 assembled into core-like particles that have typical orbivirus capsomeres. This work presents the first EEV sequence data and establishes that EEV genes have the same conserved termini (5' GUU and UAC 3') and coding assignment as AHSV and BTV. To clone complete genome sets, one-tube reactions were developed for oligo-ligation, cDNA synthesis and PCR amplification. The method is simple and efficient compared to other methods. Complete genomes can be cloned from as little as 1 ng dsRNA and a considerably reduced number of PCR cycles (22-30 cycles compared to 30-35 of other methods). This progress with cloning large dsRNA genes is important for recombinant vaccine development and determination of the role of terminal sequences for replication and gene expression. PMID:12185276

  1. Language Interventions in Natural Settings.

    ERIC Educational Resources Information Center

    Cavallaro, Claire C.

    1983-01-01

    Described is a milieu teaching approach to language development of young handicapped children. The method involves prompting and contingent delivery of reinforcers during normal language interactions in such classroom settings as free play, lunch, or instructional periods. (MC)

  2. Small Atomic Orbital Basis Set First-Principles Quantum Chemical Methods for Large Molecular and Periodic Systems: A Critical Analysis of Error Sources.

    PubMed

    Sure, Rebecca; Brandenburg, Jan Gerit; Grimme, Stefan

    2016-04-01

    In quantum chemical computations the combination of Hartree-Fock or a density functional theory (DFT) approximation with relatively small atomic orbital basis sets of double-zeta quality is still widely used, for example, in the popular B3LYP/6-31G* approach. In this Review, we critically analyze the two main sources of error in such computations, that is, the basis set superposition error on the one hand and the missing London dispersion interactions on the other. We review various strategies to correct those errors and present exemplary calculations on mainly noncovalently bound systems of widely varying size. Energies and geometries of small dimers, large supramolecular complexes, and molecular crystals are covered. We conclude that it is not justified to rely on fortunate error compensation, as the main inconsistencies can be cured by modern correction schemes which clearly outperform the plain mean-field methods. PMID:27308221

  3. Towards Perceptual Interface for Visualization Navigation of Large Data Sets Using Gesture Recognition with Bezier Curves and Registered 3-D Data

    SciTech Connect

    Shin, M C; Tsap, L V; Goldgof, D B

    2003-03-20

    This paper presents a gesture recognition system for visualization navigation. Scientists are interested in developing interactive settings for exploring large data sets in an intuitive environment. The input consists of registered 3-D data. A geometric method using Bezier curves is used for the trajectory analysis and classification of gestures. The hand gesture speed is incorporated into the algorithm to enable correct recognition from trajectories with variations in hand speed. The method is robust and reliable: correct hand identification rate is 99.9% (from 1641 frames), modes of hand movements are correct 95.6% of the time, recognition rate (given the right mode) is 97.9%. An application to gesture-controlled visualization of 3D bioinformatics data is also presented.

  4. Small Atomic Orbital Basis Set First‐Principles Quantum Chemical Methods for Large Molecular and Periodic Systems: A Critical Analysis of Error Sources

    PubMed Central

    Sure, Rebecca; Brandenburg, Jan Gerit

    2015-01-01

    Abstract In quantum chemical computations the combination of Hartree–Fock or a density functional theory (DFT) approximation with relatively small atomic orbital basis sets of double‐zeta quality is still widely used, for example, in the popular B3LYP/6‐31G* approach. In this Review, we critically analyze the two main sources of error in such computations, that is, the basis set superposition error on the one hand and the missing London dispersion interactions on the other. We review various strategies to correct those errors and present exemplary calculations on mainly noncovalently bound systems of widely varying size. Energies and geometries of small dimers, large supramolecular complexes, and molecular crystals are covered. We conclude that it is not justified to rely on fortunate error compensation, as the main inconsistencies can be cured by modern correction schemes which clearly outperform the plain mean‐field methods. PMID:27308221

  5. Collaborative Classroom Management. Video to Accompany "A Biological Brain in a Cultural Classroom: Applying Biological Research to Classroom Management." [Videotape].

    ERIC Educational Resources Information Center

    2001

    This 43-minute VHS videotape is designed to be used in course and workshop settings with "A Biological Brain in a Cultural Classroom: Applying Biological Research to Classroom Management." The videotape's principal values are as an introduction to the issues explored in the book and as a catalyst for group discussions and activities related to…

  6. Classroom Notes

    ERIC Educational Resources Information Center

    International Journal of Mathematical Education in Science and Technology, 2007

    2007-01-01

    In this issue's "Classroom Notes" section, the following papers are discussed: (1) "Constructing a line segment whose length is equal to the measure of a given angle" (W. Jacob and T. J. Osler); (2) "Generating functions for the powers of Fibonacci sequences" (D. Terrana and H. Chen); (3) "Evaluation of mean and variance integrals without…

  7. Classroom Tech

    ERIC Educational Resources Information Center

    Instructor, 2006

    2006-01-01

    This article features the latest classroom technologies namely the FLY Pentop, WriteToLearn, and a new iris scan identification system. The FLY Pentop is a computerized pen from Leapster that "magically" understands what kids write and draw on special FLY paper. WriteToLearn is an automatic grading software from Pearson Knowledge Technologies and…

  8. Classroom Independence

    ERIC Educational Resources Information Center

    Donlon, Joe

    2007-01-01

    As a technician for the Continuing Education department at Confederation College, the author was approached by an Academic Support Strategist from college's Learning Centre who was looking for a solution for one of her students. She was working with a hard-of-hearing student, and at the time, they were sitting together in the classrooms, sharing a…

  9. Classroom Notes

    ERIC Educational Resources Information Center

    International Journal of Mathematical Education in Science and Technology, 2007

    2007-01-01

    In this issue's "Classroom Notes" section, the following papers are described: (1) "Sequences of Definite Integrals" by T. Dana-Picard; (2) "Structural Analysis of Pythagorean Monoids" by M.-Q Zhan and J. Tong; (3) "A Random Walk Phenomenon under an Interesting Stopping Rule" by S. Chakraborty; (4) "On Some Confidence Intervals for Estimating the…

  10. Supplementary Classroom.

    ERIC Educational Resources Information Center

    Douglas Fir Plywood Association, Tacoma, WA.

    Three prototype portable classrooms were developed for both conventional and component construction. One of these economical units was built for $7.50 per square foot. Construction of each type is explained through use of photographs and text. Included in the presentation are--(1) cluster grouping suggestions, (2) interior and exterior…

  11. Classroom Behavior

    ERIC Educational Resources Information Center

    Segal, Carmit

    2008-01-01

    This paper investigates the determinants and malleability of noncognitive skills. Using data on boys from the National Education Longitudinal Survey, I focus on youth behavior in the classroom as a measure of noncognitive skills. I find that student behavior during adolescence is persistent. The variation in behavior can be attributed to…

  12. Tectonic stress inversion of large multi-phase fracture data sets: application of Win-Tensor to reveal the brittle tectonic history of the Lufilan Arc, DRC

    NASA Astrophysics Data System (ADS)

    Delvaux, Damien; Kipata, Louis; Sintubin, Manuel

    2013-04-01

    Large fault-slip data sets from multiphase orogenic regions present a particular challenge in paleostress reconstructions. The Lufilian Arc is an arcuate fold-and-thrust belt that formed during the late Pan-African times as the result of combined N-S and E-W amalgamation of Gondwana in SE-DRCongo and N-Zambia. We studied more than 22 sites in the Lufilian Arc, and its foreland and correlated the results obtained with existing result in the Ubende belt of W-Tanzania. Most studied sites are characterized by multiphase brittle deformation in which the observed brittle structures are the result of progressive saturation of the host rock by neoformed fractures and the reactivation of early formed fractures. They correspond to large mining exploitations with multiple large and continuous outcrops that allow obtaining datasets sufficiently large to be of statistical significance and often corresponding to several successive brittle events. In this context, the reconstruction of tectonic stress necessitates an initial field-base separation of data, completed by a dynamic separation of the original data set into subsets. In the largest sites, several parts of the deposits have been measured independently and are considered as sub-sites that are be processed separately in an initial stage. The procedure used for interactive fault-slip data separation and stress inversion will be illustrated by field examples (Luiswishi and Manono mining sites). This principle has been applied to all result in the reconstruction of the brittle tectonic history of the region, starting with two major phases of orogenic compression, followed by late orogenic extension and extensional collapse. A regional tectonic inversion during the early Mesozoic, as a result of far- field stresses mark the transition towards rift-related extension. More details in Kipata, Delvaux et al.(2013), Geologica Belgica 16/1-2: 001-017 Win-Tensor can be downloaded at: http://users.skynet.be/damien.delvaux/Tensor/tensor-index.html

  13. The Inclusive Classroom: How Inclusive Is Inclusion?

    ERIC Educational Resources Information Center

    Reid, Claudette M.

    2010-01-01

    This paper presents the position that inclusion is limited; inclusion does not go far enough. The inclusive classroom has been assessed to be of benefit both to the teacher and student. There are, however, limits set on inclusion. In most classrooms only children with learning disability are included omitting those with severe disabilities,…

  14. Expanding Knowledge: From the Classroom into Cyberspace

    ERIC Educational Resources Information Center

    Barbas, Maria Potes Santa-Clara

    2006-01-01

    This paper is part of a larger project in the area of research. The main purpose of this mediated discourse was to implement, observe and analyse experiences of teachers in a training project developed for two different settings in the classroom. The first was between international classrooms through cyberspace and the second was a cyberspace…

  15. Should Supervisors Intervene during Classroom Visits?

    ERIC Educational Resources Information Center

    Marshall, Kim

    2015-01-01

    Real-time coaching has become the go-to supervisory model in some schools (especially charters), with supervisors routinely jumping in during teacher observations and sometimes taking over the class to model a more effective approach. The author sets out goals and guidelines for impromptu classroom visits that include visiting each classroom at…

  16. Application of Transcultural Themes in International Classrooms

    ERIC Educational Resources Information Center

    Van Hook, Steven R.

    2007-01-01

    The effective use of transcultural themes and images may help promote positive resonance in international settings, such as found in the traditional and online classrooms of globalizing higher education. Findings of transculturally resonant themes and images may be applied to international classroom pedagogy through such means as multimedia…

  17. Enhancing Vocabulary Development in Multiple Classroom Contexts.

    ERIC Educational Resources Information Center

    Harmon, Janis M.; Staton, Denise G.

    1999-01-01

    Describes ways teachers can enhance students' vocabulary development through multiple contexts available in typical middle school classroom settings. Addresses questions about vocabulary learning and offers suggestions for enhancing vocabulary with narrative and expository texts that involve multiple classroom contexts. Considers the Vocab-o-gram…

  18. RESISTANCE TO DISRUPTION IN A CLASSROOM SETTING

    PubMed Central

    Parry-Cruwys, Diana E; Neal, Carrie M; Ahearn, William H; Wheeler, Emily E; Premchander, Raseeka; Loeb, Melissa B; Dube, William V

    2011-01-01

    Substantial experimental evidence indicates that behavior reinforced on a denser schedule is more resistant to disruption than is behavior reinforced on a thinner schedule. The present experiment studied resistance to disruption in a natural educational environment. Responding during familiar activities was reinforced on a multiple variable-interval (VI) 7-s VI 30-s schedule for 6 participants with developmental disabilities. Resistance to disruption was measured by presenting a distracting item. Response rates in the disruption components were compared to within-session response rates in prior baseline components. Results were consistent with the predictions of behavioral momentum theory for 5 of 6 participants. PMID:21709794

  19. Resistance to Disruption in a Classroom Setting

    ERIC Educational Resources Information Center

    Parry-Cruwys, Diana E.; Neal, Carrie M.; Ahearn, William H.; Wheeler, Emily E.; Premchander, Raseeka; Loeb, Melissa B.; Dube, William V.

    2011-01-01

    Substantial experimental evidence indicates that behavior reinforced on a denser schedule is more resistant to disruption than is behavior reinforced on a thinner schedule. The present experiment studied resistance to disruption in a natural educational environment. Responding during familiar activities was reinforced on a multiple…

  20. Environmentally Enriched Classrooms and the Development of Disadvantaged Preschool Children.

    ERIC Educational Resources Information Center

    Busse, Thomas V.; And Others

    This study evaluates the effects of placement of additional equipment in preschool classrooms on the cognitive, perceptual, and social development of urban Negro four-year-old children. Two Get Set classrooms in each of six areas of Philadelphia were paired for teachers, subjects, physical facilities and equipment. One classroom in each pair was…

  1. Improving the Teacher's Awareness of Nonverbal Communication in the Classroom.

    ERIC Educational Resources Information Center

    Kachur, Donald; And Others

    The emphasis in this paper is on developing teacher awareness of how nonverbal communication fits into the classroom setting. Various positive and negative aspects of this phase of communication in the classroom are explored. A classroom teacher is observed closely by students every day, and her/his attitude, feelings, mood or state of mind,…

  2. The Social Context of Urban Classrooms: Measuring Student Psychological Climate

    ERIC Educational Resources Information Center

    Frazier, Stacy L.; Mehta, Tara G.; Atkins, Marc S.; Glisson, Charles; Green, Philip D.; Gibbons, Robert D.; Kim, Jong Bae; Chapman, Jason E.; Schoenwald, Sonja K.; Cua, Grace; Ogle, Robert R.

    2015-01-01

    Classrooms are unique and complex work settings in which teachers and students both participate in and contribute to classroom processes. This article describes the measurement phase of a study that examined the social ecology of urban classrooms. Informed by the dimensions and items of an established measure of organizational climate, we designed…

  3. Systemize Classroom Management to Enhance Teaching and Learning

    ERIC Educational Resources Information Center

    Delman, Douglas J.

    2011-01-01

    Good classroom management is one of the most important goals teachers strive to establish from the first day of class. The rules, procedures, activities, and behaviors set the classroom tone throughout the school year. By revising, updating, and systemizing classroom management activities, teachers can eliminate many problems created by students…

  4. Multilingual Label Quests: A Practice for the "Asymmetrical" Multilingual Classroom

    ERIC Educational Resources Information Center

    Bonacina-Pugh, Florence

    2013-01-01

    Research on multilingual classrooms usually focuses on contexts where both teachers and pupils share the same linguistic repertoire; what can be called "symmetrical" multilingual classrooms. This paper sets out to investigate whether (and how) pupils' multilingual resources can be used in classrooms where the teacher does not share pupils'…

  5. Practical Classroom Applications of Language Experience: Looking Back, Looking Forward.

    ERIC Educational Resources Information Center

    Nelson, Olga G., Ed.; Linek, Wayne M., Ed.

    The 38 essays in this book look back at language experience as an educational approach, provide practical classroom applications, and reconceptualize language experience as an overarching education process. Classroom teachers and reading specialists describe strategies in use in a variety of classroom settings and describe ways to integrate…

  6. A geometrical correction for the inter- and intra-molecular basis set superposition error in Hartree-Fock and density functional theory calculations for large systems.

    PubMed

    Kruse, Holger; Grimme, Stefan

    2012-04-21

    chemistry yields MAD=0.68 kcal/mol, which represents a huge improvement over plain B3LYP/6-31G* (MAD=2.3 kcal/mol). Application of gCP-corrected B97-D3 and HF-D3 on a set of large protein-ligand complexes prove the robustness of the method. Analytical gCP gradients make optimizations of large systems feasible with small basis sets, as demonstrated for the inter-ring distances of 9-helicene and most of the complexes in Hobza's S22 test set. The method is implemented in a freely available FORTRAN program obtainable from the author's website. PMID:22519309

  7. Application of Two-Dimensional Nuclear Magnetic Resonance for Signal Enhancement by Spectral Integration Using a Large Data Set of Metabolic Mixtures.

    PubMed

    Misawa, Takuma; Wei, Feifei; Kikuchi, Jun

    2016-06-21

    Nuclear magnetic resonance (NMR) spectroscopy has tremendous advantages of minimal sample preparation and interconvertibility of data among different institutions; thus, large data sets are frequently acquired in metabolomics studies. Previously, we used a novel analytical strategy, named signal enhancement by spectral integration (SENSI), to overcome the low signal-to-noise ratio (S/N ratio) problem in (13)C NMR by integration of hundreds of spectra without additional measurements. In this letter, the development of a SENSI 2D method and application to >1000 2D JRES NMR spectra are described. Remarkably, the obtained SENSI 2D spectrum had an approximate 14-fold increase in the S/N ratio and 80-250 additional peaks without any additional measurements. These results suggest that SENSI 2D is a useful method for assigning weak signals and that the use of coefficient of variation values can support the assignment information and extraction of features from the population characteristics among large data sets. PMID:27257670

  8. River Modeling in Large and Ungauged Basins: Experience of Setting up the HEC RAS Model over the Ganges-Brahmaputra-Meghna Basins

    NASA Astrophysics Data System (ADS)

    Hossain, F.; Maswood, M.

    2014-12-01

    River modeling is the processing of setting up a physically-based hydrodynamic model that can simulate the water flow dynamics of a stream network against time varying boundary conditions. Such river models are an important component of any flood forecasting system that forecasts river levels in flood prone regions. However, many large river basins in the developing world such as the Ganges, Brahmaputra, Meghna (GBM), Indus, Irrawaddy, Salween, Mekong and Niger are mostly ungauged. Such large basins lack the necessary in-situ measurements of river bed depth/slope, bathymetry (river cross section), floodplain mapping and boundary condition flows for forcing a river model. For such basins, proxy approaches relying mostly on remote sensing data from space platforms are the only alternative. In this study, we share our experience of setting up the widely-used 1-D river model over the entire GBM basin and its stream network. Good quality in-situ measurements of river hydraulics (cross section, slope, flow) was available only for the downstream and flood prone region of the basin, which comprises only 7% of the basin area. For the remaining 93% of the basin area, we resorted to the use of data from the following satellite sensors to build a workable river model: a) Shuttle Radar Topography Mission (SRTM) for deriving bed slope; b) LANDSAT/MODIS for updating river network and flow direction generated by elevation data; c) radar altimetry data to build depth versus width relationship at river locations; d) satellite precipitation based hydrologic modeling of lateral flows into main stem rivers. In addition, we referred to an extensive body of literature to estimate the prevailing baseline hydraulics of rivers in the ungauged region. We measured success of our approach by systematically testing how well the basin-wide river model could simulate river level dynamics at two measured locations inside Bangladesh. Our experience of river modeling was replete with numerous

  9. Multilevel and Diverse Classrooms

    ERIC Educational Resources Information Center

    Baurain, Bradley, Ed.; Ha, Phan Le, Ed.

    2010-01-01

    The benefits and advantages of classroom practices incorporating unity-in-diversity and diversity-in-unity are what "Multilevel and Diverse Classrooms" is all about. Multilevel classrooms--also known as mixed-ability or heterogeneous classrooms--are a fact of life in ESOL programs around the world. These classrooms are often not only multilevel…

  10. Nurturing Mathematical Promise in a Regular Elementary Classroom: Exploring the Role of the Teacher and Classroom Environment

    ERIC Educational Resources Information Center

    Dimitriadis, Christos

    2016-01-01

    This article presents findings from a case study of an in-classroom program based on ability grouping for Year 2 (ages 6-7) primary (elementary) children identified as high ability in mathematics. The study examined the role of classroom setting, classroom environment, and teacher's approach in realizing and developing mathematical promise. The…

  11. Treatment of children with attention-deficit/hyperactivity disorder: results of a randomized, multicenter, double-blind, crossover study of extended-release dexmethylphenidate and D,L-methylphenidate and placebo in a laboratory classroom setting.

    PubMed

    Silva, Raul; Muniz, Rafael; McCague, Kevin; Childress, Ann; Brams, Matthew; Mao, Alice

    2008-01-01

    The purpose of this study was to compare the efficacy and safety of extended-release dexmethylphenidate (d-MPH-ER) to that of d,l-MPH-ER and placebo in children with attention-deficit/hyperactivity disorder (ADHD) in a laboratory classroom setting. This multicenter, double-blind, crossover study randomized 82 children, 6 to 12 years of age, stabilized on a total daily dose to the nearest equivalent of 40 to 60 mg of d,l-MPH or 20 or 30 mg/day of d-MPH. Patients participated in a screening day and practice day, and were randomized to 1 of 10 sequences of all five treatments in five separate periods. Treatments included d-MPH-ER (20 mg/day), d-MPH-ER (30 mg/day), d,l-MPH-ER (36 mg/day), d,l-MPH-ER (54 mg/day), and placebo. Primary efficacy was measured by the change from predose on the Swanson, Kotkin, Agler, M-Flynn, and Pelham (SKAMP) Rating Scale-Combined scores at 2-h postdose during the 12-h laboratory assessment (d-MPH-ER 20 mg/day vs. d,l-MPH-ER 36 mg/day). Adverse events were monitored throughout the study period. d-MPH-ER (20 mg/day) was significantly more effective than d,l-MPH-ER (36 mg/day) in the primary efficacy variable, change from predose to 2-h postdose in SKAMP-combined score. In general, d-MPH-ER had an earlier onset of action than d,l-MPH-ER, while d,l-MPH-ER had a stronger effect at 12-h postdose. No serious adverse events were reported. Treatment with either agent was associated with significant improvements in ADHD symptoms. d-MPH-ER and d,l-MPH-ER can be differentiated on what part of the day each is more effective. PMID:18362868

  12. mzDB: A File Format Using Multiple Indexing Strategies for the Efficient Analysis of Large LC-MS/MS and SWATH-MS Data Sets*

    PubMed Central

    Bouyssié, David; Dubois, Marc; Nasso, Sara; Gonzalez de Peredo, Anne; Burlet-Schiltz, Odile; Aebersold, Ruedi; Monsarrat, Bernard

    2015-01-01

    The analysis and management of MS data, especially those generated by data independent MS acquisition, exemplified by SWATH-MS, pose significant challenges for proteomics bioinformatics. The large size and vast amount of information inherent to these data sets need to be properly structured to enable an efficient and straightforward extraction of the signals used to identify specific target peptides. Standard XML based formats are not well suited to large MS data files, for example, those generated by SWATH-MS, and compromise high-throughput data processing and storing. We developed mzDB, an efficient file format for large MS data sets. It relies on the SQLite software library and consists of a standardized and portable server-less single-file database. An optimized 3D indexing approach is adopted, where the LC-MS coordinates (retention time and m/z), along with the precursor m/z for SWATH-MS data, are used to query the database for data extraction. In comparison with XML formats, mzDB saves ∼25% of storage space and improves access times by a factor of twofold up to even 2000-fold, depending on the particular data access. Similarly, mzDB shows also slightly to significantly lower access times in comparison with other formats like mz5. Both C++ and Java implementations, converting raw or XML formats to mzDB and providing access methods, will be released under permissive license. mzDB can be easily accessed by the SQLite C library and its drivers for all major languages, and browsed with existing dedicated GUIs. The mzDB described here can boost existing mass spectrometry data analysis pipelines, offering unprecedented performance in terms of efficiency, portability, compactness, and flexibility. PMID:25505153

  13. mzDB: a file format using multiple indexing strategies for the efficient analysis of large LC-MS/MS and SWATH-MS data sets.

    PubMed

    Bouyssié, David; Dubois, Marc; Nasso, Sara; Gonzalez de Peredo, Anne; Burlet-Schiltz, Odile; Aebersold, Ruedi; Monsarrat, Bernard

    2015-03-01

    The analysis and management of MS data, especially those generated by data independent MS acquisition, exemplified by SWATH-MS, pose significant challenges for proteomics bioinformatics. The large size and vast amount of information inherent to these data sets need to be properly structured to enable an efficient and straightforward extraction of the signals used to identify specific target peptides. Standard XML based formats are not well suited to large MS data files, for example, those generated by SWATH-MS, and compromise high-throughput data processing and storing. We developed mzDB, an efficient file format for large MS data sets. It relies on the SQLite software library and consists of a standardized and portable server-less single-file database. An optimized 3D indexing approach is adopted, where the LC-MS coordinates (retention time and m/z), along with the precursor m/z for SWATH-MS data, are used to query the database for data extraction. In comparison with XML formats, mzDB saves ∼25% of storage space and improves access times by a factor of twofold up to even 2000-fold, depending on the particular data access. Similarly, mzDB shows also slightly to significantly lower access times in comparison with other formats like mz5. Both C++ and Java implementations, converting raw or XML formats to mzDB and providing access methods, will be released under permissive license. mzDB can be easily accessed by the SQLite C library and its drivers for all major languages, and browsed with existing dedicated GUIs. The mzDB described here can boost existing mass spectrometry data analysis pipelines, offering unprecedented performance in terms of efficiency, portability, compactness, and flexibility. PMID:25505153

  14. Examining the large-scale convergence of photosynthesis-weighted tree leaf temperatures through stable oxygen isotope analysis of multiple data sets.

    PubMed

    Song, Xin; Barbour, Margaret M; Saurer, Matthias; Helliker, Brent R

    2011-12-01

    The idea that photosynthesis-weighted tree canopy leaf temperature (T(canδ)) can be resolved through analysis of oxygen isotope composition in tree wood cellulose (δ(18) O(wc)) has led to the observation of boreal-to-subtropical convergence of T(canδ) to c. 20°C. To further assess the validity of the large-scale convergence of T(canδ), we used the isotope approach to perform calculation of T(canδ) for independent δ(18) O(wc) data sets that have broad coverage of climates. For the boreal-to-subtropical data sets, we found that the deviation of T(canδ) from the growing season temperature systemically increases with the decreasing mean annual temperature. Across the whole data sets we calculated a mean T(canδ) of 19.48°C and an SD of 2.05°C, while for the tropical data set, the mean T(canδ) was 26.40 ± 1.03°C, significantly higher than the boreal-to-subtropical mean. Our study thus offers independent isotopic support for the concept that boreal-to-subtropical trees display conserved T(canδ) near 20°C. The isotopic analysis cannot distinguish between the possibility that leaf temperatures are generally elevated above ambient air temperatures in cooler environments and the possibility that leaf temperature equals air temperature, whereas the leaf/air temperature at which photosynthesis occurs has a weighted average of near 20°C in cooler environments. Future work will separate these potential explanations. PMID:21899555

  15. Maximizing Classroom Participation.

    ERIC Educational Resources Information Center

    Englander, Karen

    2001-01-01

    Discusses how to maximize classroom participation in the English-as-a-Second-or-Foreign-Language classroom, and provides a classroom discussion method that is based on real-life problem solving. (Author/VWL)

  16. The Changing College Classroom.

    ERIC Educational Resources Information Center

    Paulien, Daniel K.

    1998-01-01

    Describes the ways in which college classrooms are changing as a result of technology, furnishings, and educational needs requiring more space and different classroom-design concepts. Explains why the traditional tablet armchair classroom is becoming unpopular. (GR)

  17. Classroom Connectivity: Increasing Participation and Understanding Inside the Classroom

    ERIC Educational Resources Information Center

    Hegedus, Stephen

    2007-01-01

    This article shows how highly mobile computing, when used with new forms of network connectivity, can allow new forms of activities in the mathematics classroom. Examples are provided, such as the ability to share, harvest, and aggregate mathematical objects, and the ability for teachers and students to analyze the entire set of classroom…

  18. A large proportion of asymptomatic Plasmodium infections with low and sub-microscopic parasite densities in the low transmission setting of Temotu Province, Solomon Islands: challenges for malaria diagnostics in an elimination setting

    PubMed Central

    2010-01-01

    Background Many countries are scaling up malaria interventions towards elimination. This transition changes demands on malaria diagnostics from diagnosing ill patients to detecting parasites in all carriers including asymptomatic infections and infections with low parasite densities. Detection methods suitable to local malaria epidemiology must be selected prior to transitioning a malaria control programme to elimination. A baseline malaria survey conducted in Temotu Province, Solomon Islands in late 2008, as the first step in a provincial malaria elimination programme, provided malaria epidemiology data and an opportunity to assess how well different diagnostic methods performed in this setting. Methods During the survey, 9,491 blood samples were collected and examined by microscopy for Plasmodium species and density, with a subset also examined by polymerase chain reaction (PCR) and rapid diagnostic tests (RDTs). The performances of these diagnostic methods were compared. Results A total of 256 samples were positive by microscopy, giving a point prevalence of 2.7%. The species distribution was 17.5% Plasmodium falciparum and 82.4% Plasmodium vivax. In this low transmission setting, only 17.8% of the P. falciparum and 2.9% of P. vivax infected subjects were febrile (≥38°C) at the time of the survey. A significant proportion of infections detected by microscopy, 40% and 65.6% for P. falciparum and P. vivax respectively, had parasite density below 100/μL. There was an age correlation for the proportion of parasite density below 100/μL for P. vivax infections, but not for P. falciparum infections. PCR detected substantially more infections than microscopy (point prevalence of 8.71%), indicating a large number of subjects had sub-microscopic parasitemia. The concordance between PCR and microscopy in detecting single species was greater for P. vivax (135/162) compared to P. falciparum (36/118). The malaria RDT detected the 12 microscopy and PCR positive P

  19. Creating Learning Communities in the Classroom

    ERIC Educational Resources Information Center

    Saville, Bryan K.; Lawrence, Natalie Kerr; Jakobsen, Krisztina V.

    2012-01-01

    There are many ways to construct classroom-based learning communities. Nevertheless, the emphasis is always on cooperative learning. In this article, the authors focus on three teaching methods--interteaching, team-based learning, and cooperative learning in large, lecture-based courses--that they have used successfully to create classroom-based…

  20. How Tablets Are Utilized in the Classroom

    ERIC Educational Resources Information Center

    Ditzler, Christine; Hong, Eunsook; Strudler, Neal

    2016-01-01

    New technologies are a large part of the educational landscape in the 21st century. Emergent technologies are implemented in the classroom at an exponential rate. The newest technology to be added to the daily classroom is the tablet computer. Understanding students' and teachers' perceptions about the role of tablet computers is important as this…

  1. New Ways of Classroom Assessment. Revised

    ERIC Educational Resources Information Center

    Brown, J. D., Ed.

    2013-01-01

    In this revised edition in the popular New Ways Series, teachers have once again been given an opportunity to show how they do assessment in their classrooms on an everyday basis. Often feeling helpless when confronted with large-scale standardized testing practices, teachers here offer classroom testing created with the direct aim of helping…

  2. Learning the Three C's: Classroom Communication Climate.

    ERIC Educational Resources Information Center

    Myers, Scott A.

    A study examined the communication climate of a graduate teaching assistant's (GTA) college classroom. Because the teaching role is often new to the GTA, establishing a communication climate may be a significant factor in classroom management. One section of a public speaking class taught by a new graduate teaching assistant at a large midwestern…

  3. Behavior Problems in Learning Activities and Social Interactions in Head Start Classrooms and Early Reading, Mathematics, and Approaches to Learning

    ERIC Educational Resources Information Center

    Bulotsky-Shearer, Rebecca J.; Fernandez, Veronica; Dominguez, Ximena; Rouse, Heather L.

    2011-01-01

    Relations between early problem behavior in preschool classrooms and a comprehensive set of school readiness outcomes were examined for a stratified random sample (N = 256) of 4-year-old children enrolled in a large, urban school district Head Start program. A series of multilevel models examined the unique contribution of early problem behavior…

  4. Treatment outcomes in AIDS-related diffuse large B-cell lymphoma in the setting roll-out of combination antiretroviral therapy in South Africa

    PubMed Central

    de Witt, Pieter; Maartens, Deborah J; Uldrick, Thomas S; Sissolak, Gerhard

    2013-01-01

    Background Long term survival for patients with AIDS-related diffuse large B-cell lymphoma (DLBCL) is feasible in settings with available combination antiretroviral therapy (cART). However, given limited oncology resources, outcomes for AIDS-associated DLBCL in South Africa are unknown. Methods We performed a retrospective analysis of survival in patients with newly diagnosed AIDS-related diffuse large B-cell lymphoma (DLBCL) treated at a tertiary teaching hospital in Cape Town, South Africa with CHOP or CHOP-like chemotherapy (January 2004 until Dec 2010). HIV and lymphoma related prognostic factors were evaluated. Results 36 patients evaluated; median age 37.3 years, 52.8% men, and 61.1% black South Africans. Median CD4 count 184 cells/μl (in 27.8% this was < 100 cells/μl), 80% high-risk according to the age-adjusted International Prognostic Index. Concurrent Mycobacterium tuberculosis in 25%. Two-year overall survival (OS) was 40.5% (median OS 10.5 months, 95%CI 6.5 – 31.8). ECOG performance status of 2 or more (25.4% versus 50.0%, p = 0.01) and poor response to cART (18.0% versus 53.9%, p = 0.03) predicted inferior 2-year OS. No difference in 2-year OS was demonstrated in patients co-infected with Mycobacterium tuberculosis (p = 0.87). Conclusions Two-year OS for patients with AIDS-related DLBCL treated with CHOP like regimens and cART is comparable to that seen in the US and Europe. Important factors effecting OS in AIDS-related DLBCL in South Africa include performance status at presentation and response to cART. Patients with co-morbid Mycobacterium tuberculosis or hepatitis B seropositivity appear to tolerate CHOP in our setting. Additional improvements in outcomes are likely possible. PMID:23797692

  5. The Classroom Animal: Crickets.

    ERIC Educational Resources Information Center

    Kramer, David C.

    1985-01-01

    Suggests using crickets for classroom activities, providing background information on their anatomy and reproduction and tips on keeping individual organisms or a breeding colony in the classroom. (JN)

  6. Integrated QSPR models to predict the soil sorption coefficient for a large diverse set of compounds by using different modeling methods

    NASA Astrophysics Data System (ADS)

    Shao, Yonghua; Liu, Jining; Wang, Meixia; Shi, Lili; Yao, Xiaojun; Gramatica, Paola

    2014-05-01

    The soil sorption coefficient (Koc) is a key physicochemical parameter to assess the environmental risk of organic compounds. To predict soil sorption coefficient in a more effective and economical way, here, quantitative structure-property relationship (QSPR) models were developed based on a large diverse dataset including 964 non-ionic organic compounds. Multiple linear regression (MLR), local lazy regression (LLR) and least squares support vector machine (LS-SVM) were utilized to develop QSPR models based on the four most relevant theoretical molecular descriptors selected by genetic algorithms-variable subset selection (GA-VSS) procedure. The QSPR development strictly followed the OECD principles for QSPR model validation, thus great attentions were paid to internal and external validations, applicability domain and mechanistic interpretation. The obtained results indicate that the LS-SVM model performed better than the MLR and the LLR models. For best LS-SVM model, the correlation coefficients (R2) for the training set was 0.913 and concordance correlation coefficient (CCC) for the prediction set was 0.917. The root-mean square errors (RMSE) were 0.330 and 0.426, respectively. The results of internal and external validations together with applicability domain analysis indicate that the QSPR models proposed in our work are predictive and could provide a useful tool for prediction soil sorption coefficient of new compounds.

  7. Building and calibrating a large-extent and high resolution coupled groundwater-land surface model using globally available data-sets

    NASA Astrophysics Data System (ADS)

    Sutanudjaja, E. H.; Van Beek, L. P.; de Jong, S. M.; van Geer, F.; Bierkens, M. F.

    2012-12-01

    The current generation of large-scale hydrological models generally lacks a groundwater model component simulating lateral groundwater flow. Large-scale groundwater models are rare due to a lack of hydro-geological data required for their parameterization and a lack of groundwater head data required for their calibration. In this study, we propose an approach to develop a large-extent fully-coupled land surface-groundwater model by using globally available datasets and calibrate it using a combination of discharge observations and remotely-sensed soil moisture data. The underlying objective is to devise a collection of methods that enables one to build and parameterize large-scale groundwater models in data-poor regions. The model used, PCR-GLOBWB-MOD, has a spatial resolution of 1 km x 1 km and operates on a daily basis. It consists of a single-layer MODFLOW groundwater model that is dynamically coupled to the PCR-GLOBWB land surface model. This fully-coupled model accommodates two-way interactions between surface water levels and groundwater head dynamics, as well as between upper soil moisture states and groundwater levels, including a capillary rise mechanism to sustain upper soil storage and thus to fulfill high evaporation demands (during dry conditions). As a test bed, we used the Rhine-Meuse basin, where more than 4000 groundwater head time series have been collected for validation purposes. The model was parameterized using globally available data-sets on surface elevation, drainage direction, land-cover, soil and lithology. Next, the model was calibrated using a brute force approach and massive parallel computing, i.e. by running the coupled groundwater-land surface model for more than 3000 different parameter sets. Here, we varied minimal soil moisture storage and saturated conductivities of the soil layers as well as aquifer transmissivities. Using different regularization strategies and calibration criteria we compared three calibration scenarios

  8. Structural Analysis in the Classroom

    ERIC Educational Resources Information Center

    Gage, Nicholas A.; Lewis, Timothy J.

    2010-01-01

    The purpose of this article is to describe an applied method of assessing and manipulating environmental factors influencing student behavior. The assessment procedure is called structural analysis (SA) and can be a part of a functional behavioral assessment (FBA) process or a stand-alone set of procedures for teachers to use in their classrooms.…

  9. Price Discrimination: A Classroom Experiment

    ERIC Educational Resources Information Center

    Aguiló, Paula; Sard, Maria; Tugores, Maria

    2016-01-01

    In this article, the authors describe a classroom experiment aimed at familiarizing students with different types of price discrimination (first-, second-, and third-degree price discrimination). During the experiment, the students were asked to decide what tariffs to set as monopolists for each of the price discrimination scenarios under…

  10. The IRB and Classroom Research

    ERIC Educational Resources Information Center

    Hecht, Jeffrey B.

    2005-01-01

    Scholars conducting research in classrooms face a myriad of ethical issues somewhat unique to the educational setting. While the Code of Federal Regulations (45 CFR 46) generally provides that educational research be classified as exempt from review by Institutional Review Boards, those same regulations provide a host of special conditions under…

  11. Getting Started in Classroom Computing.

    ERIC Educational Resources Information Center

    Ahl, David H.

    Written for secondary students, this booklet provides an introduction to several computer-related concepts through a set of six classroom games, most of which can be played with little more than a sheet of paper and a pencil. The games are: 1) SECRET CODES--introduction to binary coding, punched cards, and paper tape; 2) GUESS--efficient methods…

  12. Classroom Culture Promotes Academic Resiliency

    ERIC Educational Resources Information Center

    DiTullio, Gina

    2014-01-01

    Resiliency is what propels many students to continue moving forward under difficult learning and life conditions. We intuitively think that such resilience is a character quality that cannot be taught. On the contrary, when a teacher sets the right conditions and culture for it in the classroom by teaching collaboration and communication skills,…

  13. Flipped Classroom Modules for Large Enrollment General Chemistry Courses: A Low Barrier Approach to Increase Active Learning and Improve Student Grades

    ERIC Educational Resources Information Center

    Eichler, Jack F.; Peeples, Junelyn

    2016-01-01

    In the face of mounting evidence revealing active learning approaches result in improved student learning outcomes compared to traditional passive lecturing, there is a growing need to change the way instructors teach large introductory science courses. However, a large proportion of STEM faculty continues to use traditional instructor-centered…

  14. Measuring Quality in Inclusive Preschool Classrooms: Development and Validation of the Inclusive Classroom Profile (ICP)

    ERIC Educational Resources Information Center

    Soukakou, Elena P.

    2012-01-01

    The purpose of this study was to develop and validate an observation measure designed to assess classroom quality in inclusive preschool programs, the Inclusive Classroom Profile (ICP). Developing the rating scale entailed systematic fieldwork in inclusive settings and review of the literature on preschool inclusion. Results from the validation…

  15. Global Internet Video Classroom: A Technology Supported Learner-Centered Classroom

    ERIC Educational Resources Information Center

    Lawrence, Oliver

    2010-01-01

    The Global Internet Video Classroom (GIVC) Project connected Chicago Civil Rights activists of the 1960s with Cape Town Anti-Apartheid activists of the 1960s in a classroom setting where learners from Cape Town and Chicago engaged activists in conversations about their motivation, principles, and strategies. The project was launched in order to…

  16. Classroom Management and Teachers' Coping Strategies: Inside Classrooms in Australia, China and Israel

    ERIC Educational Resources Information Center

    Romi, Shlomo; Lewis, Ramon; Roache, Joel

    2013-01-01

    This paper discusses the degree to which recently reported relationships between the classroom management techniques and coping styles of Australian teachers apply in two other national settings: China and Israel. Little is known about which teacher characteristics relate to their approach to classroom management, although researchers in Australia…

  17. The Development of the Older Persons and Informal Caregivers Survey Minimum DataSet (TOPICS-MDS): A Large-Scale Data Sharing Initiative

    PubMed Central

    Lutomski, Jennifer E.; Baars, Maria A. E.; Schalk, Bianca W. M.; Boter, Han; Buurman, Bianca M.; den Elzen, Wendy P. J.; Jansen, Aaltje P. D.; Kempen, Gertrudis I. J. M.; Steunenberg, Bas; Steyerberg, Ewout W.; Olde Rikkert, Marcel G. M.; Melis, René J. F.

    2013-01-01

    Introduction In 2008, the Ministry of Health, Welfare and Sport commissioned the National Care for the Elderly Programme. While numerous research projects in older persons’ health care were to be conducted under this national agenda, the Programme further advocated the development of The Older Persons and Informal Caregivers Survey Minimum DataSet (TOPICS-MDS) which would be integrated into all funded research protocols. In this context, we describe TOPICS data sharing initiative (www.topics-mds.eu). Materials and Methods A working group drafted TOPICS-MDS prototype, which was subsequently approved by a multidisciplinary panel. Using instruments validated for older populations, information was collected on demographics, morbidity, quality of life, functional limitations, mental health, social functioning and health service utilisation. For informal caregivers, information was collected on demographics, hours of informal care and quality of life (including subjective care-related burden). Results Between 2010 and 2013, a total of 41 research projects contributed data to TOPICS-MDS, resulting in preliminary data available for 32,310 older persons and 3,940 informal caregivers. The majority of studies sampled were from primary care settings and inclusion criteria differed across studies. Discussion TOPICS-MDS is a public data repository which contains essential data to better understand health challenges experienced by older persons and informal caregivers. Such findings are relevant for countries where increasing health-related expenditure has necessitated the evaluation of contemporary health care delivery. Although open sharing of data can be difficult to achieve in practice, proactively addressing issues of data protection, conflicting data analysis requests and funding limitations during TOPICS-MDS developmental phase has fostered a data sharing culture. To date, TOPICS-MDS has been successfully incorporated into 41 research projects, thus supporting the

  18. HIV Testing among Patients with Presumptive Tuberculosis: How Do We Implement in a Routine Programmatic Setting? Results of a Large Operational Research from India

    PubMed Central

    Kumar, Ajay MV; Gupta, Devesh; Kumar, Ashok; Gupta, R. S.; Kanchar, Avinash; Rao, Raghuram; Shastri, Suresh; Suryakanth, MD; Rangaraju, Chethana; Naik, Balaji; Guddemane, Deepak K.; Bhat, Prashant; Nair, Achuthan Sreenivas; Harries, Anthony David; Dewan, Puneet

    2016-01-01

    Background In March 2012, World Health Organization recommended that HIV testing should be offered to all patients with presumptive TB (previously called TB suspects). How this is best implemented and monitored in routine health care settings in India was not known. An operational research was conducted in Karnataka State (South India, population 64 million, accounts for 10% of India’s HIV burden), to test processes and learn results and challenges of screening presumptive TB patients for HIV within routine health care settings. Methods In this cross-sectional study conducted between January-March 2012, all presumptive TB patients attending public sector sputum microscopy centres state-wide were offered HIV testing by the laboratory technician, and referred to the nearest public sector HIV counselling and testing services, usually within the same facility. The HIV status of the patients was recorded in the routine TB laboratory form and TB laboratory register. The laboratory register was compiled to obtain the number of presumptive TB patients whose HIV status was ascertained, and the number found HIV positive. Aggregate data on reasons for non-testing were compiled at district level. Results Overall, 115,308 patients with presumptive TB were examined for sputum smear microscopy at 645 microscopy centres state-wide. Of these, HIV status was ascertained for 62,847(55%) among whom 7,559(12%) were HIV-positive, and of these, 3,034(40%) were newly diagnosed. Reasons for non-testing were reported for 37,700(72%) of the 52,461 patients without HIV testing; non-availability of testing services at site of sputum collection was cited by health staff in 54% of respondents. Only 4% of patients opted out of HIV testing. Conclusion Offering HIV testing routinely to presumptive TB patients detected large numbers of previously-undetected instances of HIV infection. Several operational challenges were noted which provide useful lessons for improving uptake of HIV testing in this

  19. Classroom Management. Brief

    ERIC Educational Resources Information Center

    National Education Association Research Department, 2006

    2006-01-01

    In learning-centered classrooms, the emphasis of classroom management shifts from maintaining behavioral control to fostering student engagement and self-regulation as well as community responsibility. This brief describes classroom management in "learning centered" classrooms, where practices are consistent with recent research knowledge about…

  20. Classroom Discipline. Research Roundup.

    ERIC Educational Resources Information Center

    Bielefeldt, Talbot

    1989-01-01

    Recent research in classroom discipline tends to show that discipline is a by-product of effective instruction and classroom management. The five publications reviewed in this annotated bibliography explore aspects of the complex classroom environment that relate to student discipline. Walter Doyle's chapter on "Classroom Organization and…

  1. Film Excerpts Shown to Specifically Elicit Various Affects Lead to Overlapping Activation Foci in a Large Set of Symmetrical Brain Regions in Males

    PubMed Central

    Karama, Sherif; Armony, Jorge; Beauregard, Mario

    2011-01-01

    While the limbic system theory continues to be part of common scientific parlance, its validity has been questioned on multiple grounds. Nonetheless, the issue of whether or not there exists a set of brain areas preferentially dedicated to emotional processing remains central within affective neuroscience. Recently, a widespread neural reference space for emotion which includes limbic as well as other regions was characterized in a large meta-analysis. As methodologically heterogeneous studies go into such meta-analyses, showing in an individual study in which all parameters are kept constant, the involvement of overlapping areas for various emotion conditions in keeping with the neural reference space for emotion, would serve as valuable confirmatory evidence. Here, using fMRI, 20 young adult men were scanned while viewing validated neutral and effective emotion-eliciting short film excerpts shown to quickly and specifically elicit disgust, amusement, or sexual arousal. Each emotion-specific run included, in random order, multiple neutral and emotion condition blocks. A stringent conjunction analysis revealed a large overlap across emotion conditions that fit remarkably well with the neural reference space for emotion. This overlap included symmetrical bilateral activation of the medial prefrontal cortex, the anterior cingulate, the temporo-occipital junction, the basal ganglia, the brainstem, the amygdala, the hippocampus, the thalamus, the subthalamic nucleus, the posterior hypothalamus, the cerebellum, as well as the frontal operculum extending towards the anterior insula. This study clearly confirms for the visual modality, that processing emotional stimuli leads to widespread increases in activation that cluster within relatively confined areas, regardless of valence. PMID:21818311

  2. Revoicing Classrooms: A Spatial Manifesto

    ERIC Educational Resources Information Center

    Fisher, Kenn

    2004-01-01

    Why is the physical learning environment in schools largely ignored by teachers within pedagogical practice? The cellular classroom has remained seemingly immutable since the Industrial Revolution, with spatiality playing a silent and subconscious role in schooling other than related to concerns around surveillance. Previous studies have shown…

  3. The Machine in the Classroom.

    ERIC Educational Resources Information Center

    Snider, Robert C.

    1992-01-01

    Since the 1960s, difficulty of developing a technology of instruction in public schools has proved insurmountable; results have been spotty, machines have come and gone, and classroom practices remain largely unchanged. Public clamor for reform has provided neither direction nor purpose. Technology will ultimately prevail; the problem is educating…

  4. Classroom Games: A Prisoner's Dilemma.

    ERIC Educational Resources Information Center

    Holt, Charles A.; Capra, Monica

    2000-01-01

    Describes a classroom game called the prisoner's dilemma that illustrates the conflict between social incentives to cooperate and private incentives to defect. Explains that it is a simple card game involving a large number of students. States that the students should be introduced to the real-world applications of the game. (CMK)

  5. Comparison of Two Methods for Estimating the Sampling-Related Uncertainty of Satellite Rainfall Averages Based on a Large Radar Data Set

    NASA Technical Reports Server (NTRS)

    Lau, William K. M. (Technical Monitor); Bell, Thomas L.; Steiner, Matthias; Zhang, Yu; Wood, Eric F.

    2002-01-01

    The uncertainty of rainfall estimated from averages of discrete samples collected by a satellite is assessed using a multi-year radar data set covering a large portion of the United States. The sampling-related uncertainty of rainfall estimates is evaluated for all combinations of 100 km, 200 km, and 500 km space domains, 1 day, 5 day, and 30 day rainfall accumulations, and regular sampling time intervals of 1 h, 3 h, 6 h, 8 h, and 12 h. These extensive analyses are combined to characterize the sampling uncertainty as a function of space and time domain, sampling frequency, and rainfall characteristics by means of a simple scaling law. Moreover, it is shown that both parametric and non-parametric statistical techniques of estimating the sampling uncertainty produce comparable results. Sampling uncertainty estimates, however, do depend on the choice of technique for obtaining them. They can also vary considerably from case to case, reflecting the great variability of natural rainfall, and should therefore be expressed in probabilistic terms. Rainfall calibration errors are shown to affect comparison of results obtained by studies based on data from different climate regions and/or observation platforms.

  6. Consistency of Toddler Engagement across Two Settings

    ERIC Educational Resources Information Center

    Aguiar, Cecilia; McWilliam, R. A.

    2013-01-01

    This study documented the consistency of child engagement across two settings, toddler child care classrooms and mother-child dyadic play. One hundred twelve children, aged 14-36 months (M = 25.17, SD = 6.06), randomly selected from 30 toddler child care classrooms from the district of Porto, Portugal, participated. Levels of engagement were…

  7. Traveling Tags: The Informal Literacies of Mexican Newcomers in and out of the Classroom

    ERIC Educational Resources Information Center

    Bruna, Katherine Richardson

    2007-01-01

    This article documents tagging as one of several informal literacy practices used by newcomer Mexican youth in a Midwest school and classroom setting. Specifically, it details how tagging travels into the classroom. Using the tool of interactional ethnography to analyze videotaped classroom observation data of an English Learner Science setting, I…

  8. The role of large strike-slip faults in a convergent continental setting - first results from the Dzhungarian Fault in Eastern Kazakhstan

    NASA Astrophysics Data System (ADS)

    Grützner, Christoph; Campbell, Grace; Elliott, Austin; Walker, Richard; Abdrakhmatov, Kanatbek

    2016-04-01

    The Tien Shan and the Dzhungarian Ala-tau mountain ranges in Eastern Kazakhstan and China take up a significant portion of the total convergence between India and Eurasia, despite the fact that they are more than 1000 km away from the actual plate boundary. Shortening is accommodated by large thrust faults that strike more or less perpendicular to the convergence vector, and by a set of conjugate strike-slip faults. Some of these strike-slip faults are major features of several hundred kilometres length and have produced great historical earthquakes. In most cases, little is known about their slip-rates and earthquake history, and thus, about their role in the regional tectonic setting. This study deals with the NW-SE trending Dzhungarian Fault, a more than 350 km-long, right-lateral strike slip feature. It borders the Dzhungarian Ala-tau range and forms one edge of the so-called Dzhungarian Gate. The fault curves from a ~305° strike at its NW tip in Kazakhstan to a ~328° strike in China. No historical ruptures are known from the Kazakh part of the fault. A possible rupture in 1944 in the Chinese part remains discussed. We used remote sensing, Structure-from-Motion (SfM), differential GPS, field mapping, and Quaternary dating of offset geological markers in order to map the fault-related morphology and to measure the slip rate of the fault at several locations along strike. We also aimed to find out the age of the last surface rupturing earthquake and to determine earthquake recurrence intervals and magnitudes. We were further interested in the relation between horizontal and vertical motion along the fault and possible fault segmentation. Here we present first results from our 2015 survey. High-resolution digital elevation models of offset river terraces allowed us to determine the slip vector of the most recent earthquake. Preliminary dating results from abandoned fluvial terraces allow us to speculate on a late Holocene surface rupturing event. Morphological

  9. Mendel in the Modern Classroom

    NASA Astrophysics Data System (ADS)

    Smith, Mike U.; Gericke, Niklas M.

    2015-01-01

    Mendel is an icon in the history of genetics and part of our common culture and modern biology instruction. The aim of this paper is to summarize the place of Mendel in the modern biology classroom. In the present article we will identify key issues that make Mendel relevant in the classroom today. First, we recount some of the historical controversies that have relevance to modern curricular design, such as Fisher's (Ann Sci 1:115-137, 1936/2008) claim that Mendel's data were too good to be true. We also address questions about Mendel's status as the father of genetics as well as questions about the sequencing of Mendel's work in genetics instruction in relation to modern molecular genetics and evolution. Next, we present a systematic set of examples of research based approaches to the use of Mendel in the modern classroom along with criticisms of these designs and questions about the historical accuracy of the story of Mendel as presented in the typical classroom. Finally, we identify gaps in our understanding in need of further study and present a selected set of resources that, along with the references cited, should be valuable to science educators interested in further study of the story of Mendel.

  10. Photonics Explorer: revolutionizing photonics in the classroom

    NASA Astrophysics Data System (ADS)

    Prasad, Amrita; Debaes, Nathalie; Cords, Nina; Fischer, Robert; Vlekken, Johan; Euler, Manfred; Thienpont, Hugo

    2012-10-01

    The `Photonics Explorer' is a unique intra-curricular optics kit designed to engage, excite and educate secondary school students about the fascination of working with light - hands-on, in their own classrooms. Developed with a pan European collaboration of experts, the kit equips teachers with class sets of experimental material provided within a supporting didactic framework, distributed in conjunction with teacher training courses. The material has been specifically designed to integrate into European science curricula. Each kit contains robust and versatile components sufficient for a class of 25-30 students to work in groups of 2-3. The didactic content is based on guided inquiry-based learning (IBL) techniques with a strong emphasis on hands-on experiments, team work and relating abstract concepts to real world applications. The content has been developed in conjunction with over 30 teachers and experts in pedagogy to ensure high quality and ease of integration. It is currently available in 7 European languages. The Photonics Explorer allows students not only to hone their essential scientific skills but also to really work as scientists and engineers in the classroom. Thus, it aims to encourage more young people to pursue scientific careers and avert the imminent lack of scientific workforce in Europe. 50 Photonics Explorer kits have been successfully tested in 7 European countries with over 1500 secondary school students. The positive impact of the kit in the classroom has been qualitatively and quantitatively evaluated. A non-profit organisation, EYESTvzw [Excite Youth for Engineering Science and Technology], is responsible for the large scale distribution of the Photonics Explorer.

  11. The Community of Family Circles (CFC) algorithm: a new inversion approach to obtaining self-consitent 4D thermal histories from large, spatially distributed thermochronological data sets

    NASA Astrophysics Data System (ADS)

    Beucher, R.; Brown, R. W.

    2013-12-01

    One of the most significant advances in interpreting thermochronological data is arguably our ability to extract information about the rate and trajectory of cooling over a range of temperatures, rather than having to rely on the veracity of the simplification of assuming a single closure temperature specified by a rate of monotonic cooling. Modern thermochronometry data, such as apatite fission track and (U-Th)/He analysis, are particularly good examples of data amenable to this treatment as acceptably well calibrated kinetic models now exist for both systems. With ever larger data sets of this type being generated over ever larger areas the prospect of inverting very large amounts of such data distributed spatially over large areas offers new possibilities for constraining the thermal and erosional histories over length scales approximating whole orogens and sub-continents. The challenge though is in how to properly deal with joint inversion of multiple samples in a self-consistent manner while also utilising all the available information contained in the data. We describe a new approach to this problem, called the Community of Family Circles (CFC) algorithm, which extracts information from spatially distributed apatite fission track ages (AFT) and track length distributions (TLD). The method is based on the rationale that the 3D geothermal field of the crust varies smoothly through space and time because of the efficiency of thermal diffusion. Our approach consists of seeking groups of spatially adjacent samples, or families, within a given circular radius for which a common thermal history is appropriate. The temperature offsets between individual time-temperature paths are determined relative to a low-pass filtered topographic surface, whose shape is assumed to mimic the shape of the isotherms in the partial annealing zone. This enables a single common thermal history to be shared, or interpolated, between the family members while still honouring the

  12. Electrophysiological characterization of a large set of novel variants in the SCN5A-gene: identification of novel LQTS3 and BrS mutations.

    PubMed

    Ortiz-Bonnin, Beatriz; Rinné, Susanne; Moss, Robin; Streit, Anne K; Scharf, Michael; Richter, Katrin; Stöber, Anika; Pfeufer, Arne; Seemann, Gunnar; Kääb, Stefan; Beckmann, Britt-Maria; Decher, Niels

    2016-08-01

    SCN5A encodes for the α-subunit of the cardiac voltage-gated sodium channel Nav1.5. Gain-of-function mutations in SCN5A are related to congenital long QT syndrome (LQTS3) characterized by delayed cardiac repolarization, leading to a prolonged QT interval in the ECG. Loss-of-function mutations in SCN5A are related to Brugada syndrome (BrS), characterized by an ST-segment elevation in the right precordial leads (V1-V3). The aim of this study was the characterization of a large set of novel SCN5A variants found in patients with different cardiac phenotypes, mainly LQTS and BrS. SCN5A variants of 13 families were functionally characterized in Xenopus laevis oocytes using the two-electrode voltage-clamp technique. We found in most of the cases, but not all, that the electrophysiology of the variants correlated with the clinically diagnosed phenotype. A susceptibility to develop LQTS can be suggested in patients carrying the variants S216L, K480N, A572D, F816Y, and G983D. However, taking the phenotype into account, the presence of the variants in genomic data bases, the mutational segregation, combined with our in vitro and in silico experiments, the variants S216L, S262G, K480N, A572D, F816Y, G983D, and T1526P remain as variants of unknown significance. However, the SCN5A variants R568H and A993T can be classified as pathogenic LQTS3 causing mutations, while R222stop and R2012H are novel BrS causing mutations. PMID:27287068

  13. Twelve tips for "flipping" the classroom.

    PubMed

    Moffett, Jennifer

    2015-04-01

    The flipped classroom is a pedagogical model in which the typical lecture and homework elements of a course are reversed. The following tips outline the steps involved in making a successful transition to a flipped classroom approach. The tips are based on the available literature alongside the author's experience of using the approach in a medical education setting. Flipping a classroom has a number of potential benefits, for example increased educator-student interaction, but must be planned and implemented carefully to support effective learning. PMID:25154646

  14. Conformal Prediction Classification of a Large Data Set of Environmental Chemicals from ToxCast and Tox21 Estrogen Receptor Assays.

    PubMed

    Norinder, Ulf; Boyer, Scott

    2016-06-20

    Quantitative structure-activity relationships (QSAR) are critical to exploitation of the chemical information in toxicology databases. Exploitation can be extraction of chemical knowledge from the data but also making predictions of new chemicals based on quantitative analysis of past findings. In this study, we analyzed the ToxCast and Tox21 estrogen receptor data sets using Conformal Prediction to enhance the full exploitation of the information in these data sets. We applied aggregated conformal prediction (ACP) to the ToxCast and Tox21 estrogen receptor data sets using support vector machine classifiers to compare overall performance of the models but, more importantly, to explore the performance of ACP on data sets that are significantly enriched in one class without employing sampling strategies of the training set. ACP was also used to investigate the problem of applicability domain using both data sets. Comparison of ACP to previous results obtained on the same data sets using traditional QSAR approaches indicated similar overall balanced performance to methods in which careful training set selections were made, e.g., sensitivity and specificity for the external Tox21 data set of 70-75% and far superior results to those obtained using traditional methods without training set sampling where the corresponding results showed a clear imbalance of 50 and 96%, respectively. Application of conformal prediction to imbalanced data sets facilitates an unambiguous analysis of all data, allows accurate predictive models to be built which display similar accuracy in external validation to external validation, and, most importantly, allows an unambiguous treatment of the applicability domain. PMID:27152554

  15. An expanded calibration study of the explicitly correlated CCSD(T)-F12b method using large basis set standard CCSD(T) atomization energies

    NASA Astrophysics Data System (ADS)

    Feller, David; Peterson, Kirk A.

    2013-08-01

    The effectiveness of the recently developed, explicitly correlated coupled cluster method CCSD(T)-F12b is examined in terms of its ability to reproduce atomization energies derived from complete basis set extrapolations of standard CCSD(T). Most of the standard method findings were obtained with aug-cc-pV7Z or aug-cc-pV8Z basis sets. For a few homonuclear diatomic molecules it was possible to push the basis set to the aug-cc-pV9Z level. F12b calculations were performed with the cc-pVnZ-F12 (n = D, T, Q) basis set sequence and were also extrapolated to the basis set limit using a Schwenke-style, parameterized formula. A systematic bias was observed in the F12b method with the (VTZ-F12/VQZ-F12) basis set combination. This bias resulted in the underestimation of reference values associated with small molecules (valence correlation energies <0.5 Eh) and an even larger overestimation of atomization energies for bigger systems. Consequently, caution should be exercised in the use of F12b for high accuracy studies. Root mean square and mean absolute deviation error metrics for this basis set combination were comparable to complete basis set values obtained with standard CCSD(T) and the aug-cc-pVDZ through aug-cc-pVQZ basis set sequence. However, the mean signed deviation was an order of magnitude larger. Problems partially due to basis set superposition error were identified with second row compounds which resulted in a weak performance for the smaller VDZ-F12/VTZ-F12 combination of basis sets.

  16. Using Water-Testing Data Sets.

    ERIC Educational Resources Information Center

    Varrella, Gary F.

    1994-01-01

    Advocates an approach to teaching environmentally related studies based on constructivism. Presents an activity that makes use of data on chemicals in the water supply, and discusses obtaining and using data sets in the classroom. (LZ)

  17. Observing Special and Regular Education Classrooms.

    ERIC Educational Resources Information Center

    Hersh, Susan B.

    The paper describes an observation instrument originally developed as a research tool to assess both the special setting and the regular classroom. The instrument can also be used in determining appropriate placement for students with learning disabilities and for programming the transfer of skills learned in the special setting to the regular…

  18. Self-Contained Classrooms. Research Brief

    ERIC Educational Resources Information Center

    Walker, Karen

    2009-01-01

    Determining the ideal academic setting in which students can be successful continues to be one of the primary goals of educators. Is there a best classroom structure in which students can be successful? Although there is research on the academic gains in the block schedule and in traditional departmentalized settings, both of which are common in…

  19. How Time Is Spent in Elementary Classrooms

    ERIC Educational Resources Information Center

    Rosenshine, Barak V.

    2015-01-01

    The Beginning Teacher Evaluation Study (BTES) provides valuable information on how time is spent in elementary classrooms. Some of the major topics are: the average minutes per day which students spend engaged in reading and math activities, student engagement rates in different settings (that is, teacher-led settings versus seatwork) and…

  20. Allowing "Artistic Agency" in the Elementary Classroom

    ERIC Educational Resources Information Center

    Rufo, David

    2011-01-01

    The author was interested in seeing what would happen if children were given more latitude when making art in school. In January 2009, he began by setting up environments in his classroom wherein he hoped his students would feel free to create self-initiated forms of artmaking. Two times each week an hour was set aside for an activity called Open…

  1. The Networked Classroom

    ERIC Educational Resources Information Center

    Roschelle, Jeremy; Penuel, William R.; Abrahamson, Louis

    2004-01-01

    Classroom network requires every student to think actively, which enhances student participation in mathematics and science. Classroom-specific networks use software designed to enhance communication between teacher and students.

  2. In the Classroom.

    ERIC Educational Resources Information Center

    French, Michael P.; Danielson, Kathy Everts

    1991-01-01

    Presents seven reading activities involving the preschool classroom writing environment, using big books and predictable books, using cereal boxes to foster emergent literacy, using editorials, visual-auditory links, reading outside the classroom, and ownership of writing. (MG)

  3. Inside the Primary Classroom.

    ERIC Educational Resources Information Center

    Simon, Brian

    1980-01-01

    Presents some of the findings of the ORACLE research program (Observational Research and Classroom Learning Evaluation), a detailed observational study of teacher-student interaction, teaching styles, and management methods within a sample of primary classrooms. (Editor/SJL)

  4. Designing Cooperative Learning in the Science Classroom: Integrating the Peer Tutoring Small Investigation Group (PTSIG) within the Model of the Six Mirrors of the Classroom Model

    ERIC Educational Resources Information Center

    Lazarowitz, Reuven; Hertz-Lazarowitz, Rachel; Khalil, Mahmood; Ron, Salit

    2013-01-01

    The model of the six mirrors of the classroom and its use in teaching biology in a cooperative learning mode were implemented in high school classrooms. In this study we present: a) The model of the six mirrors of the classroom (MSMC). b) Cooperative learning settings: 1. The Group Investigation; 2. The Jigsaw Method; and 3. Peer Tutoring in Small…

  5. Classroom Management. TESOL Classroom Practice Series

    ERIC Educational Resources Information Center

    Farrell, Thomas S. C., Ed.

    2008-01-01

    This series captures the dynamics of the contemporary ESOL classroom. It showcases state-of-the-art curricula, materials, tasks, and activities reflecting emerging trends in language education and seeks to build localized language teaching and learning theories based on teachers' and students' unique experiences in and beyond the classroom. Each…

  6. Creating Respectful Classroom Environments

    ERIC Educational Resources Information Center

    Miller, Regina; Pedro, Joan

    2006-01-01

    Respect is a critical variable in education. It is critical to each individual child in the classroom environment as well as to the teaching and learning that takes place in the classroom. Children learn by example. Where do they get their examples? This article explores the parameters of teaching and encouraging respect in classrooms for young…

  7. Competition in the Classroom

    ERIC Educational Resources Information Center

    Jameson, Daphne

    2007-01-01

    In this article, the author shares the strategy she adopted to even out the participation among her multicultural students during their classroom discussions. The author realized that her students had different concepts about the classroom and different philosophies about competition. For the Americans and Indians, the classroom was a site of…

  8. Classroom Management for Ensembles.

    ERIC Educational Resources Information Center

    Bauer, William I.

    2001-01-01

    Discusses topics essential to good classroom management for ensemble music teachers. Explores the importance of planning and preparation, good teaching practice within the classroom, and using an effective discipline plan to deal with any behavior problems in the classroom. Includes a bibliography of further resources. (CMK)

  9. Classroom Use and Utilization.

    ERIC Educational Resources Information Center

    Fink, Ira

    2002-01-01

    Discusses how classrooms are distributed by size on a campus, how well they are used, and how their use changes with faculty and student needs and desires. Details how to analyze classroom space, use, and utilization, taking into account such factors as scheduling and classroom stations. (EV)

  10. Teaching Quality across School Settings

    ERIC Educational Resources Information Center

    Cohen, Julie; Brown, Michelle

    2016-01-01

    Districts are increasingly making personnel decisions based on teachers' impact on student-achievement gains and classroom observations. In some schools, however, a teacher's practices and their students' achievement may reflect not just individual but collaborative efforts. In other settings, teachers' instruction benefits less from the insights…

  11. A large-scale, high-resolution hydrological model parameter data set for climate change impact assessment for the conterminous US

    NASA Astrophysics Data System (ADS)

    Oubeidillah, A. A.; Kao, S.-C.; Ashfaq, M.; Naz, B. S.; Tootle, G.

    2014-01-01

    To extend geographical coverage, refine spatial resolution, and improve modeling efficiency, a computation- and data-intensive effort was conducted to organize a comprehensive hydrologic data set with post-calibrated model parameters for hydro-climate impact assessment. Several key inputs for hydrologic simulation - including meteorologic forcings, soil, land class, vegetation, and elevation - were collected from multiple best-available data sources and organized for 2107 hydrologic subbasins (8-digit hydrologic units, HUC8s) in the conterminous US at refined 1/24° (~4 km) spatial resolution. Using high-performance computing for intensive model calibration, a high-resolution parameter data set was prepared for the macro-scale variable infiltration capacity (VIC) hydrologic model. The VIC simulation was driven by Daymet daily meteorological forcing and was calibrated against US Geological Survey (USGS) WaterWatch monthly runoff observations for each HUC8. The results showed that this new parameter data set may help reasonably simulate runoff at most US HUC8 subbasins. Based on this exhaustive calibration effort, it is now possible to accurately estimate the resources required for further model improvement across the entire conterminous US. We anticipate that through this hydrologic parameter data set, the repeated effort of fundamental data processing can be lessened, so that research efforts can emphasize the more challenging task of assessing climate change impacts. The pre-organized model parameter data set will be provided to interested parties to support further hydro-climate impact assessment.

  12. Becoming urban science teachers by transforming middle-school classrooms: A study of the Urban Science Education Fellows Program

    NASA Astrophysics Data System (ADS)

    Furman, Melina Gabriela

    The current scenario in American education shows a large achievement and opportunity gap in science between urban children in poverty and more privileged youth. Research has shown that one essential factor that accounts for this gap is the shortage of qualified science teachers in urban schools. Teaching science in a high poverty school presents unique challenges to beginner teachers. Limited resources and support and a significant cultural divide with their students are some of the common problems that cause many novice teachers to quit their jobs or to start enacting what has been described as "the pedagogy of poverty." In this study I looked at the case of the Urban Science Education Fellows Program. This program aimed to prepare preservice teachers (i.e. "fellows") to enact socially just science pedagogies in urban classrooms. I conducted qualitative case studies of three fellows. Fellows worked over one year with science teachers in middle-school classrooms in order to develop transformative action research studies. My analysis focused on how fellows coauthored hybrid spaces within these studies that challenged the typical ways science was taught and learned in their classrooms towards a vision of socially just teaching. By coauthoring these hybrid spaces, fellows developed grounded generativity, i.e. a capacity to create new teaching scenarios rooted in the pragmatic realities of an authentic classroom setting. Grounded generativity included building upon their pedagogical beliefs in order to improvise pedagogies with others, repositioning themselves and their students differently in the classroom and constructing symbols of possibility to guide their practice. I proposed authentic play as the mechanism that enabled fellows to coauthor hybrid spaces. Authentic play involved contexts of moderate risk and of distributed expertise and required fellows to be positioned at the intersection of the margins and the center of the classroom community of practice. In

  13. Runoff from a cornfield as affected by tillage and corn canopy: A large-scale simulated-rainfall hydrologic data set for model testing

    NASA Astrophysics Data System (ADS)

    Wauchope, R. D.; Sumner, H. R.; Truman, C. C.; Johnson, A. W.; Dowler, C. C.; Hook, J. E.; Gascho, G. J.; Davis, J. G.; Chandler, L. D.

    1999-09-01

    A rainfall simulator was used to apply 5 cm of rainfall in 2 hours to two replicate 624 m2 plots at six times during each of the growing seasons of 1992 and 1993. Because the simulator generated reproducible and time-invariant rainfall intensities, the resulting 24 hydrographs reproducibly reveal the effects of tractor wheel compaction, tillage, soil reconsolidation, surface sealing, and corn canopy development. A time series data set including weather, crop development, soils properties, evapotranspiration, and antecedent soil water is available. These data should provide hydrologie modelers, particularly those interested in modeling runoff with time resolutions of <1 day, with a useful validation data set.

  14. Just in Time to Flip Your Classroom

    NASA Astrophysics Data System (ADS)

    Lasry, Nathaniel; Dugdale, Michael; Charles, Elizabeth

    2014-01-01

    With advocates like Sal Khan and Bill Gates, flipped classrooms are attracting an increasing amount of media and research attention.2 We had heard Khan's TED talk and were aware of the concept of inverted pedagogies in general. Yet it really hit home when we accidentally flipped our classroom. Our objective was to better prepare our students for class. We set out to effectively move some of our course content outside of class and decided to tweak the Just-in-Time Teaching approach (JiTT).3 To our surprise, this tweak—which we like to call the flip-JiTT—ended up completely flipping our classroom. What follows is narrative of our experience and a procedure that any teacher can use to extend JiTT to a flipped classroom.

  15. Designing for the Active Classroom

    SciTech Connect

    Wilkerson, Andrea M.; Donohue, Amy; Davis, Robert G.

    2015-02-01

    The article discusses trends in classroom design and then transitions to a discussion of the future of the classroom and how the lighting industry needs to be preparing to meet the needs of the future classroom. The OSU Classroom building as an example throughout, first discussing how trends in classroom design were incorporated into the Classroom Building and then discussing how future lighting systems could enhance the Classroom Building, which is a clear departure from the actual lighting design and current technology.

  16. The Social Network Classroom

    NASA Astrophysics Data System (ADS)

    Bunus, Peter

    Online social networking is an important part in the everyday life of college students. Despite the increasing popularity of online social networking among students and faculty members, its educational benefits are largely untested. This paper presents our experience in using social networking applications and video content distribution websites as a complement of traditional classroom education. In particular, the solution has been based on effective adaptation, extension and integration of Facebook, Twitter, Blogger YouTube and iTunes services for delivering educational material to students on mobile platforms like iPods and 3 rd generation mobile phones. The goals of the proposed educational platform, described in this paper, are to make the learning experience more engaging, to encourage collaborative work and knowledge sharing among students, and to provide an interactive platform for the educators to reach students and deliver lecture material in a totally new way.

  17. Setting Structure and the Problem of the Match.

    ERIC Educational Resources Information Center

    Grannis, Joseph C.; Jackson, David E.

    Greene's finding that children's involvement was higher in the more pupil-controlled classrooms is in apparent conflict with CCEP's expectation that involvement is a measure independent of setting. A resolution is suggested in the study of a set of 20 all-day behavior stream observations of individual children in 3 Follow Through classrooms and…

  18. Ecological Analysis of Early Childhood Settings: Implications for Mainstreaming.

    ERIC Educational Resources Information Center

    Peterson, Karen L.

    In an effort to help developmentally delayed or disabled children succeed in an integrated or regular early childhood classroom setting, the Rural Area Model Preschool Project staff developed an ecological inventory to identify the behaviors and skills expected of preschoolers in classroom settings. The inventory was used for 2 months in eight…

  19. Prediction using step-wise L1, L2 regularization and feature selection for small data sets with large number of features

    PubMed Central

    2011-01-01

    Background Machine learning methods are nowadays used for many biological prediction problems involving drugs, ligands or polypeptide segments of a protein. In order to build a prediction model a so called training data set of molecules with measured target properties is needed. For many such problems the size of the training data set is limited as measurements have to be performed in a wet lab. Furthermore, the considered problems are often complex, such that it is not clear which molecular descriptors (features) may be suitable to establish a strong correlation with the target property. In many applications all available descriptors are used. This can lead to difficult machine learning problems, when thousands of descriptors are considered and only few (e.g. below hundred) molecules are available for training. Results The CoEPrA contest provides four data sets, which are typical for biological regression problems (few molecules in the training data set and thousands of descriptors). We applied the same two-step training procedure for all four regression tasks. In the first stage, we used optimized L1 regularization to select the most relevant features. Thus, the initial set of more than 6,000 features was reduced to about 50. In the second stage, we used only the selected features from the preceding stage applying a milder L2 regularization, which generally yielded further improvement of prediction performance. Our linear model employed a soft loss function which minimizes the influence of outliers. Conclusions The proposed two-step method showed good results on all four CoEPrA regression tasks. Thus, it may be useful for many other biological prediction problems where for training only a small number of molecules are available, which are described by thousands of descriptors. PMID:22026913

  20. Rewards, Intrinsic Motivation, and Achievement in Intact Classrooms

    ERIC Educational Resources Information Center

    Luis, Melissa Ann

    2011-01-01

    The purpose of this study was to examine the effects of performance-contingent rewards in a real-world setting, namely the sixth grade math classroom. This study is significant in that it represents a field study on the effects of rewards in the classroom. The purpose of this study was to investigate what effect, if any, the choice of a reward had…