Science.gov

Sample records for large classroom setting

  1. The Utility of Concept Maps to Facilitate Higher-Level Learning in a Large Classroom Setting

    PubMed Central

    Carr-Lopez, Sian M.; Vyas, Deepti; Patel, Rajul A.; Gnesa, Eric H.

    2014-01-01

    Objective. To describe the utility of concept mapping in a cardiovascular therapeutics course within a large classroom setting. Design. Students enrolled in a cardiovascular care therapeutics course completed concept maps for each major chronic cardiovascular condition. A grading rubric was used to facilitate peer-assessment of the concept map. Assessment. Students were administered a survey at the end of the course assessing their perceptions on the usefulness of the concept maps during the course and also during APPEs to assess utility beyond the course. Question item analyses were conducted on cumulative final examinations comparing student performance on concept-mapped topics compared to nonconcept-mapped topics. Conclusion. Concept maps help to facilitate meaningful learning within the course and the majority of students utilized them beyond the course. PMID:26056408

  2. Calibrated Peer Review: A New Tool for Integrating Information Literacy Skills in Writing-Intensive Large Classroom Settings

    ERIC Educational Resources Information Center

    Fosmire, Michael

    2010-01-01

    Calibrated Peer Review[TM] (CPR) is a program that can significantly enhance the ability to integrate intensive information literacy exercises into large classroom settings. CPR is founded on a solid pedagogic base for learning, and it is formulated in such a way that information skills can easily be inserted. However, there is no mention of its…

  3. Impact of problem-based learning in a large classroom setting: student perception and problem-solving skills.

    PubMed

    Klegeris, Andis; Hurren, Heather

    2011-12-01

    Problem-based learning (PBL) can be described as a learning environment where the problem drives the learning. This technique usually involves learning in small groups, which are supervised by tutors. It is becoming evident that PBL in a small-group setting has a robust positive effect on student learning and skills, including better problem-solving skills and an increase in overall motivation. However, very little research has been done on the educational benefits of PBL in a large classroom setting. Here, we describe a PBL approach (using tutorless groups) that was introduced as a supplement to standard didactic lectures in University of British Columbia Okanagan undergraduate biochemistry classes consisting of 45-85 students. PBL was chosen as an effective method to assist students in learning biochemical and physiological processes. By monitoring student attendance and using informal and formal surveys, we demonstrated that PBL has a significant positive impact on student motivation to attend and participate in the course work. Student responses indicated that PBL is superior to traditional lecture format with regard to the understanding of course content and retention of information. We also demonstrated that student problem-solving skills are significantly improved, but additional controlled studies are needed to determine how much PBL exercises contribute to this improvement. These preliminary data indicated several positive outcomes of using PBL in a large classroom setting, although further studies aimed at assessing student learning are needed to further justify implementation of this technique in courses delivered to large undergraduate classes. PMID:22139779

  4. Active Learning in a Large Medical Classroom Setting for Teaching Renal Physiology

    ERIC Educational Resources Information Center

    Dietz, John R.; Stevenson, Frazier T.

    2011-01-01

    In this article, the authors describe an active learning exercise which has been used to replace some lecture hours in the renal portion of an integrated, organ system-based curriculum for first-year medical students. The exercise takes place in a large auditorium with ~150 students. The authors, who are faculty members, lead the discussions,…

  5. Setting Up a Classroom Business

    ERIC Educational Resources Information Center

    Morgan, Madeline L.

    1977-01-01

    Junior high school home economics students plan and operate a holiday boutique in their school. The organization, operation, and evaluation involved in setting up a simulated business in the classroom is described. (BM)

  6. A Classroom Tariff-Setting Game

    ERIC Educational Resources Information Center

    Winchester, Niven

    2006-01-01

    The author outlines a classroom tariff-setting game that allows students to explore the consequences of import tariffs imposed by large countries (countries able to influence world prices). Groups of students represent countries, which are organized into trading pairs. Each group's objective is to maximize welfare by choosing an appropriate ad…

  7. Controlling Setting Events in the Classroom

    ERIC Educational Resources Information Center

    Chan, Paula E.

    2016-01-01

    Teachers face the challenging job of differentiating instruction for the diverse needs of their students. This task is difficult enough with happy students who are eager to learn; unfortunately students often enter the classroom in a bad mood because of events that happened outside the classroom walls. These events--called setting events--can…

  8. Simulating Problem Solving and Classroom Settings

    ERIC Educational Resources Information Center

    Liedtke, Werner; Vance, James

    1978-01-01

    Some problems and classroom problem-solving settings presented in a mathematics methods course for elementary teachers are described and some results of student's involvement in the program are illustrated. (MN)

  9. Improvement in Generic Problem-Solving Abilities of Students by Use of Tutor-Less Problem-Based Learning in a Large Classroom Setting

    ERIC Educational Resources Information Center

    Klegeris, Andis; Bahniwal, Manpreet; Hurren, Heather

    2013-01-01

    Problem-based learning (PBL) was originally introduced in medical education programs as a form of small-group learning, but its use has now spread to large undergraduate classrooms in various other disciplines. Introduction of new teaching techniques, including PBL-based methods, needs to be justified by demonstrating the benefits of such…

  10. Implementing iPads in the Inclusive Classroom Setting

    ERIC Educational Resources Information Center

    Maich, Kimberly; Hall, Carmen

    2016-01-01

    This column provides practical suggestions to help guide teachers in utilizing classroom sets of iPads. Following a brief introduction to tablet technology in inclusive classrooms and the origin of these recommendations from a case study focus group, important elements of setting up classroom iPad use, from finding funding to teaching apps, are…

  11. Implementing iPads in the Inclusive Classroom Setting

    ERIC Educational Resources Information Center

    Maich, Kimberly; Hall, Carmen

    2016-01-01

    This column provides practical suggestions to help guide teachers in utilizing classroom sets of iPads. Following a brief introduction to tablet technology in inclusive classrooms and the origin of these recommendations from a case study focus group, important elements of setting up classroom iPad use, from finding funding to teaching apps, are…

  12. Classroom Management: Setting Up the Classroom for Learning

    ERIC Educational Resources Information Center

    Sterling, Donna R.

    2009-01-01

    Student learning is directly related to classroom control established the first week of school (Wong and Wong 2001)--what you do the first day counts, and what you do the first 10 minutes counts even more. This article shares the advanced planning aspects of classroom management that should be in place before students enter the classroom for the…

  13. A Practical Setting of Distance Learning Classroom.

    ERIC Educational Resources Information Center

    Wang, Shousan; Buck, Lawrence

    1996-01-01

    Describes a distance-learning classroom developed and used by Central Connecticut State University for nurse training, educational statistics, mathematics, and technology courses. Discusses initial engineering, video cameras, video source switching, lighting, audio, and other technical and related aspects. Block diagrams and lists of equipment for…

  14. Tangential Floor in a Classroom Setting

    ERIC Educational Resources Information Center

    Marti, Leyla

    2012-01-01

    This article examines floor management in two classroom sessions: a task-oriented computer lesson and a literature lesson. Recordings made in the computer lesson show the organization of floor when a task is given to students. Temporary or "incipient" side floors (Jones and Thornborrow, 2004) emerge beside the main floor. In the literature lesson,…

  15. Classroom Assessment Tools and Students' Affective Stances: KFL Classroom Settings

    ERIC Educational Resources Information Center

    Byon, Andrew Sangpil

    2005-01-01

    The growth of KFL (Korean-as-a-Foreign-Language) programmes in the US college setting has been truly remarkable in the last three decades. However, despite the gradual and steady growth of the non-heritage student population, the predominant group has been heritage students in most KFL programmes. In addition, teaching these two groups of students…

  16. Student Engagement and Success in the Large Astronomy 101 Classroom

    NASA Astrophysics Data System (ADS)

    Jensen, J. B.

    2014-07-01

    The large auditorium classroom presents unique challenges to maintaining student engagement. During the fall 2012 semester, I adopted several specific strategies for increasing student engagement and reducing anonymity with the goal of maximizing student success in the large class. I measured attendance and student success in two classes, one with 300 students and one with 42, but otherwise taught as similarly as possible. While the students in the large class probably did better than they would have in a traditional lecture setting, attendance was still significantly lower in the large class, resulting in lower student success than in the small control class by all measures. I will discuss these results and compare to classes in previous semesters, including other small classes and large Distance Education classes conducted live over remote television link.

  17. Setting Variables, Classroom Interaction, and Multiple Pupil Outcomes. Final Report.

    ERIC Educational Resources Information Center

    Soar, Robert S.; Soar, Ruth, M.

    Four general problems (two substantive, two methodological) were addressed in a research project: (1) Does the nature of the pupil or the setting make a difference in the teaching style which is most effective? (2) Does the cognitive level of the learning objective make a difference? (3) How can relationships within the classroom be analyzed? and…

  18. Observation Instrument of Play Behaviour in a Classroom Setting

    ERIC Educational Resources Information Center

    Berkhout, Louise; Hoekman, Joop; Goorhuis-Brouwer, Sieneke M.

    2012-01-01

    The objective of this study was to develop an instrument to observe the play behaviour of a whole group of children from four to six years of age in a classroom setting on the basis of video recording. The instrument was developed in collaboration with experienced teachers and experts on play. Categories of play were derived from the literature…

  19. Observation Instrument of Play Behaviour in a Classroom Setting

    ERIC Educational Resources Information Center

    Berkhout, Louise; Hoekman, Joop; Goorhuis-Brouwer, Sieneke M.

    2012-01-01

    The objective of this study was to develop an instrument to observe the play behaviour of a whole group of children from four to six years of age in a classroom setting on the basis of video recording. The instrument was developed in collaboration with experienced teachers and experts on play. Categories of play were derived from the literature…

  20. Social Studies Instruction in a Non-Classroom Setting.

    ERIC Educational Resources Information Center

    Murphy, Margaret M.

    Certain areas in the social studies can be effectively taught in a non-classroom setting. This experiment determined if, in a supermarket situation, consumer preferences (as measured in sales figures and augmented by questionnaire data) could be altered by the addition of nutritional information to the labels of sixteen items which had moderate…

  1. Enhancing Feedback via Peer Learning in Large Classrooms

    ERIC Educational Resources Information Center

    Zher, Ng Huey; Hussein, Raja Maznah Raja; Saat, Rohaida Mohd

    2016-01-01

    Feedback has been lauded as a key pedagogical tool in higher education. Unfortunately, the value of feedback falls short when being carried out in large classrooms. In this study, strategies for sustaining feedback in large classroom based on peer learning are explored. All the characteristics identified within the concept of peer learning were…

  2. Examining the Effectiveness of Team-Based Learning (TBL) in Different Classroom Settings

    ERIC Educational Resources Information Center

    Yuretich, Richard F.; Kanner, Lisa C.

    2015-01-01

    The problem of effective learning in college classrooms, especially in a large lecture setting, has been a topic of discussion for a considerable span of time. Most efforts to improve learning incorporate various forms of student-active learning, such as in-class investigations or problems, group discussions, collaborative examinations and…

  3. Radial sets: interactive visual analysis of large overlapping sets.

    PubMed

    Alsallakh, Bilal; Aigner, Wolfgang; Miksch, Silvia; Hauser, Helwig

    2013-12-01

    In many applications, data tables contain multi-valued attributes that often store the memberships of the table entities to multiple sets such as which languages a person masters, which skills an applicant documents, or which features a product comes with. With a growing number of entities, the resulting element-set membership matrix becomes very rich of information about how these sets overlap. Many analysis tasks targeted at set-typed data are concerned with these overlaps as salient features of such data. This paper presents Radial Sets, a novel visual technique to analyze set memberships for a large number of elements. Our technique uses frequency-based representations to enable quickly finding and analyzing different kinds of overlaps between the sets, and relating these overlaps to other attributes of the table entities. Furthermore, it enables various interactions to select elements of interest, find out if they are over-represented in specific sets or overlaps, and if they exhibit a different distribution for a specific attribute compared to the rest of the elements. These interactions allow formulating highly-expressive visual queries on the elements in terms of their set memberships and attribute values. As we demonstrate via two usage scenarios, Radial Sets enable revealing and analyzing a multitude of overlapping patterns between large sets, beyond the limits of state-of-the-art techniques. PMID:24051816

  4. Observations of Children’s Interactions with Teachers, Peers, and Tasks across Preschool Classroom Activity Settings

    PubMed Central

    Booren, Leslie M.; Downer, Jason T.; Vitiello, Virginia E.

    2014-01-01

    This descriptive study examined classroom activity settings in relation to children’s observed behavior during classroom interactions, child gender, and basic teacher behavior within the preschool classroom. 145 children were observed for an average of 80 minutes during 8 occasions across 2 days using the inCLASS, an observational measure that conceptualizes behavior into teacher, peer, task, and conflict interactions. Findings indicated that on average children’s interactions with teachers were higher in teacher-structured settings, such as large group. On average, children’s interactions with peers and tasks were more positive in child-directed settings, such as free choice. Children experienced more conflict during recess and routines/transitions. Finally, gender differences were observed within small group and meals. The implications of these findings might encourage teachers to be thoughtful and intentional about what types of support and resources are provided so children can successfully navigate the demands of particular settings. These findings are not meant to discourage certain teacher behaviors or imply value of certain classroom settings; instead, by providing an evidenced-based picture of the conditions under which children display the most positive interactions, teachers can be more aware of choices within these settings and have a powerful way to assist in professional development and interventions. PMID:25717282

  5. Using News Articles to Build a Critical Literacy Classroom in an EFL Setting

    ERIC Educational Resources Information Center

    Park, Yujong

    2011-01-01

    This article examines an effort to support critical literacy in an English as a foreign language (EFL) setting by analyzing one college EFL reading classroom in which students read and responded to articles from "The New Yorker". Data include transcribed audiotapes of classroom interaction and interviews with students, classroom materials, and…

  6. Using News Articles to Build a Critical Literacy Classroom in an EFL Setting

    ERIC Educational Resources Information Center

    Park, Yujong

    2011-01-01

    This article examines an effort to support critical literacy in an English as a foreign language (EFL) setting by analyzing one college EFL reading classroom in which students read and responded to articles from "The New Yorker". Data include transcribed audiotapes of classroom interaction and interviews with students, classroom materials, and…

  7. Observations of Children's Interactions with Teachers, Peers, and Tasks across Preschool Classroom Activity Settings

    ERIC Educational Resources Information Center

    Booren, Leslie M.; Downer, Jason T.; Vitiello, Virginia E.

    2012-01-01

    Research Findings: This descriptive study examined classroom activity settings in relation to children's observed behavior during classroom interactions, child gender, and basic teacher behavior within the preschool classroom. A total of 145 children were observed for an average of 80 min during 8 occasions across 2 days using the Individualized…

  8. Lessons Learned from a Multiculturally, Economically Diverse Classroom Setting.

    ERIC Educational Resources Information Center

    Lyman, Lawrence

    For her sabbatical a professor of teacher education at Emporia State University returned to the elementary classroom after a 20-year absence to teach in a third/fourth combination classroom in the Emporia, Kansas Public Schools. The return to elementary classroom teaching provided the professor with the opportunity to utilize some of the social…

  9. On Flipping the Classroom in Large First Year Calculus Courses

    ERIC Educational Resources Information Center

    Jungic, Veselin; Kaur, Harpreet; Mulholland, Jamie; Xin, Cindy

    2015-01-01

    Over the course of two years, 2012-2014, we have implemented a "flipping" the classroom approach in three of our large enrolment first year calculus courses: differential and integral calculus for scientists and engineers. In this article we describe the details of our particular approach and share with the reader some experiences of…

  10. On Flipping the Classroom in Large First Year Calculus Courses

    ERIC Educational Resources Information Center

    Jungic, Veselin; Kaur, Harpreet; Mulholland, Jamie; Xin, Cindy

    2015-01-01

    Over the course of two years, 2012-2014, we have implemented a "flipping" the classroom approach in three of our large enrolment first year calculus courses: differential and integral calculus for scientists and engineers. In this article we describe the details of our particular approach and share with the reader some experiences of…

  11. Peer Educators in Classroom Settings: Effective Academic Partners

    ERIC Educational Resources Information Center

    Owen, Julie E.

    2011-01-01

    Involving undergraduates in the design, delivery, and evaluation of classroom-based learning enhances student ownership of the learning environment and stimulates peer interest in the transformative possibilities of education. As bell hooks (1994) eloquently describes, the process of honoring student voices in the classroom enhances "the…

  12. The Emergence of Student Creativity in Classroom Settings: A Case Study of Elementary Schools in Korea

    ERIC Educational Resources Information Center

    Cho, Younsoon; Chung, Hye Young; Choi, Kyoulee; Seo, Choyoung; Baek, Eunjoo

    2013-01-01

    This research explores the emergence of student creativity in classroom settings, specifically within two content areas: science and social studies. Fourteen classrooms in three elementary schools in Korea were observed, and the teachers and students were interviewed. The three types of student creativity emerging in the teaching and learning…

  13. An Exploration of the Effectiveness of an Audit Simulation Tool in a Classroom Setting

    ERIC Educational Resources Information Center

    Zelin, Robert C., II

    2010-01-01

    The purpose of this study was to examine the effectiveness of using an audit simulation product in a classroom setting. Many students and professionals feel that a disconnect exists between learning auditing in the classroom and practicing auditing in the workplace. It was hoped that the introduction of an audit simulation tool would help to…

  14. The Emergence of Student Creativity in Classroom Settings: A Case Study of Elementary Schools in Korea

    ERIC Educational Resources Information Center

    Cho, Younsoon; Chung, Hye Young; Choi, Kyoulee; Seo, Choyoung; Baek, Eunjoo

    2013-01-01

    This research explores the emergence of student creativity in classroom settings, specifically within two content areas: science and social studies. Fourteen classrooms in three elementary schools in Korea were observed, and the teachers and students were interviewed. The three types of student creativity emerging in the teaching and learning…

  15. Improving Classroom Acoustics: Utilizing Hearing-Assistive Technology and Communication Strategies in the Educational Setting.

    ERIC Educational Resources Information Center

    Crandell, Carl C.; Smaldino, Joseph J.

    1999-01-01

    This article examines acoustical, technological, and rehabilitative solutions for improving classroom acoustics and speech perception in classroom settings. These procedures include: physical acoustical modifications of the room, personal hearing aids, hearing-assistive technologies, modifications in speaker-listener distance, optimizing visual…

  16. Clickers in the large classroom: current research and best-practice tips.

    PubMed

    Caldwell, Jane E

    2007-01-01

    Audience response systems (ARS) or clickers, as they are commonly called, offer a management tool for engaging students in the large classroom. Basic elements of the technology are discussed. These systems have been used in a variety of fields and at all levels of education. Typical goals of ARS questions are discussed, as well as methods of compensating for the reduction in lecture time that typically results from their use. Examples of ARS use occur throughout the literature and often detail positive attitudes from both students and instructors, although exceptions do exist. When used in classes, ARS clickers typically have either a benign or positive effect on student performance on exams, depending on the method and extent of their use, and create a more positive and active atmosphere in the large classroom. These systems are especially valuable as a means of introducing and monitoring peer learning methods in the large lecture classroom. So that the reader may use clickers effectively in his or her own classroom, a set of guidelines for writing good questions and a list of best-practice tips have been culled from the literature and experienced users. PMID:17339389

  17. Understanding Bystander Perceptions of Cyberbullying in Inclusive Classroom Settings

    ERIC Educational Resources Information Center

    Guckert, Mary

    2013-01-01

    Cyberbullying is a pervasive problem that puts students at risk of successful academic outcomes and the ability to feel safe in school. As most students with disabilities are served in inclusive classrooms, there is a growing concern that students with special needs are at an increased risk of online bullying harassment. Enhancing responsible…

  18. Twelve Practical Strategies To Prevent Behavioral Escalation in Classroom Settings.

    ERIC Educational Resources Information Center

    Shukla-Mehta, Smita; Albin, Richard W.

    2003-01-01

    Twelve practical strategies that can be used by classroom teachers to prevent behavioral escalation are discussed, including reinforce calm, know the triggers, pay attention to anything unusual, do not escalate, intervene early, know the function of problem behavior, use extinction wisely, teach prosocial behavior, and teach academic survival…

  19. Thinking Routines: Replicating Classroom Practices within Museum Settings

    ERIC Educational Resources Information Center

    Wolberg, Rochelle Ibanez; Goff, Allison

    2012-01-01

    This article describes thinking routines as tools to guide and support young children's thinking. These learning strategies, developed by Harvard University's Project Zero Classroom, actively engage students in constructing meaning while also understanding their own thinking process. The authors discuss how thinking routines can be used in both…

  20. Setting of Classroom Environments for Hearing Impaired Children

    ERIC Educational Resources Information Center

    Turan, Zerrin

    2007-01-01

    This paper aims to explain effects of acoustical environments in sound perception of hearing impaired people. Important aspects of sound and hearing impairment are explained. Detrimental factors in acoustic conditions for speech perception are mentioned. Necessary acoustic treatment in classrooms and use of FM systems to eliminate these factors…

  1. Understanding Bystander Perceptions of Cyberbullying in Inclusive Classroom Settings

    ERIC Educational Resources Information Center

    Guckert, Mary

    2013-01-01

    Cyberbullying is a pervasive problem that puts students at risk of successful academic outcomes and the ability to feel safe in school. As most students with disabilities are served in inclusive classrooms, there is a growing concern that students with special needs are at an increased risk of online bullying harassment. Enhancing responsible…

  2. Knowledge Discovery in Large Data Sets

    SciTech Connect

    Simas, Tiago; Silva, Gabriel; Miranda, Bruno; Ribeiro, Rita

    2008-12-05

    In this work we briefly address the problem of unsupervised classification on large datasets, magnitude around 100,000,000 objects. The objects are variable objects, which are around 10% of the 1,000,000,000 astronomical objects that will be collected by GAIA/ESA mission. We tested unsupervised classification algorithms on known datasets such as OGLE and Hipparcos catalogs. Moreover, we are building several templates to represent the main classes of variable objects as well as new classes to build a synthetic dataset of this dimension. In the future we will run the GAIA satellite scanning law on these templates to obtain a testable large dataset.

  3. Reliability of the 5-min psychomotor vigilance task in a primary school classroom setting.

    PubMed

    Wilson, Andrew; Dollman, James; Lushington, Kurt; Olds, Timothy

    2010-08-01

    This study evaluated the reliability of the 5-min psychomotor vigilance task (PVT) in a single-sex Australian primary school. Seventy-five male students (mean age = 11.82 years, SD = 1.12) completed two 5-min PVTs using a Palm personal digital assistant (PDA) in (1) an isolated setting and (2) a classroom setting. Of this group of students, a subsample of 37 students completed a test-retest reliability trial within the classroom setting. Using a mixed-model analysis, there was no significant difference in the mean response time (RT) or number of lapses (RTs >or= 500 msec) between the isolated and the classroom setting. There was, however, an order effect for the number of lapses in the isolated setting, with the number of lapses being greater if the isolated test was conducted second. Test-retest intraclass correlation coefficients (ICCs) in the classroom setting indicated moderate to high reliability (mean RT = .84, lapses = .59). Bland-Altman analysis showed no systematic difference between the two settings. Findings suggest that the 5-min PDA PVT is a reliable measure of sustained attention in the classroom setting in this sample of primary-aged schoolchildren. The results provide further evidence for the versatility of this measuring device for larger interventions outside the laboratory. PMID:20805597

  4. Large-N in Volcano Settings: Volcanosri

    NASA Astrophysics Data System (ADS)

    Lees, J. M.; Song, W.; Xing, G.; Vick, S.; Phillips, D.

    2014-12-01

    We seek a paradigm shift in the approach we take on volcano monitoring where the compromise from high fidelity to large numbers of sensors is used to increase coverage and resolution. Accessibility, danger and the risk of equipment loss requires that we develop systems that are independent and inexpensive. Furthermore, rather than simply record data on hard disk for later analysis we desire a system that will work autonomously, capitalizing on wireless technology and in field network analysis. To this end we are currently producing a low cost seismic array which will incorporate, at the very basic level, seismological tools for first cut analysis of a volcano in crises mode. At the advanced end we expect to perform tomographic inversions in the network in near real time. Geophone (4 Hz) sensors connected to a low cost recording system will be installed on an active volcano where triggering earthquake location and velocity analysis will take place independent of human interaction. Stations are designed to be inexpensive and possibly disposable. In one of the first implementations the seismic nodes consist of an Arduino Due processor board with an attached Seismic Shield. The Arduino Due processor board contains an Atmel SAM3X8E ARM Cortex-M3 CPU. This 32 bit 84 MHz processor can filter and perform coarse seismic event detection on a 1600 sample signal in fewer than 200 milliseconds. The Seismic Shield contains a GPS module, 900 MHz high power mesh network radio, SD card, seismic amplifier, and 24 bit ADC. External sensors can be attached to either this 24-bit ADC or to the internal multichannel 12 bit ADC contained on the Arduino Due processor board. This allows the node to support attachment of multiple sensors. By utilizing a high-speed 32 bit processor complex signal processing tasks can be performed simultaneously on multiple sensors. Using a 10 W solar panel, second system being developed can run autonomously and collect data on 3 channels at 100Hz for 6 months with the installed 16Gb SD card. Initial designs and test results will be presented and discussed.

  5. Activity Settings and Daily Routines in Preschool Classrooms: Diverse Experiences in Early Learning Settings for Low-Income Children

    PubMed Central

    Fuligni, Allison Sidle; Howes, Carollee; Huang, Yiching; Hong, Sandra Soliday; Lara-Cinisomo, Sandraluz

    2011-01-01

    This paper examines activity settings and daily classroom routines experienced by 3- and 4-year-old low-income children in public center-based preschool programs, private center-based programs, and family child care homes. Two daily routine profiles were identified using a time-sampling coding procedure: a High Free-Choice pattern in which children spent a majority of their day engaged in child-directed free-choice activity settings combined with relatively low amounts of teacher-directed activity, and a Structured-Balanced pattern in which children spent relatively equal proportions of their day engaged in child-directed free-choice activity settings and teacher-directed small- and whole-group activities. Daily routine profiles were associated with program type and curriculum use but not with measures of process quality. Children in Structured-Balanced classrooms had more opportunities to engage in language and literacy and math activities, whereas children in High Free-Choice classrooms had more opportunities for gross motor and fantasy play. Being in a Structured-Balanced classroom was associated with children’s language scores but profiles were not associated with measures of children’s math reasoning or socio-emotional behavior. Consideration of teachers’ structuring of daily routines represents a valuable way to understand nuances in the provision of learning experiences for young children in the context of current views about developmentally appropriate practice and school readiness. PMID:22665945

  6. Using Self-Management Procedures to Improve Classroom Social Skills in Multiple General Education Settings

    ERIC Educational Resources Information Center

    Peterson, Lloyd Douglas; Young, K. Richard; Salzberg, Charles L.; West, Richard P.; Hill, Mary

    2006-01-01

    This study used self-monitoring, coupled with a student/teacher matching strategy, to improve the classroom social skills of five inner-city middle school students, who were at risk for school failure. Using a multiple-probe across students and settings (class periods) design, we evaluated intervention effects in up to six different settings

  7. Corrective Feedback and Learner Uptake in Communicative Classrooms across Instructional Settings

    ERIC Educational Resources Information Center

    Sheen, YoungHee

    2004-01-01

    This paper reports similarities and differences in teachers' corrective feedback and learners' uptake across instructional settings. Four communicative classroom settings--French Immersion, Canada ESL, New Zealand ESL and Korean EFL--were examined using Lyster and Ranta's taxonomy of teachers' corrective feedback moves and learner uptake. The…

  8. Content-Based Instruction for English Language Learners: An Exploration across Multiple Classroom Settings

    ERIC Educational Resources Information Center

    Park, Seo Jung

    2009-01-01

    This study explored the content-based literacy instruction of English language learners (ELLs) across multiple classroom settings in U.S. elementary schools. The following research questions guided the study: (a) How are ELLs taught English in two types of instructional settings: regular content-area literacy instruction in the all-English…

  9. Content-Based Instruction for English Language Learners: An Exploration across Multiple Classroom Settings

    ERIC Educational Resources Information Center

    Park, Seo Jung

    2009-01-01

    This study explored the content-based literacy instruction of English language learners (ELLs) across multiple classroom settings in U.S. elementary schools. The following research questions guided the study: (a) How are ELLs taught English in two types of instructional settings: regular content-area literacy instruction in the all-English…

  10. Compact, Convex, and Symmetric Sets Are Discs. Classroom Notes

    ERIC Educational Resources Information Center

    Lynch, Mark

    2004-01-01

    Define the centre of a parallelogram to be the intersection of its diagonals. It was shown in an earlier paper that the intersection of arbitrarily many parallelograms with the same centre is the unit disc about that centre in a metric defined using ideas from Linear Algebra. In this note, it is shown that this characterizes compact, convex sets

  11. Generalizability and Decision Studies to Inform Observational and Experimental Research in Classroom Settings

    ERIC Educational Resources Information Center

    Bottema-Beutel, Kristen; Lloyd, Blair; Carter, Erik W.; Asmus, Jennifer M.

    2014-01-01

    Attaining reliable estimates of observational measures can be challenging in school and classroom settings, as behavior can be influenced by multiple contextual factors. Generalizability (G) studies can enable researchers to estimate the reliability of observational data, and decision (D) studies can inform how many observation sessions are…

  12. Mobile-IT Education (MIT.EDU): M-Learning Applications for Classroom Settings

    ERIC Educational Resources Information Center

    Sung, M.; Gips, J.; Eagle, N.; Madan, A.; Caneel, R.; DeVaul, R.; Bonsen, J.; Pentland, A.

    2005-01-01

    In this paper, we describe the Mobile-IT Education (MIT.EDU) system, which demonstrates the potential of using a distributed mobile device architecture for rapid prototyping of wireless mobile multi-user applications for use in classroom settings. MIT.EDU is a stable, accessible system that combines inexpensive, commodity hardware, a flexible…

  13. Improving Preschool Classroom Processes: Preliminary Findings from a Randomized Trial Implemented in Head Start Settings

    ERIC Educational Resources Information Center

    Raver, C. Cybele; Jones, Stephanie M.; Li-Grining, Christine P.; Metzger, Molly; Champion, Kina M.; Sardin, Latriese

    2008-01-01

    A primary aim of the Chicago School Readiness Project was to improve teachers' emotionally supportive classroom practices in Head Start-funded preschool settings. Using a clustered randomized controlled trial (RCT) design, the Chicago School Readiness Project randomly assigned a treatment versus control condition to 18 Head Start sites, which…

  14. Conceptualizing the Classroom of Target Students: A Qualitative Investigation of Panelists' Experiences during Standard Setting

    ERIC Educational Resources Information Center

    Hein, Serge F.; Skaggs, Gary

    2010-01-01

    Increasingly, research has focused on the cognitive processes associated with various standard-setting activities. This qualitative study involved an examination of 16 third-grade reading teachers' experiences with the cognitive task of conceptualizing an entire classroom of hypothetical target students when the single-passage bookmark method or…

  15. Performance in an Online Introductory Course in a Hybrid Classroom Setting

    ERIC Educational Resources Information Center

    Aly, Ibrahim

    2013-01-01

    This study compared the academic achievement between undergraduate students taking an introductory managerial accounting course online (N = 104) and students who took the same course in a hybrid classroom setting (N = 203). Student achievement was measured using scores from twelve weekly online assignments, two major online assignments, a final…

  16. Turkish Special Education Teachers' Implementation of Functional Analysis in Classroom Settings

    ERIC Educational Resources Information Center

    Erbas, Dilek; Yucesoy, Serife; Turan, Yasemin; Ostrosky, Michaelene M.

    2006-01-01

    Three Turkish special education teachers conducted a functional analysis to identify variables that might initiate or maintain the problem behaviors of three children with developmental disabilities. The analysis procedures were conducted in natural classroom settings. In Phase 1, following initial training in functional analysis procedures, the…

  17. Civility in the University Classroom: An Opportunity for Faculty to Set Expectations

    ERIC Educational Resources Information Center

    Ward, Chris; Yates, Dan

    2014-01-01

    This research examines the types of uncivil behaviors frequently encountered in university classrooms. These behaviors range from walking in late to class, texting in class, and/or unprofessional emails. These behaviors can often undermine a professor's teaching. Setting reasonable and consistent expectations is a combination of university policy,…

  18. Developing a Positive Mind-Set toward the Use of Technology for Classroom Instruction

    ERIC Educational Resources Information Center

    Okojie, Mabel C. P. O.; Olinzock, Anthony

    2006-01-01

    The aim of this paper is to examine various indicators associated with the development of a positive mind-set toward the use of technology for instruction. The paper also examines the resources available to help teachers keep pace with technological innovation. Electronic classrooms have some complexities associated with them; therefore, support…

  19. The Impact of Physical Settings on Pre-Schoolers Classroom Organization

    ERIC Educational Resources Information Center

    Tadjic, Mirko; Martinec, Miroslav; Farago, Amalija

    2015-01-01

    The physical setting plays an important role in the lives of pre-schoolers and can be an important component of children's experience and development when it is wisely and meaningfully designed. The classroom organization enhances and supports the pre-schooler capability to perform activities himself, initiate and finish tasks, creates the…

  20. How Passive-Aggressive Behavior in Emotionally Disturbed Children Affects Peer Interactions in a Classroom Setting.

    ERIC Educational Resources Information Center

    Hardt, Janet

    Passive-aggressive behavior in an emotionally disturbed child affects the child's progress and affects peer interactions in classroom settings. Passive-aggressive personalities are typically helpless, dependent, impulsive, overly anxious, poorly oriented to reality, and procrastinating. The characteristics of passive-aggressive children need to be…

  1. Analysis of Two Early Childhood Education Settings: Classroom Variables and Peer Verbal Interaction

    ERIC Educational Resources Information Center

    Hojnoski, Robin L.; Margulies, Allison S.; Barry, Amberly; Bose-Deakins, Jillaynne; Sumara, Kimberly M.; Harman, Jennifer L.

    2008-01-01

    Descriptive and ecobehavioral analyses were used to explore the daily activity contexts in classroom settings reflecting two distinct models of early childhood education. Activity context, social configurations, teacher behavior, and child behavior were explored, with specific consideration given to peer verbal behavior as an indicator of social…

  2. A Collaborative Model for Developing Classroom Management Skills in Urban Professional Development School Settings

    ERIC Educational Resources Information Center

    Dobler, Elizabeth; Kesner, Cathy; Kramer, Rebecca; Resnik, Marilyn; Devin, Libby

    2009-01-01

    This article describes a school-university partnership that focuses on the development of classroom management skills for preservice teachers in an urban setting, through collaboration between mentors, principals, and a university supervisor. To prepare preservice teachers for the unique challenges of urban schools, three key elements were…

  3. Use of Big-Screen Films in Multiple Childbirth Education Classroom Settings

    PubMed Central

    Kaufman, Tamara

    2010-01-01

    Although two recent films, Orgasmic Birth and Pregnant in America, were intended for the big screen, they can also serve as valuable teaching resources in multiple childbirth education settings. Each film conveys powerful messages about birth and today's birthing culture. Depending on a childbirth educator's classroom setting (hospital, birthing center, or home birth environment), particular portions in each film, along with extra clips featured on the films' DVDs, can enhance an educator's curriculum and spark compelling discussions with class participants. PMID:21358831

  4. Teaching Methodology in a "Large Power Distance" Classroom: A South Korean Context

    ERIC Educational Resources Information Center

    Jambor, Paul Z.

    2005-01-01

    This paper looks at South Korea as an example of a collectivist society having a rather large power distance dimension value. In a traditional Korean classroom the teacher is at the top of the classroom hierarchy, while the students are the passive participants. Gender and age play a role in the hierarchy between students themselves. Teaching…

  5. Strategies for Engaging FCS Learners in a Large-Format Classroom: Embedded Videos

    ERIC Educational Resources Information Center

    Leslie, Catherine Amoroso

    2014-01-01

    This article presents a method for utilizing technology to increase student engagement in large classroom formats. In their lives outside the classroom, students spend considerable time interfacing with media, and they are receptive to information conveyed in electronic formats. Research has shown that multimedia is an effective learning resource;…

  6. Strategies for Engaging FCS Learners in a Large-Format Classroom: Embedded Videos

    ERIC Educational Resources Information Center

    Leslie, Catherine Amoroso

    2014-01-01

    This article presents a method for utilizing technology to increase student engagement in large classroom formats. In their lives outside the classroom, students spend considerable time interfacing with media, and they are receptive to information conveyed in electronic formats. Research has shown that multimedia is an effective learning resource;…

  7. INTERIOR VIEW, SETTING LARGE CORE WITH ASSISTANCE FROM THE OVERHEAD ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    INTERIOR VIEW, SETTING LARGE CORE WITH ASSISTANCE FROM THE OVERHEAD RAIL CRANE IN BOX FLOOR MOLD AREA (WORKERS: DAN T. WELLS AND TRUMAN CARLISLE). - Stockham Pipe & Fittings Company, Ductile Iron Foundry, 4000 Tenth Avenue North, Birmingham, Jefferson County, AL

  8. Adaptive, multiresolution visualization of large data sets using parallel octrees.

    SciTech Connect

    Freitag, L. A.; Loy, R. M.

    1999-06-10

    The interactive visualization and exploration of large scientific data sets is a challenging and difficult task; their size often far exceeds the performance and memory capacity of even the most powerful graphics work-stations. To address this problem, we have created a technique that combines hierarchical data reduction methods with parallel computing to allow interactive exploration of large data sets while retaining full-resolution capability. The hierarchical representation is built in parallel by strategically inserting field data into an octree data structure. We provide functionality that allows the user to interactively adapt the resolution of the reduced data sets so that resolution is increased in regions of interest without sacrificing local graphics performance. We describe the creation of the reduced data sets using a parallel octree, the software architecture of the system, and the performance of this system on the data from a Rayleigh-Taylor instability simulation.

  9. Looking at large data sets using binned data plots

    SciTech Connect

    Carr, D.B.

    1990-04-01

    This report addresses the monumental challenge of developing exploratory analysis methods for large data sets. The goals of the report are to increase awareness of large data sets problems and to contribute simple graphical methods that address some of the problems. The graphical methods focus on two- and three-dimensional data and common task such as finding outliers and tail structure, assessing central structure and comparing central structures. The methods handle large sample size problems through binning, incorporate information from statistical models and adapt image processing algorithms. Examples demonstrate the application of methods to a variety of publicly available large data sets. The most novel application addresses the too many plots to examine'' problem by using cognostics, computer guiding diagnostics, to prioritize plots. The particular application prioritizes views of computational fluid dynamics solution sets on the fly. That is, as each time step of a solution set is generated on a parallel processor the cognostics algorithms assess virtual plots based on the previous time step. Work in such areas is in its infancy and the examples suggest numerous challenges that remain. 35 refs., 15 figs.

  10. Classrooms.

    ERIC Educational Resources Information Center

    Butin, Dan

    This paper addresses classroom design trends and the key issues schools should consider for better classroom space flexibility and adaptability. Classroom space design issues when schools embrace technology are discussed, as are design considerations when rooms must accommodate different grade levels, the importance of lighting, furniture…

  11. Treatment of psychotic children in a classroom environment: I. Learning in a large group.

    PubMed

    Koegel, R L; Rincover, A

    1974-01-01

    The purpose of this study was to investigate systematically the feasibility of modifying the behavior of autistic children in a classroom environment. In the first experiment, eight autistic children were taught certain basic classroom behaviors (including attending to the teacher upon command, imitation, and an elementary speaking and recognition vocabulary) that were assumed to be necessary for subsequent learning to take place in the classroom. Based on research documenting the effectiveness of one-to-one (teacher-child ratio) procedures for modifying such behaviors, these behaviors were taught in one-to-one sessions. It was, however, found that behaviors taught in a one-to-one setting were not performed consistently in a classroom-sized group, or even in a group as small as two children with one teacher. Further, the children evidenced no acquisition of new behaviors in a classroom environment over a four-week period. Therefore, Experiment II introduced a treatment procedure based upon "fading in" the classroom stimulus situation from the one-to-one stimulus situation. Such treatment was highly effective in producing both a transfer in stimulus control and the acquisition of new behaviors in a kindergarten/first-grade classroom environment. PMID:4465373

  12. Reducing Information Overload in Large Seismic Data Sets

    SciTech Connect

    HAMPTON,JEFFERY W.; YOUNG,CHRISTOPHER J.; MERCHANT,BION J.; CARR,DORTHE B.; AGUILAR-CHANG,JULIO

    2000-08-02

    Event catalogs for seismic data can become very large. Furthermore, as researchers collect multiple catalogs and reconcile them into a single catalog that is stored in a relational database, the reconciled set becomes even larger. The sheer number of these events makes searching for relevant events to compare with events of interest problematic. Information overload in this form can lead to the data sets being under-utilized and/or used incorrectly or inconsistently. Thus, efforts have been initiated to research techniques and strategies for helping researchers to make better use of large data sets. In this paper, the authors present their efforts to do so in two ways: (1) the Event Search Engine, which is a waveform correlation tool and (2) some content analysis tools, which area combination of custom-built and commercial off-the-shelf tools for accessing, managing, and querying seismic data stored in a relational database. The current Event Search Engine is based on a hierarchical clustering tool known as the dendrogram tool, which is written as a MatSeis graphical user interface. The dendrogram tool allows the user to build dendrogram diagrams for a set of waveforms by controlling phase windowing, down-sampling, filtering, enveloping, and the clustering method (e.g. single linkage, complete linkage, flexible method). It also allows the clustering to be based on two or more stations simultaneously, which is important to bridge gaps in the sparsely recorded event sets anticipated in such a large reconciled event set. Current efforts are focusing on tools to help the researcher winnow the clusters defined using the dendrogram tool down to the minimum optimal identification set. This will become critical as the number of reference events in the reconciled event set continually grows. The dendrogram tool is part of the MatSeis analysis package, which is available on the Nuclear Explosion Monitoring Research and Engineering Program Web Site. As part of the research into how to winnow the reference events in these large reconciled event sets, additional database query approaches have been developed to provide windows into these datasets. These custom built content analysis tools help identify dataset characteristics that can potentially aid in providing a basis for comparing similar reference events in these large reconciled event sets. Once these characteristics can be identified, algorithms can be developed to create and add to the reduced set of events used by the Event Search Engine. These content analysis tools have already been useful in providing information on station coverage of the referenced events and basic statistical, information on events in the research datasets. The tools can also provide researchers with a quick way to find interesting and useful events within the research datasets. The tools could also be used as a means to review reference event datasets as part of a dataset delivery verification process. There has also been an effort to explore the usefulness of commercially available web-based software to help with this problem. The advantages of using off-the-shelf software applications, such as Oracle's WebDB, to manipulate, customize and manage research data are being investigated. These types of applications are being examined to provide access to large integrated data sets for regional seismic research in Asia. All of these software tools would provide the researcher with unprecedented power without having to learn the intricacies and complexities of relational database systems.

  13. Intelligent Archiving and Physics Mining of Large Data Sets (Invited)

    NASA Astrophysics Data System (ADS)

    Karimabadi, H.

    2009-12-01

    There are unique challenges in all aspects related to large data sets, from storage, search and access, to analysis and file sharing. With few exceptions, the adoption of the latest technologies to deal with the management and mining of large data sets has been slow in heliosciences. Web services such as CDAweb have been very successful and have been widely adopted by the community. There are also significant efforts going towards Virtual Observatories (VxOs). The main thrust of VxOs has so far been on data discovery, aggregation and uniform presentation. While work remains, many VxOs can now be used to access data. However data is not knowledge and the challenge of extracting physics from the large data sets remains. Here we review our efforts on (i) implementing advanced data mining techniques as part of the data-to-knowledge discovery pipeline, and (ii) use of social networking paradigm in the development of a science collaboratory environment that enables sharing of large files, creation of projects, among others. We will present new data mining software that works on a variety of data formats and demonstrate its capability through several examples of analysis of spacecraft data. The use of such techniques in intelligent archiving will be discussed. Finally, the use of our science collaboratory service and its unique sharing features such as universal accessibility of staged files will be illustrated.

  14. Spatial compounding of large sets of 3D echocardiography images

    NASA Astrophysics Data System (ADS)

    Yao, Cheng; Simpson, John M.; Jansen, Christian H. P.; King, Andrew P.; Penney, Graeme P.

    2009-02-01

    We present novel methodologies for compounding large numbers of 3D echocardiography volumes. Our aim is to investigate the effect of using an increased number of images, and to compare the performance of different compounding methods on image quality. Three sets of 3D echocardiography images were acquired from three volunteers. Each set of data (containing 10+ images) were registered using external tracking followed by state-of-the-art image registration. Four compounding methods were investigated, mean, maximum, and two methods derived from phase-based compounding. The compounded images were compared by calculating signal-to-noise ratios and contrast at manually identified anatomical positions within the images, and by visual inspection by experienced echocardiographers. Our results indicate that signal-to-noise ratio and contrast can be improved using increased number of images, and that a coherent compounded image can be produced using large (10+) numbers of 3D volumes.

  15. Interactive Web-Based Map: Applications to Large Data Sets in the Geosciences. Interactive Web-Based Map: Applications to Large Data Sets in the Geosciences.

    NASA Astrophysics Data System (ADS)

    Garbow, Z. A.; Olson, N. R.; Yuen, D. A.; Boggs, J. M.

    2001-12-01

    Current advances in computer hardware, information technology and data collection techniques have produced very large data sets, sometimes more than terabytes,in a wide variety of scientific and engineering disciplines. We must harness this opportunity to visualize and extract useful information from geophysical and geological data. We have taken the task of data-mining by using a map-like approach over the web for interrogating the humongous data, using a client-server paradigm. The spatial-data is mapped onto a two-dimensional grid from which the user ( client ) can quiz the data with the map-interface as a user extension . The data is stored on high-end compute server. The computational gateway separating the client and the server can be the front-end of an electronic publication , electronic classroom , a Grid system device or e-business. We have used a combination of JAVA, JAVA-3D and Perl for processing the data and communicating them between the client and the server. The user can interrogate the geospatial data over any particular region with arbitrary length scales and pose relevant statistical questions, such as the histogram plots and local statistics. We have applied this method for the following data sets (1.) distribution of prime numbers (2.) two-dimensional mantle convection (3.) three-dimensional mantle convection (4) high-resolution satellite reflectance data over the Upper Midwest for multiple wavelengths (5) molecular dynamics describing the flow of blood in narrow vessels. Using this map-interface concept, the user can actually interrogate these data over the web. This strategy for dissecting large data-sets can be easily applied to other areas, such as satellite geodesy and earthquake data. This mode of data-query may function in an adequately covered wireless web environment with a transfer rate of around 10 Mbit/sec .

  16. STEME: a robust, accurate motif finder for large data sets.

    PubMed

    Reid, John E; Wernisch, Lorenz

    2014-01-01

    Motif finding is a difficult problem that has been studied for over 20 years. Some older popular motif finders are not suitable for analysis of the large data sets generated by next-generation sequencing. We recently published an efficient approximation (STEME) to the EM algorithm that is at the core of many motif finders such as MEME. This approximation allows the EM algorithm to be applied to large data sets. In this work we describe several efficient extensions to STEME that are based on the MEME algorithm. Together with the original STEME EM approximation, these extensions make STEME a fully-fledged motif finder with similar properties to MEME. We discuss the difficulty of objectively comparing motif finders. We show that STEME performs comparably to existing prominent discriminative motif finders, DREME and Trawler, on 13 sets of transcription factor binding data in mouse ES cells. We demonstrate the ability of STEME to find long degenerate motifs which these discriminative motif finders do not find. As part of our method, we extend an earlier method due to Nagarajan et al. for the efficient calculation of motif E-values. STEME's source code is available under an open source license and STEME is available via a web interface. PMID:24625410

  17. Robust Coordination for Large Sets of Simple Rovers

    NASA Technical Reports Server (NTRS)

    Tumer, Kagan; Agogino, Adrian

    2006-01-01

    The ability to coordinate sets of rovers in an unknown environment is critical to the long-term success of many of NASA;s exploration missions. Such coordination policies must have the ability to adapt in unmodeled or partially modeled domains and must be robust against environmental noise and rover failures. In addition such coordination policies must accommodate a large number of rovers, without excessive and burdensome hand-tuning. In this paper we present a distributed coordination method that addresses these issues in the domain of controlling a set of simple rovers. The application of these methods allows reliable and efficient robotic exploration in dangerous, dynamic, and previously unexplored domains. Most control policies for space missions are directly programmed by engineers or created through the use of planning tools, and are appropriate for single rover missions or missions requiring the coordination of a small number of rovers. Such methods typically require significant amounts of domain knowledge, and are difficult to scale to large numbers of rovers. The method described in this article aims to address cases where a large number of rovers need to coordinate to solve a complex time dependent problem in a noisy environment. In this approach, each rover decomposes a global utility, representing the overall goal of the system, into rover-specific utilities that properly assign credit to the rover s actions. Each rover then has the responsibility to create a control policy that maximizes its own rover-specific utility. We show a method of creating rover-utilities that are "aligned" with the global utility, such that when the rovers maximize their own utility, they also maximize the global utility. In addition we show that our method creates rover-utilities that allow the rovers to create their control policies quickly and reliably. Our distributed learning method allows large sets rovers be used unmodeled domains, while providing robustness against rover failures and changing environments. In experimental simulations we show that our method scales well with large numbers of rovers in addition to being robust against noisy sensor inputs and noisy servo control. The results show that our method is able to scale to large numbers of rovers and achieves up to 400% performance improvement over standard machine learning methods.

  18. Optimizing distance-based methods for large data sets

    NASA Astrophysics Data System (ADS)

    Scholl, Tobias; Brenner, Thomas

    2015-10-01

    Distance-based methods for measuring spatial concentration of industries have received an increasing popularity in the spatial econometrics community. However, a limiting factor for using these methods is their computational complexity since both their memory requirements and running times are in {{O}}(n^2). In this paper, we present an algorithm with constant memory requirements and shorter running time, enabling distance-based methods to deal with large data sets. We discuss three recent distance-based methods in spatial econometrics: the D&O-Index by Duranton and Overman (Rev Econ Stud 72(4):1077-1106, 2005), the M-function by Marcon and Puech (J Econ Geogr 10(5):745-762, 2010) and the Cluster-Index by Scholl and Brenner (Reg Stud (ahead-of-print):1-15, 2014). Finally, we present an alternative calculation for the latter index that allows the use of data sets with millions of firms.

  19. A large-scale crop protection bioassay data set

    PubMed Central

    Gaulton, Anna; Kale, Namrata; van Westen, Gerard J. P.; Bellis, Louisa J.; Bento, A. Patrícia; Davies, Mark; Hersey, Anne; Papadatos, George; Forster, Mark; Wege, Philip; Overington, John P.

    2015-01-01

    ChEMBL is a large-scale drug discovery database containing bioactivity information primarily extracted from scientific literature. Due to the medicinal chemistry focus of the journals from which data are extracted, the data are currently of most direct value in the field of human health research. However, many of the scientific use-cases for the current data set are equally applicable in other fields, such as crop protection research: for example, identification of chemical scaffolds active against a particular target or endpoint, the de-convolution of the potential targets of a phenotypic assay, or the potential targets/pathways for safety liabilities. In order to broaden the applicability of the ChEMBL database and allow more widespread use in crop protection research, an extensive data set of bioactivity data of insecticidal, fungicidal and herbicidal compounds and assays was collated and added to the database. PMID:26175909

  20. A large-scale crop protection bioassay data set.

    PubMed

    Gaulton, Anna; Kale, Namrata; van Westen, Gerard J P; Bellis, Louisa J; Bento, A Patrícia; Davies, Mark; Hersey, Anne; Papadatos, George; Forster, Mark; Wege, Philip; Overington, John P

    2015-01-01

    ChEMBL is a large-scale drug discovery database containing bioactivity information primarily extracted from scientific literature. Due to the medicinal chemistry focus of the journals from which data are extracted, the data are currently of most direct value in the field of human health research. However, many of the scientific use-cases for the current data set are equally applicable in other fields, such as crop protection research: for example, identification of chemical scaffolds active against a particular target or endpoint, the de-convolution of the potential targets of a phenotypic assay, or the potential targets/pathways for safety liabilities. In order to broaden the applicability of the ChEMBL database and allow more widespread use in crop protection research, an extensive data set of bioactivity data of insecticidal, fungicidal and herbicidal compounds and assays was collated and added to the database. PMID:26175909

  1. Towards effective analysis of large grain boundary data sets

    NASA Astrophysics Data System (ADS)

    Glowinski, K.; Morawiec, A.

    2015-04-01

    Grain boundaries affect properties of polycrystals. Novel experimental techniques for three-dimensional orientation mapping give new opportunities for studies of this influence. Large networks of boundaries can be analyzed based on all five ’macroscopic’ boundary parameters. We demonstrate benefits of applying two methods for improving these analyses. The fractions of geometrically special boundaries in ferrite are estimated based on ’approximate’ distances to the nearest special boundaries; by using these parameters, the times needed for processing boundary data sets are shortened. Moreover, grain-boundary distributions for nickel are obtained using kernel density estimation; this approach leads to distribution functions more accurate than those obtained based on partition of the space into bins.

  2. Support vector machine classifiers for large data sets.

    SciTech Connect

    Gertz, E. M.; Griffin, J. D.

    2006-01-31

    This report concerns the generation of support vector machine classifiers for solving the pattern recognition problem in machine learning. Several methods are proposed based on interior point methods for convex quadratic programming. Software implementations are developed by adapting the object-oriented packaging OOQP to the problem structure and by using the software package PETSc to perform time-intensive computations in a distributed setting. Linear systems arising from classification problems with moderately large numbers of features are solved by using two techniques--one a parallel direct solver, the other a Krylov-subspace method incorporating novel preconditioning strategies. Numerical results are provided, and computational experience is discussed.

  3. Observations of Teacher-Child Interactions in Classrooms Serving Latinos and Dual Language Learners: Applicability of the Classroom Assessment Scoring System in Diverse Settings

    ERIC Educational Resources Information Center

    Downer, Jason T.; Lopez, Michael L.; Grimm, Kevin J.; Hamagami, Aki; Pianta, Robert C.; Howes, Carollee

    2012-01-01

    With the rising number of Latino and dual language learner (DLL) children attending pre-k and the importance of assessing the quality of their experiences in those settings, this study examined the extent to which a commonly used assessment of teacher-child interactions, the Classroom Assessment Scoring System (CLASS), demonstrated similar…

  4. Observations of Teacher-Child Interactions in Classrooms Serving Latinos and Dual Language Learners: Applicability of the Classroom Assessment Scoring System in Diverse Settings

    ERIC Educational Resources Information Center

    Downer, Jason T.; Lopez, Michael L.; Grimm, Kevin J.; Hamagami, Aki; Pianta, Robert C.; Howes, Carollee

    2012-01-01

    With the rising number of Latino and dual language learner (DLL) children attending pre-k and the importance of assessing the quality of their experiences in those settings, this study examined the extent to which a commonly used assessment of teacher-child interactions, the Classroom Assessment Scoring System (CLASS), demonstrated similar…

  5. Setting-level influences on implementation of the responsive classroom approach.

    PubMed

    Wanless, Shannon B; Patton, Christine L; Rimm-Kaufman, Sara E; Deutsch, Nancy L

    2013-02-01

    We used mixed methods to examine the association between setting-level factors and observed implementation of a social and emotional learning intervention (Responsive Classroom® approach; RC). In study 1 (N?=?33 3rd grade teachers after the first year of RC implementation), we identified relevant setting-level factors and uncovered the mechanisms through which they related to implementation. In study 2 (N?=?50 4th grade teachers after the second year of RC implementation), we validated our most salient Study 1 finding across multiple informants. Findings suggested that teachers perceived setting-level factors, particularly principal buy-in to the intervention and individualized coaching, as influential to their degree of implementation. Further, we found that intervention coaches' perspectives of principal buy-in were more related to implementation than principals' or teachers' perspectives. Findings extend the application of setting theory to the field of implementation science and suggest that interventionists may want to consider particular accounts of school setting factors before determining the likelihood of schools achieving high levels of implementation. PMID:23065349

  6. Comparing Outcomes from Field and Classroom Based Settings for Undergraduate Geoscience Courses

    NASA Astrophysics Data System (ADS)

    Skinner, M. R.; Harris, R. A.; Flores, J.

    2011-12-01

    Field based learning can be found in nearly every course offered in Geology at Brigham Young University. For example, in our Structural Geology course field studies substitute for labs. Students collect data their own data from several different structural settings of the Wasatch Range. Our curriculum also includes a two-week, sophomore-level field course that introduces students to interpreting field relations themselves and sets the stage for much of what they learn in their upper-division courses. Our senior-level six-week field geology course includes classical field mapping with exercises in petroleum and mineral exploration, environmental geology and geological hazards. Experiments with substituting field-based general education courses for those in traditional classroom settings indicate that student cognition, course enjoyment and recruiting of majors significantly increase in a field-based course. We offer a field-based introductory geology course (Geo 102) that is taught in seven, six-hour field trips during which students travel to localities of geologic interest to investigate a variety of fundamental geological problems. We compare the outcomes of Geo 102 with a traditional classroom-based geology course (Geo 101). For the comparison both courses are taught by the same instructor, use the same text and supplementary materials and take the same exams. The results of 7 years of reporting indicate that test scores and final grades are one-half grade point higher for Geo 102 students versus those in traditional introductory courses. Student evaluations of the course are also 0.8-1.4 points higher on a scale of 1-8, and are consistently the highest in the Department and College. Other observations include increased attendance, attention and curiosity. The later two are measured by the number of students asking questions of other students as well as the instructors, and the total number of questions asked during class time in the field versus the classroom. Normal classroom involvement includes two or three students asking nearly all of the questions, while in Geo 102 it is closer to half the class, and not the same students each time. Not only do more individuals participate in asking questions in Geo 102, but each participant asks more questions as well. Questions asked in class are generally specific to the discussion, while field questions are commonly multidisciplinary in nature. Field-based courses also encourage more students to collaborate with each other and to integrate shared observations due to the many different aspects of the geosciences present at each site. One of the most important pay-offs is the 50% increase in the number of students changing their major to geology in the field-based versus classroom-based courses. Field-based learning increases the depth of student understanding of the subjects they investigate as well as student involvement and enthusiasm in the class. The tradeoff we make for realizing significant individual and group discovery in the field is that more responsibility is placed on the student to understand the broad based geologic concepts found in the text. The field based approach allows the students to immediately apply their learning in real world applications.

  7. Implementing Concept-Based Learning in a Large Undergraduate Classroom

    ERIC Educational Resources Information Center

    Morse, David; Jutras, France

    2008-01-01

    An experiment explicitly introducing learning strategies to a large, first-year undergraduate cell biology course was undertaken to see whether awareness and use of strategies had a measurable impact on student performance. The construction of concept maps was selected as the strategy to be introduced because of an inherent coherence with a course…

  8. Interaction and Uptake in Large Foreign Language Classrooms

    ERIC Educational Resources Information Center

    Ekembe, Eric Enongene

    2014-01-01

    Inteaction determines and affects the conditions of language acquisition especially in contexts where exposure to the target language is limited. This is believed to be successful only within the context of small classes (Chavez, 2009). This paper examines learners' progress resulting from interaction in large classes. Using pre-, post-, and…

  9. Manifold sequencing: Efficient processing of large sets of sequencing reactions

    SciTech Connect

    Lagerkvist, A.; Stewart, J.; Lagerstroem-Fermer, M.; Landegren, U.

    1994-03-15

    Automated instruments for DNA sequencing greatly simplify data collection in the Sanger sequencing procedure. By contrast, the so-called front-end problems of preparing sequencing templates, performing sequencing reactions, and loading these on the instruments remain major obstacles to extensive sequencing projects. The authors describe here the use of a manifold support to prepare and perform sequencing reactions on large sets of templates in parallel, as well as to load the reaction products on a sequencing instrument. In this manner, all reaction steps are performed without pipetting the samples. The strategy is applied to sequencing PCR-amplified clones of the human mitochondrial D-loop and for detection of heterozygous positions in the human major histocompatibility complex class II gene HLA-DQB, amplified from genomic DNA samples. This technique will promote sequencing in a clinical context and could form the basis of more efficient genomic sequencing strategies. 24 refs., 5 figs.

  10. Fitting semiparametric random effects models to large data sets.

    PubMed

    Pennell, Michael L; Dunson, David B

    2007-10-01

    For large data sets, it can be difficult or impossible to fit models with random effects using standard algorithms due to memory limitations or high computational burdens. In addition, it would be advantageous to use the abundant information to relax assumptions, such as normality of random effects. Motivated by data from an epidemiologic study of childhood growth, we propose a 2-stage method for fitting semiparametric random effects models to longitudinal data with many subjects. In the first stage, we use a multivariate clustering method to identify G

  11. Visualization of large, multidimensional multivariate data sets. Phase 1

    SciTech Connect

    Lucas, R.D.; Mills, K.; Kim, Y.

    1987-09-01

    The project establishes the technical feasibility of a visualization workstation for very large data sets. The Phase 1 system consists of an IBM PC/AT with 2 Mbytes of expanded memory, frame buffer, and write-once optical disk drive. The latter provides random access to 200 Mbytes on a removable medium. Data from a supercomputer (or from any process, such as an experiment, that generates voluminous data in matrix form) can be written to this medium and easily transported (e.g., mailed) to the user's worksite. Software has been developed that will afford the user interactive visual access to these data in the form of orthogonal sections and contour surface renderings. Strategies for displaying multi-variate three-dimensional data and for producing interactive animated displays of data in three-dimensions plus time are developed.

  12. Visualizing large data sets in the earth sciences

    NASA Technical Reports Server (NTRS)

    Hibbard, William; Santek, David

    1989-01-01

    The authors describe the capabilities of McIDAS, an interactive visualization system that is vastly increasing the ability of earth scientists to manage and analyze data from remote sensing instruments and numerical simulation models. McIDAS provides animated three-dimensionsal images and highly interactive displays. The software can manage, analyze, and visualize large data sets that span many physical variables (such as temperature, pressure, humidity, and wind speed), as well as time and three spatial dimensions. The McIDAS system manages data from at least 100 different sources. The data management tools consist of data structures for storing different data types in files, libraries of routines for accessing these data structures, system commands for performing housekeeping functions on the data files, and reformatting programs for converting external data to the system's data structures. The McIDAS tools for three-dimensional visualization of meteorological data run on an IBM mainframe and can load up to 128-frame animation sequences into the workstations. A highly interactive version of the system can provide an interactive window into data sets containing tens of millions of points produced by numerical models and remote sensing instruments. The visualizations are being used for teaching as well as by scientists.

  13. Parallel Analysis Tools for Ultra-Large Climate Data Sets

    NASA Astrophysics Data System (ADS)

    Jacob, Robert; Krishna, Jayesh; Xu, Xiabing; Mickelson, Sheri; Wilde, Mike; Peterson, Kara; Bochev, Pavel; Latham, Robert; Tautges, Tim; Brown, David; Brownrigg, Richard; Haley, Mary; Shea, Dennis; Huang, Wei; Middleton, Don; Schuchardt, Karen; Yin, Jian

    2013-04-01

    While climate models have used parallelism for several years, the post-processing tools are still mostly single-threaded applications and many are closed source. These tools are becoming a bottleneck in the production of new climate knowledge when they confront terabyte-sized output from high-resolution climate models. The ParVis project is using and creating Free and Open Source tools that bring data and task parallelism to climate model analysis to enable analysis of large climate data sets. ParVis is using the Swift task-parallel language to implement a diagnostic suite that generates over 600 plots of atmospheric quantities. ParVis has also created a Parallel Gridded Analysis Library (ParGAL) which implements many common climate analysis operations in a data-parallel fashion using the Message Passing Interface. ParGAL has in turn been built on sophisticated packages for describing grids in parallel (the Mesh Oriented database (MOAB), performing vector operations on arbitrary grids (Intrepid) and reading data in parallel (PnetCDF). ParGAL is being used to implement a parallel version of the NCAR Command Language (NCL) called ParNCL. ParNCL/ParCAL not only speeds up analysis of large datasets but also allows operations to be performed on native grids, eliminating the need to transform data to latitude-longitude grids. All of the tools ParVis is creating are available as free and open source software.

  14. The Incredible Years Teacher Classroom Management Program: Using Coaching to Support Generalization to Real-World Classroom Settings

    ERIC Educational Resources Information Center

    Reinke, Wendy M.; Stormont, Melissa; Webster-Stratton, Carolyn; Newcomer, Lori L.; Herman, Keith C.

    2012-01-01

    This article focuses on the Incredible Years Teacher Classroom Management Training (IY TCM) intervention as an example of an evidence-based program that embeds coaching within its design. First, the core features of the IY TCM program are described. Second, the IY TCM coaching model and processes utilized to facilitate high fidelity of…

  15. The Incredible Years Teacher Classroom Management Program: Using Coaching to Support Generalization to Real-World Classroom Settings

    ERIC Educational Resources Information Center

    Reinke, Wendy M.; Stormont, Melissa; Webster-Stratton, Carolyn; Newcomer, Lori L.; Herman, Keith C.

    2012-01-01

    This article focuses on the Incredible Years Teacher Classroom Management Training (IY TCM) intervention as an example of an evidence-based program that embeds coaching within its design. First, the core features of the IY TCM program are described. Second, the IY TCM coaching model and processes utilized to facilitate high fidelity of…

  16. Web based visualization of large climate data sets

    USGS Publications Warehouse

    Alder, Jay R.; Hostetler, Steven W.

    2015-01-01

    We have implemented the USGS National Climate Change Viewer (NCCV), which is an easy-to-use web application that displays future projections from global climate models over the United States at the state, county and watershed scales. We incorporate the NASA NEX-DCP30 statistically downscaled temperature and precipitation for 30 global climate models being used in the Fifth Assessment Report (AR5) of the Intergovernmental Panel on Climate Change (IPCC), and hydrologic variables we simulated using a simple water-balance model. Our application summarizes very large, complex data sets at scales relevant to resource managers and citizens and makes climate-change projection information accessible to users of varying skill levels. Tens of terabytes of high-resolution climate and water-balance data are distilled to compact binary format summary files that are used in the application. To alleviate slow response times under high loads, we developed a map caching technique that reduces the time it takes to generate maps by several orders of magnitude. The reduced access time scales to >500 concurrent users. We provide code examples that demonstrate key aspects of data processing, data exporting/importing and the caching technique used in the NCCV.

  17. Science Teacher Beliefs and Classroom Practice Related to Constructivism in Different School Settings

    NASA Astrophysics Data System (ADS)

    Savasci, Funda; Berlin, Donna F.

    2012-02-01

    Science teacher beliefs and classroom practice related to constructivism and factors that may influence classroom practice were examined in this cross-case study. Data from four science teachers in two schools included interviews, demographic questionnaire, Classroom Learning Environment Survey (preferred/perceived), and classroom observations and documents. Using an inductive analytic approach, results suggested that the teachers embraced constructivism, but classroom observations did not confirm implementation of these beliefs for three of the four teachers. The most preferred constructivist components were personal relevance and student negotiation; the most perceived component was critical voice. Shared control was the least preferred, least perceived, and least observed constructivist component. School type, grade, student behavior/ability, curriculum/standardized testing, and parental involvement may influence classroom practice.

  18. Challenges Associated With Using Large Data Sets for Quality Assessment and Research in Clinical Settings

    PubMed Central

    Cohen, Bevin; Vawdrey, David K.; Liu, Jianfang; Caplan, David; Furuya, E. Yoko; Mis, Frederick W.; Larson, Elaine

    2015-01-01

    The rapidly expanding use of electronic records in health-care settings is generating unprecedented quantities of data available for clinical, epidemiological, and cost-effectiveness research. Several challenges are associated with using these data for clinical research, including issues surrounding access and information security, poor data quality, inconsistency of data within and across institutions, and a paucity of staff with expertise to manage and manipulate large clinical data sets. In this article, we describe our experience with assembling a data-mart and conducting clinical research using electronic data from four facilities within a single hospital network in New York City. We culled data from several electronic sources, including the institution’s admission-discharge-transfer system, cost accounting system, electronic health record, clinical data warehouse, and departmental records. The final data-mart contained information for more than 760,000 discharges occurring from 2006 through 2012. Using categories identified by the National Institutes of Health Big Data to Knowledge initiative as a framework, we outlined challenges encountered during the development and use of a domain-specific data-mart and recommend approaches to overcome these challenges. PMID:26351216

  19. Challenges Associated With Using Large Data Sets for Quality Assessment and Research in Clinical Settings.

    PubMed

    Cohen, Bevin; Vawdrey, David K; Liu, Jianfang; Caplan, David; Furuya, E Yoko; Mis, Frederick W; Larson, Elaine

    2015-08-01

    The rapidly expanding use of electronic records in health-care settings is generating unprecedented quantities of data available for clinical, epidemiological, and cost-effectiveness research. Several challenges are associated with using these data for clinical research, including issues surrounding access and information security, poor data quality, inconsistency of data within and across institutions, and a paucity of staff with expertise to manage and manipulate large clinical data sets. In this article, we describe our experience with assembling a data-mart and conducting clinical research using electronic data from four facilities within a single hospital network in New York City. We culled data from several electronic sources, including the institution's admission-discharge-transfer system, cost accounting system, electronic health record, clinical data warehouse, and departmental records. The final data-mart contained information for more than 760,000 discharges occurring from 2006 through 2012. Using categories identified by the National Institutes of Health Big Data to Knowledge initiative as a framework, we outlined challenges encountered during the development and use of a domain-specific data-mart and recommend approaches to overcome these challenges. PMID:26351216

  20. The Effects of a Teacher-Child Play Intervention on Classroom Compliance in Young Children in Child Care Settings

    ERIC Educational Resources Information Center

    Levine, Darren G.; Ducharme, Joseph M.

    2013-01-01

    The current study evaluated the effects of a teacher-conducted play intervention on preschool-aged children's compliance in child care settings. Study participants included 8 children ranging in age from 3 to 5 years and 5 early childhood education teachers within 5 classrooms across 5 child care centers. A combination ABAB and multiple baseline…

  1. Student-Centred Anti-Smoking Education: Comparing a Classroom-Based and an Out-of-School Setting

    ERIC Educational Resources Information Center

    Geier, Christine S.; Bogner, Franz X.

    2010-01-01

    The present study monitored a student-centred educational anti-smoking intervention with fifth graders by focusing on their cognitive achievement and intrinsic motivation. In order to assess the potential influence of the setting on self-directed learning, the intervention was conducted in two different learning environments: a classroom-based…

  2. A Case Based Analysis Preparation Strategy for Use in a Classroom Management for Inclusive Settings Course: Preliminary Observations

    ERIC Educational Resources Information Center

    Niles, William J.; Cohen, Alan

    2012-01-01

    Case based instruction (CBI) is a pedagogical option in teacher preparation growing in application but short on practical means to implement the method. This paper presents an analysis strategy and questions developed to help teacher trainees focus on classroom management issues embedded in a set of "real" cases. An analysis of teacher candidate…

  3. An Exploration Tool for Very Large Spectrum Data Sets

    NASA Astrophysics Data System (ADS)

    Carbon, Duane F.; Henze, Christopher

    2015-01-01

    We present an exploration tool for very large spectrum data sets such as the SDSS, LAMOST, and 4MOST data sets. The tool works in two stages: the first uses batch processing and the second runs interactively. The latter employs the NASA hyperwall, a configuration of 128 workstation displays (8x16 array) controlled by a parallelized software suite running on NASA's Pleiades supercomputer. The stellar subset of the Sloan Digital Sky Survey DR10 was chosen to show how the tool may be used. In stage one, SDSS files for 569,738 stars are processed through our data pipeline. The pipeline fits each spectrum using an iterative continuum algorithm, distinguishing emission from absorption and handling molecular absorption bands correctly. It then measures 1659 discrete atomic and molecular spectral features that were carefully preselected based on their likelihood of being visible at some spectral type. The depths relative to the local continuum at each feature wavelength are determined for each spectrum: these depths, the local S/N level, and DR10-supplied variables such as magnitudes, colors, positions, and radial velocities are the basic measured quantities used on the hyperwall. In stage two, each hyperwall panel is used to display a 2-D scatter plot showing the depth of feature A vs the depth of feature B for all of the stars. A and B change from panel to panel. The relationships between the various (A,B) strengths and any distinctive clustering are immediately apparent when examining and inter-comparing the different panels on the hyperwall. The interactive software allows the user to select the stars in any interesting region of any 2-D plot on the hyperwall, immediately rendering the same stars on all the other 2-D plots in a unique color. The process may be repeated multiple times, each selection displaying a distinctive color on all the plots. At any time, the spectra of the selected stars may be examined in detail on a connected workstation display. We illustrate how our approach allows us to quickly isolate and examine such interesting stellar subsets as EMP stars, CV stars and C-rich stars.

  4. Classrooms that Work: Teaching Generic Skills in Academic and Vocational Settings.

    ERIC Educational Resources Information Center

    Stasz, Cathleen; And Others

    This report documents the second of two studies on teaching and learning generic skills in high schools. It extends the earlier work by providing a model for designing classroom instruction in both academic and vocational classrooms where teaching generic skills is an instructional goal. Ethnographic field methods were used to observe, record, and…

  5. Connecting scientific research and classroom instruction: Developing authentic problem sets for the undergraduate organic chemistry curriculum

    NASA Astrophysics Data System (ADS)

    Raker, Jeffrey R.

    Reform efforts in science education have called for instructional methods and resources that mirror the practice of science. Little research and design methods have been documented in the literature for designing such materials. The purpose of this study was to develop problems sets for sophomore-level organic chemistry instruction. This research adapted an instructional design methodology from the science education literature for the creation of new curricular problem sets. The first phase of this study was to establish an understanding of current curricular problems in sophomore-level organic chemistry instruction. A sample of 792 problems was collected from four organic chemistry courses. These problems were assessed using three literature reported problem typologies. Two of these problem typologies have previously been used to understand general chemistry problems; comparisons between general and organic chemistry problems were thus made. Data from this phase was used to develop a set of five problems for practicing organic chemists. The second phase of this study was to explore practicing organic chemists' experiences solving problems in the context of organic synthesis research. Eight practicing organic chemists were interviewed and asked to solve two to three of the problems developed in phase one of this research. These participants spoke of three problem types: project level, synthetic planning, and day-to-day. Three knowledge types (internal knowledge, knowledgeable others, and literature) were used in solving these problems in research practice and in the developed problems. A set of guiding factors and implications were derived from this data and the chemistry education literature for the conversion of the problems for practicing chemists to problems for undergraduate students. A subsequent conversion process for the five problems occurred. The third, and last phase, of this study was to explore undergraduate students' experiences solving problems in the classroom. Eight undergraduate students from four different organic chemistry courses were interviewed and asked to solve three of the problems converted at the end of phase two. Data from these interviews were used to understand the types, methods, and knowledge uses by undergraduate students in the problem-solving process. Data from all three phases were used to assert seven ideas for the development of problems for undergraduate students.

  6. Using Large Data Sets to Study College Education Trajectories

    ERIC Educational Resources Information Center

    Oseguera, Leticia; Hwang, Jihee

    2014-01-01

    This chapter presents various considerations researchers undertook to conduct a quantitative study on low-income students using a national data set. Specifically, it describes how a critical quantitative scholar approaches guiding frameworks, variable operationalization, analytic techniques, and result interpretation. Results inform how…

  7. Using Large Data Sets to Study College Education Trajectories

    ERIC Educational Resources Information Center

    Oseguera, Leticia; Hwang, Jihee

    2014-01-01

    This chapter presents various considerations researchers undertook to conduct a quantitative study on low-income students using a national data set. Specifically, it describes how a critical quantitative scholar approaches guiding frameworks, variable operationalization, analytic techniques, and result interpretation. Results inform how…

  8. Increasing the Writing Performance of Urban Seniors Placed At-Risk through Goal-Setting in a Culturally Responsive and Creativity-Centered Classroom

    ERIC Educational Resources Information Center

    Estrada, Brittany; Warren, Susan

    2014-01-01

    Efforts to support marginalized students require not only identifying systemic inequities, but providing a classroom infrastructure that supports the academic achievement of all students. This action research study examined the effects of implementing goal-setting strategies and emphasizing creativity in a culturally responsive classroom (CRC) on…

  9. Evaluation of Data Visualization Software for Large Astronomical Data Sets

    NASA Astrophysics Data System (ADS)

    Doyle, Matthew; Taylor, Roger S.; Kanbur, Shashi; Schofield, Damian; Donalek, Ciro; Djorgovski, Stanislav G.; Davidoff, Scott

    2016-01-01

    This study investigates the efficacy of a 3D visualization application used to classify various types of stars using data derived from large synoptic sky surveys. Evaluation methodology included a cognitive walkthrough that prompted participants to identify a specific star type (Supernovae, RR Lyrae or Eclipsing Binary) and retrieve variable information (MAD, magratio, amplitude, frequency) from the star. This study also implemented a heuristic evaluation that applied usability standards such as the Shneiderman Visual Information Seeking Mantra to the initial iteration of the application. Findings from the evaluation indicated that improvements could be made to the application by developing effective spatial organization and implementing data reduction techniques such as linking, brushing, and small multiples.

  10. Processing large remote sensing image data sets on Beowulf clusters

    USGS Publications Warehouse

    Steinwand, Daniel R.; Maddox, Brian; Beckmann, Tim; Schmidt, Gail

    2003-01-01

    High-performance computing is often concerned with the speed at which floating- point calculations can be performed. The architectures of many parallel computers and/or their network topologies are based on these investigations. Often, benchmarks resulting from these investigations are compiled with little regard to how a large dataset would move about in these systems. This part of the Beowulf study addresses that concern by looking at specific applications software and system-level modifications. Applications include an implementation of a smoothing filter for time-series data, a parallel implementation of the decision tree algorithm used in the Landcover Characterization project, a parallel Kriging algorithm used to fit point data collected in the field on invasive species to a regular grid, and modifications to the Beowulf project's resampling algorithm to handle larger, higher resolution datasets at a national scale. Systems-level investigations include a feasibility study on Flat Neighborhood Networks and modifications of that concept with Parallel File Systems.

  11. Value-based customer grouping from large retail data sets

    NASA Astrophysics Data System (ADS)

    Strehl, Alexander; Ghosh, Joydeep

    2000-04-01

    In this paper, we propose OPOSSUM, a novel similarity-based clustering algorithm using constrained, weighted graph- partitioning. Instead of binary presence or absence of products in a market-basket, we use an extended 'revenue per product' measure to better account for management objectives. Typically the number of clusters desired in a database marketing application is only in the teens or less. OPOSSUM proceeds top-down, which is more efficient and takes a small number of steps to attain the desired number of clusters as compared to bottom-up agglomerative clustering approaches. OPOSSUM delivers clusters that are balanced in terms of either customers (samples) or revenue (value). To facilitate data exploration and validation of results we introduce CLUSION, a visualization toolkit for high-dimensional clustering problems. To enable closed loop deployment of the algorithm, OPOSSUM has no user-specified parameters. Thresholding heuristics are avoided and the optimal number of clusters is automatically determined by a search for maximum performance. Results are presented on a real retail industry data-set of several thousand customers and products, to demonstrate the power of the proposed technique.

  12. Classroom Management Strategies for Young Children with Challenging Behavior within Early Childhood Settings

    ERIC Educational Resources Information Center

    Jolivette, Kristine; Steed, Elizabeth A.

    2010-01-01

    Many preschool, Head Start, and kindergarten educators of young children express concern about the number of children who exhibit frequent challenging behaviors and report that managing these behaviors is difficult within these classrooms. This article describes research-based strategies with practical applications that can be used as part of…

  13. Classroom Management Strategies for Young Children with Challenging Behavior within Early Childhood Settings

    ERIC Educational Resources Information Center

    Jolivette, Kristine; Steed, Elizabeth A.

    2010-01-01

    Many preschool, Head Start, and kindergarten educators of young children express concern about the number of children who exhibit frequent challenging behaviors and report that managing these behaviors is difficult within these classrooms. This article describes research-based strategies with practical applications that can be used as part of…

  14. Multimodal Literacy Practices in the Indigenous Sámi Classroom: Children Navigating in a Complex Multilingual Setting

    ERIC Educational Resources Information Center

    Pietikäinen, Sari; Pitkänen-Huhta, Anne

    2013-01-01

    This article explores multimodal literacy practices in a transforming multilingual context of an indigenous and endangered Sámi language classroom. Looking at literacy practices as embedded in a complex and shifting terrain of language ideologies, language norms, and individual experiences and attitudes, we examined how multilingual Sámi children…

  15. Observing Students in Classroom Settings: A Review of Seven Coding Schemes

    ERIC Educational Resources Information Center

    Volpe, Robert J.; DiPerna, James C.; Hintze, John M.; Shapiro, Edward S.

    2005-01-01

    A variety of coding schemes are available for direct observational assessment of student classroom behavior. These instruments have been used for a number of assessment tasks including screening children in need of further evaluation for emotional and behavior problems, diagnostic assessment of emotional and behavior problems, assessment of…

  16. The Positive and Negative Effects of the Use of Humor in the Classroom Setting.

    ERIC Educational Resources Information Center

    Steele, Karen E.

    A study examined the effectiveness of humor on reducing students' stress and tensions as well as the fostering of a positive environment, thus enhancing learning. A survey assessing classroom teachers' use of humor consisting of 10 items was administered to a sample population of 65 high school sophomores. Results were analyzed in terms of number…

  17. Service user involvement in pre-registration mental health nurse education classroom settings: a review of the literature.

    PubMed

    Terry, J

    2012-11-01

    Service user involvement in pre-registration nurse education is now a requirement, yet little is known about how students engage with users in the classroom, how such initiatives are being evaluated, how service users are prepared themselves to teach students, or the potential influence on clinical practice. The aim of this literature review was to bring together published articles on service user involvement in classroom settings in pre-registration mental health nurse education programmes, including their evaluations. A comprehensive review of the literature was carried out via computer search engines and the Internet, as well as a hand search of pertinent journals and references. This produced eight papers that fitted the inclusion criteria, comprising four empirical studies and four review articles, which were then reviewed using a seven-item checklist. The articles revealed a range of teaching and learning strategies had been employed, ranging from exposure to users' personal stories, to students being required to demonstrate awareness of user perspectives in case study presentations, with others involving eLearning and assessment skills initiatives. This review concludes that further longitudinal research is needed to establish the influence of user involvement in the classroom over time. PMID:22296494

  18. The Classroom Observation Schedule to Measure Intentional Communication (COSMIC): an observational measure of the intentional communication of children with autism in an unstructured classroom setting.

    PubMed

    Pasco, Greg; Gordon, Rosanna K; Howlin, Patricia; Charman, Tony

    2008-11-01

    The Classroom Observation Schedule to Measure Intentional Communication (COSMIC) was devised to provide ecologically valid outcome measures for a communication-focused intervention trial. Ninety-one children with autism spectrum disorder aged 6 years 10 months (SD 16 months) were videoed during their everyday snack, teaching and free play activities. Inter-rater reliability was high and relevant items showed significant associations with comparable items from concurrent Autism Diagnostic Observation Schedule-Generic (Lord et al. 2000, J Autism Dev Disord 30(3):205-223) assessments. In a subsample of 28 children initial differences in rates of initiations, initiated speech/vocalisation and commenting were predictive of language and communication competence 15 months later. Results suggest that the use of observational measures of intentional communication in natural settings is a valuable assessment strategy for research and clinical practice. PMID:18401692

  19. Feasibility and Acceptability of Adapting the Eating in the Absence of Hunger Assessment for Preschoolers in the Classroom Setting.

    PubMed

    Soltero, Erica G; Ledoux, Tracey; Lee, Rebecca E

    2015-12-01

    Eating in the Absence of Hunger (EAH) represents a failure to self-regulate intake leading to overconsumption. Existing research on EAH has come from the clinical setting, limiting our understanding of this behavior. The purpose of this study was to describe the adaptation of the clinical EAH paradigm for preschoolers to the classroom setting and evaluate the feasibility and acceptability of measuring EAH in the classroom. The adapted protocol was implemented in childcare centers in Houston, Texas (N=4) and Phoenix, Arizona (N=2). The protocol was feasible, economical, and time efficient, eliminating previously identified barriers to administering the EAH assessment such as limited resources and the time constraint of delivering the assessment to participants individually. Implementation challenges included difficulty in choosing palatable test snacks that were in compliance with childcare center food regulations and the limited control over the meal that was administered prior to the assessment. The adapted protocol will allow for broader use of the EAH assessment and encourage researchers to incorporate the assessment into longitudinal studies in order to further our understanding of the causes and emergence of EAH. PMID:26172567

  20. Response Grids: Practical Ways to Display Large Data Sets with High Visual Impact

    ERIC Educational Resources Information Center

    Gates, Simon

    2013-01-01

    Spreadsheets are useful for large data sets but they may be too wide or too long to print as conventional tables. Response grids offer solutions to the challenges posed by any large data set. They have wide application throughout science and for every subject and context where visual data displays are designed, within education and elsewhere.…

  1. Response Grids: Practical Ways to Display Large Data Sets with High Visual Impact

    ERIC Educational Resources Information Center

    Gates, Simon

    2013-01-01

    Spreadsheets are useful for large data sets but they may be too wide or too long to print as conventional tables. Response grids offer solutions to the challenges posed by any large data set. They have wide application throughout science and for every subject and context where visual data displays are designed, within education and elsewhere.…

  2. Demands Upon Children Regarding Quality of Achievement: Standard Setting in Preschool Classrooms.

    ERIC Educational Resources Information Center

    Potter, Ellen F.

    Focusing particularly on messages transmitted by socializing agents in preschool settings, this exploratory study investigates (1) the incidence of communication events in which standards for achievement are expressed, (2) the nature of the standards, and (3) variations across settings in the nature of standard-setting events. The relationship of…

  3. Mobile-Phone-Based Classroom Response Systems: Students' Perceptions of Engagement and Learning in a Large Undergraduate Course

    ERIC Educational Resources Information Center

    Dunn, Peter K.; Richardson, Alice; Oprescu, Florin; McDonald, Christine

    2013-01-01

    Using a Classroom Response System (CRS) has been associated with positive educational outcomes, by fostering student engagement and by allowing immediate feedback to both students and instructors. This study examined a low-cost CRS (VotApedia) in a large first-year class, where students responded to questions using their mobile phones. This study…

  4. Safety and science at sea: connecting science research settings to the classroom through live video

    NASA Astrophysics Data System (ADS)

    Cohen, E.; Peart, L. W.

    2011-12-01

    Many science teachers start the year off with classroom safety topics. Annual repetition helps with mastery of this important and basic knowledge, while helping schools to meet their legal obligations for safe lab science. Although these lessons are necessary, they are often topical, rarely authentic and relatively dull. Interesting connections can, however, be drawn between the importance of safety in science classrooms and the importance of safety in academic laboratories, fieldwork, shipboard research, and commercial research. Teachers can leverage these connections through live video interactions with scientists in the field, thereby creating an authentic learning environment. During the School of Rock 2009, a professional teacher research experience aboard the Integrated Ocean Drilling Program's research vessel JOIDES Resolution, safety and nature-of-science curricula were created to help address this need. By experimenting with various topics and locations on the ship that were accessible and applicable to middle school learning, 43 highly visual "safety signs" and activities were identified and presented "live" by graduate students, teachers, scientists; the ship's mates, doctor and technical staff. Students were exposed to realistic science process skills along with safety content from the world's only riserless, deep-sea drilling research vessel. The once-in-a-lifetime experience caused the students' eyes to brighten behind their safety glasses, especially as they recognized the same eye wash station and safety gear they have to wear and attended a ship's fire and safety drill along side scientists in hard hats and personal floatation devices. This collaborative and replicable live vide approach will connect basic safety content and nature-of-science process skills for a memorable and authentic learning experience for students.

  5. A GA-based clustering algorithm for large data sets with mixed numeric and categorical values

    NASA Astrophysics Data System (ADS)

    Li, Jie; Gao, Xinbo; Jiao, Licheng

    2003-09-01

    In the field of data mining, it is often encountered to perform cluster analysis on large data sets with mixed numeric and categorical values. However, most exciting clustering algorithms are only efficient for the numeric data rather than the mixed data set. For this purpose, this paper presents a novel clustering algorithm for these mixed data sets by modifying the common cost function, trace of the within cluster dispersion matrix. The genetic algorithm (GA) is used to optimize the new cost function to obtain valid clustering result. Experimental result illustrates that the GA-based new clustering algorithm is feasible for the large data sets with mixed numeric and categorical values.

  6. Operational Aspects of Dealing with the Large BaBar Data Set

    SciTech Connect

    Trunov, Artem G

    2003-06-13

    To date, the BaBar experiment has stored over 0.7PB of data in an Objectivity/DB database. Approximately half this data-set comprises simulated data of which more than 70% has been produced at more than 20 collaborating institutes outside of SLAC. The operational aspects of managing such a large data set and providing access to the physicists in a timely manner is a challenging and complex problem. We describe the operational aspects of managing such a large distributed data-set as well as importing and exporting data from geographically spread BaBar collaborators. We also describe problems common to dealing with such large datasets.

  7. Variables Associated with a Sense of Classroom Community and Academic Persistence in an Urban Community College Online Setting.

    ERIC Educational Resources Information Center

    Cadieux, Cynthia P.

    This study used a self-report survey to investigate the sense of classroom community in an online community college classroom. The study also aimed to investigate the relationship between classroom community and academic performance. The author administered a 40-item questionnaire either directly or via the course web site three times during the…

  8. A Day in Third Grade: A Large-Scale Study of Classroom Quality and Teacher and Student Behavior

    ERIC Educational Resources Information Center

    Elementary School Journal, 2005

    2005-01-01

    Observations of 780 third-grade classrooms described classroom activities, child-teacher interactions, and dimensions of the global classroom environment, which were examined in relation to structural aspects of the classroom and child behavior. 1 child per classroom was targeted for observation in relation to classroom quality and teacher and…

  9. Selective amplification of protein-coding regions of large sets of genes using statistically designed primer sets

    SciTech Connect

    Lopez-Nieto, C.E.; Nigam, S.K. |

    1996-07-01

    We describe a novel approach to design a set of primers selective for large groups of genes. This method is based on the distribution frequency of all nucleotide combinations (octa- to decanucleotides), and the combined ability of primer pairs, based on these oligonucleotides, to detect genes. By analyzing 1000 human mRNAs, we found that surprisingly small subset of octanucleotides is shared by a high proportion of human protein-coding region sense strands. By computer simulation of polymerase chain reactions, a set based on only 30 primers was able to detect approximately 75% of known (and presumably unknown) human protein-coding regions. To validate the method and provide experimental support for the feasibility of the more ambitious goal of targeting human protein-coding regions, we sought to apply the technique to a large protein family: G-protein coupled receptors (GPCRs). Our results indicate that there is sufficient low level homology among human coding regions to allow design of a limited set of primer pairs that can selectively target coding regions in general, as well as genomic subsets (e.g., GPCRs). The approach should be generally applicable to human coding regions, and thus provide an efficient method for analyzing much of the transcriptionally active human genome. 23 refs., 5 figs., 2 tabs.

  10. Assessing the Effectiveness of Inquiry-based Learning Techniques Implemented in Large Classroom Settings

    NASA Astrophysics Data System (ADS)

    Steer, D. N.; McConnell, D. A.; Owens, K.

    2001-12-01

    Geoscience and education faculty at The University of Akron jointly developed a series of inquiry-based learning modules aimed at both non-major and major student populations enrolled in introductory geology courses. These courses typically serve 2500 students per year in four to six classes of 40-160 students each per section. Twelve modules were developed that contained common topics and assessments appropriate to Earth Science, Environmental Geology and Physical Geology classes. All modules were designed to meet four primary learning objectives agreed upon by Department of Geology faculty. These major objectives include: 1) Improvement of student understanding of the scientific method; 2) Incorporation of problem solving strategies involving analysis, synthesis, and interpretation; 3) Development of the ability to distinguish between inferences, data and observations; and 4) Obtaining an understanding of basic processes that operate on Earth. Additional objectives that may be addressed by selected modules include: 1) The societal relevance of science; 2) Use and interpretation of quantitative data to better understand the Earth; 3) Development of the students' ability to communicate scientific results; 4) Distinguishing differences between science, religion and pseudo-science; 5) Evaluation of scientific information found in the mass media; and 6) Building interpersonal relationships through in-class group work. Student pre- and post-instruction progress was evaluated by administering a test of logical thinking, an attitude toward science survey, and formative evaluations. Scores from the logical thinking instrument were used to form balanced four-person working groups based on the students' incoming cognitive level. Groups were required to complete a series of activities and/or exercises that targeted different cognitive domains based upon Bloom's taxonomy (knowledge, comprehension, application, analysis, synthesis and evaluation of information). Daily assessments of knowledge-level learning included evaluations of student responses to pre- and post-instruction conceptual test questions, short group exercises and content-oriented exam questions. Higher level thinking skills were assessed when students completed exercises that required the completion of Venn diagrams, concept maps and/or evaluation rubrics both during class periods and on exams. Initial results indicate that these techniques improved student attendance significantly and improved overall retention in the course by 8-14% over traditional lecture formats. Student scores on multiple choice exam questions were slightly higher (1-3%) for students taught in the active learning environment and short answer questions showed larger gains (7%) over students' scores in a more traditional class structure.

  11. Large Atomic Natural Orbital Basis Sets for the First Transition Row Atoms

    NASA Technical Reports Server (NTRS)

    Bauschlicher, Charles W., Jr.; Langhoff, Stephen R. (Technical Monitor)

    1994-01-01

    Large atomic natural orbital (ANO) basis sets are tabulated for the Sc to Cu. The primitive sets are taken from the large sets optimized by Partridge, namely (21s 13p 8d) for Sc and Ti and (20s 12p 9d) for V to Cu. These primitive sets are supplemented with three p, one d, six f, and four g functions. The ANO sets are derived from configuration interaction density matrices constructed as the average of the lowest states derived from the 3d(sup n)4s(sup 2) and 3d(sup n+1)4s(sup 1) occupations. For Ni, the 1S(3d(sup 10)) state is included in the averaging. The choice of basis sets for molecular calculations is discussed.

  12. Toddler Subtraction with Large Sets: Further Evidence for an Analog-Magnitude Representation of Number

    ERIC Educational Resources Information Center

    Slaughter, Virginia; Kamppi, Dorian; Paynter, Jessica

    2006-01-01

    Two experiments were conducted to test the hypothesis that toddlers have access to an analog-magnitude number representation that supports numerical reasoning about relatively large numbers. Three-year-olds were presented with subtraction problems in which initial set size and proportions subtracted were systematically varied. Two sets of cookies…

  13. Intercultural Education Set Forward: Operational Strategies and Procedures in Cypriot Classrooms

    ERIC Educational Resources Information Center

    Hajisoteriou, Christina

    2012-01-01

    Teachers in Cyprus are being called upon for the first time to teach within culturally diverse educational settings. Given the substantial role, teachers play in the implementation of intercultural education, this paper explores the intercultural strategies and procedures adopted by primary school teachers in Cyprus. Interviews were carried out…

  14. Intercultural Education Set Forward: Operational Strategies and Procedures in Cypriot Classrooms

    ERIC Educational Resources Information Center

    Hajisoteriou, Christina

    2012-01-01

    Teachers in Cyprus are being called upon for the first time to teach within culturally diverse educational settings. Given the substantial role, teachers play in the implementation of intercultural education, this paper explores the intercultural strategies and procedures adopted by primary school teachers in Cyprus. Interviews were carried out…

  15. An Academic Approach to Stress Management for College Students in a Conventional Classroom Setting.

    ERIC Educational Resources Information Center

    Carnahan, Robert E.; And Others

    Since the identification of stress and the relationship of individual stress responses to physical and mental health, medical and behavioral professionals have been training individuals in coping strategies. To investigate the possibility of teaching cognitive coping skills to a nonclinical population in an academic setting, 41 college students…

  16. A Classroom Exercise in Spatial Analysis Using an Imaginary Data Set.

    ERIC Educational Resources Information Center

    Kopaska-Merkel, David C.

    One skill that elementary students need to acquire is the ability to analyze spatially distributed data. In this activity students are asked to complete the following tasks: (1) plot a set of data (related to "mud-sharks"--an imaginary fish) on a map of the state of Alabama, (2) identify trends in the data, (3) make graphs using the data…

  17. Best in Class: A Classroom-Based Model for Ameliorating Problem Behavior in Early Childhood Settings

    ERIC Educational Resources Information Center

    Vo, Abigail K.; Sutherland, Kevin S.; Conroy, Maureen A.

    2012-01-01

    As more young children enter school settings to attend early childhood programs, early childhood teachers and school psychologists have been charged with supporting a growing number of young children with chronic problem behaviors that put them at risk for the development of emotional/behavioral disorders (EBDs). There is a need for effective,…

  18. BEST in CLASS: A Classroom-Based Model for Ameliorating Problem Behavior in Early Childhood Settings

    ERIC Educational Resources Information Center

    Vo, Abigail; Sutherland, Kevin S.; Conroy, Maureen A.

    2012-01-01

    As more young children enter school settings to attend early childhood programs, early childhood teachers and school psychologists have been charged with supporting a growing number of young children with chronic problem behaviors that put them at risk for the development of emotional/behavioral disorders (EBDs). There is a need for effective,…

  19. Ready, Set, SCIENCE!: Putting Research to Work in K-8 Science Classrooms

    ERIC Educational Resources Information Center

    Michaels, Sarah; Shouse, Andrew W.; Schweingruber, Heidi A.

    2007-01-01

    What types of instructional experiences help K-8 students learn science with understanding? What do science educators, teachers, teacher leaders, science specialists, professional development staff, curriculum designers, and school administrators need to know to create and support such experiences? "Ready, Set, Science!" guides the way with an…

  20. Students with ASD in Mainstream Primary Education Settings: Teachers' Experiences in Western Australian Classrooms

    ERIC Educational Resources Information Center

    Soto-Chodiman, Rebecca; Pooley, Julie Ann; Cohen, Lynne; Taylor, Myra Frances

    2012-01-01

    The shift to inclusive education within Australia has resulted in increasing numbers of students with autism spectrum disorders (ASD) being placed in mainstream educational settings. This move has created new demands on teachers who are not necessarily trained to meet the challenge. Therefore, the present study aimed to develop an understanding of…

  1. Teleconsultation in School Settings: Linking Classroom Teachers and Behavior Analysts Through Web-Based Technology

    PubMed Central

    Frieder, Jessica E; Peterson, Stephanie M; Woodward, Judy; Crane, JaeLee; Garner, Marlane

    2009-01-01

    This paper describes a technically driven, collaborative approach to assessing the function of problem behavior using web-based technology. A case example is provided to illustrate the process used in this pilot project. A school team conducted a functional analysis with a child who demonstrated challenging behaviors in a preschool setting. Behavior analysts at a university setting provided the school team with initial workshop trainings, on-site visits, e-mail and phone communication, as well as live web-based feedback on functional analysis sessions. The school personnel implemented the functional analysis with high fidelity and scored the data reliably. Outcomes of the project suggest that there is great potential for collaboration via the use of web-based technologies for ongoing assessment and development of effective interventions. However, an empirical evaluation of this model should be conducted before wide-scale adoption is recommended. PMID:22477705

  2. Using Mobile Phones to Increase Classroom Interaction

    ERIC Educational Resources Information Center

    Cobb, Stephanie; Heaney, Rose; Corcoran, Olivia; Henderson-Begg, Stephanie

    2010-01-01

    This study examines the possible benefits of using mobile phones to increase interaction and promote active learning in large classroom settings. First year undergraduate students studying Cellular Processes at the University of East London took part in a trial of a new text-based classroom interaction system and evaluated their experience by…

  3. Interdependent Learning in an Open Classroom Setting: Dean Rusk Elementary School, 1972-73. Research and Development Report, Volume 7, Number 7, August 1973.

    ERIC Educational Resources Information Center

    Goettee, Margaret

    All special programs at Dean Rusk Elementary School, funded in part under Title I of the 1965 Elementary Secondary Education Act, combined to facilitate individualized instruction in the nongraded, open classroom setting of the school. To better meet the needs of the pupils during the 1972-73 school year, the Follow Through Program included, for…

  4. Impact of Abbreviated Lecture with Interactive Mini-cases vs Traditional Lecture on Student Performance in the Large Classroom

    PubMed Central

    Nykamp, Diane L.; Momary, Kathryn M.

    2014-01-01

    Objective. To compare the impact of 2 different teaching and learning methods on student mastery of learning objectives in a pharmacotherapy module in the large classroom setting. Design. Two teaching and learning methods were implemented and compared in a required pharmacotherapy module for 2 years. The first year, multiple interactive mini-cases with inclass individual assessment and an abbreviated lecture were used to teach osteoarthritis; a traditional lecture with 1 inclass case discussion was used to teach gout. In the second year, the same topics were used but the methods were flipped. Student performance on pre/post individual readiness assessment tests (iRATs), case questions, and subsequent examinations were compared each year by the teaching and learning method and then between years by topic for each method. Students also voluntarily completed a 20-item evaluation of the teaching and learning methods. Assessment. Postpresentation iRATs were significantly higher than prepresentation iRATs for each topic each year with the interactive mini-cases; there was no significant difference in iRATs before and after traditional lecture. For osteoarthritis, postpresentation iRATs after interactive mini-cases in year 1 were significantly higher than postpresentation iRATs after traditional lecture in year 2; the difference in iRATs for gout per learning method was not significant. The difference between examination performance for osteoarthritis and gout was not significant when the teaching and learning methods were compared. On the student evaluations, 2 items were significant both years when answers were compared by teaching and learning method. Each year, students ranked their class participation higher with interactive cases than with traditional lecture, but both years they reported enjoying the traditional lecture format more. Conclusion. Multiple interactive mini-cases with an abbreviated lecture improved immediate mastery of learning objectives compared to a traditional lecture format, regardless of therapeutic topic, but did not improve student performance on subsequent examinations. PMID:25657376

  5. Experiments and other methods for developing expertise with design of experiments in a classroom setting

    NASA Technical Reports Server (NTRS)

    Patterson, John W.

    1990-01-01

    The only way to gain genuine expertise in Statistical Process Control (SPC) and the design of experiments (DOX) is with repeated practice, but not on canned problems with dead data sets. Rather, one must negotiate a wide variety of problems each with its own peculiarities and its own constantly changing data. The problems should not be of the type for which there is a single, well-defined answer that can be looked up in a fraternity file or in some text. The problems should match as closely as possible the open-ended types for which there is always an abundance of uncertainty. These are the only kinds that arise in real research, whether that be basic research in academe or engineering research in industry. To gain this kind of experience, either as a professional consultant or as an industrial employee, takes years. Vast amounts of money, not to mention careers, must be put at risk. The purpose here is to outline some realistic simulation-type lab exercises that are so simple and inexpensive to run that the students can repeat them as often as desired at virtually no cost. Simulations also allow the instructor to design problems whose outcomes are as noisy as desired but still predictable within limits. Also the instructor and the students can learn a great deal more from the postmortum conducted after the exercise is completed. One never knows for sure what the true data should have been when dealing only with real life experiments. To add a bit more realism to the exercises, it is sometimes desirable to make the students pay for each experimental result from a make-believe budget allocation for the problem.

  6. Computing Steerable Principal Components of a Large Set of Images and Their Rotations

    PubMed Central

    Ponce, Colin; Singer, Amit

    2013-01-01

    We present here an efficient algorithm to compute the Principal Component Analysis (PCA) of a large image set consisting of images and, for each image, the set of its uniform rotations in the plane. We do this by pointing out the block circulant structure of the covariance matrix and utilizing that structure to compute its eigenvectors. We also demonstrate the advantages of this algorithm over similar ones with numerical experiments. Although it is useful in many settings, we illustrate the specific application of the algorithm to the problem of cryo-electron microscopy. PMID:21536533

  7. Out in the Classroom: Transgender Student Experiences at a Large Public University

    ERIC Educational Resources Information Center

    Pryor, Jonathan T.

    2015-01-01

    Faculty and peer interactions are 2 of the most important relationships for college students to foster (Astin, 1993). Transgender college students have an increasing visible presence on college campuses (Pusch, 2005), yet limited research exists on their experiences and struggles in the classroom environment (Garvey & Rankin, 2015; Renn,…

  8. Out in the Classroom: Transgender Student Experiences at a Large Public University

    ERIC Educational Resources Information Center

    Pryor, Jonathan T.

    2015-01-01

    Faculty and peer interactions are 2 of the most important relationships for college students to foster (Astin, 1993). Transgender college students have an increasing visible presence on college campuses (Pusch, 2005), yet limited research exists on their experiences and struggles in the classroom environment (Garvey & Rankin, 2015; Renn,…

  9. Classroom Response Systems for Implementing "Interactive Inquiry" in Large Organic Chemistry Classes

    ERIC Educational Resources Information Center

    Morrison, Richard W.; Caughran, Joel A.; Sauers, Angela L.

    2014-01-01

    The authors have developed "sequence response applications" for classroom response systems (CRSs) that allow instructors to engage and actively involve students in the learning process, probe for common misconceptions regarding lecture material, and increase interaction between instructors and students. "Guided inquiry" and…

  10. Coaching as a Key Component in Teachers' Professional Development: Improving Classroom Practices in Head Start Settings. OPRE Report 2012-4

    ERIC Educational Resources Information Center

    Lloyd, Chrrishana M.; Modlin, Emmily L.

    2012-01-01

    Head Start CARES (Classroom-based Approaches and Resources for Emotion and Social Skill Promotion) is a large-scale, national research demonstration that was designed to test the effects of a one-year program aimed at improving pre-kindergarteners' social and emotional readiness for school. To facilitate the delivery of the program, teachers…

  11. Classroom Discourse and Reading Comprehension in Bilingual Settings: A Case Study of Collaborative Reasoning in a Chinese Heritage Language Learners' Classroom

    ERIC Educational Resources Information Center

    Tsai, Hsiao-Feng

    2012-01-01

    This dissertation examines the participation of one Chinese teacher and five 13 to 15 year-old Chinese heritage students in a classroom in a Chinese community school during group discussions about narrative texts. In this study, the teacher used Collaborative Reasoning (CR) (Anderson, et al., 2001) to help the Chinese heritage students extend…

  12. Teaching Children to Organise and Represent Large Data Sets in a Histogram

    ERIC Educational Resources Information Center

    Nisbet, Steven; Putt, Ian

    2004-01-01

    Although some bright students in primary school are able to organise numerical data into classes, most attend to the characteristics of individuals rather than the group, and "see the trees rather than the forest". How can teachers in upper primary and early high school teach students to organise large sets of data with widely varying values into…

  13. Influences of large sets of environmental exposures on immune responses in healthy adult men.

    PubMed

    Yi, Buqing; Rykova, Marina; Jäger, Gundula; Feuerecker, Matthias; Hörl, Marion; Matzel, Sandra; Ponomarev, Sergey; Vassilieva, Galina; Nichiporuk, Igor; Choukèr, Alexander

    2015-01-01

    Environmental factors have long been known to influence immune responses. In particular, clinical studies about the association between migration and increased risk of atopy/asthma have provided important information on the role of migration associated large sets of environmental exposures in the development of allergic diseases. However, investigations about environmental effects on immune responses are mostly limited in candidate environmental exposures, such as air pollution. The influences of large sets of environmental exposures on immune responses are still largely unknown. A simulated 520-d Mars mission provided an opportunity to investigate this topic. Six healthy males lived in a closed habitat simulating a spacecraft for 520 days. When they exited their "spacecraft" after the mission, the scenario was similar to that of migration, involving exposure to a new set of environmental pollutants and allergens. We measured multiple immune parameters with blood samples at chosen time points after the mission. At the early adaptation stage, highly enhanced cytokine responses were observed upon ex vivo antigen stimulations. For cell population frequencies, we found the subjects displayed increased neutrophils. These results may presumably represent the immune changes occurred in healthy humans when migrating, indicating that large sets of environmental exposures may trigger aberrant immune activity. PMID:26306804

  14. DocCube: Multi-Dimensional Visualization and Exploration of Large Document Sets.

    ERIC Educational Resources Information Center

    Mothe, Josiane; Chrisment, Claude; Dousset, Bernard; Alaux, Joel

    2003-01-01

    Describes a user interface that provides global visualizations of large document sets to help users formulate the query that corresponds to their information needs. Highlights include concept hierarchies that users can browse to specify and refine information needs; knowledge discovery in databases and texts; and multidimensional modeling.…

  15. Using Content-Specific Lyrics to Familiar Tunes in a Large Lecture Setting

    ERIC Educational Resources Information Center

    McLachlin, Derek T.

    2009-01-01

    Music can be used in lectures to increase student engagement and help students retain information. In this paper, I describe my use of biochemistry-related lyrics written to the tune of the theme to the television show, The Flintstones, in a large class setting (400-800 students). To determine student perceptions, the class was surveyed several…

  16. Preschoolers' Nonsymbolic Arithmetic with Large Sets: Is Addition More Accurate than Subtraction?

    ERIC Educational Resources Information Center

    Shinskey, Jeanne L.; Chan, Cindy Ho-man; Coleman, Rhea; Moxom, Lauren; Yamamoto, Eri

    2009-01-01

    Adult and developing humans share with other animals analog magnitude representations of number that support nonsymbolic arithmetic with large sets. This experiment tested the hypothesis that such representations may be more accurate for addition than for subtraction in children as young as 3 1/2 years of age. In these tasks, the experimenter hid…

  17. Influences of large sets of environmental exposures on immune responses in healthy adult men

    PubMed Central

    Yi, Buqing; Rykova, Marina; Jäger, Gundula; Feuerecker, Matthias; Hörl, Marion; Matzel, Sandra; Ponomarev, Sergey; Vassilieva, Galina; Nichiporuk, Igor; Choukèr, Alexander

    2015-01-01

    Environmental factors have long been known to influence immune responses. In particular, clinical studies about the association between migration and increased risk of atopy/asthma have provided important information on the role of migration associated large sets of environmental exposures in the development of allergic diseases. However, investigations about environmental effects on immune responses are mostly limited in candidate environmental exposures, such as air pollution. The influences of large sets of environmental exposures on immune responses are still largely unknown. A simulated 520-d Mars mission provided an opportunity to investigate this topic. Six healthy males lived in a closed habitat simulating a spacecraft for 520 days. When they exited their “spacecraft” after the mission, the scenario was similar to that of migration, involving exposure to a new set of environmental pollutants and allergens. We measured multiple immune parameters with blood samples at chosen time points after the mission. At the early adaptation stage, highly enhanced cytokine responses were observed upon ex vivo antigen stimulations. For cell population frequencies, we found the subjects displayed increased neutrophils. These results may presumably represent the immune changes occurred in healthy humans when migrating, indicating that large sets of environmental exposures may trigger aberrant immune activity. PMID:26306804

  18. DocCube: Multi-Dimensional Visualization and Exploration of Large Document Sets.

    ERIC Educational Resources Information Center

    Mothe, Josiane; Chrisment, Claude; Dousset, Bernard; Alaux, Joel

    2003-01-01

    Describes a user interface that provides global visualizations of large document sets to help users formulate the query that corresponds to their information needs. Highlights include concept hierarchies that users can browse to specify and refine information needs; knowledge discovery in databases and texts; and multidimensional modeling.…

  19. Large-scale detection of metals with a small set of fluorescent DNA-like chemosensors.

    PubMed

    Yuen, Lik Hang; Franzini, Raphael M; Tan, Samuel S; Kool, Eric T

    2014-10-15

    An important advantage of pattern-based chemosensor sets is their potential to detect and differentiate a large number of analytes with only few sensors. Here we test this principle at a conceptual limit by analyzing a large set of metal ion analytes covering essentially the entire periodic table, employing fluorescent DNA-like chemosensors on solid support. A tetrameric "oligodeoxyfluoroside" (ODF) library of 6561 members containing metal-binding monomers was screened for strong responders to 57 metal ions in solution. Our results show that a set of 9 chemosensors could successfully discriminate the 57 species, including alkali, alkaline earth, post-transition, transition, and lanthanide metals. As few as 6 ODF chemosensors could detect and differentiate 50 metals at 100 ?M; sensitivity for some metals was achieved at midnanomolar ranges. A blind test with 50 metals further confirmed the discriminating power of the ODFs. PMID:25255102

  20. Multivariate Cutoff Level Analysis (MultiCoLA) of large community data sets

    PubMed Central

    Gobet, Angélique; Quince, Christopher; Ramette, Alban

    2010-01-01

    High-throughput sequencing techniques are becoming attractive to molecular biologists and ecologists as they provide a time- and cost-effective way to explore diversity patterns in environmental samples at an unprecedented resolution. An issue common to many studies is the definition of what fractions of a data set should be considered as rare or dominant. Yet this question has neither been satisfactorily addressed, nor is the impact of such definition on data set structure and interpretation been fully evaluated. Here we propose a strategy, MultiCoLA (Multivariate Cutoff Level Analysis), to systematically assess the impact of various abundance or rarity cutoff levels on the resulting data set structure and on the consistency of the further ecological interpretation. We applied MultiCoLA to a 454 massively parallel tag sequencing data set of V6 ribosomal sequences from marine microbes in temperate coastal sands. Consistent ecological patterns were maintained after removing up to 35–40% rare sequences and similar patterns of beta diversity were observed after denoising the data set by using a preclustering algorithm of 454 flowgrams. This example validates the importance of exploring the impact of the definition of rarity in large community data sets. Future applications can be foreseen for data sets from different types of habitats, e.g. other marine environments, soil and human microbiota. PMID:20547594

  1. An Analysis Framework Addressing the Scale and Legibility of Large Scientific Data Sets

    SciTech Connect

    Childs, H R

    2006-11-20

    Much of the previous work in the large data visualization area has solely focused on handling the scale of the data. This task is clearly a great challenge and necessary, but it is not sufficient. Applying standard visualization techniques to large scale data sets often creates complicated pictures where meaningful trends are lost. A second challenge, then, is to also provide algorithms that simplify what an analyst must understand, using either visual or quantitative means. This challenge can be summarized as improving the legibility or reducing the complexity of massive data sets. Fully meeting both of these challenges is the work of many, many PhD dissertations. In this dissertation, we describe some new techniques to address both the scale and legibility challenges, in hope of contributing to the larger solution. In addition to our assumption of simultaneously addressing both scale and legibility, we add an additional requirement that the solutions considered fit well within an interoperable framework for diverse algorithms, because a large suite of algorithms is often necessary to fully understand complex data sets. For scale, we present a general architecture for handling large data, as well as details of a contract-based system for integrating advanced optimizations into a data flow network design. We also describe techniques for volume rendering and performing comparisons at the extreme scale. For legibility, we present several techniques. Most noteworthy are equivalence class functions, a technique to drive visualizations using statistical methods, and line-scan based techniques for characterizing shape.

  2. A Complementary Graphical Method for Reducing and Analyzing Large Data Sets*

    PubMed Central

    Jing, X.; Cimino, J. J.

    2014-01-01

    Summary Objectives Graphical displays can make data more understandable; however, large graphs can challenge human comprehension. We have previously described a filtering method to provide high-level summary views of large data sets. In this paper we demonstrate our method for setting and selecting thresholds to limit graph size while retaining important information by applying it to large single and paired data sets, taken from patient and bibliographic databases. Methods Four case studies are used to illustrate our method. The data are either patient discharge diagnoses (coded using the International Classification of Diseases, Clinical Modifications [ICD9-CM]) or Medline citations (coded using the Medical Subject Headings [MeSH]). We use combinations of different thresholds to obtain filtered graphs for detailed analysis. The thresholds setting and selection, such as thresholds for node counts, class counts, ratio values, p values (for diff data sets), and percentiles of selected class count thresholds, are demonstrated with details in case studies. The main steps include: data preparation, data manipulation, computation, and threshold selection and visualization. We also describe the data models for different types of thresholds and the considerations for thresholds selection. Results The filtered graphs are 1%-3% of the size of the original graphs. For our case studies, the graphs provide 1) the most heavily used ICD9-CM codes, 2) the codes with most patients in a research hospital in 2011, 3) a profile of publications on “heavily represented topics” in MEDLINE in 2011, and 4) validated knowledge about adverse effects of the medication of rosiglitazone and new interesting areas in the ICD9-CM hierarchy associated with patients taking the medication of pioglitazone. Conclusions Our filtering method reduces large graphs to a manageable size by removing relatively unimportant nodes. The graphical method provides summary views based on computation of usage frequency and semantic context of hierarchical terminology. The method is applicable to large data sets (such as a hundred thousand records or more) and can be used to generate new hypotheses from data sets coded with hierarchical terminologies. PMID:24727931

  3. Implementing Child-focused Activity Meter Utilization into the Elementary School Classroom Setting Using a Collaborative Community-based Approach

    PubMed Central

    Lynch, BA; Jones, A; Biggs, BK; Kaufman, T; Cristiani, V; Kumar, S; Quigg, S; Maxson, J; Swenson, L; Jacobson, N

    2016-01-01

    Introduction The prevalence of pediatric obesity has increased over the past 3 decades and is a pressing public health program. New technology advancements that can encourage more physical in children are needed. The Zamzee program is an activity meter linked to a motivational website designed for children 8–14 years of age. The objective of the study was to use a collaborative approach between a medical center, the private sector and local school staff to assess the feasibility of using the Zamzee Program in the school-based setting to improve physical activity levels in children. Methods This was a pilot 8-week observational study offered to all children in one fifth grade classroom. Body mass index (BMI), the amount of physical activity by 3-day recall survey, and satisfaction with usability of the Zamzee Program were measured pre- and post-study. Results Out of 11 children who enrolled in the study, 7 completed all study activities. In those who completed the study, the median (interquartile range) total activity time by survey increased by 17 (1042) minutes and the BMI percentile change was 0 (8). Both children and their caregivers found the Zamzee Activity Meter (6/7) and website (6/7) “very easy” or “easy” to use. Conclusion The Zamzee Program was found to be usable but did not significantly improve physical activity levels or BMI. Collaborative obesity intervention projects involving medical centers, the private sector and local schools are feasible but the effectiveness needs to be evaluated in larger-scale studies.

  4. Validating a large geophysical data set: Experiences with satellite-derived cloud parameters

    NASA Technical Reports Server (NTRS)

    Kahn, Ralph; Haskins, Robert D.; Knighton, James E.; Pursch, Andrew; Granger-Gallegos, Stephanie

    1992-01-01

    We are validating the global cloud parameters derived from the satellite-borne HIRS2 and MSU atmospheric sounding instrument measurements, and are using the analysis of these data as one prototype for studying large geophysical data sets in general. The HIRS2/MSU data set contains a total of 40 physical parameters, filling 25 MB/day; raw HIRS2/MSU data are available for a period exceeding 10 years. Validation involves developing a quantitative sense for the physical meaning of the derived parameters over the range of environmental conditions sampled. This is accomplished by comparing the spatial and temporal distributions of the derived quantities with similar measurements made using other techniques, and with model results. The data handling needed for this work is possible only with the help of a suite of interactive graphical and numerical analysis tools. Level 3 (gridded) data is the common form in which large data sets of this type are distributed for scientific analysis. We find that Level 3 data is inadequate for the data comparisons required for validation. Level 2 data (individual measurements in geophysical units) is needed. A sampling problem arises when individual measurements, which are not uniformly distributed in space or time, are used for the comparisons. Standard 'interpolation' methods involve fitting the measurements for each data set to surfaces, which are then compared. We are experimenting with formal criteria for selecting geographical regions, based upon the spatial frequency and variability of measurements, that allow us to quantify the uncertainty due to sampling. As part of this project, we are also dealing with ways to keep track of constraints placed on the output by assumptions made in the computer code. The need to work with Level 2 data introduces a number of other data handling issues, such as accessing data files across machine types, meeting large data storage requirements, accessing other validated data sets, processing speed and throughput for interactive graphical work, and problems relating to graphical interfaces.

  5. The search for structure - Object classification in large data sets. [for astronomers

    NASA Technical Reports Server (NTRS)

    Kurtz, Michael J.

    1988-01-01

    Research concerning object classifications schemes are reviewed, focusing on large data sets. Classification techniques are discussed, including syntactic, decision theoretic methods, fuzzy techniques, and stochastic and fuzzy grammars. Consideration is given to the automation of MK classification (Morgan and Keenan, 1973) and other problems associated with the classification of spectra. In addition, the classification of galaxies is examined, including the problems of systematic errors, blended objects, galaxy types, and galaxy clusters.

  6. Moving Large Data Sets Over High-Performance Long Distance Networks

    SciTech Connect

    Hodson, Stephen W; Poole, Stephen W; Ruwart, Thomas; Settlemyer, Bradley W

    2011-04-01

    In this project we look at the performance characteristics of three tools used to move large data sets over dedicated long distance networking infrastructure. Although performance studies of wide area networks have been a frequent topic of interest, performance analyses have tended to focus on network latency characteristics and peak throughput using network traffic generators. In this study we instead perform an end-to-end long distance networking analysis that includes reading large data sets from a source file system and committing large data sets to a destination file system. An evaluation of end-to-end data movement is also an evaluation of the system configurations employed and the tools used to move the data. For this paper, we have built several storage platforms and connected them with a high performance long distance network configuration. We use these systems to analyze the capabilities of three data movement tools: BBcp, GridFTP, and XDD. Our studies demonstrate that existing data movement tools do not provide efficient performance levels or exercise the storage devices in their highest performance modes. We describe the device information required to achieve high levels of I/O performance and discuss how this data is applicable in use cases beyond data movement performance.

  7. Coffee Shops, Classrooms and Conversations: public engagement and outreach in a large interdisciplinary research Hub

    NASA Astrophysics Data System (ADS)

    Holden, Jennifer A.

    2014-05-01

    Public engagement and outreach activities are increasingly using specialist staff for co-ordination, training and support for researchers, they are also becoming expected for large investments. Here, the experience of public engagement and outreach a large, interdisciplinary Research Hub is described. dot.rural, based at the University of Aberdeen UK, is a £11.8 million Research Councils UK Rural Digital Economy Hub, funded as part of the RCUK Digital Economy Theme (2009-2015). Digital Economy research aims to realise the transformational impact of digital technologies on aspects of the environment, community life, cultural experiences, future society, and the economy. The dot.rural Hub involves 92 researchers from 12 different disciplines, including Geography, Hydrology and Ecology. Public Engagement and Outreach is embedded in the dot.rural Digital Economy Hub via an Outreach Officer. Alongside this position, public engagement and outreach activities are compulsory part of PhD student contracts. Public Engagement and Outreach activities at the dot.rural Hub involve individuals and groups in both formal and informal settings organised by dot.rural and other organisations. Activities in the realms of Education, Public Engagement, Traditional and Social Media are determined by a set of Underlying Principles designed for the Hub by the Outreach Officer. The underlying Engagement and Outreach principles match funding agency requirements and expectations alongside researcher demands and the user-led nature of Digital Economy Research. All activities include researchers alongside the Outreach Officer are research informed and embedded into specific projects that form the Hub. Successful public engagement activities have included participation in Café Scientifique series, workshops in primary and secondary schools, and online activities such as I'm a Scientist Get Me Out of Here. From how to engage 8 year olds with making hydrographs more understandable to members of the public to blogging birds and engaging with remote, rural communities to Spiegeltents. This presentation will share successful public engagement and outreach events alongside some less successful experiences and lessons learnt along the way.

  8. COLLABORATIVE RESEARCH: Parallel Analysis Tools and New Visualization Techniques for Ultra-Large Climate Data Set

    SciTech Connect

    middleton, Don; Haley, Mary

    2014-12-10

    ParVis was a project funded under LAB 10-05: “Earth System Modeling: Advanced Scientific Visualization of Ultra-Large Climate Data Sets”. Argonne was the lead lab with partners at PNNL, SNL, NCAR and UC-Davis. This report covers progress from January 1st, 2013 through Dec 1st, 2014. Two previous reports covered the period from Summer, 2010, through September 2011 and October 2011 through December 2012, respectively. While the project was originally planned to end on April 30, 2013, personnel and priority changes allowed many of the institutions to continue work through FY14 using existing funds. A primary focus of ParVis was introducing parallelism to climate model analysis to greatly reduce the time-to-visualization for ultra-large climate data sets. Work in the first two years was conducted on two tracks with different time horizons: one track to provide immediate help to climate scientists already struggling to apply their analysis to existing large data sets and another focused on building a new data-parallel library and tool for climate analysis and visualization that will give the field a platform for performing analysis and visualization on ultra-large datasets for the foreseeable future. In the final 2 years of the project, we focused mostly on the new data-parallel library and associated tools for climate analysis and visualization.

  9. Non-rigid Registration for Large Sets of Microscopic Images on Graphics Processors

    PubMed Central

    Ruiz, Antonio; Ujaldon, Manuel; Cooper, Lee

    2014-01-01

    Microscopic imaging is an important tool for characterizing tissue morphology and pathology. 3D reconstruction and visualization of large sample tissue structure requires registration of large sets of high-resolution images. However, the scale of this problem presents a challenge for automatic registration methods. In this paper we present a novel method for efficient automatic registration using graphics processing units (GPUs) and parallel programming. Comparing a C++ CPU implementation with Compute Unified Device Architecture (CUDA) libraries and pthreads running on GPU we achieve a speed-up factor of up to 4.11× with a single GPU and 6.68× with a GPU pair. We present execution times for a benchmark composed of two sets of large-scale images: mouse placenta (16K × 16K pixels) and breast cancer tumors (23K × 62K pixels). It takes more than 12 hours for the genetic case in C++ to register a typical sample composed of 500 consecutive slides, which was reduced to less than 2 hours using two GPUs, in addition to a very promising scalability for extending those gains easily on a large number of GPUs in a distributed system. PMID:25328635

  10. Kernel Density Estimation, Kernel Methods, and Fast Learning in Large Data Sets.

    PubMed

    Shitong Wang; Jun Wang; Fu-Lai Chung

    2014-01-01

    Kernel methods such as the standard support vector machine and support vector regression trainings take O(N(3)) time and O(N(2)) space complexities in their naïve implementations, where N is the training set size. It is thus computationally infeasible in applying them to large data sets, and a replacement of the naive method for finding the quadratic programming (QP) solutions is highly desirable. By observing that many kernel methods can be linked up with kernel density estimate (KDE) which can be efficiently implemented by some approximation techniques, a new learning method called fast KDE (FastKDE) is proposed to scale up kernel methods. It is based on establishing a connection between KDE and the QP problems formulated for kernel methods using an entropy-based integrated-squared-error criterion. As a result, FastKDE approximation methods can be applied to solve these QP problems. In this paper, the latest advance in fast data reduction via KDE is exploited. With just a simple sampling strategy, the resulted FastKDE method can be used to scale up various kernel methods with a theoretical guarantee that their performance does not degrade a lot. It has a time complexity of O(m(3)) where m is the number of the data points sampled from the training set. Experiments on different benchmarking data sets demonstrate that the proposed method has comparable performance with the state-of-art method and it is effective for a wide range of kernel methods to achieve fast learning in large data sets. PMID:23797315

  11. Breeding and Genetics Symposium: really big data: processing and analysis of very large data sets.

    PubMed

    Cole, J B; Newman, S; Foertter, F; Aguilar, I; Coffey, M

    2012-03-01

    Modern animal breeding data sets are large and getting larger, due in part to recent availability of high-density SNP arrays and cheap sequencing technology. High-performance computing methods for efficient data warehousing and analysis are under development. Financial and security considerations are important when using shared clusters. Sound software engineering practices are needed, and it is better to use existing solutions when possible. Storage requirements for genotypes are modest, although full-sequence data will require greater storage capacity. Storage requirements for intermediate and results files for genetic evaluations are much greater, particularly when multiple runs must be stored for research and validation studies. The greatest gains in accuracy from genomic selection have been realized for traits of low heritability, and there is increasing interest in new health and management traits. The collection of sufficient phenotypes to produce accurate evaluations may take many years, and high-reliability proofs for older bulls are needed to estimate marker effects. Data mining algorithms applied to large data sets may help identify unexpected relationships in the data, and improved visualization tools will provide insights. Genomic selection using large data requires a lot of computing power, particularly when large fractions of the population are genotyped. Theoretical improvements have made possible the inversion of large numerator relationship matrices, permitted the solving of large systems of equations, and produced fast algorithms for variance component estimation. Recent work shows that single-step approaches combining BLUP with a genomic relationship (G) matrix have similar computational requirements to traditional BLUP, and the limiting factor is the construction and inversion of G for many genotypes. A naïve algorithm for creating G for 14,000 individuals required almost 24 h to run, but custom libraries and parallel computing reduced that to 15 m. Large data sets also create challenges for the delivery of genetic evaluations that must be overcome in a way that does not disrupt the transition from conventional to genomic evaluations. Processing time is important, especially as real-time systems for on-farm decisions are developed. The ultimate value of these systems is to decrease time-to-results in research, increase accuracy in genomic evaluations, and accelerate rates of genetic improvement. PMID:22100598

  12. The gradient boosting algorithm and random boosting for genome-assisted evaluation in large data sets.

    PubMed

    González-Recio, O; Jiménez-Montero, J A; Alenda, R

    2013-01-01

    In the next few years, with the advent of high-density single nucleotide polymorphism (SNP) arrays and genome sequencing, genomic evaluation methods will need to deal with a large number of genetic variants and an increasing sample size. The boosting algorithm is a machine-learning technique that may alleviate the drawbacks of dealing with such large data sets. This algorithm combines different predictors in a sequential manner with some shrinkage on them; each predictor is applied consecutively to the residuals from the committee formed by the previous ones to form a final prediction based on a subset of covariates. Here, a detailed description is provided and examples using a toy data set are included. A modification of the algorithm called "random boosting" was proposed to increase predictive ability and decrease computation time of genome-assisted evaluation in large data sets. Random boosting uses a random selection of markers to add a subsequent weak learner to the predictive model. These modifications were applied to a real data set composed of 1,797 bulls genotyped for 39,714 SNP. Deregressed proofs of 4 yield traits and 1 type trait from January 2009 routine evaluations were used as dependent variables. A 2-fold cross-validation scenario was implemented. Sires born before 2005 were used as a training sample (1,576 and 1,562 for production and type traits, respectively), whereas younger sires were used as a testing sample to evaluate predictive ability of the algorithm on yet-to-be-observed phenotypes. Comparison with the original algorithm was provided. The predictive ability of the algorithm was measured as Pearson correlations between observed and predicted responses. Further, estimated bias was computed as the average difference between observed and predicted phenotypes. The results showed that the modification of the original boosting algorithm could be run in 1% of the time used with the original algorithm and with negligible differences in accuracy and bias. This modification may be used to speed the calculus of genome-assisted evaluation in large data sets such us those obtained from consortiums. PMID:23102953

  13. An interactive environment for the analysis of large Earth observation and model data sets

    NASA Technical Reports Server (NTRS)

    Bowman, Kenneth P.; Walsh, John E.; Wilhelmson, Robert B.

    1994-01-01

    Envision is an interactive environment that provides researchers in the earth sciences convenient ways to manage, browse, and visualize large observed or model data sets. Its main features are support for the netCDF and HDF file formats, an easy to use X/Motif user interface, a client-server configuration, and portability to many UNIX workstations. The Envision package also provides new ways to view and change metadata in a set of data files. It permits a scientist to conveniently and efficiently manage large data sets consisting of many data files. It also provides links to popular visualization tools so that data can be quickly browsed. Envision is a public domain package, freely available to the scientific community. Envision software (binaries and source code) and documentation can be obtained from either of these servers: ftp://vista.atmos.uiuc.edu/pub/envision/ and ftp://csrp.tamu.edu/pub/envision/. Detailed descriptions of Envision capabilities and operations can be found in the User's Guide and Reference Manuals distributed with Envision software.

  14. Information Visualization, Nonlinear Dimensionality Reduction and Sampling for Large and Complex Data Sets

    NASA Astrophysics Data System (ADS)

    Pesenson, Meyer Z.; Pesenson, I. Z.; McCollum, B.

    2010-01-01

    Recent and forthcoming increases in the amount and complexity of astronomy data are creating data sets that are not amenable to the methods of analysis with which astronomers are familiar. Traditional methods are often inadequate not merely because the data sets are too large and too complex to fully be analyzed "manually", but because many conventional algorithms and techniques cannot be scaled up enough to work effectively on the new data sets. It is essential to develop new methods for organization, scientific visualization (as opposed to illustrative visualization) and analysis of heterogeneous, multiresolution data across application domains. Scientific utilization of highly complex and massive data sets poses significant challenges, and calls for some mathematical approaches more advanced than are now generally available. In this paper, we both give an overview of several innovative developments that address these challenges, and describe a few specific examples of algorithms we have developed, as well as the ones we are developing in the course of this ongoing work. These approaches will enhance scientific visualization and data analysis capabilities, thus facilitating astronomical research and enabling discoveries. This work was carried out with partial funding from the National Geospatial-Intelligence Agency University Research Initiative (NURI), grant HM1582-08-1-0019.

  15. Main large data set features detection by a linear predictor model

    NASA Astrophysics Data System (ADS)

    Gutierrez, Carlos Enrique; Alsharif, Mohamad Reza, Prof.; Khosravy, Mahdi; Yamashita, Katsumi, Prof.; Miyagi, Hayao, Prof.; Villa, Rafael

    2014-10-01

    The aim of the present paper is to explore and obtain a simple method capable to detect the most important variables (features) from a large set of variables. To verify the performance of the approach described in the following sections, we used a set of news. Text sources are considered high-dimensional data, where each word is treated as a single variable. In our work, a linear predictor model has been used to uncover the most influential variables, reducing strongly the dimension of the data set. Input data is classified in two categories; arranged as a collection of plain text data, pre-processed and transformed into a numerical matrix containing around 10,000 different variables. We adjust the linear model's parameters based on its prediction results, the variables with strongest effect on output survive, while those with negligible effect are removed. In order to collect, automatically, a summarized set of features, we sacrifice some details and accuracy of the prediction model, although we try to balance the squared error with the subset obtained.

  16. Web-Queryable Large-Scale Data Sets for Hypothesis Generation in Plant Biology

    PubMed Central

    Brady, Siobhan M.; Provart, Nicholas J.

    2009-01-01

    The approaching end of the 21st century's first decade marks an exciting time for plant biology. Several National Science Foundation Arabidopsis 2010 Projects will conclude, and whether or not the stated goal of the National Science Foundation 2010 Program—to determine the function of 25,000 Arabidopsis genes by 2010—is reached, these projects and others in a similar vein, such as those performed by the AtGenExpress Consortium and various plant genome sequencing initiatives, have generated important and unprecedented large-scale data sets. While providing significant biological insights for the individual laboratories that generated them, these data sets, in conjunction with the appropriate tools, are also permitting plant biologists worldwide to gain new insights into their own biological systems of interest, often at a mouse click through a Web browser. This review provides an overview of several such genomic, epigenomic, transcriptomic, proteomic, and metabolomic data sets and describes Web-based tools for querying them in the context of hypothesis generation for plant biology. We provide five biological examples of how such tools and data sets have been used to provide biological insight. PMID:19401381

  17. Corrected small basis set Hartree-Fock method for large systems.

    PubMed

    Sure, Rebecca; Grimme, Stefan

    2013-07-15

    A quantum chemical method based on a Hartree-Fock calculation with a small Gaussian AO basis set is presented. Its main area of application is the computation of structures, vibrational frequencies, and noncovalent interaction energies in huge molecular systems. The method is suggested as a partial replacement of semiempirical approaches or density functional theory (DFT) in particular when self-interaction errors are acute. In order to get accurate results three physically plausible atom pair-wise correction terms are applied for London dispersion interactions (D3 scheme), basis set superposition error (gCP scheme), and short-ranged basis set incompleteness effects. In total nine global empirical parameters are used. This so-called Hartee-Fock-3c (HF-3c) method is tested for geometries of small organic molecules, interaction energies and geometries of noncovalently bound complexes, for supramolecular systems, and protein structures. In the majority of realistic test cases good results approaching large basis set DFT quality are obtained at a tiny fraction of computational cost. PMID:23670872

  18. Distributed Computation of the knn Graph for Large High-Dimensional Point Sets

    PubMed Central

    Plaku, Erion; Kavraki, Lydia E.

    2009-01-01

    High-dimensional problems arising from robot motion planning, biology, data mining, and geographic information systems often require the computation of k nearest neighbor (knn) graphs. The knn graph of a data set is obtained by connecting each point to its k closest points. As the research in the above-mentioned fields progressively addresses problems of unprecedented complexity, the demand for computing knn graphs based on arbitrary distance metrics and large high-dimensional data sets increases, exceeding resources available to a single machine. In this work we efficiently distribute the computation of knn graphs for clusters of processors with message passing. Extensions to our distributed framework include the computation of graphs based on other proximity queries, such as approximate knn or range queries. Our experiments show nearly linear speedup with over one hundred processors and indicate that similar speedup can be obtained with several hundred processors. PMID:19847318

  19. Phase Unwrapping for Large InSAR Data Sets Through Statistical-Cost Tiling

    NASA Astrophysics Data System (ADS)

    Chen, C. W.; Zebker, H. A.

    2001-12-01

    Two-dimensional phase unwrapping is a key step in the analysis of InSAR data, and many algorithms for this task have been proposed in recent years. Some of these algorithms have shown promise in handling the problem's intrinsic difficulties, but new difficulties arise when the dimensions of the interferometric input data exceed the limits imposed by computer memory constraints. Similarly, new phase unwrapping strategies may be required when sheer data volumes necessitate greater computational throughput. These issues are especially important in the contexts of large-scale topographic mapping projects such as SRTM and the Alaska DEM Project. We propose a technique for applying the statistical-cost, network-flow phase unwrapping algorithm (SNAPHU) of Chen and Zebker (2001) to large data sets. That is, we introduce a methodology whereby a large interferogram is unwrapped as a set of several smaller tiles. The tiles are unwrapped individually and then further divided into independent, irregularly shaped reliable regions. The phase offsets of these reliable regions are then computed in a secondary optimization problem that seeks to maximize the probability of the full unwrapped solution, using the same statistical models as employed in the primary phase unwrapping stage. The technique therefore approximates a maximum a posteriori probability (MAP) unwrapped solution over the full-sized interferogram. The secondary optimization problem is solved through the use of a nonlinear network-flow solver. We examine the performance of this technique on a real interferometric data set, and we find that the technique is less prone to unwrapping artifacts than more simple tiling approaches.

  20. Classroom-Based Interventions and Teachers' Perceived Job Stressors and Confidence: Evidence from a Randomized Trial in Head Start Settings

    ERIC Educational Resources Information Center

    Zhai, Fuhua; Raver, C. Cybele; Li-Grining, Christine

    2011-01-01

    Preschool teachers' job stressors have received increasing attention but have been understudied in the literature. We investigated the impacts of a classroom-based intervention, the Chicago School Readiness Project (CSRP), on teachers' perceived job stressors and confidence, as indexed by their perceptions of job control, job resources, job…

  1. Classroom-Based Interventions and Teachers' Perceived Job Stressors and Confidence: Evidence from a Randomized Trial in Head Start Settings

    ERIC Educational Resources Information Center

    Zhai, Fuhua; Raver, C. Cybele; Li-Grining, Christine

    2011-01-01

    Preschool teachers' job stressors have received increasing attention but have been understudied in the literature. We investigated the impacts of a classroom-based intervention, the Chicago School Readiness Project (CSRP), on teachers' perceived job stressors and confidence, as indexed by their perceptions of job control, job resources, job…

  2. What Do Children Write in Science? A Study of the Genre Set in a Primary Science Classroom

    ERIC Educational Resources Information Center

    Honig, Sheryl

    2010-01-01

    This article reports on the types of scientific writing found in two primary grade classrooms. These results are part of a larger two-year study whose purpose was to examine the development of informational writing of second- and third-grade students as they participated in integrated science-literacy instruction. The primary purpose of the…

  3. Analogies as Tools for Meaning Making in Elementary Science Education: How Do They Work in Classroom Settings?

    ERIC Educational Resources Information Center

    Guerra-Ramos, Maria Teresa

    2011-01-01

    In this paper there is a critical overview of the role of analogies as tools for meaning making in science education, their advantages and disadvantages. Two empirical studies on the use of analogies in primary classrooms are discussed and analysed. In the first study, the "string circuit" analogy was used in the teaching of electric circuits with…

  4. Child and Setting Characteristics Affecting the Adult Talk Directed at Preschoolers with Autism Spectrum Disorder in the Inclusive Classroom

    ERIC Educational Resources Information Center

    Irvin, Dwight W.; Boyd, Brian A.; Odom, Samuel L.

    2015-01-01

    Difficulty with social competence is a core deficit of autism spectrum disorder. Research on typically developing children and children with disabilities, in general, suggests the adult talk received in the classroom is related to their social development. The aims of this study were to examine (1) the types and amounts of adult talk children with…

  5. Science in the Classroom: Finding a Balance between Autonomous Exploration and Teacher-Led Instruction in Preschool Settings

    ERIC Educational Resources Information Center

    Nayfeld, Irena; Brenneman, Kimberly; Gelman, Rochel

    2011-01-01

    Research Findings: This paper reports on children's use of science materials in preschool classrooms during their free choice time. Baseline observations showed that children and teachers rarely spend time in the designated science area. An intervention was designed to "market" the science center by introducing children to 1 science tool, the…

  6. An Analogous Study of Children's Attitudes Toward School in an Open Classroom Environment as Opposed to a Conventional Setting.

    ERIC Educational Resources Information Center

    Zeli, Doris Conti

    A study sought to determine whether intermediate age children exposed to open classroom teaching strategy have a more positive attitude toward school than intermediate age children exposed to conventional teaching strategy. The hypothesis was that there would be no significant difference in attitude between the two groups. The study was limited to…

  7. Initial Validation of the Prekindergarten Classroom Observation Tool and Goal Setting System for Data-Based Coaching

    ERIC Educational Resources Information Center

    Crawford, April D.; Zucker, Tricia A.; Williams, Jeffrey M.; Bhavsar, Vibhuti; Landry, Susan H.

    2013-01-01

    Although coaching is a popular approach for enhancing the quality of Tier 1 instruction, limited research has addressed observational measures specifically designed to focus coaching on evidence-based practices. This study explains the development of the prekindergarten (pre-k) Classroom Observation Tool (COT) designed for use in a data-based…

  8. Initial Validation of the Prekindergarten Classroom Observation Tool and Goal Setting System for Data-Based Coaching

    ERIC Educational Resources Information Center

    Crawford, April D.; Zucker, Tricia A.; Williams, Jeffrey M.; Bhavsar, Vibhuti; Landry, Susan H.

    2013-01-01

    Although coaching is a popular approach for enhancing the quality of Tier 1 instruction, limited research has addressed observational measures specifically designed to focus coaching on evidence-based practices. This study explains the development of the prekindergarten (pre-k) Classroom Observation Tool (COT) designed for use in a data-based…

  9. Child and Setting Characteristics Affecting the Adult Talk Directed at Preschoolers with Autism Spectrum Disorder in the Inclusive Classroom

    ERIC Educational Resources Information Center

    Irvin, Dwight W.; Boyd, Brian A.; Odom, Samuel L.

    2015-01-01

    Difficulty with social competence is a core deficit of autism spectrum disorder. Research on typically developing children and children with disabilities, in general, suggests the adult talk received in the classroom is related to their social development. The aims of this study were to examine (1) the types and amounts of adult talk children with…

  10. Science in the Classroom: Finding a Balance between Autonomous Exploration and Teacher-Led Instruction in Preschool Settings

    ERIC Educational Resources Information Center

    Nayfeld, Irena; Brenneman, Kimberly; Gelman, Rochel

    2011-01-01

    Research Findings: This paper reports on children's use of science materials in preschool classrooms during their free choice time. Baseline observations showed that children and teachers rarely spend time in the designated science area. An intervention was designed to "market" the science center by introducing children to 1 science tool, the…

  11. Analogies as Tools for Meaning Making in Elementary Science Education: How Do They Work in Classroom Settings?

    ERIC Educational Resources Information Center

    Guerra-Ramos, Maria Teresa

    2011-01-01

    In this paper there is a critical overview of the role of analogies as tools for meaning making in science education, their advantages and disadvantages. Two empirical studies on the use of analogies in primary classrooms are discussed and analysed. In the first study, the "string circuit" analogy was used in the teaching of electric circuits with…

  12. Envision: An interactive system for the management and visualization of large geophysical data sets

    NASA Technical Reports Server (NTRS)

    Searight, K. R.; Wojtowicz, D. P.; Walsh, J. E.; Pathi, S.; Bowman, K. P.; Wilhelmson, R. B.

    1995-01-01

    Envision is a software project at the University of Illinois and Texas A&M, funded by NASA's Applied Information Systems Research Project. It provides researchers in the geophysical sciences convenient ways to manage, browse, and visualize large observed or model data sets. Envision integrates data management, analysis, and visualization of geophysical data in an interactive environment. It employs commonly used standards in data formats, operating systems, networking, and graphics. It also attempts, wherever possible, to integrate with existing scientific visualization and analysis software. Envision has an easy-to-use graphical interface, distributed process components, and an extensible design. It is a public domain package, freely available to the scientific community.

  13. Experimental set-up for three PHOEBUS type large-area heliostats at the PSA

    SciTech Connect

    Haeger, M.; Schiel, W.; Romero, M.; Schmitz-Goeb, M.

    1995-11-01

    Three large-area heliostat prototypes are being erected at the Plataforma Solar de Almeria by Spanish and German industry. The objective is to demonstrate their technical and economical suitability for a PHOEBUS power tower plant. The two different heliostat designs including two 100 ml glass/metal faceted heliostats and one 150 m{sup 2} stressed membrane heliostat are tested at a representative distance of 485 m to the PSA`s CESA tower. The paper introduces the heliostat designs and test set-up, such as location, targets, flux measurement, data acquisition and control.

  14. A practical, bioinformatic workflow system for large data sets generated by next-generation sequencing

    PubMed Central

    Cantacessi, Cinzia; Jex, Aaron R.; Hall, Ross S.; Young, Neil D.; Campbell, Bronwyn E.; Joachim, Anja; Nolan, Matthew J.; Abubucker, Sahar; Sternberg, Paul W.; Ranganathan, Shoba; Mitreva, Makedonka; Gasser, Robin B.

    2010-01-01

    Transcriptomics (at the level of single cells, tissues and/or whole organisms) underpins many fields of biomedical science, from understanding the basic cellular function in model organisms, to the elucidation of the biological events that govern the development and progression of human diseases, and the exploration of the mechanisms of survival, drug-resistance and virulence of pathogens. Next-generation sequencing (NGS) technologies are contributing to a massive expansion of transcriptomics in all fields and are reducing the cost, time and performance barriers presented by conventional approaches. However, bioinformatic tools for the analysis of the sequence data sets produced by these technologies can be daunting to researchers with limited or no expertise in bioinformatics. Here, we constructed a semi-automated, bioinformatic workflow system, and critically evaluated it for the analysis and annotation of large-scale sequence data sets generated by NGS. We demonstrated its utility for the exploration of differences in the transcriptomes among various stages and both sexes of an economically important parasitic worm (Oesophagostomum dentatum) as well as the prediction and prioritization of essential molecules (including GTPases, protein kinases and phosphatases) as novel drug target candidates. This workflow system provides a practical tool for the assembly, annotation and analysis of NGS data sets, also to researchers with a limited bioinformatic expertise. The custom-written Perl, Python and Unix shell computer scripts used can be readily modified or adapted to suit many different applications. This system is now utilized routinely for the analysis of data sets from pathogens of major socio-economic importance and can, in principle, be applied to transcriptomics data sets from any organism. PMID:20682560

  15. Approaching the exa-scale: a real-world evaluation of rendering extremely large data sets

    SciTech Connect

    Patchett, John M; Ahrens, James P; Lo, Li - Ta; Browniee, Carson S; Mitchell, Christopher J; Hansen, Chuck

    2010-10-15

    Extremely large scale analysis is becoming increasingly important as supercomputers and their simulations move from petascale to exascale. The lack of dedicated hardware acceleration for rendering on today's supercomputing platforms motivates our detailed evaluation of the possibility of interactive rendering on the supercomputer. In order to facilitate our understanding of rendering on the supercomputing platform, we focus on scalability of rendering algorithms and architecture envisioned for exascale datasets. To understand tradeoffs for dealing with extremely large datasets, we compare three different rendering algorithms for large polygonal data: software based ray tracing, software based rasterization and hardware accelerated rasterization. We present a case study of strong and weak scaling of rendering extremely large data on both GPU and CPU based parallel supercomputers using Para View, a parallel visualization tool. Wc use three different data sets: two synthetic and one from a scientific application. At an extreme scale, algorithmic rendering choices make a difference and should be considered while approaching exascale computing, visualization, and analysis. We find software based ray-tracing offers a viable approach for scalable rendering of the projected future massive data sizes.

  16. Treatment of severe pulmonary hypertension in the setting of the large patent ductus arteriosus.

    PubMed

    Niu, Mary C; Mallory, George B; Justino, Henri; Ruiz, Fadel E; Petit, Christopher J

    2013-05-01

    Treatment of the large patent ductus arteriosus (PDA) in the setting of pulmonary hypertension (PH) is challenging. Left patent, the large PDA can result in irreversible pulmonary vascular disease. Occlusion, however, may lead to right ventricular failure for certain patients with severe PH. Our center has adopted a staged management strategy using medical management, noninvasive imaging, and invasive cardiac catheterization to treat PH in the presence of a large PDA. This approach determines the safety of ductal closure but also leverages medical therapy to create an opportunity for safe PDA occlusion. We reviewed our experience with this approach. Patients with both severe PH and PDAs were studied. PH treatment history and hemodynamic data obtained during catheterizations were reviewed. Repeat catheterizations, echocardiograms, and clinical status at latest follow-up were also reviewed. Seven patients had both PH and large, unrestrictive PDAs. At baseline, all patients had near-systemic right ventricular pressures. Nine catheterizations were performed. Two patients underwent 2 catheterizations each due to poor initial response to balloon test occlusion. Six of 7 patients exhibited subsystemic pulmonary pressures during test occlusion and underwent successful PDA occlusion. One patient did not undergo PDA occlusion. In follow-up, 2 additional catheterizations were performed after successful PDA occlusion for subsequent hemodynamic assessment. At the latest follow-up, the 6 patients who underwent PDA occlusion are well, with continued improvement in PH. Five patients remain on PH treatment. A staged approach to PDA closure for patients with severe PH is an effective treatment paradigm. Aggressive treatment of PH creates a window of opportunity for PDA occlusion, echocardiography assists in identifying the timing for closure, and balloon test occlusion during cardiac catheterization is critical in determining safety of closure. By safely eliminating the large PDA, this treatment algorithm can halt the perilous combination of the large shunting from the PDA and PH in a population at high risk of morbidity and mortality. PMID:23629611

  17. Litho-kinematic facies model for large landslide deposits in arid settings

    SciTech Connect

    Yarnold, J.C.; Lombard, J.P.

    1989-04-01

    Reconnaissance field studies of six large landslide deposits in the S. Basin and Range suggest that a set of characteristic features is common to the deposits of large landslides in an arid setting. These include a coarse boulder cap, an upper massive zone, a lower disrupted zone, and a mixed zone overlying disturbed substrate. The upper massive zone is dominated by crackel breccia. This grades downward into a lower disrupted zone composed of a more matrix-rich breccia that is internally sheared, intruded by clastic dikes, and often contains a cataclasite layer at its base. An underlying discontinuous mixed zone is composed of material from the overlying breccia mixed with material entrained from the underlying substrate. Bedding in the substrate sometimes displays folding and contortion that die out downward. The authors work suggests a spatial zonation of these characteristic features within many landslide deposits. In general, clastic dikes, the basal cataclasite, and folding in the substrate are observed mainly in distal parts of landslides. In most cases, total thickness, thickness of the basal disturbed and mixed zones, and the degree of internal shearing increase distally, whereas maximum clast size commonly decreases distally. Zonation of these features is interpreted to result from kinematics of emplacement that cause generally increased deformation in the distal regions of the landslide.

  18. A Technique for Moving Large Data Sets over High-Performance Long Distance Networks

    SciTech Connect

    Settlemyer, Bradley W; Dobson, Jonathan D; Hodson, Stephen W; Kuehn, Jeffery A; Poole, Stephen W; Ruwart, Thomas

    2011-01-01

    In this paper we look at the performance characteristics of three tools used to move large data sets over dedicated long distance networking infrastructure. Although performance studies of wide area networks have been a frequent topic of interest, performance analyses have tended to focus on network latency characteristics and peak throughput using network traffic generators. In this study we instead perform an end-to-end long distance networking analysis that includes reading large data sets from a source file system and committing the data to a remote destination file system. An evaluation of end-to-end data movement is also an evaluation of the system configurations employed and the tools used to move the data. For this paper, we have built several storage platforms and connected them with a high performance long distance network configuration. We use these systems to analyze the capabilities of three data movement tools: BBcp, GridFTP, and XDD. Our studies demonstrate that existing data movement tools do not provide efficient performance levels or exercise the storage devices in their highest performance modes.

  19. Contextual settings, science stories, and large context problems: Toward a more humanistic science education

    NASA Astrophysics Data System (ADS)

    Stinner, Arthur

    This article addresses the need for and the problem of organizing a science curriculum around contextual settings and science stories that serve to involve and motivate students to develop an understanding of the world that is rooted in the scientific and the humanistic traditions. A program of activities placed around contextual settings, science stories, and contemporary issues of interest is recommended in an attempt to move toward a slow and secure abolition of the gulf between scientific knowledge and common sense beliefs. A conceptual development model is described to guide the connection between theory and evidence on a level appropriate for children, from early years to senior years. For the senior years it is also important to connect the activity of teaching to a sound theoretical structure. The theoretical structure must illuminate the status of theory in science, establish what counts as evidence, clarify the relationship between experiment and explanation, and make connections to the history of science. The article concludes with a proposed program of activities in terms of a sequence of theoretical and empirical experiences that involve contextual settings, science stories, large context problems, thematic teaching, and popular science literature teaching.

  20. Developing consistent Landsat data sets for large area applications: the MRLC 2001 protocol

    USGS Publications Warehouse

    Chander, G.; Huang, C.; Yang, L.; Homer, C.; Larson, C.

    2009-01-01

    One of the major efforts in large area land cover mapping over the last two decades was the completion of two U.S. National Land Cover Data sets (NLCD), developed with nominal 1992 and 2001 Landsat imagery under the auspices of the MultiResolution Land Characteristics (MRLC) Consortium. Following the successful generation of NLCD 1992, a second generation MRLC initiative was launched with two primary goals: (1) to develop a consistent Landsat imagery data set for the U.S. and (2) to develop a second generation National Land Cover Database (NLCD 2001). One of the key enhancements was the formulation of an image preprocessing protocol and implementation of a consistent image processing method. The core data set of the NLCD 2001 database consists of Landsat 7 Enhanced Thematic Mapper Plus (ETM+) images. This letter details the procedures for processing the original ETM+ images and more recent scenes added to the database. NLCD 2001 products include Anderson Level II land cover classes, percent tree canopy, and percent urban imperviousness at 30-m resolution derived from Landsat imagery. The products are freely available for download to the general public from the MRLC Consortium Web site at http://www.mrlc.gov.

  1. Fast maximum intensity projections of large medical data sets by exploiting hierarchical memory architectures.

    PubMed

    Kiefer, Gundolf; Lehmann, Helko; Weese, Jürgen

    2006-04-01

    Maximum intensity projections (MIPs) are an important visualization technique for angiographic data sets. Efficient data inspection requires frame rates of at least five frames per second at preserved image quality. Despite the advances in computer technology, this task remains a challenge. On the one hand, the sizes of computed tomography and magnetic resonance images are increasing rapidly. On the other hand, rendering algorithms do not automatically benefit from the advances in processor technology, especially for large data sets. This is due to the faster evolving processing power and the slower evolving memory access speed, which is bridged by hierarchical cache memory architectures. In this paper, we investigate memory access optimization methods and use them for generating MIPs on general-purpose central processing units (CPUs) and graphics processing units (GPUs), respectively. These methods can work on any level of the memory hierarchy, and we show that properly combined methods can optimize memory access on multiple levels of the hierarchy at the same time. We present performance measurements to compare different algorithm variants and illustrate the influence of the respective techniques. On current hardware, the efficient handling of the memory hierarchy for CPUs improves the rendering performance by a factor of 3 to 4. On GPUs, we observed that the effect is even larger, especially for large data sets. The methods can easily be adjusted to different hardware specifics, although their impact can vary considerably. They can also be used for other rendering techniques than MIPs, and their use for more general image processing task could be investigated in the future. PMID:16617627

  2. Suffix tree searcher: exploration of common substrings in large DNA sequence sets

    PubMed Central

    2014-01-01

    Background Large DNA sequence data sets require special bioinformatics tools to search and compare them. Such tools should be easy to use so that the data can be easily accessed by a wide array of researchers. In the past, the use of suffix trees for searching DNA sequences has been limited by a practical need to keep the trees in RAM. Newer algorithms solve this problem by using disk-based approaches. However, none of the fastest suffix tree algorithms have been implemented with a graphical user interface, preventing their incorporation into a feasible laboratory workflow. Results Suffix Tree Searcher (STS) is designed as an easy-to-use tool to index, search, and analyze very large DNA sequence datasets. The program accommodates very large numbers of very large sequences, with aggregate size reaching tens of billions of nucleotides. The program makes use of pre-sorted persistent "building blocks" to reduce the time required to construct new trees. STS is comprised of a graphical user interface written in Java, and four C modules. All components are automatically downloaded when a web link is clicked. The underlying suffix tree data structure permits extremely fast searching for specific nucleotide strings, with wild cards or mismatches allowed. Complete tree traversals for detecting common substrings are also very fast. The graphical user interface allows the user to transition seamlessly between building, traversing, and searching the dataset. Conclusions Thus, STS provides a new resource for the detection of substrings common to multiple DNA sequences or within a single sequence, for truly huge data sets. The re-searching of sequence hits, allowing wild card positions or mismatched nucleotides, together with the ability to rapidly retrieve large numbers of sequence hits from the DNA sequence files, provides the user with an efficient method of evaluating the similarity between nucleotide sequences by multiple alignment or use of Logos. The ability to re-use existing suffix tree pieces considerably shortens index generation time. The graphical user interface enables quick mastery of the analysis functions, easy access to the generated data, and seamless workflow integration. PMID:25053142

  3. The Large Synoptic Survey Telescope and Foundations for Data Exploitation of Petabyte Data Sets

    SciTech Connect

    Cook, K H; Nikolaev, S; Huber, M E

    2007-02-26

    The next generation of imaging surveys in astronomy, such as the Large Synoptic Survey Telescope (LSST), will require multigigapixel cameras that can process enormous amounts of data read out every few seconds. This huge increase in data throughput (compared to megapixel cameras and minute- to hour-long integrations of today's instruments) calls for a new paradigm for extracting the knowledge content. We have developed foundations for this new approach. In this project, we have studied the necessary processes for extracting information from large time-domain databases systematics. In the process, we have produced significant scientific breakthroughs by developing new methods to probe both the elusive time and spatial variations in astrophysics data sets from the SuperMACHO (Massive Compact Halo Objects) survey, the Lowell Observatory Near-Earth Object Search (LONEOS), and the Taiwanese American Occultation Survey (TAOS). This project continues to contribute to the development of the scientific foundations for future wide-field, time-domain surveys. Our algorithm and pipeline development has provided the building blocks for the development of the LSST science software system. Our database design and performance measures have helped to size and constrain LSST database design. LLNL made significant contributions to the foundations of the LSST, which has applications for large-scale imaging and data-mining activities at LLNL. These developments are being actively applied to the previously mentioned surveys producing important scientific results that have been released to the scientific community and more continue to be published and referenced, enhancing LLNL's scientific stature.

  4. Child and setting characteristics affecting the adult talk directed at preschoolers with autism spectrum disorder in the inclusive classroom.

    PubMed

    Irvin, Dwight W; Boyd, Brian A; Odom, Samuel L

    2015-02-01

    Difficulty with social competence is a core deficit of autism spectrum disorder. Research on typically developing children and children with disabilities, in general, suggests the adult talk received in the classroom is related to their social development. The aims of this study were to examine (1) the types and amounts of adult talk children with autism spectrum disorder are exposed to in the preschool classroom and (2) the associations between child characteristics (e.g. language), activity area, and adult talk. Kontos' Teacher Talk classification was used to code videos approximately 30 min in length of 73 children with autism spectrum disorder (ages 3-5) in inclusive classrooms (n = 33) during center time. The results indicated practical/personal assistance was the most common type of adult talk coded, and behavior management talk least often coded. Child characteristics (i.e. age and autism severity) and activity area were found to be related to specific types of adult talk. Given the findings, implications for future research are discussed. PMID:24463432

  5. Classroom-based Interventions and Teachers' Perceived Job Stressors and Confidence: Evidence from a Randomized Trial in Head Start Settings.

    PubMed

    Zhai, Fuhua; Raver, C Cybele; Li-Grining, Christine

    2011-09-01

    Preschool teachers' job stressors have received increasing attention but have been understudied in the literature. We investigated the impacts of a classroom-based intervention, the Chicago School Readiness Project (CSRP), on teachers' perceived job stressors and confidence, as indexed by their perceptions of job control, job resources, job demands, and confidence in behavior management. Using a clustered randomized controlled trial (RCT) design, the CSRP provided multifaceted services to the treatment group, including teacher training and mental health consultation, which were accompanied by stress-reduction services and workshops. Overall, 90 teachers in 35 classrooms at 18 Head Start sites participated in the study. After adjusting for teacher and classroom factors and site fixed effects, we found that the CSRP had significant effects on the improvement of teachers' perceived job control and work-related resources. We also found that the CSRP decreased teachers' confidence in behavior management and had no statistically significant effects on job demands. Overall, we did not find significant moderation effects of teacher race/ethnicity, education, teaching experience, or teacher type. The implications for research and policy are discussed. PMID:21927538

  6. Twelve- to 14-Month-Old Infants Can Predict Single-Event Probability with Large Set Sizes

    ERIC Educational Resources Information Center

    Denison, Stephanie; Xu, Fei

    2010-01-01

    Previous research has revealed that infants can reason correctly about single-event probabilities with small but not large set sizes (Bonatti, 2008; Teglas "et al.", 2007). The current study asks whether infants can make predictions regarding single-event probability with large set sizes using a novel procedure. Infants completed two trials: A…

  7. Twelve- to 14-Month-Old Infants Can Predict Single-Event Probability with Large Set Sizes

    ERIC Educational Resources Information Center

    Denison, Stephanie; Xu, Fei

    2010-01-01

    Previous research has revealed that infants can reason correctly about single-event probabilities with small but not large set sizes (Bonatti, 2008; Teglas "et al.", 2007). The current study asks whether infants can make predictions regarding single-event probability with large set sizes using a novel procedure. Infants completed two trials: A…

  8. Generating extreme weather event sets from very large ensembles of regional climate models

    NASA Astrophysics Data System (ADS)

    Massey, Neil; Guillod, Benoit; Otto, Friederike; Allen, Myles; Jones, Richard; Hall, Jim

    2015-04-01

    Generating extreme weather event sets from very large ensembles of regional climate models Neil Massey, Benoit P. Guillod, Friederike E. L. Otto, Myles R. Allen, Richard Jones, Jim W. Hall Environmental Change Institute, University of Oxford, Oxford, UK Extreme events can have large impacts on societies and are therefore being increasingly studied. In particular, climate change is expected to impact the frequency and intensity of these events. However, a major limitation when investigating extreme weather events is that, by definition, only few events are present in observations. A way to overcome this issue it to use large ensembles of model simulations. Using the volunteer distributed computing (VDC) infrastructure of weather@home [1], we run a very large number (10'000s) of RCM simulations over the European domain at a resolution of 25km, with an improved land-surface scheme, nested within a free-running GCM. Using VDC allows many thousands of climate model runs to be computed. Using observations for the GCM boundary forcings we can run historical "hindcast" simulations over the past 100 to 150 years. This allows us, due to the chaotic variability of the atmosphere, to ascertain how likely an extreme event was, given the boundary forcings, and to derive synthetic event sets. The events in these sets did not actually occur in the observed record but could have occurred given the boundary forcings, with an associated probability. The event sets contain time-series of fields of meteorological variables that allow impact modellers to assess the loss the event would incur. Projections of events into the future are achieved by modelling projections of the sea-surface temperature (SST) and sea-ice boundary forcings, by combining the variability of the SST in the observed record with a range of warming signals derived from the varying responses of SSTs in the CMIP5 ensemble to elevated greenhouse gas (GHG) emissions in three RCP scenarios. Simulating the future with a range of SST responses, as well as a range of RCP scenarios, allows us to assess the uncertainty in the response to elevated GHG emissions that occurs in the CMIP5 ensemble. Numerous extreme weather events can be studied. Firstly, we analyse droughts in Europe with a focus on the UK in the context of the project MaRIUS (Managing the Risks, Impacts and Uncertainties of droughts and water Scarcity). We analyse the characteristics of the simulated droughts, the underlying physical mechanisms, and assess droughts observed in the recent past. Secondly, we analyse windstorms by applying an objective storm-identification and tracking algorithm to the ensemble output, isolating those storms that cause high loss and building a probabilistic storm catalogue, which can be used by impact modellers, insurance loss modellers, etc. Finally, we combine the model output with a heat-stress index to determine the detrimental effect on health of heat waves in Europe. [1] Massey, N. et al., 2014, Q. J. R. Meteorol. Soc.

  9. Innovation from within the Box: Evaluation of Online Problem Sets in a Series of Large Lecture Undergraduate Science Courses.

    ERIC Educational Resources Information Center

    Schaeffer, Evonne; Bhargava, Tina; Nash, John; Kerns, Charles; Stocker, Scott

    A technology-mediated solution to enhance the learning experience for students in a large lecture setting was evaluated. Online problem sets were developed to engage students in the content of a human biology course and implemented in the classes of eight faculty coordinators. The weekly problem sets contained several multiple choice problems,…

  10. Calculations of safe collimator settings and β* at the CERN Large Hadron Collider

    NASA Astrophysics Data System (ADS)

    Bruce, R.; Assmann, R. W.; Redaelli, S.

    2015-06-01

    The first run of the Large Hadron Collider (LHC) at CERN was very successful and resulted in important physics discoveries. One way of increasing the luminosity in a collider, which gave a very significant contribution to the LHC performance in the first run and can be used even if the beam intensity cannot be increased, is to decrease the transverse beam size at the interaction points by reducing the optical function β*. However, when doing so, the beam becomes larger in the final focusing system, which could expose its aperture to beam losses. For the LHC, which is designed to store beams with a total energy of 362 MJ, this is critical, since the loss of even a small fraction of the beam could cause a magnet quench or even damage. Therefore, the machine aperture has to be protected by the collimation system. The settings of the collimators constrain the maximum beam size that can be tolerated and therefore impose a lower limit on β*. In this paper, we present calculations to determine safe collimator settings and the resulting limit on β*, based on available aperture and operational stability of the machine. Our model was used to determine the LHC configurations in 2011 and 2012 and it was found that β* could be decreased significantly compared to the conservative model used in 2010. The gain in luminosity resulting from the decreased margins between collimators was more than a factor 2, and a further contribution from the use of realistic aperture estimates based on measurements was almost as large. This has played an essential role in the rapid and successful accumulation of experimental data in the LHC.

  11. Ghost transmission: How large basis sets can make electron transport calculations worse

    SciTech Connect

    Herrmann, Carmen; Solomon, Gemma C.; Subotnik, Joseph E.; Mujica, Vladimiro; Ratner, Mark A.

    2010-01-01

    The Landauer approach has proven to be an invaluable tool for calculating the electron transport properties of single molecules, especially when combined with a nonequilibrium Green’s function approach and Kohn–Sham density functional theory. However, when using large nonorthogonal atom-centered basis sets, such as those common in quantum chemistry, one can find erroneous results if the Landauer approach is applied blindly. In fact, basis sets of triple-zeta quality or higher sometimes result in an artificially high transmission and possibly even qualitatively wrong conclusions regarding chemical trends. In these cases, transport persists when molecular atoms are replaced by basis functions alone (“ghost atoms”). The occurrence of such ghost transmission is correlated with low-energy virtual molecular orbitals of the central subsystem and may be interpreted as a biased and thus inaccurate description of vacuum transmission. An approximate practical correction scheme is to calculate the ghost transmission and subtract it from the full transmission. As a further consequence of this study, it is recommended that sensitive molecules be used for parameter studies, in particular those whose transmission functions show antiresonance features such as benzene-based systems connected to the electrodes in meta positions and other low-conducting systems such as alkanes and silanes.

  12. fastSTRUCTURE: Variational Inference of Population Structure in Large SNP Data Sets

    PubMed Central

    Raj, Anil; Stephens, Matthew; Pritchard, Jonathan K.

    2014-01-01

    Tools for estimating population structure from genetic data are now used in a wide variety of applications in population genetics. However, inferring population structure in large modern data sets imposes severe computational challenges. Here, we develop efficient algorithms for approximate inference of the model underlying the STRUCTURE program using a variational Bayesian framework. Variational methods pose the problem of computing relevant posterior distributions as an optimization problem, allowing us to build on recent advances in optimization theory to develop fast inference tools. In addition, we propose useful heuristic scores to identify the number of populations represented in a data set and a new hierarchical prior to detect weak population structure in the data. We test the variational algorithms on simulated data and illustrate using genotype data from the CEPH–Human Genome Diversity Panel. The variational algorithms are almost two orders of magnitude faster than STRUCTURE and achieve accuracies comparable to those of ADMIXTURE. Furthermore, our results show that the heuristic scores for choosing model complexity provide a reasonable range of values for the number of populations represented in the data, with minimal bias toward detecting structure when it is very weak. Our algorithm, fastSTRUCTURE, is freely available online at http://pritchardlab.stanford.edu/structure.html. PMID:24700103

  13. Numerical methods for accelerating the PCA of large data sets applied to hyperspectral imaging

    NASA Astrophysics Data System (ADS)

    Vogt, Frank; Mizaikoff, Boris; Tacke, Maurus

    2002-02-01

    Principal component analysis and regression (PCA, PCR) are widespread algorithms for the calibration of spectrometers and the evaluation of spectra. In many applications, however, there are huge amounts of calibration data, as it is common to hyperspectral imaging for instance. Such data sets consist often of several ten thousands of spectra measured at several hundred wavelength positions. Hence, a PCA of calibration sets that large is computational very time consuming - in particular the included singular value decomposition (SVD). Since this procedure takes several hours of computation time on conventional personal computers, its calculation is often not feasible. In this paper a straightforward acceleration of the PCA is presented, which is achieved by data preprocessing consisting of three steps: data compression based on a wavelet transformation, exclusion of redundant data, and by taking advantage of the matrix dimensions. Since the size of the calibration matrix determines the calculation time of the PCA, a reduction of its size enables the acceleration. Due to an appropriate data preprocessing, the PCA of the discussed examples could be accelerated by more than one order of magnitude. It is demonstrated by means of synthetically generated spectra as well as by experimental data that after preprocessing the PCA results in calibration models, which are comparable to the ones obtained by the conventional approach.

  14. Duplications in RB1CC1 are associated with schizophrenia; identification in large European sample sets.

    PubMed

    Degenhardt, F; Priebe, L; Meier, S; Lennertz, L; Streit, F; Witt, S H; Hofmann, A; Becker, T; Mössner, R; Maier, W; Nenadic, I; Sauer, H; Mattheisen, M; Buizer-Voskamp, J; Ophoff, R A; Rujescu, D; Giegling, I; Ingason, A; Wagner, M; Delobel, B; Andrieux, J; Meyer-Lindenberg, A; Heinz, A; Walter, H; Moebus, S; Corvin, A; Rietschel, M; Nöthen, M M; Cichon, S

    2013-01-01

    Schizophrenia (SCZ) is a severe and debilitating neuropsychiatric disorder with an estimated heritability of ~80%. Recently, de novo mutations, identified by next-generation sequencing (NGS) technology, have been suggested to contribute to the risk of developing SCZ. Although these studies show an overall excess of de novo mutations among patients compared with controls, it is not easy to pinpoint specific genes hit by de novo mutations as actually involved in the disease process. Importantly, support for a specific gene can be provided by the identification of additional alterations in several independent patients. We took advantage of existing genome-wide single-nucleotide polymorphism data sets to screen for deletions or duplications (copy number variations, CNVs) in genes previously implicated by NGS studies. Our approach was based on the observation that CNVs constitute part of the mutational spectrum in many human disease-associated genes. In a discovery step, we investigated whether CNVs in 55 candidate genes, suggested from NGS studies, were more frequent among 1637 patients compared with 1627 controls. Duplications in RB1CC1 were overrepresented among patients. This finding was followed-up in large, independent European sample sets. In the combined analysis, totaling 8461 patients and 112 871 controls, duplications in RB1CC1 were found to be associated with SCZ (P=1.29 × 10(-5); odds ratio=8.58). Our study provides evidence for rare duplications in RB1CC1 as a risk factor for SCZ. PMID:26151896

  15. Motif-based analysis of large nucleotide data sets using MEME-ChIP.

    PubMed

    Ma, Wenxiu; Noble, William S; Bailey, Timothy L

    2014-01-01

    MEME-ChIP is a web-based tool for analyzing motifs in large DNA or RNA data sets. It can analyze peak regions identified by ChIP-seq, cross-linking sites identified by CLIP-seq and related assays, as well as sets of genomic regions selected using other criteria. MEME-ChIP performs de novo motif discovery, motif enrichment analysis, motif location analysis and motif clustering, providing a comprehensive picture of the DNA or RNA motifs that are enriched in the input sequences. MEME-ChIP performs two complementary types of de novo motif discovery: weight matrix-based discovery for high accuracy; and word-based discovery for high sensitivity. Motif enrichment analysis using DNA or RNA motifs from human, mouse, worm, fly and other model organisms provides even greater sensitivity. MEME-ChIP's interactive HTML output groups and aligns significant motifs to ease interpretation. This protocol takes less than 3 h, and it provides motif discovery approaches that are distinct and complementary to other online methods. PMID:24853928

  16. Comparison of Bias Analysis Strategies Applied to a Large Data Set

    PubMed Central

    Lash, Timothy L.; Abrams, Barbara; Bodnar, Lisa M.

    2015-01-01

    Background Epidemiologic data sets continue to grow larger. Probabilistic-bias analyses, which simulate hundreds of thousands of replications of the original data set, may challenge desktop computational resources. Methods We implemented a probabilistic-bias analysis to evaluate the direction, magnitude, and uncertainty of the bias arising from misclassification of prepregnancy body mass index when studying its association with early preterm birth in a cohort of 773,625 singleton births. We compared 3 bias analysis strategies: (1) using the full cohort, (2) using a case-cohort design, and (3) weighting records by their frequency in the full cohort. Results Underweight and overweight mothers were more likely to deliver early preterm. A validation substudy demonstrated misclassification of prepregnancy body mass index derived from birth certificates. Probabilistic-bias analyses suggested that the association between underweight and early preterm birth was overestimated by the conventional approach, whereas the associations between overweight categories and early preterm birth were underestimated. The 3 bias analyses yielded equivalent results and challenged our typical desktop computing environment. Analyses applied to the full cohort, case cohort, and weighted full cohort required 7.75 days and 4 terabytes, 15.8 hours and 287 gigabytes, and 8.5 hours and 202 gigabytes, respectively. Conclusions Large epidemiologic data sets often include variables that are imperfectly measured, often because data were collected for other purposes. Probabilistic-bias analysis allows quantification of errors but may be difficult in a desktop computing environment. Solutions that allow these analyses in this environment can be achieved without new hardware and within reasonable computational time frames. PMID:24815306

  17. A multivariate approach to filling gaps in large ecological data sets using probabilistic matrix factorization techniques

    NASA Astrophysics Data System (ADS)

    Schrodt, F. I.; Shan, H.; Kattge, J.; Reich, P.; Banerjee, A.; Reichstein, M.

    2012-12-01

    With the advent of remotely sensed data and coordinated efforts to create global databases, the ecological community has progressively become more data-intensive. However, in contrast to other disciplines, statistical ways of handling these large data sets, especially the gaps which are inherent to them, are lacking. Widely used theoretical approaches, for example model averaging based on Akaike's information criterion (AIC), are sensitive to missing values. Yet, the most common way of handling sparse matrices - the deletion of cases with missing data (complete case analysis) - is known to severely reduce statistical power as well as inducing biased parameter estimates. In order to address these issues, we present novel approaches to gap filling in large ecological data sets using matrix factorization techniques. Factorization based matrix completion was developed in a recommender system context and has since been widely used to impute missing data in fields outside the ecological community. Here, we evaluate the effectiveness of probabilistic matrix factorization techniques for imputing missing data in ecological matrices using two imputation techniques. Hierarchical Probabilistic Matrix Factorization (HPMF) effectively incorporates hierarchical phylogenetic information (phylogenetic group, family, genus, species and individual plant) into the trait imputation. Kernelized Probabilistic Matrix Factorization (KPMF) on the other hand includes environmental information (climate and soils) into the matrix factorization through kernel matrices over rows and columns. We test the accuracy and effectiveness of HPMF and KPMF in filling sparse matrices, using the TRY database of plant functional traits (http://www.try-db.org). TRY is one of the largest global compilations of plant trait databases (750 traits of 1 million plants), encompassing data on morphological, anatomical, biochemical, phenological and physiological features of plants. However, despite of unprecedented coverage, the TRY database is still very sparse, severely limiting joint trait analyses. Plant traits are the key to understanding how plants as primary producers adjust to changes in environmental conditions and in turn influence them. Forming the basis for Dynamic Global Vegetation Models (DGVMs), plant traits are also fundamental in global change studies for predicting future ecosystem changes. It is thus imperative that missing data is imputed in as accurate and precise a way as possible. In this study, we show the advantage of applying probabilistic matrix factorization techniques in incorporating hierarchical and environmental information for the prediction of missing plant traits as compared to conventional imputation techniques such as the complete case and mean approaches. We will discuss advantages of the proposed imputation techniques over other widely used methods such as multiple imputation (MI), as well as possible applications to other data sets.

  18. Evaluation of flow resistance in gravel-bed rivers through a large field data set

    NASA Astrophysics Data System (ADS)

    Rickenmann, Dieter; Recking, Alain

    2011-07-01

    A data set of 2890 field measurements was used to test the ability of several conventional flow resistance equations to predict mean flow velocity in gravel bed rivers when used with no calibration. The tests were performed using both flow depth and discharge as input since discharge may be a more reliable measure of flow conditions in shallow flows. Generally better predictions are obtained when using flow discharge as input. The results indicate that the Manning-Strickler and the Keulegan equations show considerable disagreement with observed flow velocities for flow depths smaller than 10 times the characteristic grain diameter. Most equations show some systematic deviation for small relative flow depth. The use of new definitions for dimensionless variables in terms of nondimensional hydraulic geometry equations allows the development of a new flow resistance equation. The best overall performance is obtained by the Ferguson approach, which combines two power law flow resistance equations that are different for deep and shallow flows. To use this approach with flow discharge as input, a logarithmic matching equation in terms of the new dimensionless variables is proposed. For the domains of intermediate and large-scale roughness, the field data indicate a considerable increase in flow resistance as compared with the domain of small-scale roughness. The Ferguson approach is used to discuss the importance of flow resistance partitioning for bed load transport calculations at flow conditions with intermediate- and large-scale roughness in natural gravel, cobble, and boulder bed streams.

  19. Stacking of large interferometric data sets in the image- and uv-domain - a comparative study

    NASA Astrophysics Data System (ADS)

    Lindroos, L.; Knudsen, K. K.; Vlemmings, W.; Conway, J.; Martí-Vidal, I.

    2015-02-01

    We present a new algorithm for stacking radio interferometric data in the uv-domain. The performance of uv-stacking is compared to the stacking of fully imaged data using simulated Atacama Large Millimeter/submillimeter Array and the Karl G. Jansky Very Large Array (VLA) deep extragalactic surveys. We find that image- and uv-stacking produce similar results, however, uv-stacking is typically the more robust method. An advantage of the uv-stacking algorithm is the availability of uv-data post-stacking, which makes it possible to identify and remove problematic baselines. For deep VLA surveys uv-stacking yields a signal-to-noise ratio that is up to 20 per cent higher than image-stacking. Furthermore, we have investigated stacking of resolved sources with a simulated VLA data set where 1.5 arcsec (10-12 kpc at z ˜ 1-4) sources are stacked. We find that uv-stacking, where a model is fitted directly to the visibilities, significantly improves the accuracy and robustness of the size estimates. While scientific motivation for this work is studying faint, high-z galaxies, the algorithm analysed here would also be applicable in other fields of astronomy. Stacking of radio interferometric data is also expected to play a big role for future surveys with telescopes such as Low-Frequency Array and Square Kilometre Array.

  20. Galaxy Evolution Insights from Spectral Modeling of Large Data Sets from the Sloan Digital Sky Survey

    SciTech Connect

    Hoversten, Erik A.; /Johns Hopkins U.

    2007-10-01

    This thesis centers on the use of spectral modeling techniques on data from the Sloan Digital Sky Survey (SDSS) to gain new insights into current questions in galaxy evolution. The SDSS provides a large, uniform, high quality data set which can be exploited in a number of ways. One avenue pursued here is to use the large sample size to measure precisely the mean properties of galaxies of increasingly narrow parameter ranges. The other route taken is to look for rare objects which open up for exploration new areas in galaxy parameter space. The crux of this thesis is revisiting the classical Kennicutt method for inferring the stellar initial mass function (IMF) from the integrated light properties of galaxies. A large data set ({approx} 10{sup 5} galaxies) from the SDSS DR4 is combined with more in-depth modeling and quantitative statistical analysis to search for systematic IMF variations as a function of galaxy luminosity. Galaxy H{alpha} equivalent widths are compared to a broadband color index to constrain the IMF. It is found that for the sample as a whole the best fitting IMF power law slope above 0.5 M{sub {circle_dot}} is {Lambda} = 1.5 {+-} 0.1 with the error dominated by systematics. Galaxies brighter than around M{sub r,0.1} = -20 (including galaxies like the Milky Way which has M{sub r,0.1} {approx} -21) are well fit by a universal {Lambda} {approx} 1.4 IMF, similar to the classical Salpeter slope, and smooth, exponential star formation histories (SFH). Fainter galaxies prefer steeper IMFs and the quality of the fits reveal that for these galaxies a universal IMF with smooth SFHs is actually a poor assumption. Related projects are also pursued. A targeted photometric search is conducted for strongly lensed Lyman break galaxies (LBG) similar to MS1512-cB58. The evolution of the photometric selection technique is described as are the results of spectroscopic follow-up of the best targets. The serendipitous discovery of two interesting blue compact dwarf galaxies is reported. These galaxies were identified by their extremely weak (< 150) [N {pi}] {lambda}6584 to H{alpha} emission line ratios. Abundance analysis from emission line fluxes reveals that these galaxies have gas phase oxygen abundances 12 + log(O/H) {approx} 7.7 to 7.9, not remarkably low, and near infrared imaging detects an old stellar population. However, the measured nitrogen to oxygen ratios log(N/O) < 1.7 are anomalously low for blue compact dwarf galaxies. These objects may be useful for understanding the chemical evolution of nitrogen.

  1. Measurement, visualization and analysis of extremely large data sets with a nanopositioning and nanomeasuring machine

    NASA Astrophysics Data System (ADS)

    Birli, O.; Franke, K.-H.; Linß, G.; Machleidt, T.; Manske, E.; Schale, F.; Schwannecke, H.-C.; Sparrer, E.; Weiß, M.

    2013-04-01

    Nanopositioning and nanomeasuring machines (NPM machines) developed at the Ilmenau University of Technology allow the measurement of micro- and nanostructures with nanometer precision in a measurement volume of 25 mm × 25 mm × 5 mm (NMM-1) or 200 mm × 200 mm × 25 mm (NPMM-200). Various visual, tactile or atomic force sensors can all be used to measure specimens. Atomic force sensors have emerged as a powerful tool in nanotechnology. Large-scale AFM measurements are very time-consuming and in fact in a practical sense they are impossible over millimeter ranges due to low scanning speeds. A cascaded multi-sensor system can be used to implement a multi-scale measurement and testing strategy for nanopositioning and nanomeasuring machines. This approach involves capturing an overview image at the limit of optical resolution and automatically scanning the measured data for interesting test areas that are suitable for a higher-resolution measurement. These "fields of interest" can subsequently be measured in the same NPM machine using individual AFM sensor scans. The results involve extremely large data sets that cannot be handled by off-the-shelf software. Quickly navigating within terabyte-sized data files requires preprocessing to be done on the measured data to calculate intermediate images based on the principle of a visualization pyramid. This pyramid includes the measured data of the entire volume, prepared in the form of discrete measurement volumes (spatial tiles or cubes) with certain edge lengths at specific zoom levels. The functionality of the closed process chain is demonstrated using a blob analysis for automatically selecting regions of interest on the specimen. As expected, processing large amounts of data places particularly high demands on both computing power and the software architecture.

  2. Science Teachers' Decision-Making in Abstinence-Only-Until-Marriage (AOUM) Classrooms: Taboo Subjects and Discourses of Sex and Sexuality in Classroom Settings

    ERIC Educational Resources Information Center

    Gill, Puneet Singh

    2015-01-01

    Sex education, especially in the southeastern USA, remains steeped in an Abstinence-Only-Until-Marriage (AOUM) approach, which sets up barriers to the education of sexually active students. Research confirms that science education has the potential to facilitate discussion of controversial topics, including sex education. Science teachers in the…

  3. Science Teachers' Decision-Making in Abstinence-Only-Until-Marriage (AOUM) Classrooms: Taboo Subjects and Discourses of Sex and Sexuality in Classroom Settings

    ERIC Educational Resources Information Center

    Gill, Puneet Singh

    2015-01-01

    Sex education, especially in the southeastern USA, remains steeped in an Abstinence-Only-Until-Marriage (AOUM) approach, which sets up barriers to the education of sexually active students. Research confirms that science education has the potential to facilitate discussion of controversial topics, including sex education. Science teachers in the…

  4. Any Questions? An Application of Weick's Model of Organizing to Increase Student Involvement in the Large-Lecture Classroom

    ERIC Educational Resources Information Center

    Ledford, Christy J. W.; Saperstein, Adam K.; Cafferty, Lauren A.; McClintick, Stacey H.; Bernstein, Ethan M.

    2015-01-01

    Microblogs, with their interactive nature, can engage students in community building and sensemaking. Using Weick's model of organizing as a framework, we integrated the use of micromessaging to increase student engagement in the large-lecture classroom. Students asked significantly more questions and asked a greater diversity of questions…

  5. An Evaluation of the Developmental Designs Approach and Professional Development Model on Classroom Management in 22 Middle Schools in a Large, Midwestern School District

    ERIC Educational Resources Information Center

    Hough, David L.

    2011-01-01

    This study presents findings from an evaluation of the Developmental Designs classroom management approach and professional development model during its first year of implementation across 22 middle schools in a large, Midwestern school district. The impact of this professional development model on teaching and learning as related to participants'…

  6. Any Questions? An Application of Weick's Model of Organizing to Increase Student Involvement in the Large-Lecture Classroom

    ERIC Educational Resources Information Center

    Ledford, Christy J. W.; Saperstein, Adam K.; Cafferty, Lauren A.; McClintick, Stacey H.; Bernstein, Ethan M.

    2015-01-01

    Microblogs, with their interactive nature, can engage students in community building and sensemaking. Using Weick's model of organizing as a framework, we integrated the use of micromessaging to increase student engagement in the large-lecture classroom. Students asked significantly more questions and asked a greater diversity of questions…

  7. Efficient Implementation of an Optimal Interpolator for Large Spatial Data Sets

    NASA Technical Reports Server (NTRS)

    Memarsadeghi, Nargess; Mount, David M.

    2007-01-01

    Scattered data interpolation is a problem of interest in numerous areas such as electronic imaging, smooth surface modeling, and computational geometry. Our motivation arises from applications in geology and mining, which often involve large scattered data sets and a demand for high accuracy. The method of choice is ordinary kriging. This is because it is a best unbiased estimator. Unfortunately, this interpolant is computationally very expensive to compute exactly. For n scattered data points, computing the value of a single interpolant involves solving a dense linear system of size roughly n x n. This is infeasible for large n. In practice, kriging is solved approximately by local approaches that are based on considering only a relatively small'number of points that lie close to the query point. There are many problems with this local approach, however. The first is that determining the proper neighborhood size is tricky, and is usually solved by ad hoc methods such as selecting a fixed number of nearest neighbors or all the points lying within a fixed radius. Such fixed neighborhood sizes may not work well for all query points, depending on local density of the point distribution. Local methods also suffer from the problem that the resulting interpolant is not continuous. Meyer showed that while kriging produces smooth continues surfaces, it has zero order continuity along its borders. Thus, at interface boundaries where the neighborhood changes, the interpolant behaves discontinuously. Therefore, it is important to consider and solve the global system for each interpolant. However, solving such large dense systems for each query point is impractical. Recently a more principled approach to approximating kriging has been proposed based on a technique called covariance tapering. The problems arise from the fact that the covariance functions that are used in kriging have global support. Our implementations combine, utilize, and enhance a number of different approaches that have been introduced in literature for solving large linear systems for interpolation of scattered data points. For very large systems, exact methods such as Gaussian elimination are impractical since they require 0(n(exp 3)) time and 0(n(exp 2)) storage. As Billings et al. suggested, we use an iterative approach. In particular, we use the SYMMLQ method, for solving the large but sparse ordinary kriging systems that result from tapering. The main technical issue that need to be overcome in our algorithmic solution is that the points' covariance matrix for kriging should be symmetric positive definite. The goal of tapering is to obtain a sparse approximate representation of the covariance matrix while maintaining its positive definiteness. Furrer et al. used tapering to obtain a sparse linear system of the form Ax = b, where A is the tapered symmetric positive definite covariance matrix. Thus, Cholesky factorization could be used to solve their linear systems. They implemented an efficient sparse Cholesky decomposition method. They also showed if these tapers are used for a limited class of covariance models, the solution of the system converges to the solution of the original system. Matrix A in the ordinary kriging system, while symmetric, is not positive definite. Thus, their approach is not applicable to the ordinary kriging system. Therefore, we use tapering only to obtain a sparse linear system. Then, we use SYMMLQ to solve the ordinary kriging system. We show that solving large kriging systems becomes practical via tapering and iterative methods, and results in lower estimation errors compared to traditional local approaches, and significant memory savings compared to the original global system. We also developed a more efficient variant of the sparse SYMMLQ method for large ordinary kriging systems. This approach adaptively finds the correct local neighborhood for each query point in the interpolation process.

  8. PORTAAL: A Classroom Observation Tool Assessing Evidence-Based Teaching Practices for Active Learning in Large Science, Technology, Engineering, and Mathematics Classes.

    PubMed

    Eddy, Sarah L; Converse, Mercedes; Wenderoth, Mary Pat

    2015-01-01

    There is extensive evidence that active learning works better than a completely passive lecture. Despite this evidence, adoption of these evidence-based teaching practices remains low. In this paper, we offer one tool to help faculty members implement active learning. This tool identifies 21 readily implemented elements that have been shown to increase student outcomes related to achievement, logic development, or other relevant learning goals with college-age students. Thus, this tool both clarifies the research-supported elements of best practices for instructor implementation of active learning in the classroom setting and measures instructors' alignment with these practices. We describe how we reviewed the discipline-based education research literature to identify best practices in active learning for adult learners in the classroom and used these results to develop an observation tool (Practical Observation Rubric To Assess Active Learning, or PORTAAL) that documents the extent to which instructors incorporate these practices into their classrooms. We then use PORTAAL to explore the classroom practices of 25 introductory biology instructors who employ some form of active learning. Overall, PORTAAL documents how well aligned classrooms are with research-supported best practices for active learning and provides specific feedback and guidance to instructors to allow them to identify what they do well and what could be improved. PMID:26033871

  9. PORTAAL: A Classroom Observation Tool Assessing Evidence-Based Teaching Practices for Active Learning in Large Science, Technology, Engineering, and Mathematics Classes

    PubMed Central

    Eddy, Sarah L.; Converse, Mercedes; Wenderoth, Mary Pat

    2015-01-01

    There is extensive evidence that active learning works better than a completely passive lecture. Despite this evidence, adoption of these evidence-based teaching practices remains low. In this paper, we offer one tool to help faculty members implement active learning. This tool identifies 21 readily implemented elements that have been shown to increase student outcomes related to achievement, logic development, or other relevant learning goals with college-age students. Thus, this tool both clarifies the research-supported elements of best practices for instructor implementation of active learning in the classroom setting and measures instructors’ alignment with these practices. We describe how we reviewed the discipline-based education research literature to identify best practices in active learning for adult learners in the classroom and used these results to develop an observation tool (Practical Observation Rubric To Assess Active Learning, or PORTAAL) that documents the extent to which instructors incorporate these practices into their classrooms. We then use PORTAAL to explore the classroom practices of 25 introductory biology instructors who employ some form of active learning. Overall, PORTAAL documents how well aligned classrooms are with research-supported best practices for active learning and provides specific feedback and guidance to instructors to allow them to identify what they do well and what could be improved. PMID:26033871

  10. Clickers in College Classrooms: Fostering Learning with Questioning Methods in Large Lecture Classes

    ERIC Educational Resources Information Center

    Mayer, Richard E.; Stull, Andrew; DeLeeuw, Krista; Almeroth, Kevin; Bimber, Bruce; Chun, Dorothy; Bulger, Monica; Campbell, Julie; Knight, Allan; Zhang, Hangjin

    2009-01-01

    What can be done to promote student-instructor interaction in a large lecture class? One approach is to use a personal response system (or "clickers") in which students press a button on a hand-held remote control device corresponding to their answer to a multiple choice question projected on a screen, then see the class distribution of answers on…

  11. Taking Energy to the Physics Classroom from the Large Hadron Collider at CERN

    ERIC Educational Resources Information Center

    Cid, Xabier; Cid, Ramon

    2009-01-01

    In 2008, the greatest experiment in history began. When in full operation, the Large Hadron Collider (LHC) at CERN will generate the greatest amount of information that has ever been produced in an experiment before. It will also reveal some of the most fundamental secrets of nature. Despite the enormous amount of information available on this…

  12. Taking Energy to the Physics Classroom from the Large Hadron Collider at CERN

    ERIC Educational Resources Information Center

    Cid, Xabier; Cid, Ramon

    2009-01-01

    In 2008, the greatest experiment in history began. When in full operation, the Large Hadron Collider (LHC) at CERN will generate the greatest amount of information that has ever been produced in an experiment before. It will also reveal some of the most fundamental secrets of nature. Despite the enormous amount of information available on this…

  13. Registering coherent change detection products associated with large image sets and long capture intervals

    DOEpatents

    Perkins, David Nikolaus; Gonzales, Antonio I

    2014-04-08

    A set of co-registered coherent change detection (CCD) products is produced from a set of temporally separated synthetic aperture radar (SAR) images of a target scene. A plurality of transformations are determined, which transformations are respectively for transforming a plurality of the SAR images to a predetermined image coordinate system. The transformations are used to create, from a set of CCD products produced from the set of SAR images, a corresponding set of co-registered CCD products.

  14. BACHSCORE. A tool for evaluating efficiently and reliably the quality of large sets of protein structures

    NASA Astrophysics Data System (ADS)

    Sarti, E.; Zamuner, S.; Cossio, P.; Laio, A.; Seno, F.; Trovato, A.

    2013-12-01

    In protein structure prediction it is of crucial importance, especially at the refinement stage, to score efficiently large sets of models by selecting the ones that are closest to the native state. We here present a new computational tool, BACHSCORE, that allows its users to rank different structural models of the same protein according to their quality, evaluated by using the BACH++ (Bayesian Analysis Conformation Hunt) scoring function. The original BACH statistical potential was already shown to discriminate with very good reliability the protein native state in large sets of misfolded models of the same protein. BACH++ features a novel upgrade in the solvation potential of the scoring function, now computed by adapting the LCPO (Linear Combination of Pairwise Orbitals) algorithm. This change further enhances the already good performance of the scoring function. BACHSCORE can be accessed directly through the web server: bachserver.pd.infn.it. Catalogue identifier: AEQD_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEQD_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: GNU General Public License version 3 No. of lines in distributed program, including test data, etc.: 130159 No. of bytes in distributed program, including test data, etc.: 24 687 455 Distribution format: tar.gz Programming language: C++. Computer: Any computer capable of running an executable produced by a g++ compiler (4.6.3 version). Operating system: Linux, Unix OS-es. RAM: 1 073 741 824 bytes Classification: 3. Nature of problem: Evaluate the quality of a protein structural model, taking into account the possible “a priori” knowledge of a reference primary sequence that may be different from the amino-acid sequence of the model; the native protein structure should be recognized as the best model. Solution method: The contact potential scores the occurrence of any given type of residue pair in 5 possible contact classes (?-helical contact, parallel ?-sheet contact, anti-parallel ?-sheet contact, side-chain contact, no contact). The solvation potential scores the occurrence of any residue type in 2 possible environments: buried and solvent exposed. Residue environment is assigned by adapting the LCPO algorithm. Residues present in the reference primary sequence and not present in the model structure contribute to the model score as solvent exposed and as non contacting all other residues. Restrictions: Input format file according to the Protein Data Bank standard Additional comments: Parameter values used in the scoring function can be found in the file /folder-to-bachscore/BACH/examples/bach_std.par. Running time: Roughly one minute to score one hundred structures on a desktop PC, depending on their size.

  15. The Amazon shelf setting: tropical, energetic, and influenced by a large river

    NASA Astrophysics Data System (ADS)

    Nittrouer, Charles A.; DeMaster, David J.

    1996-04-01

    A Multidisciplinary Amazon Shelf SEDiment Study (AmasSeds) investigated the oceanic processes near the mouth of the Amazon River in order to understand the fate of its enormous discharge of water, solutes and particulates. In addition to receiving a large fluvial discharge, the continental shelf near the Amazon mouth is situated on the equator and experiences an extremely energetic physical regime. As such, it represents an end member in the spectrum of coastal marine settings, with regard to latitude, energy and discharge. The oceanic processes occurring on the Amazon shelf reflect these environmental characteristics. A range of interdisciplinary interactions was observed on the Amazon shelf in response to its low latitude, among them: plume dynamics have little influence from Coriolis acceleration, riverine particles experience intense weathering conditions, high primary productivity occurs during all seasons, and shoreline sedimentation involves mangrove vegetation. The high energy conditions of the Amazon shelf result in: water motions dependent on suspended-sediment distributions, tracemetal adsorption controlled by seabed dynamics, severe restriction of macrobenthos, and deep physical reworking of sedimentary strata (to 1 m or more). The great discharge of fluvial materials (water, solutes, particulates) directly or indirectly causes three-dimensional estuarine-like processes and very high rates of primary productivity, sediment accumulation and carbon burial to occur on the shelf. Although AmasSeds research can link a wide range of interdisciplinary oceanic processes to latitude, energetics or discharge, in many cases the observed process is strongly influenced by a coupling of these characteristics. For example, the dominance of Fe and Mn oxides in controlling redox reactions is a result of tropical weathering that concentrates the Fe and Mn and of intense seabed reworking that regularly reoxidizes these phases. Therefore, the importance of Fe and Mn oxides is a result of both latitude and energy considerations. A fourth characteristic of the Amazon system is its tectonic setting, which determines physiographic features such as drainage-basin size and shelf width. Low latitude and great discharge characterize all areas of the wet tropics; energy expenditure and tectonic setting vary with specific location. All four characteristics must be considered when extrapolating AmasSeds observations to other areas and attempting to predict or interpret oceanic processes.

  16. Classroom management programs for deaf children in state residential and large public schools.

    PubMed

    Wenkus, M; Rittenhouse, B; Dancer, J

    1999-12-01

    Personnel in 4 randomly selected state residential schools for the deaf and 3 randomly selected large public schools with programs for the deaf were surveyed to assess the types of management or disciplinary programs and strategies currently in use with deaf students and the rated effectiveness of such programs. Several behavioral management programs were identified by respondents, with Assertive Discipline most often listed. Ratings of program effectiveness were generally above average on a number of qualitative criteria. PMID:10710770

  17. Considerations for observational research using large data sets in radiation oncology.

    PubMed

    Jagsi, Reshma; Bekelman, Justin E; Chen, Aileen; Chen, Ronald C; Hoffman, Karen; Shih, Ya-Chen Tina; Smith, Benjamin D; Yu, James B

    2014-09-01

    The radiation oncology community has witnessed growing interest in observational research conducted using large-scale data sources such as registries and claims-based data sets. With the growing emphasis on observational analyses in health care, the radiation oncology community must possess a sophisticated understanding of the methodological considerations of such studies in order to evaluate evidence appropriately to guide practice and policy. Because observational research has unique features that distinguish it from clinical trials and other forms of traditional radiation oncology research, the International Journal of Radiation Oncology, Biology, Physics assembled a panel of experts in health services research to provide a concise and well-referenced review, intended to be informative for the lay reader, as well as for scholars who wish to embark on such research without prior experience. This review begins by discussing the types of research questions relevant to radiation oncology that large-scale databases may help illuminate. It then describes major potential data sources for such endeavors, including information regarding access and insights regarding the strengths and limitations of each. Finally, it provides guidance regarding the analytical challenges that observational studies must confront, along with discussion of the techniques that have been developed to help minimize the impact of certain common analytical issues in observational analysis. Features characterizing a well-designed observational study include clearly defined research questions, careful selection of an appropriate data source, consultation with investigators with relevant methodological expertise, inclusion of sensitivity analyses, caution not to overinterpret small but significant differences, and recognition of limitations when trying to evaluate causality. This review concludes that carefully designed and executed studies using observational data that possess these qualities hold substantial promise for advancing our understanding of many unanswered questions of importance to the field of radiation oncology. PMID:25195986

  18. Megapixel mythology and photospace: estimating photospace for camera phones from large image sets

    NASA Astrophysics Data System (ADS)

    Hultgren, Bror O.; Hertel, Dirk W.

    2008-01-01

    It is a myth that more pixels alone result in better images. The marketing of camera phones in particular has focused on their pixel numbers. However, their performance varies considerably according to the conditions of image capture. Camera phones are often used in low-light situations where the lack of a flash and limited exposure time will produce underexposed, noisy and blurred images. Camera utilization can be quantitatively described by photospace distributions, a statistical description of the frequency of pictures taken at varying light levels and camera-subject distances. If the photospace distribution is known, the user-experienced distribution of quality can be determined either directly by direct measurement of subjective quality, or by photospace-weighting of objective attributes. The population of a photospace distribution requires examining large numbers of images taken under typical camera phone usage conditions. ImagePhi was developed as a user-friendly software tool to interactively estimate the primary photospace variables, subject illumination and subject distance, from individual images. Additionally, subjective evaluations of image quality and failure modes for low quality images can be entered into ImagePhi. ImagePhi has been applied to sets of images taken by typical users with a selection of popular camera phones varying in resolution. The estimated photospace distribution of camera phone usage has been correlated with the distributions of failure modes. The subjective and objective data show that photospace conditions have a much bigger impact on image quality of a camera phone than the pixel count of its imager. The 'megapixel myth' is thus seen to be less a myth than an ill framed conditional assertion, whose conditions are to a large extent specified by the camera's operational state in photospace.

  19. Considerations for Observational Research Using Large Data Sets in Radiation Oncology

    SciTech Connect

    Jagsi, Reshma; Bekelman, Justin E.; Chen, Aileen; Chen, Ronald C.; Hoffman, Karen; Tina Shih, Ya-Chen; Smith, Benjamin D.; Yu, James B.

    2014-09-01

    The radiation oncology community has witnessed growing interest in observational research conducted using large-scale data sources such as registries and claims-based data sets. With the growing emphasis on observational analyses in health care, the radiation oncology community must possess a sophisticated understanding of the methodological considerations of such studies in order to evaluate evidence appropriately to guide practice and policy. Because observational research has unique features that distinguish it from clinical trials and other forms of traditional radiation oncology research, the International Journal of Radiation Oncology, Biology, Physics assembled a panel of experts in health services research to provide a concise and well-referenced review, intended to be informative for the lay reader, as well as for scholars who wish to embark on such research without prior experience. This review begins by discussing the types of research questions relevant to radiation oncology that large-scale databases may help illuminate. It then describes major potential data sources for such endeavors, including information regarding access and insights regarding the strengths and limitations of each. Finally, it provides guidance regarding the analytical challenges that observational studies must confront, along with discussion of the techniques that have been developed to help minimize the impact of certain common analytical issues in observational analysis. Features characterizing a well-designed observational study include clearly defined research questions, careful selection of an appropriate data source, consultation with investigators with relevant methodological expertise, inclusion of sensitivity analyses, caution not to overinterpret small but significant differences, and recognition of limitations when trying to evaluate causality. This review concludes that carefully designed and executed studies using observational data that possess these qualities hold substantial promise for advancing our understanding of many unanswered questions of importance to the field of radiation oncology.

  20. Information Theoretic Approaches to Rapid Discovery of Relationships in Large Climate Data Sets

    NASA Astrophysics Data System (ADS)

    Knuth, K. H.; Rossow, W. B.

    2002-12-01

    Mutual information as the asymptotic Bayesian measure of independence is an excellent starting point for investigating the existence of possible relationships among climate-relevant variables in large data sets. As mutual information is a nonlinear function of its arguments, it is not beholden to the assumption of a linear relationship between the variables in question and can reveal features missed in linear correlation analyses. However, as mutual information is symmetric in its arguments, it only has the ability to reveal the probability that two variables are related. It provides no information as to how they are related; specifically, causal interactions or a relation based on a common cause cannot be detected. For this reason we also investigate the utility of a related quantity called the transfer entropy. The transfer entropy can be written as a difference between mutual informations and has the capability to reveal whether and how the variables are causally related. The application of these information theoretic measures is tested on some familiar examples using data from the International Satellite Cloud Climatology Project (ISCCP) to identify relations between global cloud cover and other variables, including equatorial pacific sea surface temperature (SST), over seasonal and El Nino Southern Oscillation (ENSO) cycles.

  1. Information Theoretic Approaches to Rapid Discovery of Relationships in Large Climate Data Sets

    NASA Technical Reports Server (NTRS)

    Knuth, Kevin H.; Rossow, William B.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Mutual information as the asymptotic Bayesian measure of independence is an excellent starting point for investigating the existence of possible relationships among climate-relevant variables in large data sets, As mutual information is a nonlinear function of of its arguments, it is not beholden to the assumption of a linear relationship between the variables in question and can reveal features missed in linear correlation analyses. However, as mutual information is symmetric in its arguments, it only has the ability to reveal the probability that two variables are related. it provides no information as to how they are related; specifically, causal interactions or a relation based on a common cause cannot be detected. For this reason we also investigate the utility of a related quantity called the transfer entropy. The transfer entropy can be written as a difference between mutual informations and has the capability to reveal whether and how the variables are causally related. The application of these information theoretic measures is rested on some familiar examples using data from the International Satellite Cloud Climatology Project (ISCCP) to identify relation between global cloud cover and other variables, including equatorial pacific sea surface temperature (SST), over seasonal and El Nino Southern Oscillation (ENSO) cycles.

  2. Analog and digital interface solutions for the common large-area display set (CLADS)

    NASA Astrophysics Data System (ADS)

    Hermann, David J.; Gorenflo, Ronald L.

    1997-07-01

    Battelle is under contract with Warner Robins Air Logistics Center to design a common large area display set (CLADS) for use in multiple airborne command, control, communications, computers and intelligence applications that currently use unique 19 inch cathode ray tubes (CRTs). The CLADS is a modular design, with common modules used wherever possible. Each CLADS includes an application-specific integration kit, which incorporates all of the unique interface components. Since there is no existing digital video interface standard for high resolution workstations, a standard interface was developed for CLADS and documented as an interface specification.One of the application-specific modules, the application video interface module (AVIM), readily incorporates most of the required application electrical interfaces for a given system into a single module. The analog AVIM, however, poses unique design problems when folding multiple application interface requirements into a single common AVIM for the most prevalent workstation display interface: analog RGB video. Future workstation display interfaces will incorporate fully digital video between the graphics hardware and the digital display device. A digital AVIM is described which utilizes a fiber channel interface to deliver high speed 1280 by 1024, 24- bit, 60 Hz digital video from a PCI graphics card to the CLADS. A video recording and playback device is described, as well as other common CLADS modules, including the display controller and power supply. This paper will discuss both the analog and digital AVIM interfaces, application BIT and power interfaces, as well as CLADS internal interfaces.

  3. Public-private partnerships with large corporations: setting the ground rules for better health.

    PubMed

    Galea, Gauden; McKee, Martin

    2014-04-01

    Public-private partnerships with large corporations offer potential benefits to the health sector but many concerns have been raised, highlighting the need for appropriate safeguards. In this paper we propose five tests that public policy makers may wish to apply when considering engaging in such a public-private partnership. First, are the core products and services provided by the corporation health enhancing or health damaging? In some cases, such as tobacco, the answer is obvious but others, such as food and alcohol, are contested. In such cases, the burden of proof is on the potential partners to show that their activities are health enhancing. Second, do potential partners put their policies into practice in the settings where they can do so, their own workplaces? Third, are the corporate social responsibility activities of potential partners independently audited? Fourth, do potential partners make contributions to the commons rather than to narrow programmes of their choosing? Fifth, is the role of the partner confined to policy implementation rather than policy development, which is ultimately the responsibility of government alone? PMID:24581699

  4. Anomaly Detection in Large Sets of High-Dimensional Symbol Sequences

    NASA Technical Reports Server (NTRS)

    Budalakoti, Suratna; Srivastava, Ashok N.; Akella, Ram; Turkov, Eugene

    2006-01-01

    This paper addresses the problem of detecting and describing anomalies in large sets of high-dimensional symbol sequences. The approach taken uses unsupervised clustering of sequences using the normalized longest common subsequence (LCS) as a similarity measure, followed by detailed analysis of outliers to detect anomalies. As the LCS measure is expensive to compute, the first part of the paper discusses existing algorithms, such as the Hunt-Szymanski algorithm, that have low time-complexity. We then discuss why these algorithms often do not work well in practice and present a new hybrid algorithm for computing the LCS that, in our tests, outperforms the Hunt-Szymanski algorithm by a factor of five. The second part of the paper presents new algorithms for outlier analysis that provide comprehensible indicators as to why a particular sequence was deemed to be an outlier. The algorithms provide a coherent description to an analyst of the anomalies in the sequence, compared to more normal sequences. The algorithms we present are general and domain-independent, so we discuss applications in related areas such as anomaly detection.

  5. Setting the Stage for Developing Pre-service Teachers' Conceptions of Good Science Teaching: The role of classroom videos

    NASA Astrophysics Data System (ADS)

    Wong, Siu Ling; Yung, Benny Hin Wai; Cheng, Man Wai; Lam, Kwok Leung; Hodson, Derek

    2006-01-01

    This paper reports findings about a curriculum innovation conducted at The University of Hong Kong. A CD-ROM consisting of videos of two lessons by different teachers demonstrating exemplary science teaching was used to elicit conceptions of good science teaching of student-teachers enrolled for the 1-year Postgraduate Diploma in Education at several stages during the programme. It was found that the videos elicited student-teachers’ conceptions and had impact on those conceptions prior to the commencement of formal instruction. It has extended student-teachers’ awareness of alternative teaching methods and approaches not experienced in their own schooling, broadened their awareness of different classroom situations, provided proof of existence of good practices, and prompted them to reflect on their current preconceptions of good science teaching. In several ways, the videos acted as a catalyst in socializing the transition of student-teachers from the role of student to the role of teacher.

  6. Assembly of large metagenome data sets using a Convey HC-1 hybrid core computer (7th Annual SFAF Meeting, 2012)

    SciTech Connect

    Copeland, Alex

    2012-06-01

    Alex Copeland on "Assembly of large metagenome data sets using a Convey HC-1 hybrid core computer" at the 2012 Sequencing, Finishing, Analysis in the Future Meeting held June 5-7, 2012 in Santa Fe, New Mexico.

  7. Assembly of large metagenome data sets using a Convey HC-1 hybrid core computer (7th Annual SFAF Meeting, 2012)

    ScienceCinema

    Copeland, Alex [DOE JGI

    2013-02-11

    Alex Copeland on "Assembly of large metagenome data sets using a Convey HC-1 hybrid core computer" at the 2012 Sequencing, Finishing, Analysis in the Future Meeting held June 5-7, 2012 in Santa Fe, New Mexico.

  8. Perturbation corrections to Koopmans' theorem. V - A study with large basis sets

    NASA Technical Reports Server (NTRS)

    Chong, D. P.; Langhoff, S. R.

    1982-01-01

    The vertical ionization potentials of N2, F2 and H2O were calculated by perturbation corrections to Koopmans' theorem using six different basis sets. The largest set used includes several sets of polarization functions. Comparison is made with measured values and with results of computations using Green's functions.

  9. Classroom Management and the Librarian

    ERIC Educational Resources Information Center

    Blackburn, Heidi; Hays, Lauren

    2014-01-01

    As librarians take on more instructional responsibilities, the need for classroom management skills becomes vital. Unfortunately, classroom management skills are not taught in library school and therefore, many librarians are forced to learn how to manage a classroom on the job. Different classroom settings such as one-shot instruction sessions…

  10. Linked Scatter Plots, A Powerful Exploration Tool For Very Large Sets of Spectra

    NASA Astrophysics Data System (ADS)

    Carbon, Duane Francis; Henze, Christopher

    2015-08-01

    We present a new tool, based on linked scatter plots, that is designed to efficiently explore very large spectrum data sets such as the SDSS, APOGEE, LAMOST, GAIA, and RAVE data sets.The tool works in two stages: the first uses batch processing and the second runs interactively. In the batch stage, spectra are processed through our data pipeline which computes the depths relative to the local continuum at preselected feature wavelengths. These depths, and any additional available variables such as local S/N level, magnitudes, colors, positions, and radial velocities, are the basic measured quantities used in the interactive stage.The interactive stage employs the NASA hyperwall, a configuration of 128 workstation displays (8x16 array) controlled by a parallelized software suite running on NASA's Pleiades supercomputer. Each hyperwall panel is used to display a fully linked 2-D scatter plot showing the depth of feature A vs the depth of feature B for all of the spectra. A and B change from panel to panel. The relationships between the various (A,B) strengths and any distinctive clustering, as well as unique outlier groupings, are visually apparent when examining and inter-comparing the different panels on the hyperwall. In addition, the data links between the scatter plots allow the user to apply a logical algebra to the measurements. By graphically selecting the objects in any interesting region of any 2-D plot on the hyperwall, the tool immediately and clearly shows how the selected objects are distributed in all the other 2-D plots. The selection process may be repeated multiple times and, at each step, the selections can represent a sequence of logical constraints on the measurements, revealing those objects which satisfy all the constraints thus far. The spectra of the selected objects may be examined at any time on a connected workstation display.Using over 945,000,000 depth measurements from 569,738 SDSS DR10 stellar spectra, we illustrate how to quickly isolate and examine such interesting stellar subsets as EMP stars, C-rich EMP stars, and CV stars.

  11. Validation and evaluation of common large-area display set (CLADS) performance specification

    NASA Astrophysics Data System (ADS)

    Hermann, David J.; Gorenflo, Ronald L.

    1998-09-01

    Battelle is under contract with Warner Robins Air Logistics Center to design a Common Large Area Display Set (CLADS) for use in multiple Command, Control, Communications, Computers, and Intelligence (C4I) applications that currently use 19- inch Cathode Ray Tubes (CRTs). Battelle engineers have built and fully tested pre-production prototypes of the CLADS design for AWACS, and are completing pre-production prototype displays for three other platforms simultaneously. With the CLADS design, any display technology that can be packaged to meet the form, fit, and function requirements defined by the Common Large Area Display Head Assembly (CLADHA) performance specification is a candidate for CLADS applications. This technology independent feature reduced the risk of CLADS development, permits life long technology insertion upgrades without unnecessary redesign, and addresses many of the obsolescence problems associated with COTS technology-based acquisition. Performance and environmental testing were performed on the AWACS CLADS and continues on other platforms as a part of the performance specification validation process. A simulator assessment and flight assessment were successfully completed for the AWACS CLADS, and lessons learned from these assessments are being incorporated into the performance specifications. Draft CLADS specifications were released to potential display integrators and manufacturers for review in 1997, and the final version of the performance specifications are scheduled to be released to display integrators and manufacturers in May, 1998. Initial USAF applications include replacements for the E-3 AWACS color monitor assembly, E-8 Joint STARS graphics display unit, and ABCCC airborne color display. Initial U.S. Navy applications include the E-2C ACIS display. For these applications, reliability and maintainability are key objectives. The common design will reduce the cost of operation and maintenance by an estimated 3.3M per year on E-3 AWACS alone. It is realistic to anticipate savings of over 30M per year as CLADS is implemented widely across DoD applications. As commonality and open systems interfaces begin to surface in DoD applications, the CLADS architecture can easily and cost effectively absorb the changes, and avoid COTS obsolescence issues.

  12. Getting specific: making taxonomic and ecological sense of large sequencing data sets.

    PubMed

    Massana, Ramon

    2015-06-01

    Eukaryotic microbes comprise a diverse collection of phototrophic and heterotrophic creatures known to play fundamental roles in ecological processes. Some can be identified by light microscopy, generally the largest and with conspicuous shapes, while the smallest can be counted by epifluorescence microscopy or flow cytometry but remain largely unidentified. Microbial diversity studies greatly advanced with the analysis of phylogenetic markers sequenced from natural assemblages. Molecular surveys began in 1990 targeting marine bacterioplankton (Giovannoni et al. ) and first approached microbial eukaryotes in three studies published in 2001 (Díez et al. ; López-García et al. ; Moon-van der Staay et al. ). These seminal studies, based on cloning and Sanger sequencing the complete 18S rDNA, were critical for obtaining broad pictures of microbial diversity in contrasted habitats and for describing novel lineages by robust phylogenies, but were limited by the number of sequences obtained. So, inventories of species richness in a given sample and community comparisons through environmental gradients were very incomplete. These limitations have been overcome with the advent of high-throughput sequencing (HTS) methods, initially 454-pyrosequencing, today Illumina and soon others to come. In this issue of Molecular Ecology, Egge et al. () show a nice example of the use of HTS to study the biodiversity and seasonal succession of a particularly important group of marine microbial eukaryotes, the haptophytes. Temporal changes were analysed first at the community level, then at the clade level, and finally at the lowest rank comparable to species. Interesting and useful ecological insights were obtained at each taxonomic scale. Haptophyte diversity differed along seasons in a systematic manner, with some species showing seasonal preferences and others being always present. Many of these species had no correspondence with known species, pointing out the high level of novelty in microbial assemblages, only accessible by molecular tools. Moreover, the number of species detected was limited, agreeing with a putative scenario of constrained evolutionary diversification in free-living small eukaryotes. This study illustrates the potential of HTS to address ecological relevant questions in an accessible way by processing large data sets that, nonetheless, need to be treated with a fair understanding of their limitations. PMID:26095583

  13. Repulsive parallel MCMC algorithm for discovering diverse motifs from large sequence sets

    PubMed Central

    Ikebata, Hisaki; Yoshida, Ryo

    2015-01-01

    Motivation: The motif discovery problem consists of finding recurring patterns of short strings in a set of nucleotide sequences. This classical problem is receiving renewed attention as most early motif discovery methods lack the ability to handle large data of recent genome-wide ChIP studies. New ChIP-tailored methods focus on reducing computation time and pay little regard to the accuracy of motif detection. Unlike such methods, our method focuses on increasing the detection accuracy while maintaining the computation efficiency at an acceptable level. The major advantage of our method is that it can mine diverse multiple motifs undetectable by current methods. Results: The repulsive parallel Markov chain Monte Carlo (RPMCMC) algorithm that we propose is a parallel version of the widely used Gibbs motif sampler. RPMCMC is run on parallel interacting motif samplers. A repulsive force is generated when different motifs produced by different samplers near each other. Thus, different samplers explore different motifs. In this way, we can detect much more diverse motifs than conventional methods can. Through application to 228 transcription factor ChIP-seq datasets of the ENCODE project, we show that the RPMCMC algorithm can find many reliable cofactor interacting motifs that existing methods are unable to discover. Availability and implementation: A C++ implementation of RPMCMC and discovered cofactor motifs for the 228 ENCODE ChIP-seq datasets are available from http://daweb.ism.ac.jp/yoshidalab/motif. Contact: ikebata.hisaki@ism.ac.jp, yoshidar@ism.ac.jp Supplementary information: Supplementary data are available from Bioinformatics online. PMID:25583120

  14. Taking energy to the physics classroom from the Large Hadron Collider at CERN

    NASA Astrophysics Data System (ADS)

    Cid, Xabier; Cid, Ramón

    2009-01-01

    In 2008, the greatest experiment in history began. When in full operation, the Large Hadron Collider (LHC) at CERN will generate the greatest amount of information that has ever been produced in an experiment before. It will also reveal some of the most fundamental secrets of nature. Despite the enormous amount of information available on this topic, it is not easy for non-specialists to know where the data comes from. The aim of this article is to introduce at a secondary school level a few simple physical calculations about power phenomena that will be present in the LHC: stored beam energy, power and LHC dipole, energy stored in the compact muon solenoid (CMS), energy stored in the ATLAS solenoid and toroid, delivered energy for radiofrequency (RF) cavities, and energy dissipated in dump blocks. In addition, we will be talking about one of the most important scientific institutions in the world and introducing the greatest experiment in history. The calculations that you will find in this article are adapted from physics at secondary school level, and in most cases they are just very simple approaches to the correct results.

  15. Gaining A Geological Perspective Through Active Learning in the Large Lecture Classroom

    NASA Astrophysics Data System (ADS)

    Kapp, J. L.; Richardson, R. M.; Slater, S. J.

    2008-12-01

    NATS 101 A Geological Perspective is a general education course taken by non science majors. We offer 600 seats per semester, with four large lecture sections taught by different faculty members. In the past we have offered optional once a week study groups taught by graduate teaching assistants. Students often feel overwhelmed by the science and associated jargon, and many are prone to skipping lectures altogether. Optional study groups are only attended by ~50% of the students. Faculty members find the class to be a lot of work, mainly due to the grading it generates. Activities given in lecture are often short multiple choice or true false assignments, limiting the depth of understanding we can evaluate. Our students often lack math and critical thinking skills, and we spend a lot of time in lecture reintroducing ideas students should have already gotten from the text. In summer 2007 we were funded to redesign the course. Our goals were to 1) cut the cost of running the course, and 2) improve student learning. Under our redesign optional study groups were replaced by once a week mandatory break out sessions where students complete activities that have been introduced in lecture. Break out sessions substitute for one hour of lecture, and are run by undergraduate preceptors and graduate teaching assistants (GTAs). During the lecture period, lectures themselves are brief with a large portion of the class devoted to active learning in small groups. Weekly reading quizzes are submitted via the online course management system. Break out sessions allow students to spend more time interacting with their fellow students, undergraduate preceptors, and GTAs. They get one on one help in break out sessions on assignments designed to enhance the lecture material. The active lecture format means less of their time is devoted to listening passively to a lecture, and more time is spent peer learning an interacting with the instructor. Completing quizzes online allows students more freedom in when and where they complete their work, and we provide instant feedback on their submitted work. The University of Wyoming Cognition in Astronomy, Physics and Earth sciences Research (CAPER) Team, who specialize in project evaluation, are leading the evaluation effort. We are comparing pre-test to post-test gains on the Geoscience Concept Inventory and Attitudes Toward Science surveys before and after the redesign, and inductive analysis of student interviews and reflective writing that describe student perceptions of the modified learning environment. The redesign has cut the cost of the class per student by more than half. This was achieved primarily in two ways: 1) by greatly reducing the number of hours spent by faculty and graduate teaching assistants on preparation, class time, and grading; and 2) reducing the number of graduate teaching assistants required for the class and replacing many of them with undergraduate preceptors. Undergraduate preceptors are not paid, but receive academic credit for their teaching service. The savings from the redesign is used to allow faculty more time to work on institutional priorities.

  16. Interactive 3-D Immersive Visualization for Analysis of Large Multi-Parameter Atmospheric Data Sets

    NASA Astrophysics Data System (ADS)

    Frenzer, J. B.; Hoell, J. M.; Holdzkom, J. J.; Jacob, D.; Fuelberg, H.; Avery, M.; Carmichael, G.; Hopkins, D. L.

    2001-12-01

    Significant improvements in the ability of atmospheric chemistry models to predict the transport and production of atmospheric constituents on regional and global scales have been realized over the past decade. Concurrent with the model improvements, has been an increase in the size and complexity of atmospheric observational data sets. As a result, the challenge to provide efficient and realistic visualization of atmospheric data "products" has increased dramatically. Over the past several years, personnel from the Atmospheric Sciences Data Center (ASDC) at NASA's Langley Research Center have explored the merits of visualizing atmospheric data products using interactive, immersive visualization hardware and software. As part of this activity, the Virtual Global Explorer and Observatory (vGeo) software, developed by VRCO, Inc., has been utilized to support the visual analysis of large multivariate data sets. The vGeo software provides an environment in which the user can create, view, navigate, and interact with data, models, and images in an immersive 3-D environment. The vGeo visualization capability was employed during the March/April 2001, NASA Global Tropospheric Experiment Transport and Chemical Evolution over the Pacific (TRACE-P) mission [(GTE) http://www-gte.larc.nasa.gov] to support day-to-day flight-planning activities through the creation of virtual 3-D worlds containing modeled data and proposed aircraft flight paths. The GTE, a major activity within NASA's Earth Science Enterprise, is primarily an aircraft-based measurement program, supplemented by ground-based measurements and satellite observations, focused on understanding the impact of human activity on the global troposphere. The TRACE-P is the most recent campaign conducted by GTE and was deployed to Hong Kong and then to the Yokota Airbase, Japan. TRACE-P is the third in a series of GTE field campaigns in the northwestern Pacific region to understand the chemical composition of air masses emerging from the Asian Continent and their impact on the region. Since completing the field deployment phase of TRACE-P, the 3-D visualization capability has been used as a tool to combine and visually analyze TRACE-P data from multiple sources (e.g. model, airborne and ground based measurements, ozone sondes, and satellite observations). This capability to merge measurements into model data fields in a virtual 3-D world is perhaps the most exciting aspect of this new visualization capability. This allows for a more realistic contextual representation of the model/measurement results. The measured parameters along specific flights (of typical duration of 8 hrs) along with supporting ancillary measurements provide the "real" representation of the atmosphere at that specific point in time and space. The models provide the time evolution, and three-dimensional structure during the measurement period. When these are merged together the context of the observations is documented, and model predictions can be validated and/or improved. Specific TRACE-P case studies will be presented showing results from global and regional models coupled with airborne measurements for which the influence of transport on the spatial distribution of species measured on the aircraft was more clearly discerned within the 3-D environment than from conventional visualization techniques.

  17. Tools for Analysis and Visualization of Large Time-Varying CFD Data Sets

    NASA Technical Reports Server (NTRS)

    Wilhelms, Jane; VanGelder, Allen

    1997-01-01

    In the second year, we continued to built upon and improve our scanline-based direct volume renderer that we developed in the first year of this grant. This extremely general rendering approach can handle regular or irregular grids, including overlapping multiple grids, and polygon mesh surfaces. It runs in parallel on multi-processors. It can also be used in conjunction with a k-d tree hierarchy, where approximate models and error terms are stored in the nodes of the tree, and approximate fast renderings can be created. We have extended our software to handle time-varying data where the data changes but the grid does not. We are now working on extending it to handle more general time-varying data. We have also developed a new extension of our direct volume renderer that uses automatic decimation of the 3D grid, as opposed to an explicit hierarchy. We explored this alternative approach as being more appropriate for very large data sets, where the extra expense of a tree may be unacceptable. We also describe a new approach to direct volume rendering using hardware 3D textures and incorporates lighting effects. Volume rendering using hardware 3D textures is extremely fast, and machines capable of using this technique are becoming more moderately priced. While this technique, at present, is limited to use with regular grids, we are pursuing possible algorithms extending the approach to more general grid types. We have also begun to explore a new method for determining the accuracy of approximate models based on the light field method described at ACM SIGGRAPH '96. In our initial implementation, we automatically image the volume from 32 equi-distant positions on the surface of an enclosing tessellated sphere. We then calculate differences between these images under different conditions of volume approximation or decimation. We are studying whether this will give a quantitative measure of the effects of approximation. We have created new tools for exploring the differences between images produced by various rendering methods. Images created by our software can be stored in the SGI RGB format. Our idtools software reads in pair of images and compares them using various metrics. The differences of the images using the RGB, HSV, and HSL color models can be calculated and shown. We can also calculate the auto-correlation function and the Fourier transform of the image and image differences. We will explore how these image differences compare in order to find useful metrics for quantifying the success of various visualization approaches. In general, progress was consistent with our research plan for the second year of the grant.

  18. Engaged: Making Large Classes Feel Small through Blended Learning Instructional Strategies that Promote Increased Student Performance

    ERIC Educational Resources Information Center

    Francis, Raymond W.

    2012-01-01

    It is not enough to be great at sharing information in a large classroom setting. To be an effective teacher you must be able to meaningfully engage your students with their peers and with the content. And you must do this regardless of class size or content. The issues of teaching effectively in large classroom settings have presented ongoing…

  19. Adaptation of Bharatanatyam Dance Pedagogy for Multicultural Classrooms: Questions and Relevance in a North American University Setting

    ERIC Educational Resources Information Center

    Banerjee, Suparna

    2013-01-01

    This article opens up questions around introducing Bharatanatyam, a form of Indian classical dance, to undergraduate learners within a North American university setting. The aim is to observe how the learners understood and received a particular cultural practice and to explore issues related to learning goals, curriculum content, approaches to…

  20. Training Health Service Technicians as Teacher Assistants in an Inpatient Residential Emotional/Behavior Disorder Classroom Setting

    ERIC Educational Resources Information Center

    Banks, Walter E.

    2012-01-01

    Schools have identified that the use of Teacher Assistants often provides needed additional support in the school setting. In a Health Care Facility that provides inpatient psychiatric services, children ages 5-14 are required to engage in school activities. Currently there are no Teacher Assistants trained in the facility. This study focuses on…

  1. Adaptation of Bharatanatyam Dance Pedagogy for Multicultural Classrooms: Questions and Relevance in a North American University Setting

    ERIC Educational Resources Information Center

    Banerjee, Suparna

    2013-01-01

    This article opens up questions around introducing Bharatanatyam, a form of Indian classical dance, to undergraduate learners within a North American university setting. The aim is to observe how the learners understood and received a particular cultural practice and to explore issues related to learning goals, curriculum content, approaches to…

  2. On the performance of large Gaussian basis sets for the computation of total atomization energies

    NASA Technical Reports Server (NTRS)

    Martin, J. M. L.

    1992-01-01

    The total atomization energies of a number of molecules have been computed using an augmented coupled-cluster method and (5s4p3d2f1g) and 4s3p2d1f) atomic natural orbital (ANO) basis sets, as well as the correlation consistent valence triple zeta plus polarization (cc-pVTZ) correlation consistent valence quadrupole zeta plus polarization (cc-pVQZ) basis sets. The performance of ANO and correlation consistent basis sets is comparable throughout, although the latter can result in significant CPU time savings. Whereas the inclusion of g functions has significant effects on the computed Sigma D(e) values, chemical accuracy is still not reached for molecules involving multiple bonds. A Gaussian-1 (G) type correction lowers the error, but not much beyond the accuracy of the G1 model itself. Using separate corrections for sigma bonds, pi bonds, and valence pairs brings down the mean absolute error to less than 1 kcal/mol for the spdf basis sets, and about 0.5 kcal/mol for the spdfg basis sets. Some conclusions on the success of the Gaussian-1 and Gaussian-2 models are drawn.

  3. STRESSOR DATA SETS FOR STUDYING SPECIES DIVERSITY AT LARGE SPATIAL SCALES

    EPA Science Inventory

    There is increasing scientific and societal concern over the impact of anthropogenic activities (e.g., habitat destruction, pollution) on biodiversity. he impact of anthropogenic activities on biodiversity is generally recognized as a global phenomenon. t large spatial scales, se...

  4. Improved student engagement, satisfaction, and learning outcomes in a "flipped" large-lecture setting

    NASA Astrophysics Data System (ADS)

    Ward, A. S.; Bettis, E. A., III; Russell, J. E.; Van Horne, S.; Rocheford, M. K.; Sipola, M.; Colombo, M. R.

    2014-12-01

    Large lecture courses are traditional teaching practices of most large institutions of public higher education. They have historically provided an efficient way to deliver content information to the large number of students with the least amount of faculty resources. However, research of student learning indicates that the traditional lecture format does not provide the best learning experience for students, and students learn better in the active learning environments in which students engage in meaningful learning activities rather than just listening. In this study, we compare two offerings of Introduction to Environmental Science, a large-lecture general education course, offered in two formats by the same instructors in subsequent years. In the first offering (Spring 2013) the course was offered as a traditional large-lecture course, with lecture to large audiences and a limited number of exams for assessment. In the second offering (Spring 2014), the course included small-group discussion periods, peer-review of writing assignments, guest lectures, and online learning with limited traditional lecture. Our primary objective was to quantify differences in student engagement and learning outcomes between the two course offerings. Results of our study show that the students in the transformed course indicated higher interest, engagement level, and satisfaction than the students in the traditional lecture course. Furthermore, students in the transformed course reported increased behavior, emotional, and cognitive engagement over those in the traditional course, and also increased satisfaction with the course.

  5. MDH: A High Speed Multi-phase Dynamic Hash String Matching Algorithm for Large-Scale Pattern Set

    NASA Astrophysics Data System (ADS)

    Zhou, Zongwei; Xue, Yibo; Liu, Junda; Zhang, Wei; Li, Jun

    String matching algorithm is one of the key technologies in numerous network security applications and systems. Nowadays, the increasing network bandwidth and pattern set size both calls for high speed string matching algorithm for large-scale pattern set. This paper proposes a novel algorithm called Multi-phase Dynamic Hash (MDH), which cut down the memory requirement by multi-phase hash and explore valuable pattern set information to speed up searching procedure by dynamic-cut heuristics. The experimental results demonstrate that MDH can improve matching performance by 100% to 300% comparing with other popular algorithms, whereas the memory requirement stays in a comparatively low level.

  6. A Controlled Trial of Active versus Passive Learning Strategies in a Large Group Setting

    ERIC Educational Resources Information Center

    Haidet, Paul; Morgan, Robert O.; O'Malley, Kimberly; Moran, Betty Jeanne; Richards, Boyd F.

    2004-01-01

    Objective: To compare the effects of active and didactic teaching strategies on learning- and process-oriented outcomes. Design: Controlled trial. Setting: After-hours residents' teaching session. Participants: Family and Community Medicine, Internal Medicine, and Pediatrics residents at two academic medical institutions. Interventions: We…

  7. An Efficient Algorithm for Discovering Motifs in Large DNA Data Sets.

    PubMed

    Yu, Qiang; Huo, Hongwei; Chen, Xiaoyang; Guo, Haitao; Vitter, Jeffrey Scott; Huan, Jun

    2015-07-01

    The planted (l,d) motif discovery has been successfully used to locate transcription factor binding sites in dozens of promoter sequences over the past decade. However, there has not been enough work done in identifying (l,d) motifs in the next-generation sequencing (ChIP-seq) data sets, which contain thousands of input sequences and thereby bring new challenge to make a good identification in reasonable time. To cater this need, we propose a new planted (l,d) motif discovery algorithm named MCES, which identifies motifs by mining and combining emerging substrings. Specially, to handle larger data sets, we design a MapReduce-based strategy to mine emerging substrings distributedly. Experimental results on the simulated data show that i) MCES is able to identify (l,d) motifs efficiently and effectively in thousands to millions of input sequences, and runs faster than the state-of-the-art (l,d) motif discovery algorithms, such as F-motif and TraverStringsR; ii) MCES is able to identify motifs without known lengths, and has a better identification accuracy than the competing algorithm CisFinder. Also, the validity of MCES is tested on real data sets. MCES is freely available at http://sites.google.com/site/feqond/mces. PMID:25872217

  8. Options in Education, Transcript for February 16, 1976: National Commitment to Equal Rights & Equal Educational Opportunity, Racial Conflict in the Classroom, Setting Up a Publishing Business, and Women in Education (Mathematics and Sex).

    ERIC Educational Resources Information Center

    George Washington Univ., Washington, DC. Inst. for Educational Leadership.

    "Options in Education" is a radio news program which focuses on issues and developments in education. This transcript contains discussions of the national commitment to desegregated education, racial conflict in the classroom, learning how to set up a publishing business, women in education (mathematics and sex) and education news highlights.…

  9. An “Electronic Fluorescent Pictograph” Browser for Exploring and Analyzing Large-Scale Biological Data Sets

    PubMed Central

    Nahal, Hardeep; Ammar, Ron; Wilson, Greg V.; Provart, Nicholas J.

    2007-01-01

    Background The exploration of microarray data and data from other high-throughput projects for hypothesis generation has become a vital aspect of post-genomic research. For the non-bioinformatics specialist, however, many of the currently available tools provide overwhelming amounts of data that are presented in a non-intuitive way. Methodology/Principal Findings In order to facilitate the interpretation and analysis of microarray data and data from other large-scale data sets, we have developed a tool, which we have dubbed the electronic Fluorescent Pictograph – or eFP – Browser, available at http://www.bar.utoronto.ca/, for exploring microarray and other data for hypothesis generation. This eFP Browser engine paints data from large-scale data sets onto pictographic representations of the experimental samples used to generate the data sets. We give examples of using the tool to present Arabidopsis gene expression data from the AtGenExpress Consortium (Arabidopsis eFP Browser), data for subcellular localization of Arabidopsis proteins (Cell eFP Browser), and mouse tissue atlas microarray data (Mouse eFP Browser). Conclusions/Significance The eFP Browser software is easily adaptable to microarray or other large-scale data sets from any organism and thus should prove useful to a wide community for visualizing and interpreting these data sets for hypothesis generation. PMID:17684564

  10. The PRRS Host Genomic Consortium (PHGC) Database: Management of large data sets.

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In any consortium project where large amounts of phenotypic and genotypic data are collected across several research labs, issues arise with maintenance and analysis of datasets. The PRRS Host Genomic Consortium (PHGC) Database was developed to meet this need for the PRRS research community. The sch...

  11. Design of Availability-Dependent Distributed Services in Large-Scale Uncooperative Settings

    ERIC Educational Resources Information Center

    Morales, Ramses Victor

    2009-01-01

    Thesis Statement: "Availability-dependent global predicates can be efficiently and scalably realized for a class of distributed services, in spite of specific selfish and colluding behaviors, using local and decentralized protocols". Several types of large-scale distributed systems spanning the Internet have to deal with availability variations…

  12. Learning through Discussions: Comparing the Benefits of Small-Group and Large-Class Settings

    ERIC Educational Resources Information Center

    Pollock, Philip H.; Hamann, Kerstin; Wilson, Bruce M.

    2011-01-01

    The literature on teaching and learning heralds the benefits of discussion for student learner outcomes, especially its ability to improve students' critical thinking skills. Yet, few studies compare the effects of different types of face-to-face discussions on learners. Using student surveys, we analyze the benefits of small-group and large-class…

  13. Learning through Discussions: Comparing the Benefits of Small-Group and Large-Class Settings

    ERIC Educational Resources Information Center

    Pollock, Philip H.; Hamann, Kerstin; Wilson, Bruce M.

    2011-01-01

    The literature on teaching and learning heralds the benefits of discussion for student learner outcomes, especially its ability to improve students' critical thinking skills. Yet, few studies compare the effects of different types of face-to-face discussions on learners. Using student surveys, we analyze the benefits of small-group and large-class…

  14. A posteriori correction of camera characteristics from large image data sets

    PubMed Central

    Afanasyev, Pavel; Ravelli, Raimond B. G.; Matadeen, Rishi; De Carlo, Sacha; van Duinen, Gijs; Alewijnse, Bart; Peters, Peter J.; Abrahams, Jan-Pieter; Portugal, Rodrigo V.; Schatz, Michael; van Heel, Marin

    2015-01-01

    Large datasets are emerging in many fields of image processing including: electron microscopy, light microscopy, medical X-ray imaging, astronomy, etc. Novel computer-controlled instrumentation facilitates the collection of very large datasets containing thousands of individual digital images. In single-particle cryogenic electron microscopy (“cryo-EM”), for example, large datasets are required for achieving quasi-atomic resolution structures of biological complexes. Based on the collected data alone, large datasets allow us to precisely determine the statistical properties of the imaging sensor on a pixel-by-pixel basis, independent of any “a priori” normalization routinely applied to the raw image data during collection (“flat field correction”). Our straightforward “a posteriori” correction yields clean linear images as can be verified by Fourier Ring Correlation (FRC), illustrating the statistical independence of the corrected images over all spatial frequencies. The image sensor characteristics can also be measured continuously and used for correcting upcoming images. PMID:26068909

  15. Design of Availability-Dependent Distributed Services in Large-Scale Uncooperative Settings

    ERIC Educational Resources Information Center

    Morales, Ramses Victor

    2009-01-01

    Thesis Statement: "Availability-dependent global predicates can be efficiently and scalably realized for a class of distributed services, in spite of specific selfish and colluding behaviors, using local and decentralized protocols". Several types of large-scale distributed systems spanning the Internet have to deal with availability variations…

  16. Using a Classroom Response System for Promoting Interaction to Teaching Mathematics to Large Groups of Undergraduate Students

    ERIC Educational Resources Information Center

    Morais, Adolfo; Barragués, José Ignacio; Guisasola, Jenaro

    2015-01-01

    This work describes the design and evaluation of a proposal to use Classroom Response Systems (CRS), intended to promote participative classes of Mathematics at University. The proposal is based on Problem Based Learnig (PBL) and uses Robert's six hypotheses for mathematical teaching-learning. The results show that PBL is a relevant strategy to…

  17. Using Classroom-Based Assessment on a Large Scale: Supporting and Reporting on Student Learning with the Early Literacy Profile.

    ERIC Educational Resources Information Center

    Falk, Beverly; Ort, Suzanna Wichterle; Moirs, Katie

    This paper describes the development work and research findings of an initiative to create a statewide literacy assessment in New York to inform teaching and learning and report on group performance trends. The Early Literacy Profile (ELP) is a classroom-based, standards-referenced performance assessment for students in the primary grades…

  18. Using a Classroom Response System for Promoting Interaction to Teaching Mathematics to Large Groups of Undergraduate Students

    ERIC Educational Resources Information Center

    Morais, Adolfo; Barragués, José Ignacio; Guisasola, Jenaro

    2015-01-01

    This work describes the design and evaluation of a proposal to use Classroom Response Systems (CRS), intended to promote participative classes of Mathematics at University. The proposal is based on Problem Based Learnig (PBL) and uses Robert's six hypotheses for mathematical teaching-learning. The results show that PBL is a relevant strategy to…

  19. Caught you: threats to confidentiality due to the public release of large-scale genetic data sets

    PubMed Central

    2010-01-01

    Background Large-scale genetic data sets are frequently shared with other research groups and even released on the Internet to allow for secondary analysis. Study participants are usually not informed about such data sharing because data sets are assumed to be anonymous after stripping off personal identifiers. Discussion The assumption of anonymity of genetic data sets, however, is tenuous because genetic data are intrinsically self-identifying. Two types of re-identification are possible: the "Netflix" type and the "profiling" type. The "Netflix" type needs another small genetic data set, usually with less than 100 SNPs but including a personal identifier. This second data set might originate from another clinical examination, a study of leftover samples or forensic testing. When merged to the primary, unidentified set it will re-identify all samples of that individual. Even with no second data set at hand, a "profiling" strategy can be developed to extract as much information as possible from a sample collection. Starting with the identification of ethnic subgroups along with predictions of body characteristics and diseases, the asthma kids case as a real-life example is used to illustrate that approach. Summary Depending on the degree of supplemental information, there is a good chance that at least a few individuals can be identified from an anonymized data set. Any re-identification, however, may potentially harm study participants because it will release individual genetic disease risks to the public. PMID:21190545

  20. Ssecrett and NeuroTrace: Interactive Visualization and Analysis Tools for Large-Scale Neuroscience Data Sets

    PubMed Central

    Jeong, Won-Ki; Beyer, Johanna; Hadwiger, Markus; Blue, Rusty; Law, Charles; Vázquez-Reina, Amelio; Reid, R. Clay; Lichtman, Jeff; Pfister, Hanspeter

    2010-01-01

    Data sets imaged with modern electron microscopes can range from tens of terabytes to about one petabyte. Two new tools, Ssecrett and NeuroTrace, support interactive exploration and analysis of large-scale optical and electron-microscopy images to help scientists reconstruct complex neural circuits of the mammalian nervous system. PMID:20650718

  1. Use of Large-Scale Data Sets to Study Educational Pathways of American Indian and Alaska Native Students

    ERIC Educational Resources Information Center

    Faircloth, Susan C.; Alcantar, Cynthia M.; Stage, Frances K.

    2014-01-01

    This chapter discusses issues and challenges encountered in using large-scale data sets to study educational experiences and subsequent outcomes for American Indian and Alaska Native (AI/AN) students. In this chapter, we argue that the linguistic and cultural diversity of Native peoples, coupled with the legal and political ways in which education…

  2. Use of Large-Scale Data Sets to Study Educational Pathways of American Indian and Alaska Native Students

    ERIC Educational Resources Information Center

    Faircloth, Susan C.; Alcantar, Cynthia M.; Stage, Frances K.

    2014-01-01

    This chapter discusses issues and challenges encountered in using large-scale data sets to study educational experiences and subsequent outcomes for American Indian and Alaska Native (AI/AN) students. In this chapter, we argue that the linguistic and cultural diversity of Native peoples, coupled with the legal and political ways in which education…

  3. Creation of libraries of recurring mass spectra from large data sets assisted by a dual-column workflow.

    PubMed

    Mallard, W Gary; Andriamaharavo, N Rabe; Mirokhin, Yuri A; Halket, John M; Stein, Stephen E

    2014-10-21

    An analytical methodology has been developed for extracting recurrent unidentified spectra (RUS) from large GC/MS data sets. Spectra were first extracted from original data files by the Automated Mass Spectral Deconvolution and Identification System (AMDIS; Stein, S. E. J. Am. Soc. Mass Spectrom. 1999 , 10 , 770 - 781 ) using settings designed to minimize spurious spectra, followed by searching the NIST library with all unidentified spectra. The spectra that could not be identified were then filtered to remove poorly deconvoluted data and clustered. The results were assumed to be unidentified components. This was tested by requiring each unidentified spectrum to be found in two chromatographic columns with slightly different stationary phases. This methodology has been applied to a large set of pediatric urine samples. A library of spectra and retention indices for derivatized urine components, both identified and recurrent unidentified, has been created and is available for download. PMID:25233296

  4. New storage and presentation methods for rapid access to large sets of triggered records

    NASA Astrophysics Data System (ADS)

    Stoll, Dieter

    1993-02-01

    The recent advent of high-capacity storage media for seismological measurement instruments has rendered traditional methods of storing and accessing triggered events obsolete. Traditionally, the user was faced with the burden of remembering where and how data sets were stored. With hundreds of megabytes to administer, this approach is no longer feasible. A new approach using a relational databasemanagement system is presented here. The system is implemented under Unix tm, using the commercially available INFORMIX® C/ISAM® DBMS. The new system is not only much faster and more convenient, but also offers new possibilities previously unavailable, such as direct access to data on the original recording media.

  5. An Examination of Classroom Social Environment on Motivation and Engagement of College Early Entrant Honors Students

    ERIC Educational Resources Information Center

    Maddox, Richard S.

    2010-01-01

    This study set out to examine the relationships between the classroom social environment, motivation, engagement and achievement of a group of early entrant Honors students at a large urban university. Prior research on the classroom environment, motivation, engagement and high ability students was examined, leading to the assumption that the…

  6. High-throughput film-densitometry: an efficient approach to generate large data sets.

    PubMed

    Typke, Dieter; Nordmeyer, Robert A; Jones, Arthur; Lee, Juyoung; Avila-Sakar, Agustin; Downing, Kenneth H; Glaeser, Robert M

    2005-01-01

    A film-handling machine (robot) has been built which can, in conjunction with a commercially available film densitometer, exchange and digitize over 300 electron micrographs per day. Implementation of robotic film handling effectively eliminates the delay and tedium associated with digitizing images when data are initially recorded on photographic film. The modulation transfer function (MTF) of the commercially available densitometer is significantly worse than that of a high-end, scientific microdensitometer. Nevertheless, its signal-to-noise ratio (S/N) is quite excellent, allowing substantial restoration of the output to "near-to-perfect" performance. Due to the large area of the standard electron microscope film that can be digitized by the commercial densitometer (up to 10,000 x 13,680 pixels with an appropriately coded holder), automated film digitization offers a fast and inexpensive alternative to high-end CCD cameras as a means of acquiring large amounts of image data in electron microscopy. PMID:15629654

  7. High-throughput film-densitometry: An efficient approach to generate large data sets

    SciTech Connect

    Typke, Dieter; Nordmeyer, Robert A.; Jones, Arthur; Lee, Juyoung; Avila-Sakar, Agustin; Downing, Kenneth H.; Glaeser, Robert M.

    2004-07-14

    A film-handling machine (robot) has been built which can, in conjunction with a commercially available film densitometer, exchange and digitize over 300 electron micrographs per day. Implementation of robotic film handling effectively eliminates the delay and tedium associated with digitizing images when data are initially recorded on photographic film. The modulation transfer function (MTF) of the commercially available densitometer is significantly worse than that of a high-end, scientific microdensitometer. Nevertheless, its signal-to-noise ratio (S/N) is quite excellent, allowing substantial restoration of the output to ''near-to-perfect'' performance. Due to the large area of the standard electron microscope film that can be digitized by the commercial densitometer (up to 10,000 x 13,680 pixels with an appropriately coded holder), automated film digitization offers a fast and inexpensive alternative to high-end CCD cameras as a means of acquiring large amounts of image data in electron microscopy.

  8. Parallel k-Means Clustering for Quantitative Ecoregion Delineation Using Large Data Sets

    SciTech Connect

    Kumar, Jitendra; Mills, Richard T; Hoffman, Forrest M; HargroveJr., William Walter

    2011-01-01

    Identification of geographic ecoregions has long been of interest to environmental scientists and ecologists for identifying regions of similar ecological and environmental conditions. Such classifications are important for predicting suitable species ranges, for stratification of ecological samples, and to help prioritize habitat preservation and remediation efforts. Hargrove and Hoffman (1999, 2009) have developed geographical spatio-temporal clustering algorithms and codes and have successfully applied them to a variety of environmental science domains, including ecological regionalization; environmental monitoring network design; analysis of satellite-, airborne-, and ground-based remote sensing, and climate model-model and model-measurement intercomparison. With the advances in state-of-the-art satellite remote sensing and climate models, observations and model outputs are available at increasingly high spatial and temporal resolutions. Long time series of these high resolution datasets are extremely large in size and growing. Analysis and knowledge extraction from these large datasets are not just algorithmic and ecological problems, but also pose a complex computational problem. This paper focuses on the development of a massively parallel multivariate geographical spatio-temporal clustering code for analysis of very large datasets using tens of thousands processors on one of the fastest supercomputers in the world.

  9. Latest developments in the display of large-scale ionospheric and thermospheric data sets

    NASA Technical Reports Server (NTRS)

    Sojka, J. J.

    1992-01-01

    Over the past decade, data base sizes have continually increased and will continue to do so in the future. This problem of size is further compounded because the trend in present-day studies is to use data from many different locations and different instruments and then compare it with data from global scale physical models. The latter produce data bases of comparable if not even larger size. Much of the data can be viewed as 'image' time sequences and is most readily viewed on color display terminals. These data sets reside in national or owner-generated data bases linked together by computer networks. As the size increases, just moving this data around, taking 'quick-looks' at the data, or even storing it locally become severe problems compromising the scientific return from the data. Is the present-day technology with these analysis techniques being used in the best way? What are the prospects for reducing the storage and transmission size of the data sets? Examples of such problems and potential solutions are described in this paper.

  10. Large-Eddy Simulation of Premixed and Partially Premixed Turbulent Combustion Using a Level Set Method

    NASA Astrophysics Data System (ADS)

    Duchamp de Lageneste, Laurent; Pitsch, Heinz

    2001-11-01

    Level-set methods (G-equation) have been recently used in the context of RANS to model turbulent premixed (Hermann 2000) or partially premixed (Chen 1999) combustion. By directly taking into account unsteady effects, LES can be expected to improve predictions over RANS. Since the reaction zone thickness of premixed flames in technical devices is usually much smaller than the LES grid spacing, chemical reactions completely occur on the sub-grid scales and hence have to be modeled entirely. In the level-set methodology, the flame front is represented by an arbitrary iso-surface G0 of a scalar field G whose evolution is described by the so-called G-equation. This equation is only valid at G=G_0, and hence decoupled from other G levels. Heat release is then modeled using a flamelet approach in which temperature is determined as a function of G and the mixture-fraction Z. In the present study, the proposed approach has been formulated for LES and validated using data from a turbulent Bunsen burner experiment (Chen, Peters 1996). Simulation of an experimental Lean Premixed Prevapourised (LPP) dump combustor (Besson, Bruel 1999, 2000) under different premixed or partially premixed conditions will also be presented.

  11. Validating hierarchical verbal autopsy expert algorithms in a large data set with known causes of death

    PubMed Central

    Kalter, Henry D; Perin, Jamie; Black, Robert E

    2016-01-01

    Background Physician assessment historically has been the most common method of analyzing verbal autopsy (VA) data. Recently, the World Health Organization endorsed two automated methods, Tariff 2.0 and InterVA–4, which promise greater objectivity and lower cost. A disadvantage of the Tariff method is that it requires a training data set from a prior validation study, while InterVA relies on clinically specified conditional probabilities. We undertook to validate the hierarchical expert algorithm analysis of VA data, an automated, intuitive, deterministic method that does not require a training data set. Methods Using Population Health Metrics Research Consortium study hospital source data, we compared the primary causes of 1629 neonatal and 1456 1–59 month–old child deaths from VA expert algorithms arranged in a hierarchy to their reference standard causes. The expert algorithms were held constant, while five prior and one new “compromise” neonatal hierarchy, and three former child hierarchies were tested. For each comparison, the reference standard data were resampled 1000 times within the range of cause–specific mortality fractions (CSMF) for one of three approximated community scenarios in the 2013 WHO global causes of death, plus one random mortality cause proportions scenario. We utilized CSMF accuracy to assess overall population–level validity, and the absolute difference between VA and reference standard CSMFs to examine particular causes. Chance–corrected concordance (CCC) and Cohen’s kappa were used to evaluate individual–level cause assignment. Results Overall CSMF accuracy for the best–performing expert algorithm hierarchy was 0.80 (range 0.57–0.96) for neonatal deaths and 0.76 (0.50–0.97) for child deaths. Performance for particular causes of death varied, with fairly flat estimated CSMF over a range of reference values for several causes. Performance at the individual diagnosis level was also less favorable than that for overall CSMF (neonatal: best CCC = 0.23, range 0.16–0.33; best kappa = 0.29, 0.23–0.35; child: best CCC = 0.40, 0.19–0.45; best kappa = 0.29, 0.07–0.35). Conclusions Expert algorithms in a hierarchy offer an accessible, automated method for assigning VA causes of death. Overall population–level accuracy is similar to that of more complex machine learning methods, but without need for a training data set from a prior validation study.

  12. Propagation of large uncertainty sets in orbital dynamics by automatic domain splitting

    NASA Astrophysics Data System (ADS)

    Wittig, Alexander; Di Lizia, Pierluigi; Armellin, Roberto; Makino, Kyoko; Bernelli-Zazzera, Franco; Berz, Martin

    2015-07-01

    Current approaches to uncertainty propagation in astrodynamics mainly refer to linearized models or Monte Carlo simulations. Naive linear methods fail in nonlinear dynamics, whereas Monte Carlo simulations tend to be computationally intensive. Differential algebra has already proven to be an efficient compromise by replacing thousands of pointwise integrations of Monte Carlo runs with the fast evaluation of the arbitrary order Taylor expansion of the flow of the dynamics. However, the current implementation of the DA-based high-order uncertainty propagator fails when the non-linearities of the dynamics prohibit good convergence of the Taylor expansion in one or more directions. We solve this issue by introducing automatic domain splitting. During propagation, the polynomial expansion of the current state is split into two polynomials whenever its truncation error reaches a predefined threshold. The resulting set of polynomials accurately tracks uncertainties, even in highly nonlinear dynamics. The method is tested on the propagation of (99942) Apophis post-encounter motion.

  13. The Same or Separate? An Exploration of Teachers' Perceptions of the Classroom Assignment of Twins in Prior to School and Kindergarten to Year Two School Settings

    ERIC Educational Resources Information Center

    Jones, Laura; De Gioia, Katey

    2010-01-01

    This article investigates the perceptions of 12 teachers from New South Wales, Australia, regarding the classroom assignment of twins. Analysis of semi-structured interviews with each of the teachers revealed four key findings: 1) teachers' perceptions about the classroom assignment of twins vary according to their previous experience and…

  14. A new tool called DISSECT for analysing large genomic data sets using a Big Data approach

    PubMed Central

    Canela-Xandri, Oriol; Law, Andy; Gray, Alan; Woolliams, John A.; Tenesa, Albert

    2015-01-01

    Large-scale genetic and genomic data are increasingly available and the major bottleneck in their analysis is a lack of sufficiently scalable computational tools. To address this problem in the context of complex traits analysis, we present DISSECT. DISSECT is a new and freely available software that is able to exploit the distributed-memory parallel computational architectures of compute clusters, to perform a wide range of genomic and epidemiologic analyses, which currently can only be carried out on reduced sample sizes or under restricted conditions. We demonstrate the usefulness of our new tool by addressing the challenge of predicting phenotypes from genotype data in human populations using mixed-linear model analysis. We analyse simulated traits from 470,000 individuals genotyped for 590,004 SNPs in ∼4 h using the combined computational power of 8,400 processor cores. We find that prediction accuracies in excess of 80% of the theoretical maximum could be achieved with large sample sizes. PMID:26657010

  15. A new tool called DISSECT for analysing large genomic data sets using a Big Data approach.

    PubMed

    Canela-Xandri, Oriol; Law, Andy; Gray, Alan; Woolliams, John A; Tenesa, Albert

    2015-01-01

    Large-scale genetic and genomic data are increasingly available and the major bottleneck in their analysis is a lack of sufficiently scalable computational tools. To address this problem in the context of complex traits analysis, we present DISSECT. DISSECT is a new and freely available software that is able to exploit the distributed-memory parallel computational architectures of compute clusters, to perform a wide range of genomic and epidemiologic analyses, which currently can only be carried out on reduced sample sizes or under restricted conditions. We demonstrate the usefulness of our new tool by addressing the challenge of predicting phenotypes from genotype data in human populations using mixed-linear model analysis. We analyse simulated traits from 470,000 individuals genotyped for 590,004 SNPs in ?4?h using the combined computational power of 8,400 processor cores. We find that prediction accuracies in excess of 80% of the theoretical maximum could be achieved with large sample sizes. PMID:26657010

  16. The coming of age of phosphoproteomics--from large data sets to inference of protein functions.

    PubMed

    Roux, Philippe P; Thibault, Pierre

    2013-12-01

    Protein phosphorylation is one of the most common post-translational modifications used in signal transduction to control cell growth, proliferation, and survival in response to both intracellular and extracellular stimuli. This modification is finely coordinated by a network of kinases and phosphatases that recognize unique sequence motifs and/or mediate their functions through scaffold and adaptor proteins. Detailed information on the nature of kinase substrates and site-specific phosphoregulation is required in order for one to better understand their pathophysiological roles. Recent advances in affinity chromatography and mass spectrometry (MS) sensitivity have enabled the large-scale identification and profiling of protein phosphorylation, but appropriate follow-up experiments are required in order to ascertain the functional significance of identified phosphorylation sites. In this review, we present meaningful technical details for MS-based phosphoproteomic analyses and describe important considerations for the selection of model systems and the functional characterization of identified phosphorylation sites. PMID:24037665

  17. A hybrid structure for the storage and manipulation of very large spatial data sets

    USGS Publications Warehouse

    Peuquet, Donna J.

    1982-01-01

    The map data input and output problem for geographic information systems is rapidly diminishing with the increasing availability of mass digitizing, direct spatial data capture and graphics hardware based on raster technology. Although a large number of efficient raster-based algorithms exist for performing a wide variety of common tasks on these data, there are a number of procedures which are more efficiently performed in vector mode or for which raster mode equivalents of current vector-based techniques have not yet been developed. This paper presents a hybrid spatial data structure, named the ?vaster' structure, which can utilize the advantages of both raster and vector structures while potentially eliminating, or greatly reducing, the need for raster-to-vector and vector-to-raster conversion. Other advantages of the vaster structure are also discussed.

  18. Setting up a Rayleigh Scattering Based Flow Measuring System in a Large Nozzle Testing Facility

    NASA Technical Reports Server (NTRS)

    Panda, Jayanta; Gomez, Carlos R.

    2002-01-01

    A molecular Rayleigh scattering based air density measurement system has been built in a large nozzle testing facility at NASA Glenn Research Center. The technique depends on the light scattering by gas molecules present in air; no artificial seeding is required. Light from a single mode, continuous wave laser was transmitted to the nozzle facility by optical fiber, and light scattered by gas molecules, at various points along the laser beam, is collected and measured by photon-counting electronics. By placing the laser beam and collection optics on synchronized traversing units, the point measurement technique is made effective for surveying density variation over a cross-section of the nozzle plume. Various difficulties associated with dust particles, stray light, high noise level and vibration are discussed. Finally, a limited amount of data from an underexpanded jet are presented and compared with expected variations to validate the technique.

  19. Estimation of melting points of large set of persistent organic pollutants utilizing QSPR approach.

    PubMed

    Watkins, Marquita; Sizochenko, Natalia; Rasulev, Bakhtiyor; Leszczynski, Jerzy

    2016-03-01

    The presence of polyhalogenated persistent organic pollutants (POPs), such as Cl/Br-substituted benzenes, biphenyls, diphenyl ethers, and naphthalenes has been identified in all environmental compartments. The exposure to these compounds can pose potential risk not only for ecological systems, but also for human health. Therefore, efficient tools for comprehensive environmental risk assessment for POPs are required. Among the factors vital for environmental transport and fate processes is melting point of a compound. In this study, we estimated the melting points of a large group (1419 compounds) of chloro- and bromo- derivatives of dibenzo-p-dioxins, dibenzofurans, biphenyls, naphthalenes, diphenylethers, and benzenes by utilizing quantitative structure-property relationship (QSPR) techniques. The compounds were classified by applying structure-based clustering methods followed by GA-PLS modeling. In addition, random forest method has been applied to develop more general models. Factors responsible for melting point behavior and predictive ability of each method were discussed. PMID:26874948

  20. Processing large sensor data sets for safeguards : the knowledge generation system.

    SciTech Connect

    Thomas, Maikel A.; Smartt, Heidi Anne; Matthews, Robert F.

    2012-04-01

    Modern nuclear facilities, such as reprocessing plants, present inspectors with significant challenges due in part to the sheer amount of equipment that must be safeguarded. The Sandia-developed and patented Knowledge Generation system was designed to automatically analyze large amounts of safeguards data to identify anomalous events of interest by comparing sensor readings with those expected from a process of interest and operator declarations. This paper describes a demonstration of the Knowledge Generation system using simulated accountability tank sensor data to represent part of a reprocessing plant. The demonstration indicated that Knowledge Generation has the potential to address several problems critical to the future of safeguards. It could be extended to facilitate remote inspections and trigger random inspections. Knowledge Generation could analyze data to establish trust hierarchies, to facilitate safeguards use of operator-owned sensors.

  1. Mining pinyin-to-character conversion rules from large-scale corpus: a rough set approach.

    PubMed

    Wang, Xiaolong; Chen, Qingcai; Yeung, Daniel S

    2004-04-01

    This paper introduces a rough set technique for solving the problem of mining Pinyin-to-character (PTC) conversion rules. It first presents a text-structuring method by constructing a language information table from a corpus for each pinyin, which it will then apply to a free-form textual corpus. Data generalization and rule extraction algorithms can then be used to eliminate redundant information and extract consistent PTC conversion rules. The design of our model also addresses a number of important issues such as the long-distance dependency problem, the storage requirements of the rule base, and the consistency of the extracted rules, while the performance of the extracted rules as well as the effects of different model parameters are evaluated experimentally. These results show that by the smoothing method, high precision conversion (0.947) and recall rates (0.84) can be achieved even for rules represented directly by pinyin rather than words. A comparison with the baseline tri-gram model also shows good complement between our method and the tri-gram language model. PMID:15376833

  2. Data Mining on Large Data Set for Predicting Salmon Spawning Habitat

    SciTech Connect

    Xie, YuLong; Murray, Christopher J.; Hanrahan, Timothy P.; Geist, David R.

    2008-07-01

    Hydraulic properties related to river flow affect salmon spawning habitat. Accurate prediction of salmon spawning habitat and understanding the influential properties on the spawning behavior are of great interest for hydroelectric dam management. Previous research predicted salmon spawning habitat through deriving river specific spawning suitability indices and employing a function estimate method like logistic regression on several static river flow related properties and had some success. The objective of this study was two-fold. First dynamic river flow properties associated with upstream dam operation were successfully derived from a huge set of time series of both water velocity and water depth for about one fifth of a million habitat cells through principal component analysis (PCA) using nonlinear iterative partial least squares (NIPLAS). The inclusion of dynamic variables in the models greatly improved the model prediction. Secondly, nine machine learning methods were applied to the data and it was found that decision tree and rule induction methods were generally outperformed usually used logistic regression. Specifically random forest, an advanced decision tree algorithm, provided unanimous better results. Over-prediction problem in previous studies were greatly alleviated.

  3. Cytotoxicity evaluation of large cyanobacterial strain set using selected human and murine in vitro cell models.

    PubMed

    Hrouzek, Pavel; Kapuścik, Aleksandra; Vacek, Jan; Voráčová, Kateřina; Paichlová, Jindřiška; Kosina, Pavel; Voloshko, Ludmila; Ventura, Stefano; Kopecký, Jiří

    2016-02-01

    The production of cytotoxic molecules interfering with mammalian cells is extensively reported in cyanobacteria. These compounds may have a use in pharmacological applications; however, their potential toxicity needs to be considered. We performed cytotoxicity tests of crude cyanobacterial extracts in six cell models in order to address the frequency of cyanobacterial cytotoxicity to human cells and the level of specificity to a particular cell line. A set of more than 100 cyanobacterial crude extracts isolated from soil habitats (mainly genera Nostoc and Tolypothrix) was tested by MTT test for in vitro toxicity on the hepatic and non-hepatic human cell lines HepG2 and HeLa, and three cell systems of rodent origin: Yac-1, Sp-2 and Balb/c 3T3 fibroblasts. Furthermore, a subset of the extracts was assessed for cytotoxicity against primary cultures of human hepatocytes as a model for evaluating potential hepatotoxicity. Roughly one third of cyanobacterial extracts caused cytotoxic effects (i.e. viability<75%) on human cell lines. Despite the sensitivity differences, high correlation coefficients among the inhibition values were obtained for particular cell systems. This suggests a prevailing general cytotoxic effect of extracts and their constituents. The non-transformed immortalized fibroblasts (Balb/c 3T3) and hepatic cancer line HepG2 exhibited good correlations with primary cultures of human hepatocytes. The presence of cytotoxic fractions in strongly cytotoxic extracts was confirmed by an activity-guided HPLC fractionation, and it was demonstrated that cyanobacterial cytotoxicity is caused by a mixture of components with similar hydrophobic/hydrophilic properties. The data presented here could be used in further research into in vitro testing based on human models for the toxicological monitoring of complex cyanobacterial samples. PMID:26519817

  4. Self Organizing Maps for the Clustering of Large Sets of Labeled Graphs

    NASA Astrophysics Data System (ADS)

    Zhang, Shujia; Hagenbuchner, Markus; Tsoi, Ah Chung; Sperduti, Alessandro

    Data mining on Web documents is one of the most challenging tasks in machine learning due to the large number of documents on the Web, the underlying structures (as one document may refer to another document), and the data is commonly not labeled (the class in which the document belongs is not known a-priori). This paper considers latest developments in Self-Organizing Maps (SOM), a machine learning approach, as one way to classifying documents on the Web. The most recent development is called a Probability Mapping Graph Self-Organizing Map (PMGraphSOM), and is an extension of an earlier Graph-SOM approach; this encodes undirected and cyclic graphs in a scalable fashion. This paper illustrates empirically the advantages of the PMGraphSOM versus the original GraphSOM model in a data mining application involving graph structured information. It will be shown that the performances achieved can exceed the current state-of-the art techniques on a given benchmark problem.

  5. A large set of newly created interspecific Saccharomyces hybrids increases aromatic diversity in lager beers.

    PubMed

    Mertens, Stijn; Steensels, Jan; Saels, Veerle; De Rouck, Gert; Aerts, Guido; Verstrepen, Kevin J

    2015-12-01

    Lager beer is the most consumed alcoholic beverage in the world. Its production process is marked by a fermentation conducted at low (8 to 15°C) temperatures and by the use of Saccharomyces pastorianus, an interspecific hybrid between Saccharomyces cerevisiae and the cold-tolerant Saccharomyces eubayanus. Recent whole-genome-sequencing efforts revealed that the currently available lager yeasts belong to one of only two archetypes, "Saaz" and "Frohberg." This limited genetic variation likely reflects that all lager yeasts descend from only two separate interspecific hybridization events, which may also explain the relatively limited aromatic diversity between the available lager beer yeasts compared to, for example, wine and ale beer yeasts. In this study, 31 novel interspecific yeast hybrids were developed, resulting from large-scale robot-assisted selection and breeding between carefully selected strains of S. cerevisiae (six strains) and S. eubayanus (two strains). Interestingly, many of the resulting hybrids showed a broader temperature tolerance than their parental strains and reference S. pastorianus yeasts. Moreover, they combined a high fermentation capacity with a desirable aroma profile in laboratory-scale lager beer fermentations, thereby successfully enriching the currently available lager yeast biodiversity. Pilot-scale trials further confirmed the industrial potential of these hybrids and identified one strain, hybrid H29, which combines a fast fermentation, high attenuation, and the production of a complex, desirable fruity aroma. PMID:26407881

  6. Value-cell bar charts for visualizing large transaction data sets.

    PubMed

    Keim, Daniel A; Hao, Ming C; Dayal, Umeshwar; Lyons, Martha

    2007-01-01

    One of the common problems businesses need to solve is how to use large volumes of sales histories, Web transactions, and other data to understand the behavior of their customers and increase their revenues. Bar charts are widely used for daily analysis, but only show highly aggregated data. Users often need to visualize detailed multidimensional information reflecting the health of their businesses. In this paper, we propose an innovative visualization solution based on the use of value cells within bar charts to represent business metrics. The value of a transaction can be discretized into one or multiple cells: high-value transactions are mapped to multiple value cells, whereas many small-value transactions are combined into one cell. With value-cell bar charts, users can 1) visualize transaction value distributions and correlations, 2) identify high-value transactions and outliers at a glance, and 3) instantly display values at the transaction record level. Value-Cell Bar Charts have been applied with success to different sales and IT service usage applications, demonstrating the benefits of the technique over traditional charting techniques. A comparison with two variants of the well-known Treemap technique and our earlier work on Pixel Bar Charts is also included. PMID:17495340

  7. Large-scale assessment of missed opportunity risks in a complex hospital setting.

    PubMed

    Peng, Yidong; Erdem, Ergin; Shi, Jing; Masek, Christopher; Woodbridge, Peter

    2016-03-01

    In this research, we apply a large-scale logistic regression analysis to assess the patient missed opportunity risks at a complex VA (US Department of Veterans Affairs) hospital in three categories, namely, no-show alone, no-show combined with late patient cancellation and no-show combined with late patient and clinic cancellations. The analysis includes unique explanatory variables related to VA patients for predicting missed opportunity risks. Furthermore, we develop two aggregated weather indices by combining many weather measures and include them as explanatory variables. The results indicate that most of the explanatory variables considered are significant factors for predicting the missed opportunity risks. Patients with afternoon appointment, higher percentage service connected, and insurance, married patients, shorter lead time and appointments with longer appointment length are consistently related to lower risks of missed opportunity. Furthermore, the VA patient-related factors and the two proposed weather indices are useful predictors for the risks of no-show and patient cancellation. More importantly, this research presents an effective procedure for VA hospitals and clinics to analyze the missed opportunity risks within the complex VA information technology system, and help them to develop proper interventions to mitigate the adverse effects caused by the missed opportunities. PMID:25325215

  8. Inference of higher-order relationships in the cycads from a large chloroplast data set.

    PubMed

    Rai, Hardeep S; O'Brien, Heath E; Reeves, Patrick A; Olmstead, Richard G; Graham, Sean W

    2003-11-01

    We investigated higher-order relationships in the cycads, an ancient group of seed-bearing plants, by examining a large portion of the chloroplast genome from seven species chosen to exemplify our current understanding of taxonomic diversity in the order. The regions considered span approximately 13.5 kb of unaligned data per taxon, and comprise a diverse range of coding sequences, introns and intergenic spacers dispersed throughout the plastid genome. Our results provide substantial support for most of the inferred backbone of cycad phylogeny, and weak evidence that the sister-group of the cycads among living seed plants is Ginkgo biloba. Cycas (representing Cycadaceae) is the sister-group of the remaining cycads; Dioon is part of the next most basal split. Two of the three commonly recognized families of cycads (Zamiaceae and Stangeriaceae) are not monophyletic; Stangeria is embedded within Zamiaceae, close to Zamia and Ceratozamia, and not closely allied to the other genus of Stangeriaceae, Bowenia. In contrast to the other seed plants, cycad chloroplast genomes share two features with Ginkgo: a reduced rate of evolution and an elevated transition:transversion ratio. We demonstrate that the latter aspect of their molecular evolution is unlikely to have affected inference of cycad relationships in the context of seed-plant wide analyses. PMID:13678689

  9. Can Wide Consultation Help with Setting Priorities for Large-Scale Biodiversity Monitoring Programs?

    PubMed Central

    Boivin, Frédéric; Simard, Anouk; Peres-Neto, Pedro

    2014-01-01

    Climate and other global change phenomena affecting biodiversity require monitoring to track ecosystem changes and guide policy and management actions. Designing a biodiversity monitoring program is a difficult task that requires making decisions that often lack consensus due to budgetary constrains. As monitoring programs require long-term investment, they also require strong and continuing support from all interested parties. As such, stakeholder consultation is key to identify priorities and make sound design decisions that have as much support as possible. Here, we present the results of a consultation conducted to serve as an aid for designing a large-scale biodiversity monitoring program for the province of Québec (Canada). The consultation took the form of a survey with 13 discrete choices involving tradeoffs in respect to design priorities and 10 demographic questions (e.g., age, profession). The survey was sent to thousands of individuals having expected interests and knowledge about biodiversity and was completed by 621 participants. Overall, consensuses were few and it appeared difficult to create a design fulfilling the priorities of the majority. Most participants wanted 1) a monitoring design covering the entire territory and focusing on natural habitats; 2) a focus on species related to ecosystem services, on threatened and on invasive species. The only demographic characteristic that was related to the type of prioritization was the declared level of knowledge in biodiversity (null to high), but even then the influence was quite small. PMID:25525798

  10. Evaluation in the Classroom.

    ERIC Educational Resources Information Center

    Becnel, Shirley

    Six classroom research-based instructional projects funded under Chapter 2 are described, and their outcomes are summarized. The projects each used computer hardware and software in the classroom setting. The projects and their salient points include: (1) the Science Technology Project, in which 48 teachers and 2,847 students in 18 schools used…

  11. Monitoring Classroom Behavior.

    ERIC Educational Resources Information Center

    Ingersoll, Gary M.

    This document describes an instructional packet designed to help teachers develop effective techniques for monitoring classroom behavior. Monitoring student classroom behavior requires the possession of a meaningful set of categories with which to describe student behavior, the ability to identify examples of those behaviors in the context of…

  12. Tips from the Classroom.

    ERIC Educational Resources Information Center

    Benedetti, Teresa; De Gaetano, Yvonne; Weinstein-McShane, Ruth; Paez, Doris; McCarty, Laurie; Ehlers-Zavala, Fabiola; Bakken, Jeffrey P.

    1997-01-01

    This group of classroom tips discusses the benefits of peer coaching, peer group conversation about teachers' classroom experiences, using visual displays for collegial sharing, using cultural brokers in educational settings, and the role of picture books in developing literacy skills in diverse students with disabilities. (Author/CK)

  13. Job-related diseases and occupations within a large workers' compensation data set.

    PubMed

    Leigh, J P; Miller, T R

    1998-03-01

    The objective of this report is to describe workers' job-related diseases and the occupations associated with those diseases. The methods include aggregation and analysis of job-related disease and occupation data from the Bureau of Labor Statistics' Supplementary Data System (SDS) for 1985 and 1986--the last years of data available with workers' compensation categories: death, permanent total, permanent partial, and temporary total and partial. Diseases are ranked according to their contribution to the four workers' compensation (WC) categories and also ranked within occupations according to the number of cases. Occupations are ranked according to their contribution to specific diseases within one of the four categories. The following diseases comprise the greatest numbers of deaths: heart attacks, asbestosis, silicosis, and stroke. Within the permanent total category, the diseases with the greatest contributions are heart attack, silicosis, strokes, and inflammation of the joints. For the permanent partial category, they are hearing loss, inflammation of joints, carpal tunnel syndrome, and heart attacks. For the temporary total and partial category, they are: inflammation of joints, carpal tunnel syndrome, dermatitis, and toxic poisoning. Hearing loss or inflammation of joints are associated with more than 300 occupations. Circulatory diseases comprise a larger share of job-related diseases than is generally acknowledged. Occupations contributing the most heart attack deaths are truck drivers, managers, janitors, supervisors, firefighters, and laborers. Ratios of numbers of deaths to numbers of disabilities are far higher for illnesses than injuries. Occupations that are consistent in their high ranking on most lists involving a variety of conditions include nonconstruction laborers, janitors, and construction laborers. The large SDS, though dated, provides a tentative national look at the broad spectrum of occupational diseases as defined by WC and the occupations associated with those diseases in 1985 and 1986. Some description of the spectrum of diseases encountered today is possible especially for occupations, such as those mentioned above for which employment has expanded in the 1990s. PMID:9481418

  14. Actual Versus Estimated Utility Factor of a Large Set of Privately Owned Chevrolet Volts

    SciTech Connect

    John Smart; Thomas Bradley; Stephen Schey

    2014-04-01

    In order to determine the overall fuel economy of a plug-in hybrid electric vehicle (PHEV), the amount of operation in charge depleting (CD) versus charge sustaining modes must be determined. Mode of operation is predominantly dependent on customer usage of the vehicle and is therefore highly variable. The utility factor (UF) concept was developed to quantify the distance a group of vehicles has traveled or may travel in CD mode. SAE J2841 presents a UF calculation method based on data collected from travel surveys of conventional vehicles. UF estimates have been used in a variety of areas, including the calculation of window sticker fuel economy, policy decisions, and vehicle design determination. The EV Project, a plug-in electric vehicle charging infrastructure demonstration being conducted across the United States, provides the opportunity to determine the real-world UF of a large group of privately owned Chevrolet Volt extended range electric vehicles. Using data collected from Volts enrolled in The EV Project, this paper compares the real-world UF of two groups of Chevrolet Volts to estimated UF's based on J2841. The actual observed fleet utility factors (FUF) for the MY2011/2012 and MY2013 Volt groups studied were observed to be 72% and 74%, respectively. Using the EPA CD ranges, the method prescribed by J2841 estimates a FUF of 65% and 68% for the MY2011/2012 and MY2013 Volt groups, respectively. Volt drivers achieved higher percentages of distance traveled in EV mode for two reasons. First, they had fewer long-distance travel days than drivers in the national travel survey referenced by J2841. Second, they charged more frequently than the J2841 assumption of once per day - drivers of Volts in this study averaged over 1.4 charging events per day. Although actual CD range varied widely as driving conditions varied, the average CD ranges for the two Volt groups studied matched the EPA CD range estimates, so CD range variation did not affect FUF results.

  15. Designing Websites for Displaying Large Data Sets and Images on Multiple Platforms

    NASA Astrophysics Data System (ADS)

    Anderson, A.; Wolf, V. G.; Garron, J.; Kirschner, M.

    2012-12-01

    The desire to build websites to analyze and display ever increasing amounts of scientific data and images pushes for web site designs which utilize large displays, and to use the display area as efficiently as possible. Yet, scientists and users of their data are increasingly wishing to access these websites in the field and on mobile devices. This results in the need to develop websites that can support a wide range of devices and screen sizes, and to optimally use whatever display area is available. Historically, designers have addressed this issue by building two websites; one for mobile devices, and one for desktop environments, resulting in increased cost, duplicity of work, and longer development times. Recent advancements in web design technology and techniques have evolved which allow for the development of a single website that dynamically adjusts to the type of device being used to browse the website (smartphone, tablet, desktop). In addition they provide the opportunity to truly optimize whatever display area is available. HTML5 and CSS3 give web designers media query statements which allow design style sheets to be aware of the size of the display being used, and to format web content differently based upon the queried response. Web elements can be rendered in a different size, position, or even removed from the display entirely, based upon the size of the display area. Using HTML5/CSS3 media queries in this manner is referred to as "Responsive Web Design" (RWD). RWD in combination with technologies such as LESS and Twitter Bootstrap allow the web designer to build web sites which not only dynamically respond to the browser display size being used, but to do so in very controlled and intelligent ways, ensuring that good layout and graphic design principles are followed while doing so. At the University of Alaska Fairbanks, the Alaska Satellite Facility SAR Data Center (ASF) recently redesigned their popular Vertex application and converted it from a traditional, fixed-layout website into a RWD site built on HTML5, LESS and Twitter Bootstrap. Vertex is a data portal for remotely sensed imagery of the earth, offering Synthetic Aperture Radar (SAR) data products from the global ASF archive. By using Responsive Web Design, ASF is able to provide access to a massive collection of SAR imagery and allow the user to use mobile devices and desktops to maximum advantage. ASF's Vertex web site demonstrates that with increased interface flexibility, scientists, managers and users can increase their personal effectiveness by accessing data portals from their preferred device as their science dictates.

  16. Mining unusual and rare stellar spectra from large spectroscopic survey data sets using the outlier-detection method

    NASA Astrophysics Data System (ADS)

    Wei, Peng; Luo, Ali; Li, Yinbi; Pan, Jingchang; Tu, Liangping; Jiang, Bin; Kong, Xiao; Shi, Zhixin; Yi, Zhenping; Wang, Fengfei; Liu, Jie; Zhao, Yongheng

    2013-05-01

    The large number of spectra obtained from sky surveys such as the Sloan Digital Sky Survey (SDSS) and the survey executed by the Large sky Area Multi-Object fibre Spectroscopic Telescope (LAMOST, also called GuoShouJing Telescope) provide us with opportunities to search for peculiar or even unknown types of spectra. In response to the limitations of existing methods, a novel outlier-mining method, the Monte Carlo Local Outlier Factor (MCLOF), is proposed in this paper, which can be used to highlight unusual and rare spectra from large spectroscopic survey data sets. The MCLOF method exposes outliers automatically and efficiently by marking each spectrum with a number, i.e. using outlier index as a flag for an unusual and rare spectrum. The Local Outlier Factor (LOF) represents how unusual and rare a spectrum is compared with other spectra and the Monte Carlo method is used to compute the global LOF for each spectrum by randomly selecting samples in each independent iteration. Our MCLOF method is applied to over half a million stellar spectra (classified as STAR by the SDSS Pipeline) from the SDSS data release 8 (DR8) and a total of 37 033 spectra are selected as outliers with signal-to-noise ratio (S/N) ≥ 3 and outlier index ≥0.85. Some of these outliers are shown to be binary stars, emission-line stars, carbon stars and stars with unusual continuum. The results show that our proposed method can efficiently highlight these unusual spectra from the survey data sets. In addition, some relatively rare and interesting spectra are selected, indicating that the proposed method can also be used to mine rare, even unknown, spectra. The proposed method can be applicable not only to spectral survey data sets but also to other types of survey data sets. The spectra of all peculiar objects selected by our MCLOF method are available from a user-friendly website: http://sciwiki.lamost.org/Miningdr8/.

  17. The Viking viewer for connectomics: scalable multi-user annotation and summarization of large volume data sets

    PubMed Central

    ANDERSON, JR; MOHAMMED, S; GRIMM, B; JONES, BW; KOSHEVOY, P; TASDIZEN, T; WHITAKER, R; MARC, RE

    2011-01-01

    Modern microscope automation permits the collection of vast amounts of continuous anatomical imagery in both two and three dimensions. These large data sets present significant challenges for data storage, access, viewing, annotation and analysis. The cost and overhead of collecting and storing the data can be extremely high. Large data sets quickly exceed an individual's capability for timely analysis and present challenges in efficiently applying transforms, if needed. Finally annotated anatomical data sets can represent a significant investment of resources and should be easily accessible to the scientific community. The Viking application was our solution created to view and annotate a 16.5 TB ultrastructural retinal connectome volume and we demonstrate its utility in reconstructing neural networks for a distinctive retinal amacrine cell class. Viking has several key features. (1) It works over the internet using HTTP and supports many concurrent users limited only by hardware. (2) It supports a multi-user, collaborative annotation strategy. (3) It cleanly demarcates viewing and analysis from data collection and hosting. (4) It is capable of applying transformations in real-time. (5) It has an easily extensible user interface, allowing addition of specialized modules without rewriting the viewer. PMID:21118201

  18. The Viking viewer for connectomics: scalable multi-user annotation and summarization of large volume data sets.

    PubMed

    Anderson, J R; Mohammed, S; Grimm, B; Jones, B W; Koshevoy, P; Tasdizen, T; Whitaker, R; Marc, R E

    2011-01-01

    Modern microscope automation permits the collection of vast amounts of continuous anatomical imagery in both two and three dimensions. These large data sets present significant challenges for data storage, access, viewing, annotation and analysis. The cost and overhead of collecting and storing the data can be extremely high. Large data sets quickly exceed an individual's capability for timely analysis and present challenges in efficiently applying transforms, if needed. Finally annotated anatomical data sets can represent a significant investment of resources and should be easily accessible to the scientific community. The Viking application was our solution created to view and annotate a 16.5 TB ultrastructural retinal connectome volume and we demonstrate its utility in reconstructing neural networks for a distinctive retinal amacrine cell class. Viking has several key features. (1) It works over the internet using HTTP and supports many concurrent users limited only by hardware. (2) It supports a multi-user, collaborative annotation strategy. (3) It cleanly demarcates viewing and analysis from data collection and hosting. (4) It is capable of applying transformations in real-time. (5) It has an easily extensible user interface, allowing addition of specialized modules without rewriting the viewer. PMID:21118201

  19. Supporting Classroom Activities with the BSUL System

    ERIC Educational Resources Information Center

    Ogata, Hiroaki; Saito, Nobuji A.; Paredes J., Rosa G.; San Martin, Gerardo Ayala; Yano, Yoneo

    2008-01-01

    This paper presents the integration of ubiquitous computing systems into classroom settings, in order to provide basic support for classrooms and field activities. We have developed web application components using Java technology and configured a classroom with wireless network access and a web camera for our purposes. In this classroom, the…

  20. Knowledge and theme discovery across very large biological data sets using distributed queries: a prototype combining unstructured and structured data.

    PubMed

    Mudunuri, Uma S; Khouja, Mohamad; Repetski, Stephen; Venkataraman, Girish; Che, Anney; Luke, Brian T; Girard, F Pascal; Stephens, Robert M

    2013-01-01

    As the discipline of biomedical science continues to apply new technologies capable of producing unprecedented volumes of noisy and complex biological data, it has become evident that available methods for deriving meaningful information from such data are simply not keeping pace. In order to achieve useful results, researchers require methods that consolidate, store and query combinations of structured and unstructured data sets efficiently and effectively. As we move towards personalized medicine, the need to combine unstructured data, such as medical literature, with large amounts of highly structured and high-throughput data such as human variation or expression data from very large cohorts, is especially urgent. For our study, we investigated a likely biomedical query using the Hadoop framework. We ran queries using native MapReduce tools we developed as well as other open source and proprietary tools. Our results suggest that the available technologies within the Big Data domain can reduce the time and effort needed to utilize and apply distributed queries over large datasets in practical clinical applications in the life sciences domain. The methodologies and technologies discussed in this paper set the stage for a more detailed evaluation that investigates how various data structures and data models are best mapped to the proper computational framework. PMID:24312478

  1. Knowledge and Theme Discovery across Very Large Biological Data Sets Using Distributed Queries: A Prototype Combining Unstructured and Structured Data

    PubMed Central

    Repetski, Stephen; Venkataraman, Girish; Che, Anney; Luke, Brian T.; Girard, F. Pascal; Stephens, Robert M.

    2013-01-01

    As the discipline of biomedical science continues to apply new technologies capable of producing unprecedented volumes of noisy and complex biological data, it has become evident that available methods for deriving meaningful information from such data are simply not keeping pace. In order to achieve useful results, researchers require methods that consolidate, store and query combinations of structured and unstructured data sets efficiently and effectively. As we move towards personalized medicine, the need to combine unstructured data, such as medical literature, with large amounts of highly structured and high-throughput data such as human variation or expression data from very large cohorts, is especially urgent. For our study, we investigated a likely biomedical query using the Hadoop framework. We ran queries using native MapReduce tools we developed as well as other open source and proprietary tools. Our results suggest that the available technologies within the Big Data domain can reduce the time and effort needed to utilize and apply distributed queries over large datasets in practical clinical applications in the life sciences domain. The methodologies and technologies discussed in this paper set the stage for a more detailed evaluation that investigates how various data structures and data models are best mapped to the proper computational framework. PMID:24312478

  2. Development and Validation of Decision Forest Model for Estrogen Receptor Binding Prediction of Chemicals Using Large Data Sets.

    PubMed

    Ng, Hui Wen; Doughty, Stephen W; Luo, Heng; Ye, Hao; Ge, Weigong; Tong, Weida; Hong, Huixiao

    2015-12-21

    Some chemicals in the environment possess the potential to interact with the endocrine system in the human body. Multiple receptors are involved in the endocrine system; estrogen receptor ? (ER?) plays very important roles in endocrine activity and is the most studied receptor. Understanding and predicting estrogenic activity of chemicals facilitates the evaluation of their endocrine activity. Hence, we have developed a decision forest classification model to predict chemical binding to ER? using a large training data set of 3308 chemicals obtained from the U.S. Food and Drug Administration's Estrogenic Activity Database. We tested the model using cross validations and external data sets of 1641 chemicals obtained from the U.S. Environmental Protection Agency's ToxCast project. The model showed good performance in both internal (92% accuracy) and external validations (?70-89% relative balanced accuracies), where the latter involved the validations of the model across different ER pathway-related assays in ToxCast. The important features that contribute to the prediction ability of the model were identified through informative descriptor analysis and were related to current knowledge of ER binding. Prediction confidence analysis revealed that the model had both high prediction confidence and accuracy for most predicted chemicals. The results demonstrated that the model constructed based on the large training data set is more accurate and robust for predicting ER binding of chemicals than the published models that have been developed using much smaller data sets. The model could be useful for the evaluation of ER?-mediated endocrine activity potential of environmental chemicals. PMID:26524122

  3. Combing rough set and RBF neural network for large-scale ship recognition in optical satellite images

    NASA Astrophysics Data System (ADS)

    Chunyan, Lu; Huanxin, Zou; Hao, Sun; Shilin, Zhou

    2014-03-01

    Large scale ship recognition in optical remote sensing images is of great importance for many military applications. It aims to recognize the category information of the detected ships for effective maritime surveillance. The contributions of the paper can be summarized as follows: Firstly, based on the rough set theory, the common discernibility degree is used to compute the significance weight of each candidate feature and select valid recognition features automatically; Secondly, RBF neural network is constructed based on the selected recognition features. Experiments on recorded optical satellite images show the proposed method is effective and can get better classification rates at a higher speed than the state of the art methods.

  4. Functional network construction in Arabidopsis using rule-based machine learning on large-scale data sets.

    PubMed

    Bassel, George W; Glaab, Enrico; Marquez, Julietta; Holdsworth, Michael J; Bacardit, Jaume

    2011-09-01

    The meta-analysis of large-scale postgenomics data sets within public databases promises to provide important novel biological knowledge. Statistical approaches including correlation analyses in coexpression studies of gene expression have emerged as tools to elucidate gene function using these data sets. Here, we present a powerful and novel alternative methodology to computationally identify functional relationships between genes from microarray data sets using rule-based machine learning. This approach, termed "coprediction," is based on the collective ability of groups of genes co-occurring within rules to accurately predict the developmental outcome of a biological system. We demonstrate the utility of coprediction as a powerful analytical tool using publicly available microarray data generated exclusively from Arabidopsis thaliana seeds to compute a functional gene interaction network, termed Seed Co-Prediction Network (SCoPNet). SCoPNet predicts functional associations between genes acting in the same developmental and signal transduction pathways irrespective of the similarity in their respective gene expression patterns. Using SCoPNet, we identified four novel regulators of seed germination (ALTERED SEED GERMINATION5, 6, 7, and 8), and predicted interactions at the level of transcript abundance between these novel and previously described factors influencing Arabidopsis seed germination. An online Web tool to query SCoPNet has been developed as a community resource to dissect seed biology and is available at http://www.vseed.nottingham.ac.uk/. PMID:21896882

  5. Achieving the Complete-Basis Limit in Large Molecular Clusters: Computationally Efficient Procedures to Eliminate Basis-Set Superposition Error

    NASA Astrophysics Data System (ADS)

    Richard, Ryan M.; Herbert, John M.

    2013-06-01

    Previous electronic structure studies that have relied on fragmentation have been primarily interested in those methods' abilities to replicate the supersystem energy (or a related energy difference) without recourse to the ability of those supersystem results to replicate experiment or high accuracy benchmarks. Here we focus on replicating accurate ab initio benchmarks, that are suitable for comparison to experimental data. In doing this it becomes imperative that we correct our methods for basis-set superposition errors (BSSE) in a computationally feasible way. This criterion leads us to develop a new method for BSSE correction, which we term the many-body counterpoise correction, or MBn for short. MBn is truncated at order n, in much the same manner as a normal many-body expansion leading to a decrease in computational time. Furthermore, its formulation in terms of fragments makes it especially suitable for use with pre-existing fragment codes. A secondary focus of this study is directed at assessing fragment methods' abilities to extrapolate to the complete basis set (CBS) limit as well as compute approximate triples corrections. Ultimately, by analysis of (H_2O)_6 and (H_2O)_{10}F^- systems, it is concluded that with large enough basis-sets (triple or quad zeta) fragment based methods can replicate high level benchmarks in a fraction of the time.

  6. Do networking activities outside of the classroom protect students against being bullied? A field study with students in secondary school settings in Germany.

    PubMed

    Blickle, Gerhard; Meurs, James A; Schoepe, Christine

    2013-01-01

    Research has shown that having close relationships with fellow classmates can provide a buffer for students against bullying and the negative outcomes associated with it. But, research has not explicitly examined the potential benefits of social networking behaviors outside of the classroom for those who could be bullied. This study addresses this gap and finds that, although a bullying climate in the classroom increases overall bullying, students high on external networking activities did not experience an increase in the bullying they received when in a classroom with a high bullying climate. However, the same group of students reported the largest degree of received bulling under conditions of a low bullying climate. We discuss the implications of our results and provide directions for future research. PMID:24364126

  7. Goal Setting and Student Achievement: A Longitudinal Study

    ERIC Educational Resources Information Center

    Moeller, Aleidine J.; Theiler, Janine M.; Wu, Chaorong

    2012-01-01

    The connection between goals and student motivation has been widely investigated in the research literature, but the relationship of goal setting and student achievement at the classroom level has remained largely unexplored. This article reports the findings of a 5-year quasi-experimental study examining goal setting and student achievement in…

  8. Pre-Service Teachers and Classroom Authority

    ERIC Educational Resources Information Center

    Pellegrino, Anthony M.

    2010-01-01

    This study examined the classroom practices of five pre-service teachers from three secondary schools in a large southeastern state. Through classroom observations, survey responses, reviews of refection logs, and focus-group interview responses, we centered on the issue of developing classroom authority as a means to effective classroom…

  9. Culture in the Classroom

    ERIC Educational Resources Information Center

    Medin, Douglas L.; Bang, Megan

    2014-01-01

    Culture plays a large but often unnoticeable role in what we teach and how we teach children. We are a country of immense diversity, but in classrooms the dominant European-American culture has become the language of learning.

  10. The Impact of Brief Teacher Training on Classroom Management and Child Behavior in At-Risk Preschool Settings: Mediators and Treatment Utility

    ERIC Educational Resources Information Center

    Snyder, James; Low, Sabina; Schultz, Tara; Barner, Stacy; Moreno, Desirae; Garst, Meladee; Leiker, Ryan; Swink, Nathan; Schrepferman, Lynn

    2011-01-01

    Teachers from fourteen classrooms were randomly assigned to an adaptation of Incredible Years (IY) teacher training or to teacher training-as-usual. Observations were made of the behavior of 136 target preschool boys and girls nominated by teachers as having many or few conduct problems. Peer and teacher behavior were observed at baseline and post…

  11. "Designing Instrument for Science Classroom Learning Environment in Francophone Minority Settings: Accounting for Voiced Concerns among Teachers and Immigrant/Refugee Students"

    ERIC Educational Resources Information Center

    Bolivar, Bathélemy

    2015-01-01

    The three-phase process "-Instrument for Minority Immigrant Science Learning Environment," an 8-scale, 32-item see Appendix I- (I_MISLE) instrument when completed by teachers provides an accurate description of existing conditions in classrooms in which immigrant and refugee students are situated. Through the completion of the instrument…

  12. Navigating the Problem Space of Academia: Exploring Processes of Course Design and Classroom Teaching in Postsecondary Settings. WCER Working Paper No. 2014-1

    ERIC Educational Resources Information Center

    Hora, Matthew T.

    2014-01-01

    Policymakers and educators alike increasing focus on faculty adoption of interactive teaching techniques as a way to improve undergraduate education. Yet, little empirical research exists that examines the processes whereby faculty make decisions about curriculum design and classroom teaching in real-world situations. In this study, I use the idea…

  13. The Impact of Brief Teacher Training on Classroom Management and Child Behavior in At-Risk Preschool Settings: Mediators and Treatment Utility

    ERIC Educational Resources Information Center

    Snyder, James; Low, Sabina; Schultz, Tara; Barner, Stacy; Moreno, Desirae; Garst, Meladee; Leiker, Ryan; Swink, Nathan; Schrepferman, Lynn

    2011-01-01

    Teachers from fourteen classrooms were randomly assigned to an adaptation of Incredible Years (IY) teacher training or to teacher training-as-usual. Observations were made of the behavior of 136 target preschool boys and girls nominated by teachers as having many or few conduct problems. Peer and teacher behavior were observed at baseline and post…

  14. A Case Study of Literacy Instruction Delivered to Kindergarten Struggling Readers within the Response to Intervention Model in Three Classroom Settings

    ERIC Educational Resources Information Center

    Zelenka, Valerie Lynn

    2010-01-01

    A portion of the 2004 reauthorization of the Individuals with Disabilities Education Act (IDEA, 2004), Response to Intervention (RtI), aims to prevent unnecessary student placement in special education. The intent of RtI is to provide all students with effective classroom instruction first and afford low-performing students with increasingly…

  15. Possible calcium centers for hydrogen storage applications: An accurate many-body study by AFQMC calculations with large basis sets

    NASA Astrophysics Data System (ADS)

    Purwanto, Wirawan; Krakauer, Henry; Zhang, Shiwei; Virgus, Yudistira

    2011-03-01

    Weak H2 physisorption energies present a significant challenge to first-principle theoretical modeling and prediction of materials for H storage. There has been controversy regarding the accuracy of DFT on systems involving Ca cations. We use the auxiliary-field quantum Monte Carlo (AFQMC) method to accurately predict the binding energy of Ca + , - 4{H}2 . AFQMC scales as Nbasis3and has demonstrated accuracy similar to or better than the gold-standard coupled cluster CCSD(T) method. We apply a modified Cholesky decomposition to achieve efficient Hubbard-Stratonovich transformation in AFQMC at large basis sizes. We employ the largest correlation consistent basis sets available, up to Ca/cc-pCV5Z, to extrapolate to the complete basis limit. The calculated potential energy curve exhibits binding with a double-well structure. Supported by DOE and NSF. Calculations were performed at OLCF Jaguar and CPD.

  16. My Classroom Physical Activity Pyramid: A Tool for Integrating Movement into the Classroom

    ERIC Educational Resources Information Center

    Orlowski, Marietta; Lorson, Kevin; Lyon, Anna; Minoughan, Susan

    2013-01-01

    The classroom teacher is a critical team member of a comprehensive school physical activity program and an activity-friendly school environment. Students spend more time in the classroom than in any other school setting or environment. Classrooms are busy places, and classroom teachers must make decisions about how to make the best use of their…

  17. My Classroom Physical Activity Pyramid: A Tool for Integrating Movement into the Classroom

    ERIC Educational Resources Information Center

    Orlowski, Marietta; Lorson, Kevin; Lyon, Anna; Minoughan, Susan

    2013-01-01

    The classroom teacher is a critical team member of a comprehensive school physical activity program and an activity-friendly school environment. Students spend more time in the classroom than in any other school setting or environment. Classrooms are busy places, and classroom teachers must make decisions about how to make the best use of their…

  18. The ambient dose equivalent at flight altitudes: a fit to a large set of data using a Bayesian approach.

    PubMed

    Wissmann, F; Reginatto, M; Möller, T

    2010-09-01

    The problem of finding a simple, generally applicable description of worldwide measured ambient dose equivalent rates at aviation altitudes between 8 and 12 km is difficult to solve due to the large variety of functional forms and parametrisations that are possible. We present an approach that uses Bayesian statistics and Monte Carlo methods to fit mathematical models to a large set of data and to compare the different models. About 2500 data points measured in the periods 1997-1999 and 2003-2006 were used. Since the data cover wide ranges of barometric altitude, vertical cut-off rigidity and phases in the solar cycle 23, we developed functions which depend on these three variables. Whereas the dependence on the vertical cut-off rigidity is described by an exponential, the dependences on barometric altitude and solar activity may be approximated by linear functions in the ranges under consideration. Therefore, a simple Taylor expansion was used to define different models and to investigate the relevance of the different expansion coefficients. With the method presented here, it is possible to obtain probability distributions for each expansion coefficient and thus to extract reliable uncertainties even for the dose rate evaluated. The resulting function agrees well with new measurements made at fixed geographic positions and during long haul flights covering a wide range of latitudes. PMID:20826891

  19. Petascale Global Kinetic Simulations of The Magnetosphere and Visualization Strategies for Analysis of Very Large Multi-Variate Data Sets

    NASA Astrophysics Data System (ADS)

    Karimabadi, H.; Loring, B.; Vu, H. X.; Omelchenko, Y.; Tatineni, M.; Majumdar, A.; Ayachit, U.; Geveci, B.

    2011-10-01

    3D global electromagnetic hybrid (fluid electrons, kinetic ions) simulations have long been considered the holy grail in kinetic modeling of the magnetosphere but high computational requirements have kept them out of reach. Petascale computers provide the computational power to make such simulations possible but peta computing poses two technical challenges. One is related to the development of efficient and scalable algorithms that can take advantage of the large number of cores. The second is related to knowledge extraction from the resulting simulation output. The challenge of science discovery from the extremely large data sets (˜ 200 TB from a single run) generated from global kinetic simulations is compounded by the multi-variate and "noisy" nature of the data. Here, we review our innovations to overcome both challenges. We have developed a highly scalable hybrid simulation code (H3D) that we used to perform the first petascale global kinetic simulation of the magnetosphere using 98,304 cores on the NSF Kraken supercomputer. To facilitate analysis of data from such runs, we have developed complex visualization pipeline including physics based algorithms to detect and track events of interest in the data. The effectiveness of this approach is illustrated through examples.

  20. QSAR prediction of estrogen activity for a large set of diverse chemicals under the guidance of OECD principles.

    PubMed

    Liu, Huanxiang; Papa, Ester; Gramatica, Paola

    2006-11-01

    A large number of environmental chemicals, known as endocrine-disrupting chemicals, are suspected of disrupting endocrine functions by mimicking or antagonizing natural hormones, and such chemicals may pose a serious threat to the health of humans and wildlife. They are thought to act through a variety of mechanisms, mainly estrogen-receptor-mediated mechanisms of toxicity. However, it is practically impossible to perform thorough toxicological tests on all potential xenoestrogens, and thus, the quantitative structure--activity relationship (QSAR) provides a promising method for the estimation of a compound's estrogenic activity. Here, QSAR models of the estrogen receptor binding affinity of a large data set of heterogeneous chemicals have been built using theoretical molecular descriptors, giving full consideration to the new OECD principles in regulation for QSAR acceptability, during model construction and assessment. An unambiguous multiple linear regression (MLR) algorithm was used to build the models, and model predictive ability was validated by both internal and external validation. The applicability domain was checked by the leverage approach to verify prediction reliability. The results obtained using several validation paths indicate that the proposed QSAR model is robust and satisfactory, and can provide a feasible and practical tool for the rapid screening of the estrogen activity of organic compounds. PMID:17112243

  1. WebViz:A Web-based Collaborative Interactive Visualization System for large-Scale Data Sets

    NASA Astrophysics Data System (ADS)

    Yuen, D. A.; McArthur, E.; Weiss, R. M.; Zhou, J.; Yao, B.

    2010-12-01

    WebViz is a web-based application designed to conduct collaborative, interactive visualizations of large data sets for multiple users, allowing researchers situated all over the world to utilize the visualization services offered by the University of Minnesota’s Laboratory for Computational Sciences and Engineering (LCSE). This ongoing project has been built upon over the last 3 1/2 years .The motivation behind WebViz lies primarily with the need to parse through an increasing amount of data produced by the scientific community as a result of larger and faster multicore and massively parallel computers coming to the market, including the use of general purpose GPU computing. WebViz allows these large data sets to be visualized online by anyone with an account. The application allows users to save time and resources by visualizing data ‘on the fly’, wherever he or she may be located. By leveraging AJAX via the Google Web Toolkit (http://code.google.com/webtoolkit/), we are able to provide users with a remote, web portal to LCSE's (http://www.lcse.umn.edu) large-scale interactive visualization system already in place at the University of Minnesota. LCSE’s custom hierarchical volume rendering software provides high resolution visualizations on the order of 15 million pixels and has been employed for visualizing data primarily from simulations in astrophysics to geophysical fluid dynamics . In the current version of WebViz, we have implemented a highly extensible back-end framework built around HTTP "server push" technology. The web application is accessible via a variety of devices including netbooks, iPhones, and other web and javascript-enabled cell phones. Features in the current version include the ability for users to (1) securely login (2) launch multiple visualizations (3) conduct collaborative visualization sessions (4) delegate control aspects of a visualization to others and (5) engage in collaborative chats with other users within the user interface of the web application. These features are all in addition to a full range of essential visualization functions including 3-D camera and object orientation, position manipulation, time-stepping control, and custom color/alpha mapping.

  2. Hydraulic behavior of two areas of the Floridan aquifer system characterized by complex hydrogeologic settings and large groundwater withdrawals

    SciTech Connect

    Maslia, M.L. )

    1993-03-01

    Two areas of the Florida aquifer system (FAS) that are characterized by complex hydrogeologic settings and exceedingly large ground-water withdrawals are the Dougherty Plain area of southwest GA and the Glynn County area of southeast GA. In southwest GA, large scale withdrawals of ground water for agricultural and livestock irrigation amounted to about 148 million gallons per day (mg/d) during 1990. Large scale pumping in Glynn County, primarily used for industrial purposes and centered in the City of Brunswick, amounted to about 88 mg/d during 1990. In southwest GA, the FAS consists primarily of the Ocala Limestone (OL) of late Eocene age. Confining the aquifer from above is a residual layer (50 ft thick) of sand and clay containing silicified boulders which is derived from the chemical weathering of the OL. This area is characterized by karst topography marked by numerous depressions and sinkholes, high transmissivity (generally greater than 50,000 feet squared per day), and significant hydraulic connections to overlying streams and lakes. These characteristics, along with the seasonal nature of pumping and mean annual recharge of about 10 inches per year have prevented permanent, long-term water-level declines. In the Glynn County area, the FAS can be more than 2,600 ft thick, consisting of a sequence of calcareous and dolomitic rocks that are of Late Cretaceous to early Miocene in age. The aquifer system is confined above by clastic rocks of Middle Miocene age, having an average thickness of 400 ft. This area is characterized by post-depositional tectonic modification of the subsurface as opposed to simple karst development, thick confinement of the aquifer system, and significant amounts of vertical leakage of water from below. These characteristics and heavy-long term pumping from the Upper Floridan aquifer (UFA) have caused a broad, shallow cone of depression to develop and the upward migration of saltwater to contaminate the freshwater zones of the UFA.

  3. Simulation studies as designed experiments: the comparison of penalized regression models in the "large p, small n" setting.

    PubMed

    Chaibub Neto, Elias; Bare, J Christopher; Margolin, Adam A

    2014-01-01

    New algorithms are continuously proposed in computational biology. Performance evaluation of novel methods is important in practice. Nonetheless, the field experiences a lack of rigorous methodology aimed to systematically and objectively evaluate competing approaches. Simulation studies are frequently used to show that a particular method outperforms another. Often times, however, simulation studies are not well designed, and it is hard to characterize the particular conditions under which different methods perform better. In this paper we propose the adoption of well established techniques in the design of computer and physical experiments for developing effective simulation studies. By following best practices in planning of experiments we are better able to understand the strengths and weaknesses of competing algorithms leading to more informed decisions about which method to use for a particular task. We illustrate the application of our proposed simulation framework with a detailed comparison of the ridge-regression, lasso and elastic-net algorithms in a large scale study investigating the effects on predictive performance of sample size, number of features, true model sparsity, signal-to-noise ratio, and feature correlation, in situations where the number of covariates is usually much larger than sample size. Analysis of data sets containing tens of thousands of features but only a few hundred samples is nowadays routine in computational biology, where "omics" features such as gene expression, copy number variation and sequence data are frequently used in the predictive modeling of complex phenotypes such as anticancer drug response. The penalized regression approaches investigated in this study are popular choices in this setting and our simulations corroborate well established results concerning the conditions under which each one of these methods is expected to perform best while providing several novel insights. PMID:25289666

  4. Simulation Studies as Designed Experiments: The Comparison of Penalized Regression Models in the “Large p, Small n” Setting

    PubMed Central

    Chaibub Neto, Elias; Bare, J. Christopher; Margolin, Adam A.

    2014-01-01

    New algorithms are continuously proposed in computational biology. Performance evaluation of novel methods is important in practice. Nonetheless, the field experiences a lack of rigorous methodology aimed to systematically and objectively evaluate competing approaches. Simulation studies are frequently used to show that a particular method outperforms another. Often times, however, simulation studies are not well designed, and it is hard to characterize the particular conditions under which different methods perform better. In this paper we propose the adoption of well established techniques in the design of computer and physical experiments for developing effective simulation studies. By following best practices in planning of experiments we are better able to understand the strengths and weaknesses of competing algorithms leading to more informed decisions about which method to use for a particular task. We illustrate the application of our proposed simulation framework with a detailed comparison of the ridge-regression, lasso and elastic-net algorithms in a large scale study investigating the effects on predictive performance of sample size, number of features, true model sparsity, signal-to-noise ratio, and feature correlation, in situations where the number of covariates is usually much larger than sample size. Analysis of data sets containing tens of thousands of features but only a few hundred samples is nowadays routine in computational biology, where “omics” features such as gene expression, copy number variation and sequence data are frequently used in the predictive modeling of complex phenotypes such as anticancer drug response. The penalized regression approaches investigated in this study are popular choices in this setting and our simulations corroborate well established results concerning the conditions under which each one of these methods is expected to perform best while providing several novel insights. PMID:25289666

  5. Comparison of large-scale global land precipitation from multisatellite and reanalysis products with gauge-based GPCC data sets

    NASA Astrophysics Data System (ADS)

    Prakash, Satya; Gairola, R. M.; Mitra, A. K.

    2015-07-01

    Reliable information of land precipitation along with other atmospheric variables is crucial for monsoon studies, ecosystem modelling, crop modelling and numerous other applications. In this paper, three multisatellite and three reanalysis precipitation products, namely Global Precipitation Climatology Project (GPCP), Climate Prediction Center Mapping of Precipitation (CMAP1 and CMAP2), European Center for Medium Range Weather Forecasts Reanalysis-Interim (ERA-I) and National Center for Environmental Prediction (NCEP1 and NCEP2), are compared with the recent version of gauge-based gridded Global Precipitation Climatology Centre (GPCC) data sets over the global land region. The analysis is done at monthly scale and at 2.5° latitude × 2.5° longitude resolution for a 25-year (1986-2010) period. Large-scale prominent features of precipitation and its variability are qualitatively represented by all the precipitation products. However, the magnitudes considerably differ among themselves. Among the six precipitation products, GPCP performs better than the others when compared to the gridded GPCC data sets. Among the three reanalysis precipitation products, ERA-I is better than NCEP1 and NCEP2 in general. Even though NCEP2 is improved over NCEP1 over the mid-latitudes, NCEP2 has more serious problem over the orographic regions than that of NCEP1. Moreover, all the precipitation estimates exhibit similar kind of interannual variability over the global and tropical land regions. Additionally, the comparison is done for the six global monsoon regions for the regional analysis which shows that all the precipitation estimates exhibit similar kind of interannual variability in the seasonal monsoon precipitation. However, there are some regional differences among these precipitation products in the representation of monsoon variability.

  6. Question Driven Instruction with Classroom Response Technology

    NASA Astrophysics Data System (ADS)

    Gerace, William; Beatty, Ian

    2007-10-01

    Essentially, a classroom response system is technology that: 1) allows an instructor to present a question or problem to the class; 2) allows students to enter their answers into some kind of device; and 3) instantly aggregates and summarizes students' answers for the instructor, usually as a histogram. Most response systems provide additional functionality. Some additional names for this class of system (or for subsets of the class) are classroom communication system (CCS), audience response system (ARS), voting machine system, audience feedback system, and--most ambitiously--CATAALYST system (for ``Classroom Aggregation Technology for Activating and Assessing Learning and Your Students' Thinking''). UMPERG has been teaching with and researching classroom response systems since 1993. We find that the technology has the potential to transform the way we teach science in large lecture settings. CRSs can serve as catalysts for creating a more interactive, student-centered classroom in the lecture hall, thereby allowing students to become more actively involved in constructing and using knowledge. CRSs not only make it easier to engage students in learning activities during lecture but also enhance the communication among students, and between the students and the instructor. This enhanced communication assists the students and the instructor in assessing understanding during class time, and affords the instructor the opportunity to devise instructional interventions that target students' needs as they arise.

  7. Using Technology To Implement Active Learning in Large Classes. Technical Report.

    ERIC Educational Resources Information Center

    Gerace, William J.; Dufresne, Robert J.; Leonard, William J.

    An emerging technology, classroom communication systems (CCSs), has the potential to transform the way we teach science in large-lecture settings. CCSs can serve as catalysts for creating a more interactive, student-centered classroom in the lecture hall, thereby allowing students to become more actively involved in constructing and using…

  8. Clickenomics: Using a Classroom Response System to Increase Student Engagement in a Large-Enrollment Principles of Economics Course

    ERIC Educational Resources Information Center

    Salemi, Michael K.

    2009-01-01

    One of the most important challenges facing college instructors of economics is helping students engage. Engagement is particularly important in a large-enrollment Principles of Economics course, where it can help students achieve a long-lived understanding of how economists use basic economic ideas to look at the world. The author reports how…

  9. Classroom Planetarium.

    ERIC Educational Resources Information Center

    Ankney, Paul

    1981-01-01

    Provides instructions for the construction of a paper mache classroom planetarium and suggests several student activities using this planetarium model. Lists reasons why students have difficulties in transferring classroom instruction in astronomy to the night sky. (DS)

  10. Identifying Cognate Binding Pairs among a Large Set of Paralogs: The Case of PE/PPE Proteins of Mycobacterium tuberculosis

    PubMed Central

    Riley, Robert; Pellegrini, Matteo; Eisenberg, David

    2008-01-01

    We consider the problem of how to detect cognate pairs of proteins that bind when each belongs to a large family of paralogs. To illustrate the problem, we have undertaken a genomewide analysis of interactions of members of the PE and PPE protein families of Mycobacterium tuberculosis. Our computational method uses structural information, operon organization, and protein coevolution to infer the interaction of PE and PPE proteins. Some 289 PE/PPE complexes were predicted out of a possible 5,590 PE/PPE pairs genomewide. Thirty-five of these predicted complexes were also found to have correlated mRNA expression, providing additional evidence for these interactions. We show that our method is applicable to other protein families, by analyzing interactions of the Esx family of proteins. Our resulting set of predictions is a starting point for genomewide experimental interaction screens of the PE and PPE families, and our method may be generally useful for detecting interactions of proteins within families having many paralogs. PMID:18787688

  11. Spatial Fingerprints of Community Structure in Human Interaction Network for an Extensive Set of Large-Scale Regions

    PubMed Central

    Kallus, Zsófia; Barankai, Norbert; Szüle, János; Vattay, Gábor

    2015-01-01

    Human interaction networks inferred from country-wide telephone activity recordings were recently used to redraw political maps by projecting their topological partitions into geographical space. The results showed remarkable spatial cohesiveness of the network communities and a significant overlap between the redrawn and the administrative borders. Here we present a similar analysis based on one of the most popular online social networks represented by the ties between more than 5.8 million of its geo-located users. The worldwide coverage of their measured activity allowed us to analyze the large-scale regional subgraphs of entire continents and an extensive set of examples for single countries. We present results for North and South America, Europe and Asia. In our analysis we used the well-established method of modularity clustering after an aggregation of the individual links into a weighted graph connecting equal-area geographical pixels. Our results show fingerprints of both of the opposing forces of dividing local conflicts and of uniting cross-cultural trends of globalization. PMID:25993329

  12. RelMon: A General Approach to QA, Validation and Physics Analysis through Comparison of large Sets of Histograms

    NASA Astrophysics Data System (ADS)

    Piparo, Danilo

    2012-12-01

    The estimation of the compatibility of large amounts of histogram pairs is a recurrent problem in high energy physics. The issue is common to several different areas, from software quality monitoring to data certification, preservation and analysis. Given two sets of histograms, it is very important to be able to scrutinize the outcome of several goodness of fit tests, obtain a clear answer about the overall compatibility, easily spot the single anomalies and directly access the concerned histogram pairs. This procedure must be automated in order to reduce the human workload, therefore improving the process of identification of differences which is usually carried out by a trained human mind. Some solutions to this problem have been proposed, but they are experiment specific. RelMon depends only on ROOT and offers several goodness of fit tests (e.g. chi-squared or Kolmogorov-Smirnov). It produces highly readable web reports, in which aggregations of the comparisons rankings are available as well as all the plots of the single histogram overlays. The comparison procedure is fully automatic and scales smoothly towards ensembles of millions of histograms. Examples of RelMon utilisation within the regular workflows of the CMS collaboration and the advantages therewith obtained are described. Its interplay with the data quality monitoring infrastructure is illustrated as well as its role in the QA of the event reconstruction code, its integration in the CMS software release cycle process, CMS user data analysis and dataset validation.

  13. Determination of counterfeit medicines by Raman spectroscopy: Systematic study based on a large set of model tablets.

    PubMed

    Neuberger, Sabine; Neusüß, Christian

    2015-08-10

    In the last decade, counterfeit pharmaceutical products have become a widespread issue for public health. Raman spectroscopy which is easy, non-destructive and information-rich is particularly suitable as screening method for fast characterization of chemicals and pharmaceuticals. Combined with chemometric techniques, it provides a powerful tool for the analysis and determination of counterfeit medicines. Here, for the first time, a systematic study of the benefits and limitations of Raman spectroscopy for the analysis of pharmaceutical samples on a large set of model tablets, varying with respect to chemical and physical properties, was performed. To discriminate between the different mixtures, a combination of dispersive Raman spectroscopy performing in backscattering mode and principal component analysis was used. The discrimination between samples with different coatings, a varying amount of active pharmaceutical ingredients and a diversity of excipients were possible. However, it was not possible to distinguish between variations of the press power, mixing quality and granulation. As a showcase, the change in Raman signals of commercial acetylsalicylic acid effervescent tablets due to five different storage conditions was monitored. It was possible to detect early small chemical changes caused by inappropriate storage conditions. These results demonstrate that Raman spectroscopy combined with multivariate data analysis provides a powerful methodology for the fast and easy characterization of genuine and counterfeit medicines. PMID:25956227

  14. Automatic detection of rate change in large data sets with an unsupervised approach: the case of influenza viruses.

    PubMed

    Labonté, Kasandra; Aris-Brosou, Stéphane

    2016-04-01

    Influenza viruses evolve at such a high rate that vaccine recommendations need to be changed, but not quite on a regular basis. This observation suggests that the rate of evolution of these viruses is not constant through time, which begs the question as to when such rate changes occur, if they do so independently of the host in which they circulate and (or) independently of their subtype. To address these outstanding questions, we introduce a novel heuristics, Mclust*, that is based on a two-tier clustering approach in a phylogenetic context to estimate (i) absolute rates of evolution and (ii) when rate change occurs. We employ the novel approach to compare the two influenza surface proteins, hemagglutinin and neuraminidase, that circulated in avian, human, and swine hosts between 1960 and 2014 in two subtypes: H3N2 and H1N1. We show that the algorithm performs well in most conditions, accounting for phylogenetic uncertainty by means of bootstrapping and scales up to analyze very large data sets. Our results show that our approach is robust to the time-dependent artifact of rate estimation, and confirm pervasive punctuated evolution across hosts and subtypes. As such, the novel approach can potentially detect when vaccine composition needs to be updated. PMID:26966881

  15. Early Miocene Kirka-Phrigian caldera, western Anatolia - an example of large volume silicic magma generation in extensional setting

    NASA Astrophysics Data System (ADS)

    Seghedi, Ioan; Helvacı, Cahit

    2014-05-01

    Large rhyolitic ignimbrite occurrences are close connected to the Early Miocene initiation of extensional processes in the central-west Anatolia along Taşvanlı-Afyon zones. Field correlations, petrographical, geochemical and geochronological data lead to a substantial reinterpretation of the ignimbrite surrounding Kırka area, known from its world-class borate deposits, as representing the climatic event of a caldera collapse, unknown up to now and newly named "Kırka-Phrigian caldera". The caldera, which is roughly oval (24 km x 15km) in shape, one of the largest in Turkey, is supposed to have been formed in a single stage collapse event, at ~19 Ma that generated huge volume extracaldera outflow ignimbrites. Transtensive/distensive tectonic stresses since 25 Ma ago resulted in the NNW-SSE elongation of the magma chamber and influenced the roughly elliptical shape of the subsided block (caldera floor) belonging to the apex of Eskişehir-Afyon-Isparta volcanic area. Intracaldera post-collapse sedimentation and volcanism (at ~ 18 Ma) was controlled through subsidence-related faults with generation of a series of volcanic structures (mainly domes) showing a large compositional range from saturated silicic rhyolites and crystal-rich trachytes to undersaturated lamproites. Such volcanic rock association is typical for lithospheric extension. In this scenario, enriched mantle components within the subcontinental lithospheric mantle will begin to melt via decompression melting during the initiation of extension. Interaction of these melts with crustal rocks, fractionation processes and crustal anatexis driven by the heat contained in the ascending mantle melts produced the silicic compositions in a large crustal reservoir. Such silicic melts generated the initial eruptions of Kırka-Phrigian caldera ignimbrites. The rock volume and geochemical evidence suggests that silicic volcanic rocks come from a long-lived magma chamber that evolved episodically; after caldera generation there is a shift to small volume episodic rhyolitic, trachytic and lamproitic volcanism, the last ones indicating a more primitive magma input with evident origin in an enriched mantle lithosphere. The volcanic rock succession provides a direct picture of the state of the magmatic system at the time of eruptions that generated caldera and post-caldera structures and offer an excellent example for silicic magma generation and associated potassic and ultrapotassic intermediate-mafic rocks in post-collisional extensional setting.

  16. Classroom Network Technology as a Support for Systemic Mathematics Reform: The Effects of TI MathForward on Student Achievement in a Large, Diverse District

    ERIC Educational Resources Information Center

    Penuel, William; Singleton, Corinne; Roschelle, Jeremy

    2011-01-01

    Low-cost, portable classroom network technologies have shown great promise in recent years for improving teaching and learning in mathematics. This paper explores the impacts on student learning in mathematics when a program to introduce network technologies into mathematics classrooms is integrated into a systemic reform initiative at the…

  17. The Printout: Desktop Pulishing in the Classroom.

    ERIC Educational Resources Information Center

    Balajthy, Ernest; Link, Gordon

    1988-01-01

    Reviews software available to the classroom teacher for desktop publishing and describes specific classroom activities. Suggests using desktop publishing to produce large print texts for students with limited sight or for primary students.(NH)

  18. In silico assessment of adverse effects of a large set of 6-fluoroquinolones obtained from a study of tuberculosis chemotherapy.

    PubMed

    Tusar, Marjan; Minovski, Nikola; Fjodorova, Natalja; Novic, Marjana

    2012-09-01

    Among the different chemotherapeutic classes available today, the 6-fluoroquinolone (6-FQ) antibacterials are still one of the most effective cures in fighting tuberculosis (TB). Nowadays, the development of novel 6-FQs for treatment of TB mainly depends on understanding how the structural modifications of the main quinolone scaffold at specific positions affect the anti-mycobacterial activity. Alongside the structure-activity relationship (SAR) studies of the 6-FQ antibacterials, which can be considered as a golden rule in the development of novel active antitubercular 6-FQs, the structure side effects relationship (SSER) of these drugs must be also taken into account. In the present study we focus on a proficient implementation of the existing knowledge-based expert systems for design of novel 6-FQ antibacterials with possible enhanced biological activity against Mycobaterium tuberculosis as well as lower toxicity. Following the SAR in silico studies of the quinolone antibacterials against M. tuberculosis performed in our laboratory, a large set of 6-FQs was selected. Several new 6-FQ derivatives were proposed as drug candidates for further research and development. The 6- FQs identified as potentially effective against M. tuberculosis were subjected to an additional SSER study for prediction of their toxicological profile. The assessment of structurally-driven adverse effects which might hamper the potential of new drug candidates is mandatory for an effective drug design. We applied publicly available knowledge-based (expert) systems and Quantitative Structure-Activity Relationship (QSAR) models in order to prepare a priority list of active compounds. A preferred order of drug candidates was obtained, so that the less harmful candidates were identified for further testing. TOXTREE expert system as well as some QSAR models developed in the framework of EC funded project CAESAR were used to assess toxicity. CAESAR models were developed according to the OECD principles for the validation of QSAR and they turn to be appropriate tools for in silico tests regarding five different toxicity endpoints. Those endpoints with high relevance for REACH are: bioconcentration factor, skin sensitization, carcinogenicity, mutagenicity, and developmental toxicity. We used the above-mentioned freely available models to select a set of less harmful active 6-FQs as candidates for clinical studies. PMID:23062244

  19. Classroom Connectivity: Increasing Participation and Understanding Inside the Classroom

    ERIC Educational Resources Information Center

    Hegedus, Stephen

    2007-01-01

    This article shows how highly mobile computing, when used with new forms of network connectivity, can allow new forms of activities in the mathematics classroom. Examples are provided, such as the ability to share, harvest, and aggregate mathematical objects, and the ability for teachers and students to analyze the entire set of classroom…

  20. Empirical Mining of Large Data Sets Already Helps to Solve Practical Ecological Problems; A Panoply of Working Examples (Invited)

    NASA Astrophysics Data System (ADS)

    Hargrove, W. W.; Hoffman, F. M.; Kumar, J.; Spruce, J.; Norman, S. P.

    2013-12-01

    Here we present diverse examples where empirical mining and statistical analysis of large data sets have already been shown to be useful for a wide variety of practical decision-making problems within the realm of large-scale ecology. Because a full understanding and appreciation of particular ecological phenomena are possible only after hypothesis-directed research regarding the existence and nature of that process, some ecologists may feel that purely empirical data harvesting may represent a less-than-satisfactory approach. Restricting ourselves exclusively to process-driven approaches, however, may actually slow progress, particularly for more complex or subtle ecological processes. We may not be able to afford the delays caused by such directed approaches. Rather than attempting to formulate and ask every relevant question correctly, empirical methods allow trends, relationships and associations to emerge freely from the data themselves, unencumbered by a priori theories, ideas and prejudices that have been imposed upon them. Although they cannot directly demonstrate causality, empirical methods can be extremely efficient at uncovering strong correlations with intermediate "linking" variables. In practice, these correlative structures and linking variables, once identified, may provide sufficient predictive power to be useful themselves. Such correlation "shadows" of causation can be harnessed by, e.g., Bayesian Belief Nets, which bias ecological management decisions, made with incomplete information, toward favorable outcomes. Empirical data-harvesting also generates a myriad of testable hypotheses regarding processes, some of which may even be correct. Quantitative statistical regionalizations based on quantitative multivariate similarity have lended insights into carbon eddy-flux direction and magnitude, wildfire biophysical conditions, phenological ecoregions useful for vegetation type mapping and monitoring, forest disease risk maps (e.g., sudden oak death), global aquatic ecoregion risk maps for aquatic invasives, and forest vertical structure ecoregions (e.g., using extensive LiDAR data sets). Multivariate Spatio-Temporal Clustering, which quantitatively places alternative future conditions on a common footing with present conditions, allows prediction of present and future shifts in tree species ranges, given alternative climatic change forecasts. ForWarn, a forest disturbance detection and monitoring system mining 12 years of national 8-day MODIS phenology data, has been operating since 2010, producing national maps every 8 days showing many kinds of potential forest disturbances. Forest resource managers can view disturbance maps via a web-based viewer, and alerts are issued when particular forest disturbances are seen. Regression-based decadal trend analysis showing long-term forest thrive and decline areas, and individual-based, brute-force supercomputing to map potential movement corridors and migration routes across landscapes will also be discussed. As significant ecological changes occur with increasing rapidity, such empirical data-mining approaches may be the most efficient means to help land managers find the best, most-actionable policies and decision strategies.

  1. Classroom Management in Diverse Classrooms

    ERIC Educational Resources Information Center

    Milner, H. Richard, IV; Tenore, F. Blake

    2010-01-01

    Classroom management continues to be a serious concern for teachers and especially in urban and diverse learning environments. The authors present the culturally responsive classroom management practices of two teachers from an urban and diverse middle school to extend the construct, culturally responsive classroom management. The principles that…

  2. Classroom Management in Diverse Classrooms

    ERIC Educational Resources Information Center

    Milner, H. Richard, IV; Tenore, F. Blake

    2010-01-01

    Classroom management continues to be a serious concern for teachers and especially in urban and diverse learning environments. The authors present the culturally responsive classroom management practices of two teachers from an urban and diverse middle school to extend the construct, culturally responsive classroom management. The principles that…

  3. Teaching Cell Biology in the Large-Enrollment Classroom: Methods to Promote Analytical Thinking and Assessment of Their Effectiveness

    PubMed Central

    Kitchen, Elizabeth; Bell, John D.; Reeve, Suzanne; Sudweeks, Richard R.; Bradshaw, William S.

    2003-01-01

    A large-enrollment, undergraduate cellular biology lecture course is described whose primary goal is to help students acquire skill in the interpretation of experimental data. The premise is that this kind of analytical reasoning is not intuitive for most people and, in the absence of hands-on laboratory experience, will not readily develop unless instructional methods and examinations specifically designed to foster it are employed. Promoting scientific thinking forces changes in the roles of both teacher and student. We describe didactic strategies that include directed practice of data analysis in a workshop format, active learning through verbal and written communication, visualization of abstractions diagrammatically, and the use of ancillary small-group mentoring sessions with faculty. The implications for a teacher in reducing the breadth and depth of coverage, becoming coach instead of lecturer, and helping students to diagnose cognitive weaknesses are discussed. In order to determine the efficacy of these strategies, we have carefully monitored student performance and have demonstrated a large gain in a pre- and posttest comparison of scores on identical problems, improved test scores on several successive midterm examinations when the statistical analysis accounts for the relative difficulty of the problems, and higher scores in comparison to students in a control course whose objective was information transfer, not acquisition of reasoning skills. A novel analytical index (student mobility profile) is described that demonstrates that this improvement was not random, but a systematic outcome of the teaching/learning strategies employed. An assessment of attitudes showed that, in spite of finding it difficult, students endorse this approach to learning, but also favor curricular changes that would introduce an analytical emphasis earlier in their training. PMID:14506506

  4. A Large-Scale Inquiry-Based Astronomy Intervention Project: Impact on Students' Content Knowledge Performance and Views of their High School Science Classroom

    NASA Astrophysics Data System (ADS)

    Fitzgerald, Michael; McKinnon, David H.; Danaia, Lena; Deehan, James

    2015-08-01

    In this paper, we present the results from a study of the impact on students involved in a large-scale inquiry-based astronomical high school education intervention in Australia. Students in this intervention were led through an educational design allowing them to undertake an investigative approach to understanding the lifecycle of stars more aligned with the `ideal' picture of school science. Through the use of two instruments, one focused on content knowledge gains and the other on student views of school science, we explore the impact of this design. Overall, students made moderate content knowledge gains although these gains were heavily dependent on the individual teacher, the number of times a teacher implemented and the depth to which an individual teacher went with the provided materials. In terms of students' views, there were significant global changes in their views of their experience of the science classroom. However, there were some areas where no change or slightly negative changes of which some were expected and some were not. From these results, we comment on the necessity of sustained long-period implementations rather than single interventions, the requirement for similarly sustained professional development and the importance of monitoring the impact of inquiry-based implementations. This is especially important as inquiry-based approaches to science are required by many new curriculum reforms, most notably in this context, the new Australian curriculum currently being rolled out.

  5. The Nonsexist Classroom. Primary Place.

    ERIC Educational Resources Information Center

    Taus, Kay; Spann, Mary Beth

    1992-01-01

    Presents strategies to help teachers keep elementary classrooms free of sex-role stereotyping. The article explains how to set the tone, observe classroom behavior, share nonsexist lessons, provide role models, modify sexist statements, connect with parents, discuss female heroes, reverse traditional roles, use nonsexist photo files, and make role…

  6. Is Our Classroom an Ecological Place?

    ERIC Educational Resources Information Center

    Xia, Wang

    2006-01-01

    The essence of ecology is life and its diversity, integrity, openness and coexistence. When one contemplates and analyzes classroom from the perspective of ecology, classroom should contain open-ended and multiple goals instead of a single and pre-set goal; classroom is more flexible, allowing great diversity instead of being narrow-minded,…

  7. Is Our Classroom an Ecological Place?

    ERIC Educational Resources Information Center

    Xia, Wang

    2006-01-01

    The essence of ecology is life and its diversity, integrity, openness and coexistence. When one contemplates and analyzes classroom from the perspective of ecology, classroom should contain open-ended and multiple goals instead of a single and pre-set goal; classroom is more flexible, allowing great diversity instead of being narrow-minded,…

  8. Classroom Ecological Inventory: A Process for Mainstreaming.

    ERIC Educational Resources Information Center

    Fuchs, Douglas; And Others

    1994-01-01

    Teachers in Tennessee are using the Classroom Ecological Inventory (CEI) to prepare students with mild disabilities for moves into mainstream settings. The CEI was field tested as part of the Peabody Reintegration Project and involves observation of the regular classroom, regular teacher interview, comparison of the special and regular classrooms

  9. A review of sea spray aerosol source functions using a large global set of sea salt aerosol concentration measurements

    NASA Astrophysics Data System (ADS)

    Grythe, H.; Ström, J.; Krejci, R.; Quinn, P.; Stohl, A.

    2013-08-01

    Sea spray aerosols (SSA) are an important part of the climate system through their effects on the global radiative budget both directly as scatterers and absorbers of solar and terrestrial radiation, and indirectly as cloud condensation nuclei (CCN) influencing cloud formation, lifetime and precipitation. In terms of their global mass, SSA have the largest uncertainty of all aerosols. In this study we review 21 SSA source functions from the literature, several of which are used in current climate models, and we also propose a new function. Even excluding outliers, the global annual SSA mass produced by these source functions spans roughly 3-70 Pg yr-1 for the different source functions, with relatively little interannual variability for a given function. The FLEXPART Lagrangian model was run in backward mode for a large global set of observed SSA concentrations, comprised of several station networks and ship cruise measurement campaigns. FLEXPART backward calculations produce gridded emission sensitivity fields, which can subsequently be multiplied with gridded SSA production fluxes to obtain modeled SSA concentrations. This allowed to efficiently evaluate all 21source functions at the same time against the measurements. Another advantage of this method is that source-region information on wind speed and sea surface temperatures (SSTs) could be stored and used for improving the SSA source function parameterizations. The best source functions reproduced as much as 70% of the observed SSA concentration variability at several stations, which is comparable with "state of the art" aerosol models. The main driver of SSA production is wind, and we found that the best fit to the observation data could be obtained when the SSA production is proportional to U103.5 where U10 is the source region averaged 10 m wind speed, to the power of 3.5. A strong influence of SST on SSA production could be detected as well, although the underlying physical mechanisms of the SST influence remains unclear. Our new source function gives a global SSA production for particles smaller than 10 ?m of 9 Pg yr-1 and is the best fit to the observed concentrations.

  10. Imputation of low-frequency variants using the HapMap3 benefits from large, diverse reference sets.

    PubMed

    Jostins, Luke; Morley, Katherine I; Barrett, Jeffrey C

    2011-06-01

    Imputation allows the inference of unobserved genotypes in low-density data sets, and is often used to test for disease association at variants that are poorly captured by standard genotyping chips (such as low-frequency variants). Although much effort has gone into developing the best imputation algorithms, less is known about the effects of reference set choice on imputation accuracy. We assess the improvements afforded by increases in reference size and diversity, specifically comparing the HapMap2 data set, which has been used to date for imputation, and the new HapMap3 data set, which contains more samples from a more diverse range of populations. We find that, for imputation into Western European samples, the HapMap3 reference provides more accurate imputation with better-calibrated quality scores than HapMap2, and that increasing the number of HapMap3 populations included in the reference set grant further improvements. Improvements are most pronounced for low-frequency variants (frequency <5%), with the largest and most diverse reference sets bringing the accuracy of imputation of low-frequency variants close to that of common ones. For low-frequency variants, reference set diversity can improve the accuracy of imputation, independent of reference sample size. HapMap3 reference sets provide significant increases in imputation accuracy relative to HapMap2, and are of particular use if highly accurate imputation of low-frequency variants is required. Our results suggest that, although the sample sizes from the 1000 Genomes Pilot Project will not allow reliable imputation of low-frequency variants, the larger sample sizes of the main project will allow. PMID:21364697

  11. Photometric selection of quasars in large astronomical data sets with a fast and accurate machine learning algorithm

    NASA Astrophysics Data System (ADS)

    Gupta, Pramod; Connolly, Andrew J.; Gardner, Jeffrey P.

    2014-03-01

    Future astronomical surveys will produce data on ˜108 objects per night. In order to characterize and classify these sources, we will require algorithms that scale linearly with the size of the data, that can be easily parallelized and where the speedup of the parallel algorithm will be linear in the number of processing cores. In this paper, we present such an algorithm and apply it to the question of colour selection of quasars. We use non-parametric Bayesian classification and a binning algorithm implemented with hash tables (BASH tables). We show that this algorithm's run time scales linearly with the number of test set objects and is independent of the number of training set objects. We also show that it has the same classification accuracy as other algorithms. For current data set sizes, it is up to three orders of magnitude faster than commonly used naive kernel-density-estimation techniques and it is estimated to be about eight times faster than the current fastest algorithm using dual kd-trees for kernel density estimation. The BASH table algorithm scales linearly with the size of the test set data only, and so for future larger data sets, it will be even faster compared to other algorithms which all depend on the size of the test set and the size of the training set. Since it uses linear data structures, it is easier to parallelize compared to tree-based algorithms and its speedup is linear in the number of cores unlike tree-based algorithms whose speedup plateaus after a certain number of cores. Moreover, due to the use of hash tables to implement the binning, the memory usage is very small. While our analysis is for the specific problem of selection of quasars, the ideas are general and the BASH table algorithm can be applied to any density-estimation problem involving sparse high-dimensional data sets. Since sparse high-dimensional data sets are a common type of scientific data set, this method has the potential to be useful in a broad range of machine-learning applications in astrophysics.

  12. Inter-species inference of gene set enrichment in lung epithelial cells from proteomic and large transcriptomic datasets

    PubMed Central

    Hormoz, Sahand; Bhanot, Gyan; Biehl, Michael; Bilal, Erhan; Meyer, Pablo; Norel, Raquel; Rhrissorrakrai, Kahn; Dayarian, Adel

    2015-01-01

    Motivation: Translating findings in rodent models to human models has been a cornerstone of modern biology and drug development. However, in many cases, a naive ‘extrapolation’ between the two species has not succeeded. As a result, clinical trials of new drugs sometimes fail even after considerable success in the mouse or rat stage of development. In addition to in vitro studies, inter-species translation requires analytical tools that can predict the enriched gene sets in human cells under various stimuli from corresponding measurements in animals. Such tools can improve our understanding of the underlying biology and optimize the allocation of resources for drug development. Results: We developed an algorithm to predict differential gene set enrichment as part of the sbv IMPROVER (systems biology verification in Industrial Methodology for Process Verification in Research) Species Translation Challenge, which focused on phosphoproteomic and transcriptomic measurements of normal human bronchial epithelial (NHBE) primary cells under various stimuli and corresponding measurements in rat (NRBE) primary cells. We find that gene sets exhibit a higher inter-species correlation compared with individual genes, and are potentially more suited for direct prediction. Furthermore, in contrast to a similar cross-species response in protein phosphorylation states 5 and 25 min after exposure to stimuli, gene set enrichment 6 h after exposure is significantly different in NHBE cells compared with NRBE cells. In spite of this difference, we were able to develop a robust algorithm to predict gene set activation in NHBE with high accuracy using simple analytical methods. Availability and implementation: Implementation of all algorithms is available as source code (in Matlab) at http://bhanot.biomaps.rutgers.edu/wiki/codes_SC3_Predicting_GeneSets.zip, along with the relevant data used in the analysis. Gene sets, gene expression and protein phosphorylation data are available on request. Contact: hormoz@kitp.ucsb.edu PMID:25152231

  13. A Zebra in the Classroom.

    ERIC Educational Resources Information Center

    Leake, Devin; Morvillo, Nancy

    1998-01-01

    Describes the care and breeding of zebra fish, suggests various experiments and observations easily performed in a classroom setting, and provides some ideas to further student interest and exploration of these organisms. (DDR)

  14. Pre-Service Teachers and Classroom Authority

    ERIC Educational Resources Information Center

    Pellegrino, Anthony M.

    2010-01-01

    This study examined the classroom practices of five pre-service teachers from three secondary schools in a large southeastern state. Through classroom observations, survey responses, reviews of refection logs, and focus-group interview responses, we centered on the issue of developing classroom authority as a means to effective classroom…

  15. Classroom Management That Works

    ERIC Educational Resources Information Center

    Cleve, Lauren

    2012-01-01

    The purpose of this study was to find the best classroom management strategies to use when teaching in an elementary school setting. I wanted to conduct the best possible management tools for a variety of age groups as well as meet educational standards. Through my research I found different approaches in different grade levels is an important…

  16. Learning in Tomorrow's Classrooms

    ERIC Educational Resources Information Center

    Bowman, Richard F.

    2015-01-01

    Teaching today remains the most individualistic of all the professions, with educators characteristically operating in a highly fragmented world of "their" courses, "their" skills, and "their" students. Learning will occur in the classrooms of the future through a sustainable set of complementary capabilities:…

  17. Classroom Activities.

    ERIC Educational Resources Information Center

    Stuart, Frances R.

    This pamphlet suggests activities that may be used in the elementary school classroom. Chapter I lists various short plays that children can easily perform which encourage their imagination. Chapter II details a few quiet classroom games such as "I Saw,""Corral the Wild Horse,""Who Has Gone from the Room," and "Six-Man-Football Checkers." A number…

  18. Classroom Management.

    ERIC Educational Resources Information Center

    Dinsmore, Terri Sue

    This paper is a report of a middle-school teacher's study of classroom management. The teacher/researcher was interested in how some of the techniques in the Kovalik Integrated Thematic Instruction model of training would influence the teacher/researcher's classroom management; the effects of direct instruction within a community circle; the…

  19. Classroom Connect.

    ERIC Educational Resources Information Center

    Richardson, Sandra

    1997-01-01

    Describes the World Wide Web site called Classroom Connect. Notes that it gives easy access to Global Resources and Directory of Educational Sites (GRADES), which lists only "high quality" sites. Briefly discusses 17 sites listed by GRADES, and seven sections of the Classroom Connect site. (RS)

  20. Outdoor Classrooms

    ERIC Educational Resources Information Center

    Mayes, Valynda

    2010-01-01

    An outdoor classroom is the ideal vehicle for community involvement: Parents, native plant societies, 4-H, garden clubs, and master naturalists are all resources waiting to be tapped, as are local businesses offering support. If you enlist your community in the development and maintenance of your outdoor classroom, the entire community will…

  1. Accompanying Readings & Tools for Enhancing Classroom Approaches for Addressing Barriers to Learning: Classroom-Focused Enabling.

    ERIC Educational Resources Information Center

    California Univ., Los Angeles. Center for Mental Health in Schools.

    This publication presents a set of readings and tools that accompany the education modules "Enhancing Classroom Approaches to Addressing Barriers to Learning: Classroom-Focused Enabling." Together, they delineate a preservice/inservice teacher preparation curriculum covering how regular classrooms and schools should be designed to ensure all…

  2. Classroom interactions and science inquiry: A comparative study examining differential implementation of a science program in two middle school classrooms

    NASA Astrophysics Data System (ADS)

    Goldberg, Jennifer Sarah

    This dissertation explores two classroom communities during the implementation of a new environmental science curriculum. The classrooms are similar in that both are located in the same middle school and led by experienced classroom teachers. Despite these similarities, differences among learning outcomes are found in analyses of student pre- and post-science tests in the two rooms. Through videotape analysis of classroom interaction within parallel curricular activities, learning opportunities are contrasted in terms of the social and cognitive organization of science activities and the roles played by teachers, students, and scientists as manifested in their discourse. In one classroom, tasks flow between whole class discussions and small group work. Curricular activities are interwoven with transitions eased as goals are shared with students. Scientific concepts are connected through various activities and related to ideas outside of the classroom. Furthermore, the classroom community is united, established largely through the teacher's discourse patterns, such as deictics (specifically, inclusive personal pronouns). Moreover, the teacher emphasizes that she is learning alongside the students. In the other classroom, the focus of their science period is typically centered around whole class instruction or small group work depending on the particular lesson. This organization accompanied by a heavy use of directives leads to an implicit goal of completing the assigned task. Curricular activities are isolated, with an emphasis on following protocol instructions. Through discursive patterns, such as endearing address terms and exclusive pronouns, a dichotomy is created between the teacher and student. As the designated expert, this teacher imparts her knowledge of science to the students. Several implications emerge from this study. Although pre-packaged, curricular lessons appear identical on paper, the enacted curriculum differs, even in similar settings. Without doubt, science curricula can be useful in providing suggested guidelines and much needed materials for the classroom, but such curricula do not necessarily translate into student inquiry. As researchers and educators, we need to look beyond the curricula into the classrooms themselves. Indeed, this research has convinced me that a better understanding of classroom communities can be gleaned through the study of lesson organization and the classroom roles.

  3. Creating a Family-Like Atmosphere in Child Care Settings: All the More Difficult in Large Child Care Centers.

    ERIC Educational Resources Information Center

    Whitehead, Linda C.; Ginsberg, Stacey I.

    1999-01-01

    Presents suggestions for creating family-like programs in large child-care centers in three areas: (1) physical environment, incorporating cozy spaces, beauty, and space for family interaction; (2) caregiving climate, such as sharing home photographs, and serving meals family style; and (3) family involvement, including regular conversations with…

  4. Developing a "Semi-Systematic" Approach to Using Large-Scale Data-Sets for Small-Scale Interventions: The "Baby Matterz" Initiative as a Case Study

    ERIC Educational Resources Information Center

    O'Brien, Mark

    2011-01-01

    The appropriateness of using statistical data to inform the design of any given service development or initiative often depends upon judgements regarding scale. Large-scale data sets, perhaps national in scope, whilst potentially important in informing the design, implementation and roll-out of experimental initiatives, will often remain unused…

  5. Key Issues and Strategies for Recruitment and Implementation in Large-Scale Randomized Controlled Trial Studies in Afterschool Settings. Afterschool Research Brief. Issue No. 2

    ERIC Educational Resources Information Center

    Jones, Debra Hughes; Vaden-Kiernan, Michael; Rudo, Zena; Fitzgerald, Robert; Hartry, Ardice; Chambers, Bette; Smith, Dewi; Muller, Patricia; Moss, Marcey A.

    2008-01-01

    Under the larger scope of the National Partnership for Quality Afterschool Learning, SEDL funded three awardees to carry out large-scale randomized controlled trials (RCT) assessing the efficacy of promising literacy curricula in afterschool settings on student academic achievement. SEDL provided analytic and technical support to the RCT studies…

  6. Developing a "Semi-Systematic" Approach to Using Large-Scale Data-Sets for Small-Scale Interventions: The "Baby Matterz" Initiative as a Case Study

    ERIC Educational Resources Information Center

    O'Brien, Mark

    2011-01-01

    The appropriateness of using statistical data to inform the design of any given service development or initiative often depends upon judgements regarding scale. Large-scale data sets, perhaps national in scope, whilst potentially important in informing the design, implementation and roll-out of experimental initiatives, will often remain unused…

  7. pXRF quantitative analysis of the Otowi Member of the Bandelier Tuff: Generating large, robust data sets to decipher trace element zonation in large silicic magma chambers

    NASA Astrophysics Data System (ADS)

    Van Hoose, A. E.; Wolff, J.; Conrey, R.

    2013-12-01

    Advances in portable X-Ray fluorescence (pXRF) analytical technology have made it possible for high-quality, quantitative data to be collected in a fraction of the time required by standard, non-portable analytical techniques. Not only do these advances reduce analysis time, but data may also be collected in the field in conjunction with sampling. Rhyolitic pumice, being primarily glass, is an excellent material to be analyzed with this technology. High-quality, quantitative data for elements that are tracers of magmatic differentiation (e.g. Rb, Sr, Y, Nb) can be collected for whole, individual pumices and subsamples of larger pumices in 4 minutes. We have developed a calibration for powdered rhyolite pumice from the Otowi Member of the Bandelier Tuff analyzed with the Bruker Tracer IV pXRF using Bruker software and influence coefficients for pumice, which measures the following 19 oxides and elements: SiO2, TiO2, Al2O3, FeO*, MnO, CaO, K2O, P2O5, Zn, Ga, Rb, Sr, Y, Zr, Nb, Ba, Ce, Pb, and Th. With this calibration for the pXRF and thousands of individual powdered pumice samples, we have generated an unparalleled data set for any single eruptive unit with known trace element zonation. The Bandelier Tuff of the Valles-Toledo Caldera Complex, Jemez Mountains, New Mexico, is divided into three main eruptive events. For this study, we have chosen the 1.61 Ma, 450 km3 Otowi Member as it is primarily unwelded and pumice samples are easily accessible. The eruption began with a plinian phase from a single source located near center of the current caldera and deposited the Guaje Pumice Bed. The initial Unit A of the Guaje is geochemically monotonous, but Units B through E, co-deposited with ignimbrite show very strong chemical zonation in trace elements, progressing upwards through the deposits from highly differentiated compositions (Rb ~350 ppm, Nb ~200 ppm) to less differentiated (Rb ~100 ppm, Nb ~50 ppm). Co-erupted ignimbrites emplaced during column collapse show similar trace element zonation. The eruption culminated in caldera collapse after transitioning from a single central vent to ring fracture vents. Ignimbrites deposited at this time have lithic breccias and chaotic geochemical profiles. The geochemical discrepancy between early and late deposits warrants detailed, high-resolution sampling and analysis in order to fully understand the dynamics behind zonation processes. Samples were collected from locations that circumvent the caldera and prepared and analyzed in the field and the laboratory with the pXRF. Approximately 2,000 pumice samples will complete this unprecedented data set, allowing detailed reconstruction of trace element zonation around all sides of the Valles Caldera. These data are then used to constrain models of magma chamber processes that produce trace element zonation and how it is preserved in the deposits after a catastrophic, caldera-forming eruption.

  8. The large karstic holes at the top of the Syrian coastal Mountain Range. Importance of structural setting for the karstogenesis.

    NASA Astrophysics Data System (ADS)

    Mocochain, Ludovic; Blanpied, Christian; Bigot, Jean-Yves; Peyronel, Olivier; Gorini, Christian; Abdalla, Abdelkarim Al; Azki, Fawaz

    2015-04-01

    Along the Eastern Mediterranean Sea, the Syria Coastal Mountain Range spreads from north to south over 150 km of long. This range is a monocline structure stopped by a major escarpment that domines Al-Gahb Graben to the East. The Coastal Mountain Range is mainly formed by Mesozoic limestone that show a major unconformity between the Upper Jurassic and Aptien deposits, and important erosions in the Upper Cretaceous deposits. Locally, the Juro-Cretaceous unconformity is characterized by a layer of continental basalts with fossil woods that reveal a long emersion of the platform. The most recent carbonate deposits at the top of the Coastal Mountain Range are Turonian age. In the center part of the Coastal Mountain Range, in a small area, the Cretaceous carbonates are affected by large karstic dolines. These dolines are curiously located at the top of the mountain range. This position is not beneficial for the development of large karstic holes.

  9. Statistical Analysis of a Large Sample Size Pyroshock Test Data Set Including Post Flight Data Assessment. Revision 1

    NASA Technical Reports Server (NTRS)

    Hughes, William O.; McNelis, Anne M.

    2010-01-01

    The Earth Observing System (EOS) Terra spacecraft was launched on an Atlas IIAS launch vehicle on its mission to observe planet Earth in late 1999. Prior to launch, the new design of the spacecraft's pyroshock separation system was characterized by a series of 13 separation ground tests. The analysis methods used to evaluate this unusually large amount of shock data will be discussed in this paper, with particular emphasis on population distributions and finding statistically significant families of data, leading to an overall shock separation interface level. The wealth of ground test data also allowed a derivation of a Mission Assurance level for the flight. All of the flight shock measurements were below the EOS Terra Mission Assurance level thus contributing to the overall success of the EOS Terra mission. The effectiveness of the statistical methodology for characterizing the shock interface level and for developing a flight Mission Assurance level from a large sample size of shock data is demonstrated in this paper.

  10. A High Performance Spatial Query Engine for Large Event Data Sets Developed for the GLAST LAT Instrument

    NASA Astrophysics Data System (ADS)

    Stephens, T.

    2007-10-01

    The Large Area Telescope (LAT) is the primary instrument on-board the Gamma-ray Large Area Space Telescope (GLAST) satellite. The LAT is not a typical imaging instrument but rather records the energy and arrival direction of each individual gamma-ray photon. It is designed to operate primarily in sky survey mode, instead of pointing at fixed targets for long periods of time. The standard survey mode gives complete coverage of the celestial sphere every 2 orbits (˜3 hrs). Additionally, the LAT has a very large (˜2 sr) field of view, and an energy dependent point spread function (PSF). These factors combine to generate a large data volume and present a unique challenge in providing data to the user community for the study of astronomical sources. We present the design of the public data server at the GLAST Science Support Center (GSSC) for the LAT data as well as performance benchmarks of the initial implementation. The data server operates on event lists stored in FITS files. Based on the user's query, the photons matching the data cuts are extracted and presented to the user as a FITS file ready to be downloaded and used in the GLAST science analysis software. Running on a single CPU, the photon data server can extract and prepare the ˜1.5 million photons matching a typical user's query from the ˜80 million photons in a year's worth of observational data in approximately 40~s. The complete system is implemented as a small cluster of Linux PCs to improve performance even more by distributing the load of a single query or processing multiple queries simultaneously.

  11. Moving toward an Empowering Setting in a First Grade Classroom Serving Primarily Working Class and Working Poor Latina/o Children: An Exploratory Analysis

    ERIC Educational Resources Information Center

    Silva, Janelle M.; Langhout, Regina Day

    2016-01-01

    Empowering settings are important places for people to develop leadership skills in order to enact social change. Yet, due to socio-cultural constructions of childhood in the US, especially constructions around working class and working poor children of Color, they are often not seen as capable or competent change agents, or in need of being in…

  12. The Effects of Positive Verbal Reinforcement on the Time Spent outside the Classroom for Students with Emotional and Behavioral Disorders in a Residential Setting

    ERIC Educational Resources Information Center

    Kennedy, Christina; Jolivette, Kristine

    2008-01-01

    To more effectively instruct the entire class, teachers of students with emotional behavioral disorders (EBD) often choose to send students who display inappropriate behavior out of the room. A multiple baseline across settings was used to evaluate the effects of increasing teacher positive verbal reinforcement on the amount of time 2 students…

  13. Moving toward an Empowering Setting in a First Grade Classroom Serving Primarily Working Class and Working Poor Latina/o Children: An Exploratory Analysis

    ERIC Educational Resources Information Center

    Silva, Janelle M.; Langhout, Regina Day

    2016-01-01

    Empowering settings are important places for people to develop leadership skills in order to enact social change. Yet, due to socio-cultural constructions of childhood in the US, especially constructions around working class and working poor children of Color, they are often not seen as capable or competent change agents, or in need of being in…

  14. Sorting a large set of heavily used LiF:Mg,Ti thermoluminescent detectors into repeatable subsets of similar response.

    PubMed

    Kearfott, Kimberlee J; Newton, Jill P; Rafique, Muhammad

    2014-10-30

    A set of 920 heavily used LiF:Mg,Ti thermoluminescent dosimeters (TLDs) was placed into a polymethyl methacrylate (PMMA) plate attached to a 40×40×15cm(3) PMMA phantom and irradiated to 4.52mGy using a (137)Cs source. This was repeated three times to determine the mean and standard deviation of each TLD׳s sensitivity. Reader drift was tracked over time with 10 control dosimeters. Two test sets of 100 TLDs were divided into subsets with sensitivities within ±1% of their subset means. All dosimeters were re-irradiated four times to test the TLDs׳ response repeatability and determine the sensitivity uniformity within the subsets. Coefficients of variation revealed that, within a given subset, the dosimeters responded within ±2.5% of their subset mean in all calibrations. The coefficient of variation in any of the 200 TLDs׳ calibrations was below 6% across the four calibrations. The work validates the approach of performing three calibrations to separate heavily used and aged TLDs with overall sensitivity variations of ±25% into subsets that reproducibly respond within ±2.5%. PMID:25464196

  15. Flexible Classroom Furniture

    ERIC Educational Resources Information Center

    Kim Hassell,

    2011-01-01

    Classroom design for the 21st-century learning environment should accommodate a variety of learning skills and needs. The space should be large enough so it can be configured to accommodate a number of learning activities. This also includes furniture that provides flexibility and accommodates collaboration and interactive work among students and…

  16. Flexible Classroom Furniture

    ERIC Educational Resources Information Center

    Kim Hassell,

    2011-01-01

    Classroom design for the 21st-century learning environment should accommodate a variety of learning skills and needs. The space should be large enough so it can be configured to accommodate a number of learning activities. This also includes furniture that provides flexibility and accommodates collaboration and interactive work among students and…

  17. The Learning Environment in Clicker Classrooms: Student Processes of Learning and Involvement in Large University-Level Courses Using Student Response Systems

    ERIC Educational Resources Information Center

    Trees, April R.; Jackson, Michele H.

    2007-01-01

    To explore what social and educational infrastructure is needed to support classroom use of student response systems (Roschelle et al., 2004), this study investigated the ways in which student characteristics and course design choices were related to students' assessments of the contribution of clicker use to their learning and involvement in the…

  18. Estimation of comprehensive forest variable sets from multiparameter SAR data over a large area with diverse species

    NASA Technical Reports Server (NTRS)

    Moghaddam, M.

    2001-01-01

    Polarimetric and multifrequency data from the NASA/JPL airborne synthetic aperture radar (AIRSAR) have been used in a multi-tier estimation algorithm to calculate a comprehensive set of forest canopy properties including branch layer moisture and thickness, trunk density, trunk water content and diameter, trunk height, and subcanapy soil moisture. The estimation algorithm takes advantage of species-specific allometric relations, and is applied to a 100Km x 100Km area in the Canadian boreal region containing many different vegetation species types. The results show very good agreement with ground measurements taken at several focused and auxiliary study sites. This paper expands on the results reported in [1] and applies the algorithm on the regional scale.

  19. Deep sequencing of large library selections allows computational discovery of diverse sets of zinc fingers that bind common targets

    PubMed Central

    Persikov, Anton V.; Rowland, Elizabeth F.; Oakes, Benjamin L.; Singh, Mona; Noyes, Marcus B.

    2014-01-01

    The Cys2His2 zinc finger (ZF) is the most frequently found sequence-specific DNA-binding domain in eukaryotic proteins. The ZF’s modular protein–DNA interface has also served as a platform for genome engineering applications. Despite decades of intense study, a predictive understanding of the DNA-binding specificities of either natural or engineered ZF domains remains elusive. To help fill this gap, we developed an integrated experimental-computational approach to enrich and recover distinct groups of ZFs that bind common targets. To showcase the power of our approach, we built several large ZF libraries and demonstrated their excellent diversity. As proof of principle, we used one of these ZF libraries to select and recover thousands of ZFs that bind several 3-nt targets of interest. We were then able to computationally cluster these recovered ZFs to reveal several distinct classes of proteins, all recovered from a single selection, to bind the same target. Finally, for each target studied, we confirmed that one or more representative ZFs yield the desired specificity. In sum, the described approach enables comprehensive large-scale selection and characterization of ZF specificities and should be a great aid in furthering our understanding of the ZF domain. PMID:24214968

  20. Deep sequencing of large library selections allows computational discovery of diverse sets of zinc fingers that bind common targets.

    PubMed

    Persikov, Anton V; Rowland, Elizabeth F; Oakes, Benjamin L; Singh, Mona; Noyes, Marcus B

    2014-02-01

    The Cys2His2 zinc finger (ZF) is the most frequently found sequence-specific DNA-binding domain in eukaryotic proteins. The ZF's modular protein-DNA interface has also served as a platform for genome engineering applications. Despite decades of intense study, a predictive understanding of the DNA-binding specificities of either natural or engineered ZF domains remains elusive. To help fill this gap, we developed an integrated experimental-computational approach to enrich and recover distinct groups of ZFs that bind common targets. To showcase the power of our approach, we built several large ZF libraries and demonstrated their excellent diversity. As proof of principle, we used one of these ZF libraries to select and recover thousands of ZFs that bind several 3-nt targets of interest. We were then able to computationally cluster these recovered ZFs to reveal several distinct classes of proteins, all recovered from a single selection, to bind the same target. Finally, for each target studied, we confirmed that one or more representative ZFs yield the desired specificity. In sum, the described approach enables comprehensive large-scale selection and characterization of ZF specificities and should be a great aid in furthering our understanding of the ZF domain. PMID:24214968

  1. "Did Ronald McDonald also Tend to Scare You as a Child?": Working to Emplace Consumption, Commodities and Citizen-Students in a Large Classroom Setting

    ERIC Educational Resources Information Center

    Goodman, Michael K.

    2008-01-01

    So-called "radical" and "critical"pedagogy seems to be everywhere these days on the landscapes of geographical teaching praxis and theory. Part of the remit of radical/critical pedagogy involves a de-centring of the traditional "banking" method of pedagogical praxis. Yet, how do we challenge this "banking" model of knowledge transmission in both a…

  2. "Did Ronald McDonald also Tend to Scare You as a Child?": Working to Emplace Consumption, Commodities and Citizen-Students in a Large Classroom Setting

    ERIC Educational Resources Information Center

    Goodman, Michael K.

    2008-01-01

    So-called "radical" and "critical"pedagogy seems to be everywhere these days on the landscapes of geographical teaching praxis and theory. Part of the remit of radical/critical pedagogy involves a de-centring of the traditional "banking" method of pedagogical praxis. Yet, how do we challenge this "banking" model of knowledge transmission in both a…

  3. Strategy Training in a Task-Based Language Classroom

    ERIC Educational Resources Information Center

    Lai, Chun; Lin, Xiaolin

    2015-01-01

    Recent literature that examines the implementation of task-based language teaching (TBLT) in classroom settings has reported various challenges related to educational cultures, classroom management, teacher cognition and learner perceptions. To facilitate the smooth transition of TBLT from laboratory settings to classroom contexts, measures need…

  4. Efficient Computation of k-Nearest Neighbour Graphs for Large High-Dimensional Data Sets on GPU Clusters

    PubMed Central

    Dashti, Ali; Komarov, Ivan; D’Souza, Roshan M.

    2013-01-01

    This paper presents an implementation of the brute-force exact k-Nearest Neighbor Graph (k-NNG) construction for ultra-large high-dimensional data cloud. The proposed method uses Graphics Processing Units (GPUs) and is scalable with multi-levels of parallelism (between nodes of a cluster, between different GPUs on a single node, and within a GPU). The method is applicable to homogeneous computing clusters with a varying number of nodes and GPUs per node. We achieve a 6-fold speedup in data processing as compared with an optimized method running on a cluster of CPUs and bring a hitherto impossible -NNG generation for a dataset of twenty million images with 15 k dimensionality into the realm of practical possibility. PMID:24086314

  5. Efficient computation of k-Nearest Neighbour Graphs for large high-dimensional data sets on GPU clusters.

    PubMed

    Dashti, Ali; Komarov, Ivan; D'Souza, Roshan M

    2013-01-01

    This paper presents an implementation of the brute-force exact k-Nearest Neighbor Graph (k-NNG) construction for ultra-large high-dimensional data cloud. The proposed method uses Graphics Processing Units (GPUs) and is scalable with multi-levels of parallelism (between nodes of a cluster, between different GPUs on a single node, and within a GPU). The method is applicable to homogeneous computing clusters with a varying number of nodes and GPUs per node. We achieve a 6-fold speedup in data processing as compared with an optimized method running on a cluster of CPUs and bring a hitherto impossible [Formula: see text]-NNG generation for a dataset of twenty million images with 15 k dimensionality into the realm of practical possibility. PMID:24086314

  6. Classroom Wildlife.

    ERIC Educational Resources Information Center

    Fleer, Daryl

    1984-01-01

    A game is used to study population control factors on a wolf pack and to explore human competition with these animals. A game board and chance cards to be photocopied for use in the classroom are provided. (DH)

  7. Approaching the complete basis set limit of CCSD(T) for large systems by the third-order incremental dual-basis set zero-buffer F12 method

    SciTech Connect

    Zhang, Jun Dolg, Michael

    2014-01-28

    The third-order incremental dual-basis set zero-buffer approach was combined with CCSD(T)-F12x (x = a, b) theory to develop a new approach, i.e., the inc3-db-B0-CCSD(T)-F12 method, which can be applied as a black-box procedure to efficiently obtain the near complete basis set (CBS) limit of the CCSD(T) energies also for large systems. We tested this method for several cases of different chemical nature: four complexes taken from the standard benchmark sets S66 and X40, the energy difference between isomers of water hexamer and the rotation barrier of biphenyl. The results show that our method has an error relative to the best estimation of CBS energy of only 0.2 kcal/mol or less. By parallelization, our method can accomplish the CCSD(T)-F12 calculations of about 60 correlated electrons and 800 basis functions in only several days, which by standard implementation are impossible for ordinary hardware. We conclude that the inc3-db-B0-CCSD(T)-F12a/AVTZ method, which is of CCSD(T)/AV5Z quality, is close to the limit of accuracy that one can achieve for large systems currently.

  8. Assessment of amyloid β-protein precursor gene mutations in a large set of familial and sporadic Alzheimer disease cases

    PubMed Central

    Tanzi, Rudolph E.; Vaula, Giovanna; Romano, Donna M.; Mortilla, Marzia; Huang, Tricia L.; Tupler, Rossella G.; Wasco, Wilma; Hyman, Bradley T.; Haines, Jonathan L.; Jenkins, Barbara J.; Kalaitsidaki, Marianna; Warren, Andrew C.; McInnis, Melvin C.; Antonarakis, Stylianos E.; Karlinsky, Harry; Percy, Maire E.; Connor, Linda; Growdon, John; Crapper-McIachlan, Donald R.; Gusella, James F.; George-Hyslop, Peter H. St

    1992-01-01

    A genetic locus associated with familial Alzheimer disease (FAD) and a candidate gene, APP, encoding the amyloid protein precursor have both been assigned previously to chromosome 21, and, in a few FAD families, mutations of APP have been detected. However, obligate crossovers between APP and FAD have also been reported in several FAD pedigrees, including FAD4, a large kindred showing highly suggestive evidence for linkage of the disorder to chromosome 21. In case the apparent APP crossover in FAD4 actually represented an intragenic recombination event or segregation of different mutations in different family branches, we have performed a more detailed assessment of APP as a candidate gene in this family. The entire coding region of the APP gene was sequenced for FAD4 and for FAD1, a second large kindred. No mutations were found, indicating that, in at least one chromosome 21–linked FAD pedigree, the gene defect is not accounted for by a mutation in the known coding region of the APP gene. A total of 25 well-characterized early- and late-onset FAD pedigrees were typed for genetic linkage to APP, to assess the percentage of FAD families predicted to carry mutations in the APP gene. None of the FAD families yielded positive lod scores at a recombination fraction of 0.0. To estimate the overall prevalence of FAD-associated mutations in the βA4 domain of APP, we sequenced exons 16 and 17 in 30 (20 early- and 10 late-onset) FAD kindreds and in 11 sporadic AD cases, and we screened 56 FAD kindreds and 81 cases of sporadic AD for the presence of the originally reported FAD-associated mutation, APP717 Val→Ile (by BclI digestion). No APP gene mutations were found in any of the FAD families or sporadic-AD samples examined in this study, suggesting that the mutations in exons 16 and 17 are a rare cause of FAD. Overall, these data suggest that APP gene mutations account for a very small portion of FAD. ImagesFigure 1 PMID:1642228

  9. News Teaching: The epiSTEMe project: KS3 maths and science improvement Field trip: Pupils learn physics in a stately home Conference: ShowPhysics welcomes fun in Europe Student numbers: Physics numbers increase in UK Tournament: Physics tournament travels to Singapore Particle physics: Hadron Collider sets new record Astronomy: Take your classroom into space Forthcoming Events

    NASA Astrophysics Data System (ADS)

    2010-05-01

    Teaching: The epiSTEMe project: KS3 maths and science improvement Field trip: Pupils learn physics in a stately home Conference: ShowPhysics welcomes fun in Europe Student numbers: Physics numbers increase in UK Tournament: Physics tournament travels to Singapore Particle physics: Hadron Collider sets new record Astronomy: Take your classroom into space Forthcoming Events

  10. LINC-NIRVANA for the large binocular telescope: setting up the world's largest near infrared binoculars for astronomy

    NASA Astrophysics Data System (ADS)

    Hofferbert, Ralph; Baumeister, Harald; Bertram, Thomas; Berwein, Jürgen; Bizenberger, Peter; Böhm, Armin; Böhm, Michael; Borelli, José Luis; Brangier, Matthieu; Briegel, Florian; Conrad, Albert; De Bonis, Fulvio; Follert, Roman; Herbst, Tom; Huber, Armin; Kittmann, Frank; Kürster, Martin; Laun, Werner; Mall, Ulrich; Meschke, Daniel; Mohr, Lars; Naranjo, Vianak; Pavlov, Aleksei; Pott, Jörg-Uwe; Rix, Hans-Walter; Rohloff, Ralf-Rainer; Schinnerer, Eva; Storz, Clemens; Trowitzsch, Jan; Yan, Zhaojun; Zhang, Xianyu; Eckart, Andreas; Horrobin, Matthew; Rost, Steffen; Straubmeier, Christian; Wank, Imke; Zuther, Jens; Beckmann, Udo; Connot, Claus; Heininger, Matthias; Hofmann, Karl-Heinz; Kröner, Tim; Nussbaum, Eddy; Schertl, Dieter; Weigelt, Gerd; Bergomi, Maria; Brunelli, Alessandro; Dima, Marco; Farinato, Jacopo; Magrin, Demetrio; Marafatto, Luca; Ragazzoni, Roberto; Viotto, Valentina; Arcidiacono, Carmelo; Bregoli, Giovanni; Ciliegi, Paolo; Cosentino, Guiseppe; Diolaiti, Emiliano; Foppiani, Italo; Lombini, Matteo; Schreiber, Laura; D'Alessio, Francesco; Li Causi, Gianluca; Lorenzetti, Dario; Vitali, Fabrizio; Bertero, Mario; Boccacci, Patrizia; La Camera, Andrea

    2013-08-01

    LINC-NIRVANA (LN) is the near-infrared, Fizeau-type imaging interferometer for the large binocular telescope (LBT) on Mt. Graham, Arizona (elevation of 3267 m). The instrument is currently being built by a consortium of German and Italian institutes under the leadership of the Max Planck Institute for Astronomy in Heidelberg, Germany. It will combine the radiation from both 8.4 m primary mirrors of LBT in such a way that the sensitivity of a 11.9 m telescope and the spatial resolution of a 22.8 m telescope will be obtained within a 10.5×10.5 arcsec scientific field of view. Interferometric fringes of the combined beams are tracked in an oval field with diameters of 1 and 1.5 arcmin. In addition, both incoming beams are individually corrected by LN's multiconjugate adaptive optics system to reduce atmospheric image distortion over a circular field of up to 6 arcmin in diameter. A comprehensive technical overview of the instrument is presented, comprising the detailed design of LN's four major systems for interferometric imaging and fringe tracking, both in the near infrared range of 1 to 2.4 ?m, as well as atmospheric turbulence correction at two altitudes, both in the visible range of 0.6 to 0.9 ?m. The resulting performance capabilities and a short outlook of some of the major science goals will be presented. In addition, the roadmap for the related assembly, integration, and verification process are discussed. To avoid late interface-related risks, strategies for early hardware as well as software interactions with the telescope have been elaborated. The goal is to ship LN to the LBT in 2014.

  11. PhyloMap: an algorithm for visualizing relationships of large sequence data sets and its application to the influenza A virus genome

    PubMed Central

    2011-01-01

    Background Results of phylogenetic analysis are often visualized as phylogenetic trees. Such a tree can typically only include up to a few hundred sequences. When more than a few thousand sequences are to be included, analyzing the phylogenetic relationships among them becomes a challenging task. The recent frequent outbreaks of influenza A viruses have resulted in the rapid accumulation of corresponding genome sequences. Currently, there are more than 7500 influenza A virus genomes in the database. There are no efficient ways of representing this huge data set as a whole, thus preventing a further understanding of the diversity of the influenza A virus genome. Results Here we present a new algorithm, "PhyloMap", which combines ordination, vector quantization, and phylogenetic tree construction to give an elegant representation of a large sequence data set. The use of PhyloMap on influenza A virus genome sequences reveals the phylogenetic relationships of the internal genes that cannot be seen when only a subset of sequences are analyzed. Conclusions The application of PhyloMap to influenza A virus genome data shows that it is a robust algorithm for analyzing large sequence data sets. It utilizes the entire data set, minimizes bias, and provides intuitive visualization. PhyloMap is implemented in JAVA, and the source code is freely available at http://www.biochem.uni-luebeck.de/public/software/phylomap.html PMID:21689434

  12. Rethinking the Christian Studies Classroom: Reflections on the Dynamics of Teaching Religion in Southern Public Universities

    ERIC Educational Resources Information Center

    Gravett, Sandie; Hulsether, Mark; Medine, Carolyn

    2011-01-01

    An extended set of conversations conducted by three religious studies faculty teaching at large public universities in the Southern United States spurred these reflections on how their institutional locations inflected issues such as the cultural expectations students bring to the classroom, how these expectations interact with the evolving…

  13. Eruptive history and tectonic setting of Medicine Lake Volcano, a large rear-arc volcano in the southern Cascades

    USGS Publications Warehouse

    Donnelly-Nolan, J. M.; Grove, T.L.; Lanphere, M.A.; Champion, D.E.; Ramsey, D.W.

    2008-01-01

    Medicine Lake Volcano (MLV), located in the southern Cascades ??? 55??km east-northeast of contemporaneous Mount Shasta, has been found by exploratory geothermal drilling to have a surprisingly silicic core mantled by mafic lavas. This unexpected result is very different from the long-held view derived from previous mapping of exposed geology that MLV is a dominantly basaltic shield volcano. Detailed mapping shows that < 6% of the ??? 2000??km2 of mapped MLV lavas on this southern Cascade Range shield-shaped edifice are rhyolitic and dacitic, but drill holes on the edifice penetrated more than 30% silicic lava. Argon dating yields ages in the range ??? 475 to 300??ka for early rhyolites. Dates on the stratigraphically lowest mafic lavas at MLV fall into this time frame as well, indicating that volcanism at MLV began about half a million years ago. Mafic compositions apparently did not dominate until ??? 300??ka. Rhyolite eruptions were scarce post-300??ka until late Holocene time. However, a dacite episode at ??? 200 to ??? 180??ka included the volcano's only ash-flow tuff, which was erupted from within the summit caldera. At ??? 100??ka, compositionally distinctive high-Na andesite and minor dacite built most of the present caldera rim. Eruption of these lavas was followed soon after by several large basalt flows, such that the combined area covered by eruptions between 100??ka and postglacial time amounts to nearly two-thirds of the volcano's area. Postglacial eruptive activity was strongly episodic and also covered a disproportionate amount of area. The volcano has erupted 9 times in the past 5200??years, one of the highest rates of late Holocene eruptive activity in the Cascades. Estimated volume of MLV is ??? 600??km3, giving an overall effusion rate of ??? 1.2??km3 per thousand years, although the rate for the past 100??kyr may be only half that. During much of the volcano's history, both dry HAOT (high-alumina olivine tholeiite) and hydrous calcalkaline basalts erupted together in close temporal and spatial proximity. Petrologic studies indicate that the HAOT magmas were derived by dry melting of spinel peridotite mantle near the crust mantle boundary. Subduction-derived H2O-rich fluids played an important role in the generation of calcalkaline magmas. Petrology, geochemistry and proximity indicate that MLV is part of the Cascades magmatic arc and not a Basin and Range volcano, although Basin and Range extension impinges on the volcano and strongly influences its eruptive style. MLV may be analogous to Mount Adams in southern Washington, but not, as sometimes proposed, to the older distributed back-arc Simcoe Mountains volcanic field.

  14. Antithrombotic Utilization Trends after Noncardioembolic Ischemic Stroke or TIA in the Setting of Large Antithrombotic Trials (2002–2009)

    PubMed Central

    Khan, Amir S.; Qureshi, Adnan I.

    2015-01-01

    Background and Purpose Several large trials published over the last decade have significantly altered recommended guidelines for therapy following a noncardioembolic ischemic stroke or transient ischemic attack (TIA). The impact of these studies on patient usage of alternative antithrombotic agents has hitherto not been evaluated. We examined the usage of these agents in the United States over the last decade, with regard to the publication of the Management of Atherothrombosis with Clopidogrel in High-Risk Patients (MATCH), European/Australasian Stroke Prevention in Reversible Ischaemia Trial (ESPRIT), and Prevention Regimen for Effectively Avoiding Second Strokes (PRoFESS) clinical trials, in order to test the hypothesis that resulting recommendations are reflected in usage trends. Methods Antithrombotic utilization was prospectively collected as part of the National Ambulatory Medical Care Survey (NAMCS) on a total of 53,608,351 patients in the United States between 2002 and 2009. Patients with a history of ischemic stroke or TIA were included. Patients were excluded if there was a prior history of subarachnoid or intracerebral hemorrhage, or if other indications for antithrombotic treatment were present, including deep venous thrombosis, pulmonary embolism, atrial fibrillation or flutter, mechanical cardiac valve replacement, congestive heart failure, coronary artery disease, peripheral arterial disease, and rheumatoid arthritis. Annual utilization of the following antithrombotic strategies was compared in 53,608,351 patients: 1) aspirin monotherapy, 2) clopidogrel monotherapy, 3) combined clopidogrel and aspirin, 4) combined extended-release dipyridamole (ERDP) and aspirin, and 5) warfarin. Annual utilization was compared before and after publication of MATCH, ESPRIT, and PRoFESS in 2004, 2006, and 2008, respectively. Trend analysis was performed with the Mantel–Haenszel test for trends. Sensitivity analysis of demographic and clinical characteristics stratified by antithrombotic-usage group was performed using the Wald Chi-square test. Results Utilization of combined clopidogrel and aspirin increased from 3.3% to 6.7% after the MATCH trial (p<0.0001). Following the results of the ESPRIT trial, utilization of combination ERDP and aspirin decreased from 4% to 3% (p<0.0001), utilization of clopidogrel declined from 6.8% to 6% (p<0.0001), and utilization of aspirin remained essentially unchanged. After the PRoFESS trial, utilization of clopidogrel increased from 5% to 9% (p<0.0001), utilization of ERDP-aspirin increased from 3 % to 4.6% (p<0.0001), and utilization of aspirin increased from 15.6% to 17.8% (p<0.0001). The proportion of patients on none of the five antithrombotic secondary prevention strategies steadily declined from a peak of 74% in 2003 to 57% by 2009. Conclusions The impact of the MATCH, ESPRIT, and PRoFESS trials on antithrombotic utilization has been variable. These findings highlight the importance of addressing factors that affect the implementation of findings from major clinical trials. PMID:25825628

  15. Towards Perceptual Interface for Visualization Navigation of Large Data Sets Using Gesture Recognition with Bezier Curves and Registered 3-D Data

    SciTech Connect

    Shin, M C; Tsap, L V; Goldgof, D B

    2003-03-20

    This paper presents a gesture recognition system for visualization navigation. Scientists are interested in developing interactive settings for exploring large data sets in an intuitive environment. The input consists of registered 3-D data. A geometric method using Bezier curves is used for the trajectory analysis and classification of gestures. The hand gesture speed is incorporated into the algorithm to enable correct recognition from trajectories with variations in hand speed. The method is robust and reliable: correct hand identification rate is 99.9% (from 1641 frames), modes of hand movements are correct 95.6% of the time, recognition rate (given the right mode) is 97.9%. An application to gesture-controlled visualization of 3D bioinformatics data is also presented.

  16. The effect of between-set rest intervals on the oxygen uptake during and after resistance exercise sessions performed with large- and small-muscle mass.

    PubMed

    Farinatti, Paulo T V; Castinheiras Neto, Antonio G

    2011-11-01

    Between-set rest intervals (RIs) may influence accumulated fatigue, work volume, and therefore oxygen uptake (VO2) and energy expenditure (EE) during resistance training. The study investigated the effects of different RIs on VO2 and EE in resistance exercises performed with multiple sets and recruiting large and small-muscle mass. Ten healthy men performed 4 randomized protocols (5 sets of 10 repetitions with 15 repetition maximum workloads in either horizontal leg press [LP] or chest fly [CF] with an RI of 1 and 3 minutes). The VO2 was measured at rest, within sets, and during 90-minute postexercise recovery (excess postexercise oxygen consumption [EPOC]). The EE was estimated from VO2net (total VO2 - rest VO2). The VO2 increased in all protocols, being higher within the exercises and during EPOC in the LP than in the CF regardless of the RI. The 1-minute RI induced higher accumulated VO2 during LP (p < 0.05) but not during CF. The EPOC lasted approximately 40 minutes after LP1, LP3, and CF1, being longer than after CF3 (20 minutes, p < 0.05). Total EE was mainly influenced by muscle mass (p < 0.001) (LP3 = 91.1 ± 13.5 kcal ∼ LP1 = 88.7 ± 18.4 kcal > CF1 = 50.3 ± 14.4 kcal ∼ CF3 = 54.1 ± 12.0 kcal). In conclusion, total VO2 was always higher in LP than in CF. Shortening RI enhanced the accumulated fatigue throughout sets only in LP and increased VO2 in the initial few minutes of EPOC, whereas it did not influence total VO2 and EE in both exercises. Therefore, (a) the role of RI in preventing early fatigue seems to be more important when large-muscle groups are recruited; (b) resistance exercises recruiting large-muscle mass induce higher EE because of a greater EPOC magnitude. PMID:21993043

  17. Collaborative Classroom Management. Video to Accompany "A Biological Brain in a Cultural Classroom: Applying Biological Research to Classroom Management." [Videotape].

    ERIC Educational Resources Information Center

    2001

    This 43-minute VHS videotape is designed to be used in course and workshop settings with "A Biological Brain in a Cultural Classroom: Applying Biological Research to Classroom Management." The videotape's principal values are as an introduction to the issues explored in the book and as a catalyst for group discussions and activities related to…

  18. Collaborative Classroom Management. Video to Accompany "A Biological Brain in a Cultural Classroom: Applying Biological Research to Classroom Management." [Videotape].

    ERIC Educational Resources Information Center

    2001

    This 43-minute VHS videotape is designed to be used in course and workshop settings with "A Biological Brain in a Cultural Classroom: Applying Biological Research to Classroom Management." The videotape's principal values are as an introduction to the issues explored in the book and as a catalyst for group discussions and activities related to…

  19. Tectonic stress inversion of large multi-phase fracture data sets: application of Win-Tensor to reveal the brittle tectonic history of the Lufilan Arc, DRC

    NASA Astrophysics Data System (ADS)

    Delvaux, Damien; Kipata, Louis; Sintubin, Manuel

    2013-04-01

    Large fault-slip data sets from multiphase orogenic regions present a particular challenge in paleostress reconstructions. The Lufilian Arc is an arcuate fold-and-thrust belt that formed during the late Pan-African times as the result of combined N-S and E-W amalgamation of Gondwana in SE-DRCongo and N-Zambia. We studied more than 22 sites in the Lufilian Arc, and its foreland and correlated the results obtained with existing result in the Ubende belt of W-Tanzania. Most studied sites are characterized by multiphase brittle deformation in which the observed brittle structures are the result of progressive saturation of the host rock by neoformed fractures and the reactivation of early formed fractures. They correspond to large mining exploitations with multiple large and continuous outcrops that allow obtaining datasets sufficiently large to be of statistical significance and often corresponding to several successive brittle events. In this context, the reconstruction of tectonic stress necessitates an initial field-base separation of data, completed by a dynamic separation of the original data set into subsets. In the largest sites, several parts of the deposits have been measured independently and are considered as sub-sites that are be processed separately in an initial stage. The procedure used for interactive fault-slip data separation and stress inversion will be illustrated by field examples (Luiswishi and Manono mining sites). This principle has been applied to all result in the reconstruction of the brittle tectonic history of the region, starting with two major phases of orogenic compression, followed by late orogenic extension and extensional collapse. A regional tectonic inversion during the early Mesozoic, as a result of far- field stresses mark the transition towards rift-related extension. More details in Kipata, Delvaux et al.(2013), Geologica Belgica 16/1-2: 001-017 Win-Tensor can be downloaded at: http://users.skynet.be/damien.delvaux/Tensor/tensor-index.html

  20. Tools to achieve the analysis of large data-set and handling intensity variations of sources with INTEGRAL/SPI : mapping of the sky and study of large-scale structures

    NASA Astrophysics Data System (ADS)

    Bouchet, Laurent

    Nowadays, analysis and reduction of the ever-larger data-sets becomes a crucial issue, especially for long periods of observation combined. The INTEGRAL/SPI X/gamma-ray spectrometer (20 keV-8 MeV) is an instrument for which it is essential to process many exposures at the same time to increase the low signal-to-noise ratio weakest sources and/or low-surface brightness extended emission. Processing several years of data simultaneously (10 years actually) requires to compute not only the solution of a large system of equations (linear or non-linear), but also the associated uncertainties. In this context, traditional methods of data reduction are ineffective and sometimes not possible at all. Thanks to the newly developed tools, processing large data-sets from SPI is now possible both with a reasonable turnaround time and low memory usage. We propose also techniques that help overcome difficulties related to the intensity variation of sources/backgreound between sources between consecutive exposures. They allow the construction of pseudo light-curves of a more rational way. We have developed a specific algorithm which involves the transfer function SPI. Based on these advanced tools, we have developed imaging algorithms. Finally, we show some applications to point-sources studies and to the imaging and morphologies study of the large scale structures of the Galaxy ( 511 kev electron-positron annihilation line, the (26) Al line and the diffuse continuum).

  1. Smart Classroom

    ERIC Educational Resources Information Center

    Kelly, Rhea, Ed.

    2006-01-01

    What makes a classroom "smart"? Presentation technologies such as projectors, document cameras, and LCD panels clearly fit the bill, but when considering other technologies for teaching, learning, and developing content, the possibilities become limited only by the boundaries of an institution's innovation. This article presents 32 best practices…

  2. Classroom Notes

    ERIC Educational Resources Information Center

    International Journal of Mathematical Education in Science and Technology, 2007

    2007-01-01

    In this issue's "Classroom Notes" section, the following papers are described: (1) "Sequences of Definite Integrals" by T. Dana-Picard; (2) "Structural Analysis of Pythagorean Monoids" by M.-Q Zhan and J. Tong; (3) "A Random Walk Phenomenon under an Interesting Stopping Rule" by S. Chakraborty; (4) "On Some Confidence Intervals for Estimating the…

  3. JPA Classroom.

    ERIC Educational Resources Information Center

    Howard, Elizabeth; And Others

    This instructional guide offers classroom lesson plans that can be used by teachers or police officers with a videotape to present a "Junior Police Academy" (JPA) program for middle school students. The guide also contains lesson plans and student activities to be used independently of the videotape. Following a description of the goals of the…

  4. Classroom Notes

    ERIC Educational Resources Information Center

    International Journal of Mathematical Education in Science and Technology, 2007

    2007-01-01

    In this issue's "Classroom Notes" section, the following papers are discussed: (1) "Constructing a line segment whose length is equal to the measure of a given angle" (W. Jacob and T. J. Osler); (2) "Generating functions for the powers of Fibonacci sequences" (D. Terrana and H. Chen); (3) "Evaluation of mean and variance integrals without…

  5. Supplementary Classroom.

    ERIC Educational Resources Information Center

    Douglas Fir Plywood Association, Tacoma, WA.

    Three prototype portable classrooms were developed for both conventional and component construction. One of these economical units was built for $7.50 per square foot. Construction of each type is explained through use of photographs and text. Included in the presentation are--(1) cluster grouping suggestions, (2) interior and exterior…

  6. Classroom Behavior

    ERIC Educational Resources Information Center

    Segal, Carmit

    2008-01-01

    This paper investigates the determinants and malleability of noncognitive skills. Using data on boys from the National Education Longitudinal Survey, I focus on youth behavior in the classroom as a measure of noncognitive skills. I find that student behavior during adolescence is persistent. The variation in behavior can be attributed to…

  7. Smart Classroom

    ERIC Educational Resources Information Center

    Kelly, Rhea, Ed.

    2006-01-01

    What makes a classroom "smart"? Presentation technologies such as projectors, document cameras, and LCD panels clearly fit the bill, but when considering other technologies for teaching, learning, and developing content, the possibilities become limited only by the boundaries of an institution's innovation. This article presents 32 best practices…

  8. Classroom Tech

    ERIC Educational Resources Information Center

    Instructor, 2006

    2006-01-01

    This article features the latest classroom technologies namely the FLY Pentop, WriteToLearn, and a new iris scan identification system. The FLY Pentop is a computerized pen from Leapster that "magically" understands what kids write and draw on special FLY paper. WriteToLearn is an automatic grading software from Pearson Knowledge Technologies and…

  9. Expanding Knowledge: From the Classroom into Cyberspace

    ERIC Educational Resources Information Center

    Barbas, Maria Potes Santa-Clara

    2006-01-01

    This paper is part of a larger project in the area of research. The main purpose of this mediated discourse was to implement, observe and analyse experiences of teachers in a training project developed for two different settings in the classroom. The first was between international classrooms through cyberspace and the second was a cyberspace…

  10. Bringing the Great Outdoors to Your Classroom.

    ERIC Educational Resources Information Center

    Kentucky State Dept. of Education, Frankfort. Div. of Program Development.

    This guide suggests ways to study the environment in the classroom. It provides information for setting up a classroom learning center including learning outcomes, activity suggestions, and resources and equipment for the learning center. Guidelines for a recycling program, a bird-feeding program, and terrarium-making are given. A bibliography of…

  11. Expanding Knowledge: From the Classroom into Cyberspace

    ERIC Educational Resources Information Center

    Barbas, Maria Potes Santa-Clara

    2006-01-01

    This paper is part of a larger project in the area of research. The main purpose of this mediated discourse was to implement, observe and analyse experiences of teachers in a training project developed for two different settings in the classroom. The first was between international classrooms through cyberspace and the second was a cyberspace…

  12. Application of Transcultural Themes in International Classrooms

    ERIC Educational Resources Information Center

    Van Hook, Steven R.

    2007-01-01

    The effective use of transcultural themes and images may help promote positive resonance in international settings, such as found in the traditional and online classrooms of globalizing higher education. Findings of transculturally resonant themes and images may be applied to international classroom pedagogy through such means as multimedia…

  13. A geometrical correction for the inter- and intra-molecular basis set superposition error in Hartree-Fock and density functional theory calculations for large systems

    NASA Astrophysics Data System (ADS)

    Kruse, Holger; Grimme, Stefan

    2012-04-01

    A semi-empirical counterpoise-type correction for basis set superposition error (BSSE) in molecular systems is presented. An atom pair-wise potential corrects for the inter- and intra-molecular BSSE in supermolecular Hartree-Fock (HF) or density functional theory (DFT) calculations. This geometrical counterpoise (gCP) denoted scheme depends only on the molecular geometry, i.e., no input from the electronic wave-function is required and hence is applicable to molecules with ten thousands of atoms. The four necessary parameters have been determined by a fit to standard Boys and Bernadi counterpoise corrections for Hobza's S66×8 set of non-covalently bound complexes (528 data points). The method's target are small basis sets (e.g., minimal, split-valence, 6-31G*), but reliable results are also obtained for larger triple-? sets. The intermolecular BSSE is calculated by gCP within a typical error of 10%-30% that proves sufficient in many practical applications. The approach is suggested as a quantitative correction in production work and can also be routinely applied to estimate the magnitude of the BSSE beforehand. The applicability for biomolecules as the primary target is tested for the crambin protein, where gCP removes intramolecular BSSE effectively and yields conformational energies comparable to def2-TZVP basis results. Good mutual agreement is also found with Jensen's ACP(4) scheme, estimating the intramolecular BSSE in the phenylalanine-glycine-phenylalanine tripeptide, for which also a relaxed rotational energy profile is presented. A variety of minimal and double-? basis sets combined with gCP and the dispersion corrections DFT-D3 and DFT-NL are successfully benchmarked on the S22 and S66 sets of non-covalent interactions. Outstanding performance with a mean absolute deviation (MAD) of 0.51 kcal/mol (0.38 kcal/mol after D3-refit) is obtained at the gCP-corrected HF-D3/(minimal basis) level for the S66 benchmark. The gCP-corrected B3LYP-D3/6-31G* model chemistry yields MAD=0.68 kcal/mol, which represents a huge improvement over plain B3LYP/6-31G* (MAD=2.3 kcal/mol). Application of gCP-corrected B97-D3 and HF-D3 on a set of large protein-ligand complexes prove the robustness of the method. Analytical gCP gradients make optimizations of large systems feasible with small basis sets, as demonstrated for the inter-ring distances of 9-helicene and most of the complexes in Hobza's S22 test set. The method is implemented in a freely available FORTRAN program obtainable from the author's website.

  14. The Social Context of Urban Classrooms: Measuring Student Psychological Climate

    ERIC Educational Resources Information Center

    Frazier, Stacy L.; Mehta, Tara G.; Atkins, Marc S.; Glisson, Charles; Green, Philip D.; Gibbons, Robert D.; Kim, Jong Bae; Chapman, Jason E.; Schoenwald, Sonja K.; Cua, Grace; Ogle, Robert R.

    2015-01-01

    Classrooms are unique and complex work settings in which teachers and students both participate in and contribute to classroom processes. This article describes the measurement phase of a study that examined the social ecology of urban classrooms. Informed by the dimensions and items of an established measure of organizational climate, we designed…

  15. Systemize Classroom Management to Enhance Teaching and Learning

    ERIC Educational Resources Information Center

    Delman, Douglas J.

    2011-01-01

    Good classroom management is one of the most important goals teachers strive to establish from the first day of class. The rules, procedures, activities, and behaviors set the classroom tone throughout the school year. By revising, updating, and systemizing classroom management activities, teachers can eliminate many problems created by students…

  16. Practical Classroom Applications of Language Experience: Looking Back, Looking Forward.

    ERIC Educational Resources Information Center

    Nelson, Olga G., Ed.; Linek, Wayne M., Ed.

    The 38 essays in this book look back at language experience as an educational approach, provide practical classroom applications, and reconceptualize language experience as an overarching education process. Classroom teachers and reading specialists describe strategies in use in a variety of classroom settings and describe ways to integrate…

  17. Practical Classroom Applications of Language Experience: Looking Back, Looking Forward.

    ERIC Educational Resources Information Center

    Nelson, Olga G., Ed.; Linek, Wayne M., Ed.

    The 38 essays in this book look back at language experience as an educational approach, provide practical classroom applications, and reconceptualize language experience as an overarching education process. Classroom teachers and reading specialists describe strategies in use in a variety of classroom settings and describe ways to integrate…

  18. Multilingual Label Quests: A Practice for the "Asymmetrical" Multilingual Classroom

    ERIC Educational Resources Information Center

    Bonacina-Pugh, Florence

    2013-01-01

    Research on multilingual classrooms usually focuses on contexts where both teachers and pupils share the same linguistic repertoire; what can be called "symmetrical" multilingual classrooms. This paper sets out to investigate whether (and how) pupils' multilingual resources can be used in classrooms where the teacher does not share pupils'…

  19. Environmentally Enriched Classrooms and the Development of Disadvantaged Preschool Children.

    ERIC Educational Resources Information Center

    Busse, Thomas V.; And Others

    This study evaluates the effects of placement of additional equipment in preschool classrooms on the cognitive, perceptual, and social development of urban Negro four-year-old children. Two Get Set classrooms in each of six areas of Philadelphia were paired for teachers, subjects, physical facilities and equipment. One classroom in each pair was…

  20. The Social Context of Urban Classrooms: Measuring Student Psychological Climate

    ERIC Educational Resources Information Center

    Frazier, Stacy L.; Mehta, Tara G.; Atkins, Marc S.; Glisson, Charles; Green, Philip D.; Gibbons, Robert D.; Kim, Jong Bae; Chapman, Jason E.; Schoenwald, Sonja K.; Cua, Grace; Ogle, Robert R.

    2015-01-01

    Classrooms are unique and complex work settings in which teachers and students both participate in and contribute to classroom processes. This article describes the measurement phase of a study that examined the social ecology of urban classrooms. Informed by the dimensions and items of an established measure of organizational climate, we designed…

  1. Methodologic implications of social inequalities for analyzing health disparities in large spatiotemporal data sets: an example using breast cancer incidence data (Northern and Southern California, 1988--2002).

    PubMed

    Chen, Jarvis T; Coull, Brent A; Waterman, Pamela D; Schwartz, Joel; Krieger, Nancy

    2008-09-10

    Efforts to monitor, investigate, and ultimately eliminate health disparities across racial/ethnic and socioeconomic groups can benefit greatly from spatiotemporal models that enable exploration of spatial and temporal variation in health. Hierarchical Bayes methods are well-established tools in the statistical literature for fitting such models, as they permit smoothing of unstable small-area rates. However, issues presented by 'real-life' surveillance data can be a barrier to routine use of these models by epidemiologists. These include (1) shifting of regional boundaries over time, (2) social inequalities in racial/ethnic residential segregation, which imply differential spatial structuring across different racial/ethnic groups, and (3) heavy computational burdens for large spatiotemporal data sets. Using data from a study of changing socioeconomic gradients in female breast cancer incidence in two population-based cancer registries covering the San Francisco Bay Area and Los Angeles County, CA (1988--2002), we illustrate a two-stage approach to modeling health disparities and census tract (CT) variation in incidence over time. In the first stage, we fit race- and year-specific spatial models using CT boundaries normalized to the U.S. Census 2000. In stage 2, temporal patterns in the race- and year-specific estimates of racial/ethnic and socioeconomic effects are explored using a variety of methods. Our approach provides a straightforward means of fitting spatiotemporal models in large data sets, while highlighting differences in spatial patterning across racial/ethnic population and across time. PMID:18551507

  2. Methodologic implications of social inequalities for analyzing health disparities in large spatiotemporal data sets: An example using breast cancer incidence data (Northern and Southern California, 1988–2002)

    PubMed Central

    Chen, Jarvis T; Coull, Brent A.; Waterman, Pamela D.; Schwartz, Joel; Krieger, Nancy

    2008-01-01

    SUMMARY Efforts to monitor, investigate, and ultimately eliminate health disparities across racial/ethnic and socioeconomic groups can benefit greatly from spatiotemporal models that enable exploration of spatial and temporal variation in health. Hierarchical Bayes methods are well-established tools in the statistical literature for fitting such models, as they permit smoothing of unstable small-area rates. However, issues presented by ‘real-life’ surveillance data can be a barrier to routine use of these models by epidemiologists. These include (1) shifting of regional boundaries over time, (2) social inequalities in racial/ethnic residential segregation, which imply differential spatial structuring across different racial/ethnic groups, and (3) heavy computational burdens for large spatiotemporal data sets. Using data from a study of changing socioeconomic gradients in female breast cancer incidence in two population-based cancer registries covering the San Francisco Bay Area and Los Angeles County, CA (1988–2002), we illustrate a two-stage approach to modeling health disparities and census tract (CT) variation in incidence over time. In the first stage, we fit race- and year-specific spatial models using CT boundaries normalized to the U.S. Census 2000. In stage 2, temporal patterns in the race- and year-specific estimates of racial/ethnic and socioeconomic effects are explored using a variety of methods. Our approach provides a straightforward means of fitting spatiotemporal models in large data sets, while highlighting differences in spatial patterning across racial/ethnic population and across time. PMID:18551507

  3. River Modeling in Large and Ungauged Basins: Experience of Setting up the HEC RAS Model over the Ganges-Brahmaputra-Meghna Basins

    NASA Astrophysics Data System (ADS)

    Hossain, F.; Maswood, M.

    2014-12-01

    River modeling is the processing of setting up a physically-based hydrodynamic model that can simulate the water flow dynamics of a stream network against time varying boundary conditions. Such river models are an important component of any flood forecasting system that forecasts river levels in flood prone regions. However, many large river basins in the developing world such as the Ganges, Brahmaputra, Meghna (GBM), Indus, Irrawaddy, Salween, Mekong and Niger are mostly ungauged. Such large basins lack the necessary in-situ measurements of river bed depth/slope, bathymetry (river cross section), floodplain mapping and boundary condition flows for forcing a river model. For such basins, proxy approaches relying mostly on remote sensing data from space platforms are the only alternative. In this study, we share our experience of setting up the widely-used 1-D river model over the entire GBM basin and its stream network. Good quality in-situ measurements of river hydraulics (cross section, slope, flow) was available only for the downstream and flood prone region of the basin, which comprises only 7% of the basin area. For the remaining 93% of the basin area, we resorted to the use of data from the following satellite sensors to build a workable river model: a) Shuttle Radar Topography Mission (SRTM) for deriving bed slope; b) LANDSAT/MODIS for updating river network and flow direction generated by elevation data; c) radar altimetry data to build depth versus width relationship at river locations; d) satellite precipitation based hydrologic modeling of lateral flows into main stem rivers. In addition, we referred to an extensive body of literature to estimate the prevailing baseline hydraulics of rivers in the ungauged region. We measured success of our approach by systematically testing how well the basin-wide river model could simulate river level dynamics at two measured locations inside Bangladesh. Our experience of river modeling was replete with numerous hurdles that we did not anticipate, and often required a change in plan. In this study we summarize these key hurdles faced and offer a step by step approach to setting up river models for large ungauged river basins. Such a guide can be useful for the community wishing to set up RAS type models in basins such as Niger, Mekong, Irrawaddy, Indus etc.

  4. Resistance to Disruption in a Classroom Setting

    ERIC Educational Resources Information Center

    Parry-Cruwys, Diana E.; Neal, Carrie M.; Ahearn, William H.; Wheeler, Emily E.; Premchander, Raseeka; Loeb, Melissa B.; Dube, William V.

    2011-01-01

    Substantial experimental evidence indicates that behavior reinforced on a denser schedule is more resistant to disruption than is behavior reinforced on a thinner schedule. The present experiment studied resistance to disruption in a natural educational environment. Responding during familiar activities was reinforced on a multiple…

  5. Pivotal Response Teaching in the Classroom Setting

    ERIC Educational Resources Information Center

    Stahmer, Aubyn C.; Suhrheinrich, Jessica; Reed, Sarah; Bolduc, Cynthia; Schreibman, Laura

    2010-01-01

    Pivotal response teaching (PRT) is an empirically supported naturalistic behavioral intervention proven to be efficacious in the education of children with autism. This intervention involves loosely structured learning environments, teaching during ongoing interactions between student and teacher, child initiation of teaching episodes, child…

  6. Promoting Active Involvement in Classrooms

    ERIC Educational Resources Information Center

    Conderman, Greg; Bresnahan, Val; Hedin, Laura

    2012-01-01

    This article presents a rationale for using active involvement techniques, describes large- and small-group methods based on their documented effectiveness and applicability to K-12 classrooms, and illustrates their use. These approaches include ways of engaging students in large groups (e.g., unison responses, response cards, dry-erase boards,…

  7. Promoting Active Involvement in Classrooms

    ERIC Educational Resources Information Center

    Conderman, Greg; Bresnahan, Val; Hedin, Laura

    2012-01-01

    This article presents a rationale for using active involvement techniques, describes large- and small-group methods based on their documented effectiveness and applicability to K-12 classrooms, and illustrates their use. These approaches include ways of engaging students in large groups (e.g., unison responses, response cards, dry-erase boards,…

  8. mzDB: A File Format Using Multiple Indexing Strategies for the Efficient Analysis of Large LC-MS/MS and SWATH-MS Data Sets*

    PubMed Central

    Bouyssié, David; Dubois, Marc; Nasso, Sara; Gonzalez de Peredo, Anne; Burlet-Schiltz, Odile; Aebersold, Ruedi; Monsarrat, Bernard

    2015-01-01

    The analysis and management of MS data, especially those generated by data independent MS acquisition, exemplified by SWATH-MS, pose significant challenges for proteomics bioinformatics. The large size and vast amount of information inherent to these data sets need to be properly structured to enable an efficient and straightforward extraction of the signals used to identify specific target peptides. Standard XML based formats are not well suited to large MS data files, for example, those generated by SWATH-MS, and compromise high-throughput data processing and storing. We developed mzDB, an efficient file format for large MS data sets. It relies on the SQLite software library and consists of a standardized and portable server-less single-file database. An optimized 3D indexing approach is adopted, where the LC-MS coordinates (retention time and m/z), along with the precursor m/z for SWATH-MS data, are used to query the database for data extraction. In comparison with XML formats, mzDB saves ?25% of storage space and improves access times by a factor of twofold up to even 2000-fold, depending on the particular data access. Similarly, mzDB shows also slightly to significantly lower access times in comparison with other formats like mz5. Both C++ and Java implementations, converting raw or XML formats to mzDB and providing access methods, will be released under permissive license. mzDB can be easily accessed by the SQLite C library and its drivers for all major languages, and browsed with existing dedicated GUIs. The mzDB described here can boost existing mass spectrometry data analysis pipelines, offering unprecedented performance in terms of efficiency, portability, compactness, and flexibility. PMID:25505153

  9. Multilevel and Diverse Classrooms

    ERIC Educational Resources Information Center

    Baurain, Bradley, Ed.; Ha, Phan Le, Ed.

    2010-01-01

    The benefits and advantages of classroom practices incorporating unity-in-diversity and diversity-in-unity are what "Multilevel and Diverse Classrooms" is all about. Multilevel classrooms--also known as mixed-ability or heterogeneous classrooms--are a fact of life in ESOL programs around the world. These classrooms are often not only multilevel…

  10. Multilevel and Diverse Classrooms

    ERIC Educational Resources Information Center

    Baurain, Bradley, Ed.; Ha, Phan Le, Ed.

    2010-01-01

    The benefits and advantages of classroom practices incorporating unity-in-diversity and diversity-in-unity are what "Multilevel and Diverse Classrooms" is all about. Multilevel classrooms--also known as mixed-ability or heterogeneous classrooms--are a fact of life in ESOL programs around the world. These classrooms are often not only multilevel…

  11. Consolidating the set of known human protein-protein interactions in preparation for large-scale mapping of the human interactome

    PubMed Central

    Ramani, Arun K; Bunescu, Razvan C; Mooney, Raymond J; Marcotte, Edward M

    2005-01-01

    Background Extensive protein interaction maps are being constructed for yeast, worm, and fly to ask how the proteins organize into pathways and systems, but no such genome-wide interaction map yet exists for the set of human proteins. To prepare for studies in humans, we wished to establish tests for the accuracy of future interaction assays and to consolidate the known interactions among human proteins. Results We established two tests of the accuracy of human protein interaction datasets and measured the relative accuracy of the available data. We then developed and applied natural language processing and literature-mining algorithms to recover from Medline abstracts 6,580 interactions among 3,737 human proteins. A three-part algorithm was used: first, human protein names were identified in Medline abstracts using a discriminator based on conditional random fields, then interactions were identified by the co-occurrence of protein names across the set of Medline abstracts, filtering the interactions with a Bayesian classifier to enrich for legitimate physical interactions. These mined interactions were combined with existing interaction data to obtain a network of 31,609 interactions among 7,748 human proteins, accurate to the same degree as the existing datasets. Conclusion These interactions and the accuracy benchmarks will aid interpretation of current functional genomics data and provide a basis for determining the quality of future large-scale human protein interaction assays. Projecting from the approximately 15 interactions per protein in the best-sampled interaction set to the estimated 25,000 human genes implies more than 375,000 interactions in the complete human protein interaction network. This set therefore represents no more than 10% of the complete network. PMID:15892868

  12. A large proportion of asymptomatic Plasmodium infections with low and sub-microscopic parasite densities in the low transmission setting of Temotu Province, Solomon Islands: challenges for malaria diagnostics in an elimination setting

    PubMed Central

    2010-01-01

    Background Many countries are scaling up malaria interventions towards elimination. This transition changes demands on malaria diagnostics from diagnosing ill patients to detecting parasites in all carriers including asymptomatic infections and infections with low parasite densities. Detection methods suitable to local malaria epidemiology must be selected prior to transitioning a malaria control programme to elimination. A baseline malaria survey conducted in Temotu Province, Solomon Islands in late 2008, as the first step in a provincial malaria elimination programme, provided malaria epidemiology data and an opportunity to assess how well different diagnostic methods performed in this setting. Methods During the survey, 9,491 blood samples were collected and examined by microscopy for Plasmodium species and density, with a subset also examined by polymerase chain reaction (PCR) and rapid diagnostic tests (RDTs). The performances of these diagnostic methods were compared. Results A total of 256 samples were positive by microscopy, giving a point prevalence of 2.7%. The species distribution was 17.5% Plasmodium falciparum and 82.4% Plasmodium vivax. In this low transmission setting, only 17.8% of the P. falciparum and 2.9% of P. vivax infected subjects were febrile (?38°C) at the time of the survey. A significant proportion of infections detected by microscopy, 40% and 65.6% for P. falciparum and P. vivax respectively, had parasite density below 100/?L. There was an age correlation for the proportion of parasite density below 100/?L for P. vivax infections, but not for P. falciparum infections. PCR detected substantially more infections than microscopy (point prevalence of 8.71%), indicating a large number of subjects had sub-microscopic parasitemia. The concordance between PCR and microscopy in detecting single species was greater for P. vivax (135/162) compared to P. falciparum (36/118). The malaria RDT detected the 12 microscopy and PCR positive P. falciparum, but failed to detect 12/13 microscopy and PCR positive P. vivax infections. Conclusion Asymptomatic malaria infections and infections with low and sub-microscopic parasite densities are highly prevalent in Temotu province where malaria transmission is low. This presents a challenge for elimination since the large proportion of the parasite reservoir will not be detected by standard active and passive case detection. Therefore effective mass screening and treatment campaigns will most likely need more sensitive assays such as a field deployable molecular based assay. PMID:20822506

  13. Behavior Problems in Learning Activities and Social Interactions in Head Start Classrooms and Early Reading, Mathematics, and Approaches to Learning

    ERIC Educational Resources Information Center

    Bulotsky-Shearer, Rebecca J.; Fernandez, Veronica; Dominguez, Ximena; Rouse, Heather L.

    2011-01-01

    Relations between early problem behavior in preschool classrooms and a comprehensive set of school readiness outcomes were examined for a stratified random sample (N = 256) of 4-year-old children enrolled in a large, urban school district Head Start program. A series of multilevel models examined the unique contribution of early problem behavior…

  14. Behavior Problems in Learning Activities and Social Interactions in Head Start Classrooms and Early Reading, Mathematics, and Approaches to Learning

    ERIC Educational Resources Information Center

    Bulotsky-Shearer, Rebecca J.; Fernandez, Veronica; Dominguez, Ximena; Rouse, Heather L.

    2011-01-01

    Relations between early problem behavior in preschool classrooms and a comprehensive set of school readiness outcomes were examined for a stratified random sample (N = 256) of 4-year-old children enrolled in a large, urban school district Head Start program. A series of multilevel models examined the unique contribution of early problem behavior…

  15. New Ways of Classroom Assessment. Revised

    ERIC Educational Resources Information Center

    Brown, J. D., Ed.

    2013-01-01

    In this revised edition in the popular New Ways Series, teachers have once again been given an opportunity to show how they do assessment in their classrooms on an everyday basis. Often feeling helpless when confronted with large-scale standardized testing practices, teachers here offer classroom testing created with the direct aim of helping…

  16. Learning the Three C's: Classroom Communication Climate.

    ERIC Educational Resources Information Center

    Myers, Scott A.

    A study examined the communication climate of a graduate teaching assistant's (GTA) college classroom. Because the teaching role is often new to the GTA, establishing a communication climate may be a significant factor in classroom management. One section of a public speaking class taught by a new graduate teaching assistant at a large midwestern…

  17. Creating Learning Communities in the Classroom

    ERIC Educational Resources Information Center

    Saville, Bryan K.; Lawrence, Natalie Kerr; Jakobsen, Krisztina V.

    2012-01-01

    There are many ways to construct classroom-based learning communities. Nevertheless, the emphasis is always on cooperative learning. In this article, the authors focus on three teaching methods--interteaching, team-based learning, and cooperative learning in large, lecture-based courses--that they have used successfully to create classroom-based…

  18. Creating Learning Communities in the Classroom

    ERIC Educational Resources Information Center

    Saville, Bryan K.; Lawrence, Natalie Kerr; Jakobsen, Krisztina V.

    2012-01-01

    There are many ways to construct classroom-based learning communities. Nevertheless, the emphasis is always on cooperative learning. In this article, the authors focus on three teaching methods--interteaching, team-based learning, and cooperative learning in large, lecture-based courses--that they have used successfully to create classroom-based…

  19. New Ways of Classroom Assessment. Revised

    ERIC Educational Resources Information Center

    Brown, J. D., Ed.

    2013-01-01

    In this revised edition in the popular New Ways Series, teachers have once again been given an opportunity to show how they do assessment in their classrooms on an everyday basis. Often feeling helpless when confronted with large-scale standardized testing practices, teachers here offer classroom testing created with the direct aim of helping…

  20. Bag-Tanks for Your Classroom.

    ERIC Educational Resources Information Center

    Wulfson, Stephen E.

    1981-01-01

    Suggests using plastic bags as aquaria and terraria. Describes techniques for converting plastic sheets into aquaria, how to set them up for classroom use, and other uses for plastic bag aquaria. (DS)

  1. Integrated QSPR models to predict the soil sorption coefficient for a large diverse set of compounds by using different modeling methods

    NASA Astrophysics Data System (ADS)

    Shao, Yonghua; Liu, Jining; Wang, Meixia; Shi, Lili; Yao, Xiaojun; Gramatica, Paola

    2014-05-01

    The soil sorption coefficient (Koc) is a key physicochemical parameter to assess the environmental risk of organic compounds. To predict soil sorption coefficient in a more effective and economical way, here, quantitative structure-property relationship (QSPR) models were developed based on a large diverse dataset including 964 non-ionic organic compounds. Multiple linear regression (MLR), local lazy regression (LLR) and least squares support vector machine (LS-SVM) were utilized to develop QSPR models based on the four most relevant theoretical molecular descriptors selected by genetic algorithms-variable subset selection (GA-VSS) procedure. The QSPR development strictly followed the OECD principles for QSPR model validation, thus great attentions were paid to internal and external validations, applicability domain and mechanistic interpretation. The obtained results indicate that the LS-SVM model performed better than the MLR and the LLR models. For best LS-SVM model, the correlation coefficients (R2) for the training set was 0.913 and concordance correlation coefficient (CCC) for the prediction set was 0.917. The root-mean square errors (RMSE) were 0.330 and 0.426, respectively. The results of internal and external validations together with applicability domain analysis indicate that the QSPR models proposed in our work are predictive and could provide a useful tool for prediction soil sorption coefficient of new compounds.

  2. The Classroom Animal: Crickets.

    ERIC Educational Resources Information Center

    Kramer, David C.

    1985-01-01

    Suggests using crickets for classroom activities, providing background information on their anatomy and reproduction and tips on keeping individual organisms or a breeding colony in the classroom. (JN)

  3. Sharp Interface Immersed-Boundary/Level-Set Cartesian Grid Method for Large-Eddy Simulation of Two-Phase Flows with Surface-Piercing Moving Bodies

    NASA Astrophysics Data System (ADS)

    Yang, Jianming; Stern, Frederick

    2007-11-01

    A sharp interface Cartesian grid method for the large-eddy simulation of two-phase flows interacting with surface-piercing moving bodies is presented. The method is based on a sharp interface immersed boundary formulation for fluid flows with moving boundaries and a level set based ghost fluid method for two-phase interface treatment. A four-step fractional step method is adopted and a Lagrangian dynamic Smagorinsky subgrid-scale model is used for large-eddy simulations. The combination of immersed boundary method for solid/fluid boundaries and ghost-fluid method for fluid/fluid interfaces is discussed in detail. A variety of test cases with different scales ranging from bubble dynamics to ship hydrodynamics are performed for verification and validation purpose. Several examples of interest such as water exit and entry of a circular cylinder, landslide generated waves, and ship waves are demonstrated to showcase the accuracy and efficiency of our method. Approaches for extending it to high Reynolds number ship flows by means of wall-layer modeling are also discussed.

  4. Household malaria knowledge and its association with bednet ownership in settings without large–scale distribution programs: Evidence from rural Madagascar

    PubMed Central

    Krezanoski, Paul J.; Tsai, Alexander C.; Hamer, Davidson H.; Comfort, Alison B.; Bangsberg, David R.

    2014-01-01

    Background Insecticide–treated bednets are effective at preventing malaria. This study focuses on household–level factors that are associated with bednet ownership in a rural area of Madagascar which had not been a recipient of large–scale ITN distribution. Methods Data were gathered on individual and household characteristics, malaria knowledge, household assets and bednet ownership. Principal components analysis was used to construct both a wealth index based on household assets and a malaria knowledge index based on responses to questions about malaria. Bivariate and multivariate regressions were used to determine predictors of household bednet ownership and malaria knowledge. Results Forty–seven of 560 households (8.4%) owned a bednet. In multivariate analysis, higher level of malaria knowledge among household members was the only variable significantly associated with bednet ownership (odds ratio 3.72, P?setting of limited supply of affordable bednets, malaria knowledge was associated with an increased probability of household bednet ownership. Further studies should determine how such malaria knowledge evolves and if malaria–specific education programs could help overcome the barriers to bednet ownership among at–risk households living outside the reach of large–scale bednet distribution programs. PMID:24976960

  5. Structural Analysis in the Classroom

    ERIC Educational Resources Information Center

    Gage, Nicholas A.; Lewis, Timothy J.

    2010-01-01

    The purpose of this article is to describe an applied method of assessing and manipulating environmental factors influencing student behavior. The assessment procedure is called structural analysis (SA) and can be a part of a functional behavioral assessment (FBA) process or a stand-alone set of procedures for teachers to use in their classrooms

  6. Classroom Meetings: A Program Model.

    ERIC Educational Resources Information Center

    Frey, Andy; Doyle, Hallie Davis

    2001-01-01

    Describes a model for classroom meetings in an elementary school setting. Focuses on enhancing children's communication and problem-solving skills for typical students and those identified through special education. The purpose of the meetings is to provide a nurturing climate for the learning of social skills that the children can use in the…

  7. Classroom Culture Promotes Academic Resiliency

    ERIC Educational Resources Information Center

    DiTullio, Gina

    2014-01-01

    Resiliency is what propels many students to continue moving forward under difficult learning and life conditions. We intuitively think that such resilience is a character quality that cannot be taught. On the contrary, when a teacher sets the right conditions and culture for it in the classroom by teaching collaboration and communication skills,…

  8. Classroom Culture Promotes Academic Resiliency

    ERIC Educational Resources Information Center

    DiTullio, Gina

    2014-01-01

    Resiliency is what propels many students to continue moving forward under difficult learning and life conditions. We intuitively think that such resilience is a character quality that cannot be taught. On the contrary, when a teacher sets the right conditions and culture for it in the classroom by teaching collaboration and communication skills,…

  9. Getting Started in Classroom Computing.

    ERIC Educational Resources Information Center

    Ahl, David H.

    Written for secondary students, this booklet provides an introduction to several computer-related concepts through a set of six classroom games, most of which can be played with little more than a sheet of paper and a pencil. The games are: 1) SECRET CODES--introduction to binary coding, punched cards, and paper tape; 2) GUESS--efficient methods…

  10. Flipped Classroom Modules for Large Enrollment General Chemistry Courses: A Low Barrier Approach to Increase Active Learning and Improve Student Grades

    ERIC Educational Resources Information Center

    Eichler, Jack F.; Peeples, Junelyn

    2016-01-01

    In the face of mounting evidence revealing active learning approaches result in improved student learning outcomes compared to traditional passive lecturing, there is a growing need to change the way instructors teach large introductory science courses. However, a large proportion of STEM faculty continues to use traditional instructor-centered…

  11. Global Internet Video Classroom: A Technology Supported Learner-Centered Classroom

    ERIC Educational Resources Information Center

    Lawrence, Oliver

    2010-01-01

    The Global Internet Video Classroom (GIVC) Project connected Chicago Civil Rights activists of the 1960s with Cape Town Anti-Apartheid activists of the 1960s in a classroom setting where learners from Cape Town and Chicago engaged activists in conversations about their motivation, principles, and strategies. The project was launched in order to…

  12. Global Internet Video Classroom: A Technology Supported Learner-Centered Classroom

    ERIC Educational Resources Information Center

    Lawrence, Oliver

    2010-01-01

    The Global Internet Video Classroom (GIVC) Project connected Chicago Civil Rights activists of the 1960s with Cape Town Anti-Apartheid activists of the 1960s in a classroom setting where learners from Cape Town and Chicago engaged activists in conversations about their motivation, principles, and strategies. The project was launched in order to…

  13. Classroom Management and Teachers' Coping Strategies: Inside Classrooms in Australia, China and Israel

    ERIC Educational Resources Information Center

    Romi, Shlomo; Lewis, Ramon; Roache, Joel

    2013-01-01

    This paper discusses the degree to which recently reported relationships between the classroom management techniques and coping styles of Australian teachers apply in two other national settings: China and Israel. Little is known about which teacher characteristics relate to their approach to classroom management, although researchers in Australia…

  14. Classroom Management and Teachers' Coping Strategies: Inside Classrooms in Australia, China and Israel

    ERIC Educational Resources Information Center

    Romi, Shlomo; Lewis, Ramon; Roache, Joel

    2013-01-01

    This paper discusses the degree to which recently reported relationships between the classroom management techniques and coping styles of Australian teachers apply in two other national settings: China and Israel. Little is known about which teacher characteristics relate to their approach to classroom management, although researchers in Australia…

  15. All Together Now: Measuring Staff Cohesion in Special Education Classrooms

    PubMed Central

    Kratz, Hilary E.; Locke, Jill; Piotrowski, Zinnia; Ouellette, Rachel R.; Xie, Ming; Stahmer, Aubyn C.; Mandell, David S.

    2015-01-01

    This study sought to validate a new measure, the Classroom Cohesion Survey (CCS), designed to examine the relationship between teachers and classroom assistants in autism support classrooms. Teachers, classroom assistants, and external observers showed good inter-rater agreement on the CCS and good internal consistency for all scales. Simple factor structures were found for both teacher- and classroom assistant–rated scales, with one-factor solutions for both scales. Paired t tests revealed that on average, classroom assistants rated classroom cohesion stronger than teachers. The CCS may be an effective tool for measuring cohesion between classroom staff and may have an important impact on various clinical and implementation outcomes in school settings. PMID:26213443

  16. Classroom Management. Brief

    ERIC Educational Resources Information Center

    National Education Association Research Department, 2006

    2006-01-01

    In learning-centered classrooms, the emphasis of classroom management shifts from maintaining behavioral control to fostering student engagement and self-regulation as well as community responsibility. This brief describes classroom management in "learning centered" classrooms, where practices are consistent with recent research knowledge about…

  17. Comparison of Two Methods for Estimating the Sampling-Related Uncertainty of Satellite Rainfall Averages Based on a Large Radar Data Set

    NASA Technical Reports Server (NTRS)

    Lau, William K. M. (Technical Monitor); Bell, Thomas L.; Steiner, Matthias; Zhang, Yu; Wood, Eric F.

    2002-01-01

    The uncertainty of rainfall estimated from averages of discrete samples collected by a satellite is assessed using a multi-year radar data set covering a large portion of the United States. The sampling-related uncertainty of rainfall estimates is evaluated for all combinations of 100 km, 200 km, and 500 km space domains, 1 day, 5 day, and 30 day rainfall accumulations, and regular sampling time intervals of 1 h, 3 h, 6 h, 8 h, and 12 h. These extensive analyses are combined to characterize the sampling uncertainty as a function of space and time domain, sampling frequency, and rainfall characteristics by means of a simple scaling law. Moreover, it is shown that both parametric and non-parametric statistical techniques of estimating the sampling uncertainty produce comparable results. Sampling uncertainty estimates, however, do depend on the choice of technique for obtaining them. They can also vary considerably from case to case, reflecting the great variability of natural rainfall, and should therefore be expressed in probabilistic terms. Rainfall calibration errors are shown to affect comparison of results obtained by studies based on data from different climate regions and/or observation platforms.

  18. Revoicing Classrooms: A Spatial Manifesto

    ERIC Educational Resources Information Center

    Fisher, Kenn

    2004-01-01

    Why is the physical learning environment in schools largely ignored by teachers within pedagogical practice? The cellular classroom has remained seemingly immutable since the Industrial Revolution, with spatiality playing a silent and subconscious role in schooling other than related to concerns around surveillance. Previous studies have shown…

  19. Consistency of Toddler Engagement across Two Settings

    ERIC Educational Resources Information Center

    Aguiar, Cecilia; McWilliam, R. A.

    2013-01-01

    This study documented the consistency of child engagement across two settings, toddler child care classrooms and mother-child dyadic play. One hundred twelve children, aged 14-36 months (M = 25.17, SD = 6.06), randomly selected from 30 toddler child care classrooms from the district of Porto, Portugal, participated. Levels of engagement were…

  20. Consistency of Toddler Engagement across Two Settings

    ERIC Educational Resources Information Center

    Aguiar, Cecilia; McWilliam, R. A.

    2013-01-01

    This study documented the consistency of child engagement across two settings, toddler child care classrooms and mother-child dyadic play. One hundred twelve children, aged 14-36 months (M = 25.17, SD = 6.06), randomly selected from 30 toddler child care classrooms from the district of Porto, Portugal, participated. Levels of engagement were…

  1. Mendel in the Modern Classroom

    NASA Astrophysics Data System (ADS)

    Smith, Mike U.; Gericke, Niklas M.

    2015-01-01

    Mendel is an icon in the history of genetics and part of our common culture and modern biology instruction. The aim of this paper is to summarize the place of Mendel in the modern biology classroom. In the present article we will identify key issues that make Mendel relevant in the classroom today. First, we recount some of the historical controversies that have relevance to modern curricular design, such as Fisher's (Ann Sci 1:115-137, 1936/2008) claim that Mendel's data were too good to be true. We also address questions about Mendel's status as the father of genetics as well as questions about the sequencing of Mendel's work in genetics instruction in relation to modern molecular genetics and evolution. Next, we present a systematic set of examples of research based approaches to the use of Mendel in the modern classroom along with criticisms of these designs and questions about the historical accuracy of the story of Mendel as presented in the typical classroom. Finally, we identify gaps in our understanding in need of further study and present a selected set of resources that, along with the references cited, should be valuable to science educators interested in further study of the story of Mendel.

  2. Photonics Explorer: revolutionizing photonics in the classroom

    NASA Astrophysics Data System (ADS)

    Prasad, Amrita; Debaes, Nathalie; Cords, Nina; Fischer, Robert; Vlekken, Johan; Euler, Manfred; Thienpont, Hugo

    2012-10-01

    The `Photonics Explorer' is a unique intra-curricular optics kit designed to engage, excite and educate secondary school students about the fascination of working with light - hands-on, in their own classrooms. Developed with a pan European collaboration of experts, the kit equips teachers with class sets of experimental material provided within a supporting didactic framework, distributed in conjunction with teacher training courses. The material has been specifically designed to integrate into European science curricula. Each kit contains robust and versatile components sufficient for a class of 25-30 students to work in groups of 2-3. The didactic content is based on guided inquiry-based learning (IBL) techniques with a strong emphasis on hands-on experiments, team work and relating abstract concepts to real world applications. The content has been developed in conjunction with over 30 teachers and experts in pedagogy to ensure high quality and ease of integration. It is currently available in 7 European languages. The Photonics Explorer allows students not only to hone their essential scientific skills but also to really work as scientists and engineers in the classroom. Thus, it aims to encourage more young people to pursue scientific careers and avert the imminent lack of scientific workforce in Europe. 50 Photonics Explorer kits have been successfully tested in 7 European countries with over 1500 secondary school students. The positive impact of the kit in the classroom has been qualitatively and quantitatively evaluated. A non-profit organisation, EYESTvzw [Excite Youth for Engineering Science and Technology], is responsible for the large scale distribution of the Photonics Explorer.

  3. Twelve tips for "flipping" the classroom.

    PubMed

    Moffett, Jennifer

    2015-04-01

    The flipped classroom is a pedagogical model in which the typical lecture and homework elements of a course are reversed. The following tips outline the steps involved in making a successful transition to a flipped classroom approach. The tips are based on the available literature alongside the author's experience of using the approach in a medical education setting. Flipping a classroom has a number of potential benefits, for example increased educator-student interaction, but must be planned and implemented carefully to support effective learning. PMID:25154646

  4. Connecting classrooms to the Milky Way

    NASA Astrophysics Data System (ADS)

    Salomé, P.; Radiguet, A.; Albert, B.; Batrung, M.; Caillat, M.; Gheudin, M.; Libert, Y.; Ferlet, R.; Maestrini, A.; Melchior, A.-L.; Munier, J.-M.; Rudolph, A.

    2012-12-01

    'Connecting Classrooms to the Milky Way' is a project of the EU-HOU Consortium (Hands-On-Universe, Europe), involving 11 European countries. It is supported by the lifelong Learning Programme of the European Community. The main goal of this project was to set up the first network of small radio-telescopes dedicated to education all around Europe and directly accessible from a simple Web interface. Any classroom connected to Internet via any Web-browser can now remotely control one of the radio-telescopes and observe the HI emission coming from our Galaxy. The interface also provides the users with simple tools to analyse the data: (i) derive the Milky-Way rotation curve and (ii) map the spiral arms HI distribution. A special emphasis has been made to enable the young generation to understand the challenges of these wavelengths, which are currently at the frontline of the new instruments with the development of the ALMA (Atacama Large Millimeter Array) and SKA (Square Kilometer Array) projects.

  5. The contribution of children's self-regulation and classroom quality to children's adaptive behaviors in the kindergarten classroom.

    PubMed

    Rimm-Kaufman, Sara E; Curby, Tim W; Grimm, Kevin J; Nathanson, Lori; Brock, Laura L

    2009-07-01

    In this study, the authors examined the extent to which children's self-regulation upon kindergarten entrance and classroom quality in kindergarten contributed to children's adaptive classroom behavior. Children's self-regulation was assessed using a direct assessment upon entrance into kindergarten. Classroom quality was measured on the basis of multiple classroom observations during the kindergarten year. Children's adaptive classroom behavior in kindergarten was assessed through teacher report and classroom observations: Teachers rated children's cognitive and behavioral self-control and work habits during the spring of the kindergarten year; observers rated children's engagement and measured off-task behavior at 2-month intervals from November to May. Hierarchical linear models revealed that children's self-regulation upon school entry in a direct assessment related to teachers' report of behavioral self-control, cognitive self-control, and work habits in the spring of the kindergarten year. Classroom quality, particularly teachers' effective classroom management, was linked to children's greater behavioral and cognitive self-control, children's higher behavioral engagement, and less time spent off-task in the classroom. Classroom quality did not moderate the relation between children's self-regulation upon school entry and children's adaptive classroom behaviors in kindergarten. The discussion considers the implications of classroom management for supporting children's early development of behavioral skills that are important in school settings. PMID:19586173

  6. An Observational Study of Instructional and Curricular Practices Used with Gifted and Talented Students in Regular Classrooms. Research Monograph 93104.

    ERIC Educational Resources Information Center

    Westberg, Karen L.; And Others

    This report describes one part of the Classroom Practices Study, focusing on systematic observations of gifted and talented students in 46 third and fourth grade classrooms. The observations were designed to determine if and how teachers meet the needs of gifted and talented students in regular classroom settings. The Classroom Practices Record…

  7. Using Water-Testing Data Sets.

    ERIC Educational Resources Information Center

    Varrella, Gary F.

    1994-01-01

    Advocates an approach to teaching environmentally related studies based on constructivism. Presents an activity that makes use of data on chemicals in the water supply, and discusses obtaining and using data sets in the classroom. (LZ)

  8. Using short forms of classroom climate instruments to assess and improve classroom psychosocial environment

    NASA Astrophysics Data System (ADS)

    Fraser, Barry J.; Fisher, Darrell L.

    Despite international interest in research in the area of classroom environment, very little attention has been given to exploring how science teachers might apply ideas from the field of classroom environment in guiding practical improvements in science classrooms. In order to facilitate science teachers' use of classroom climate assessments, we developed economical short forms of the Classroom Environment Scale (CES), Individualized Classroom Environment Questionnaire (ICEQ), and My Class Inventory (MCI) which contain only approximately 25 items each and which are amenable to easy hand scoring. When each instrument was administered to a large sample of science classes, results supported each scale's internal consistency reliability, discriminant validity, and ability to differentiate between the perceptions of students in different classrooms. The methods for improving classrooms are illustrated by reporting some case studies of change attempts. For example, when the CES was used in an attempt to improve the environment of a ninth grade science class, the steps followed were, first, assessment of actual and preferred classroom environment in order to identify discrepancies between actual and preferred environment and, second, introducing interventions aimed at reducing these discrepancies. The interesting finding was that significant improvements occurred for the two dimensions on which change had been attempted.

  9. Self-Contained Classrooms. Research Brief

    ERIC Educational Resources Information Center

    Walker, Karen

    2009-01-01

    Determining the ideal academic setting in which students can be successful continues to be one of the primary goals of educators. Is there a best classroom structure in which students can be successful? Although there is research on the academic gains in the block schedule and in traditional departmentalized settings, both of which are common in…

  10. Edifying Teachers in the Networked Classroom.

    ERIC Educational Resources Information Center

    Weisser, Christian

    Most instructors today feel that using computers in classrooms to create electronic forums automatically results in a more egalitarian setting, but technology can become an effective cloak for otherwise oppressive practices. These settings can potentially reinscribe dominant ideologies, stifling students rather than empowering them. These…

  11. Allowing "Artistic Agency" in the Elementary Classroom

    ERIC Educational Resources Information Center

    Rufo, David

    2011-01-01

    The author was interested in seeing what would happen if children were given more latitude when making art in school. In January 2009, he began by setting up environments in his classroom wherein he hoped his students would feel free to create self-initiated forms of artmaking. Two times each week an hour was set aside for an activity called Open…

  12. Inside the Primary Classroom.

    ERIC Educational Resources Information Center

    Simon, Brian

    1980-01-01

    Presents some of the findings of the ORACLE research program (Observational Research and Classroom Learning Evaluation), a detailed observational study of teacher-student interaction, teaching styles, and management methods within a sample of primary classrooms. (Editor/SJL)

  13. Creating Respectful Classroom Environments

    ERIC Educational Resources Information Center

    Miller, Regina; Pedro, Joan

    2006-01-01

    Respect is a critical variable in education. It is critical to each individual child in the classroom environment as well as to the teaching and learning that takes place in the classroom. Children learn by example. Where do they get their examples? This article explores the parameters of teaching and encouraging respect in classrooms for young…

  14. Classroom Use and Utilization.

    ERIC Educational Resources Information Center

    Fink, Ira

    2002-01-01

    Discusses how classrooms are distributed by size on a campus, how well they are used, and how their use changes with faculty and student needs and desires. Details how to analyze classroom space, use, and utilization, taking into account such factors as scheduling and classroom stations. (EV)

  15. Analysing Bilingual Classroom Discourse

    ERIC Educational Resources Information Center

    Hasan, Ali S.

    2006-01-01

    The present paper analyses and evaluates spoken discourse in the bilingual classroom at Damascus University. It looks at the mechanism of classroom interaction: the use of questions, initiations, repetitions and expansions. Although this paper deals with classroom interaction at Damascus University, it is believed that the results arrived at may…

  16. Observing Classroom Practice

    ERIC Educational Resources Information Center

    Danielson, Charlotte

    2012-01-01

    Classroom observation is a crucial aspect of any system of teacher evaluation. No matter how skilled a teacher is in other aspects of teaching--such as careful planning, working well with colleagues, and communicating with parents--if classroom practice is deficient, that individual cannot be considered a good teacher. Classroom observations can…

  17. Competition in the Classroom

    ERIC Educational Resources Information Center

    Jameson, Daphne

    2007-01-01

    In this article, the author shares the strategy she adopted to even out the participation among her multicultural students during their classroom discussions. The author realized that her students had different concepts about the classroom and different philosophies about competition. For the Americans and Indians, the classroom was a site of…

  18. Classroom Management. TESOL Classroom Practice Series

    ERIC Educational Resources Information Center

    Farrell, Thomas S. C., Ed.

    2008-01-01

    This series captures the dynamics of the contemporary ESOL classroom. It showcases state-of-the-art curricula, materials, tasks, and activities reflecting emerging trends in language education and seeks to build localized language teaching and learning theories based on teachers' and students' unique experiences in and beyond the classroom. Each…

  19. Classroom Strategies: Classroom Management Systems. Volume 3.

    ERIC Educational Resources Information Center

    Speiss, Madeleine F.; And Others

    Classroom management is defined as procedures for arranging the classroom environment so that children learn what the teacher wants to teach them in the healthiest and most effective way possible. The Southwestern Cooperative Educational Laboratory presents a discussion of these procedures as they relate to social controls and components of…

  20. Classrooms 2000: Innovative Approaches to Classroom Technology.

    ERIC Educational Resources Information Center

    Schoomer, Elia

    2000-01-01

    Describes the next generation of technology classrooms based on experiences at Lehigh University (Pennsylvania). Topics include learner-centered rather than instructor-centered instruction; size; interactivity; and technology features, emphasizing flexibility and interactive technology. An appendix lists selected classroom technology Web sites and…