Science.gov

Sample records for large classroom setting

  1. Calibrated Peer Review: A New Tool for Integrating Information Literacy Skills in Writing-Intensive Large Classroom Settings

    ERIC Educational Resources Information Center

    Fosmire, Michael

    2010-01-01

    Calibrated Peer Review[TM] (CPR) is a program that can significantly enhance the ability to integrate intensive information literacy exercises into large classroom settings. CPR is founded on a solid pedagogic base for learning, and it is formulated in such a way that information skills can easily be inserted. However, there is no mention of its…

  2. Active Learning in a Large Medical Classroom Setting for Teaching Renal Physiology

    ERIC Educational Resources Information Center

    Dietz, John R.; Stevenson, Frazier T.

    2011-01-01

    In this article, the authors describe an active learning exercise which has been used to replace some lecture hours in the renal portion of an integrated, organ system-based curriculum for first-year medical students. The exercise takes place in a large auditorium with ~150 students. The authors, who are faculty members, lead the discussions,…

  3. Impact of Problem-Based Learning in a Large Classroom Setting: Student Perception and Problem-Solving Skills

    ERIC Educational Resources Information Center

    Klegeris, Andis; Hurren, Heather

    2011-01-01

    Problem-based learning (PBL) can be described as a learning environment where the problem drives the learning. This technique usually involves learning in small groups, which are supervised by tutors. It is becoming evident that PBL in a small-group setting has a robust positive effect on student learning and skills, including better…

  4. Controlling Setting Events in the Classroom

    ERIC Educational Resources Information Center

    Chan, Paula E.

    2016-01-01

    Teachers face the challenging job of differentiating instruction for the diverse needs of their students. This task is difficult enough with happy students who are eager to learn; unfortunately students often enter the classroom in a bad mood because of events that happened outside the classroom walls. These events--called setting events--can…

  5. Improvement in generic problem-solving abilities of students by use of tutor-less problem-based learning in a large classroom setting.

    PubMed

    Klegeris, Andis; Bahniwal, Manpreet; Hurren, Heather

    2013-01-01

    Problem-based learning (PBL) was originally introduced in medical education programs as a form of small-group learning, but its use has now spread to large undergraduate classrooms in various other disciplines. Introduction of new teaching techniques, including PBL-based methods, needs to be justified by demonstrating the benefits of such techniques over classical teaching styles. Previously, we demonstrated that introduction of tutor-less PBL in a large third-year biochemistry undergraduate class increased student satisfaction and attendance. The current study assessed the generic problem-solving abilities of students from the same class at the beginning and end of the term, and compared student scores with similar data obtained in three classes not using PBL. Two generic problem-solving tests of equal difficulty were administered such that students took different tests at the beginning and the end of the term. Blinded marking showed a statistically significant 13% increase in the test scores of the biochemistry students exposed to PBL, while no trend toward significant change in scores was observed in any of the control groups not using PBL. Our study is among the first to demonstrate that use of tutor-less PBL in a large classroom leads to statistically significant improvement in generic problem-solving skills of students.

  6. Large Numbers and Calculators: A Classroom Activity.

    ERIC Educational Resources Information Center

    Arcavi, Abraham; Hadas, Nurit

    1989-01-01

    Described is an activity demonstrating how a scientific calculator can be used in a mathematics classroom to introduce new content while studying a conventional topic. Examples of reading and writing large numbers, and reading hidden results are provided. (YP)

  7. Implementing iPads in the Inclusive Classroom Setting

    ERIC Educational Resources Information Center

    Maich, Kimberly; Hall, Carmen

    2016-01-01

    This column provides practical suggestions to help guide teachers in utilizing classroom sets of iPads. Following a brief introduction to tablet technology in inclusive classrooms and the origin of these recommendations from a case study focus group, important elements of setting up classroom iPad use, from finding funding to teaching apps, are…

  8. Climate Setting in Second-Language Classrooms.

    ERIC Educational Resources Information Center

    Evans-Harvey, Cher

    1993-01-01

    Discusses the creation of a positive classroom climate, examines four dimensions of classroom climate (physical, academic, organizational, and social-emotional), and reviews techniques that teachers can use to promote a positive classroom climate. Teachers need to get to know their students, discuss the course objectives with their students, and…

  9. Tangential Floor in a Classroom Setting

    ERIC Educational Resources Information Center

    Marti, Leyla

    2012-01-01

    This article examines floor management in two classroom sessions: a task-oriented computer lesson and a literature lesson. Recordings made in the computer lesson show the organization of floor when a task is given to students. Temporary or "incipient" side floors (Jones and Thornborrow, 2004) emerge beside the main floor. In the literature lesson,…

  10. A Practical Setting of Distance Learning Classroom.

    ERIC Educational Resources Information Center

    Wang, Shousan; Buck, Lawrence

    1996-01-01

    Describes a distance-learning classroom developed and used by Central Connecticut State University for nurse training, educational statistics, mathematics, and technology courses. Discusses initial engineering, video cameras, video source switching, lighting, audio, and other technical and related aspects. Block diagrams and lists of equipment for…

  11. Student Engagement and Success in the Large Astronomy 101 Classroom

    NASA Astrophysics Data System (ADS)

    Jensen, J. B.

    2014-07-01

    The large auditorium classroom presents unique challenges to maintaining student engagement. During the fall 2012 semester, I adopted several specific strategies for increasing student engagement and reducing anonymity with the goal of maximizing student success in the large class. I measured attendance and student success in two classes, one with 300 students and one with 42, but otherwise taught as similarly as possible. While the students in the large class probably did better than they would have in a traditional lecture setting, attendance was still significantly lower in the large class, resulting in lower student success than in the small control class by all measures. I will discuss these results and compare to classes in previous semesters, including other small classes and large Distance Education classes conducted live over remote television link.

  12. Teaching Music in the Urban Classroom Set

    ERIC Educational Resources Information Center

    Frierson-Campbell, Carol Ed.

    2006-01-01

    The change needed in urban music education not only relates to the idea that music should be at the center of the curriculum; rather, it is that culturally relevant music should be a creative force at the center of reform in urban education. This set is the start of a national-level conversation aimed at making that goal a reality. In both…

  13. Observation Instrument of Play Behaviour in a Classroom Setting

    ERIC Educational Resources Information Center

    Berkhout, Louise; Hoekman, Joop; Goorhuis-Brouwer, Sieneke M.

    2012-01-01

    The objective of this study was to develop an instrument to observe the play behaviour of a whole group of children from four to six years of age in a classroom setting on the basis of video recording. The instrument was developed in collaboration with experienced teachers and experts on play. Categories of play were derived from the literature…

  14. Social Studies Instruction in a Non-Classroom Setting.

    ERIC Educational Resources Information Center

    Murphy, Margaret M.

    Certain areas in the social studies can be effectively taught in a non-classroom setting. This experiment determined if, in a supermarket situation, consumer preferences (as measured in sales figures and augmented by questionnaire data) could be altered by the addition of nutritional information to the labels of sixteen items which had moderate…

  15. Enhancing Feedback via Peer Learning in Large Classrooms

    ERIC Educational Resources Information Center

    Zher, Ng Huey; Hussein, Raja Maznah Raja; Saat, Rohaida Mohd

    2016-01-01

    Feedback has been lauded as a key pedagogical tool in higher education. Unfortunately, the value of feedback falls short when being carried out in large classrooms. In this study, strategies for sustaining feedback in large classroom based on peer learning are explored. All the characteristics identified within the concept of peer learning were…

  16. Examining the Effectiveness of Team-Based Learning (TBL) in Different Classroom Settings

    ERIC Educational Resources Information Center

    Yuretich, Richard F.; Kanner, Lisa C.

    2015-01-01

    The problem of effective learning in college classrooms, especially in a large lecture setting, has been a topic of discussion for a considerable span of time. Most efforts to improve learning incorporate various forms of student-active learning, such as in-class investigations or problems, group discussions, collaborative examinations and…

  17. Radial sets: interactive visual analysis of large overlapping sets.

    PubMed

    Alsallakh, Bilal; Aigner, Wolfgang; Miksch, Silvia; Hauser, Helwig

    2013-12-01

    In many applications, data tables contain multi-valued attributes that often store the memberships of the table entities to multiple sets such as which languages a person masters, which skills an applicant documents, or which features a product comes with. With a growing number of entities, the resulting element-set membership matrix becomes very rich of information about how these sets overlap. Many analysis tasks targeted at set-typed data are concerned with these overlaps as salient features of such data. This paper presents Radial Sets, a novel visual technique to analyze set memberships for a large number of elements. Our technique uses frequency-based representations to enable quickly finding and analyzing different kinds of overlaps between the sets, and relating these overlaps to other attributes of the table entities. Furthermore, it enables various interactions to select elements of interest, find out if they are over-represented in specific sets or overlaps, and if they exhibit a different distribution for a specific attribute compared to the rest of the elements. These interactions allow formulating highly-expressive visual queries on the elements in terms of their set memberships and attribute values. As we demonstrate via two usage scenarios, Radial Sets enable revealing and analyzing a multitude of overlapping patterns between large sets, beyond the limits of state-of-the-art techniques. PMID:24051816

  18. Teacher and Student Research Using Large Data Sets

    NASA Astrophysics Data System (ADS)

    Croft, S. K.; Pompea, S. M.; Sparks, R. T.

    2005-12-01

    One of the objectives of teacher research experiences is to immerse the teacher in an authentic research situation to help the teacher understand what real research is all about: "to do science as scientists do." Experiences include doing experiments in laboratories, gathering data out in the field, and observing at professional observatories. However, a rapidly growing area of scientific research is in "data mining" increasingly large public data archives. In the earth and space sciences, such large archives are built around data from Landsat 7, the Sloan Digital Sky Survey, and in about seven years, the Large Synoptic Survey Telescope. The LSST will re-photograph the entire night sky every three day, resulting in a data flow of about 20 terabytes per night. The resulting LSST archive will represent a huge challenge of simple storage and retrieval for professional scientists. It will be a much greater challenge to help K-12 teachers use such gargantuan files and collections of data effectively in the classroom and to understand and begin to practice the new research procedures involved in data mining. At NOAO we are exploring ways of using large data sets in formal educational settings like classrooms, and public settings like planetariums and museums. In our existing professional development programs, such as our Teacher leaders in Research Based Science Education, we have introduced teachers to research via on-site observing experiences and partnerships with active astronomers. To successfully initiate research in the classroom, we have found that teachers need training in specific science content, use of specialized software to work with the data, development of research questions and objectives, and explicit pedagogical strategies for classroom use. Our research projects are well defined, though not "canned," and incorporate specific types of data, such as solar images. These data can be replaced with new data from an archive for the classroom research

  19. Observations of Children’s Interactions with Teachers, Peers, and Tasks across Preschool Classroom Activity Settings

    PubMed Central

    Booren, Leslie M.; Downer, Jason T.; Vitiello, Virginia E.

    2014-01-01

    This descriptive study examined classroom activity settings in relation to children’s observed behavior during classroom interactions, child gender, and basic teacher behavior within the preschool classroom. 145 children were observed for an average of 80 minutes during 8 occasions across 2 days using the inCLASS, an observational measure that conceptualizes behavior into teacher, peer, task, and conflict interactions. Findings indicated that on average children’s interactions with teachers were higher in teacher-structured settings, such as large group. On average, children’s interactions with peers and tasks were more positive in child-directed settings, such as free choice. Children experienced more conflict during recess and routines/transitions. Finally, gender differences were observed within small group and meals. The implications of these findings might encourage teachers to be thoughtful and intentional about what types of support and resources are provided so children can successfully navigate the demands of particular settings. These findings are not meant to discourage certain teacher behaviors or imply value of certain classroom settings; instead, by providing an evidenced-based picture of the conditions under which children display the most positive interactions, teachers can be more aware of choices within these settings and have a powerful way to assist in professional development and interventions. PMID:25717282

  20. Using Flipped Classroom Approach to Explore Deep Learning in Large Classrooms

    ERIC Educational Resources Information Center

    Danker, Brenda

    2015-01-01

    This project used two Flipped Classroom approaches to stimulate deep learning in large classrooms during the teaching of a film module as part of a Diploma in Performing Arts course at Sunway University, Malaysia. The flipped classes utilized either a blended learning approach where students first watched online lectures as homework, and then…

  1. Silent Students' Participation in a Large Active Learning Science Classroom

    ERIC Educational Resources Information Center

    Obenland, Carrie A.; Munson, Ashlyn H.; Hutchinson, John S.

    2012-01-01

    Active learning in large science classrooms furthers opportunities for students to engage in the content and in meaningful learning, yet students can still remain anonymously silent. This study aims to understand the impact of active learning on these silent students in a large General Chemistry course taught via Socratic questioning and…

  2. Teaching the Assessment of Normality Using Large Easily-Generated Real Data Sets

    ERIC Educational Resources Information Center

    Kulp, Christopher W.; Sprechini, Gene D.

    2016-01-01

    A classroom activity is presented, which can be used in teaching students statistics with an easily generated, large, real world data set. The activity consists of analyzing a video recording of an object. The colour data of the recorded object can then be used as a data set to explore variation in the data using graphs including histograms,…

  3. Spatial occupancy models for large data sets

    USGS Publications Warehouse

    Johnson, Devin S.; Conn, Paul B.; Hooten, Mevin B.; Ray, Justina C.; Pond, Bruce A.

    2013-01-01

    Since its development, occupancy modeling has become a popular and useful tool for ecologists wishing to learn about the dynamics of species occurrence over time and space. Such models require presence–absence data to be collected at spatially indexed survey units. However, only recently have researchers recognized the need to correct for spatially induced overdisperison by explicitly accounting for spatial autocorrelation in occupancy probability. Previous efforts to incorporate such autocorrelation have largely focused on logit-normal formulations for occupancy, with spatial autocorrelation induced by a random effect within a hierarchical modeling framework. Although useful, computational time generally limits such an approach to relatively small data sets, and there are often problems with algorithm instability, yielding unsatisfactory results. Further, recent research has revealed a hidden form of multicollinearity in such applications, which may lead to parameter bias if not explicitly addressed. Combining several techniques, we present a unifying hierarchical spatial occupancy model specification that is particularly effective over large spatial extents. This approach employs a probit mixture framework for occupancy and can easily accommodate a reduced-dimensional spatial process to resolve issues with multicollinearity and spatial confounding while improving algorithm convergence. Using open-source software, we demonstrate this new model specification using a case study involving occupancy of caribou (Rangifer tarandus) over a set of 1080 survey units spanning a large contiguous region (108 000 km2) in northern Ontario, Canada. Overall, the combination of a more efficient specification and open-source software allows for a facile and stable implementation of spatial occupancy models for large data sets.

  4. Lessons Learned from a Multiculturally, Economically Diverse Classroom Setting.

    ERIC Educational Resources Information Center

    Lyman, Lawrence

    For her sabbatical a professor of teacher education at Emporia State University returned to the elementary classroom after a 20-year absence to teach in a third/fourth combination classroom in the Emporia, Kansas Public Schools. The return to elementary classroom teaching provided the professor with the opportunity to utilize some of the social…

  5. Detecting novel associations in large data sets.

    PubMed

    Reshef, David N; Reshef, Yakir A; Finucane, Hilary K; Grossman, Sharon R; McVean, Gilean; Turnbaugh, Peter J; Lander, Eric S; Mitzenmacher, Michael; Sabeti, Pardis C

    2011-12-16

    Identifying interesting relationships between pairs of variables in large data sets is increasingly important. Here, we present a measure of dependence for two-variable relationships: the maximal information coefficient (MIC). MIC captures a wide range of associations both functional and not, and for functional relationships provides a score that roughly equals the coefficient of determination (R(2)) of the data relative to the regression function. MIC belongs to a larger class of maximal information-based nonparametric exploration (MINE) statistics for identifying and classifying relationships. We apply MIC and MINE to data sets in global health, gene expression, major-league baseball, and the human gut microbiota and identify known and novel relationships. PMID:22174245

  6. Detecting novel associations in large data sets.

    PubMed

    Reshef, David N; Reshef, Yakir A; Finucane, Hilary K; Grossman, Sharon R; McVean, Gilean; Turnbaugh, Peter J; Lander, Eric S; Mitzenmacher, Michael; Sabeti, Pardis C

    2011-12-16

    Identifying interesting relationships between pairs of variables in large data sets is increasingly important. Here, we present a measure of dependence for two-variable relationships: the maximal information coefficient (MIC). MIC captures a wide range of associations both functional and not, and for functional relationships provides a score that roughly equals the coefficient of determination (R(2)) of the data relative to the regression function. MIC belongs to a larger class of maximal information-based nonparametric exploration (MINE) statistics for identifying and classifying relationships. We apply MIC and MINE to data sets in global health, gene expression, major-league baseball, and the human gut microbiota and identify known and novel relationships.

  7. On Flipping the Classroom in Large First Year Calculus Courses

    ERIC Educational Resources Information Center

    Jungic, Veselin; Kaur, Harpreet; Mulholland, Jamie; Xin, Cindy

    2015-01-01

    Over the course of two years, 2012-2014, we have implemented a "flipping" the classroom approach in three of our large enrolment first year calculus courses: differential and integral calculus for scientists and engineers. In this article we describe the details of our particular approach and share with the reader some experiences of…

  8. On flipping the classroom in large first year calculus courses

    NASA Astrophysics Data System (ADS)

    Jungić, Veselin; Kaur, Harpreet; Mulholland, Jamie; Xin, Cindy

    2015-05-01

    Over the course of two years, 2012--2014, we have implemented a 'flipping' the classroom approach in three of our large enrolment first year calculus courses: differential and integral calculus for scientists and engineers. In this article we describe the details of our particular approach and share with the reader some experiences of both instructors and students.

  9. Reinforcement Strategies for Token Economies in a Special Classroom Setting

    ERIC Educational Resources Information Center

    Libb, J. Wesley; And Others

    1973-01-01

    Both academic and disruptive behavior in a classroom for children with behavioral problems were monitored under two different procedures for administering token reinforcement. Control may be more efficiently achieved by reinforcing academic behaviors incompatible with disruptive behaviors. (Author/JB)

  10. Clickers in the Large Classroom: Current Research and Best-Practice Tips

    PubMed Central

    2007-01-01

    Audience response systems (ARS) or clickers, as they are commonly called, offer a management tool for engaging students in the large classroom. Basic elements of the technology are discussed. These systems have been used in a variety of fields and at all levels of education. Typical goals of ARS questions are discussed, as well as methods of compensating for the reduction in lecture time that typically results from their use. Examples of ARS use occur throughout the literature and often detail positive attitudes from both students and instructors, although exceptions do exist. When used in classes, ARS clickers typically have either a benign or positive effect on student performance on exams, depending on the method and extent of their use, and create a more positive and active atmosphere in the large classroom. These systems are especially valuable as a means of introducing and monitoring peer learning methods in the large lecture classroom. So that the reader may use clickers effectively in his or her own classroom, a set of guidelines for writing good questions and a list of best-practice tips have been culled from the literature and experienced users. PMID:17339389

  11. Classroom Social Capital: Development of a Measure of Instrumental Social Support within Academic Settings

    ERIC Educational Resources Information Center

    Shecter, Julie

    2009-01-01

    Many universities implement programs and interventions to increase students' perceived instrumental social support within the classroom setting, yet to date, no measures exist to adequately assess such perceptions. In response to this need, the current research developed an operational definition of instrumental classroom social support and also…

  12. The Emergence of Student Creativity in Classroom Settings: A Case Study of Elementary Schools in Korea

    ERIC Educational Resources Information Center

    Cho, Younsoon; Chung, Hye Young; Choi, Kyoulee; Seo, Choyoung; Baek, Eunjoo

    2013-01-01

    This research explores the emergence of student creativity in classroom settings, specifically within two content areas: science and social studies. Fourteen classrooms in three elementary schools in Korea were observed, and the teachers and students were interviewed. The three types of student creativity emerging in the teaching and learning…

  13. An Exploration of the Effectiveness of an Audit Simulation Tool in a Classroom Setting

    ERIC Educational Resources Information Center

    Zelin, Robert C., II

    2010-01-01

    The purpose of this study was to examine the effectiveness of using an audit simulation product in a classroom setting. Many students and professionals feel that a disconnect exists between learning auditing in the classroom and practicing auditing in the workplace. It was hoped that the introduction of an audit simulation tool would help to…

  14. Teaching Nursing Research Using Large Data Sets.

    ERIC Educational Resources Information Center

    Brosnan, Christine A.; Eriksen, Lillian R.; Lin, Yu-Feng

    2002-01-01

    Describes a process for teaching nursing research via secondary analysis of data sets from the National Center for Health Statistics. Addresses advantages, potential problems and limitations, guidelines for students, and evaluation methods. (Contains 32 references.) (SK)

  15. Understanding Bystander Perceptions of Cyberbullying in Inclusive Classroom Settings

    ERIC Educational Resources Information Center

    Guckert, Mary

    2013-01-01

    Cyberbullying is a pervasive problem that puts students at risk of successful academic outcomes and the ability to feel safe in school. As most students with disabilities are served in inclusive classrooms, there is a growing concern that students with special needs are at an increased risk of online bullying harassment. Enhancing responsible…

  16. Helping Children Cope with Stress in the Classroom Setting.

    ERIC Educational Resources Information Center

    Fallin, Karen; Wallinga, Charlotte; Coleman, Mick

    2001-01-01

    Discusses children's experiences with stress, using key concepts of the cognitive-transactional model. Relates stressors to cognitive appraisal, identifies coping strategies, lists resources, and offers suggestions for interventions in the classroom. Recommends identifying and responding to daily stressors with children, facilitating coping…

  17. Thinking Routines: Replicating Classroom Practices within Museum Settings

    ERIC Educational Resources Information Center

    Wolberg, Rochelle Ibanez; Goff, Allison

    2012-01-01

    This article describes thinking routines as tools to guide and support young children's thinking. These learning strategies, developed by Harvard University's Project Zero Classroom, actively engage students in constructing meaning while also understanding their own thinking process. The authors discuss how thinking routines can be used in both…

  18. Setting of Classroom Environments for Hearing Impaired Children

    ERIC Educational Resources Information Center

    Turan, Zerrin

    2007-01-01

    This paper aims to explain effects of acoustical environments in sound perception of hearing impaired people. Important aspects of sound and hearing impairment are explained. Detrimental factors in acoustic conditions for speech perception are mentioned. Necessary acoustic treatment in classrooms and use of FM systems to eliminate these factors…

  19. Twelve Practical Strategies To Prevent Behavioral Escalation in Classroom Settings.

    ERIC Educational Resources Information Center

    Shukla-Mehta, Smita; Albin, Richard W.

    2003-01-01

    Twelve practical strategies that can be used by classroom teachers to prevent behavioral escalation are discussed, including reinforce calm, know the triggers, pay attention to anything unusual, do not escalate, intervene early, know the function of problem behavior, use extinction wisely, teach prosocial behavior, and teach academic survival…

  20. Knowledge Discovery in Large Data Sets

    SciTech Connect

    Simas, Tiago; Silva, Gabriel; Miranda, Bruno; Ribeiro, Rita

    2008-12-05

    In this work we briefly address the problem of unsupervised classification on large datasets, magnitude around 100,000,000 objects. The objects are variable objects, which are around 10% of the 1,000,000,000 astronomical objects that will be collected by GAIA/ESA mission. We tested unsupervised classification algorithms on known datasets such as OGLE and Hipparcos catalogs. Moreover, we are building several templates to represent the main classes of variable objects as well as new classes to build a synthetic dataset of this dimension. In the future we will run the GAIA satellite scanning law on these templates to obtain a testable large dataset.

  1. Reliability of the 5-min psychomotor vigilance task in a primary school classroom setting.

    PubMed

    Wilson, Andrew; Dollman, James; Lushington, Kurt; Olds, Timothy

    2010-08-01

    This study evaluated the reliability of the 5-min psychomotor vigilance task (PVT) in a single-sex Australian primary school. Seventy-five male students (mean age = 11.82 years, SD = 1.12) completed two 5-min PVTs using a Palm personal digital assistant (PDA) in (1) an isolated setting and (2) a classroom setting. Of this group of students, a subsample of 37 students completed a test-retest reliability trial within the classroom setting. Using a mixed-model analysis, there was no significant difference in the mean response time (RT) or number of lapses (RTs >or= 500 msec) between the isolated and the classroom setting. There was, however, an order effect for the number of lapses in the isolated setting, with the number of lapses being greater if the isolated test was conducted second. Test-retest intraclass correlation coefficients (ICCs) in the classroom setting indicated moderate to high reliability (mean RT = .84, lapses = .59). Bland-Altman analysis showed no systematic difference between the two settings. Findings suggest that the 5-min PDA PVT is a reliable measure of sustained attention in the classroom setting in this sample of primary-aged schoolchildren. The results provide further evidence for the versatility of this measuring device for larger interventions outside the laboratory. PMID:20805597

  2. Reliability of the 5-min psychomotor vigilance task in a primary school classroom setting.

    PubMed

    Wilson, Andrew; Dollman, James; Lushington, Kurt; Olds, Timothy

    2010-08-01

    This study evaluated the reliability of the 5-min psychomotor vigilance task (PVT) in a single-sex Australian primary school. Seventy-five male students (mean age = 11.82 years, SD = 1.12) completed two 5-min PVTs using a Palm personal digital assistant (PDA) in (1) an isolated setting and (2) a classroom setting. Of this group of students, a subsample of 37 students completed a test-retest reliability trial within the classroom setting. Using a mixed-model analysis, there was no significant difference in the mean response time (RT) or number of lapses (RTs >or= 500 msec) between the isolated and the classroom setting. There was, however, an order effect for the number of lapses in the isolated setting, with the number of lapses being greater if the isolated test was conducted second. Test-retest intraclass correlation coefficients (ICCs) in the classroom setting indicated moderate to high reliability (mean RT = .84, lapses = .59). Bland-Altman analysis showed no systematic difference between the two settings. Findings suggest that the 5-min PDA PVT is a reliable measure of sustained attention in the classroom setting in this sample of primary-aged schoolchildren. The results provide further evidence for the versatility of this measuring device for larger interventions outside the laboratory.

  3. Large-N in Volcano Settings: Volcanosri

    NASA Astrophysics Data System (ADS)

    Lees, J. M.; Song, W.; Xing, G.; Vick, S.; Phillips, D.

    2014-12-01

    We seek a paradigm shift in the approach we take on volcano monitoring where the compromise from high fidelity to large numbers of sensors is used to increase coverage and resolution. Accessibility, danger and the risk of equipment loss requires that we develop systems that are independent and inexpensive. Furthermore, rather than simply record data on hard disk for later analysis we desire a system that will work autonomously, capitalizing on wireless technology and in field network analysis. To this end we are currently producing a low cost seismic array which will incorporate, at the very basic level, seismological tools for first cut analysis of a volcano in crises mode. At the advanced end we expect to perform tomographic inversions in the network in near real time. Geophone (4 Hz) sensors connected to a low cost recording system will be installed on an active volcano where triggering earthquake location and velocity analysis will take place independent of human interaction. Stations are designed to be inexpensive and possibly disposable. In one of the first implementations the seismic nodes consist of an Arduino Due processor board with an attached Seismic Shield. The Arduino Due processor board contains an Atmel SAM3X8E ARM Cortex-M3 CPU. This 32 bit 84 MHz processor can filter and perform coarse seismic event detection on a 1600 sample signal in fewer than 200 milliseconds. The Seismic Shield contains a GPS module, 900 MHz high power mesh network radio, SD card, seismic amplifier, and 24 bit ADC. External sensors can be attached to either this 24-bit ADC or to the internal multichannel 12 bit ADC contained on the Arduino Due processor board. This allows the node to support attachment of multiple sensors. By utilizing a high-speed 32 bit processor complex signal processing tasks can be performed simultaneously on multiple sensors. Using a 10 W solar panel, second system being developed can run autonomously and collect data on 3 channels at 100Hz for 6 months

  4. Activity Settings and Daily Routines in Preschool Classrooms: Diverse Experiences in Early Learning Settings for Low-Income Children

    PubMed Central

    Fuligni, Allison Sidle; Howes, Carollee; Huang, Yiching; Hong, Sandra Soliday; Lara-Cinisomo, Sandraluz

    2011-01-01

    This paper examines activity settings and daily classroom routines experienced by 3- and 4-year-old low-income children in public center-based preschool programs, private center-based programs, and family child care homes. Two daily routine profiles were identified using a time-sampling coding procedure: a High Free-Choice pattern in which children spent a majority of their day engaged in child-directed free-choice activity settings combined with relatively low amounts of teacher-directed activity, and a Structured-Balanced pattern in which children spent relatively equal proportions of their day engaged in child-directed free-choice activity settings and teacher-directed small- and whole-group activities. Daily routine profiles were associated with program type and curriculum use but not with measures of process quality. Children in Structured-Balanced classrooms had more opportunities to engage in language and literacy and math activities, whereas children in High Free-Choice classrooms had more opportunities for gross motor and fantasy play. Being in a Structured-Balanced classroom was associated with children’s language scores but profiles were not associated with measures of children’s math reasoning or socio-emotional behavior. Consideration of teachers’ structuring of daily routines represents a valuable way to understand nuances in the provision of learning experiences for young children in the context of current views about developmentally appropriate practice and school readiness. PMID:22665945

  5. A Meta-Analysis of Interventions To Decrease Disruptive Classroom Behavior in Public Education Settings.

    ERIC Educational Resources Information Center

    Stage, Scott A.; Quiroz, David R.

    1997-01-01

    Describes meta-analysis of 99 studies that used interventions to decrease disruptive classroom behavior in public education settings. Overall, results indicate interventions yield comparable results to other studies investigating effectiveness of psychotherapy. Findings show that efficacious treatments used in public school settings decrease…

  6. Content-Based Instruction for English Language Learners: An Exploration across Multiple Classroom Settings

    ERIC Educational Resources Information Center

    Park, Seo Jung

    2009-01-01

    This study explored the content-based literacy instruction of English language learners (ELLs) across multiple classroom settings in U.S. elementary schools. The following research questions guided the study: (a) How are ELLs taught English in two types of instructional settings: regular content-area literacy instruction in the all-English…

  7. Technological Challenges: Designing Large Compressed Video and Multimedia Classrooms.

    ERIC Educational Resources Information Center

    Hart, Russ A.; Parker, Roger

    Designing a distance learning classroom requires integration of educational goals and philosophy with technology and ergonomics. The technological challenge and key to designing effective distance learning and multimedia classrooms is creating an environment in which the participants--students, and teacher--may easily interact with instructional…

  8. Large Data at Small Universities: Astronomical processing using a computer classroom

    NASA Astrophysics Data System (ADS)

    Fuller, Nathaniel James; Clarkson, William I.; Fluharty, Bill; Belanger, Zach; Dage, Kristen

    2016-06-01

    The use of large computing clusters for astronomy research is becoming more commonplace as datasets expand, but access to these required resources is sometimes difficult for research groups working at smaller Universities. As an alternative to purchasing processing time on an off-site computing cluster, or purchasing dedicated hardware, we show how one can easily build a crude on-site cluster by utilizing idle cycles on instructional computers in computer-lab classrooms. Since these computers are maintained as part of the educational mission of the University, the resource impact on the investigator is generally low.By using open source Python routines, it is possible to have a large number of desktop computers working together via a local network to sort through large data sets. By running traditional analysis routines in an “embarrassingly parallel” manner, gains in speed are accomplished without requiring the investigator to learn how to write routines using highly specialized methodology. We demonstrate this concept here applied to 1. photometry of large-format images and 2. Statistical significance-tests for X-ray lightcurve analysis. In these scenarios, we see a speed-up factor which scales almost linearly with the number of cores in the cluster. Additionally, we show that the usage of the cluster does not severely limit performance for a local user, and indeed the processing can be performed while the computers are in use for classroom purposes.

  9. Performance in an Online Introductory Course in a Hybrid Classroom Setting

    ERIC Educational Resources Information Center

    Aly, Ibrahim

    2013-01-01

    This study compared the academic achievement between undergraduate students taking an introductory managerial accounting course online (N = 104) and students who took the same course in a hybrid classroom setting (N = 203). Student achievement was measured using scores from twelve weekly online assignments, two major online assignments, a final…

  10. Enhancing Knowledge Transfer in Classroom versus Online Settings: The Interplay among Instructor, Student, Content, and Context

    ERIC Educational Resources Information Center

    Nemanich, Louise; Banks, Michael; Vera, Dusya

    2009-01-01

    This article integrates management education and organizational learning theories to identify the factors that drive the differences in student outcomes between the online and classroom settings. We draw upon theory on knowledge transfer barriers in organizations to understand the interlinking relationships among presage conditions, deep learning…

  11. Developing a Positive Mind-Set toward the Use of Technology for Classroom Instruction

    ERIC Educational Resources Information Center

    Okojie, Mabel C. P. O.; Olinzock, Anthony

    2006-01-01

    The aim of this paper is to examine various indicators associated with the development of a positive mind-set toward the use of technology for instruction. The paper also examines the resources available to help teachers keep pace with technological innovation. Electronic classrooms have some complexities associated with them; therefore, support…

  12. Descriptive Analysis of Classroom Setting Events on the Social Behaviors of Children with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Boyd, Brian A.; Conroy, Maureen A.; Asmus, Jennifer M.; McKenney, Elizabeth L. W.; Mancil, G. Richmond

    2008-01-01

    Children with Autism Spectrum Disorder (ASD) are characterized by extreme deficits in social relatedness with same-age peers. The purpose of this descriptive study was to identify naturally occurring antecedent variables (i.e., setting events) in the classroom environments of children with ASD that promoted their engagement in peer-related social…

  13. Reliability Issues and Solutions for Coding Social Communication Performance in Classroom Settings

    ERIC Educational Resources Information Center

    Olswang, Lesley B.; Svensson, Liselotte; Coggins, Truman E.; Beilinson, Jill S.; Donaldson, Amy L.

    2006-01-01

    Purpose: To explore the utility of time-interval analysis for documenting the reliability of coding social communication performance of children in classroom settings. Of particular interest was finding a method for determining whether independent observers could reliably judge both occurrence and duration of ongoing behavioral dimensions for…

  14. The Impact of Physical Settings on Pre-Schoolers Classroom Organization

    ERIC Educational Resources Information Center

    Tadjic, Mirko; Martinec, Miroslav; Farago, Amalija

    2015-01-01

    The physical setting plays an important role in the lives of pre-schoolers and can be an important component of children's experience and development when it is wisely and meaningfully designed. The classroom organization enhances and supports the pre-schooler capability to perform activities himself, initiate and finish tasks, creates the…

  15. Mobile-IT Education (MIT.EDU): M-Learning Applications for Classroom Settings

    ERIC Educational Resources Information Center

    Sung, M.; Gips, J.; Eagle, N.; Madan, A.; Caneel, R.; DeVaul, R.; Bonsen, J.; Pentland, A.

    2005-01-01

    In this paper, we describe the Mobile-IT Education (MIT.EDU) system, which demonstrates the potential of using a distributed mobile device architecture for rapid prototyping of wireless mobile multi-user applications for use in classroom settings. MIT.EDU is a stable, accessible system that combines inexpensive, commodity hardware, a flexible…

  16. The Relationship Between Interpersonal Relations Orientations and Preferred Classroom Physical Settings.

    ERIC Educational Resources Information Center

    Feitler, Fred C.; And Others

    This study reports relationships found between FIRO-B (Fundamental Interpersonal Relations Orientation) scores and preference for classroom spatial settings. It was hypothesized that differences in interpersonal needs would be reflected in preferences for particular physical environments in which to teach. The sample consisted of 276 graduates and…

  17. Generalizability and Decision Studies to Inform Observational and Experimental Research in Classroom Settings

    ERIC Educational Resources Information Center

    Bottema-Beutel, Kristen; Lloyd, Blair; Carter, Erik W.; Asmus, Jennifer M.

    2014-01-01

    Attaining reliable estimates of observational measures can be challenging in school and classroom settings, as behavior can be influenced by multiple contextual factors. Generalizability (G) studies can enable researchers to estimate the reliability of observational data, and decision (D) studies can inform how many observation sessions are…

  18. Analysis of Two Early Childhood Education Settings: Classroom Variables and Peer Verbal Interaction

    ERIC Educational Resources Information Center

    Hojnoski, Robin L.; Margulies, Allison S.; Barry, Amberly; Bose-Deakins, Jillaynne; Sumara, Kimberly M.; Harman, Jennifer L.

    2008-01-01

    Descriptive and ecobehavioral analyses were used to explore the daily activity contexts in classroom settings reflecting two distinct models of early childhood education. Activity context, social configurations, teacher behavior, and child behavior were explored, with specific consideration given to peer verbal behavior as an indicator of social…

  19. Teaching Elementary School Teachers Cognitive-Behavioral Techniques To Address ADDH Behaviors in the Classroom Setting.

    ERIC Educational Resources Information Center

    Vogelmann-Peper, Marcella

    This practicum was designed to address attention deficit and hyperactive behaviors (ADDH) in the elementary classroom setting. The primary goal was to provide teachers with an effective intervention technique which requires little time and addresses the ADDH syndrome. A second aim was to increase teachers' understanding of the ADDH syndrome and…

  20. Civility in the University Classroom: An Opportunity for Faculty to Set Expectations

    ERIC Educational Resources Information Center

    Ward, Chris; Yates, Dan

    2014-01-01

    This research examines the types of uncivil behaviors frequently encountered in university classrooms. These behaviors range from walking in late to class, texting in class, and/or unprofessional emails. These behaviors can often undermine a professor's teaching. Setting reasonable and consistent expectations is a combination of university policy,…

  1. Use of Big-Screen Films in Multiple Childbirth Education Classroom Settings

    PubMed Central

    Kaufman, Tamara

    2010-01-01

    Although two recent films, Orgasmic Birth and Pregnant in America, were intended for the big screen, they can also serve as valuable teaching resources in multiple childbirth education settings. Each film conveys powerful messages about birth and today's birthing culture. Depending on a childbirth educator's classroom setting (hospital, birthing center, or home birth environment), particular portions in each film, along with extra clips featured on the films' DVDs, can enhance an educator's curriculum and spark compelling discussions with class participants. PMID:21358831

  2. Comparing Functional Analysis and Paired-choice Assessment Results in Classroom Settings

    PubMed Central

    Berg, Wendy K; Wacker, David P; Cigrand, Karla; Merkle, Steve; Wade, Jeanie; Henry, Kim; Wang, Yu-Chia

    2007-01-01

    The results of a functional analysis of problem behavior and a paired-choice assessment were compared to determine whether the same social reinforcers were identified for problem behavior and an appropriate response (time allocation). The two assessments were conducted in classroom settings with 4 adolescents with mental retardation who engaged in severe problem behavior. Each student's classroom teacher served as the therapist for all phases of assessment. The two assessment procedures identified the same social reinforcers for problem and appropriate behavior for 3 of 4 participants. PMID:17970268

  3. INTERIOR VIEW, SETTING LARGE CORE WITH ASSISTANCE FROM THE OVERHEAD ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    INTERIOR VIEW, SETTING LARGE CORE WITH ASSISTANCE FROM THE OVERHEAD RAIL CRANE IN BOX FLOOR MOLD AREA (WORKERS: DAN T. WELLS AND TRUMAN CARLISLE). - Stockham Pipe & Fittings Company, Ductile Iron Foundry, 4000 Tenth Avenue North, Birmingham, Jefferson County, AL

  4. Strategies for Engaging FCS Learners in a Large-Format Classroom: Embedded Videos

    ERIC Educational Resources Information Center

    Leslie, Catherine Amoroso

    2014-01-01

    This article presents a method for utilizing technology to increase student engagement in large classroom formats. In their lives outside the classroom, students spend considerable time interfacing with media, and they are receptive to information conveyed in electronic formats. Research has shown that multimedia is an effective learning resource;…

  5. A Quantitative Evaluation of the Flipped Classroom in a Large Lecture Principles of Economics Course

    ERIC Educational Resources Information Center

    Balaban, Rita A.; Gilleskie, Donna B.; Tran, Uyen

    2016-01-01

    This research provides evidence that the flipped classroom instructional format increases student final exam performance, relative to the traditional instructional format, in a large lecture principles of economics course. The authors find that the flipped classroom directly improves performance by 0.2 to 0.7 standardized deviations, depending on…

  6. Teaching Methodology in a "Large Power Distance" Classroom: A South Korean Context

    ERIC Educational Resources Information Center

    Jambor, Paul Z.

    2005-01-01

    This paper looks at South Korea as an example of a collectivist society having a rather large power distance dimension value. In a traditional Korean classroom the teacher is at the top of the classroom hierarchy, while the students are the passive participants. Gender and age play a role in the hierarchy between students themselves. Teaching…

  7. The Relation between High School Teacher Sense of Teaching Efficacy and Self-Reported Attitudes toward the Inclusive Classroom Settings

    ERIC Educational Resources Information Center

    Wright, Heather Dillehay

    2013-01-01

    The purpose of this study was to investigate if collective sense of teaching efficacy, general sense of teaching efficacy, or personal sense of teacher efficacy influenced teacher attitude toward inclusive classroom settings. Additionally, the study sought to determine if teacher attitude toward inclusive classroom settings differed when taking…

  8. Wound staging: can nurses apply classroom education to the clinical setting?

    PubMed

    Arnold, N; Watterworth, B

    1995-06-01

    Traditionally, education on wound staging has been conducted in the classroom using drawings, photographs or slides to illustrate examples of wound stages. These methods portray wounds two-dimensionally, but clinically, wounds are three-dimensional. Seven home care nurses in central Florida were given a pre-test and post-test of 16 slides, four of each stage. Field visits with three of these nurses were then conducted by an ET nurse to evaluate the application of this education into clinical practice in the home setting. The questions studied were: Do nurses learn to accurately assess stages of wounds from classroom education? Does classroom ability to stage wounds equate to ability to stage correctly in the clinical setting? Test results showed Stage II and Stage III wounds to be most problematic in the classroom. The most improvement was seen in the staging of Stage II and Stage IV wounds two-dimensionally. In the field, one nurse consistently staged wounds correctly while two had problems with correct staging. Additional investigation is required to determine if these results can be generalized to other home health nurses and if changes in clinical ability will occur over time. PMID:7612139

  9. Generalizability and decision studies to inform observational and experimental research in classroom settings.

    PubMed

    Bottema-Beutel, Kristen; Lloyd, Blair; Carter, Erik W; Asmus, Jennifer M

    2014-11-01

    Attaining reliable estimates of observational measures can be challenging in school and classroom settings, as behavior can be influenced by multiple contextual factors. Generalizability (G) studies can enable researchers to estimate the reliability of observational data, and decision (D) studies can inform how many observation sessions are necessary to achieve a criterion level of reliability. We conducted G and D studies using observational data from a randomized control trial focusing on social and academic participation of students with severe disabilities in inclusive secondary classrooms. Results highlight the importance of anchoring observational decisions to reliability estimates from existing or pilot data sets. We outline steps for conducting G and D studies and address options when reliability estimates are lower than desired. PMID:25354126

  10. Adaptive, multiresolution visualization of large data sets using parallel octrees.

    SciTech Connect

    Freitag, L. A.; Loy, R. M.

    1999-06-10

    The interactive visualization and exploration of large scientific data sets is a challenging and difficult task; their size often far exceeds the performance and memory capacity of even the most powerful graphics work-stations. To address this problem, we have created a technique that combines hierarchical data reduction methods with parallel computing to allow interactive exploration of large data sets while retaining full-resolution capability. The hierarchical representation is built in parallel by strategically inserting field data into an octree data structure. We provide functionality that allows the user to interactively adapt the resolution of the reduced data sets so that resolution is increased in regions of interest without sacrificing local graphics performance. We describe the creation of the reduced data sets using a parallel octree, the software architecture of the system, and the performance of this system on the data from a Rayleigh-Taylor instability simulation.

  11. Looking at large data sets using binned data plots

    SciTech Connect

    Carr, D.B.

    1990-04-01

    This report addresses the monumental challenge of developing exploratory analysis methods for large data sets. The goals of the report are to increase awareness of large data sets problems and to contribute simple graphical methods that address some of the problems. The graphical methods focus on two- and three-dimensional data and common task such as finding outliers and tail structure, assessing central structure and comparing central structures. The methods handle large sample size problems through binning, incorporate information from statistical models and adapt image processing algorithms. Examples demonstrate the application of methods to a variety of publicly available large data sets. The most novel application addresses the too many plots to examine'' problem by using cognostics, computer guiding diagnostics, to prioritize plots. The particular application prioritizes views of computational fluid dynamics solution sets on the fly. That is, as each time step of a solution set is generated on a parallel processor the cognostics algorithms assess virtual plots based on the previous time step. Work in such areas is in its infancy and the examples suggest numerous challenges that remain. 35 refs., 15 figs.

  12. Iterative dictionary construction for compression of large DNA data sets.

    PubMed

    Kuruppu, Shanika; Beresford-Smith, Bryan; Conway, Thomas; Zobel, Justin

    2012-01-01

    Genomic repositories increasingly include individual as well as reference sequences, which tend to share long identical and near-identical strings of nucleotides. However, the sequential processing used by most compression algorithms, and the volumes of data involved, mean that these long-range repetitions are not detected. An order-insensitive, disk-based dictionary construction method can detect this repeated content and use it to compress collections of sequences. We explore a dictionary construction method that improves repeat identification in large DNA data sets. Our adaptation, COMRAD, of an existing disk-based method identifies exact repeated content in collections of sequences with similarities within and across the set of input sequences. COMRAD compresses the data over multiple passes, which is an expensive process, but allows COMRAD to compress large data sets within reasonable time and space. COMRAD allows for random access to individual sequences and subsequences without decompressing the whole data set. COMRAD has no competitor in terms of the size of data sets that it can compress (extending to many hundreds of gigabytes) and, even for smaller data sets, the results are competitive compared to alternatives; as an example, 39 S. cerevisiae genomes compressed to 0.25 bits per base.

  13. Reducing Information Overload in Large Seismic Data Sets

    SciTech Connect

    HAMPTON,JEFFERY W.; YOUNG,CHRISTOPHER J.; MERCHANT,BION J.; CARR,DORTHE B.; AGUILAR-CHANG,JULIO

    2000-08-02

    Event catalogs for seismic data can become very large. Furthermore, as researchers collect multiple catalogs and reconcile them into a single catalog that is stored in a relational database, the reconciled set becomes even larger. The sheer number of these events makes searching for relevant events to compare with events of interest problematic. Information overload in this form can lead to the data sets being under-utilized and/or used incorrectly or inconsistently. Thus, efforts have been initiated to research techniques and strategies for helping researchers to make better use of large data sets. In this paper, the authors present their efforts to do so in two ways: (1) the Event Search Engine, which is a waveform correlation tool and (2) some content analysis tools, which area combination of custom-built and commercial off-the-shelf tools for accessing, managing, and querying seismic data stored in a relational database. The current Event Search Engine is based on a hierarchical clustering tool known as the dendrogram tool, which is written as a MatSeis graphical user interface. The dendrogram tool allows the user to build dendrogram diagrams for a set of waveforms by controlling phase windowing, down-sampling, filtering, enveloping, and the clustering method (e.g. single linkage, complete linkage, flexible method). It also allows the clustering to be based on two or more stations simultaneously, which is important to bridge gaps in the sparsely recorded event sets anticipated in such a large reconciled event set. Current efforts are focusing on tools to help the researcher winnow the clusters defined using the dendrogram tool down to the minimum optimal identification set. This will become critical as the number of reference events in the reconciled event set continually grows. The dendrogram tool is part of the MatSeis analysis package, which is available on the Nuclear Explosion Monitoring Research and Engineering Program Web Site. As part of the research

  14. Quantifying comparison of large detrital geochronology data sets

    NASA Astrophysics Data System (ADS)

    Saylor, J. E.; Sundell, K. E., II

    2015-12-01

    The increasing size of detrital geochronological data challenges existing approaches to data visualization and comparison, highlighting the need for quantitative techniques able to compare multiple large data sets. Using the DZstats software package we applied five metrics to twenty large synthetic data sets and one large empirical data set. The metrics included the Kolmogorov-Smirnov (K-S) and Kuiper tests as well as Cross-correlation, Likeness, and Similarity coefficients of probability density plots (PDPs), kernel density estimates (KDEs) and locally adaptive, variable-bandwidth KDEs (LA-KDEs). We evaluate the metrics' utility based on three criteria: 1) samples from the same population should become systematically more similar with increasing sample size; 2) the metrics should maximize the range of possible coefficients; and 3) the metrics should minimize artifacts resulting from sample-specific complexity. K-S and Kuiper test p-values, and all KDE and LA-KDE coefficients passed a maximum of one criterion. Likeness and Similarity coefficients of PDPs, as well as K-S and Kuiper test D- and V-values passed two of the criteria. Cross-correlation of PDPs passed all three. As hypothesis tests of derivation from a common source, individual K-S and Kuiper p-values too frequently reject the null hypothesis that samples come from a common source. However, mean p-values calculated by bootstrap subsampling and comparison of sample data sets yield a binary discrimination of identical versus different source populations. Cross-correlation and Likeness of PDPs, and Cross-correlation of KDEs yield the widest divergence in coefficients and thus a consistent discrimination between identical and different source populations, with Cross-correlation of PDPs requiring the smallest sample size. In light of this, we recommend standard acquisition of large (n > 300) detrital geochronology data sets and repeated subsampling for robust quantitative comparison using Likeness, Cross

  15. Classrooms.

    ERIC Educational Resources Information Center

    Butin, Dan

    This paper addresses classroom design trends and the key issues schools should consider for better classroom space flexibility and adaptability. Classroom space design issues when schools embrace technology are discussed, as are design considerations when rooms must accommodate different grade levels, the importance of lighting, furniture…

  16. The attributes of an effective teacher differ between the classroom and the clinical setting.

    PubMed

    Haws, Jolene; Rannelli, Luke; Schaefer, Jeffrey P; Zarnke, Kelly; Coderre, Sylvain; Ravani, Pietro; McLaughlin, Kevin

    2016-10-01

    Most training programs use learners' subjective ratings of their teachers as the primary measure of teaching effectiveness. In a recent study we found that preclinical medical students' ratings of classroom teachers were associated with perceived charisma and physical attractiveness of the teacher, but not intellect. Here we explored whether the relationship between these variables and teaching effectiveness ratings holds in the clinical setting. We asked 27 Internal Medicine residents to rate teaching effectiveness of ten teachers with whom they had worked on a clinical rotation, in addition to rating each teacher's clinical skills, physical attractiveness, and charisma. We used linear regression to study the association between these explanatory variables and teaching effectiveness ratings. We found no association between rating of physical attractiveness and teaching effectiveness. Clinical skill and charisma were independently associated with rating of teaching effectiveness (regression coefficients [95 % confidence interval] 0.73 [0.60, 0.85], p < 0.001 and 0.12 [0.01, 0.23], p = 0.03, respectively). The variables associated with effectiveness of classroom and clinical teachers differ, suggesting context specificity in teaching effectiveness ratings. Context specificity may be explained by differences in the exposure that learners have to teachers in the classroom versus clinical setting-so that raters in the clinical setting may base ratings upon observed behaviours rather than stereotype data. Alternatively, since subjective ratings of teaching effectiveness inevitably incorporate learners' context-specific needs, the attributes that make a teacher effective in one context may not meet the needs of learners in a different context. PMID:26891679

  17. The attributes of an effective teacher differ between the classroom and the clinical setting.

    PubMed

    Haws, Jolene; Rannelli, Luke; Schaefer, Jeffrey P; Zarnke, Kelly; Coderre, Sylvain; Ravani, Pietro; McLaughlin, Kevin

    2016-10-01

    Most training programs use learners' subjective ratings of their teachers as the primary measure of teaching effectiveness. In a recent study we found that preclinical medical students' ratings of classroom teachers were associated with perceived charisma and physical attractiveness of the teacher, but not intellect. Here we explored whether the relationship between these variables and teaching effectiveness ratings holds in the clinical setting. We asked 27 Internal Medicine residents to rate teaching effectiveness of ten teachers with whom they had worked on a clinical rotation, in addition to rating each teacher's clinical skills, physical attractiveness, and charisma. We used linear regression to study the association between these explanatory variables and teaching effectiveness ratings. We found no association between rating of physical attractiveness and teaching effectiveness. Clinical skill and charisma were independently associated with rating of teaching effectiveness (regression coefficients [95 % confidence interval] 0.73 [0.60, 0.85], p < 0.001 and 0.12 [0.01, 0.23], p = 0.03, respectively). The variables associated with effectiveness of classroom and clinical teachers differ, suggesting context specificity in teaching effectiveness ratings. Context specificity may be explained by differences in the exposure that learners have to teachers in the classroom versus clinical setting-so that raters in the clinical setting may base ratings upon observed behaviours rather than stereotype data. Alternatively, since subjective ratings of teaching effectiveness inevitably incorporate learners' context-specific needs, the attributes that make a teacher effective in one context may not meet the needs of learners in a different context.

  18. STEME: a robust, accurate motif finder for large data sets.

    PubMed

    Reid, John E; Wernisch, Lorenz

    2014-01-01

    Motif finding is a difficult problem that has been studied for over 20 years. Some older popular motif finders are not suitable for analysis of the large data sets generated by next-generation sequencing. We recently published an efficient approximation (STEME) to the EM algorithm that is at the core of many motif finders such as MEME. This approximation allows the EM algorithm to be applied to large data sets. In this work we describe several efficient extensions to STEME that are based on the MEME algorithm. Together with the original STEME EM approximation, these extensions make STEME a fully-fledged motif finder with similar properties to MEME. We discuss the difficulty of objectively comparing motif finders. We show that STEME performs comparably to existing prominent discriminative motif finders, DREME and Trawler, on 13 sets of transcription factor binding data in mouse ES cells. We demonstrate the ability of STEME to find long degenerate motifs which these discriminative motif finders do not find. As part of our method, we extend an earlier method due to Nagarajan et al. for the efficient calculation of motif E-values. STEME's source code is available under an open source license and STEME is available via a web interface. PMID:24625410

  19. Registration of large data sets for multimodal inspection

    NASA Astrophysics Data System (ADS)

    Vedula, Venumadhav V. S.; Sheri, George

    2006-08-01

    Registration plays a key role in multimodal data fusion to extract synergistic information from multiple non-destructive evaluation (NDE) sources. One of the common techniques for registration of point datasets is the Iterative Closest Point (ICP) Algorithm. Generally, modern day NDE techniques generate large datasets and conventional ICP algorithm requires huge amount of time to register datasets to the desired accuracy. In this paper, we present algorithms to aid in the registration of large 3D NDE data sets in less time with the required accuracy. Various methods of coarse registration of data, partial registration and data reduction are used to realize this. These techniques have been used in registration and it is shown that registration can be accomplished to the desired accuracy with more than 90% reduction in time as compared to conventional ICP algorithm. Volumes of interest (VOI) can be defined on the data sets and merged together so that only the features of interest are used in the registration. The proposed algorithm also provides capability for eliminating noise in the data sets. Registration of Computed Tomography (CT) Image data, Coordinate Measuring Machine (CMM) Inspection data and CAD model has been discussed in the present work. The algorithm is generic in nature and can be applied to any other NDE inspection data.

  20. Unsupervised Source Separation for Large Planetary Data Sets

    NASA Astrophysics Data System (ADS)

    Schmidt, A.; Moussaoui, S.; Legendre, M.; Schmidt, F.

    2012-12-01

    Implementations of non-negative matrix approximation (NNMA) algorithms have reached a level of maturity that makes them suitable for application on large planetary datasets. Our results imply that they can effectively factor large collections of hyperspectral measurements into sources and abundances under linearity and non-negativity constraints with reasonable resources. This work presents our first steps towards automated large-scale analysis of hyperspectral data sets with the focus on spectral and geographical summarization using NNMA. The method we implement consists of various steps: after initial calibration, randomized NNMA is used to extract source spectra; these are then distributed on a virtual sphere and classified according to similarity and proximity; for each class of source, an abundance map is generated and presented to the user for interpretation. We use the method on both imaging and point spectrometer data but require a selection of appropriate imagery (for example, a consistent set of nadir pointing images with compatible illumination conditions). Results can be obtained with reasonable resources in minutes to hours, depending on the size of the dataset and the accuracy required. For the datasets we studied, results have been mostly in line with related work and community opinion. Future work includes treatment of overlapping image regions, incorporation of more robust features and inclusion of more datasets.

  1. Robust Coordination for Large Sets of Simple Rovers

    NASA Technical Reports Server (NTRS)

    Tumer, Kagan; Agogino, Adrian

    2006-01-01

    The ability to coordinate sets of rovers in an unknown environment is critical to the long-term success of many of NASA;s exploration missions. Such coordination policies must have the ability to adapt in unmodeled or partially modeled domains and must be robust against environmental noise and rover failures. In addition such coordination policies must accommodate a large number of rovers, without excessive and burdensome hand-tuning. In this paper we present a distributed coordination method that addresses these issues in the domain of controlling a set of simple rovers. The application of these methods allows reliable and efficient robotic exploration in dangerous, dynamic, and previously unexplored domains. Most control policies for space missions are directly programmed by engineers or created through the use of planning tools, and are appropriate for single rover missions or missions requiring the coordination of a small number of rovers. Such methods typically require significant amounts of domain knowledge, and are difficult to scale to large numbers of rovers. The method described in this article aims to address cases where a large number of rovers need to coordinate to solve a complex time dependent problem in a noisy environment. In this approach, each rover decomposes a global utility, representing the overall goal of the system, into rover-specific utilities that properly assign credit to the rover s actions. Each rover then has the responsibility to create a control policy that maximizes its own rover-specific utility. We show a method of creating rover-utilities that are "aligned" with the global utility, such that when the rovers maximize their own utility, they also maximize the global utility. In addition we show that our method creates rover-utilities that allow the rovers to create their control policies quickly and reliably. Our distributed learning method allows large sets rovers be used unmodeled domains, while providing robustness against

  2. A large-scale crop protection bioassay data set.

    PubMed

    Gaulton, Anna; Kale, Namrata; van Westen, Gerard J P; Bellis, Louisa J; Bento, A Patrícia; Davies, Mark; Hersey, Anne; Papadatos, George; Forster, Mark; Wege, Philip; Overington, John P

    2015-01-01

    ChEMBL is a large-scale drug discovery database containing bioactivity information primarily extracted from scientific literature. Due to the medicinal chemistry focus of the journals from which data are extracted, the data are currently of most direct value in the field of human health research. However, many of the scientific use-cases for the current data set are equally applicable in other fields, such as crop protection research: for example, identification of chemical scaffolds active against a particular target or endpoint, the de-convolution of the potential targets of a phenotypic assay, or the potential targets/pathways for safety liabilities. In order to broaden the applicability of the ChEMBL database and allow more widespread use in crop protection research, an extensive data set of bioactivity data of insecticidal, fungicidal and herbicidal compounds and assays was collated and added to the database.

  3. Optimizing distance-based methods for large data sets

    NASA Astrophysics Data System (ADS)

    Scholl, Tobias; Brenner, Thomas

    2015-10-01

    Distance-based methods for measuring spatial concentration of industries have received an increasing popularity in the spatial econometrics community. However, a limiting factor for using these methods is their computational complexity since both their memory requirements and running times are in {{O}}(n^2). In this paper, we present an algorithm with constant memory requirements and shorter running time, enabling distance-based methods to deal with large data sets. We discuss three recent distance-based methods in spatial econometrics: the D&O-Index by Duranton and Overman (Rev Econ Stud 72(4):1077-1106, 2005), the M-function by Marcon and Puech (J Econ Geogr 10(5):745-762, 2010) and the Cluster-Index by Scholl and Brenner (Reg Stud (ahead-of-print):1-15, 2014). Finally, we present an alternative calculation for the latter index that allows the use of data sets with millions of firms.

  4. A large-scale crop protection bioassay data set.

    PubMed

    Gaulton, Anna; Kale, Namrata; van Westen, Gerard J P; Bellis, Louisa J; Bento, A Patrícia; Davies, Mark; Hersey, Anne; Papadatos, George; Forster, Mark; Wege, Philip; Overington, John P

    2015-01-01

    ChEMBL is a large-scale drug discovery database containing bioactivity information primarily extracted from scientific literature. Due to the medicinal chemistry focus of the journals from which data are extracted, the data are currently of most direct value in the field of human health research. However, many of the scientific use-cases for the current data set are equally applicable in other fields, such as crop protection research: for example, identification of chemical scaffolds active against a particular target or endpoint, the de-convolution of the potential targets of a phenotypic assay, or the potential targets/pathways for safety liabilities. In order to broaden the applicability of the ChEMBL database and allow more widespread use in crop protection research, an extensive data set of bioactivity data of insecticidal, fungicidal and herbicidal compounds and assays was collated and added to the database. PMID:26175909

  5. Support vector machine classifiers for large data sets.

    SciTech Connect

    Gertz, E. M.; Griffin, J. D.

    2006-01-31

    This report concerns the generation of support vector machine classifiers for solving the pattern recognition problem in machine learning. Several methods are proposed based on interior point methods for convex quadratic programming. Software implementations are developed by adapting the object-oriented packaging OOQP to the problem structure and by using the software package PETSc to perform time-intensive computations in a distributed setting. Linear systems arising from classification problems with moderately large numbers of features are solved by using two techniques--one a parallel direct solver, the other a Krylov-subspace method incorporating novel preconditioning strategies. Numerical results are provided, and computational experience is discussed.

  6. Implementing Concept-Based Learning in a Large Undergraduate Classroom

    ERIC Educational Resources Information Center

    Morse, David; Jutras, France

    2008-01-01

    An experiment explicitly introducing learning strategies to a large, first-year undergraduate cell biology course was undertaken to see whether awareness and use of strategies had a measurable impact on student performance. The construction of concept maps was selected as the strategy to be introduced because of an inherent coherence with a course…

  7. Interaction and Uptake in Large Foreign Language Classrooms

    ERIC Educational Resources Information Center

    Ekembe, Eric Enongene

    2014-01-01

    Inteaction determines and affects the conditions of language acquisition especially in contexts where exposure to the target language is limited. This is believed to be successful only within the context of small classes (Chavez, 2009). This paper examines learners' progress resulting from interaction in large classes. Using pre-, post-, and…

  8. Comparing Outcomes from Field and Classroom Based Settings for Undergraduate Geoscience Courses

    NASA Astrophysics Data System (ADS)

    Skinner, M. R.; Harris, R. A.; Flores, J.

    2011-12-01

    Field based learning can be found in nearly every course offered in Geology at Brigham Young University. For example, in our Structural Geology course field studies substitute for labs. Students collect data their own data from several different structural settings of the Wasatch Range. Our curriculum also includes a two-week, sophomore-level field course that introduces students to interpreting field relations themselves and sets the stage for much of what they learn in their upper-division courses. Our senior-level six-week field geology course includes classical field mapping with exercises in petroleum and mineral exploration, environmental geology and geological hazards. Experiments with substituting field-based general education courses for those in traditional classroom settings indicate that student cognition, course enjoyment and recruiting of majors significantly increase in a field-based course. We offer a field-based introductory geology course (Geo 102) that is taught in seven, six-hour field trips during which students travel to localities of geologic interest to investigate a variety of fundamental geological problems. We compare the outcomes of Geo 102 with a traditional classroom-based geology course (Geo 101). For the comparison both courses are taught by the same instructor, use the same text and supplementary materials and take the same exams. The results of 7 years of reporting indicate that test scores and final grades are one-half grade point higher for Geo 102 students versus those in traditional introductory courses. Student evaluations of the course are also 0.8-1.4 points higher on a scale of 1-8, and are consistently the highest in the Department and College. Other observations include increased attendance, attention and curiosity. The later two are measured by the number of students asking questions of other students as well as the instructors, and the total number of questions asked during class time in the field versus the classroom

  9. Towards effective analysis of large grain boundary data sets

    NASA Astrophysics Data System (ADS)

    Glowinski, K.; Morawiec, A.

    2015-04-01

    Grain boundaries affect properties of polycrystals. Novel experimental techniques for three-dimensional orientation mapping give new opportunities for studies of this influence. Large networks of boundaries can be analyzed based on all five ’macroscopic’ boundary parameters. We demonstrate benefits of applying two methods for improving these analyses. The fractions of geometrically special boundaries in ferrite are estimated based on ’approximate’ distances to the nearest special boundaries; by using these parameters, the times needed for processing boundary data sets are shortened. Moreover, grain-boundary distributions for nickel are obtained using kernel density estimation; this approach leads to distribution functions more accurate than those obtained based on partition of the space into bins.

  10. Visualizing large data sets in the earth sciences

    NASA Technical Reports Server (NTRS)

    Hibbard, William; Santek, David

    1989-01-01

    The authors describe the capabilities of McIDAS, an interactive visualization system that is vastly increasing the ability of earth scientists to manage and analyze data from remote sensing instruments and numerical simulation models. McIDAS provides animated three-dimensionsal images and highly interactive displays. The software can manage, analyze, and visualize large data sets that span many physical variables (such as temperature, pressure, humidity, and wind speed), as well as time and three spatial dimensions. The McIDAS system manages data from at least 100 different sources. The data management tools consist of data structures for storing different data types in files, libraries of routines for accessing these data structures, system commands for performing housekeeping functions on the data files, and reformatting programs for converting external data to the system's data structures. The McIDAS tools for three-dimensional visualization of meteorological data run on an IBM mainframe and can load up to 128-frame animation sequences into the workstations. A highly interactive version of the system can provide an interactive window into data sets containing tens of millions of points produced by numerical models and remote sensing instruments. The visualizations are being used for teaching as well as by scientists.

  11. Parallel Analysis Tools for Ultra-Large Climate Data Sets

    NASA Astrophysics Data System (ADS)

    Jacob, Robert; Krishna, Jayesh; Xu, Xiabing; Mickelson, Sheri; Wilde, Mike; Peterson, Kara; Bochev, Pavel; Latham, Robert; Tautges, Tim; Brown, David; Brownrigg, Richard; Haley, Mary; Shea, Dennis; Huang, Wei; Middleton, Don; Schuchardt, Karen; Yin, Jian

    2013-04-01

    While climate models have used parallelism for several years, the post-processing tools are still mostly single-threaded applications and many are closed source. These tools are becoming a bottleneck in the production of new climate knowledge when they confront terabyte-sized output from high-resolution climate models. The ParVis project is using and creating Free and Open Source tools that bring data and task parallelism to climate model analysis to enable analysis of large climate data sets. ParVis is using the Swift task-parallel language to implement a diagnostic suite that generates over 600 plots of atmospheric quantities. ParVis has also created a Parallel Gridded Analysis Library (ParGAL) which implements many common climate analysis operations in a data-parallel fashion using the Message Passing Interface. ParGAL has in turn been built on sophisticated packages for describing grids in parallel (the Mesh Oriented database (MOAB), performing vector operations on arbitrary grids (Intrepid) and reading data in parallel (PnetCDF). ParGAL is being used to implement a parallel version of the NCAR Command Language (NCL) called ParNCL. ParNCL/ParCAL not only speeds up analysis of large datasets but also allows operations to be performed on native grids, eliminating the need to transform data to latitude-longitude grids. All of the tools ParVis is creating are available as free and open source software.

  12. Ready, Set, Science! Putting Research To Work In K-8 Classrooms

    NASA Astrophysics Data System (ADS)

    van der Veen, Wil E.; Moody, T.

    2008-05-01

    What types of instructional experiences help students learn and understand science? What do professional development providers and curriculum designers need to know to create and support such experiences? Ready, Set, Science! is a book that provides a practical and accessible account of current research about teaching and learning science. Based on the groundbreaking National Research Council report "Taking Science to School: Learning and Teaching Science in Grades K-8” (2006), the book reviews principles derived from the latest educational research and applies them to effective teaching practice. Ready, Set, Science! is a MUST READ for everyone involved in K-12 education, or creating products intended for K-12 use. We will review Ready, Set, Science!'s new vision of science in education, its most important recommendations, and its implications for the place of astronomy in K-12 classrooms. We will review some useful suggestions on how to make student thinking visible and report on how we have put this into practice with teachers. We will engage the audience in a brief interactive demonstration of specific questioning techniques described in the book that help to make student thinking visible.

  13. Implementing concept-based learning in a large undergraduate classroom.

    PubMed

    Morse, David; Jutras, France

    2008-01-01

    An experiment explicitly introducing learning strategies to a large, first-year undergraduate cell biology course was undertaken to see whether awareness and use of strategies had a measurable impact on student performance. The construction of concept maps was selected as the strategy to be introduced because of an inherent coherence with a course structured by concepts. Data were collected over three different semesters of an introductory cell biology course, all teaching similar course material with the same professor and all evaluated using similar examinations. The first group, used as a control, did not construct concept maps, the second group constructed individual concept maps, and the third group first constructed individual maps then validated their maps in small teams to provide peer feedback about the individual maps. Assessment of the experiment involved student performance on the final exam, anonymous polls of student perceptions, failure rate, and retention of information at the start of the following year. The main conclusion drawn is that concept maps without feedback have no significant effect on student performance, whereas concept maps with feedback produced a measurable increase in student problem-solving performance and a decrease in failure rates.

  14. Implementing Concept-based Learning in a Large Undergraduate Classroom

    PubMed Central

    Jutras, France

    2008-01-01

    An experiment explicitly introducing learning strategies to a large, first-year undergraduate cell biology course was undertaken to see whether awareness and use of strategies had a measurable impact on student performance. The construction of concept maps was selected as the strategy to be introduced because of an inherent coherence with a course structured by concepts. Data were collected over three different semesters of an introductory cell biology course, all teaching similar course material with the same professor and all evaluated using similar examinations. The first group, used as a control, did not construct concept maps, the second group constructed individual concept maps, and the third group first constructed individual maps then validated their maps in small teams to provide peer feedback about the individual maps. Assessment of the experiment involved student performance on the final exam, anonymous polls of student perceptions, failure rate, and retention of information at the start of the following year. The main conclusion drawn is that concept maps without feedback have no significant effect on student performance, whereas concept maps with feedback produced a measurable increase in student problem-solving performance and a decrease in failure rates. PMID:18519616

  15. The Incredible Years Teacher Classroom Management Program: Using Coaching to Support Generalization to Real-World Classroom Settings

    ERIC Educational Resources Information Center

    Reinke, Wendy M.; Stormont, Melissa; Webster-Stratton, Carolyn; Newcomer, Lori L.; Herman, Keith C.

    2012-01-01

    This article focuses on the Incredible Years Teacher Classroom Management Training (IY TCM) intervention as an example of an evidence-based program that embeds coaching within its design. First, the core features of the IY TCM program are described. Second, the IY TCM coaching model and processes utilized to facilitate high fidelity of…

  16. Web based visualization of large climate data sets

    USGS Publications Warehouse

    Alder, Jay R.; Hostetler, Steven W.

    2015-01-01

    We have implemented the USGS National Climate Change Viewer (NCCV), which is an easy-to-use web application that displays future projections from global climate models over the United States at the state, county and watershed scales. We incorporate the NASA NEX-DCP30 statistically downscaled temperature and precipitation for 30 global climate models being used in the Fifth Assessment Report (AR5) of the Intergovernmental Panel on Climate Change (IPCC), and hydrologic variables we simulated using a simple water-balance model. Our application summarizes very large, complex data sets at scales relevant to resource managers and citizens and makes climate-change projection information accessible to users of varying skill levels. Tens of terabytes of high-resolution climate and water-balance data are distilled to compact binary format summary files that are used in the application. To alleviate slow response times under high loads, we developed a map caching technique that reduces the time it takes to generate maps by several orders of magnitude. The reduced access time scales to >500 concurrent users. We provide code examples that demonstrate key aspects of data processing, data exporting/importing and the caching technique used in the NCCV.

  17. Large Data set management tools in European Grid middleware

    NASA Astrophysics Data System (ADS)

    Cozzini, S.

    2005-12-01

    Today scientific computational tasks are becoming more and more data-intensive and therefore in the Grid computing environment is important to have tools that easily integrate storage resources that provide high capacity, redundancy and scalability. Data movement and replication are also important functions to guarantee optimized access to files. Making storage resources shareable through the Grid middleware technology requires that those systems expose a uniform standard interface. The need for a homogeneous, transparent interface to storage leaded to the definition of the Storage Resource Manager (SRM) interface. Thus, space management for any storage manager implementation looks the same to a client. The StoRM project, a joint collaboration between CNAF/INAF, ICTP Egrid project and CERN, aims at implementing the SRM standard over parallel file systems like GPFS and Lustre. In this talk I will briefly discuss data managements tools developed within the most important European Grid projects and I will shortly present Storm and its features. I finally will discuss typical case studies where large data set produced by ab-initio simulations can be easily handled within computational grids with appropriated tools like the one described before.

  18. Challenges Associated With Using Large Data Sets for Quality Assessment and Research in Clinical Settings.

    PubMed

    Cohen, Bevin; Vawdrey, David K; Liu, Jianfang; Caplan, David; Furuya, E Yoko; Mis, Frederick W; Larson, Elaine

    2015-08-01

    The rapidly expanding use of electronic records in health-care settings is generating unprecedented quantities of data available for clinical, epidemiological, and cost-effectiveness research. Several challenges are associated with using these data for clinical research, including issues surrounding access and information security, poor data quality, inconsistency of data within and across institutions, and a paucity of staff with expertise to manage and manipulate large clinical data sets. In this article, we describe our experience with assembling a data-mart and conducting clinical research using electronic data from four facilities within a single hospital network in New York City. We culled data from several electronic sources, including the institution's admission-discharge-transfer system, cost accounting system, electronic health record, clinical data warehouse, and departmental records. The final data-mart contained information for more than 760,000 discharges occurring from 2006 through 2012. Using categories identified by the National Institutes of Health Big Data to Knowledge initiative as a framework, we outlined challenges encountered during the development and use of a domain-specific data-mart and recommend approaches to overcome these challenges.

  19. Challenges Associated With Using Large Data Sets for Quality Assessment and Research in Clinical Settings

    PubMed Central

    Cohen, Bevin; Vawdrey, David K.; Liu, Jianfang; Caplan, David; Furuya, E. Yoko; Mis, Frederick W.; Larson, Elaine

    2015-01-01

    The rapidly expanding use of electronic records in health-care settings is generating unprecedented quantities of data available for clinical, epidemiological, and cost-effectiveness research. Several challenges are associated with using these data for clinical research, including issues surrounding access and information security, poor data quality, inconsistency of data within and across institutions, and a paucity of staff with expertise to manage and manipulate large clinical data sets. In this article, we describe our experience with assembling a data-mart and conducting clinical research using electronic data from four facilities within a single hospital network in New York City. We culled data from several electronic sources, including the institution’s admission-discharge-transfer system, cost accounting system, electronic health record, clinical data warehouse, and departmental records. The final data-mart contained information for more than 760,000 discharges occurring from 2006 through 2012. Using categories identified by the National Institutes of Health Big Data to Knowledge initiative as a framework, we outlined challenges encountered during the development and use of a domain-specific data-mart and recommend approaches to overcome these challenges. PMID:26351216

  20. Challenges Associated With Using Large Data Sets for Quality Assessment and Research in Clinical Settings.

    PubMed

    Cohen, Bevin; Vawdrey, David K; Liu, Jianfang; Caplan, David; Furuya, E Yoko; Mis, Frederick W; Larson, Elaine

    2015-08-01

    The rapidly expanding use of electronic records in health-care settings is generating unprecedented quantities of data available for clinical, epidemiological, and cost-effectiveness research. Several challenges are associated with using these data for clinical research, including issues surrounding access and information security, poor data quality, inconsistency of data within and across institutions, and a paucity of staff with expertise to manage and manipulate large clinical data sets. In this article, we describe our experience with assembling a data-mart and conducting clinical research using electronic data from four facilities within a single hospital network in New York City. We culled data from several electronic sources, including the institution's admission-discharge-transfer system, cost accounting system, electronic health record, clinical data warehouse, and departmental records. The final data-mart contained information for more than 760,000 discharges occurring from 2006 through 2012. Using categories identified by the National Institutes of Health Big Data to Knowledge initiative as a framework, we outlined challenges encountered during the development and use of a domain-specific data-mart and recommend approaches to overcome these challenges. PMID:26351216

  1. An Exploration Tool for Very Large Spectrum Data Sets

    NASA Astrophysics Data System (ADS)

    Carbon, Duane F.; Henze, Christopher

    2015-01-01

    We present an exploration tool for very large spectrum data sets such as the SDSS, LAMOST, and 4MOST data sets. The tool works in two stages: the first uses batch processing and the second runs interactively. The latter employs the NASA hyperwall, a configuration of 128 workstation displays (8x16 array) controlled by a parallelized software suite running on NASA's Pleiades supercomputer. The stellar subset of the Sloan Digital Sky Survey DR10 was chosen to show how the tool may be used. In stage one, SDSS files for 569,738 stars are processed through our data pipeline. The pipeline fits each spectrum using an iterative continuum algorithm, distinguishing emission from absorption and handling molecular absorption bands correctly. It then measures 1659 discrete atomic and molecular spectral features that were carefully preselected based on their likelihood of being visible at some spectral type. The depths relative to the local continuum at each feature wavelength are determined for each spectrum: these depths, the local S/N level, and DR10-supplied variables such as magnitudes, colors, positions, and radial velocities are the basic measured quantities used on the hyperwall. In stage two, each hyperwall panel is used to display a 2-D scatter plot showing the depth of feature A vs the depth of feature B for all of the stars. A and B change from panel to panel. The relationships between the various (A,B) strengths and any distinctive clustering are immediately apparent when examining and inter-comparing the different panels on the hyperwall. The interactive software allows the user to select the stars in any interesting region of any 2-D plot on the hyperwall, immediately rendering the same stars on all the other 2-D plots in a unique color. The process may be repeated multiple times, each selection displaying a distinctive color on all the plots. At any time, the spectra of the selected stars may be examined in detail on a connected workstation display. We illustrate

  2. Trans-dimensional Bayesian inference for large sequential data sets

    NASA Astrophysics Data System (ADS)

    Mandolesi, E.; Dettmer, J.; Dosso, S. E.; Holland, C. W.

    2015-12-01

    This work develops a sequential Monte Carlo method to infer seismic parameters of layered seabeds from large sequential reflection-coefficient data sets. The approach provides parameter estimates and uncertainties along survey tracks with the goal to aid in the detection of unexploded ordnance in shallow water. The sequential data are acquired by a moving platform with source and receiver array towed close to the seabed. This geometry requires consideration of spherical reflection coefficients, computed efficiently by massively parallel implementation of the Sommerfeld integral via Levin integration on a graphics processing unit. The seabed is parametrized with a trans-dimensional model to account for changes in the environment (i.e. changes in layering) along the track. The method combines advanced Markov chain Monte Carlo methods (annealing) with particle filtering (resampling). Since data from closely-spaced source transmissions (pings) often sample similar environments, the solution from one ping can be utilized to efficiently estimate the posterior for data from subsequent pings. Since reflection-coefficient data are highly informative, the likelihood function can be extremely peaked, resulting in little overlap between posteriors of adjacent pings. This is addressed by adding bridging distributions (via annealed importance sampling) between pings for more efficient transitions. The approach assumes the environment to be changing slowly enough to justify the local 1D parametrization. However, bridging allows rapid changes between pings to be addressed and we demonstrate the method to be stable in such situations. Results are in terms of trans-D parameter estimates and uncertainties along the track. The algorithm is examined for realistic simulated data along a track and applied to a dataset collected by an autonomous underwater vehicle on the Malta Plateau, Mediterranean Sea. [Work supported by the SERDP, DoD.

  3. Science Teacher Beliefs and Classroom Practice Related to Constructivism in Different School Settings

    NASA Astrophysics Data System (ADS)

    Savasci, Funda; Berlin, Donna F.

    2012-02-01

    Science teacher beliefs and classroom practice related to constructivism and factors that may influence classroom practice were examined in this cross-case study. Data from four science teachers in two schools included interviews, demographic questionnaire, Classroom Learning Environment Survey (preferred/perceived), and classroom observations and documents. Using an inductive analytic approach, results suggested that the teachers embraced constructivism, but classroom observations did not confirm implementation of these beliefs for three of the four teachers. The most preferred constructivist components were personal relevance and student negotiation; the most perceived component was critical voice. Shared control was the least preferred, least perceived, and least observed constructivist component. School type, grade, student behavior/ability, curriculum/standardized testing, and parental involvement may influence classroom practice.

  4. The Effects of a Teacher-Child Play Intervention on Classroom Compliance in Young Children in Child Care Settings

    ERIC Educational Resources Information Center

    Levine, Darren G.; Ducharme, Joseph M.

    2013-01-01

    The current study evaluated the effects of a teacher-conducted play intervention on preschool-aged children's compliance in child care settings. Study participants included 8 children ranging in age from 3 to 5 years and 5 early childhood education teachers within 5 classrooms across 5 child care centers. A combination ABAB and multiple baseline…

  5. A Case Based Analysis Preparation Strategy for Use in a Classroom Management for Inclusive Settings Course: Preliminary Observations

    ERIC Educational Resources Information Center

    Niles, William J.; Cohen, Alan

    2012-01-01

    Case based instruction (CBI) is a pedagogical option in teacher preparation growing in application but short on practical means to implement the method. This paper presents an analysis strategy and questions developed to help teacher trainees focus on classroom management issues embedded in a set of "real" cases. An analysis of teacher candidate…

  6. The Interests of Full Disclosure: Agenda-Setting and the Practical Initiation of the Feminist Classroom

    ERIC Educational Resources Information Center

    Seymour, Nicole

    2007-01-01

    Several theoretical and pragmatic questions arise when one attempts to employ feminist pedagogy in the classroom (or to study it), such as how to strike a balance between classroom order and instructor de-centering and how to productively address student resistance. In this article, the author describes how she took on her final project for a…

  7. Examining Play among Young Children in Single-Age and Multi-Age Preschool Classroom Settings

    ERIC Educational Resources Information Center

    Youhne, Mia Song

    2009-01-01

    Advocates for multi-age classrooms claim multi-age groupings benefit children (Brynes, Shuster, & Jones, 1994). Currently, there is a lack of research examining play among students in multi-age classrooms. If indeed there is a positive benefit of play among children, research is needed to examine these behaviors among and between young children in…

  8. Using Reading Classroom Explorer's Interactive Notebook: Student-Initiated Inquiries in a Collaborative Setting.

    ERIC Educational Resources Information Center

    Hughes, Joan E.; Packard, Becky Wai-Ling; Reischl, Catherine H.; Pearson, P. David

    The Reading Classroom Explorer (RCE), a hypermedia learning environment for teacher education, was developed in 1996. The environment contains searchable video clips of six exemplary teachers teaching reading, transcripts of classroom clips, questions to spur thinking, reference citations, and an interactive notebook. A study explored: what sorts…

  9. Classrooms that Work: Teaching Generic Skills in Academic and Vocational Settings.

    ERIC Educational Resources Information Center

    Stasz, Cathleen; And Others

    This report documents the second of two studies on teaching and learning generic skills in high schools. It extends the earlier work by providing a model for designing classroom instruction in both academic and vocational classrooms where teaching generic skills is an instructional goal. Ethnographic field methods were used to observe, record, and…

  10. Teaching and learning in an integrated curriculum setting: A case study of classroom practices

    NASA Astrophysics Data System (ADS)

    MacMath, Sheryl Lynn

    Curriculum integration, while a commonly used educational term, remains a challenging concept to define and examine both in research and in classroom practice. Numerous types and definitions of curriculum integration exist in educational research, while, in comparison, teachers tend to focus on curriculum integration simply as a mixing of subject areas. To better understand curriculum integration in practice, this thesis details a case study that examines both teacher and student perspectives regarding a grade nine integrated unit on energy. Set in a public secondary school in Ontario, Canada, I comprehensively describe and analyze teacher understandings of, and challenges with, the implementation of an integrated unit, while also examining student perspectives and academic learning. My participants consisted of two high school teachers, a geography teacher and a science teacher, and their twenty-three students. Using data gathered from interviews before, during, and after the implementation of a 16-lesson unit, as well as observations throughout, I completed a case description and thematic analysis. My results illustrate the importance of examining why teachers choose to implement an integrated unit and the planning and scheduling challenges that exist. In addition, while the students in this study were academically successful, clarification is needed regarding whether student success can be linked to the integration of these two subjects or the types of activities these two teachers utilized.

  11. Evidence for teaching practice: the impact of clickers in a large classroom environment.

    PubMed

    Patterson, Barbara; Kilpatrick, Judith; Woebkenberg, Eric

    2010-10-01

    As the number of nursing students increases, the ability to actively engage all students in a large classroom is challenging and increasingly difficult. Clickers, or student response systems (SRS), are a relatively new technology in nursing education that use wireless technology and enable students to select individual responses to questions posed to them during class. The study design was a quasi-experimental comparison with one section of an adult medical-surgical course using the SRS and one receiving standard teaching. No significant differences between groups on any measure of performance were found. Focus groups were conducted to describe student perceptions of SRS. Three themes emerged: Being able to respond anonymously, validating an answer while providing immediate feedback, and providing an interactive and engaging environment. Although the clickers did not improve learning outcomes as measured by objective testing, perceptions shared by students indicated an increased degree of classroom engagement. Future research needs to examine other potential outcome variables. PMID:20044180

  12. Connecting scientific research and classroom instruction: Developing authentic problem sets for the undergraduate organic chemistry curriculum

    NASA Astrophysics Data System (ADS)

    Raker, Jeffrey R.

    Reform efforts in science education have called for instructional methods and resources that mirror the practice of science. Little research and design methods have been documented in the literature for designing such materials. The purpose of this study was to develop problems sets for sophomore-level organic chemistry instruction. This research adapted an instructional design methodology from the science education literature for the creation of new curricular problem sets. The first phase of this study was to establish an understanding of current curricular problems in sophomore-level organic chemistry instruction. A sample of 792 problems was collected from four organic chemistry courses. These problems were assessed using three literature reported problem typologies. Two of these problem typologies have previously been used to understand general chemistry problems; comparisons between general and organic chemistry problems were thus made. Data from this phase was used to develop a set of five problems for practicing organic chemists. The second phase of this study was to explore practicing organic chemists' experiences solving problems in the context of organic synthesis research. Eight practicing organic chemists were interviewed and asked to solve two to three of the problems developed in phase one of this research. These participants spoke of three problem types: project level, synthetic planning, and day-to-day. Three knowledge types (internal knowledge, knowledgeable others, and literature) were used in solving these problems in research practice and in the developed problems. A set of guiding factors and implications were derived from this data and the chemistry education literature for the conversion of the problems for practicing chemists to problems for undergraduate students. A subsequent conversion process for the five problems occurred. The third, and last phase, of this study was to explore undergraduate students' experiences solving problems in

  13. Using Large Data Sets to Study College Education Trajectories

    ERIC Educational Resources Information Center

    Oseguera, Leticia; Hwang, Jihee

    2014-01-01

    This chapter presents various considerations researchers undertook to conduct a quantitative study on low-income students using a national data set. Specifically, it describes how a critical quantitative scholar approaches guiding frameworks, variable operationalization, analytic techniques, and result interpretation. Results inform how…

  14. Processing large remote sensing image data sets on Beowulf clusters

    USGS Publications Warehouse

    Steinwand, Daniel R.; Maddox, Brian; Beckmann, Tim; Schmidt, Gail

    2003-01-01

    High-performance computing is often concerned with the speed at which floating- point calculations can be performed. The architectures of many parallel computers and/or their network topologies are based on these investigations. Often, benchmarks resulting from these investigations are compiled with little regard to how a large dataset would move about in these systems. This part of the Beowulf study addresses that concern by looking at specific applications software and system-level modifications. Applications include an implementation of a smoothing filter for time-series data, a parallel implementation of the decision tree algorithm used in the Landcover Characterization project, a parallel Kriging algorithm used to fit point data collected in the field on invasive species to a regular grid, and modifications to the Beowulf project's resampling algorithm to handle larger, higher resolution datasets at a national scale. Systems-level investigations include a feasibility study on Flat Neighborhood Networks and modifications of that concept with Parallel File Systems.

  15. Value-based customer grouping from large retail data sets

    NASA Astrophysics Data System (ADS)

    Strehl, Alexander; Ghosh, Joydeep

    2000-04-01

    In this paper, we propose OPOSSUM, a novel similarity-based clustering algorithm using constrained, weighted graph- partitioning. Instead of binary presence or absence of products in a market-basket, we use an extended 'revenue per product' measure to better account for management objectives. Typically the number of clusters desired in a database marketing application is only in the teens or less. OPOSSUM proceeds top-down, which is more efficient and takes a small number of steps to attain the desired number of clusters as compared to bottom-up agglomerative clustering approaches. OPOSSUM delivers clusters that are balanced in terms of either customers (samples) or revenue (value). To facilitate data exploration and validation of results we introduce CLUSION, a visualization toolkit for high-dimensional clustering problems. To enable closed loop deployment of the algorithm, OPOSSUM has no user-specified parameters. Thresholding heuristics are avoided and the optimal number of clusters is automatically determined by a search for maximum performance. Results are presented on a real retail industry data-set of several thousand customers and products, to demonstrate the power of the proposed technique.

  16. The Single and Combined Effects of Multiple Intensities of Behavior Modification and Methylphenidate for Children with Attention Deficit Hyperactivity Disorder in a Classroom Setting

    ERIC Educational Resources Information Center

    Fabiano, Gregory A.; Pelham, William E., Jr.; Gnagy, Elizabeth M.; Burrows-MacLean, Lisa; Coles, Erika K.; Chacko, Anil; Wymbs, Brian T.; Walker, Kathryn S.; Arnold, Fran; Garefino, Allison; Keenan, Jenna K.; Onyango, Adia N.; Hoffman, Martin T.; Massetti, Greta M.; Robb, Jessica A.

    2007-01-01

    Currently behavior modification, stimulant medication, and combined treatments are supported as evidence-based interventions for attention deficit hyperactivity disorder in classroom settings. However, there has been little study of the relative effects of these two modalities and their combination in classrooms. Using a within-subject design, the…

  17. Increasing the Writing Performance of Urban Seniors Placed At-Risk through Goal-Setting in a Culturally Responsive and Creativity-Centered Classroom

    ERIC Educational Resources Information Center

    Estrada, Brittany; Warren, Susan

    2014-01-01

    Efforts to support marginalized students require not only identifying systemic inequities, but providing a classroom infrastructure that supports the academic achievement of all students. This action research study examined the effects of implementing goal-setting strategies and emphasizing creativity in a culturally responsive classroom (CRC) on…

  18. Turkish Pre-Service Teachers' Perceived Self-Efficacy Beliefs and Knowledge about Using Expository Text as an Instructional Tool in Their Future Classroom Settings

    ERIC Educational Resources Information Center

    Yildirim, Kasim; Ates, Seyit

    2012-01-01

    The aim of this research was to examine Turkish pre-service teachers' knowledge and perceived self-efficacy beliefs toward using expository text as an instructional tool in their future classroom settings. The research sample were 346 pre-service teachers who studied in different teacher preparation programs which included elementary classroom and…

  19. Classroom Management Strategies for Young Children with Challenging Behavior within Early Childhood Settings

    ERIC Educational Resources Information Center

    Jolivette, Kristine; Steed, Elizabeth A.

    2010-01-01

    Many preschool, Head Start, and kindergarten educators of young children express concern about the number of children who exhibit frequent challenging behaviors and report that managing these behaviors is difficult within these classrooms. This article describes research-based strategies with practical applications that can be used as part of…

  20. Making Room for Group Work I: Teaching Engineering in a Modern Classroom Setting

    ERIC Educational Resources Information Center

    Wilkens, Robert J.; Ciric, Amy R.

    2005-01-01

    This paper describes the results of several teaching experiments in the teaching Studio of The University of Dayton's Learning-Teaching Center. The Studio is a state-of-the-art classroom with a flexible seating arrangements and movable whiteboards and corkboards for small group discussions. The Studio has a communications system with a TV/VCR…

  1. The Attributes of an Effective Teacher Differ between the Classroom and the Clinical Setting

    ERIC Educational Resources Information Center

    Haws, Jolene; Rannelli, Luke; Schaefer, Jeffrey P.; Zarnke, Kelly; Coderre, Sylvain; Ravani, Pietro; McLaughlin, Kevin

    2016-01-01

    Most training programs use learners' subjective ratings of their teachers as the primary measure of teaching effectiveness. In a recent study we found that preclinical medical students' ratings of classroom teachers were associated with perceived charisma and physical attractiveness of the teacher, but not intellect. Here we explored whether the…

  2. The Timing of Feedback on Mathematics Problem Solving in a Classroom Setting

    ERIC Educational Resources Information Center

    Fyfe, Emily R.; Rittle-Johnson, Bethany

    2015-01-01

    Feedback is a ubiquitous learning tool that is theorized to help learners detect and correct their errors. The goal of this study was to examine the effects of feedback in a classroom context for children solving math equivalence problems (problems with operations on both sides of the equal sign). The authors worked with children in 7 second-grade…

  3. By What Token Economy? A Classroom Learning Tool for Inclusive Settings.

    ERIC Educational Resources Information Center

    Anderson, Carol; Katsiyannis, Antonis

    1997-01-01

    Describes a token economy that used tokens styled as license plates to elicit appropriate behavior in an inclusive fifth-grade class in which four students with behavior disorders were enrolled. Student involvement in establishing the "driving rules" of the classroom is explained, the components of a token economy are outlined, and steps for group…

  4. Mobile Learning in a Large Blended Computer Science Classroom: System Function, Pedagogies, and Their Impact on Learning

    ERIC Educational Resources Information Center

    Shen, Ruimin; Wang, Minjuan; Gao, Wanping; Novak, D.; Tang, Lin

    2009-01-01

    The computer science classes in China's institutions of higher education often have large numbers of students. In addition, many institutions offer "blended" classes that include both on-campus and online students. These large blended classrooms have long suffered from a lack of interactivity. Many online classes simply provide recorded instructor…

  5. Service user involvement in pre-registration mental health nurse education classroom settings: a review of the literature.

    PubMed

    Terry, J

    2012-11-01

    Service user involvement in pre-registration nurse education is now a requirement, yet little is known about how students engage with users in the classroom, how such initiatives are being evaluated, how service users are prepared themselves to teach students, or the potential influence on clinical practice. The aim of this literature review was to bring together published articles on service user involvement in classroom settings in pre-registration mental health nurse education programmes, including their evaluations. A comprehensive review of the literature was carried out via computer search engines and the Internet, as well as a hand search of pertinent journals and references. This produced eight papers that fitted the inclusion criteria, comprising four empirical studies and four review articles, which were then reviewed using a seven-item checklist. The articles revealed a range of teaching and learning strategies had been employed, ranging from exposure to users' personal stories, to students being required to demonstrate awareness of user perspectives in case study presentations, with others involving eLearning and assessment skills initiatives. This review concludes that further longitudinal research is needed to establish the influence of user involvement in the classroom over time. PMID:22296494

  6. Do emotional support and classroom organization earlier in the year set the stage for higher quality instruction?

    PubMed

    Curby, Timothy W; Rimm-Kaufman, Sara E; Abry, Tashia

    2013-10-01

    Many teachers believe that providing greater emotional and organizational supports in the beginning of the year strengthens their ability to teach effectively as the year progresses. Some interventions, such as the Responsive Classroom (RC) approach, explicitly embed this sequence into professional development efforts. We tested the hypothesis that earlier emotional and organizational supports set the stage for improved instruction later in the year in a sample of third- and fourth-grade teachers enrolled in a randomized controlled trial of the RC approach. Further, we examined the extent to which the model generalized for teachers using varying levels of RC practices as well as whether or not teachers were in the intervention or control groups. Teachers' emotional, organizational, and instructional interactions were observed using the Classroom Assessment Scoring System (Pianta, La Paro, & Hamre, 2008) on five occasions throughout the year. Results indicated a reciprocal relation between emotional and instructional supports. Specifically, higher levels of emotional support earlier in the year predicted higher instructional support later in the year. Also, higher levels of instructional support earlier in the year predicted higher emotional support later in the year. Classroom organization was not found to have longitudinal associations with the other domains across a year. This pattern was robust when controlling for the use of RC practices as well as across intervention and control groups. Further, teachers' use of RC practices predicted higher emotional support and classroom organization throughout the year, suggesting the malleability of this teacher characteristic. Discussion highlights the connection between teachers' emotional and instructional supports and how the use of RC practices improves teachers' emotionally supportive interactions with students.

  7. Engaging millennial learners: Effectiveness of personal response system technology with nursing students in small and large classrooms.

    PubMed

    Revell, Susan M Hunter; McCurry, Mary K

    2010-05-01

    Nurse educators must explore innovative technologies that make the most of the characteristics and learning styles of millennial learners. These students are comfortable with technology and prefer interactive classrooms with individual feedback and peer collaboration. This study evaluated the perceived effectiveness of personal response system (PRS) technology in enhancing student learning in small and large classrooms. PRS technology was integrated into two undergraduate courses, nursing research (n = 33) and junior medical-surgical nursing (n = 116). Multiple-choice, true-false, NCLEX-RN alternate format, and reading quiz questions were incorporated within didactic PowerPoint presentations. Data analysis of Likert-type and open-response questions supported the use of PRS technology as an effective strategy for educating millennial learners in both small and large classrooms. PRS technology promotes active learning, increases participation, and provides students and faculty with immediate feedback that reflects comprehension of content and increases faculty-student interaction.

  8. Response Grids: Practical Ways to Display Large Data Sets with High Visual Impact

    ERIC Educational Resources Information Center

    Gates, Simon

    2013-01-01

    Spreadsheets are useful for large data sets but they may be too wide or too long to print as conventional tables. Response grids offer solutions to the challenges posed by any large data set. They have wide application throughout science and for every subject and context where visual data displays are designed, within education and elsewhere.…

  9. Feasibility and Acceptability of Adapting the Eating in the Absence of Hunger Assessment for Preschoolers in the Classroom Setting.

    PubMed

    Soltero, Erica G; Ledoux, Tracey; Lee, Rebecca E

    2015-12-01

    Eating in the Absence of Hunger (EAH) represents a failure to self-regulate intake leading to overconsumption. Existing research on EAH has come from the clinical setting, limiting our understanding of this behavior. The purpose of this study was to describe the adaptation of the clinical EAH paradigm for preschoolers to the classroom setting and evaluate the feasibility and acceptability of measuring EAH in the classroom. The adapted protocol was implemented in childcare centers in Houston, Texas (N=4) and Phoenix, Arizona (N=2). The protocol was feasible, economical, and time efficient, eliminating previously identified barriers to administering the EAH assessment such as limited resources and the time constraint of delivering the assessment to participants individually. Implementation challenges included difficulty in choosing palatable test snacks that were in compliance with childcare center food regulations and the limited control over the meal that was administered prior to the assessment. The adapted protocol will allow for broader use of the EAH assessment and encourage researchers to incorporate the assessment into longitudinal studies in order to further our understanding of the causes and emergence of EAH.

  10. Silent and Vocal Students in a Large Active Learning Chemistry Classroom: Comparison of Performance and Motivational Factors

    ERIC Educational Resources Information Center

    Obenland, Carrie A.; Munson, Ashlyn H.; Hutchinson, John S.

    2013-01-01

    Active learning is becoming more prevalent in large science classrooms, and this study shows the impact on performance of being vocal during Socratic questioning in a General Chemistry course. 800 college students over a two year period were given a pre and post-test using the Chemistry Concept Reasoning Test. The pre-test results showed that…

  11. Mobile-Phone-Based Classroom Response Systems: Students' Perceptions of Engagement and Learning in a Large Undergraduate Course

    ERIC Educational Resources Information Center

    Dunn, Peter K.; Richardson, Alice; Oprescu, Florin; McDonald, Christine

    2013-01-01

    Using a Classroom Response System (CRS) has been associated with positive educational outcomes, by fostering student engagement and by allowing immediate feedback to both students and instructors. This study examined a low-cost CRS (VotApedia) in a large first-year class, where students responded to questions using their mobile phones. This study…

  12. Observing physical education teachers' need-supportive interactions in classroom settings.

    PubMed

    Haerens, Leen; Aelterman, Nathalie; Van den Berghe, Lynn; De Meyer, Jotie; Soenens, Bart; Vansteenkiste, Maarten

    2013-02-01

    According to self-determination theory, teachers can motivate students by supporting their psychological needs for relatedness, competence, and autonomy. The present study complements extant research (most of which relied on self-report measures) by relying on observations of need-supportive teaching in the domain of physical education (PE), which allows for the identification of concrete, real-life examples of how teacher need support manifests in the classroom. Seventy-four different PE lessons were coded for 5-min intervals to assess the occurrence of 21 need-supportive teaching behaviors. Factor analyses provided evidence for four interpretable factors, namely, relatedness support, autonomy support, and two components of structure (structure before and during the activity). Reasonable evidence was obtained for convergence between observed and student perceived need support. Yet, the low interrater reliability for two of the four scales indicates that these scales need further improvement.

  13. Safety and science at sea: connecting science research settings to the classroom through live video

    NASA Astrophysics Data System (ADS)

    Cohen, E.; Peart, L. W.

    2011-12-01

    Many science teachers start the year off with classroom safety topics. Annual repetition helps with mastery of this important and basic knowledge, while helping schools to meet their legal obligations for safe lab science. Although these lessons are necessary, they are often topical, rarely authentic and relatively dull. Interesting connections can, however, be drawn between the importance of safety in science classrooms and the importance of safety in academic laboratories, fieldwork, shipboard research, and commercial research. Teachers can leverage these connections through live video interactions with scientists in the field, thereby creating an authentic learning environment. During the School of Rock 2009, a professional teacher research experience aboard the Integrated Ocean Drilling Program's research vessel JOIDES Resolution, safety and nature-of-science curricula were created to help address this need. By experimenting with various topics and locations on the ship that were accessible and applicable to middle school learning, 43 highly visual "safety signs" and activities were identified and presented "live" by graduate students, teachers, scientists; the ship's mates, doctor and technical staff. Students were exposed to realistic science process skills along with safety content from the world's only riserless, deep-sea drilling research vessel. The once-in-a-lifetime experience caused the students' eyes to brighten behind their safety glasses, especially as they recognized the same eye wash station and safety gear they have to wear and attended a ship's fire and safety drill along side scientists in hard hats and personal floatation devices. This collaborative and replicable live vide approach will connect basic safety content and nature-of-science process skills for a memorable and authentic learning experience for students.

  14. Effective Extensive Reading outside the Classroom: A Large-Scale Experiment

    ERIC Educational Resources Information Center

    Robb, Thomas; Kano, Makimi

    2013-01-01

    We report on a large-scale implementation of extensive reading (ER) in a university setting in Japan where all students were required to read outside class time as part of their course requirement. A pre/posttest comparison between the 2009 cohort of students who read outside of class and the 2008 cohort who did no outside reading shows that the…

  15. Zebrafish Expression Ontology of Gene Sets (ZEOGS): a tool to analyze enrichment of zebrafish anatomical terms in large gene sets.

    PubMed

    Prykhozhij, Sergey V; Marsico, Annalisa; Meijsing, Sebastiaan H

    2013-09-01

    The zebrafish (Danio rerio) is an established model organism for developmental and biomedical research. It is frequently used for high-throughput functional genomics experiments, such as genome-wide gene expression measurements, to systematically analyze molecular mechanisms. However, the use of whole embryos or larvae in such experiments leads to a loss of the spatial information. To address this problem, we have developed a tool called Zebrafish Expression Ontology of Gene Sets (ZEOGS) to assess the enrichment of anatomical terms in large gene sets. ZEOGS uses gene expression pattern data from several sources: first, in situ hybridization experiments from the Zebrafish Model Organism Database (ZFIN); second, it uses the Zebrafish Anatomical Ontology, a controlled vocabulary that describes connected anatomical structures; and third, the available connections between expression patterns and anatomical terms contained in ZFIN. Upon input of a gene set, ZEOGS determines which anatomical structures are overrepresented in the input gene set. ZEOGS allows one for the first time to look at groups of genes and to describe them in terms of shared anatomical structures. To establish ZEOGS, we first tested it on random gene selections and on two public microarray datasets with known tissue-specific gene expression changes. These tests showed that ZEOGS could reliably identify the tissues affected, whereas only very few enriched terms to none were found in the random gene sets. Next we applied ZEOGS to microarray datasets of 24 and 72 h postfertilization zebrafish embryos treated with beclomethasone, a potent glucocorticoid. This analysis resulted in the identification of several anatomical terms related to glucocorticoid-responsive tissues, some of which were stage-specific. Our studies highlight the ability of ZEOGS to extract spatial information from datasets derived from whole embryos, indicating that ZEOGS could be a useful tool to automatically analyze gene expression

  16. The Distracting Effects of a Ringing Cell Phone: An Investigation of the Laboratory and the Classroom Setting

    PubMed Central

    Shelton, Jill T.; Elliott, Emily M.; Lynn, Sharon D.; Exner, Amanda L.

    2010-01-01

    The detrimental effects of a ringing phone on cognitive performance were investigated in four experiments. In Experiments 1 and 2, the effects of different types of sounds (a standard cell phone ring, irrelevant tones and an instrumental song commonly encountered by participants) on performance were examined. In Experiment 1, slower responses were observed in all auditory groups relative to a silence condition, but participants in the ring and song conditions recovered more slowly. In Experiment 2, participants who were warned about the potential for distraction recovered more quickly, suggesting a benefit of this prior knowledge. This investigation continued in a college classroom setting (Experiments 3a and 3b); students were exposed to a ringing cell phone during the lecture. Performance on a surprise quiz revealed low accuracy rates on material presented while the phone was ringing. These findings offer insight into top-down cognitive processes that moderate involuntary orienting responses associated with a common stimulus encountered in the environment. PMID:21234286

  17. Assessing the Effectiveness of Inquiry-based Learning Techniques Implemented in Large Classroom Settings

    NASA Astrophysics Data System (ADS)

    Steer, D. N.; McConnell, D. A.; Owens, K.

    2001-12-01

    Geoscience and education faculty at The University of Akron jointly developed a series of inquiry-based learning modules aimed at both non-major and major student populations enrolled in introductory geology courses. These courses typically serve 2500 students per year in four to six classes of 40-160 students each per section. Twelve modules were developed that contained common topics and assessments appropriate to Earth Science, Environmental Geology and Physical Geology classes. All modules were designed to meet four primary learning objectives agreed upon by Department of Geology faculty. These major objectives include: 1) Improvement of student understanding of the scientific method; 2) Incorporation of problem solving strategies involving analysis, synthesis, and interpretation; 3) Development of the ability to distinguish between inferences, data and observations; and 4) Obtaining an understanding of basic processes that operate on Earth. Additional objectives that may be addressed by selected modules include: 1) The societal relevance of science; 2) Use and interpretation of quantitative data to better understand the Earth; 3) Development of the students' ability to communicate scientific results; 4) Distinguishing differences between science, religion and pseudo-science; 5) Evaluation of scientific information found in the mass media; and 6) Building interpersonal relationships through in-class group work. Student pre- and post-instruction progress was evaluated by administering a test of logical thinking, an attitude toward science survey, and formative evaluations. Scores from the logical thinking instrument were used to form balanced four-person working groups based on the students' incoming cognitive level. Groups were required to complete a series of activities and/or exercises that targeted different cognitive domains based upon Bloom's taxonomy (knowledge, comprehension, application, analysis, synthesis and evaluation of information). Daily assessments of knowledge-level learning included evaluations of student responses to pre- and post-instruction conceptual test questions, short group exercises and content-oriented exam questions. Higher level thinking skills were assessed when students completed exercises that required the completion of Venn diagrams, concept maps and/or evaluation rubrics both during class periods and on exams. Initial results indicate that these techniques improved student attendance significantly and improved overall retention in the course by 8-14% over traditional lecture formats. Student scores on multiple choice exam questions were slightly higher (1-3%) for students taught in the active learning environment and short answer questions showed larger gains (7%) over students' scores in a more traditional class structure.

  18. Toddler Subtraction with Large Sets: Further Evidence for an Analog-Magnitude Representation of Number

    ERIC Educational Resources Information Center

    Slaughter, Virginia; Kamppi, Dorian; Paynter, Jessica

    2006-01-01

    Two experiments were conducted to test the hypothesis that toddlers have access to an analog-magnitude number representation that supports numerical reasoning about relatively large numbers. Three-year-olds were presented with subtraction problems in which initial set size and proportions subtracted were systematically varied. Two sets of cookies…

  19. An Academic Approach to Stress Management for College Students in a Conventional Classroom Setting.

    ERIC Educational Resources Information Center

    Carnahan, Robert E.; And Others

    Since the identification of stress and the relationship of individual stress responses to physical and mental health, medical and behavioral professionals have been training individuals in coping strategies. To investigate the possibility of teaching cognitive coping skills to a nonclinical population in an academic setting, 41 college students…

  20. BEST in CLASS: A Classroom-Based Model for Ameliorating Problem Behavior in Early Childhood Settings

    ERIC Educational Resources Information Center

    Vo, Abigail; Sutherland, Kevin S.; Conroy, Maureen A.

    2012-01-01

    As more young children enter school settings to attend early childhood programs, early childhood teachers and school psychologists have been charged with supporting a growing number of young children with chronic problem behaviors that put them at risk for the development of emotional/behavioral disorders (EBDs). There is a need for effective,…

  1. Intercultural Education Set Forward: Operational Strategies and Procedures in Cypriot Classrooms

    ERIC Educational Resources Information Center

    Hajisoteriou, Christina

    2012-01-01

    Teachers in Cyprus are being called upon for the first time to teach within culturally diverse educational settings. Given the substantial role, teachers play in the implementation of intercultural education, this paper explores the intercultural strategies and procedures adopted by primary school teachers in Cyprus. Interviews were carried out…

  2. A Classroom Exercise in Spatial Analysis Using an Imaginary Data Set.

    ERIC Educational Resources Information Center

    Kopaska-Merkel, David C.

    One skill that elementary students need to acquire is the ability to analyze spatially distributed data. In this activity students are asked to complete the following tasks: (1) plot a set of data (related to "mud-sharks"--an imaginary fish) on a map of the state of Alabama, (2) identify trends in the data, (3) make graphs using the data…

  3. Setting Up an SSR Program in the Foreign Language Classroom: Some Questions and Answers

    ERIC Educational Resources Information Center

    Greenewald, M. Jane

    1978-01-01

    A Sustained Silent Reading (SSR) program provides a way to upgrade reading instruction by stressing self-selection, independent practice, and the use of reading as a communication tool. Guidelines are offered for setting up SSR programs, a resource library, and scheduling SSR periods. (Author/SW)

  4. Toddler subtraction with large sets: further evidence for an analog-magnitude representation of number.

    PubMed

    Slaughter, Virginia; Kamppi, Dorian; Paynter, Jessica

    2006-01-01

    Two experiments were conducted to test the hypothesis that toddlers have access to an analog-magnitude number representation that supports numerical reasoning about relatively large numbers. Three-year-olds were presented with subtraction problems in which initial set size and proportions subtracted were systematically varied. Two sets of cookies were presented and then covered. The experimenter visibly subtracted cookies from the hidden sets, and the children were asked to choose which of the resulting sets had more. In Experiment 1, performance was above chance when high proportions of objects (3 versus 6) were subtracted from large sets (of 9) and for the subset of older participants (older than 3 years, 5 months; n = 15), performance was also above chance when high proportions (10 versus 20) were subtracted from the very large sets (of 30). In Experiment 2, which was conducted exclusively with older 3-year-olds and incorporated an important methodological control, the pattern of results for the subtraction tasks was replicated. In both experiments, success on the tasks was not related to counting ability. The results of these experiments support the hypothesis that young children have access to an analog-magnitude system for representing large approximate quantities, as performance on these subtraction tasks showed a Weber's Law signature, and was independent of conventional number knowledge.

  5. Impact of Abbreviated Lecture with Interactive Mini-cases vs Traditional Lecture on Student Performance in the Large Classroom

    PubMed Central

    Nykamp, Diane L.; Momary, Kathryn M.

    2014-01-01

    Objective. To compare the impact of 2 different teaching and learning methods on student mastery of learning objectives in a pharmacotherapy module in the large classroom setting. Design. Two teaching and learning methods were implemented and compared in a required pharmacotherapy module for 2 years. The first year, multiple interactive mini-cases with inclass individual assessment and an abbreviated lecture were used to teach osteoarthritis; a traditional lecture with 1 inclass case discussion was used to teach gout. In the second year, the same topics were used but the methods were flipped. Student performance on pre/post individual readiness assessment tests (iRATs), case questions, and subsequent examinations were compared each year by the teaching and learning method and then between years by topic for each method. Students also voluntarily completed a 20-item evaluation of the teaching and learning methods. Assessment. Postpresentation iRATs were significantly higher than prepresentation iRATs for each topic each year with the interactive mini-cases; there was no significant difference in iRATs before and after traditional lecture. For osteoarthritis, postpresentation iRATs after interactive mini-cases in year 1 were significantly higher than postpresentation iRATs after traditional lecture in year 2; the difference in iRATs for gout per learning method was not significant. The difference between examination performance for osteoarthritis and gout was not significant when the teaching and learning methods were compared. On the student evaluations, 2 items were significant both years when answers were compared by teaching and learning method. Each year, students ranked their class participation higher with interactive cases than with traditional lecture, but both years they reported enjoying the traditional lecture format more. Conclusion. Multiple interactive mini-cases with an abbreviated lecture improved immediate mastery of learning objectives compared to

  6. Using Mobile Phones to Increase Classroom Interaction

    ERIC Educational Resources Information Center

    Cobb, Stephanie; Heaney, Rose; Corcoran, Olivia; Henderson-Begg, Stephanie

    2010-01-01

    This study examines the possible benefits of using mobile phones to increase interaction and promote active learning in large classroom settings. First year undergraduate students studying Cellular Processes at the University of East London took part in a trial of a new text-based classroom interaction system and evaluated their experience by…

  7. Interdependent Learning in an Open Classroom Setting: Dean Rusk Elementary School, 1972-73. Research and Development Report, Volume 7, Number 7, August 1973.

    ERIC Educational Resources Information Center

    Goettee, Margaret

    All special programs at Dean Rusk Elementary School, funded in part under Title I of the 1965 Elementary Secondary Education Act, combined to facilitate individualized instruction in the nongraded, open classroom setting of the school. To better meet the needs of the pupils during the 1972-73 school year, the Follow Through Program included, for…

  8. Out in the Classroom: Transgender Student Experiences at a Large Public University

    ERIC Educational Resources Information Center

    Pryor, Jonathan T.

    2015-01-01

    Faculty and peer interactions are 2 of the most important relationships for college students to foster (Astin, 1993). Transgender college students have an increasing visible presence on college campuses (Pusch, 2005), yet limited research exists on their experiences and struggles in the classroom environment (Garvey & Rankin, 2015; Renn,…

  9. Secondary Data Analysis of Large Data Sets in Urology: Successes and Errors to Avoid

    PubMed Central

    Schlomer, Bruce J.; Copp, Hillary L.

    2014-01-01

    Purpose Secondary data analysis is the use of data collected for research by someone other than the investigator. In the last several years there has been a dramatic increase in the number of these studies being published in urological journals and presented at urological meetings, especially involving secondary data analysis of large administrative data sets. Along with this expansion, skepticism for secondary data analysis studies has increased for many urologists. Materials and Methods In this narrative review we discuss the types of large data sets that are commonly used for secondary data analysis in urology, and discuss the advantages and disadvantages of secondary data analysis. A literature search was performed to identify urological secondary data analysis studies published since 2008 using commonly used large data sets, and examples of high quality studies published in high impact journals are given. We outline an approach for performing a successful hypothesis or goal driven secondary data analysis study and highlight common errors to avoid. Results More than 350 secondary data analysis studies using large data sets have been published on urological topics since 2008 with likely many more studies presented at meetings but never published. Nonhypothesis or goal driven studies have likely constituted some of these studies and have probably contributed to the increased skepticism of this type of research. However, many high quality, hypothesis driven studies addressing research questions that would have been difficult to conduct with other methods have been performed in the last few years. Conclusions Secondary data analysis is a powerful tool that can address questions which could not be adequately studied by another method. Knowledge of the limitations of secondary data analysis and of the data sets used is critical for a successful study. There are also important errors to avoid when planning and performing a secondary data analysis study. Investigators

  10. Experiments and other methods for developing expertise with design of experiments in a classroom setting

    NASA Technical Reports Server (NTRS)

    Patterson, John W.

    1990-01-01

    The only way to gain genuine expertise in Statistical Process Control (SPC) and the design of experiments (DOX) is with repeated practice, but not on canned problems with dead data sets. Rather, one must negotiate a wide variety of problems each with its own peculiarities and its own constantly changing data. The problems should not be of the type for which there is a single, well-defined answer that can be looked up in a fraternity file or in some text. The problems should match as closely as possible the open-ended types for which there is always an abundance of uncertainty. These are the only kinds that arise in real research, whether that be basic research in academe or engineering research in industry. To gain this kind of experience, either as a professional consultant or as an industrial employee, takes years. Vast amounts of money, not to mention careers, must be put at risk. The purpose here is to outline some realistic simulation-type lab exercises that are so simple and inexpensive to run that the students can repeat them as often as desired at virtually no cost. Simulations also allow the instructor to design problems whose outcomes are as noisy as desired but still predictable within limits. Also the instructor and the students can learn a great deal more from the postmortum conducted after the exercise is completed. One never knows for sure what the true data should have been when dealing only with real life experiments. To add a bit more realism to the exercises, it is sometimes desirable to make the students pay for each experimental result from a make-believe budget allocation for the problem.

  11. Protein Identification False Discovery Rates for Very Large Proteomics Data Sets Generated by Tandem Mass Spectrometry*

    PubMed Central

    Reiter, Lukas; Claassen, Manfred; Schrimpf, Sabine P.; Jovanovic, Marko; Schmidt, Alexander; Buhmann, Joachim M.; Hengartner, Michael O.; Aebersold, Ruedi

    2009-01-01

    Comprehensive characterization of a proteome is a fundamental goal in proteomics. To achieve saturation coverage of a proteome or specific subproteome via tandem mass spectrometric identification of tryptic protein sample digests, proteomics data sets are growing dramatically in size and heterogeneity. The trend toward very large integrated data sets poses so far unsolved challenges to control the uncertainty of protein identifications going beyond well established confidence measures for peptide-spectrum matches. We present MAYU, a novel strategy that reliably estimates false discovery rates for protein identifications in large scale data sets. We validated and applied MAYU using various large proteomics data sets. The data show that the size of the data set has an important and previously underestimated impact on the reliability of protein identifications. We particularly found that protein false discovery rates are significantly elevated compared with those of peptide-spectrum matches. The function provided by MAYU is critical to control the quality of proteome data repositories and thereby to enhance any study relying on these data sources. The MAYU software is available as standalone software and also integrated into the Trans-Proteomic Pipeline. PMID:19608599

  12. Setting Cut Scores on Large-Scale Assessments. The State Board Connection Issues in Brief.

    ERIC Educational Resources Information Center

    National Association of State Boards of Education, Alexandria, VA.

    This Brief builds on the work of the National Association of State Boards of Education 1997 Study Group on State Assessment Systems by examining one of the state board actions that is most likely to capture the publics attention: setting cut scores on state assessments. This process involves a large measure of human judgment and politics and a…

  13. Using Content-Specific Lyrics to Familiar Tunes in a Large Lecture Setting

    ERIC Educational Resources Information Center

    McLachlin, Derek T.

    2009-01-01

    Music can be used in lectures to increase student engagement and help students retain information. In this paper, I describe my use of biochemistry-related lyrics written to the tune of the theme to the television show, The Flintstones, in a large class setting (400-800 students). To determine student perceptions, the class was surveyed several…

  14. Influences of large sets of environmental exposures on immune responses in healthy adult men

    PubMed Central

    Yi, Buqing; Rykova, Marina; Jäger, Gundula; Feuerecker, Matthias; Hörl, Marion; Matzel, Sandra; Ponomarev, Sergey; Vassilieva, Galina; Nichiporuk, Igor; Choukèr, Alexander

    2015-01-01

    Environmental factors have long been known to influence immune responses. In particular, clinical studies about the association between migration and increased risk of atopy/asthma have provided important information on the role of migration associated large sets of environmental exposures in the development of allergic diseases. However, investigations about environmental effects on immune responses are mostly limited in candidate environmental exposures, such as air pollution. The influences of large sets of environmental exposures on immune responses are still largely unknown. A simulated 520-d Mars mission provided an opportunity to investigate this topic. Six healthy males lived in a closed habitat simulating a spacecraft for 520 days. When they exited their “spacecraft” after the mission, the scenario was similar to that of migration, involving exposure to a new set of environmental pollutants and allergens. We measured multiple immune parameters with blood samples at chosen time points after the mission. At the early adaptation stage, highly enhanced cytokine responses were observed upon ex vivo antigen stimulations. For cell population frequencies, we found the subjects displayed increased neutrophils. These results may presumably represent the immune changes occurred in healthy humans when migrating, indicating that large sets of environmental exposures may trigger aberrant immune activity. PMID:26306804

  15. Influences of large sets of environmental exposures on immune responses in healthy adult men.

    PubMed

    Yi, Buqing; Rykova, Marina; Jäger, Gundula; Feuerecker, Matthias; Hörl, Marion; Matzel, Sandra; Ponomarev, Sergey; Vassilieva, Galina; Nichiporuk, Igor; Choukèr, Alexander

    2015-08-26

    Environmental factors have long been known to influence immune responses. In particular, clinical studies about the association between migration and increased risk of atopy/asthma have provided important information on the role of migration associated large sets of environmental exposures in the development of allergic diseases. However, investigations about environmental effects on immune responses are mostly limited in candidate environmental exposures, such as air pollution. The influences of large sets of environmental exposures on immune responses are still largely unknown. A simulated 520-d Mars mission provided an opportunity to investigate this topic. Six healthy males lived in a closed habitat simulating a spacecraft for 520 days. When they exited their "spacecraft" after the mission, the scenario was similar to that of migration, involving exposure to a new set of environmental pollutants and allergens. We measured multiple immune parameters with blood samples at chosen time points after the mission. At the early adaptation stage, highly enhanced cytokine responses were observed upon ex vivo antigen stimulations. For cell population frequencies, we found the subjects displayed increased neutrophils. These results may presumably represent the immune changes occurred in healthy humans when migrating, indicating that large sets of environmental exposures may trigger aberrant immune activity.

  16. Preschoolers' Nonsymbolic Arithmetic with Large Sets: Is Addition More Accurate than Subtraction?

    ERIC Educational Resources Information Center

    Shinskey, Jeanne L.; Chan, Cindy Ho-man; Coleman, Rhea; Moxom, Lauren; Yamamoto, Eri

    2009-01-01

    Adult and developing humans share with other animals analog magnitude representations of number that support nonsymbolic arithmetic with large sets. This experiment tested the hypothesis that such representations may be more accurate for addition than for subtraction in children as young as 3 1/2 years of age. In these tasks, the experimenter hid…

  17. Teaching Children to Organise and Represent Large Data Sets in a Histogram

    ERIC Educational Resources Information Center

    Nisbet, Steven; Putt, Ian

    2004-01-01

    Although some bright students in primary school are able to organise numerical data into classes, most attend to the characteristics of individuals rather than the group, and "see the trees rather than the forest". How can teachers in upper primary and early high school teach students to organise large sets of data with widely varying values into…

  18. DocCube: Multi-Dimensional Visualization and Exploration of Large Document Sets.

    ERIC Educational Resources Information Center

    Mothe, Josiane; Chrisment, Claude; Dousset, Bernard; Alaux, Joel

    2003-01-01

    Describes a user interface that provides global visualizations of large document sets to help users formulate the query that corresponds to their information needs. Highlights include concept hierarchies that users can browse to specify and refine information needs; knowledge discovery in databases and texts; and multidimensional modeling.…

  19. Influences of large sets of environmental exposures on immune responses in healthy adult men.

    PubMed

    Yi, Buqing; Rykova, Marina; Jäger, Gundula; Feuerecker, Matthias; Hörl, Marion; Matzel, Sandra; Ponomarev, Sergey; Vassilieva, Galina; Nichiporuk, Igor; Choukèr, Alexander

    2015-01-01

    Environmental factors have long been known to influence immune responses. In particular, clinical studies about the association between migration and increased risk of atopy/asthma have provided important information on the role of migration associated large sets of environmental exposures in the development of allergic diseases. However, investigations about environmental effects on immune responses are mostly limited in candidate environmental exposures, such as air pollution. The influences of large sets of environmental exposures on immune responses are still largely unknown. A simulated 520-d Mars mission provided an opportunity to investigate this topic. Six healthy males lived in a closed habitat simulating a spacecraft for 520 days. When they exited their "spacecraft" after the mission, the scenario was similar to that of migration, involving exposure to a new set of environmental pollutants and allergens. We measured multiple immune parameters with blood samples at chosen time points after the mission. At the early adaptation stage, highly enhanced cytokine responses were observed upon ex vivo antigen stimulations. For cell population frequencies, we found the subjects displayed increased neutrophils. These results may presumably represent the immune changes occurred in healthy humans when migrating, indicating that large sets of environmental exposures may trigger aberrant immune activity. PMID:26306804

  20. Large-scale detection of metals with a small set of fluorescent DNA-like chemosensors.

    PubMed

    Yuen, Lik Hang; Franzini, Raphael M; Tan, Samuel S; Kool, Eric T

    2014-10-15

    An important advantage of pattern-based chemosensor sets is their potential to detect and differentiate a large number of analytes with only few sensors. Here we test this principle at a conceptual limit by analyzing a large set of metal ion analytes covering essentially the entire periodic table, employing fluorescent DNA-like chemosensors on solid support. A tetrameric "oligodeoxyfluoroside" (ODF) library of 6561 members containing metal-binding monomers was screened for strong responders to 57 metal ions in solution. Our results show that a set of 9 chemosensors could successfully discriminate the 57 species, including alkali, alkaline earth, post-transition, transition, and lanthanide metals. As few as 6 ODF chemosensors could detect and differentiate 50 metals at 100 μM; sensitivity for some metals was achieved at midnanomolar ranges. A blind test with 50 metals further confirmed the discriminating power of the ODFs. PMID:25255102

  1. On basis set superposition error corrected stabilization energies for large n-body clusters.

    PubMed

    Walczak, Katarzyna; Friedrich, Joachim; Dolg, Michael

    2011-10-01

    In this contribution, we propose an approximate basis set superposition error (BSSE) correction scheme for the site-site function counterpoise and for the Valiron-Mayer function counterpoise correction of second order to account for the basis set superposition error in clusters with a large number of subunits. The accuracy of the proposed scheme has been investigated for a water cluster series at the CCSD(T), CCSD, MP2, and self-consistent field levels of theory using Dunning's correlation consistent basis sets. The BSSE corrected stabilization energies for a series of water clusters are presented. A study regarding the possible savings with respect to computational resources has been carried out as well as a monitoring of the basis set dependence of the approximate BSSE corrections. PMID:21992293

  2. An Analysis Framework Addressing the Scale and Legibility of Large Scientific Data Sets

    SciTech Connect

    Childs, Hank R.

    2006-01-01

    Much of the previous work in the large data visualization area has solely focused on handling the scale of the data. This task is clearly a great challenge and necessary, but it is not sufficient. Applying standard visualization techniques to large scale data sets often creates complicated pictures where meaningful trends are lost. A second challenge, then, is to also provide algorithms that simplify what an analyst must understand, using either visual or quantitative means. This challenge can be summarized as improving the legibility or reducing the complexity of massive data sets. Fully meeting both of these challenges is the work of many, many PhD dissertations. In this dissertation, we describe some new techniques to address both the scale and legibility challenges, in hope of contributing to the larger solution. In addition to our assumption of simultaneously addressing both scale and legibility, we add an additional requirement that the solutions considered fit well within an interoperable framework for diverse algorithms, because a large suite of algorithms is often necessary to fully understand complex data sets. For scale, we present a general architecture for handling large data, as well as details of a contract-based system for integrating advanced optimizations into a data flow network design. We also describe techniques for volume rendering and performing comparisons at the extreme scale. For legibility, we present several techniques. Most noteworthy are equivalence class functions, a technique to drive visualizations using statistical methods, and line-scan based techniques for characterizing shape.

  3. Extensions of parallel coordinates for interactive exploration of large multi-timepoint data sets.

    PubMed

    Blaas, Jorik; Botha, Charl P; Post, Frits H

    2008-01-01

    Parallel coordinate plots (PCPs) are commonly used in information visualization to provide insight into multi-variate data. These plots help to spot correlations between variables. PCPs have been successfully applied to unstructured datasets up to a few millions of points. In this paper, we present techniques to enhance the usability of PCPs for the exploration of large, multi-timepoint volumetric data sets, containing tens of millions of points per timestep. The main difficulties that arise when applying PCPs to large numbers of data points are visual clutter and slow performance, making interactive exploration infeasible. Moreover, the spatial context of the volumetric data is usually lost. We describe techniques for preprocessing using data quantization and compression, and for fast GPU-based rendering of PCPs using joint density distributions for each pair of consecutive variables, resulting in a smooth, continuous visualization. Also, fast brushing techniques are proposed for interactive data selection in multiple linked views, including a 3D spatial volume view. These techniques have been successfully applied to three large data sets: Hurricane Isabel (Vis'04 contest), the ionization front instability data set (Vis'08 design contest), and data from a large-eddy simulation of cumulus clouds. With these data, we show how PCPs can be extended to successfully visualize and interactively explore multi-timepoint volumetric datasets with an order of magnitude more data points.

  4. A Complementary Graphical Method for Reducing and Analyzing Large Data Sets*

    PubMed Central

    Jing, X.; Cimino, J. J.

    2014-01-01

    Summary Objectives Graphical displays can make data more understandable; however, large graphs can challenge human comprehension. We have previously described a filtering method to provide high-level summary views of large data sets. In this paper we demonstrate our method for setting and selecting thresholds to limit graph size while retaining important information by applying it to large single and paired data sets, taken from patient and bibliographic databases. Methods Four case studies are used to illustrate our method. The data are either patient discharge diagnoses (coded using the International Classification of Diseases, Clinical Modifications [ICD9-CM]) or Medline citations (coded using the Medical Subject Headings [MeSH]). We use combinations of different thresholds to obtain filtered graphs for detailed analysis. The thresholds setting and selection, such as thresholds for node counts, class counts, ratio values, p values (for diff data sets), and percentiles of selected class count thresholds, are demonstrated with details in case studies. The main steps include: data preparation, data manipulation, computation, and threshold selection and visualization. We also describe the data models for different types of thresholds and the considerations for thresholds selection. Results The filtered graphs are 1%-3% of the size of the original graphs. For our case studies, the graphs provide 1) the most heavily used ICD9-CM codes, 2) the codes with most patients in a research hospital in 2011, 3) a profile of publications on “heavily represented topics” in MEDLINE in 2011, and 4) validated knowledge about adverse effects of the medication of rosiglitazone and new interesting areas in the ICD9-CM hierarchy associated with patients taking the medication of pioglitazone. Conclusions Our filtering method reduces large graphs to a manageable size by removing relatively unimportant nodes. The graphical method provides summary views based on computation of usage

  5. PrestoPronto: a code devoted to handling large data sets

    NASA Astrophysics Data System (ADS)

    Figueroa, S. J. A.; Prestipino, C.

    2016-05-01

    The software PrestoPronto consist to a full graphical user interface (GUI) program aimed to execute the analysis of large X-ray Absorption Spectroscopy data sets. Written in Python is free and open source. The code is able to read large datasets, apply calibration, alignment corrections and perform classical data analysis, from the extraction of the signal to EXAFS fit. The package includes also programs with GUIs] to perform, Principal Component Analysis and Linear Combination Fits. The main benefit of this program is allow to follow quickly the evolution of time resolved experiments coming from Quick-EXAFS (QEXAFS) and dispersive EXAFS beamlines.

  6. Validating a large geophysical data set: Experiences with satellite-derived cloud parameters

    NASA Technical Reports Server (NTRS)

    Kahn, Ralph; Haskins, Robert D.; Knighton, James E.; Pursch, Andrew; Granger-Gallegos, Stephanie

    1992-01-01

    We are validating the global cloud parameters derived from the satellite-borne HIRS2 and MSU atmospheric sounding instrument measurements, and are using the analysis of these data as one prototype for studying large geophysical data sets in general. The HIRS2/MSU data set contains a total of 40 physical parameters, filling 25 MB/day; raw HIRS2/MSU data are available for a period exceeding 10 years. Validation involves developing a quantitative sense for the physical meaning of the derived parameters over the range of environmental conditions sampled. This is accomplished by comparing the spatial and temporal distributions of the derived quantities with similar measurements made using other techniques, and with model results. The data handling needed for this work is possible only with the help of a suite of interactive graphical and numerical analysis tools. Level 3 (gridded) data is the common form in which large data sets of this type are distributed for scientific analysis. We find that Level 3 data is inadequate for the data comparisons required for validation. Level 2 data (individual measurements in geophysical units) is needed. A sampling problem arises when individual measurements, which are not uniformly distributed in space or time, are used for the comparisons. Standard 'interpolation' methods involve fitting the measurements for each data set to surfaces, which are then compared. We are experimenting with formal criteria for selecting geographical regions, based upon the spatial frequency and variability of measurements, that allow us to quantify the uncertainty due to sampling. As part of this project, we are also dealing with ways to keep track of constraints placed on the output by assumptions made in the computer code. The need to work with Level 2 data introduces a number of other data handling issues, such as accessing data files across machine types, meeting large data storage requirements, accessing other validated data sets, processing speed

  7. A Scalable Approach for Protein False Discovery Rate Estimation in Large Proteomic Data Sets.

    PubMed

    Savitski, Mikhail M; Wilhelm, Mathias; Hahne, Hannes; Kuster, Bernhard; Bantscheff, Marcus

    2015-09-01

    Calculating the number of confidently identified proteins and estimating false discovery rate (FDR) is a challenge when analyzing very large proteomic data sets such as entire human proteomes. Biological and technical heterogeneity in proteomic experiments further add to the challenge and there are strong differences in opinion regarding the conceptual validity of a protein FDR and no consensus regarding the methodology for protein FDR determination. There are also limitations inherent to the widely used classic target-decoy strategy that particularly show when analyzing very large data sets and that lead to a strong over-representation of decoy identifications. In this study, we investigated the merits of the classic, as well as a novel target-decoy-based protein FDR estimation approach, taking advantage of a heterogeneous data collection comprised of ∼19,000 LC-MS/MS runs deposited in ProteomicsDB (https://www.proteomicsdb.org). The "picked" protein FDR approach treats target and decoy sequences of the same protein as a pair rather than as individual entities and chooses either the target or the decoy sequence depending on which receives the highest score. We investigated the performance of this approach in combination with q-value based peptide scoring to normalize sample-, instrument-, and search engine-specific differences. The "picked" target-decoy strategy performed best when protein scoring was based on the best peptide q-value for each protein yielding a stable number of true positive protein identifications over a wide range of q-value thresholds. We show that this simple and unbiased strategy eliminates a conceptual issue in the commonly used "classic" protein FDR approach that causes overprediction of false-positive protein identification in large data sets. The approach scales from small to very large data sets without losing performance, consistently increases the number of true-positive protein identifications and is readily implemented in

  8. The search for structure - Object classification in large data sets. [for astronomers

    NASA Technical Reports Server (NTRS)

    Kurtz, Michael J.

    1988-01-01

    Research concerning object classifications schemes are reviewed, focusing on large data sets. Classification techniques are discussed, including syntactic, decision theoretic methods, fuzzy techniques, and stochastic and fuzzy grammars. Consideration is given to the automation of MK classification (Morgan and Keenan, 1973) and other problems associated with the classification of spectra. In addition, the classification of galaxies is examined, including the problems of systematic errors, blended objects, galaxy types, and galaxy clusters.

  9. Moving Large Data Sets Over High-Performance Long Distance Networks

    SciTech Connect

    Hodson, Stephen W; Poole, Stephen W; Ruwart, Thomas; Settlemyer, Bradley W

    2011-04-01

    In this project we look at the performance characteristics of three tools used to move large data sets over dedicated long distance networking infrastructure. Although performance studies of wide area networks have been a frequent topic of interest, performance analyses have tended to focus on network latency characteristics and peak throughput using network traffic generators. In this study we instead perform an end-to-end long distance networking analysis that includes reading large data sets from a source file system and committing large data sets to a destination file system. An evaluation of end-to-end data movement is also an evaluation of the system configurations employed and the tools used to move the data. For this paper, we have built several storage platforms and connected them with a high performance long distance network configuration. We use these systems to analyze the capabilities of three data movement tools: BBcp, GridFTP, and XDD. Our studies demonstrate that existing data movement tools do not provide efficient performance levels or exercise the storage devices in their highest performance modes. We describe the device information required to achieve high levels of I/O performance and discuss how this data is applicable in use cases beyond data movement performance.

  10. Coffee Shops, Classrooms and Conversations: public engagement and outreach in a large interdisciplinary research Hub

    NASA Astrophysics Data System (ADS)

    Holden, Jennifer A.

    2014-05-01

    Public engagement and outreach activities are increasingly using specialist staff for co-ordination, training and support for researchers, they are also becoming expected for large investments. Here, the experience of public engagement and outreach a large, interdisciplinary Research Hub is described. dot.rural, based at the University of Aberdeen UK, is a £11.8 million Research Councils UK Rural Digital Economy Hub, funded as part of the RCUK Digital Economy Theme (2009-2015). Digital Economy research aims to realise the transformational impact of digital technologies on aspects of the environment, community life, cultural experiences, future society, and the economy. The dot.rural Hub involves 92 researchers from 12 different disciplines, including Geography, Hydrology and Ecology. Public Engagement and Outreach is embedded in the dot.rural Digital Economy Hub via an Outreach Officer. Alongside this position, public engagement and outreach activities are compulsory part of PhD student contracts. Public Engagement and Outreach activities at the dot.rural Hub involve individuals and groups in both formal and informal settings organised by dot.rural and other organisations. Activities in the realms of Education, Public Engagement, Traditional and Social Media are determined by a set of Underlying Principles designed for the Hub by the Outreach Officer. The underlying Engagement and Outreach principles match funding agency requirements and expectations alongside researcher demands and the user-led nature of Digital Economy Research. All activities include researchers alongside the Outreach Officer are research informed and embedded into specific projects that form the Hub. Successful public engagement activities have included participation in Café Scientifique series, workshops in primary and secondary schools, and online activities such as I'm a Scientist Get Me Out of Here. From how to engage 8 year olds with making hydrographs more understandable to members of

  11. Parallel analysis tools and new visualization techniques for ultra-large climate data set

    SciTech Connect

    Middleton, Don; Haley, Mary

    2014-12-10

    ParVis was a project funded under LAB 10-05: “Earth System Modeling: Advanced Scientific Visualization of Ultra-Large Climate Data Sets”. Argonne was the lead lab with partners at PNNL, SNL, NCAR and UC-Davis. This report covers progress from January 1st, 2013 through Dec 1st, 2014. Two previous reports covered the period from Summer, 2010, through September 2011 and October 2011 through December 2012, respectively. While the project was originally planned to end on April 30, 2013, personnel and priority changes allowed many of the institutions to continue work through FY14 using existing funds. A primary focus of ParVis was introducing parallelism to climate model analysis to greatly reduce the time-to-visualization for ultra-large climate data sets. Work in the first two years was conducted on two tracks with different time horizons: one track to provide immediate help to climate scientists already struggling to apply their analysis to existing large data sets and another focused on building a new data-parallel library and tool for climate analysis and visualization that will give the field a platform for performing analysis and visualization on ultra-large datasets for the foreseeable future. In the final 2 years of the project, we focused mostly on the new data-parallel library and associated tools for climate analysis and visualization.

  12. Non-rigid Registration for Large Sets of Microscopic Images on Graphics Processors

    PubMed Central

    Ruiz, Antonio; Ujaldon, Manuel; Cooper, Lee

    2014-01-01

    Microscopic imaging is an important tool for characterizing tissue morphology and pathology. 3D reconstruction and visualization of large sample tissue structure requires registration of large sets of high-resolution images. However, the scale of this problem presents a challenge for automatic registration methods. In this paper we present a novel method for efficient automatic registration using graphics processing units (GPUs) and parallel programming. Comparing a C++ CPU implementation with Compute Unified Device Architecture (CUDA) libraries and pthreads running on GPU we achieve a speed-up factor of up to 4.11× with a single GPU and 6.68× with a GPU pair. We present execution times for a benchmark composed of two sets of large-scale images: mouse placenta (16K × 16K pixels) and breast cancer tumors (23K × 62K pixels). It takes more than 12 hours for the genetic case in C++ to register a typical sample composed of 500 consecutive slides, which was reduced to less than 2 hours using two GPUs, in addition to a very promising scalability for extending those gains easily on a large number of GPUs in a distributed system. PMID:25328635

  13. Breeding and Genetics Symposium: really big data: processing and analysis of very large data sets.

    PubMed

    Cole, J B; Newman, S; Foertter, F; Aguilar, I; Coffey, M

    2012-03-01

    Modern animal breeding data sets are large and getting larger, due in part to recent availability of high-density SNP arrays and cheap sequencing technology. High-performance computing methods for efficient data warehousing and analysis are under development. Financial and security considerations are important when using shared clusters. Sound software engineering practices are needed, and it is better to use existing solutions when possible. Storage requirements for genotypes are modest, although full-sequence data will require greater storage capacity. Storage requirements for intermediate and results files for genetic evaluations are much greater, particularly when multiple runs must be stored for research and validation studies. The greatest gains in accuracy from genomic selection have been realized for traits of low heritability, and there is increasing interest in new health and management traits. The collection of sufficient phenotypes to produce accurate evaluations may take many years, and high-reliability proofs for older bulls are needed to estimate marker effects. Data mining algorithms applied to large data sets may help identify unexpected relationships in the data, and improved visualization tools will provide insights. Genomic selection using large data requires a lot of computing power, particularly when large fractions of the population are genotyped. Theoretical improvements have made possible the inversion of large numerator relationship matrices, permitted the solving of large systems of equations, and produced fast algorithms for variance component estimation. Recent work shows that single-step approaches combining BLUP with a genomic relationship (G) matrix have similar computational requirements to traditional BLUP, and the limiting factor is the construction and inversion of G for many genotypes. A naïve algorithm for creating G for 14,000 individuals required almost 24 h to run, but custom libraries and parallel computing reduced that to

  14. Breeding and Genetics Symposium: really big data: processing and analysis of very large data sets.

    PubMed

    Cole, J B; Newman, S; Foertter, F; Aguilar, I; Coffey, M

    2012-03-01

    Modern animal breeding data sets are large and getting larger, due in part to recent availability of high-density SNP arrays and cheap sequencing technology. High-performance computing methods for efficient data warehousing and analysis are under development. Financial and security considerations are important when using shared clusters. Sound software engineering practices are needed, and it is better to use existing solutions when possible. Storage requirements for genotypes are modest, although full-sequence data will require greater storage capacity. Storage requirements for intermediate and results files for genetic evaluations are much greater, particularly when multiple runs must be stored for research and validation studies. The greatest gains in accuracy from genomic selection have been realized for traits of low heritability, and there is increasing interest in new health and management traits. The collection of sufficient phenotypes to produce accurate evaluations may take many years, and high-reliability proofs for older bulls are needed to estimate marker effects. Data mining algorithms applied to large data sets may help identify unexpected relationships in the data, and improved visualization tools will provide insights. Genomic selection using large data requires a lot of computing power, particularly when large fractions of the population are genotyped. Theoretical improvements have made possible the inversion of large numerator relationship matrices, permitted the solving of large systems of equations, and produced fast algorithms for variance component estimation. Recent work shows that single-step approaches combining BLUP with a genomic relationship (G) matrix have similar computational requirements to traditional BLUP, and the limiting factor is the construction and inversion of G for many genotypes. A naïve algorithm for creating G for 14,000 individuals required almost 24 h to run, but custom libraries and parallel computing reduced that to

  15. Neuron-synapse IC chip-set for large-scale chaotic neural networks.

    PubMed

    Horio, Y; Aihara, K; Yamamoto, O

    2003-01-01

    We propose a neuron-synapse integrated circuit (IC) chip-set for large-scale chaotic neural networks. We use switched-capacitor (SC) circuit techniques to implement a three-internal-state transiently-chaotic neural network model. The SC chaotic neuron chip faithfully reproduces complex chaotic dynamics in real numbers through continuous state variables of the analog circuitry. We can digitally control most of the model parameters by means of programmable capacitive arrays embedded in the SC chaotic neuron chip. Since the output of the neuron is transfered into a digital pulse according to the all-or-nothing property of an axon, we design a synapse chip with digital circuits. We propose a memory-based synapse circuit architecture to achieve a rapid calculation of a vast number of weighted summations. Both of the SC neuron and the digital synapse circuits have been fabricated as IC forms. We have tested these IC chips extensively, and confirmed the functions and performance of the chip-set. The proposed neuron-synapse IC chip-set makes it possible to construct a scalable and reconfigurable large-scale chaotic neural network with 10000 neurons and 10000/sup 2/ synaptic connections. PMID:18244585

  16. An interactive environment for the analysis of large Earth observation and model data sets

    NASA Technical Reports Server (NTRS)

    Bowman, Kenneth P.; Walsh, John E.; Wilhelmson, Robert B.

    1994-01-01

    Envision is an interactive environment that provides researchers in the earth sciences convenient ways to manage, browse, and visualize large observed or model data sets. Its main features are support for the netCDF and HDF file formats, an easy to use X/Motif user interface, a client-server configuration, and portability to many UNIX workstations. The Envision package also provides new ways to view and change metadata in a set of data files. It permits a scientist to conveniently and efficiently manage large data sets consisting of many data files. It also provides links to popular visualization tools so that data can be quickly browsed. Envision is a public domain package, freely available to the scientific community. Envision software (binaries and source code) and documentation can be obtained from either of these servers: ftp://vista.atmos.uiuc.edu/pub/envision/ and ftp://csrp.tamu.edu/pub/envision/. Detailed descriptions of Envision capabilities and operations can be found in the User's Guide and Reference Manuals distributed with Envision software.

  17. Tiny videos: a large data set for nonparametric video retrieval and frame classification.

    PubMed

    Karpenko, Alexandre; Aarabi, Parham

    2011-03-01

    In this paper, we present a large database of over 50,000 user-labeled videos collected from YouTube. We develop a compact representation called "tiny videos" that achieves high video compression rates while retaining the overall visual appearance of the video as it varies over time. We show that frame sampling using affinity propagation-an exemplar-based clustering algorithm-achieves the best trade-off between compression and video recall. We use this large collection of user-labeled videos in conjunction with simple data mining techniques to perform related video retrieval, as well as classification of images and video frames. The classification results achieved by tiny videos are compared with the tiny images framework [24] for a variety of recognition tasks. The tiny images data set consists of 80 million images collected from the Internet. These are the largest labeled research data sets of videos and images available to date. We show that tiny videos are better suited for classifying scenery and sports activities, while tiny images perform better at recognizing objects. Furthermore, we demonstrate that combining the tiny images and tiny videos data sets improves classification precision in a wider range of categories.

  18. Basis set convergence of CCSD(T) equilibrium geometries using a large and diverse set of molecular structures.

    PubMed

    Spackman, Peter R; Jayatilaka, Dylan; Karton, Amir

    2016-09-14

    We examine the basis set convergence of the CCSD(T) method for obtaining the structures of the 108 neutral first- and second-row species in the W4-11 database (with up to five non-hydrogen atoms). This set includes a total of 181 unique bonds: 75 H-X, 49 X-Y, 43 X=Y, and 14 X≡Y bonds (where X and Y are first- and second-row atoms). As reference values, geometries optimized at the CCSD(T)/aug'-cc-pV(6+d)Z level of theory are used. We consider the basis set convergence of the CCSD(T) method with the correlation consistent basis sets cc-pV(n+d)Z and aug'-cc-pV(n+d)Z (n = D, T, Q, 5) and the Weigend-Ahlrichs def2-n ZVPP basis sets (n = T, Q). For each increase in the highest angular momentum present in the basis set, the root-mean-square deviation (RMSD) over the bond distances is decreased by a factor of ∼4. For example, the following RMSDs are obtained for the cc-pV(n+d)Z basis sets 0.0196 (D), 0.0050 (T), 0.0015 (Q), and 0.0004 (5) Å. Similar results are obtained for the aug'-cc-pV(n+d)Z and def2-n ZVPP basis sets. The double-zeta and triple-zeta quality basis sets systematically and significantly overestimate the bond distances. A simple and cost-effective way to improve the performance of these basis sets is to scale the bond distances by an empirical scaling factor of 0.9865 (cc-pV(D+d)Z) and 0.9969 (cc-pV(T+d)Z). This results in RMSDs of 0.0080 (scaled cc-pV(D+d)Z) and 0.0029 (scaled cc-pV(T+d)Z) Å. The basis set convergence of larger basis sets can be accelerated via standard basis-set extrapolations. In addition, the basis set convergence of explicitly correlated CCSD(T)-F12 calculations is investigated in conjunction with the cc-pVnZ-F12 basis sets (n = D, T). Typically, one "gains" two angular momenta in the explicitly correlated calculations. That is, the CCSD(T)-F12/cc-pVnZ-F12 level of theory shows similar performance to the CCSD(T)/cc-pV(n+2)Z level of theory. In particular, the following RMSDs are obtained for the cc-pVnZ-F12 basis sets 0.0019 (D

  19. Basis set convergence of CCSD(T) equilibrium geometries using a large and diverse set of molecular structures.

    PubMed

    Spackman, Peter R; Jayatilaka, Dylan; Karton, Amir

    2016-09-14

    We examine the basis set convergence of the CCSD(T) method for obtaining the structures of the 108 neutral first- and second-row species in the W4-11 database (with up to five non-hydrogen atoms). This set includes a total of 181 unique bonds: 75 H-X, 49 X-Y, 43 X=Y, and 14 X≡Y bonds (where X and Y are first- and second-row atoms). As reference values, geometries optimized at the CCSD(T)/aug'-cc-pV(6+d)Z level of theory are used. We consider the basis set convergence of the CCSD(T) method with the correlation consistent basis sets cc-pV(n+d)Z and aug'-cc-pV(n+d)Z (n = D, T, Q, 5) and the Weigend-Ahlrichs def2-n ZVPP basis sets (n = T, Q). For each increase in the highest angular momentum present in the basis set, the root-mean-square deviation (RMSD) over the bond distances is decreased by a factor of ∼4. For example, the following RMSDs are obtained for the cc-pV(n+d)Z basis sets 0.0196 (D), 0.0050 (T), 0.0015 (Q), and 0.0004 (5) Å. Similar results are obtained for the aug'-cc-pV(n+d)Z and def2-n ZVPP basis sets. The double-zeta and triple-zeta quality basis sets systematically and significantly overestimate the bond distances. A simple and cost-effective way to improve the performance of these basis sets is to scale the bond distances by an empirical scaling factor of 0.9865 (cc-pV(D+d)Z) and 0.9969 (cc-pV(T+d)Z). This results in RMSDs of 0.0080 (scaled cc-pV(D+d)Z) and 0.0029 (scaled cc-pV(T+d)Z) Å. The basis set convergence of larger basis sets can be accelerated via standard basis-set extrapolations. In addition, the basis set convergence of explicitly correlated CCSD(T)-F12 calculations is investigated in conjunction with the cc-pVnZ-F12 basis sets (n = D, T). Typically, one "gains" two angular momenta in the explicitly correlated calculations. That is, the CCSD(T)-F12/cc-pVnZ-F12 level of theory shows similar performance to the CCSD(T)/cc-pV(n+2)Z level of theory. In particular, the following RMSDs are obtained for the cc-pVnZ-F12 basis sets 0.0019 (D

  20. Basis set convergence of CCSD(T) equilibrium geometries using a large and diverse set of molecular structures

    NASA Astrophysics Data System (ADS)

    Spackman, Peter R.; Jayatilaka, Dylan; Karton, Amir

    2016-09-01

    We examine the basis set convergence of the CCSD(T) method for obtaining the structures of the 108 neutral first- and second-row species in the W4-11 database (with up to five non-hydrogen atoms). This set includes a total of 181 unique bonds: 75 H—X, 49 X—Y, 43 X=Y, and 14 X≡Y bonds (where X and Y are first- and second-row atoms). As reference values, geometries optimized at the CCSD(T)/aug'-cc-pV(6+d)Z level of theory are used. We consider the basis set convergence of the CCSD(T) method with the correlation consistent basis sets cc-pV(n+d)Z and aug'-cc-pV(n+d)Z (n = D, T, Q, 5) and the Weigend-Ahlrichs def2-n ZVPP basis sets (n = T, Q). For each increase in the highest angular momentum present in the basis set, the root-mean-square deviation (RMSD) over the bond distances is decreased by a factor of ˜4. For example, the following RMSDs are obtained for the cc-pV(n+d)Z basis sets 0.0196 (D), 0.0050 (T), 0.0015 (Q), and 0.0004 (5) Å. Similar results are obtained for the aug'-cc-pV(n+d)Z and def2-n ZVPP basis sets. The double-zeta and triple-zeta quality basis sets systematically and significantly overestimate the bond distances. A simple and cost-effective way to improve the performance of these basis sets is to scale the bond distances by an empirical scaling factor of 0.9865 (cc-pV(D+d)Z) and 0.9969 (cc-pV(T+d)Z). This results in RMSDs of 0.0080 (scaled cc-pV(D+d)Z) and 0.0029 (scaled cc-pV(T+d)Z) Å. The basis set convergence of larger basis sets can be accelerated via standard basis-set extrapolations. In addition, the basis set convergence of explicitly correlated CCSD(T)-F12 calculations is investigated in conjunction with the cc-pVnZ-F12 basis sets (n = D, T). Typically, one "gains" two angular momenta in the explicitly correlated calculations. That is, the CCSD(T)-F12/cc-pVnZ-F12 level of theory shows similar performance to the CCSD(T)/cc-pV(n+2)Z level of theory. In particular, the following RMSDs are obtained for the cc-pVnZ-F12 basis sets 0

  1. Coresets vs clustering: comparison of methods for redundancy reduction in very large white matter fiber sets

    NASA Astrophysics Data System (ADS)

    Alexandroni, Guy; Zimmerman Moreno, Gali; Sochen, Nir; Greenspan, Hayit

    2016-03-01

    Recent advances in Diffusion Weighted Magnetic Resonance Imaging (DW-MRI) of white matter in conjunction with improved tractography produce impressive reconstructions of White Matter (WM) pathways. These pathways (fiber sets) often contain hundreds of thousands of fibers, or more. In order to make fiber based analysis more practical, the fiber set needs to be preprocessed to eliminate redundancies and to keep only essential representative fibers. In this paper we demonstrate and compare two distinctive frameworks for selecting this reduced set of fibers. The first framework entails pre-clustering the fibers using k-means, followed by Hierarchical Clustering and replacing each cluster with one representative. For the second clustering stage seven distance metrics were evaluated. The second framework is based on an efficient geometric approximation paradigm named coresets. Coresets present a new approach to optimization and have huge success especially in tasks requiring large computation time and/or memory. We propose a modified version of the coresets algorithm, Density Coreset. It is used for extracting the main fibers from dense datasets, leaving a small set that represents the main structures and connectivity of the brain. A novel approach, based on a 3D indicator structure, is used for comparing the frameworks. This comparison was applied to High Angular Resolution Diffusion Imaging (HARDI) scans of 4 healthy individuals. We show that among the clustering based methods, that cosine distance gives the best performance. In comparing the clustering schemes with coresets, Density Coreset method achieves the best performance.

  2. Heuristic method for searches on large data-sets organised using network models

    NASA Astrophysics Data System (ADS)

    Ruiz-Fernández, D.; Quintana-Pacheco, Y.

    2016-05-01

    Searches on large data-sets have become an important issue in recent years. An alternative, which has achieved good results, is the use of methods relying on data mining techniques, such as cluster-based retrieval. This paper proposes a heuristic search that is based on an organisational model that reflects similarity relationships among data elements. The search is guided by using quality estimators of model nodes, which are obtained by the progressive evaluation of the given target function for the elements associated with each node. The results of the experiments confirm the effectiveness of the proposed algorithm. High-quality solutions are obtained evaluating a relatively small percentage of elements in the data-sets.

  3. Distributed Computation of the knn Graph for Large High-Dimensional Point Sets

    PubMed Central

    Plaku, Erion; Kavraki, Lydia E.

    2009-01-01

    High-dimensional problems arising from robot motion planning, biology, data mining, and geographic information systems often require the computation of k nearest neighbor (knn) graphs. The knn graph of a data set is obtained by connecting each point to its k closest points. As the research in the above-mentioned fields progressively addresses problems of unprecedented complexity, the demand for computing knn graphs based on arbitrary distance metrics and large high-dimensional data sets increases, exceeding resources available to a single machine. In this work we efficiently distribute the computation of knn graphs for clusters of processors with message passing. Extensions to our distributed framework include the computation of graphs based on other proximity queries, such as approximate knn or range queries. Our experiments show nearly linear speedup with over one hundred processors and indicate that similar speedup can be obtained with several hundred processors. PMID:19847318

  4. Memory efficient principal component analysis for the dimensionality reduction of large mass spectrometry imaging data sets.

    PubMed

    Race, Alan M; Steven, Rory T; Palmer, Andrew D; Styles, Iain B; Bunch, Josephine

    2013-03-19

    A memory efficient algorithm for the computation of principal component analysis (PCA) of large mass spectrometry imaging data sets is presented. Mass spectrometry imaging (MSI) enables two- and three-dimensional overviews of hundreds of unlabeled molecular species in complex samples such as intact tissue. PCA, in combination with data binning or other reduction algorithms, has been widely used in the unsupervised processing of MSI data and as a dimentionality reduction method prior to clustering and spatial segmentation. Standard implementations of PCA require the data to be stored in random access memory. This imposes an upper limit on the amount of data that can be processed, necessitating a compromise between the number of pixels and the number of peaks to include. With increasing interest in multivariate analysis of large 3D multislice data sets and ongoing improvements in instrumentation, the ability to retain all pixels and many more peaks is increasingly important. We present a new method which has no limitation on the number of pixels and allows an increased number of peaks to be retained. The new technique was validated against the MATLAB (The MathWorks Inc., Natick, Massachusetts) implementation of PCA (princomp) and then used to reduce, without discarding peaks or pixels, multiple serial sections acquired from a single mouse brain which was too large to be analyzed with princomp. Then, k-means clustering was performed on the reduced data set. We further demonstrate with simulated data of 83 slices, comprising 20,535 pixels per slice and equaling 44 GB of data, that the new method can be used in combination with existing tools to process an entire organ. MATLAB code implementing the memory efficient PCA algorithm is provided.

  5. Envision: An interactive system for the management and visualization of large geophysical data sets

    NASA Technical Reports Server (NTRS)

    Searight, K. R.; Wojtowicz, D. P.; Walsh, J. E.; Pathi, S.; Bowman, K. P.; Wilhelmson, R. B.

    1995-01-01

    Envision is a software project at the University of Illinois and Texas A&M, funded by NASA's Applied Information Systems Research Project. It provides researchers in the geophysical sciences convenient ways to manage, browse, and visualize large observed or model data sets. Envision integrates data management, analysis, and visualization of geophysical data in an interactive environment. It employs commonly used standards in data formats, operating systems, networking, and graphics. It also attempts, wherever possible, to integrate with existing scientific visualization and analysis software. Envision has an easy-to-use graphical interface, distributed process components, and an extensible design. It is a public domain package, freely available to the scientific community.

  6. Viewpoints: Interactive Exploration of Large Multivariate Earth and Space Science Data Sets

    NASA Astrophysics Data System (ADS)

    Levit, C.; Gazis, P. R.

    2006-05-01

    Analysis and visualization of extremely large and complex data sets may be one of the most significant challenges facing earth and space science investigators in the forthcoming decades. While advances in hardware speed and storage technology have roughly kept up with (indeed, have driven) increases in database size, the same is not of our abilities to manage the complexity of these data. Current missions, instruments, and simulations produce so much data of such high dimensionality that they outstrip the capabilities of traditional visualization and analysis software. This problem can only be expected to get worse as data volumes increase by orders of magnitude in future missions and in ever-larger supercomputer simulations. For large multivariate data (more than 105 samples or records with more than 5 variables per sample) the interactive graphics response of most existing statistical analysis, machine learning, exploratory data analysis, and/or visualization tools such as Torch, MLC++, Matlab, S++/R, and IDL stutters, stalls, or stops working altogether. Fortunately, the graphics processing units (GPUs) built in to all professional desktop and laptop computers currently on the market are capable of transforming, filtering, and rendering hundreds of millions of points per second. We present a prototype open-source cross-platform application which leverages much of the power latent in the GPU to enable smooth interactive exploration and analysis of large high- dimensional data using a variety of classical and recent techniques. The targeted application is the interactive analysis of large, complex, multivariate data sets, with dimensionalities that may surpass 100 and sample sizes that may exceed 106-108.

  7. Approaching the exa-scale: a real-world evaluation of rendering extremely large data sets

    SciTech Connect

    Patchett, John M; Ahrens, James P; Lo, Li - Ta; Browniee, Carson S; Mitchell, Christopher J; Hansen, Chuck

    2010-10-15

    Extremely large scale analysis is becoming increasingly important as supercomputers and their simulations move from petascale to exascale. The lack of dedicated hardware acceleration for rendering on today's supercomputing platforms motivates our detailed evaluation of the possibility of interactive rendering on the supercomputer. In order to facilitate our understanding of rendering on the supercomputing platform, we focus on scalability of rendering algorithms and architecture envisioned for exascale datasets. To understand tradeoffs for dealing with extremely large datasets, we compare three different rendering algorithms for large polygonal data: software based ray tracing, software based rasterization and hardware accelerated rasterization. We present a case study of strong and weak scaling of rendering extremely large data on both GPU and CPU based parallel supercomputers using Para View, a parallel visualization tool. Wc use three different data sets: two synthetic and one from a scientific application. At an extreme scale, algorithmic rendering choices make a difference and should be considered while approaching exascale computing, visualization, and analysis. We find software based ray-tracing offers a viable approach for scalable rendering of the projected future massive data sizes.

  8. An Analogous Study of Children's Attitudes Toward School in an Open Classroom Environment as Opposed to a Conventional Setting.

    ERIC Educational Resources Information Center

    Zeli, Doris Conti

    A study sought to determine whether intermediate age children exposed to open classroom teaching strategy have a more positive attitude toward school than intermediate age children exposed to conventional teaching strategy. The hypothesis was that there would be no significant difference in attitude between the two groups. The study was limited to…

  9. What Do Children Write in Science? A Study of the Genre Set in a Primary Science Classroom

    ERIC Educational Resources Information Center

    Honig, Sheryl

    2010-01-01

    This article reports on the types of scientific writing found in two primary grade classrooms. These results are part of a larger two-year study whose purpose was to examine the development of informational writing of second- and third-grade students as they participated in integrated science-literacy instruction. The primary purpose of the…

  10. Classroom-Based Interventions and Teachers' Perceived Job Stressors and Confidence: Evidence from a Randomized Trial in Head Start Settings

    ERIC Educational Resources Information Center

    Zhai, Fuhua; Raver, C. Cybele; Li-Grining, Christine

    2011-01-01

    Preschool teachers' job stressors have received increasing attention but have been understudied in the literature. We investigated the impacts of a classroom-based intervention, the Chicago School Readiness Project (CSRP), on teachers' perceived job stressors and confidence, as indexed by their perceptions of job control, job resources, job…

  11. Child and Setting Characteristics Affecting the Adult Talk Directed at Preschoolers with Autism Spectrum Disorder in the Inclusive Classroom

    ERIC Educational Resources Information Center

    Irvin, Dwight W.; Boyd, Brian A.; Odom, Samuel L.

    2015-01-01

    Difficulty with social competence is a core deficit of autism spectrum disorder. Research on typically developing children and children with disabilities, in general, suggests the adult talk received in the classroom is related to their social development. The aims of this study were to examine (1) the types and amounts of adult talk children with…

  12. Multimodal Literacy Practices in the Indigenous Sámi Classroom: Children Navigating in a Complex Multilingual Setting

    ERIC Educational Resources Information Center

    Pietikäinen, Sari; Pitkänen-Huhta, Anne

    2013-01-01

    This article explores multimodal literacy practices in a transforming multilingual context of an indigenous and endangered Sámi language classroom. Looking at literacy practices as embedded in a complex and shifting terrain of language ideologies, language norms, and individual experiences and attitudes, we examined how multilingual Sámi children…

  13. Analogies as Tools for Meaning Making in Elementary Science Education: How Do They Work in Classroom Settings?

    ERIC Educational Resources Information Center

    Guerra-Ramos, Maria Teresa

    2011-01-01

    In this paper there is a critical overview of the role of analogies as tools for meaning making in science education, their advantages and disadvantages. Two empirical studies on the use of analogies in primary classrooms are discussed and analysed. In the first study, the "string circuit" analogy was used in the teaching of electric circuits with…

  14. Initial Validation of the Prekindergarten Classroom Observation Tool and Goal Setting System for Data-Based Coaching

    ERIC Educational Resources Information Center

    Crawford, April D.; Zucker, Tricia A.; Williams, Jeffrey M.; Bhavsar, Vibhuti; Landry, Susan H.

    2013-01-01

    Although coaching is a popular approach for enhancing the quality of Tier 1 instruction, limited research has addressed observational measures specifically designed to focus coaching on evidence-based practices. This study explains the development of the prekindergarten (pre-k) Classroom Observation Tool (COT) designed for use in a data-based…

  15. Science in the Classroom: Finding a Balance between Autonomous Exploration and Teacher-Led Instruction in Preschool Settings

    ERIC Educational Resources Information Center

    Nayfeld, Irena; Brenneman, Kimberly; Gelman, Rochel

    2011-01-01

    Research Findings: This paper reports on children's use of science materials in preschool classrooms during their free choice time. Baseline observations showed that children and teachers rarely spend time in the designated science area. An intervention was designed to "market" the science center by introducing children to 1 science tool, the…

  16. Litho-kinematic facies model for large landslide deposits in arid settings

    SciTech Connect

    Yarnold, J.C.; Lombard, J.P.

    1989-04-01

    Reconnaissance field studies of six large landslide deposits in the S. Basin and Range suggest that a set of characteristic features is common to the deposits of large landslides in an arid setting. These include a coarse boulder cap, an upper massive zone, a lower disrupted zone, and a mixed zone overlying disturbed substrate. The upper massive zone is dominated by crackel breccia. This grades downward into a lower disrupted zone composed of a more matrix-rich breccia that is internally sheared, intruded by clastic dikes, and often contains a cataclasite layer at its base. An underlying discontinuous mixed zone is composed of material from the overlying breccia mixed with material entrained from the underlying substrate. Bedding in the substrate sometimes displays folding and contortion that die out downward. The authors work suggests a spatial zonation of these characteristic features within many landslide deposits. In general, clastic dikes, the basal cataclasite, and folding in the substrate are observed mainly in distal parts of landslides. In most cases, total thickness, thickness of the basal disturbed and mixed zones, and the degree of internal shearing increase distally, whereas maximum clast size commonly decreases distally. Zonation of these features is interpreted to result from kinematics of emplacement that cause generally increased deformation in the distal regions of the landslide.

  17. A flexible method for estimating the fraction of fitness influencing mutations from large sequencing data sets.

    PubMed

    Moon, Sunjin; Akey, Joshua M

    2016-06-01

    A continuing challenge in the analysis of massively large sequencing data sets is quantifying and interpreting non-neutrally evolving mutations. Here, we describe a flexible and robust approach based on the site frequency spectrum to estimate the fraction of deleterious and adaptive variants from large-scale sequencing data sets. We applied our method to approximately 1 million single nucleotide variants (SNVs) identified in high-coverage exome sequences of 6515 individuals. We estimate that the fraction of deleterious nonsynonymous SNVs is higher than previously reported; quantify the effects of genomic context, codon bias, chromatin accessibility, and number of protein-protein interactions on deleterious protein-coding SNVs; and identify pathways and networks that have likely been influenced by positive selection. Furthermore, we show that the fraction of deleterious nonsynonymous SNVs is significantly higher for Mendelian versus complex disease loci and in exons harboring dominant versus recessive Mendelian mutations. In summary, as genome-scale sequencing data accumulate in progressively larger sample sizes, our method will enable increasingly high-resolution inferences into the characteristics and determinants of non-neutral variation.

  18. Large-scale similarity search profiling of ChEMBL compound data sets.

    PubMed

    Heikamp, Kathrin; Bajorath, Jürgen

    2011-08-22

    A large-scale similarity search investigation has been carried out on 266 well-defined compound activity classes extracted from the ChEMBL database. The analysis was performed using two widely applied two-dimensional (2D) fingerprints that mark opposite ends of the current performance spectrum of these types of fingerprints, i.e., MACCS structural keys and the extended connectivity fingerprint with bond diameter four (ECFP4). For each fingerprint, three nearest neighbor search strategies were applied. On the basis of these search calculations, a similarity search profile of the ChEMBL database was generated. Overall, the fingerprint search campaign was surprisingly successful. In 203 of 266 test cases (∼76%), a compound recovery rate of at least 50% was observed with at least the better performing fingerprint and one search strategy. The similarity search profile also revealed several general trends. For example, fingerprint searching was often characterized by an early enrichment of active compounds in database selection sets. In addition, compound activity classes have been categorized according to different similarity search performance levels, which helps to put the results of benchmark calculations into perspective. Therefore, a compendium of activity classes falling into different search performance categories is provided. On the basis of our large-scale investigation, the performance range of state-of-the-art 2D fingerprinting has been delineated for compound data sets directed against a wide spectrum of pharmaceutical targets.

  19. A Technique for Moving Large Data Sets over High-Performance Long Distance Networks

    SciTech Connect

    Settlemyer, Bradley W; Dobson, Jonathan D; Hodson, Stephen W; Kuehn, Jeffery A; Poole, Stephen W; Ruwart, Thomas

    2011-01-01

    In this paper we look at the performance characteristics of three tools used to move large data sets over dedicated long distance networking infrastructure. Although performance studies of wide area networks have been a frequent topic of interest, performance analyses have tended to focus on network latency characteristics and peak throughput using network traffic generators. In this study we instead perform an end-to-end long distance networking analysis that includes reading large data sets from a source file system and committing the data to a remote destination file system. An evaluation of end-to-end data movement is also an evaluation of the system configurations employed and the tools used to move the data. For this paper, we have built several storage platforms and connected them with a high performance long distance network configuration. We use these systems to analyze the capabilities of three data movement tools: BBcp, GridFTP, and XDD. Our studies demonstrate that existing data movement tools do not provide efficient performance levels or exercise the storage devices in their highest performance modes.

  20. Contextual settings, science stories, and large context problems: Toward a more humanistic science education

    NASA Astrophysics Data System (ADS)

    Stinner, Arthur

    This article addresses the need for and the problem of organizing a science curriculum around contextual settings and science stories that serve to involve and motivate students to develop an understanding of the world that is rooted in the scientific and the humanistic traditions. A program of activities placed around contextual settings, science stories, and contemporary issues of interest is recommended in an attempt to move toward a slow and secure abolition of the gulf between scientific knowledge and common sense beliefs. A conceptual development model is described to guide the connection between theory and evidence on a level appropriate for children, from early years to senior years. For the senior years it is also important to connect the activity of teaching to a sound theoretical structure. The theoretical structure must illuminate the status of theory in science, establish what counts as evidence, clarify the relationship between experiment and explanation, and make connections to the history of science. The article concludes with a proposed program of activities in terms of a sequence of theoretical and empirical experiences that involve contextual settings, science stories, large context problems, thematic teaching, and popular science literature teaching.

  1. Developing consistent Landsat data sets for large area applications: the MRLC 2001 protocol

    USGS Publications Warehouse

    Chander, G.; Huang, C.; Yang, L.; Homer, C.; Larson, C.

    2009-01-01

    One of the major efforts in large area land cover mapping over the last two decades was the completion of two U.S. National Land Cover Data sets (NLCD), developed with nominal 1992 and 2001 Landsat imagery under the auspices of the MultiResolution Land Characteristics (MRLC) Consortium. Following the successful generation of NLCD 1992, a second generation MRLC initiative was launched with two primary goals: (1) to develop a consistent Landsat imagery data set for the U.S. and (2) to develop a second generation National Land Cover Database (NLCD 2001). One of the key enhancements was the formulation of an image preprocessing protocol and implementation of a consistent image processing method. The core data set of the NLCD 2001 database consists of Landsat 7 Enhanced Thematic Mapper Plus (ETM+) images. This letter details the procedures for processing the original ETM+ images and more recent scenes added to the database. NLCD 2001 products include Anderson Level II land cover classes, percent tree canopy, and percent urban imperviousness at 30-m resolution derived from Landsat imagery. The products are freely available for download to the general public from the MRLC Consortium Web site at http://www.mrlc.gov.

  2. Hierarchical Unbiased Graph Shrinkage (HUGS): A Novel Groupwise Registration for Large Data Set

    PubMed Central

    Ying, Shihui; Wu, Guorong; Wang, Qian; Shen, Dinggang

    2014-01-01

    Normalizing all images in a large data set into a common space is a key step in many clinical and research studies, e.g., for brain development, maturation, and aging. Recently, groupwise registration has been developed for simultaneous alignment of all images without selecting a particular image as template, thus potentially avoiding bias in the registration. However, most conventional groupwise registration methods do not explore the data distribution during the image registration. Thus, their performance could be affected by large inter-subject variations in the data set under registration. To solve this potential issue, we propose to use a graph to model the distribution of all image data sitting on the image manifold, with each node representing an image and each edge representing the geodesic pathway between two nodes (or images). Then, the procedure of warping all images to their population center turns to the dynamic shrinking of the graph nodes along their graph edges until all graph nodes become close to each other. Thus, the topology of image distribution on the image manifold is always preserved during the groupwise registration. More importantly, by modeling the distribution of all images via a graph, we can potentially reduce registration error since every time each image is warped only according to its nearby images with similar structures in the graph. We have evaluated our proposed groupwise registration method on both infant and adult data sets, by also comparing with the conventional group-mean based registration and the ABSORB methods. All experimental results show that our proposed method can achieve better performance in terms of registration accuracy and robustness. PMID:24055505

  3. Contamination in the MACHO data set and the puzzle of Large Magellanic Cloud microlensing

    NASA Astrophysics Data System (ADS)

    Griest, Kim; Thomas, Christian L.

    2005-05-01

    In a recent series of three papers, Belokurov, Evans & Le Du and Evans & Belokurov reanalysed the MACHO collaboration data and gave alternative sets of microlensing events and an alternative optical depth to microlensing towards the Large Magellanic Cloud (LMC). Although these authors examined less than 0.2 per cent of the data, they reported that by using a neural net program they had reliably selected a better (and smaller) set of microlensing candidates. Estimating the optical depth from this smaller set, they claimed that the MACHO collaboration overestimated the optical depth by a significant factor and that the MACHO microlensing experiment is consistent with lensing by known stars in the Milky Way and LMC. As we show below, the analysis by these authors contains several errors, and as a result their conclusions are incorrect. Their efficiency analysis is in error, and since they did not search through the entire MACHO data set, they do not know how many microlensing events their neural net would find in the data nor what optical depth their method would give. Examination of their selected events suggests that their method misses low signal-to-noise ratio events and thus would have lower efficiency than the MACHO selection criteria. In addition, their method is likely to give many more false positives (non-lensing events identified as lensing). Both effects would increase their estimated optical depth. Finally, we note that the EROS discovery that LMC event 23 is a variable star reduces the MACHO collaboration estimates of optical depth and the Macho halo fraction by around 8 per cent, and does open the question of additional contamination.

  4. Suffix tree searcher: exploration of common substrings in large DNA sequence sets

    PubMed Central

    2014-01-01

    Background Large DNA sequence data sets require special bioinformatics tools to search and compare them. Such tools should be easy to use so that the data can be easily accessed by a wide array of researchers. In the past, the use of suffix trees for searching DNA sequences has been limited by a practical need to keep the trees in RAM. Newer algorithms solve this problem by using disk-based approaches. However, none of the fastest suffix tree algorithms have been implemented with a graphical user interface, preventing their incorporation into a feasible laboratory workflow. Results Suffix Tree Searcher (STS) is designed as an easy-to-use tool to index, search, and analyze very large DNA sequence datasets. The program accommodates very large numbers of very large sequences, with aggregate size reaching tens of billions of nucleotides. The program makes use of pre-sorted persistent "building blocks" to reduce the time required to construct new trees. STS is comprised of a graphical user interface written in Java, and four C modules. All components are automatically downloaded when a web link is clicked. The underlying suffix tree data structure permits extremely fast searching for specific nucleotide strings, with wild cards or mismatches allowed. Complete tree traversals for detecting common substrings are also very fast. The graphical user interface allows the user to transition seamlessly between building, traversing, and searching the dataset. Conclusions Thus, STS provides a new resource for the detection of substrings common to multiple DNA sequences or within a single sequence, for truly huge data sets. The re-searching of sequence hits, allowing wild card positions or mismatched nucleotides, together with the ability to rapidly retrieve large numbers of sequence hits from the DNA sequence files, provides the user with an efficient method of evaluating the similarity between nucleotide sequences by multiple alignment or use of Logos. The ability to re

  5. The Large Synoptic Survey Telescope and Foundations for Data Exploitation of Petabyte Data Sets

    SciTech Connect

    Cook, K H; Nikolaev, S; Huber, M E

    2007-02-26

    The next generation of imaging surveys in astronomy, such as the Large Synoptic Survey Telescope (LSST), will require multigigapixel cameras that can process enormous amounts of data read out every few seconds. This huge increase in data throughput (compared to megapixel cameras and minute- to hour-long integrations of today's instruments) calls for a new paradigm for extracting the knowledge content. We have developed foundations for this new approach. In this project, we have studied the necessary processes for extracting information from large time-domain databases systematics. In the process, we have produced significant scientific breakthroughs by developing new methods to probe both the elusive time and spatial variations in astrophysics data sets from the SuperMACHO (Massive Compact Halo Objects) survey, the Lowell Observatory Near-Earth Object Search (LONEOS), and the Taiwanese American Occultation Survey (TAOS). This project continues to contribute to the development of the scientific foundations for future wide-field, time-domain surveys. Our algorithm and pipeline development has provided the building blocks for the development of the LSST science software system. Our database design and performance measures have helped to size and constrain LSST database design. LLNL made significant contributions to the foundations of the LSST, which has applications for large-scale imaging and data-mining activities at LLNL. These developments are being actively applied to the previously mentioned surveys producing important scientific results that have been released to the scientific community and more continue to be published and referenced, enhancing LLNL's scientific stature.

  6. Twelve- to 14-Month-Old Infants Can Predict Single-Event Probability with Large Set Sizes

    ERIC Educational Resources Information Center

    Denison, Stephanie; Xu, Fei

    2010-01-01

    Previous research has revealed that infants can reason correctly about single-event probabilities with small but not large set sizes (Bonatti, 2008; Teglas "et al.", 2007). The current study asks whether infants can make predictions regarding single-event probability with large set sizes using a novel procedure. Infants completed two trials: A…

  7. Radiometric Normalization of Large Airborne Image Data Sets Acquired by Different Sensor Types

    NASA Astrophysics Data System (ADS)

    Gehrke, S.; Beshah, B. T.

    2016-06-01

    successfully applied to large sets of heterogeneous imagery, including the adjustment of original sensor images prior to quality control and further processing as well as radiometric adjustment for ortho-image mosaic generation.

  8. Innovation from within the Box: Evaluation of Online Problem Sets in a Series of Large Lecture Undergraduate Science Courses.

    ERIC Educational Resources Information Center

    Schaeffer, Evonne; Bhargava, Tina; Nash, John; Kerns, Charles; Stocker, Scott

    A technology-mediated solution to enhance the learning experience for students in a large lecture setting was evaluated. Online problem sets were developed to engage students in the content of a human biology course and implemented in the classes of eight faculty coordinators. The weekly problem sets contained several multiple choice problems,…

  9. Child and setting characteristics affecting the adult talk directed at preschoolers with autism spectrum disorder in the inclusive classroom.

    PubMed

    Irvin, Dwight W; Boyd, Brian A; Odom, Samuel L

    2015-02-01

    Difficulty with social competence is a core deficit of autism spectrum disorder. Research on typically developing children and children with disabilities, in general, suggests the adult talk received in the classroom is related to their social development. The aims of this study were to examine (1) the types and amounts of adult talk children with autism spectrum disorder are exposed to in the preschool classroom and (2) the associations between child characteristics (e.g. language), activity area, and adult talk. Kontos' Teacher Talk classification was used to code videos approximately 30 min in length of 73 children with autism spectrum disorder (ages 3-5) in inclusive classrooms (n = 33) during center time. The results indicated practical/personal assistance was the most common type of adult talk coded, and behavior management talk least often coded. Child characteristics (i.e. age and autism severity) and activity area were found to be related to specific types of adult talk. Given the findings, implications for future research are discussed.

  10. Child and setting characteristics affecting the adult talk directed at preschoolers with autism spectrum disorder in the inclusive classroom.

    PubMed

    Irvin, Dwight W; Boyd, Brian A; Odom, Samuel L

    2015-02-01

    Difficulty with social competence is a core deficit of autism spectrum disorder. Research on typically developing children and children with disabilities, in general, suggests the adult talk received in the classroom is related to their social development. The aims of this study were to examine (1) the types and amounts of adult talk children with autism spectrum disorder are exposed to in the preschool classroom and (2) the associations between child characteristics (e.g. language), activity area, and adult talk. Kontos' Teacher Talk classification was used to code videos approximately 30 min in length of 73 children with autism spectrum disorder (ages 3-5) in inclusive classrooms (n = 33) during center time. The results indicated practical/personal assistance was the most common type of adult talk coded, and behavior management talk least often coded. Child characteristics (i.e. age and autism severity) and activity area were found to be related to specific types of adult talk. Given the findings, implications for future research are discussed. PMID:24463432

  11. Perl One-Liners: Bridging the Gap Between Large Data Sets and Analysis Tools.

    PubMed

    Hokamp, Karsten

    2015-01-01

    Computational analyses of biological data are becoming increasingly powerful, and researchers intending on carrying out their own analyses can often choose from a wide array of tools and resources. However, their application might be obstructed by the wide variety of different data formats that are in use, from standard, commonly used formats to output files from high-throughput analysis platforms. The latter are often too large to be opened, viewed, or edited by standard programs, potentially leading to a bottleneck in the analysis. Perl one-liners provide a simple solution to quickly reformat, filter, and merge data sets in preparation for downstream analyses. This chapter presents example code that can be easily adjusted to meet individual requirements. An online version is available at http://bioinf.gen.tcd.ie/pol. PMID:26498621

  12. Application of de Novo Sequencing to Large-Scale Complex Proteomics Data Sets.

    PubMed

    Devabhaktuni, Arun; Elias, Joshua E

    2016-03-01

    Dependent on concise, predefined protein sequence databases, traditional search algorithms perform poorly when analyzing mass spectra derived from wholly uncharacterized protein products. Conversely, de novo peptide sequencing algorithms can interpret mass spectra without relying on reference databases. However, such algorithms have been difficult to apply to complex protein mixtures, in part due to a lack of methods for automatically validating de novo sequencing results. Here, we present novel metrics for benchmarking de novo sequencing algorithm performance on large-scale proteomics data sets and present a method for accurately calibrating false discovery rates on de novo results. We also present a novel algorithm (LADS) that leverages experimentally disambiguated fragmentation spectra to boost sequencing accuracy and sensitivity. LADS improves sequencing accuracy on longer peptides relative to that of other algorithms and improves discriminability of correct and incorrect sequences. Using these advancements, we demonstrate accurate de novo identification of peptide sequences not identifiable using database search-based approaches. PMID:26743026

  13. Generating extreme weather event sets from very large ensembles of regional climate models

    NASA Astrophysics Data System (ADS)

    Massey, Neil; Guillod, Benoit; Otto, Friederike; Allen, Myles; Jones, Richard; Hall, Jim

    2015-04-01

    Generating extreme weather event sets from very large ensembles of regional climate models Neil Massey, Benoit P. Guillod, Friederike E. L. Otto, Myles R. Allen, Richard Jones, Jim W. Hall Environmental Change Institute, University of Oxford, Oxford, UK Extreme events can have large impacts on societies and are therefore being increasingly studied. In particular, climate change is expected to impact the frequency and intensity of these events. However, a major limitation when investigating extreme weather events is that, by definition, only few events are present in observations. A way to overcome this issue it to use large ensembles of model simulations. Using the volunteer distributed computing (VDC) infrastructure of weather@home [1], we run a very large number (10'000s) of RCM simulations over the European domain at a resolution of 25km, with an improved land-surface scheme, nested within a free-running GCM. Using VDC allows many thousands of climate model runs to be computed. Using observations for the GCM boundary forcings we can run historical "hindcast" simulations over the past 100 to 150 years. This allows us, due to the chaotic variability of the atmosphere, to ascertain how likely an extreme event was, given the boundary forcings, and to derive synthetic event sets. The events in these sets did not actually occur in the observed record but could have occurred given the boundary forcings, with an associated probability. The event sets contain time-series of fields of meteorological variables that allow impact modellers to assess the loss the event would incur. Projections of events into the future are achieved by modelling projections of the sea-surface temperature (SST) and sea-ice boundary forcings, by combining the variability of the SST in the observed record with a range of warming signals derived from the varying responses of SSTs in the CMIP5 ensemble to elevated greenhouse gas (GHG) emissions in three RCP scenarios. Simulating the future with a

  14. fastSTRUCTURE: variational inference of population structure in large SNP data sets.

    PubMed

    Raj, Anil; Stephens, Matthew; Pritchard, Jonathan K

    2014-06-01

    Tools for estimating population structure from genetic data are now used in a wide variety of applications in population genetics. However, inferring population structure in large modern data sets imposes severe computational challenges. Here, we develop efficient algorithms for approximate inference of the model underlying the STRUCTURE program using a variational Bayesian framework. Variational methods pose the problem of computing relevant posterior distributions as an optimization problem, allowing us to build on recent advances in optimization theory to develop fast inference tools. In addition, we propose useful heuristic scores to identify the number of populations represented in a data set and a new hierarchical prior to detect weak population structure in the data. We test the variational algorithms on simulated data and illustrate using genotype data from the CEPH-Human Genome Diversity Panel. The variational algorithms are almost two orders of magnitude faster than STRUCTURE and achieve accuracies comparable to those of ADMIXTURE. Furthermore, our results show that the heuristic scores for choosing model complexity provide a reasonable range of values for the number of populations represented in the data, with minimal bias toward detecting structure when it is very weak. Our algorithm, fastSTRUCTURE, is freely available online at http://pritchardlab.stanford.edu/structure.html.

  15. Motif-based analysis of large nucleotide data sets using MEME-ChIP.

    PubMed

    Ma, Wenxiu; Noble, William S; Bailey, Timothy L

    2014-01-01

    MEME-ChIP is a web-based tool for analyzing motifs in large DNA or RNA data sets. It can analyze peak regions identified by ChIP-seq, cross-linking sites identified by CLIP-seq and related assays, as well as sets of genomic regions selected using other criteria. MEME-ChIP performs de novo motif discovery, motif enrichment analysis, motif location analysis and motif clustering, providing a comprehensive picture of the DNA or RNA motifs that are enriched in the input sequences. MEME-ChIP performs two complementary types of de novo motif discovery: weight matrix-based discovery for high accuracy; and word-based discovery for high sensitivity. Motif enrichment analysis using DNA or RNA motifs from human, mouse, worm, fly and other model organisms provides even greater sensitivity. MEME-ChIP's interactive HTML output groups and aligns significant motifs to ease interpretation. This protocol takes less than 3 h, and it provides motif discovery approaches that are distinct and complementary to other online methods. PMID:24853928

  16. Ghost transmission: How large basis sets can make electron transport calculations worse

    SciTech Connect

    Herrmann, Carmen; Solomon, Gemma C.; Subotnik, Joseph E.; Mujica, Vladimiro; Ratner, Mark A.

    2010-01-01

    The Landauer approach has proven to be an invaluable tool for calculating the electron transport properties of single molecules, especially when combined with a nonequilibrium Green’s function approach and Kohn–Sham density functional theory. However, when using large nonorthogonal atom-centered basis sets, such as those common in quantum chemistry, one can find erroneous results if the Landauer approach is applied blindly. In fact, basis sets of triple-zeta quality or higher sometimes result in an artificially high transmission and possibly even qualitatively wrong conclusions regarding chemical trends. In these cases, transport persists when molecular atoms are replaced by basis functions alone (“ghost atoms”). The occurrence of such ghost transmission is correlated with low-energy virtual molecular orbitals of the central subsystem and may be interpreted as a biased and thus inaccurate description of vacuum transmission. An approximate practical correction scheme is to calculate the ghost transmission and subtract it from the full transmission. As a further consequence of this study, it is recommended that sensitive molecules be used for parameter studies, in particular those whose transmission functions show antiresonance features such as benzene-based systems connected to the electrodes in meta positions and other low-conducting systems such as alkanes and silanes.

  17. Motif-based analysis of large nucleotide data sets using MEME-ChIP

    PubMed Central

    Ma, Wenxiu; Noble, William S; Bailey, Timothy L

    2014-01-01

    MEME-ChIP is a web-based tool for analyzing motifs in large DNA or RNA data sets. It can analyze peak regions identified by ChIP-seq, cross-linking sites identified by cLIP-seq and related assays, as well as sets of genomic regions selected using other criteria. MEME-ChIP performs de novo motif discovery, motif enrichment analysis, motif location analysis and motif clustering, providing a comprehensive picture of the DNA or RNA motifs that are enriched in the input sequences. MEME-ChIP performs two complementary types of de novo motif discovery: weight matrix–based discovery for high accuracy; and word-based discovery for high sensitivity. Motif enrichment analysis using DNA or RNA motifs from human, mouse, worm, fly and other model organisms provides even greater sensitivity. MEME-ChIP’s interactive HTML output groups and aligns significant motifs to ease interpretation. this protocol takes less than 3 h, and it provides motif discovery approaches that are distinct and complementary to other online methods. PMID:24853928

  18. Validating a large geophysical data set - Experiences with satellite-derived cloud parameters

    NASA Technical Reports Server (NTRS)

    Kahn, Ralph; Haskins, Robert D.; Knighton, James E.; Pursch, Andrew; Granger-Gallegos, Stephanie

    1991-01-01

    The goal of this study is to validate the global cloud parameters derived from the satellite-borne HIRS2 and MSU atmospheric sounding instrument measurements, and to use the analysis of these data as one prototype for studying large geophysical data sets in general. The HIRS2/MSU data set contains a total of 40 physical parameters, filling 25 MB/day; raw HIRS2/MSU data are available for a period exceeding 10 years. Validation involves developing a quantitative sense for the physical meaning of the derived parameters over the range of environmental conditions sampled. This is accomplished by comparing the spatial and temporal distributions of the derived quantities with similar measurements made using other techniques, and with model results. The need to work with Level 2 (point) data, rather than Level 3 (gridded) data for validation purposes is discussed, and some techniques developed for charting the assumptions made in deriving an algorithm and generating a code to produce geophysical quantities from measured radiances are presented.

  19. ECOSAR model performance with a large test set of industrial chemicals.

    PubMed

    Reuschenbach, Peter; Silvani, Maurizio; Dammann, Martina; Warnecke, Dietmar; Knacker, Thomas

    2008-05-01

    The widely used ECOSAR computer programme for QSAR prediction of chemical toxicity towards aquatic organisms was evaluated by using large data sets of industrial chemicals with varying molecular structures. Experimentally derived toxicity data covering acute effects on fish, Daphnia and green algae growth inhibition of in total more than 1,000 randomly selected substances were compared to the prediction results of the ECOSAR programme in order (1) to assess the capability of ECOSAR to correctly classify the chemicals into defined classes of aquatic toxicity according to rules of EU regulation and (2) to determine the number of correct predictions within tolerance factors from 2 to 1,000. Regarding ecotoxicity classification, 65% (fish), 52% (Daphnia) and 49% (algae) of the substances were correctly predicted into the classes "not harmful", "harmful", "toxic" and "very toxic". At all trophic levels about 20% of the chemicals were underestimated in their toxicity. The class of "not harmful" substances (experimental LC/EC(50)>100 mg l(-1)) represents nearly half of the whole data set. The percentages for correct predictions of toxic effects on fish, Daphnia and algae growth inhibition were 69%, 64% and 60%, respectively, when a tolerance factor of 10 was allowed. Focussing on those experimental results which were verified by analytically measured concentrations, the predictability for Daphnia and algae toxicity was improved by approximately three percentage points, whereas for fish no improvement was determined. The calculated correlation coefficients demonstrated poor correlation when the complete data set was taken, but showed good results for some of the ECOSAR chemical classes. The results are discussed in the context of literature data on the performance of ECOSAR and other QSAR models.

  20. A multivariate approach to filling gaps in large ecological data sets using probabilistic matrix factorization techniques

    NASA Astrophysics Data System (ADS)

    Schrodt, F. I.; Shan, H.; Kattge, J.; Reich, P.; Banerjee, A.; Reichstein, M.

    2012-12-01

    With the advent of remotely sensed data and coordinated efforts to create global databases, the ecological community has progressively become more data-intensive. However, in contrast to other disciplines, statistical ways of handling these large data sets, especially the gaps which are inherent to them, are lacking. Widely used theoretical approaches, for example model averaging based on Akaike's information criterion (AIC), are sensitive to missing values. Yet, the most common way of handling sparse matrices - the deletion of cases with missing data (complete case analysis) - is known to severely reduce statistical power as well as inducing biased parameter estimates. In order to address these issues, we present novel approaches to gap filling in large ecological data sets using matrix factorization techniques. Factorization based matrix completion was developed in a recommender system context and has since been widely used to impute missing data in fields outside the ecological community. Here, we evaluate the effectiveness of probabilistic matrix factorization techniques for imputing missing data in ecological matrices using two imputation techniques. Hierarchical Probabilistic Matrix Factorization (HPMF) effectively incorporates hierarchical phylogenetic information (phylogenetic group, family, genus, species and individual plant) into the trait imputation. Kernelized Probabilistic Matrix Factorization (KPMF) on the other hand includes environmental information (climate and soils) into the matrix factorization through kernel matrices over rows and columns. We test the accuracy and effectiveness of HPMF and KPMF in filling sparse matrices, using the TRY database of plant functional traits (http://www.try-db.org). TRY is one of the largest global compilations of plant trait databases (750 traits of 1 million plants), encompassing data on morphological, anatomical, biochemical, phenological and physiological features of plants. However, despite of unprecedented

  1. Evaluation of flow resistance in gravel-bed rivers through a large field data set

    NASA Astrophysics Data System (ADS)

    Rickenmann, Dieter; Recking, Alain

    2011-07-01

    A data set of 2890 field measurements was used to test the ability of several conventional flow resistance equations to predict mean flow velocity in gravel bed rivers when used with no calibration. The tests were performed using both flow depth and discharge as input since discharge may be a more reliable measure of flow conditions in shallow flows. Generally better predictions are obtained when using flow discharge as input. The results indicate that the Manning-Strickler and the Keulegan equations show considerable disagreement with observed flow velocities for flow depths smaller than 10 times the characteristic grain diameter. Most equations show some systematic deviation for small relative flow depth. The use of new definitions for dimensionless variables in terms of nondimensional hydraulic geometry equations allows the development of a new flow resistance equation. The best overall performance is obtained by the Ferguson approach, which combines two power law flow resistance equations that are different for deep and shallow flows. To use this approach with flow discharge as input, a logarithmic matching equation in terms of the new dimensionless variables is proposed. For the domains of intermediate and large-scale roughness, the field data indicate a considerable increase in flow resistance as compared with the domain of small-scale roughness. The Ferguson approach is used to discuss the importance of flow resistance partitioning for bed load transport calculations at flow conditions with intermediate- and large-scale roughness in natural gravel, cobble, and boulder bed streams.

  2. Galaxy Evolution Insights from Spectral Modeling of Large Data Sets from the Sloan Digital Sky Survey

    SciTech Connect

    Hoversten, Erik A.

    2007-10-01

    This thesis centers on the use of spectral modeling techniques on data from the Sloan Digital Sky Survey (SDSS) to gain new insights into current questions in galaxy evolution. The SDSS provides a large, uniform, high quality data set which can be exploited in a number of ways. One avenue pursued here is to use the large sample size to measure precisely the mean properties of galaxies of increasingly narrow parameter ranges. The other route taken is to look for rare objects which open up for exploration new areas in galaxy parameter space. The crux of this thesis is revisiting the classical Kennicutt method for inferring the stellar initial mass function (IMF) from the integrated light properties of galaxies. A large data set (~ 105 galaxies) from the SDSS DR4 is combined with more in-depth modeling and quantitative statistical analysis to search for systematic IMF variations as a function of galaxy luminosity. Galaxy Hα equivalent widths are compared to a broadband color index to constrain the IMF. It is found that for the sample as a whole the best fitting IMF power law slope above 0.5 M is Γ = 1.5 ± 0.1 with the error dominated by systematics. Galaxies brighter than around Mr,0.1 = -20 (including galaxies like the Milky Way which has Mr,0.1 ~ -21) are well fit by a universal Γ ~ 1.4 IMF, similar to the classical Salpeter slope, and smooth, exponential star formation histories (SFH). Fainter galaxies prefer steeper IMFs and the quality of the fits reveal that for these galaxies a universal IMF with smooth SFHs is actually a poor assumption. Related projects are also pursued. A targeted photometric search is conducted for strongly lensed Lyman break galaxies (LBG) similar to MS1512-cB58. The evolution of the photometric selection technique is described as are the results of spectroscopic follow-up of the best targets. The serendipitous discovery of two interesting blue compact dwarf galaxies is reported. These

  3. Measurement, visualization and analysis of extremely large data sets with a nanopositioning and nanomeasuring machine

    NASA Astrophysics Data System (ADS)

    Birli, O.; Franke, K.-H.; Linß, G.; Machleidt, T.; Manske, E.; Schale, F.; Schwannecke, H.-C.; Sparrer, E.; Weiß, M.

    2013-04-01

    Nanopositioning and nanomeasuring machines (NPM machines) developed at the Ilmenau University of Technology allow the measurement of micro- and nanostructures with nanometer precision in a measurement volume of 25 mm × 25 mm × 5 mm (NMM-1) or 200 mm × 200 mm × 25 mm (NPMM-200). Various visual, tactile or atomic force sensors can all be used to measure specimens. Atomic force sensors have emerged as a powerful tool in nanotechnology. Large-scale AFM measurements are very time-consuming and in fact in a practical sense they are impossible over millimeter ranges due to low scanning speeds. A cascaded multi-sensor system can be used to implement a multi-scale measurement and testing strategy for nanopositioning and nanomeasuring machines. This approach involves capturing an overview image at the limit of optical resolution and automatically scanning the measured data for interesting test areas that are suitable for a higher-resolution measurement. These "fields of interest" can subsequently be measured in the same NPM machine using individual AFM sensor scans. The results involve extremely large data sets that cannot be handled by off-the-shelf software. Quickly navigating within terabyte-sized data files requires preprocessing to be done on the measured data to calculate intermediate images based on the principle of a visualization pyramid. This pyramid includes the measured data of the entire volume, prepared in the form of discrete measurement volumes (spatial tiles or cubes) with certain edge lengths at specific zoom levels. The functionality of the closed process chain is demonstrated using a blob analysis for automatically selecting regions of interest on the specimen. As expected, processing large amounts of data places particularly high demands on both computing power and the software architecture.

  4. Listserv Lemmings and Fly-brarians on the Wall: A Librarian-Instructor Team Taming the Cyberbeast in the Large Classroom.

    ERIC Educational Resources Information Center

    Dickstein, Ruth; McBride, Kari Boyd

    1998-01-01

    Computer technology can empower students if they have the tools to find their way through print and online sources. This article describes how a reference librarian and a faculty instructor collaborated to teach research strategies and critical thinking skills (including analysis and evaluation of resources) in a large university classroom using a…

  5. Any Questions? An Application of Weick's Model of Organizing to Increase Student Involvement in the Large-Lecture Classroom

    ERIC Educational Resources Information Center

    Ledford, Christy J. W.; Saperstein, Adam K.; Cafferty, Lauren A.; McClintick, Stacey H.; Bernstein, Ethan M.

    2015-01-01

    Microblogs, with their interactive nature, can engage students in community building and sensemaking. Using Weick's model of organizing as a framework, we integrated the use of micromessaging to increase student engagement in the large-lecture classroom. Students asked significantly more questions and asked a greater diversity of questions…

  6. Issues in Estimating Program Effects and Studying Implementation in Large-Scale Educational Experiments: The Case of a Connected Classroom Technology Program

    ERIC Educational Resources Information Center

    Shin, Hye Sook

    2009-01-01

    Using data from a nationwide, large-scale experimental study of the effects of a connected classroom technology on student learning in algebra (Owens et al., 2004), this dissertation focuses on challenges that can arise in estimating treatment effects in educational field experiments when samples are highly heterogeneous in terms of various…

  7. An Evaluation of the Developmental Designs Approach and Professional Development Model on Classroom Management in 22 Middle Schools in a Large, Midwestern School District

    ERIC Educational Resources Information Center

    Hough, David L.

    2011-01-01

    This study presents findings from an evaluation of the Developmental Designs classroom management approach and professional development model during its first year of implementation across 22 middle schools in a large, Midwestern school district. The impact of this professional development model on teaching and learning as related to participants'…

  8. Efficient Implementation of an Optimal Interpolator for Large Spatial Data Sets

    NASA Technical Reports Server (NTRS)

    Memarsadeghi, Nargess; Mount, David M.

    2007-01-01

    Scattered data interpolation is a problem of interest in numerous areas such as electronic imaging, smooth surface modeling, and computational geometry. Our motivation arises from applications in geology and mining, which often involve large scattered data sets and a demand for high accuracy. The method of choice is ordinary kriging. This is because it is a best unbiased estimator. Unfortunately, this interpolant is computationally very expensive to compute exactly. For n scattered data points, computing the value of a single interpolant involves solving a dense linear system of size roughly n x n. This is infeasible for large n. In practice, kriging is solved approximately by local approaches that are based on considering only a relatively small'number of points that lie close to the query point. There are many problems with this local approach, however. The first is that determining the proper neighborhood size is tricky, and is usually solved by ad hoc methods such as selecting a fixed number of nearest neighbors or all the points lying within a fixed radius. Such fixed neighborhood sizes may not work well for all query points, depending on local density of the point distribution. Local methods also suffer from the problem that the resulting interpolant is not continuous. Meyer showed that while kriging produces smooth continues surfaces, it has zero order continuity along its borders. Thus, at interface boundaries where the neighborhood changes, the interpolant behaves discontinuously. Therefore, it is important to consider and solve the global system for each interpolant. However, solving such large dense systems for each query point is impractical. Recently a more principled approach to approximating kriging has been proposed based on a technique called covariance tapering. The problems arise from the fact that the covariance functions that are used in kriging have global support. Our implementations combine, utilize, and enhance a number of different

  9. PORTAAL: A Classroom Observation Tool Assessing Evidence-Based Teaching Practices for Active Learning in Large Science, Technology, Engineering, and Mathematics Classes.

    PubMed

    Eddy, Sarah L; Converse, Mercedes; Wenderoth, Mary Pat

    2015-01-01

    There is extensive evidence that active learning works better than a completely passive lecture. Despite this evidence, adoption of these evidence-based teaching practices remains low. In this paper, we offer one tool to help faculty members implement active learning. This tool identifies 21 readily implemented elements that have been shown to increase student outcomes related to achievement, logic development, or other relevant learning goals with college-age students. Thus, this tool both clarifies the research-supported elements of best practices for instructor implementation of active learning in the classroom setting and measures instructors' alignment with these practices. We describe how we reviewed the discipline-based education research literature to identify best practices in active learning for adult learners in the classroom and used these results to develop an observation tool (Practical Observation Rubric To Assess Active Learning, or PORTAAL) that documents the extent to which instructors incorporate these practices into their classrooms. We then use PORTAAL to explore the classroom practices of 25 introductory biology instructors who employ some form of active learning. Overall, PORTAAL documents how well aligned classrooms are with research-supported best practices for active learning and provides specific feedback and guidance to instructors to allow them to identify what they do well and what could be improved.

  10. PORTAAL: A Classroom Observation Tool Assessing Evidence-Based Teaching Practices for Active Learning in Large Science, Technology, Engineering, and Mathematics Classes

    PubMed Central

    Eddy, Sarah L.; Converse, Mercedes; Wenderoth, Mary Pat

    2015-01-01

    There is extensive evidence that active learning works better than a completely passive lecture. Despite this evidence, adoption of these evidence-based teaching practices remains low. In this paper, we offer one tool to help faculty members implement active learning. This tool identifies 21 readily implemented elements that have been shown to increase student outcomes related to achievement, logic development, or other relevant learning goals with college-age students. Thus, this tool both clarifies the research-supported elements of best practices for instructor implementation of active learning in the classroom setting and measures instructors’ alignment with these practices. We describe how we reviewed the discipline-based education research literature to identify best practices in active learning for adult learners in the classroom and used these results to develop an observation tool (Practical Observation Rubric To Assess Active Learning, or PORTAAL) that documents the extent to which instructors incorporate these practices into their classrooms. We then use PORTAAL to explore the classroom practices of 25 introductory biology instructors who employ some form of active learning. Overall, PORTAAL documents how well aligned classrooms are with research-supported best practices for active learning and provides specific feedback and guidance to instructors to allow them to identify what they do well and what could be improved. PMID:26033871

  11. PORTAAL: A Classroom Observation Tool Assessing Evidence-Based Teaching Practices for Active Learning in Large Science, Technology, Engineering, and Mathematics Classes.

    PubMed

    Eddy, Sarah L; Converse, Mercedes; Wenderoth, Mary Pat

    2015-01-01

    There is extensive evidence that active learning works better than a completely passive lecture. Despite this evidence, adoption of these evidence-based teaching practices remains low. In this paper, we offer one tool to help faculty members implement active learning. This tool identifies 21 readily implemented elements that have been shown to increase student outcomes related to achievement, logic development, or other relevant learning goals with college-age students. Thus, this tool both clarifies the research-supported elements of best practices for instructor implementation of active learning in the classroom setting and measures instructors' alignment with these practices. We describe how we reviewed the discipline-based education research literature to identify best practices in active learning for adult learners in the classroom and used these results to develop an observation tool (Practical Observation Rubric To Assess Active Learning, or PORTAAL) that documents the extent to which instructors incorporate these practices into their classrooms. We then use PORTAAL to explore the classroom practices of 25 introductory biology instructors who employ some form of active learning. Overall, PORTAAL documents how well aligned classrooms are with research-supported best practices for active learning and provides specific feedback and guidance to instructors to allow them to identify what they do well and what could be improved. PMID:26033871

  12. Opening the Black Box: Prospects for Using International Large-Scale Assessments to Explore Classroom Effects

    ERIC Educational Resources Information Center

    Schmidt, William H.; Burroughs, Nathan A.

    2013-01-01

    In this article, the authors review International Large-Scale Assessment (ILSA)-based research over the last several decades, with specific attention on cross-national analysis of mean differences between and variation within countries in mathematics education. They discuss the role of sampling design and "opportunity to learn" (OTL)…

  13. Taking Energy to the Physics Classroom from the Large Hadron Collider at CERN

    ERIC Educational Resources Information Center

    Cid, Xabier; Cid, Ramon

    2009-01-01

    In 2008, the greatest experiment in history began. When in full operation, the Large Hadron Collider (LHC) at CERN will generate the greatest amount of information that has ever been produced in an experiment before. It will also reveal some of the most fundamental secrets of nature. Despite the enormous amount of information available on this…

  14. Clickers in College Classrooms: Fostering Learning with Questioning Methods in Large Lecture Classes

    ERIC Educational Resources Information Center

    Mayer, Richard E.; Stull, Andrew; DeLeeuw, Krista; Almeroth, Kevin; Bimber, Bruce; Chun, Dorothy; Bulger, Monica; Campbell, Julie; Knight, Allan; Zhang, Hangjin

    2009-01-01

    What can be done to promote student-instructor interaction in a large lecture class? One approach is to use a personal response system (or "clickers") in which students press a button on a hand-held remote control device corresponding to their answer to a multiple choice question projected on a screen, then see the class distribution of answers on…

  15. Science Teachers' Decision-Making in Abstinence-Only-Until-Marriage (AOUM) Classrooms: Taboo Subjects and Discourses of Sex and Sexuality in Classroom Settings

    ERIC Educational Resources Information Center

    Gill, Puneet Singh

    2015-01-01

    Sex education, especially in the southeastern USA, remains steeped in an Abstinence-Only-Until-Marriage (AOUM) approach, which sets up barriers to the education of sexually active students. Research confirms that science education has the potential to facilitate discussion of controversial topics, including sex education. Science teachers in the…

  16. Strategies for reducing large fMRI data sets for independent component analysis.

    PubMed

    Wang, Ze; Wang, Jiongjiong; Calhoun, Vince; Rao, Hengyi; Detre, John A; Childress, Anna R

    2006-06-01

    In independent component analysis (ICA), principal component analysis (PCA) is generally used to reduce the raw data to a few principal components (PCs) through eigenvector decomposition (EVD) on the data covariance matrix. Although this works for spatial ICA (sICA) on moderately sized fMRI data, it is intractable for temporal ICA (tICA), since typical fMRI data have a high spatial dimension, resulting in an unmanageable data covariance matrix. To solve this problem, two practical data reduction methods are presented in this paper. The first solution is to calculate the PCs of tICA from the PCs of sICA. This approach works well for moderately sized fMRI data; however, it is highly computationally intensive, even intractable, when the number of scans increases. The second solution proposed is to perform PCA decomposition via a cascade recursive least squared (CRLS) network, which provides a uniform data reduction solution for both sICA and tICA. Without the need to calculate the covariance matrix, CRLS extracts PCs directly from the raw data, and the PC extraction can be terminated after computing an arbitrary number of PCs without the need to estimate the whole set of PCs. Moreover, when the whole data set becomes too large to be loaded into the machine memory, CRLS-PCA can save data retrieval time by reading the data once, while the conventional PCA requires numerous data retrieval steps for both covariance matrix calculation and PC extractions. Real fMRI data were used to evaluate the PC extraction precision, computational expense, and memory usage of the presented methods.

  17. Registering coherent change detection products associated with large image sets and long capture intervals

    SciTech Connect

    Perkins, David Nikolaus; Gonzales, Antonio I

    2014-04-08

    A set of co-registered coherent change detection (CCD) products is produced from a set of temporally separated synthetic aperture radar (SAR) images of a target scene. A plurality of transformations are determined, which transformations are respectively for transforming a plurality of the SAR images to a predetermined image coordinate system. The transformations are used to create, from a set of CCD products produced from the set of SAR images, a corresponding set of co-registered CCD products.

  18. Classroom management programs for deaf children in state residential and large public schools.

    PubMed

    Wenkus, M; Rittenhouse, B; Dancer, J

    1999-12-01

    Personnel in 4 randomly selected state residential schools for the deaf and 3 randomly selected large public schools with programs for the deaf were surveyed to assess the types of management or disciplinary programs and strategies currently in use with deaf students and the rated effectiveness of such programs. Several behavioral management programs were identified by respondents, with Assertive Discipline most often listed. Ratings of program effectiveness were generally above average on a number of qualitative criteria. PMID:10710770

  19. Classroom management programs for deaf children in state residential and large public schools.

    PubMed

    Wenkus, M; Rittenhouse, B; Dancer, J

    1999-12-01

    Personnel in 4 randomly selected state residential schools for the deaf and 3 randomly selected large public schools with programs for the deaf were surveyed to assess the types of management or disciplinary programs and strategies currently in use with deaf students and the rated effectiveness of such programs. Several behavioral management programs were identified by respondents, with Assertive Discipline most often listed. Ratings of program effectiveness were generally above average on a number of qualitative criteria.

  20. Considerations for observational research using large data sets in radiation oncology.

    PubMed

    Jagsi, Reshma; Bekelman, Justin E; Chen, Aileen; Chen, Ronald C; Hoffman, Karen; Shih, Ya-Chen Tina; Smith, Benjamin D; Yu, James B

    2014-09-01

    The radiation oncology community has witnessed growing interest in observational research conducted using large-scale data sources such as registries and claims-based data sets. With the growing emphasis on observational analyses in health care, the radiation oncology community must possess a sophisticated understanding of the methodological considerations of such studies in order to evaluate evidence appropriately to guide practice and policy. Because observational research has unique features that distinguish it from clinical trials and other forms of traditional radiation oncology research, the International Journal of Radiation Oncology, Biology, Physics assembled a panel of experts in health services research to provide a concise and well-referenced review, intended to be informative for the lay reader, as well as for scholars who wish to embark on such research without prior experience. This review begins by discussing the types of research questions relevant to radiation oncology that large-scale databases may help illuminate. It then describes major potential data sources for such endeavors, including information regarding access and insights regarding the strengths and limitations of each. Finally, it provides guidance regarding the analytical challenges that observational studies must confront, along with discussion of the techniques that have been developed to help minimize the impact of certain common analytical issues in observational analysis. Features characterizing a well-designed observational study include clearly defined research questions, careful selection of an appropriate data source, consultation with investigators with relevant methodological expertise, inclusion of sensitivity analyses, caution not to overinterpret small but significant differences, and recognition of limitations when trying to evaluate causality. This review concludes that carefully designed and executed studies using observational data that possess these qualities hold

  1. Megapixel mythology and photospace: estimating photospace for camera phones from large image sets

    NASA Astrophysics Data System (ADS)

    Hultgren, Bror O.; Hertel, Dirk W.

    2008-01-01

    It is a myth that more pixels alone result in better images. The marketing of camera phones in particular has focused on their pixel numbers. However, their performance varies considerably according to the conditions of image capture. Camera phones are often used in low-light situations where the lack of a flash and limited exposure time will produce underexposed, noisy and blurred images. Camera utilization can be quantitatively described by photospace distributions, a statistical description of the frequency of pictures taken at varying light levels and camera-subject distances. If the photospace distribution is known, the user-experienced distribution of quality can be determined either directly by direct measurement of subjective quality, or by photospace-weighting of objective attributes. The population of a photospace distribution requires examining large numbers of images taken under typical camera phone usage conditions. ImagePhi was developed as a user-friendly software tool to interactively estimate the primary photospace variables, subject illumination and subject distance, from individual images. Additionally, subjective evaluations of image quality and failure modes for low quality images can be entered into ImagePhi. ImagePhi has been applied to sets of images taken by typical users with a selection of popular camera phones varying in resolution. The estimated photospace distribution of camera phone usage has been correlated with the distributions of failure modes. The subjective and objective data show that photospace conditions have a much bigger impact on image quality of a camera phone than the pixel count of its imager. The 'megapixel myth' is thus seen to be less a myth than an ill framed conditional assertion, whose conditions are to a large extent specified by the camera's operational state in photospace.

  2. Considerations for Observational Research Using Large Data Sets in Radiation Oncology

    SciTech Connect

    Jagsi, Reshma; Bekelman, Justin E.; Chen, Aileen; Chen, Ronald C.; Hoffman, Karen; Tina Shih, Ya-Chen; Smith, Benjamin D.; Yu, James B.

    2014-09-01

    The radiation oncology community has witnessed growing interest in observational research conducted using large-scale data sources such as registries and claims-based data sets. With the growing emphasis on observational analyses in health care, the radiation oncology community must possess a sophisticated understanding of the methodological considerations of such studies in order to evaluate evidence appropriately to guide practice and policy. Because observational research has unique features that distinguish it from clinical trials and other forms of traditional radiation oncology research, the International Journal of Radiation Oncology, Biology, Physics assembled a panel of experts in health services research to provide a concise and well-referenced review, intended to be informative for the lay reader, as well as for scholars who wish to embark on such research without prior experience. This review begins by discussing the types of research questions relevant to radiation oncology that large-scale databases may help illuminate. It then describes major potential data sources for such endeavors, including information regarding access and insights regarding the strengths and limitations of each. Finally, it provides guidance regarding the analytical challenges that observational studies must confront, along with discussion of the techniques that have been developed to help minimize the impact of certain common analytical issues in observational analysis. Features characterizing a well-designed observational study include clearly defined research questions, careful selection of an appropriate data source, consultation with investigators with relevant methodological expertise, inclusion of sensitivity analyses, caution not to overinterpret small but significant differences, and recognition of limitations when trying to evaluate causality. This review concludes that carefully designed and executed studies using observational data that possess these qualities hold

  3. Considerations for observational research using large data sets in radiation oncology.

    PubMed

    Jagsi, Reshma; Bekelman, Justin E; Chen, Aileen; Chen, Ronald C; Hoffman, Karen; Shih, Ya-Chen Tina; Smith, Benjamin D; Yu, James B

    2014-09-01

    The radiation oncology community has witnessed growing interest in observational research conducted using large-scale data sources such as registries and claims-based data sets. With the growing emphasis on observational analyses in health care, the radiation oncology community must possess a sophisticated understanding of the methodological considerations of such studies in order to evaluate evidence appropriately to guide practice and policy. Because observational research has unique features that distinguish it from clinical trials and other forms of traditional radiation oncology research, the International Journal of Radiation Oncology, Biology, Physics assembled a panel of experts in health services research to provide a concise and well-referenced review, intended to be informative for the lay reader, as well as for scholars who wish to embark on such research without prior experience. This review begins by discussing the types of research questions relevant to radiation oncology that large-scale databases may help illuminate. It then describes major potential data sources for such endeavors, including information regarding access and insights regarding the strengths and limitations of each. Finally, it provides guidance regarding the analytical challenges that observational studies must confront, along with discussion of the techniques that have been developed to help minimize the impact of certain common analytical issues in observational analysis. Features characterizing a well-designed observational study include clearly defined research questions, careful selection of an appropriate data source, consultation with investigators with relevant methodological expertise, inclusion of sensitivity analyses, caution not to overinterpret small but significant differences, and recognition of limitations when trying to evaluate causality. This review concludes that carefully designed and executed studies using observational data that possess these qualities hold

  4. Public-private partnerships with large corporations: setting the ground rules for better health.

    PubMed

    Galea, Gauden; McKee, Martin

    2014-04-01

    Public-private partnerships with large corporations offer potential benefits to the health sector but many concerns have been raised, highlighting the need for appropriate safeguards. In this paper we propose five tests that public policy makers may wish to apply when considering engaging in such a public-private partnership. First, are the core products and services provided by the corporation health enhancing or health damaging? In some cases, such as tobacco, the answer is obvious but others, such as food and alcohol, are contested. In such cases, the burden of proof is on the potential partners to show that their activities are health enhancing. Second, do potential partners put their policies into practice in the settings where they can do so, their own workplaces? Third, are the corporate social responsibility activities of potential partners independently audited? Fourth, do potential partners make contributions to the commons rather than to narrow programmes of their choosing? Fifth, is the role of the partner confined to policy implementation rather than policy development, which is ultimately the responsibility of government alone? PMID:24581699

  5. Information Theoretic Approaches to Rapid Discovery of Relationships in Large Climate Data Sets

    NASA Technical Reports Server (NTRS)

    Knuth, Kevin H.; Rossow, William B.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Mutual information as the asymptotic Bayesian measure of independence is an excellent starting point for investigating the existence of possible relationships among climate-relevant variables in large data sets, As mutual information is a nonlinear function of of its arguments, it is not beholden to the assumption of a linear relationship between the variables in question and can reveal features missed in linear correlation analyses. However, as mutual information is symmetric in its arguments, it only has the ability to reveal the probability that two variables are related. it provides no information as to how they are related; specifically, causal interactions or a relation based on a common cause cannot be detected. For this reason we also investigate the utility of a related quantity called the transfer entropy. The transfer entropy can be written as a difference between mutual informations and has the capability to reveal whether and how the variables are causally related. The application of these information theoretic measures is rested on some familiar examples using data from the International Satellite Cloud Climatology Project (ISCCP) to identify relation between global cloud cover and other variables, including equatorial pacific sea surface temperature (SST), over seasonal and El Nino Southern Oscillation (ENSO) cycles.

  6. Anomaly Detection in Large Sets of High-Dimensional Symbol Sequences

    NASA Technical Reports Server (NTRS)

    Budalakoti, Suratna; Srivastava, Ashok N.; Akella, Ram; Turkov, Eugene

    2006-01-01

    This paper addresses the problem of detecting and describing anomalies in large sets of high-dimensional symbol sequences. The approach taken uses unsupervised clustering of sequences using the normalized longest common subsequence (LCS) as a similarity measure, followed by detailed analysis of outliers to detect anomalies. As the LCS measure is expensive to compute, the first part of the paper discusses existing algorithms, such as the Hunt-Szymanski algorithm, that have low time-complexity. We then discuss why these algorithms often do not work well in practice and present a new hybrid algorithm for computing the LCS that, in our tests, outperforms the Hunt-Szymanski algorithm by a factor of five. The second part of the paper presents new algorithms for outlier analysis that provide comprehensible indicators as to why a particular sequence was deemed to be an outlier. The algorithms provide a coherent description to an analyst of the anomalies in the sequence, compared to more normal sequences. The algorithms we present are general and domain-independent, so we discuss applications in related areas such as anomaly detection.

  7. Analog and digital interface solutions for the common large-area display set (CLADS)

    NASA Astrophysics Data System (ADS)

    Hermann, David J.; Gorenflo, Ronald L.

    1997-07-01

    Battelle is under contract with Warner Robins Air Logistics Center to design a common large area display set (CLADS) for use in multiple airborne command, control, communications, computers and intelligence applications that currently use unique 19 inch cathode ray tubes (CRTs). The CLADS is a modular design, with common modules used wherever possible. Each CLADS includes an application-specific integration kit, which incorporates all of the unique interface components. Since there is no existing digital video interface standard for high resolution workstations, a standard interface was developed for CLADS and documented as an interface specification.One of the application-specific modules, the application video interface module (AVIM), readily incorporates most of the required application electrical interfaces for a given system into a single module. The analog AVIM, however, poses unique design problems when folding multiple application interface requirements into a single common AVIM for the most prevalent workstation display interface: analog RGB video. Future workstation display interfaces will incorporate fully digital video between the graphics hardware and the digital display device. A digital AVIM is described which utilizes a fiber channel interface to deliver high speed 1280 by 1024, 24- bit, 60 Hz digital video from a PCI graphics card to the CLADS. A video recording and playback device is described, as well as other common CLADS modules, including the display controller and power supply. This paper will discuss both the analog and digital AVIM interfaces, application BIT and power interfaces, as well as CLADS internal interfaces.

  8. Assembly of large metagenome data sets using a Convey HC-1 hybrid core computer (7th Annual SFAF Meeting, 2012)

    ScienceCinema

    Copeland, Alex [DOE JGI

    2016-07-12

    Alex Copeland on "Assembly of large metagenome data sets using a Convey HC-1 hybrid core computer" at the 2012 Sequencing, Finishing, Analysis in the Future Meeting held June 5-7, 2012 in Santa Fe, New Mexico.

  9. Linked Scatter Plots, A Powerful Exploration Tool For Very Large Sets of Spectra

    NASA Astrophysics Data System (ADS)

    Carbon, Duane Francis; Henze, Christopher

    2015-08-01

    We present a new tool, based on linked scatter plots, that is designed to efficiently explore very large spectrum data sets such as the SDSS, APOGEE, LAMOST, GAIA, and RAVE data sets.The tool works in two stages: the first uses batch processing and the second runs interactively. In the batch stage, spectra are processed through our data pipeline which computes the depths relative to the local continuum at preselected feature wavelengths. These depths, and any additional available variables such as local S/N level, magnitudes, colors, positions, and radial velocities, are the basic measured quantities used in the interactive stage.The interactive stage employs the NASA hyperwall, a configuration of 128 workstation displays (8x16 array) controlled by a parallelized software suite running on NASA's Pleiades supercomputer. Each hyperwall panel is used to display a fully linked 2-D scatter plot showing the depth of feature A vs the depth of feature B for all of the spectra. A and B change from panel to panel. The relationships between the various (A,B) strengths and any distinctive clustering, as well as unique outlier groupings, are visually apparent when examining and inter-comparing the different panels on the hyperwall. In addition, the data links between the scatter plots allow the user to apply a logical algebra to the measurements. By graphically selecting the objects in any interesting region of any 2-D plot on the hyperwall, the tool immediately and clearly shows how the selected objects are distributed in all the other 2-D plots. The selection process may be repeated multiple times and, at each step, the selections can represent a sequence of logical constraints on the measurements, revealing those objects which satisfy all the constraints thus far. The spectra of the selected objects may be examined at any time on a connected workstation display.Using over 945,000,000 depth measurements from 569,738 SDSS DR10 stellar spectra, we illustrate how to quickly

  10. Setting the Stage for Developing Pre-service Teachers' Conceptions of Good Science Teaching: The role of classroom videos

    NASA Astrophysics Data System (ADS)

    Wong, Siu Ling; Yung, Benny Hin Wai; Cheng, Man Wai; Lam, Kwok Leung; Hodson, Derek

    2006-01-01

    This paper reports findings about a curriculum innovation conducted at The University of Hong Kong. A CD-ROM consisting of videos of two lessons by different teachers demonstrating exemplary science teaching was used to elicit conceptions of good science teaching of student-teachers enrolled for the 1-year Postgraduate Diploma in Education at several stages during the programme. It was found that the videos elicited student-teachers’ conceptions and had impact on those conceptions prior to the commencement of formal instruction. It has extended student-teachers’ awareness of alternative teaching methods and approaches not experienced in their own schooling, broadened their awareness of different classroom situations, provided proof of existence of good practices, and prompted them to reflect on their current preconceptions of good science teaching. In several ways, the videos acted as a catalyst in socializing the transition of student-teachers from the role of student to the role of teacher.

  11. Mobile-phone-based classroom response systems: Students' perceptions of engagement and learning in a large undergraduate course

    NASA Astrophysics Data System (ADS)

    Dunn, Peter K.; Richardson, Alice; Oprescu, Florin; McDonald, Christine

    2013-12-01

    Using a Classroom Response System (CRS) has been associated with positive educational outcomes, by fostering student engagement and by allowing immediate feedback to both students and instructors. This study examined a low-cost CRS (VotApedia) in a large first-year class, where students responded to questions using their mobile phones. This study explored whether the use of VotApedia retained the advantages of other CRS, overcame some of the challenges of other CRS, and whether new challenges were introduced by using VotApedia. These issues were studied within three themes: students' perceptions of using VotApedia; the impact of VotApedia on their engagement; and the impact of VotApedia on their learning. Data were collected from an online survey, focus groups and student feedback on teaching and course content. The results indicated that using VotApedia retains the pedagogical advantages of other CRS, while overcoming some of the challenges presented by using other CRS, without introducing any new challenges.

  12. The Sheffield experiment: the effects of centralising accident and emergency services in a large urban setting

    PubMed Central

    Simpson, A; Wardrope, J; Burke, D

    2001-01-01

    Objectives—To assess the effects of centralisation of accident and emergency (A&E) services in a large urban setting. The end points were the quality of patient care judged by time to see a doctor or nurse practitioner, time to admission and the cost of the A&E service as a whole. Methods—Sheffield is a large industrial city with a population of 471 000. In 1994 Sheffield health authority took a decision to centralise a number of services including the A&E services. This study presents data collected over a three year period before, during and after the centralisation of adult A&E services from two sites to one site and the centralisation of children's A&E services to a separate site. A minor injury unit was also established along with an emergency admissions unit. The study used information from the A&E departments' computer system and routinely available financial data. Results—There has been a small decrease in the number of new patient attendances using the Sheffield A&E system. Most patients go to the correct department. The numbers of acute admissions through the adult A&E have doubled. Measures of process efficiency show some improvement in times to admission. There has been measurable deterioration in the time to be seen for minor injuries in the A&E departments. This is partly offset by the very good waiting time to be seen in the minor injuries unit. The costs of providing the service within Sheffield have increased. Conclusion—Centralisation of A&E services in Sheffield has led to concentration of the most ill patients in a single adult department and separate paediatric A&E department. Despite a greatly increased number of admissions at the adult site this change has not resulted in increased waiting times for admission because of the transfer of adequate beds to support the changes. There has however been a deterioration in the time to see a clinician, especially in the A&E departments. The waiting times at the minor injury unit are very short

  13. Getting specific: making taxonomic and ecological sense of large sequencing data sets.

    PubMed

    Massana, Ramon

    2015-06-01

    in microbial assemblages, only accessible by molecular tools. Moreover, the number of species detected was limited, agreeing with a putative scenario of constrained evolutionary diversification in free-living small eukaryotes. This study illustrates the potential of HTS to address ecological relevant questions in an accessible way by processing large data sets that, nonetheless, need to be treated with a fair understanding of their limitations. PMID:26095583

  14. BACHSCORE. A tool for evaluating efficiently and reliably the quality of large sets of protein structures

    NASA Astrophysics Data System (ADS)

    Sarti, E.; Zamuner, S.; Cossio, P.; Laio, A.; Seno, F.; Trovato, A.

    2013-12-01

    In protein structure prediction it is of crucial importance, especially at the refinement stage, to score efficiently large sets of models by selecting the ones that are closest to the native state. We here present a new computational tool, BACHSCORE, that allows its users to rank different structural models of the same protein according to their quality, evaluated by using the BACH++ (Bayesian Analysis Conformation Hunt) scoring function. The original BACH statistical potential was already shown to discriminate with very good reliability the protein native state in large sets of misfolded models of the same protein. BACH++ features a novel upgrade in the solvation potential of the scoring function, now computed by adapting the LCPO (Linear Combination of Pairwise Orbitals) algorithm. This change further enhances the already good performance of the scoring function. BACHSCORE can be accessed directly through the web server: bachserver.pd.infn.it. Catalogue identifier: AEQD_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEQD_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: GNU General Public License version 3 No. of lines in distributed program, including test data, etc.: 130159 No. of bytes in distributed program, including test data, etc.: 24 687 455 Distribution format: tar.gz Programming language: C++. Computer: Any computer capable of running an executable produced by a g++ compiler (4.6.3 version). Operating system: Linux, Unix OS-es. RAM: 1 073 741 824 bytes Classification: 3. Nature of problem: Evaluate the quality of a protein structural model, taking into account the possible “a priori” knowledge of a reference primary sequence that may be different from the amino-acid sequence of the model; the native protein structure should be recognized as the best model. Solution method: The contact potential scores the occurrence of any given type of residue pair in 5 possible

  15. Classroom Management and the Librarian

    ERIC Educational Resources Information Center

    Blackburn, Heidi; Hays, Lauren

    2014-01-01

    As librarians take on more instructional responsibilities, the need for classroom management skills becomes vital. Unfortunately, classroom management skills are not taught in library school and therefore, many librarians are forced to learn how to manage a classroom on the job. Different classroom settings such as one-shot instruction sessions…

  16. Repulsive parallel MCMC algorithm for discovering diverse motifs from large sequence sets

    PubMed Central

    Ikebata, Hisaki; Yoshida, Ryo

    2015-01-01

    Motivation: The motif discovery problem consists of finding recurring patterns of short strings in a set of nucleotide sequences. This classical problem is receiving renewed attention as most early motif discovery methods lack the ability to handle large data of recent genome-wide ChIP studies. New ChIP-tailored methods focus on reducing computation time and pay little regard to the accuracy of motif detection. Unlike such methods, our method focuses on increasing the detection accuracy while maintaining the computation efficiency at an acceptable level. The major advantage of our method is that it can mine diverse multiple motifs undetectable by current methods. Results: The repulsive parallel Markov chain Monte Carlo (RPMCMC) algorithm that we propose is a parallel version of the widely used Gibbs motif sampler. RPMCMC is run on parallel interacting motif samplers. A repulsive force is generated when different motifs produced by different samplers near each other. Thus, different samplers explore different motifs. In this way, we can detect much more diverse motifs than conventional methods can. Through application to 228 transcription factor ChIP-seq datasets of the ENCODE project, we show that the RPMCMC algorithm can find many reliable cofactor interacting motifs that existing methods are unable to discover. Availability and implementation: A C++ implementation of RPMCMC and discovered cofactor motifs for the 228 ENCODE ChIP-seq datasets are available from http://daweb.ism.ac.jp/yoshidalab/motif. Contact: ikebata.hisaki@ism.ac.jp, yoshidar@ism.ac.jp Supplementary information: Supplementary data are available from Bioinformatics online. PMID:25583120

  17. Gaining A Geological Perspective Through Active Learning in the Large Lecture Classroom

    NASA Astrophysics Data System (ADS)

    Kapp, J. L.; Richardson, R. M.; Slater, S. J.

    2008-12-01

    NATS 101 A Geological Perspective is a general education course taken by non science majors. We offer 600 seats per semester, with four large lecture sections taught by different faculty members. In the past we have offered optional once a week study groups taught by graduate teaching assistants. Students often feel overwhelmed by the science and associated jargon, and many are prone to skipping lectures altogether. Optional study groups are only attended by ~50% of the students. Faculty members find the class to be a lot of work, mainly due to the grading it generates. Activities given in lecture are often short multiple choice or true false assignments, limiting the depth of understanding we can evaluate. Our students often lack math and critical thinking skills, and we spend a lot of time in lecture reintroducing ideas students should have already gotten from the text. In summer 2007 we were funded to redesign the course. Our goals were to 1) cut the cost of running the course, and 2) improve student learning. Under our redesign optional study groups were replaced by once a week mandatory break out sessions where students complete activities that have been introduced in lecture. Break out sessions substitute for one hour of lecture, and are run by undergraduate preceptors and graduate teaching assistants (GTAs). During the lecture period, lectures themselves are brief with a large portion of the class devoted to active learning in small groups. Weekly reading quizzes are submitted via the online course management system. Break out sessions allow students to spend more time interacting with their fellow students, undergraduate preceptors, and GTAs. They get one on one help in break out sessions on assignments designed to enhance the lecture material. The active lecture format means less of their time is devoted to listening passively to a lecture, and more time is spent peer learning an interacting with the instructor. Completing quizzes online allows students

  18. Improving Library Effectiveness: A Proposal for Applying Fuzzy Set Concepts in the Management of Large Collections.

    ERIC Educational Resources Information Center

    Robinson, Earl J.; Turner, Stephen J.

    1981-01-01

    Fuzzy set theory, a mathematical modeling technique that allows for the consideration of such factors as "professional expertise" in decision making, is discussed as a tool for use in libraries--specifically in collection management. The fundamentals of fuzzy set theory are introduced and a reference list is included. (JL)

  19. Problems in the Cataloging of Large Microform Sets or, Learning to Expect the Unexpected.

    ERIC Educational Resources Information Center

    Joachim, Martin D.

    1989-01-01

    Describes problems encountered during the cataloging of three major microform sets at the Indiana University Libraries. Areas discussed include size and contents of the sets, staffing for the project, equipment, authority work, rare book cataloging rules, serials, language of materials, musical scores, and manuscripts. (CLB)

  20. Tools for Analysis and Visualization of Large Time-Varying CFD Data Sets

    NASA Technical Reports Server (NTRS)

    Wilhelms, Jane; VanGelder, Allen

    1997-01-01

    In the second year, we continued to built upon and improve our scanline-based direct volume renderer that we developed in the first year of this grant. This extremely general rendering approach can handle regular or irregular grids, including overlapping multiple grids, and polygon mesh surfaces. It runs in parallel on multi-processors. It can also be used in conjunction with a k-d tree hierarchy, where approximate models and error terms are stored in the nodes of the tree, and approximate fast renderings can be created. We have extended our software to handle time-varying data where the data changes but the grid does not. We are now working on extending it to handle more general time-varying data. We have also developed a new extension of our direct volume renderer that uses automatic decimation of the 3D grid, as opposed to an explicit hierarchy. We explored this alternative approach as being more appropriate for very large data sets, where the extra expense of a tree may be unacceptable. We also describe a new approach to direct volume rendering using hardware 3D textures and incorporates lighting effects. Volume rendering using hardware 3D textures is extremely fast, and machines capable of using this technique are becoming more moderately priced. While this technique, at present, is limited to use with regular grids, we are pursuing possible algorithms extending the approach to more general grid types. We have also begun to explore a new method for determining the accuracy of approximate models based on the light field method described at ACM SIGGRAPH '96. In our initial implementation, we automatically image the volume from 32 equi-distant positions on the surface of an enclosing tessellated sphere. We then calculate differences between these images under different conditions of volume approximation or decimation. We are studying whether this will give a quantitative measure of the effects of approximation. We have created new tools for exploring the

  1. Scalable Algorithms for Unsupervised Classification and Anomaly Detection in Large Geospatiotemporal Data Sets

    NASA Astrophysics Data System (ADS)

    Mills, R. T.; Hoffman, F. M.; Kumar, J.

    2015-12-01

    The increasing availability of high-resolution geospatiotemporal datasets from sources such as observatory networks, remote sensing platforms, and computational Earth system models has opened new possibilities for knowledge discovery and mining of ecological data sets fused from disparate sources. Traditional algorithms and computing platforms are impractical for the analysis and synthesis of data sets of this size; however, new algorithmic approaches that can effectively utilize the complex memory hierarchies and the extremely high levels of available parallelism in state-of-the-art high-performance computing platforms can enable such analysis. We describe some unsupervised knowledge discovery and anomaly detection approaches based on highly scalable parallel algorithms for k-means clustering and singular value decomposition, consider a few practical applications thereof to the analysis of climatic and remotely-sensed vegetation phenology data sets, and speculate on some of the new applications that such scalable analysis methods may enable.

  2. Semantic feature production norms for a large set of living and nonliving things.

    PubMed

    McRae, Ken; Cree, George S; Seidenberg, Mark S; McNorgan, Chris

    2005-11-01

    Semantic features have provided insight into numerous behavioral phenomena concerning concepts, categorization, and semantic memory in adults, children, and neuropsychological populations. Numerous theories and models in these areas are based on representations and computations involving semantic features. Consequently, empirically derived semantic feature production norms have played, and continue to play, a highly useful role in these domains. This article describes a set of feature norms collected from approximately 725 participants for 541 living (dog) and nonliving (chair) basic-level concepts, the largest such set of norms developed to date. This article describes the norms and numerous statistics associated with them. Our aim is to make these norms available to facilitate other research, while obviating the need to repeat the labor-intensive methods involved in collecting and analyzing such norms. The full set of norms may be downloaded from www.psychonomic.org/archive. PMID:16629288

  3. Adaptation of Bharatanatyam Dance Pedagogy for Multicultural Classrooms: Questions and Relevance in a North American University Setting

    ERIC Educational Resources Information Center

    Banerjee, Suparna

    2013-01-01

    This article opens up questions around introducing Bharatanatyam, a form of Indian classical dance, to undergraduate learners within a North American university setting. The aim is to observe how the learners understood and received a particular cultural practice and to explore issues related to learning goals, curriculum content, approaches to…

  4. Addressing Methodological Challenges in Large Communication Data Sets: Collecting and Coding Longitudinal Interactions in Home Hospice Cancer Care.

    PubMed

    Reblin, Maija; Clayton, Margaret F; John, Kevin K; Ellington, Lee

    2016-07-01

    In this article, we present strategies for collecting and coding a large longitudinal communication data set collected across multiple sites, consisting of more than 2000 hours of digital audio recordings from approximately 300 families. We describe our methods within the context of implementing a large-scale study of communication during cancer home hospice nurse visits, but this procedure could be adapted to communication data sets across a wide variety of settings. This research is the first study designed to capture home hospice nurse-caregiver communication, a highly understudied location and type of communication event. We present a detailed example protocol encompassing data collection in the home environment, large-scale, multisite secure data management, the development of theoretically-based communication coding, and strategies for preventing coder drift and ensuring reliability of analyses. Although each of these challenges has the potential to undermine the utility of the data, reliability between coders is often the only issue consistently reported and addressed in the literature. Overall, our approach demonstrates rigor and provides a "how-to" example for managing large, digitally recorded data sets from collection through analysis. These strategies can inform other large-scale health communication research.

  5. Addressing Methodological Challenges in Large Communication Data Sets: Collecting and Coding Longitudinal Interactions in Home Hospice Cancer Care.

    PubMed

    Reblin, Maija; Clayton, Margaret F; John, Kevin K; Ellington, Lee

    2016-07-01

    In this article, we present strategies for collecting and coding a large longitudinal communication data set collected across multiple sites, consisting of more than 2000 hours of digital audio recordings from approximately 300 families. We describe our methods within the context of implementing a large-scale study of communication during cancer home hospice nurse visits, but this procedure could be adapted to communication data sets across a wide variety of settings. This research is the first study designed to capture home hospice nurse-caregiver communication, a highly understudied location and type of communication event. We present a detailed example protocol encompassing data collection in the home environment, large-scale, multisite secure data management, the development of theoretically-based communication coding, and strategies for preventing coder drift and ensuring reliability of analyses. Although each of these challenges has the potential to undermine the utility of the data, reliability between coders is often the only issue consistently reported and addressed in the literature. Overall, our approach demonstrates rigor and provides a "how-to" example for managing large, digitally recorded data sets from collection through analysis. These strategies can inform other large-scale health communication research. PMID:26580414

  6. Large-Scale Disturbance Events in Terrestrial Ecosystems Detected using Global Satellite Data Sets

    NASA Astrophysics Data System (ADS)

    Potter, C.; Tan, P.; Kumar, V.; Klooster, S.

    2004-12-01

    Studies are being conducted to evaluate patterns in a 19-year record of global satellite observations of vegetation phenology from the Advanced Very High Resolution Radiometer (AVHRR), as a means to characterize large-scale ecosystem disturbance events and regimes. The fraction absorbed of photosynthetically active radiation (FPAR) by vegetation canopies worldwide has been computed at a monthly time interval from 1982 to 2000 and gridded at a spatial resolution of 8-km globally. Potential disturbance events were identified in the FPAR time series by locating anomalously low values (FPAR-LO) that lasted longer than 12 consecutive months at any 8-km pixel. We can find verifiable evidence of numerous disturbance types across North America, including major regional patterns of cold and heat waves, forest fires, tropical storms, and large-scale forest logging. Based on this analysis, an historical picture is emerging of periodic droughts and heat waves, possibly coupled with herbivorous insect outbreaks, as among the most important causes of ecosystem disturbance in North America. In South America, large areas of northeastern Brazil appear to have been impacted in the early 1990s by severe drought. Amazon tropical forest disturbance can be detected at large scales particularly in the mid 1990s. In Asia, large-scale disturbance events appear in the mid 1980s and the late 1990s across boreal and temperate forest zones, as well as in cropland areas of western India. In northern Europe and central Africa, large-scale forest disturbance appears in the mid 1990s.

  7. T1DBase: update 2011, organization and presentation of large-scale data sets for type 1 diabetes research.

    PubMed

    Burren, Oliver S; Adlem, Ellen C; Achuthan, Premanand; Christensen, Mikkel; Coulson, Richard M R; Todd, John A

    2011-01-01

    T1DBase (http://www.t1dbase.org) is web platform, which supports the type 1 diabetes (T1D) community. It integrates genetic, genomic and expression data relevant to T1D research across mouse, rat and human and presents this to the user as a set of web pages and tools. This update describes the incorporation of new data sets, tools and curation efforts as well as a new website design to simplify site use. New data sets include curated summary data from four genome-wide association studies relevant to T1D, HaemAtlas-a data set and tool to query gene expression levels in haematopoietic cells and a manually curated table of human T1D susceptibility loci, incorporating genetic overlap with other related diseases. These developments will continue to support T1D research and allow easy access to large and complex T1D relevant data sets.

  8. An Efficient Algorithm for Discovering Motifs in Large DNA Data Sets.

    PubMed

    Yu, Qiang; Huo, Hongwei; Chen, Xiaoyang; Guo, Haitao; Vitter, Jeffrey Scott; Huan, Jun

    2015-07-01

    The planted (l,d) motif discovery has been successfully used to locate transcription factor binding sites in dozens of promoter sequences over the past decade. However, there has not been enough work done in identifying (l,d) motifs in the next-generation sequencing (ChIP-seq) data sets, which contain thousands of input sequences and thereby bring new challenge to make a good identification in reasonable time. To cater this need, we propose a new planted (l,d) motif discovery algorithm named MCES, which identifies motifs by mining and combining emerging substrings. Specially, to handle larger data sets, we design a MapReduce-based strategy to mine emerging substrings distributedly. Experimental results on the simulated data show that i) MCES is able to identify (l,d) motifs efficiently and effectively in thousands to millions of input sequences, and runs faster than the state-of-the-art (l,d) motif discovery algorithms, such as F-motif and TraverStringsR; ii) MCES is able to identify motifs without known lengths, and has a better identification accuracy than the competing algorithm CisFinder. Also, the validity of MCES is tested on real data sets. MCES is freely available at http://sites.google.com/site/feqond/mces.

  9. Number Bias for the Discrimination of Large Visual Sets in Infancy

    ERIC Educational Resources Information Center

    Brannon, Elizabeth M.; Abbott, Sara; Lutz, Donna J.

    2004-01-01

    This brief report attempts to resolve the claim that infants preferentially attend to continuous variables over number [e.g. Psychol. Sci. 10 (1999) 408; Cognit. Psychol.44 (2002) 33] with the finding that when continuous variables are controlled, infants as young as 6-months of age discriminate large numerical values [e.g. Psychol. Sci. 14 (2003)…

  10. A posteriori correction of camera characteristics from large image data sets

    PubMed Central

    Afanasyev, Pavel; Ravelli, Raimond B. G.; Matadeen, Rishi; De Carlo, Sacha; van Duinen, Gijs; Alewijnse, Bart; Peters, Peter J.; Abrahams, Jan-Pieter; Portugal, Rodrigo V.; Schatz, Michael; van Heel, Marin

    2015-01-01

    Large datasets are emerging in many fields of image processing including: electron microscopy, light microscopy, medical X-ray imaging, astronomy, etc. Novel computer-controlled instrumentation facilitates the collection of very large datasets containing thousands of individual digital images. In single-particle cryogenic electron microscopy (“cryo-EM”), for example, large datasets are required for achieving quasi-atomic resolution structures of biological complexes. Based on the collected data alone, large datasets allow us to precisely determine the statistical properties of the imaging sensor on a pixel-by-pixel basis, independent of any “a priori” normalization routinely applied to the raw image data during collection (“flat field correction”). Our straightforward “a posteriori” correction yields clean linear images as can be verified by Fourier Ring Correlation (FRC), illustrating the statistical independence of the corrected images over all spatial frequencies. The image sensor characteristics can also be measured continuously and used for correcting upcoming images. PMID:26068909

  11. Psychology in an Interdisciplinary Setting: A Large-Scale Project to Improve University Teaching

    ERIC Educational Resources Information Center

    Koch, Franziska D.; Vogt, Joachim

    2015-01-01

    At a German university of technology, a large-scale project was funded as a part of the "Quality Pact for Teaching", a programme launched by the German Federal Ministry of Education and Research to improve the quality of university teaching and study conditions. The project aims at intensifying interdisciplinary networking in teaching,…

  12. Learning through Discussions: Comparing the Benefits of Small-Group and Large-Class Settings

    ERIC Educational Resources Information Center

    Pollock, Philip H.; Hamann, Kerstin; Wilson, Bruce M.

    2011-01-01

    The literature on teaching and learning heralds the benefits of discussion for student learner outcomes, especially its ability to improve students' critical thinking skills. Yet, few studies compare the effects of different types of face-to-face discussions on learners. Using student surveys, we analyze the benefits of small-group and large-class…

  13. Design of Availability-Dependent Distributed Services in Large-Scale Uncooperative Settings

    ERIC Educational Resources Information Center

    Morales, Ramses Victor

    2009-01-01

    Thesis Statement: "Availability-dependent global predicates can be efficiently and scalably realized for a class of distributed services, in spite of specific selfish and colluding behaviors, using local and decentralized protocols". Several types of large-scale distributed systems spanning the Internet have to deal with availability variations…

  14. Use of Large-Scale Data Sets to Study Educational Pathways of American Indian and Alaska Native Students

    ERIC Educational Resources Information Center

    Faircloth, Susan C.; Alcantar, Cynthia M.; Stage, Frances K.

    2014-01-01

    This chapter discusses issues and challenges encountered in using large-scale data sets to study educational experiences and subsequent outcomes for American Indian and Alaska Native (AI/AN) students. In this chapter, we argue that the linguistic and cultural diversity of Native peoples, coupled with the legal and political ways in which education…

  15. Using a Classroom Response System for Promoting Interaction to Teaching Mathematics to Large Groups of Undergraduate Students

    ERIC Educational Resources Information Center

    Morais, Adolfo; Barragués, José Ignacio; Guisasola, Jenaro

    2015-01-01

    This work describes the design and evaluation of a proposal to use Classroom Response Systems (CRS), intended to promote participative classes of Mathematics at University. The proposal is based on Problem Based Learnig (PBL) and uses Robert's six hypotheses for mathematical teaching-learning. The results show that PBL is a relevant strategy to…

  16. The Impact of Mobile Learning on Students' Learning Behaviours and Performance: Report from a Large Blended Classroom

    ERIC Educational Resources Information Center

    Wang, Minjuan; Shen, Ruimin; Novak, Daniel; Pan, Xiaoyan

    2009-01-01

    Chinese classrooms, whether on school grounds or online, have long suffered from a lack of interactivity. Many online classes simply provide recorded instructor lectures, which only reinforces the negative effects of passive nonparticipatory learning. At Shanghai Jiaotong University, researchers and developers actively seek technologic…

  17. Options in Education, Transcript for February 16, 1976: National Commitment to Equal Rights & Equal Educational Opportunity, Racial Conflict in the Classroom, Setting Up a Publishing Business, and Women in Education (Mathematics and Sex).

    ERIC Educational Resources Information Center

    George Washington Univ., Washington, DC. Inst. for Educational Leadership.

    "Options in Education" is a radio news program which focuses on issues and developments in education. This transcript contains discussions of the national commitment to desegregated education, racial conflict in the classroom, learning how to set up a publishing business, women in education (mathematics and sex) and education news highlights.…

  18. High-throughput film-densitometry: An efficient approach to generate large data sets

    SciTech Connect

    Typke, Dieter; Nordmeyer, Robert A.; Jones, Arthur; Lee, Juyoung; Avila-Sakar, Agustin; Downing, Kenneth H.; Glaeser, Robert M.

    2004-07-14

    A film-handling machine (robot) has been built which can, in conjunction with a commercially available film densitometer, exchange and digitize over 300 electron micrographs per day. Implementation of robotic film handling effectively eliminates the delay and tedium associated with digitizing images when data are initially recorded on photographic film. The modulation transfer function (MTF) of the commercially available densitometer is significantly worse than that of a high-end, scientific microdensitometer. Nevertheless, its signal-to-noise ratio (S/N) is quite excellent, allowing substantial restoration of the output to ''near-to-perfect'' performance. Due to the large area of the standard electron microscope film that can be digitized by the commercial densitometer (up to 10,000 x 13,680 pixels with an appropriately coded holder), automated film digitization offers a fast and inexpensive alternative to high-end CCD cameras as a means of acquiring large amounts of image data in electron microscopy.

  19. Parallel k-Means Clustering for Quantitative Ecoregion Delineation Using Large Data Sets

    SciTech Connect

    Kumar, Jitendra; Mills, Richard T; Hoffman, Forrest M; HargroveJr., William Walter

    2011-01-01

    Identification of geographic ecoregions has long been of interest to environmental scientists and ecologists for identifying regions of similar ecological and environmental conditions. Such classifications are important for predicting suitable species ranges, for stratification of ecological samples, and to help prioritize habitat preservation and remediation efforts. Hargrove and Hoffman (1999, 2009) have developed geographical spatio-temporal clustering algorithms and codes and have successfully applied them to a variety of environmental science domains, including ecological regionalization; environmental monitoring network design; analysis of satellite-, airborne-, and ground-based remote sensing, and climate model-model and model-measurement intercomparison. With the advances in state-of-the-art satellite remote sensing and climate models, observations and model outputs are available at increasingly high spatial and temporal resolutions. Long time series of these high resolution datasets are extremely large in size and growing. Analysis and knowledge extraction from these large datasets are not just algorithmic and ecological problems, but also pose a complex computational problem. This paper focuses on the development of a massively parallel multivariate geographical spatio-temporal clustering code for analysis of very large datasets using tens of thousands processors on one of the fastest supercomputers in the world.

  20. Development of a core set from a large rice collection using a modified heuristic algorithm to retain maximum diversity.

    PubMed

    Chung, Hun-Ki; Kim, Kyu-Won; Chung, Jong-Wook; Lee, Jung-Ro; Lee, Sok-Young; Dixit, Anupam; Kang, Hee-Kyoung; Zhao, Weiguo; McNally, Kenneth L; Hamilton, Ruraidh S; Gwag, Jae-Gyun; Park, Yong-Jin

    2009-12-01

    A new heuristic approach was undertaken for the establishment of a core set for the diversity research of rice. As a result, 107 entries were selected from the 10 368 characterized accessions. The core set derived using this new approach provided a good representation of the characterized accessions present in the entire collection. No significant differences for the mean, range, standard deviation and coefficient of variation of each trait were observed between the core and existing collections. We also compared the diversity of core sets established using this Heuristic Core Collection (HCC) approach with those of core sets established using the conventional clustering methods. This modified heuristic algorithm can also be used to select genotype data with allelic richness and reduced redundancy, and to facilitate management and use of large collections of plant genetic resources in a more efficient way.

  1. Large-Eddy Simulation of Premixed and Partially Premixed Turbulent Combustion Using a Level Set Method

    NASA Astrophysics Data System (ADS)

    Duchamp de Lageneste, Laurent; Pitsch, Heinz

    2001-11-01

    Level-set methods (G-equation) have been recently used in the context of RANS to model turbulent premixed (Hermann 2000) or partially premixed (Chen 1999) combustion. By directly taking into account unsteady effects, LES can be expected to improve predictions over RANS. Since the reaction zone thickness of premixed flames in technical devices is usually much smaller than the LES grid spacing, chemical reactions completely occur on the sub-grid scales and hence have to be modeled entirely. In the level-set methodology, the flame front is represented by an arbitrary iso-surface G0 of a scalar field G whose evolution is described by the so-called G-equation. This equation is only valid at G=G_0, and hence decoupled from other G levels. Heat release is then modeled using a flamelet approach in which temperature is determined as a function of G and the mixture-fraction Z. In the present study, the proposed approach has been formulated for LES and validated using data from a turbulent Bunsen burner experiment (Chen, Peters 1996). Simulation of an experimental Lean Premixed Prevapourised (LPP) dump combustor (Besson, Bruel 1999, 2000) under different premixed or partially premixed conditions will also be presented.

  2. Large deformation solid-fluid interaction via a level set approach.

    SciTech Connect

    Schunk, Peter Randall; Noble, David R.; Baer, Thomas A.; Rao, Rekha Ranjana; Notz, Patrick K.; Wilkes, Edward Dean

    2003-12-01

    Solidification and blood flow seemingly have little in common, but each involves a fluid in contact with a deformable solid. In these systems, the solid-fluid interface moves as the solid advects and deforms, often traversing the entire domain of interest. Currently, these problems cannot be simulated without innumerable expensive remeshing steps, mesh manipulations or decoupling the solid and fluid motion. Despite the wealth of progress recently made in mechanics modeling, this glaring inadequacy persists. We propose a new technique that tracks the interface implicitly and circumvents the need for remeshing and remapping the solution onto the new mesh. The solid-fluid boundary is tracked with a level set algorithm that changes the equation type dynamically depending on the phases present. This novel approach to coupled mechanics problems promises to give accurate stresses, displacements and velocities in both phases, simultaneously.

  3. Plastic set of smooth large radii of curvature thermal conductance specimens at light loads.

    NASA Technical Reports Server (NTRS)

    Mckinzie, D. J., Jr.

    1972-01-01

    Thermal contact conductance test data at high vacuum were obtained from two Armco iron specimens having smooth, large radii of curvature, convex, one-half wave length surfaces. The data are compared with calculations based on two macroscopic elastic deformation theories and an empirical expression. Major disagreement with the theories and fair agreement with the empirical expression resulted. Plastic deformation of all the contacting surfaces was verified from surface analyzer statistics. These results indicate that the theoretical assumption of macroscopic elastic deformation is inadequate for accurate prediction of heat transfer with light loads for Armco iron specimens similar to those used in this investigation.

  4. Plastic set of smooth large radii of curvature thermal conductance specimens at light loads

    NASA Technical Reports Server (NTRS)

    Mckinzie, D. J., Jr.

    1972-01-01

    Thermal contact conductance test data at high vacuum were obtained from two Armco iron specimens having smooth, large radii of curvature, convex, one-half wave length surfaces. The data are compared with calculations based on two macroscopic elastic deformation theories and an empirical expression. Major disagreement with the theories and fair agreement with the empirical expression resulted. Plastic deformation of all the contacting surfaces was verified from surface analyzer statistics. These results indicate that the theoretical assumption of macroscopic elastic deformation is inadequate for accurate prediction of heat transfer with light loads for Armco iron specimens similar to those used in this investigation.

  5. A new tool called DISSECT for analysing large genomic data sets using a Big Data approach

    PubMed Central

    Canela-Xandri, Oriol; Law, Andy; Gray, Alan; Woolliams, John A.; Tenesa, Albert

    2015-01-01

    Large-scale genetic and genomic data are increasingly available and the major bottleneck in their analysis is a lack of sufficiently scalable computational tools. To address this problem in the context of complex traits analysis, we present DISSECT. DISSECT is a new and freely available software that is able to exploit the distributed-memory parallel computational architectures of compute clusters, to perform a wide range of genomic and epidemiologic analyses, which currently can only be carried out on reduced sample sizes or under restricted conditions. We demonstrate the usefulness of our new tool by addressing the challenge of predicting phenotypes from genotype data in human populations using mixed-linear model analysis. We analyse simulated traits from 470,000 individuals genotyped for 590,004 SNPs in ∼4 h using the combined computational power of 8,400 processor cores. We find that prediction accuracies in excess of 80% of the theoretical maximum could be achieved with large sample sizes. PMID:26657010

  6. Setting up a Rayleigh Scattering Based Flow Measuring System in a Large Nozzle Testing Facility

    NASA Technical Reports Server (NTRS)

    Panda, Jayanta; Gomez, Carlos R.

    2002-01-01

    A molecular Rayleigh scattering based air density measurement system has been built in a large nozzle testing facility at NASA Glenn Research Center. The technique depends on the light scattering by gas molecules present in air; no artificial seeding is required. Light from a single mode, continuous wave laser was transmitted to the nozzle facility by optical fiber, and light scattered by gas molecules, at various points along the laser beam, is collected and measured by photon-counting electronics. By placing the laser beam and collection optics on synchronized traversing units, the point measurement technique is made effective for surveying density variation over a cross-section of the nozzle plume. Various difficulties associated with dust particles, stray light, high noise level and vibration are discussed. Finally, a limited amount of data from an underexpanded jet are presented and compared with expected variations to validate the technique.

  7. Three-dimensional nanostructure determination from a large diffraction data set recorded using scanning electron nanodiffraction

    DOE PAGESBeta

    Meng, Yifei; Zuo, Jian-Min

    2016-07-04

    A diffraction-based technique is developed for the determination of three-dimensional nanostructures. The technique employs high-resolution and low-dose scanning electron nanodiffraction (SEND) to acquire three-dimensional diffraction patterns, with the help of a special sample holder for large-angle rotation. Grains are identified in three-dimensional space based on crystal orientation and on reconstructed dark-field images from the recorded diffraction patterns. Application to a nanocrystalline TiN thin film shows that the three-dimensional morphology of columnar TiN grains of tens of nanometres in diameter can be reconstructed using an algebraic iterative algorithm under specified prior conditions, together with their crystallographic orientations. The principles can bemore » extended to multiphase nanocrystalline materials as well. Thus, the tomographic SEND technique provides an effective and adaptive way of determining three-dimensional nanostructures.« less

  8. External beam IBA set-up with large-area thin Si3N4 window

    NASA Astrophysics Data System (ADS)

    Palonen, V.; Mizohata, K.; Nissinen, T.; Räisänen, J.

    2016-08-01

    A compact external beam setup has been constructed for Particle Induced X-ray Emission (PIXE) and Nuclear Reaction (NRA) analyses. The key issue in the design has been to obtain a wide beam spot size with maximized beam current utilizing a thin Si3N4 exit window. The employed specific exit window support enables use of foils with thickness of 100 nm for a beam spot size of 4 mm in diameter. The durable thin foil and the large beam spot size will be especially important for the complementary external beam NRA measurements. The path between the exit foil and sample is filled with flowing helium to minimize radiation hazard as well as energy loss and straggling, and to cool the samples. For sample-independent beam current monitoring and irradiation fluence measurement, indirect charge integration, based on secondary electron current measurement from a beam profilometer, is utilized.

  9. Estimation of melting points of large set of persistent organic pollutants utilizing QSPR approach.

    PubMed

    Watkins, Marquita; Sizochenko, Natalia; Rasulev, Bakhtiyor; Leszczynski, Jerzy

    2016-03-01

    The presence of polyhalogenated persistent organic pollutants (POPs), such as Cl/Br-substituted benzenes, biphenyls, diphenyl ethers, and naphthalenes has been identified in all environmental compartments. The exposure to these compounds can pose potential risk not only for ecological systems, but also for human health. Therefore, efficient tools for comprehensive environmental risk assessment for POPs are required. Among the factors vital for environmental transport and fate processes is melting point of a compound. In this study, we estimated the melting points of a large group (1419 compounds) of chloro- and bromo- derivatives of dibenzo-p-dioxins, dibenzofurans, biphenyls, naphthalenes, diphenylethers, and benzenes by utilizing quantitative structure-property relationship (QSPR) techniques. The compounds were classified by applying structure-based clustering methods followed by GA-PLS modeling. In addition, random forest method has been applied to develop more general models. Factors responsible for melting point behavior and predictive ability of each method were discussed.

  10. A hybrid structure for the storage and manipulation of very large spatial data sets

    USGS Publications Warehouse

    Peuquet, Donna J.

    1982-01-01

    The map data input and output problem for geographic information systems is rapidly diminishing with the increasing availability of mass digitizing, direct spatial data capture and graphics hardware based on raster technology. Although a large number of efficient raster-based algorithms exist for performing a wide variety of common tasks on these data, there are a number of procedures which are more efficiently performed in vector mode or for which raster mode equivalents of current vector-based techniques have not yet been developed. This paper presents a hybrid spatial data structure, named the ?vaster' structure, which can utilize the advantages of both raster and vector structures while potentially eliminating, or greatly reducing, the need for raster-to-vector and vector-to-raster conversion. Other advantages of the vaster structure are also discussed.

  11. Processing large sensor data sets for safeguards : the knowledge generation system.

    SciTech Connect

    Thomas, Maikel A.; Smartt, Heidi Anne; Matthews, Robert F.

    2012-04-01

    Modern nuclear facilities, such as reprocessing plants, present inspectors with significant challenges due in part to the sheer amount of equipment that must be safeguarded. The Sandia-developed and patented Knowledge Generation system was designed to automatically analyze large amounts of safeguards data to identify anomalous events of interest by comparing sensor readings with those expected from a process of interest and operator declarations. This paper describes a demonstration of the Knowledge Generation system using simulated accountability tank sensor data to represent part of a reprocessing plant. The demonstration indicated that Knowledge Generation has the potential to address several problems critical to the future of safeguards. It could be extended to facilitate remote inspections and trigger random inspections. Knowledge Generation could analyze data to establish trust hierarchies, to facilitate safeguards use of operator-owned sensors.

  12. New large solar photocatalytic plant: set-up and preliminary results.

    PubMed

    Malato, S; Blanco, J; Vidal, A; Fernández, P; Cáceres, J; Trincado, P; Oliveira, J C; Vincent, M

    2002-04-01

    A European industrial consortium called SOLARDETOX has been created as the result of an EC-DGXII BRITE-EURAM-III-financed project on solar photocatalytic detoxification of water. The project objective was to develop a simple, efficient and commercially competitive water-treatment technology, based on compound parabolic collectors (CPCs) solar collectors and TiO2 photocatalysis, to make possible easy design and installation. The design, set-up and preliminary results of the main project deliverable, the first European industrial solar detoxification treatment plant, is presented. This plant has been designed for the batch treatment of 2 m3 of water with a 100 m2 collector-aperture area and aqueous aerated suspensions of polycrystalline TiO2 irradiated by sunlight. Fully automatic control reduces operation and maintenance manpower. Plant behaviour has been compared (using dichloroacetic acid and cyanide at 50 mg l(-1) initial concentration as model compounds) with the small CPC pilot plants installed at the Plataforma Solar de Almería several years ago. The first results with high-content cyanide (1 g l(-1)) waste water are presented and plant treatment capacity is calculated.

  13. Data Mining on Large Data Set for Predicting Salmon Spawning Habitat

    SciTech Connect

    Xie, YuLong; Murray, Christopher J.; Hanrahan, Timothy P.; Geist, David R.

    2008-07-01

    Hydraulic properties related to river flow affect salmon spawning habitat. Accurate prediction of salmon spawning habitat and understanding the influential properties on the spawning behavior are of great interest for hydroelectric dam management. Previous research predicted salmon spawning habitat through deriving river specific spawning suitability indices and employing a function estimate method like logistic regression on several static river flow related properties and had some success. The objective of this study was two-fold. First dynamic river flow properties associated with upstream dam operation were successfully derived from a huge set of time series of both water velocity and water depth for about one fifth of a million habitat cells through principal component analysis (PCA) using nonlinear iterative partial least squares (NIPLAS). The inclusion of dynamic variables in the models greatly improved the model prediction. Secondly, nine machine learning methods were applied to the data and it was found that decision tree and rule induction methods were generally outperformed usually used logistic regression. Specifically random forest, an advanced decision tree algorithm, provided unanimous better results. Over-prediction problem in previous studies were greatly alleviated.

  14. Improved Species-Specific Lysine Acetylation Site Prediction Based on a Large Variety of Features Set.

    PubMed

    Wuyun, Qiqige; Zheng, Wei; Zhang, Yanping; Ruan, Jishou; Hu, Gang

    2016-01-01

    Lysine acetylation is a major post-translational modification. It plays a vital role in numerous essential biological processes, such as gene expression and metabolism, and is related to some human diseases. To fully understand the regulatory mechanism of acetylation, identification of acetylation sites is first and most important. However, experimental identification of protein acetylation sites is often time consuming and expensive. Therefore, the alternative computational methods are necessary. Here, we developed a novel tool, KA-predictor, to predict species-specific lysine acetylation sites based on support vector machine (SVM) classifier. We incorporated different types of features and employed an efficient feature selection on each type to form the final optimal feature set for model learning. And our predictor was highly competitive for the majority of species when compared with other methods. Feature contribution analysis indicated that HSE features, which were firstly introduced for lysine acetylation prediction, significantly improved the predictive performance. Particularly, we constructed a high-accurate structure dataset of H.sapiens from PDB to analyze the structural properties around lysine acetylation sites. Our datasets and a user-friendly local tool of KA-predictor can be freely available at http://sourceforge.net/p/ka-predictor. PMID:27183223

  15. Improved Species-Specific Lysine Acetylation Site Prediction Based on a Large Variety of Features Set

    PubMed Central

    Wuyun, Qiqige; Zheng, Wei; Zhang, Yanping; Ruan, Jishou; Hu, Gang

    2016-01-01

    Lysine acetylation is a major post-translational modification. It plays a vital role in numerous essential biological processes, such as gene expression and metabolism, and is related to some human diseases. To fully understand the regulatory mechanism of acetylation, identification of acetylation sites is first and most important. However, experimental identification of protein acetylation sites is often time consuming and expensive. Therefore, the alternative computational methods are necessary. Here, we developed a novel tool, KA-predictor, to predict species-specific lysine acetylation sites based on support vector machine (SVM) classifier. We incorporated different types of features and employed an efficient feature selection on each type to form the final optimal feature set for model learning. And our predictor was highly competitive for the majority of species when compared with other methods. Feature contribution analysis indicated that HSE features, which were firstly introduced for lysine acetylation prediction, significantly improved the predictive performance. Particularly, we constructed a high-accurate structure dataset of H.sapiens from PDB to analyze the structural properties around lysine acetylation sites. Our datasets and a user-friendly local tool of KA-predictor can be freely available at http://sourceforge.net/p/ka-predictor. PMID:27183223

  16. Cytotoxicity evaluation of large cyanobacterial strain set using selected human and murine in vitro cell models.

    PubMed

    Hrouzek, Pavel; Kapuścik, Aleksandra; Vacek, Jan; Voráčová, Kateřina; Paichlová, Jindřiška; Kosina, Pavel; Voloshko, Ludmila; Ventura, Stefano; Kopecký, Jiří

    2016-02-01

    The production of cytotoxic molecules interfering with mammalian cells is extensively reported in cyanobacteria. These compounds may have a use in pharmacological applications; however, their potential toxicity needs to be considered. We performed cytotoxicity tests of crude cyanobacterial extracts in six cell models in order to address the frequency of cyanobacterial cytotoxicity to human cells and the level of specificity to a particular cell line. A set of more than 100 cyanobacterial crude extracts isolated from soil habitats (mainly genera Nostoc and Tolypothrix) was tested by MTT test for in vitro toxicity on the hepatic and non-hepatic human cell lines HepG2 and HeLa, and three cell systems of rodent origin: Yac-1, Sp-2 and Balb/c 3T3 fibroblasts. Furthermore, a subset of the extracts was assessed for cytotoxicity against primary cultures of human hepatocytes as a model for evaluating potential hepatotoxicity. Roughly one third of cyanobacterial extracts caused cytotoxic effects (i.e. viability<75%) on human cell lines. Despite the sensitivity differences, high correlation coefficients among the inhibition values were obtained for particular cell systems. This suggests a prevailing general cytotoxic effect of extracts and their constituents. The non-transformed immortalized fibroblasts (Balb/c 3T3) and hepatic cancer line HepG2 exhibited good correlations with primary cultures of human hepatocytes. The presence of cytotoxic fractions in strongly cytotoxic extracts was confirmed by an activity-guided HPLC fractionation, and it was demonstrated that cyanobacterial cytotoxicity is caused by a mixture of components with similar hydrophobic/hydrophilic properties. The data presented here could be used in further research into in vitro testing based on human models for the toxicological monitoring of complex cyanobacterial samples. PMID:26519817

  17. Problem-based learning: facilitating multiple small teams in a large group setting.

    PubMed

    Hyams, Jennifer H; Raidal, Sharanne L

    2013-01-01

    Problem-based learning (PBL) is often described as resource demanding due to the high staff-to-student ratio required in a traditional PBL tutorial class where there is commonly one facilitator to every 5-16 students. The veterinary science program at Charles Sturt University, Australia, has developed a method of group facilitation which readily allows one or two staff members to facilitate up to 30 students at any one time while maintaining the benefits of a small PBL team of six students. Multi-team facilitation affords obvious financial and logistic advantages, but there are also important pedagogical benefits derived from uniform facilitation across multiple groups, enhanced discussion and debate between groups, and the development of self-facilitation skills in students. There are few disadvantages to the roaming facilitator model, provided that several requirements are addressed. These requirements include a suitable venue, large whiteboards, a structured approach to support student engagement with each disclosure, a detailed facilitator guide, and an open, collaborative, and communicative environment.

  18. A Large Set of Newly Created Interspecific Saccharomyces Hybrids Increases Aromatic Diversity in Lager Beers

    PubMed Central

    Mertens, Stijn; Steensels, Jan; Saels, Veerle; De Rouck, Gert; Aerts, Guido

    2015-01-01

    Lager beer is the most consumed alcoholic beverage in the world. Its production process is marked by a fermentation conducted at low (8 to 15°C) temperatures and by the use of Saccharomyces pastorianus, an interspecific hybrid between Saccharomyces cerevisiae and the cold-tolerant Saccharomyces eubayanus. Recent whole-genome-sequencing efforts revealed that the currently available lager yeasts belong to one of only two archetypes, “Saaz” and “Frohberg.” This limited genetic variation likely reflects that all lager yeasts descend from only two separate interspecific hybridization events, which may also explain the relatively limited aromatic diversity between the available lager beer yeasts compared to, for example, wine and ale beer yeasts. In this study, 31 novel interspecific yeast hybrids were developed, resulting from large-scale robot-assisted selection and breeding between carefully selected strains of S. cerevisiae (six strains) and S. eubayanus (two strains). Interestingly, many of the resulting hybrids showed a broader temperature tolerance than their parental strains and reference S. pastorianus yeasts. Moreover, they combined a high fermentation capacity with a desirable aroma profile in laboratory-scale lager beer fermentations, thereby successfully enriching the currently available lager yeast biodiversity. Pilot-scale trials further confirmed the industrial potential of these hybrids and identified one strain, hybrid H29, which combines a fast fermentation, high attenuation, and the production of a complex, desirable fruity aroma. PMID:26407881

  19. Can Wide Consultation Help with Setting Priorities for Large-Scale Biodiversity Monitoring Programs?

    PubMed Central

    Boivin, Frédéric; Simard, Anouk; Peres-Neto, Pedro

    2014-01-01

    Climate and other global change phenomena affecting biodiversity require monitoring to track ecosystem changes and guide policy and management actions. Designing a biodiversity monitoring program is a difficult task that requires making decisions that often lack consensus due to budgetary constrains. As monitoring programs require long-term investment, they also require strong and continuing support from all interested parties. As such, stakeholder consultation is key to identify priorities and make sound design decisions that have as much support as possible. Here, we present the results of a consultation conducted to serve as an aid for designing a large-scale biodiversity monitoring program for the province of Québec (Canada). The consultation took the form of a survey with 13 discrete choices involving tradeoffs in respect to design priorities and 10 demographic questions (e.g., age, profession). The survey was sent to thousands of individuals having expected interests and knowledge about biodiversity and was completed by 621 participants. Overall, consensuses were few and it appeared difficult to create a design fulfilling the priorities of the majority. Most participants wanted 1) a monitoring design covering the entire territory and focusing on natural habitats; 2) a focus on species related to ecosystem services, on threatened and on invasive species. The only demographic characteristic that was related to the type of prioritization was the declared level of knowledge in biodiversity (null to high), but even then the influence was quite small. PMID:25525798

  20. "Tools For Analysis and Visualization of Large Time- Varying CFD Data Sets"

    NASA Technical Reports Server (NTRS)

    Wilhelms, Jane; vanGelder, Allen

    1999-01-01

    During the four years of this grant (including the one year extension), we have explored many aspects of the visualization of large CFD (Computational Fluid Dynamics) datasets. These have included new direct volume rendering approaches, hierarchical methods, volume decimation, error metrics, parallelization, hardware texture mapping, and methods for analyzing and comparing images. First, we implemented an extremely general direct volume rendering approach that can be used to render rectilinear, curvilinear, or tetrahedral grids, including overlapping multiple zone grids, and time-varying grids. Next, we developed techniques for associating the sample data with a k-d tree, a simple hierarchial data model to approximate samples in the regions covered by each node of the tree, and an error metric for the accuracy of the model. We also explored a new method for determining the accuracy of approximate models based on the light field method described at ACM SIGGRAPH (Association for Computing Machinery Special Interest Group on Computer Graphics) '96. In our initial implementation, we automatically image the volume from 32 approximately evenly distributed positions on the surface of an enclosing tessellated sphere. We then calculate differences between these images under different conditions of volume approximation or decimation.

  1. Inference of higher-order relationships in the cycads from a large chloroplast data set.

    PubMed

    Rai, Hardeep S; O'Brien, Heath E; Reeves, Patrick A; Olmstead, Richard G; Graham, Sean W

    2003-11-01

    We investigated higher-order relationships in the cycads, an ancient group of seed-bearing plants, by examining a large portion of the chloroplast genome from seven species chosen to exemplify our current understanding of taxonomic diversity in the order. The regions considered span approximately 13.5 kb of unaligned data per taxon, and comprise a diverse range of coding sequences, introns and intergenic spacers dispersed throughout the plastid genome. Our results provide substantial support for most of the inferred backbone of cycad phylogeny, and weak evidence that the sister-group of the cycads among living seed plants is Ginkgo biloba. Cycas (representing Cycadaceae) is the sister-group of the remaining cycads; Dioon is part of the next most basal split. Two of the three commonly recognized families of cycads (Zamiaceae and Stangeriaceae) are not monophyletic; Stangeria is embedded within Zamiaceae, close to Zamia and Ceratozamia, and not closely allied to the other genus of Stangeriaceae, Bowenia. In contrast to the other seed plants, cycad chloroplast genomes share two features with Ginkgo: a reduced rate of evolution and an elevated transition:transversion ratio. We demonstrate that the latter aspect of their molecular evolution is unlikely to have affected inference of cycad relationships in the context of seed-plant wide analyses.

  2. A large set of newly created interspecific Saccharomyces hybrids increases aromatic diversity in lager beers.

    PubMed

    Mertens, Stijn; Steensels, Jan; Saels, Veerle; De Rouck, Gert; Aerts, Guido; Verstrepen, Kevin J

    2015-12-01

    Lager beer is the most consumed alcoholic beverage in the world. Its production process is marked by a fermentation conducted at low (8 to 15°C) temperatures and by the use of Saccharomyces pastorianus, an interspecific hybrid between Saccharomyces cerevisiae and the cold-tolerant Saccharomyces eubayanus. Recent whole-genome-sequencing efforts revealed that the currently available lager yeasts belong to one of only two archetypes, "Saaz" and "Frohberg." This limited genetic variation likely reflects that all lager yeasts descend from only two separate interspecific hybridization events, which may also explain the relatively limited aromatic diversity between the available lager beer yeasts compared to, for example, wine and ale beer yeasts. In this study, 31 novel interspecific yeast hybrids were developed, resulting from large-scale robot-assisted selection and breeding between carefully selected strains of S. cerevisiae (six strains) and S. eubayanus (two strains). Interestingly, many of the resulting hybrids showed a broader temperature tolerance than their parental strains and reference S. pastorianus yeasts. Moreover, they combined a high fermentation capacity with a desirable aroma profile in laboratory-scale lager beer fermentations, thereby successfully enriching the currently available lager yeast biodiversity. Pilot-scale trials further confirmed the industrial potential of these hybrids and identified one strain, hybrid H29, which combines a fast fermentation, high attenuation, and the production of a complex, desirable fruity aroma. PMID:26407881

  3. On divide-and-conquer strategies for parsimony analysis of large data sets: Rec-I-DCM3 versus TNT.

    PubMed

    Goloboff, Pablo A; Pol, Diego

    2007-06-01

    Roshan et al. recently described a "divide-and-conquer" technique for parsimony analysis of large data sets, Rec-I-DCM3, and stated that it compares very favorably to results using the program TNT. Their technique is based on selecting subsets of taxa to create reduced data sets or subproblems, finding most-parsimonious trees for each reduced data set, recombining all parts together, and then performing global TBR swapping on the combined tree. Here, we contrast this approach to sectorial searches, a divide-and-conquer algorithm implemented in TNT. This algorithm also uses a guide tree to create subproblems, with the first-pass state sets of the nodes that join the selected sectors with the rest of the topology; this allows exact length calculations for the entire topology (that is, any solution N steps shorter than the original, for the reduced subproblem, must also be N steps shorter for the entire topology). We show here that, for sectors of similar size analyzed with the same search algorithms, subdividing data sets with sectorial searches produces better results than subdividing with Rec-I-DCM3. Roshan et al.'s claim that Rec-I-DCM3 outperforms the techniques in TNT was caused by a poor experimental design and algorithmic settings used for the runs in TNT. In particular, for finding trees at or very close to the minimum known length of the analyzed data sets, TNT clearly outperforms Rec-I-DCM3. Finally, we show that the performance of Rec-I-DCM3 is bound by the efficiency of TBR implementation for the complete data set, as this method behaves (after some number of iterations) as a technique for cyclic perturbations and improvements more than as a divide-and-conquer strategy.

  4. Actual Versus Estimated Utility Factor of a Large Set of Privately Owned Chevrolet Volts

    SciTech Connect

    John Smart; Thomas Bradley; Stephen Schey

    2014-04-01

    In order to determine the overall fuel economy of a plug-in hybrid electric vehicle (PHEV), the amount of operation in charge depleting (CD) versus charge sustaining modes must be determined. Mode of operation is predominantly dependent on customer usage of the vehicle and is therefore highly variable. The utility factor (UF) concept was developed to quantify the distance a group of vehicles has traveled or may travel in CD mode. SAE J2841 presents a UF calculation method based on data collected from travel surveys of conventional vehicles. UF estimates have been used in a variety of areas, including the calculation of window sticker fuel economy, policy decisions, and vehicle design determination. The EV Project, a plug-in electric vehicle charging infrastructure demonstration being conducted across the United States, provides the opportunity to determine the real-world UF of a large group of privately owned Chevrolet Volt extended range electric vehicles. Using data collected from Volts enrolled in The EV Project, this paper compares the real-world UF of two groups of Chevrolet Volts to estimated UF's based on J2841. The actual observed fleet utility factors (FUF) for the MY2011/2012 and MY2013 Volt groups studied were observed to be 72% and 74%, respectively. Using the EPA CD ranges, the method prescribed by J2841 estimates a FUF of 65% and 68% for the MY2011/2012 and MY2013 Volt groups, respectively. Volt drivers achieved higher percentages of distance traveled in EV mode for two reasons. First, they had fewer long-distance travel days than drivers in the national travel survey referenced by J2841. Second, they charged more frequently than the J2841 assumption of once per day - drivers of Volts in this study averaged over 1.4 charging events per day. Although actual CD range varied widely as driving conditions varied, the average CD ranges for the two Volt groups studied matched the EPA CD range estimates, so CD range variation did not affect FUF results.

  5. Designing Websites for Displaying Large Data Sets and Images on Multiple Platforms

    NASA Astrophysics Data System (ADS)

    Anderson, A.; Wolf, V. G.; Garron, J.; Kirschner, M.

    2012-12-01

    The desire to build websites to analyze and display ever increasing amounts of scientific data and images pushes for web site designs which utilize large displays, and to use the display area as efficiently as possible. Yet, scientists and users of their data are increasingly wishing to access these websites in the field and on mobile devices. This results in the need to develop websites that can support a wide range of devices and screen sizes, and to optimally use whatever display area is available. Historically, designers have addressed this issue by building two websites; one for mobile devices, and one for desktop environments, resulting in increased cost, duplicity of work, and longer development times. Recent advancements in web design technology and techniques have evolved which allow for the development of a single website that dynamically adjusts to the type of device being used to browse the website (smartphone, tablet, desktop). In addition they provide the opportunity to truly optimize whatever display area is available. HTML5 and CSS3 give web designers media query statements which allow design style sheets to be aware of the size of the display being used, and to format web content differently based upon the queried response. Web elements can be rendered in a different size, position, or even removed from the display entirely, based upon the size of the display area. Using HTML5/CSS3 media queries in this manner is referred to as "Responsive Web Design" (RWD). RWD in combination with technologies such as LESS and Twitter Bootstrap allow the web designer to build web sites which not only dynamically respond to the browser display size being used, but to do so in very controlled and intelligent ways, ensuring that good layout and graphic design principles are followed while doing so. At the University of Alaska Fairbanks, the Alaska Satellite Facility SAR Data Center (ASF) recently redesigned their popular Vertex application and converted it from a

  6. Evaluation in the Classroom.

    ERIC Educational Resources Information Center

    Becnel, Shirley

    Six classroom research-based instructional projects funded under Chapter 2 are described, and their outcomes are summarized. The projects each used computer hardware and software in the classroom setting. The projects and their salient points include: (1) the Science Technology Project, in which 48 teachers and 2,847 students in 18 schools used…

  7. Large population sizes mitigate negative effects of variable weather conditions on fruit set in two spring woodland orchids.

    PubMed

    Jacquemyn, Hans; Brys, Rein; Honnay, Olivier

    2009-08-23

    Global circulation models predict increased climatic variability, which could increase variability in demographic rates and affect long-term population viability. In animal-pollinated species, pollination services, and thus fruit and seed set, may be highly variable among years and sites, and depend on both local environmental conditions and climatic variables. Orchid species may be particularly vulnerable to disruption of their pollination services, as most species depend on pollinators for successful fruit set and because seed germination and seedling recruitment are to some extent dependent on the amount of fruits and seeds produced. Better insights into the factors determining fruit and seed set are therefore indispensable for a better understanding of population dynamics and viability of orchid populations under changing climatic conditions. However, very few studies have investigated spatio-temporal variation in fruit set in orchids. Here, we quantified fruit production in eight populations of the orchid Orchis purpurea that does not reward pollinators and 13 populations of the rewarding Neottia (Listera) ovata during five consecutive years (2002-2006). Fruit production in large populations showed much higher stability than that in small populations and was less affected by extreme weather conditions. Our results highlight the potential vulnerability of small orchid populations to an increasingly variable climate through highly unpredictable fruit-set patterns.

  8. The Differentiated Classroom Observation Scale

    ERIC Educational Resources Information Center

    Cassady, Jerrell C.; Neumeister, Kristie L. Speirs; Adams, Cheryll M.; Cross, Tracy L.; Dixon, Felicia A.; Pierce, Rebecca L.

    2004-01-01

    This article presents a new classroom observation scale that was developed to examine the differential learning activities and experiences of gifted children educated in regular classroom settings. The Differentiated Classroom Observation Scale (DCOS) is presented in total, with clarification of the coding practices and strategies. Although the…

  9. Mining unusual and rare stellar spectra from large spectroscopic survey data sets using the outlier-detection method

    NASA Astrophysics Data System (ADS)

    Wei, Peng; Luo, Ali; Li, Yinbi; Pan, Jingchang; Tu, Liangping; Jiang, Bin; Kong, Xiao; Shi, Zhixin; Yi, Zhenping; Wang, Fengfei; Liu, Jie; Zhao, Yongheng

    2013-05-01

    The large number of spectra obtained from sky surveys such as the Sloan Digital Sky Survey (SDSS) and the survey executed by the Large sky Area Multi-Object fibre Spectroscopic Telescope (LAMOST, also called GuoShouJing Telescope) provide us with opportunities to search for peculiar or even unknown types of spectra. In response to the limitations of existing methods, a novel outlier-mining method, the Monte Carlo Local Outlier Factor (MCLOF), is proposed in this paper, which can be used to highlight unusual and rare spectra from large spectroscopic survey data sets. The MCLOF method exposes outliers automatically and efficiently by marking each spectrum with a number, i.e. using outlier index as a flag for an unusual and rare spectrum. The Local Outlier Factor (LOF) represents how unusual and rare a spectrum is compared with other spectra and the Monte Carlo method is used to compute the global LOF for each spectrum by randomly selecting samples in each independent iteration. Our MCLOF method is applied to over half a million stellar spectra (classified as STAR by the SDSS Pipeline) from the SDSS data release 8 (DR8) and a total of 37 033 spectra are selected as outliers with signal-to-noise ratio (S/N) ≥ 3 and outlier index ≥0.85. Some of these outliers are shown to be binary stars, emission-line stars, carbon stars and stars with unusual continuum. The results show that our proposed method can efficiently highlight these unusual spectra from the survey data sets. In addition, some relatively rare and interesting spectra are selected, indicating that the proposed method can also be used to mine rare, even unknown, spectra. The proposed method can be applicable not only to spectral survey data sets but also to other types of survey data sets. The spectra of all peculiar objects selected by our MCLOF method are available from a user-friendly website: http://sciwiki.lamost.org/Miningdr8/.

  10. Knowledge and theme discovery across very large biological data sets using distributed queries: a prototype combining unstructured and structured data.

    PubMed

    Mudunuri, Uma S; Khouja, Mohamad; Repetski, Stephen; Venkataraman, Girish; Che, Anney; Luke, Brian T; Girard, F Pascal; Stephens, Robert M

    2013-01-01

    As the discipline of biomedical science continues to apply new technologies capable of producing unprecedented volumes of noisy and complex biological data, it has become evident that available methods for deriving meaningful information from such data are simply not keeping pace. In order to achieve useful results, researchers require methods that consolidate, store and query combinations of structured and unstructured data sets efficiently and effectively. As we move towards personalized medicine, the need to combine unstructured data, such as medical literature, with large amounts of highly structured and high-throughput data such as human variation or expression data from very large cohorts, is especially urgent. For our study, we investigated a likely biomedical query using the Hadoop framework. We ran queries using native MapReduce tools we developed as well as other open source and proprietary tools. Our results suggest that the available technologies within the Big Data domain can reduce the time and effort needed to utilize and apply distributed queries over large datasets in practical clinical applications in the life sciences domain. The methodologies and technologies discussed in this paper set the stage for a more detailed evaluation that investigates how various data structures and data models are best mapped to the proper computational framework. PMID:24312478

  11. Knowledge and Theme Discovery across Very Large Biological Data Sets Using Distributed Queries: A Prototype Combining Unstructured and Structured Data

    PubMed Central

    Repetski, Stephen; Venkataraman, Girish; Che, Anney; Luke, Brian T.; Girard, F. Pascal; Stephens, Robert M.

    2013-01-01

    As the discipline of biomedical science continues to apply new technologies capable of producing unprecedented volumes of noisy and complex biological data, it has become evident that available methods for deriving meaningful information from such data are simply not keeping pace. In order to achieve useful results, researchers require methods that consolidate, store and query combinations of structured and unstructured data sets efficiently and effectively. As we move towards personalized medicine, the need to combine unstructured data, such as medical literature, with large amounts of highly structured and high-throughput data such as human variation or expression data from very large cohorts, is especially urgent. For our study, we investigated a likely biomedical query using the Hadoop framework. We ran queries using native MapReduce tools we developed as well as other open source and proprietary tools. Our results suggest that the available technologies within the Big Data domain can reduce the time and effort needed to utilize and apply distributed queries over large datasets in practical clinical applications in the life sciences domain. The methodologies and technologies discussed in this paper set the stage for a more detailed evaluation that investigates how various data structures and data models are best mapped to the proper computational framework. PMID:24312478

  12. A Rainbow for the Classroom.

    ERIC Educational Resources Information Center

    Russell, R. D.

    1989-01-01

    Describes an experiment producing a visible spectrum with inexpensive equipment available in the physics classroom. Discusses some related equations, apparatus settings, and instructional methods. (YP)

  13. A Large-Scale Empirical Evaluation of Cross-Validation and External Test Set Validation in (Q)SAR.

    PubMed

    Gütlein, Martin; Helma, Christoph; Karwath, Andreas; Kramer, Stefan

    2013-06-01

    (Q)SAR model validation is essential to ensure the quality of inferred models and to indicate future model predictivity on unseen compounds. Proper validation is also one of the requirements of regulatory authorities in order to accept the (Q)SAR model, and to approve its use in real world scenarios as alternative testing method. However, at the same time, the question of how to validate a (Q)SAR model, in particular whether to employ variants of cross-validation or external test set validation, is still under discussion. In this paper, we empirically compare a k-fold cross-validation with external test set validation. To this end we introduce a workflow allowing to realistically simulate the common problem setting of building predictive models for relatively small datasets. The workflow allows to apply the built and validated models on large amounts of unseen data, and to compare the performance of the different validation approaches. The experimental results indicate that cross-validation produces higher performant (Q)SAR models than external test set validation, reduces the variance of the results, while at the same time underestimates the performance on unseen compounds. The experimental results reported in this paper suggest that, contrary to current conception in the community, cross-validation may play a significant role in evaluating the predictivity of (Q)SAR models.

  14. Performance-friendly rule extraction in large water data-sets with AOC posets and relational concept analysis

    NASA Astrophysics Data System (ADS)

    Dolques, Xavier; Le Ber, Florence; Huchard, Marianne; Grac, Corinne

    2016-02-01

    In this paper, we consider data analysis methods for knowledge extraction from large water data-sets. More specifically, we try to connect physico-chemical parameters and the characteristics of taxons living in sample sites. Among these data analysis methods, we consider formal concept analysis (FCA), which is a recognized tool for classification and rule discovery on object-attribute data. Relational concept analysis (RCA) relies on FCA and deals with sets of object-attribute data provided with relations. RCA produces more informative results but at the expense of an increase in complexity. Besides, in numerous applications of FCA, the partially ordered set of concepts introducing attributes or objects (AOC poset, for Attribute-Object-Concept poset) is used rather than the concept lattice in order to reduce combinatorial problems. AOC posets are much smaller and easier to compute than concept lattices and still contain the information needed to rebuild the initial data. This paper introduces a variant of the RCA process based on AOC posets rather than concept lattices. This approach is compared with RCA based on iceberg lattices. Experiments are performed with various scaling operators, and a specific operator is introduced to deal with noisy data. We show that using AOC poset on water data-sets provides a reasonable concept number and allows us to extract meaningful implication rules (association rules whose confidence is 1), whose semantics depends on the chosen scaling operator.

  15. Development and Validation of Decision Forest Model for Estrogen Receptor Binding Prediction of Chemicals Using Large Data Sets.

    PubMed

    Ng, Hui Wen; Doughty, Stephen W; Luo, Heng; Ye, Hao; Ge, Weigong; Tong, Weida; Hong, Huixiao

    2015-12-21

    Some chemicals in the environment possess the potential to interact with the endocrine system in the human body. Multiple receptors are involved in the endocrine system; estrogen receptor α (ERα) plays very important roles in endocrine activity and is the most studied receptor. Understanding and predicting estrogenic activity of chemicals facilitates the evaluation of their endocrine activity. Hence, we have developed a decision forest classification model to predict chemical binding to ERα using a large training data set of 3308 chemicals obtained from the U.S. Food and Drug Administration's Estrogenic Activity Database. We tested the model using cross validations and external data sets of 1641 chemicals obtained from the U.S. Environmental Protection Agency's ToxCast project. The model showed good performance in both internal (92% accuracy) and external validations (∼ 70-89% relative balanced accuracies), where the latter involved the validations of the model across different ER pathway-related assays in ToxCast. The important features that contribute to the prediction ability of the model were identified through informative descriptor analysis and were related to current knowledge of ER binding. Prediction confidence analysis revealed that the model had both high prediction confidence and accuracy for most predicted chemicals. The results demonstrated that the model constructed based on the large training data set is more accurate and robust for predicting ER binding of chemicals than the published models that have been developed using much smaller data sets. The model could be useful for the evaluation of ERα-mediated endocrine activity potential of environmental chemicals.

  16. Development and Validation of Decision Forest Model for Estrogen Receptor Binding Prediction of Chemicals Using Large Data Sets.

    PubMed

    Ng, Hui Wen; Doughty, Stephen W; Luo, Heng; Ye, Hao; Ge, Weigong; Tong, Weida; Hong, Huixiao

    2015-12-21

    Some chemicals in the environment possess the potential to interact with the endocrine system in the human body. Multiple receptors are involved in the endocrine system; estrogen receptor α (ERα) plays very important roles in endocrine activity and is the most studied receptor. Understanding and predicting estrogenic activity of chemicals facilitates the evaluation of their endocrine activity. Hence, we have developed a decision forest classification model to predict chemical binding to ERα using a large training data set of 3308 chemicals obtained from the U.S. Food and Drug Administration's Estrogenic Activity Database. We tested the model using cross validations and external data sets of 1641 chemicals obtained from the U.S. Environmental Protection Agency's ToxCast project. The model showed good performance in both internal (92% accuracy) and external validations (∼ 70-89% relative balanced accuracies), where the latter involved the validations of the model across different ER pathway-related assays in ToxCast. The important features that contribute to the prediction ability of the model were identified through informative descriptor analysis and were related to current knowledge of ER binding. Prediction confidence analysis revealed that the model had both high prediction confidence and accuracy for most predicted chemicals. The results demonstrated that the model constructed based on the large training data set is more accurate and robust for predicting ER binding of chemicals than the published models that have been developed using much smaller data sets. The model could be useful for the evaluation of ERα-mediated endocrine activity potential of environmental chemicals. PMID:26524122

  17. Supporting Classroom Activities with the BSUL System

    ERIC Educational Resources Information Center

    Ogata, Hiroaki; Saito, Nobuji A.; Paredes J., Rosa G.; San Martin, Gerardo Ayala; Yano, Yoneo

    2008-01-01

    This paper presents the integration of ubiquitous computing systems into classroom settings, in order to provide basic support for classrooms and field activities. We have developed web application components using Java technology and configured a classroom with wireless network access and a web camera for our purposes. In this classroom, the…

  18. msIQuant--Quantitation Software for Mass Spectrometry Imaging Enabling Fast Access, Visualization, and Analysis of Large Data Sets.

    PubMed

    Källback, Patrik; Nilsson, Anna; Shariatgorji, Mohammadreza; Andrén, Per E

    2016-04-19

    This paper presents msIQuant, a novel instrument- and manufacturer-independent quantitative mass spectrometry imaging software suite that uses the standardized open access data format imzML. Its data processing structure enables rapid image display and the analysis of very large data sets (>50 GB) without any data reduction. In addition, msIQuant provides many tools for image visualization including multiple interpolation methods, low intensity transparency display, and image fusion. It also has a quantitation function that automatically generates calibration standard curves from series of standards that can be used to determine the concentrations of specific analytes. Regions-of-interest in a tissue section can be analyzed based on a number of quantities including the number of pixels, average intensity, standard deviation of intensity, and median and quartile intensities. Moreover, the suite's export functions enable simplified postprocessing of data and report creation. We demonstrate its potential through several applications including the quantitation of small molecules such as drugs and neurotransmitters. The msIQuant suite is a powerful tool for accessing and evaluating very large data sets, quantifying drugs and endogenous compounds in tissue areas of interest, and for processing mass spectra and images.

  19. Functional network construction in Arabidopsis using rule-based machine learning on large-scale data sets.

    PubMed

    Bassel, George W; Glaab, Enrico; Marquez, Julietta; Holdsworth, Michael J; Bacardit, Jaume

    2011-09-01

    The meta-analysis of large-scale postgenomics data sets within public databases promises to provide important novel biological knowledge. Statistical approaches including correlation analyses in coexpression studies of gene expression have emerged as tools to elucidate gene function using these data sets. Here, we present a powerful and novel alternative methodology to computationally identify functional relationships between genes from microarray data sets using rule-based machine learning. This approach, termed "coprediction," is based on the collective ability of groups of genes co-occurring within rules to accurately predict the developmental outcome of a biological system. We demonstrate the utility of coprediction as a powerful analytical tool using publicly available microarray data generated exclusively from Arabidopsis thaliana seeds to compute a functional gene interaction network, termed Seed Co-Prediction Network (SCoPNet). SCoPNet predicts functional associations between genes acting in the same developmental and signal transduction pathways irrespective of the similarity in their respective gene expression patterns. Using SCoPNet, we identified four novel regulators of seed germination (ALTERED SEED GERMINATION5, 6, 7, and 8), and predicted interactions at the level of transcript abundance between these novel and previously described factors influencing Arabidopsis seed germination. An online Web tool to query SCoPNet has been developed as a community resource to dissect seed biology and is available at http://www.vseed.nottingham.ac.uk/. PMID:21896882

  20. The plateau in mnemonic resolution across large set sizes indicates discrete resource limits in visual working memory.

    PubMed

    Anderson, David E; Awh, Edward

    2012-07-01

    The precision of visual working memory (WM) representations declines monotonically with increasing storage load. Two distinct models of WM capacity predict different shapes for this precision-by-set-size function. Flexible-resource models, which assert a continuous allocation of resources across an unlimited number of items, predict a monotonic decline in precision across a large range of set sizes. Conversely, discrete-resource models, which assert a relatively small item limit for WM storage, predict that precision will plateau once this item limit is exceeded. Recent work has demonstrated such a plateau in mnemonic precision. Moreover, the set size at which mnemonic precision reached asymptote has been strongly predicted by estimated item limits in WM. In the present work, we extend this evidence in three ways. First, we show that this empirical pattern generalizes beyond orientation memory to color memory. Second, we rule out encoding limits as the source of discrete limits by demonstrating equivalent performance across simultaneous and sequential presentations of the memoranda. Finally, we demonstrate that the analytic approach commonly used to estimate precision yields flawed parameter estimates when the range of stimulus space is narrowed (e.g., a 180º rather than a 360º orientation space) and typical numbers of observations are collected. Such errors in parameter estimation reconcile an apparent conflict between our findings and others based on different stimuli. These findings provide further support for discrete-resource models of WM capacity.

  1. Achieving the Complete-Basis Limit in Large Molecular Clusters: Computationally Efficient Procedures to Eliminate Basis-Set Superposition Error

    NASA Astrophysics Data System (ADS)

    Richard, Ryan M.; Herbert, John M.

    2013-06-01

    Previous electronic structure studies that have relied on fragmentation have been primarily interested in those methods' abilities to replicate the supersystem energy (or a related energy difference) without recourse to the ability of those supersystem results to replicate experiment or high accuracy benchmarks. Here we focus on replicating accurate ab initio benchmarks, that are suitable for comparison to experimental data. In doing this it becomes imperative that we correct our methods for basis-set superposition errors (BSSE) in a computationally feasible way. This criterion leads us to develop a new method for BSSE correction, which we term the many-body counterpoise correction, or MBn for short. MBn is truncated at order n, in much the same manner as a normal many-body expansion leading to a decrease in computational time. Furthermore, its formulation in terms of fragments makes it especially suitable for use with pre-existing fragment codes. A secondary focus of this study is directed at assessing fragment methods' abilities to extrapolate to the complete basis set (CBS) limit as well as compute approximate triples corrections. Ultimately, by analysis of (H_2O)_6 and (H_2O)_{10}F^- systems, it is concluded that with large enough basis-sets (triple or quad zeta) fragment based methods can replicate high level benchmarks in a fraction of the time.

  2. Culture in the Classroom

    ERIC Educational Resources Information Center

    Medin, Douglas L.; Bang, Megan

    2014-01-01

    Culture plays a large but often unnoticeable role in what we teach and how we teach children. We are a country of immense diversity, but in classrooms the dominant European-American culture has become the language of learning.

  3. Possible calcium centers for hydrogen storage applications: An accurate many-body study by AFQMC calculations with large basis sets

    NASA Astrophysics Data System (ADS)

    Purwanto, Wirawan; Krakauer, Henry; Zhang, Shiwei; Virgus, Yudistira

    2011-03-01

    Weak H2 physisorption energies present a significant challenge to first-principle theoretical modeling and prediction of materials for H storage. There has been controversy regarding the accuracy of DFT on systems involving Ca cations. We use the auxiliary-field quantum Monte Carlo (AFQMC) method to accurately predict the binding energy of Ca + , - 4{H}2 . AFQMC scales as Nbasis3and has demonstrated accuracy similar to or better than the gold-standard coupled cluster CCSD(T) method. We apply a modified Cholesky decomposition to achieve efficient Hubbard-Stratonovich transformation in AFQMC at large basis sizes. We employ the largest correlation consistent basis sets available, up to Ca/cc-pCV5Z, to extrapolate to the complete basis limit. The calculated potential energy curve exhibits binding with a double-well structure. Supported by DOE and NSF. Calculations were performed at OLCF Jaguar and CPD.

  4. Navigating the Problem Space of Academia: Exploring Processes of Course Design and Classroom Teaching in Postsecondary Settings. WCER Working Paper No. 2014-1

    ERIC Educational Resources Information Center

    Hora, Matthew T.

    2014-01-01

    Policymakers and educators alike increasing focus on faculty adoption of interactive teaching techniques as a way to improve undergraduate education. Yet, little empirical research exists that examines the processes whereby faculty make decisions about curriculum design and classroom teaching in real-world situations. In this study, I use the idea…

  5. A Case Study of Literacy Instruction Delivered to Kindergarten Struggling Readers within the Response to Intervention Model in Three Classroom Settings

    ERIC Educational Resources Information Center

    Zelenka, Valerie Lynn

    2010-01-01

    A portion of the 2004 reauthorization of the Individuals with Disabilities Education Act (IDEA, 2004), Response to Intervention (RtI), aims to prevent unnecessary student placement in special education. The intent of RtI is to provide all students with effective classroom instruction first and afford low-performing students with increasingly…

  6. Initial Development and Piloting of a Learning-Based, Classroom Assessment and Consultation System: New Perspectives on the Rhetoric of Improving Instruction in Higher Education Settings.

    ERIC Educational Resources Information Center

    Loup, Karen S.; And Others

    Results are reported of three years of research and development, piloting, and extended field testing of a classroom-based assessment and professional consultation system used to assess important teaching and learning variables in higher education contexts. Of particular interest is the focus of the total system on enhancing learning and newer…

  7. The Impact of Brief Teacher Training on Classroom Management and Child Behavior in At-Risk Preschool Settings: Mediators and Treatment Utility

    ERIC Educational Resources Information Center

    Snyder, James; Low, Sabina; Schultz, Tara; Barner, Stacy; Moreno, Desirae; Garst, Meladee; Leiker, Ryan; Swink, Nathan; Schrepferman, Lynn

    2011-01-01

    Teachers from fourteen classrooms were randomly assigned to an adaptation of Incredible Years (IY) teacher training or to teacher training-as-usual. Observations were made of the behavior of 136 target preschool boys and girls nominated by teachers as having many or few conduct problems. Peer and teacher behavior were observed at baseline and post…

  8. "Designing Instrument for Science Classroom Learning Environment in Francophone Minority Settings: Accounting for Voiced Concerns among Teachers and Immigrant/Refugee Students"

    ERIC Educational Resources Information Center

    Bolivar, Bathélemy

    2015-01-01

    The three-phase process "-Instrument for Minority Immigrant Science Learning Environment," an 8-scale, 32-item see Appendix I- (I_MISLE) instrument when completed by teachers provides an accurate description of existing conditions in classrooms in which immigrant and refugee students are situated. Through the completion of the instrument…

  9. QSAR prediction of estrogen activity for a large set of diverse chemicals under the guidance of OECD principles.

    PubMed

    Liu, Huanxiang; Papa, Ester; Gramatica, Paola

    2006-11-01

    A large number of environmental chemicals, known as endocrine-disrupting chemicals, are suspected of disrupting endocrine functions by mimicking or antagonizing natural hormones, and such chemicals may pose a serious threat to the health of humans and wildlife. They are thought to act through a variety of mechanisms, mainly estrogen-receptor-mediated mechanisms of toxicity. However, it is practically impossible to perform thorough toxicological tests on all potential xenoestrogens, and thus, the quantitative structure--activity relationship (QSAR) provides a promising method for the estimation of a compound's estrogenic activity. Here, QSAR models of the estrogen receptor binding affinity of a large data set of heterogeneous chemicals have been built using theoretical molecular descriptors, giving full consideration to the new OECD principles in regulation for QSAR acceptability, during model construction and assessment. An unambiguous multiple linear regression (MLR) algorithm was used to build the models, and model predictive ability was validated by both internal and external validation. The applicability domain was checked by the leverage approach to verify prediction reliability. The results obtained using several validation paths indicate that the proposed QSAR model is robust and satisfactory, and can provide a feasible and practical tool for the rapid screening of the estrogen activity of organic compounds.

  10. The ambient dose equivalent at flight altitudes: a fit to a large set of data using a Bayesian approach.

    PubMed

    Wissmann, F; Reginatto, M; Möller, T

    2010-09-01

    The problem of finding a simple, generally applicable description of worldwide measured ambient dose equivalent rates at aviation altitudes between 8 and 12 km is difficult to solve due to the large variety of functional forms and parametrisations that are possible. We present an approach that uses Bayesian statistics and Monte Carlo methods to fit mathematical models to a large set of data and to compare the different models. About 2500 data points measured in the periods 1997-1999 and 2003-2006 were used. Since the data cover wide ranges of barometric altitude, vertical cut-off rigidity and phases in the solar cycle 23, we developed functions which depend on these three variables. Whereas the dependence on the vertical cut-off rigidity is described by an exponential, the dependences on barometric altitude and solar activity may be approximated by linear functions in the ranges under consideration. Therefore, a simple Taylor expansion was used to define different models and to investigate the relevance of the different expansion coefficients. With the method presented here, it is possible to obtain probability distributions for each expansion coefficient and thus to extract reliable uncertainties even for the dose rate evaluated. The resulting function agrees well with new measurements made at fixed geographic positions and during long haul flights covering a wide range of latitudes.

  11. My Classroom Physical Activity Pyramid: A Tool for Integrating Movement into the Classroom

    ERIC Educational Resources Information Center

    Orlowski, Marietta; Lorson, Kevin; Lyon, Anna; Minoughan, Susan

    2013-01-01

    The classroom teacher is a critical team member of a comprehensive school physical activity program and an activity-friendly school environment. Students spend more time in the classroom than in any other school setting or environment. Classrooms are busy places, and classroom teachers must make decisions about how to make the best use of their…

  12. WebViz:A Web-based Collaborative Interactive Visualization System for large-Scale Data Sets

    NASA Astrophysics Data System (ADS)

    Yuen, D. A.; McArthur, E.; Weiss, R. M.; Zhou, J.; Yao, B.

    2010-12-01

    WebViz is a web-based application designed to conduct collaborative, interactive visualizations of large data sets for multiple users, allowing researchers situated all over the world to utilize the visualization services offered by the University of Minnesota’s Laboratory for Computational Sciences and Engineering (LCSE). This ongoing project has been built upon over the last 3 1/2 years .The motivation behind WebViz lies primarily with the need to parse through an increasing amount of data produced by the scientific community as a result of larger and faster multicore and massively parallel computers coming to the market, including the use of general purpose GPU computing. WebViz allows these large data sets to be visualized online by anyone with an account. The application allows users to save time and resources by visualizing data ‘on the fly’, wherever he or she may be located. By leveraging AJAX via the Google Web Toolkit (http://code.google.com/webtoolkit/), we are able to provide users with a remote, web portal to LCSE's (http://www.lcse.umn.edu) large-scale interactive visualization system already in place at the University of Minnesota. LCSE’s custom hierarchical volume rendering software provides high resolution visualizations on the order of 15 million pixels and has been employed for visualizing data primarily from simulations in astrophysics to geophysical fluid dynamics . In the current version of WebViz, we have implemented a highly extensible back-end framework built around HTTP "server push" technology. The web application is accessible via a variety of devices including netbooks, iPhones, and other web and javascript-enabled cell phones. Features in the current version include the ability for users to (1) securely login (2) launch multiple visualizations (3) conduct collaborative visualization sessions (4) delegate control aspects of a visualization to others and (5) engage in collaborative chats with other users within the user interface

  13. Hydraulic behavior of two areas of the Floridan aquifer system characterized by complex hydrogeologic settings and large groundwater withdrawals

    SciTech Connect

    Maslia, M.L. )

    1993-03-01

    Two areas of the Florida aquifer system (FAS) that are characterized by complex hydrogeologic settings and exceedingly large ground-water withdrawals are the Dougherty Plain area of southwest GA and the Glynn County area of southeast GA. In southwest GA, large scale withdrawals of ground water for agricultural and livestock irrigation amounted to about 148 million gallons per day (mg/d) during 1990. Large scale pumping in Glynn County, primarily used for industrial purposes and centered in the City of Brunswick, amounted to about 88 mg/d during 1990. In southwest GA, the FAS consists primarily of the Ocala Limestone (OL) of late Eocene age. Confining the aquifer from above is a residual layer (50 ft thick) of sand and clay containing silicified boulders which is derived from the chemical weathering of the OL. This area is characterized by karst topography marked by numerous depressions and sinkholes, high transmissivity (generally greater than 50,000 feet squared per day), and significant hydraulic connections to overlying streams and lakes. These characteristics, along with the seasonal nature of pumping and mean annual recharge of about 10 inches per year have prevented permanent, long-term water-level declines. In the Glynn County area, the FAS can be more than 2,600 ft thick, consisting of a sequence of calcareous and dolomitic rocks that are of Late Cretaceous to early Miocene in age. The aquifer system is confined above by clastic rocks of Middle Miocene age, having an average thickness of 400 ft. This area is characterized by post-depositional tectonic modification of the subsurface as opposed to simple karst development, thick confinement of the aquifer system, and significant amounts of vertical leakage of water from below. These characteristics and heavy-long term pumping from the Upper Floridan aquifer (UFA) have caused a broad, shallow cone of depression to develop and the upward migration of saltwater to contaminate the freshwater zones of the UFA.

  14. Clickenomics: Using a Classroom Response System to Increase Student Engagement in a Large-Enrollment Principles of Economics Course

    ERIC Educational Resources Information Center

    Salemi, Michael K.

    2009-01-01

    One of the most important challenges facing college instructors of economics is helping students engage. Engagement is particularly important in a large-enrollment Principles of Economics course, where it can help students achieve a long-lived understanding of how economists use basic economic ideas to look at the world. The author reports how…

  15. Teaching Cell Biology in the Large-Enrollment Classroom: Methods to Promote Analytical Thinking and Assessment of Their Effectiveness

    ERIC Educational Resources Information Center

    Kitchen, Elizabeth; Bell, John D.; Reeve, Suzanne; Sudweeks, Richard R.; Bradshaw, William S.

    2003-01-01

    A large-enrollment, undergraduate cellular biology lecture course is described whose primary goal is to help students acquire skill in the interpretation of experimental data. The premise is that this kind of analytical reasoning is not intuitive for most people and, in the absence of hands-on laboratory experience, will not readily develop unless…

  16. Using Technology To Implement Active Learning in Large Classes. Technical Report.

    ERIC Educational Resources Information Center

    Gerace, William J.; Dufresne, Robert J.; Leonard, William J.

    An emerging technology, classroom communication systems (CCSs), has the potential to transform the way we teach science in large-lecture settings. CCSs can serve as catalysts for creating a more interactive, student-centered classroom in the lecture hall, thereby allowing students to become more actively involved in constructing and using…

  17. Spatial Fingerprints of Community Structure in Human Interaction Network for an Extensive Set of Large-Scale Regions

    PubMed Central

    Kallus, Zsófia; Barankai, Norbert; Szüle, János; Vattay, Gábor

    2015-01-01

    Human interaction networks inferred from country-wide telephone activity recordings were recently used to redraw political maps by projecting their topological partitions into geographical space. The results showed remarkable spatial cohesiveness of the network communities and a significant overlap between the redrawn and the administrative borders. Here we present a similar analysis based on one of the most popular online social networks represented by the ties between more than 5.8 million of its geo-located users. The worldwide coverage of their measured activity allowed us to analyze the large-scale regional subgraphs of entire continents and an extensive set of examples for single countries. We present results for North and South America, Europe and Asia. In our analysis we used the well-established method of modularity clustering after an aggregation of the individual links into a weighted graph connecting equal-area geographical pixels. Our results show fingerprints of both of the opposing forces of dividing local conflicts and of uniting cross-cultural trends of globalization. PMID:25993329

  18. Automatic detection of rate change in large data sets with an unsupervised approach: the case of influenza viruses.

    PubMed

    Labonté, Kasandra; Aris-Brosou, Stéphane

    2016-04-01

    Influenza viruses evolve at such a high rate that vaccine recommendations need to be changed, but not quite on a regular basis. This observation suggests that the rate of evolution of these viruses is not constant through time, which begs the question as to when such rate changes occur, if they do so independently of the host in which they circulate and (or) independently of their subtype. To address these outstanding questions, we introduce a novel heuristics, Mclust*, that is based on a two-tier clustering approach in a phylogenetic context to estimate (i) absolute rates of evolution and (ii) when rate change occurs. We employ the novel approach to compare the two influenza surface proteins, hemagglutinin and neuraminidase, that circulated in avian, human, and swine hosts between 1960 and 2014 in two subtypes: H3N2 and H1N1. We show that the algorithm performs well in most conditions, accounting for phylogenetic uncertainty by means of bootstrapping and scales up to analyze very large data sets. Our results show that our approach is robust to the time-dependent artifact of rate estimation, and confirm pervasive punctuated evolution across hosts and subtypes. As such, the novel approach can potentially detect when vaccine composition needs to be updated. PMID:26966881

  19. Wide-range photoabsorption cross-sections of simple metals: large basis-set OPW calculations for sodium.

    PubMed

    Kitamura, Hikaru

    2013-02-13

    Photoabsorption cross-sections of simple metals are formulated through a solid-state band theory based on the orthogonalized-plane-wave (OPW) method in Slater's local-exchange approximation, where interband transitions of core and conduction electrons are evaluated up to the soft x-ray regime by using large basis sets. The photoabsorption cross-sections of a sodium crystal are computed for a wide photon energy range from 3 to 1800 eV. It is found that the numerical results reproduce the existing x-ray databases fairly well for energies above the L(2,3)-edge (31 eV), verifying a consistency between solid-state and atomic models for inner-shell photoabsorption; additional oscillatory structures in the present spectra manifest solid-state effects. Our computed results in the vacuum ultraviolet regime (6-30 eV) are also in better agreement with experimental data compared to earlier theories, although some discrepancies remain in the range of 20-30 eV. The influence of the core eigenvalues on the absorption spectra is examined. PMID:23334229

  20. Identifying Cognate Binding Pairs among a Large Set of Paralogs: The Case of PE/PPE Proteins of Mycobacterium tuberculosis

    PubMed Central

    Riley, Robert; Pellegrini, Matteo; Eisenberg, David

    2008-01-01

    We consider the problem of how to detect cognate pairs of proteins that bind when each belongs to a large family of paralogs. To illustrate the problem, we have undertaken a genomewide analysis of interactions of members of the PE and PPE protein families of Mycobacterium tuberculosis. Our computational method uses structural information, operon organization, and protein coevolution to infer the interaction of PE and PPE proteins. Some 289 PE/PPE complexes were predicted out of a possible 5,590 PE/PPE pairs genomewide. Thirty-five of these predicted complexes were also found to have correlated mRNA expression, providing additional evidence for these interactions. We show that our method is applicable to other protein families, by analyzing interactions of the Esx family of proteins. Our resulting set of predictions is a starting point for genomewide experimental interaction screens of the PE and PPE families, and our method may be generally useful for detecting interactions of proteins within families having many paralogs. PMID:18787688

  1. Spatial fingerprints of community structure in human interaction network for an extensive set of large-scale regions.

    PubMed

    Kallus, Zsófia; Barankai, Norbert; Szüle, János; Vattay, Gábor

    2015-01-01

    Human interaction networks inferred from country-wide telephone activity recordings were recently used to redraw political maps by projecting their topological partitions into geographical space. The results showed remarkable spatial cohesiveness of the network communities and a significant overlap between the redrawn and the administrative borders. Here we present a similar analysis based on one of the most popular online social networks represented by the ties between more than 5.8 million of its geo-located users. The worldwide coverage of their measured activity allowed us to analyze the large-scale regional subgraphs of entire continents and an extensive set of examples for single countries. We present results for North and South America, Europe and Asia. In our analysis we used the well-established method of modularity clustering after an aggregation of the individual links into a weighted graph connecting equal-area geographical pixels. Our results show fingerprints of both of the opposing forces of dividing local conflicts and of uniting cross-cultural trends of globalization. PMID:25993329

  2. Early Miocene Kirka-Phrigian caldera, western Anatolia - an example of large volume silicic magma generation in extensional setting

    NASA Astrophysics Data System (ADS)

    Seghedi, Ioan; Helvacı, Cahit

    2014-05-01

    Large rhyolitic ignimbrite occurrences are close connected to the Early Miocene initiation of extensional processes in the central-west Anatolia along Taşvanlı-Afyon zones. Field correlations, petrographical, geochemical and geochronological data lead to a substantial reinterpretation of the ignimbrite surrounding Kırka area, known from its world-class borate deposits, as representing the climatic event of a caldera collapse, unknown up to now and newly named "Kırka-Phrigian caldera". The caldera, which is roughly oval (24 km x 15km) in shape, one of the largest in Turkey, is supposed to have been formed in a single stage collapse event, at ~19 Ma that generated huge volume extracaldera outflow ignimbrites. Transtensive/distensive tectonic stresses since 25 Ma ago resulted in the NNW-SSE elongation of the magma chamber and influenced the roughly elliptical shape of the subsided block (caldera floor) belonging to the apex of Eskişehir-Afyon-Isparta volcanic area. Intracaldera post-collapse sedimentation and volcanism (at ~ 18 Ma) was controlled through subsidence-related faults with generation of a series of volcanic structures (mainly domes) showing a large compositional range from saturated silicic rhyolites and crystal-rich trachytes to undersaturated lamproites. Such volcanic rock association is typical for lithospheric extension. In this scenario, enriched mantle components within the subcontinental lithospheric mantle will begin to melt via decompression melting during the initiation of extension. Interaction of these melts with crustal rocks, fractionation processes and crustal anatexis driven by the heat contained in the ascending mantle melts produced the silicic compositions in a large crustal reservoir. Such silicic melts generated the initial eruptions of Kırka-Phrigian caldera ignimbrites. The rock volume and geochemical evidence suggests that silicic volcanic rocks come from a long-lived magma chamber that evolved episodically; after caldera

  3. Creating a Classroom Library.

    ERIC Educational Resources Information Center

    Hepler, Susan; And Others

    1992-01-01

    Presents ideas for creating classroom libraries, noting how to set up a library (create a space, build and organize the collection, and set rules), where to find books at bargain prices (e.g., garage sales, libraries, book clubs, and grants), basic books to include, and information on authors and illustrators. (SM)

  4. Your Outdoor Classroom

    ERIC Educational Resources Information Center

    Hinman, Laurie

    2005-01-01

    Physical education is still taught in outdoor settings in many warmer climates of the United States. Even when indoor facilities are available, physical education may be moved outside because of other curricular needs or facility issues. How can physical educators make the outdoor setting seem more like an indoor classroom? Outdoor teaching…

  5. Development of a large-sample watershed-scale hydrometeorological data set for the contiguous USA: data set characteristics and assessment of regional variability in hydrologic model performance

    NASA Astrophysics Data System (ADS)

    Newman, A. J.; Clark, M. P.; Sampson, K.; Wood, A.; Hay, L. E.; Bock, A.; Viger, R. J.; Blodgett, D.; Brekke, L.; Arnold, J. R.; Hopson, T.; Duan, Q.

    2015-01-01

    We present a community data set of daily forcing and hydrologic response data for 671 small- to medium-sized basins across the contiguous United States (median basin size of 336 km2) that spans a very wide range of hydroclimatic conditions. Area-averaged forcing data for the period 1980-2010 was generated for three basin spatial configurations - basin mean, hydrologic response units (HRUs) and elevation bands - by mapping daily, gridded meteorological data sets to the subbasin (Daymet) and basin polygons (Daymet, Maurer and NLDAS). Daily streamflow data was compiled from the United States Geological Survey National Water Information System. The focus of this paper is to (1) present the data set for community use and (2) provide a model performance benchmark using the coupled Snow-17 snow model and the Sacramento Soil Moisture Accounting Model, calibrated using the shuffled complex evolution global optimization routine. After optimization minimizing daily root mean squared error, 90% of the basins have Nash-Sutcliffe efficiency scores ≥0.55 for the calibration period and 34% ≥ 0.8. This benchmark provides a reference level of hydrologic model performance for a commonly used model and calibration system, and highlights some regional variations in model performance. For example, basins with a more pronounced seasonal cycle generally have a negative low flow bias, while basins with a smaller seasonal cycle have a positive low flow bias. Finally, we find that data points with extreme error (defined as individual days with a high fraction of total error) are more common in arid basins with limited snow and, for a given aridity, fewer extreme error days are present as the basin snow water equivalent increases.

  6. Classroom Network Technology as a Support for Systemic Mathematics Reform: The Effects of TI MathForward on Student Achievement in a Large, Diverse District

    ERIC Educational Resources Information Center

    Penuel, William; Singleton, Corinne; Roschelle, Jeremy

    2011-01-01

    Low-cost, portable classroom network technologies have shown great promise in recent years for improving teaching and learning in mathematics. This paper explores the impacts on student learning in mathematics when a program to introduce network technologies into mathematics classrooms is integrated into a systemic reform initiative at the…

  7. A Study of Classroom Response System Clickers: Increasing Student Engagement and Performance in a Large Undergraduate Lecture Class on Architectural Research

    ERIC Educational Resources Information Center

    Bachman, Leonard; Bachman, Christine

    2011-01-01

    This study examines the effectiveness of a classroom response system (CRS) and architecture students' perceptions of real-time feedback. CRS is designed to increase active engagement of students by their responses to a question or prompt via wireless keypads. Feedback is immediately portrayed on a classroom projector for discussion. The authors…

  8. Empirical Mining of Large Data Sets Already Helps to Solve Practical Ecological Problems; A Panoply of Working Examples (Invited)

    NASA Astrophysics Data System (ADS)

    Hargrove, W. W.; Hoffman, F. M.; Kumar, J.; Spruce, J.; Norman, S. P.

    2013-12-01

    Here we present diverse examples where empirical mining and statistical analysis of large data sets have already been shown to be useful for a wide variety of practical decision-making problems within the realm of large-scale ecology. Because a full understanding and appreciation of particular ecological phenomena are possible only after hypothesis-directed research regarding the existence and nature of that process, some ecologists may feel that purely empirical data harvesting may represent a less-than-satisfactory approach. Restricting ourselves exclusively to process-driven approaches, however, may actually slow progress, particularly for more complex or subtle ecological processes. We may not be able to afford the delays caused by such directed approaches. Rather than attempting to formulate and ask every relevant question correctly, empirical methods allow trends, relationships and associations to emerge freely from the data themselves, unencumbered by a priori theories, ideas and prejudices that have been imposed upon them. Although they cannot directly demonstrate causality, empirical methods can be extremely efficient at uncovering strong correlations with intermediate "linking" variables. In practice, these correlative structures and linking variables, once identified, may provide sufficient predictive power to be useful themselves. Such correlation "shadows" of causation can be harnessed by, e.g., Bayesian Belief Nets, which bias ecological management decisions, made with incomplete information, toward favorable outcomes. Empirical data-harvesting also generates a myriad of testable hypotheses regarding processes, some of which may even be correct. Quantitative statistical regionalizations based on quantitative multivariate similarity have lended insights into carbon eddy-flux direction and magnitude, wildfire biophysical conditions, phenological ecoregions useful for vegetation type mapping and monitoring, forest disease risk maps (e.g., sudden oak

  9. Teaching cell biology in the large-enrollment classroom: methods to promote analytical thinking and assessment of their effectiveness.

    PubMed

    Kitchen, Elizabeth; Bell, John D; Reeve, Suzanne; Sudweeks, Richard R; Bradshaw, William S

    2003-01-01

    A large-enrollment, undergraduate cellular biology lecture course is described whose primary goal is to help students acquire skill in the interpretation of experimental data. The premise is that this kind of analytical reasoning is not intuitive for most people and, in the absence of hands-on laboratory experience, will not readily develop unless instructional methods and examinations specifically designed to foster it are employed. Promoting scientific thinking forces changes in the roles of both teacher and student. We describe didactic strategies that include directed practice of data analysis in a workshop format, active learning through verbal and written communication, visualization of abstractions diagrammatically, and the use of ancillary small-group mentoring sessions with faculty. The implications for a teacher in reducing the breadth and depth of coverage, becoming coach instead of lecturer, and helping students to diagnose cognitive weaknesses are discussed. In order to determine the efficacy of these strategies, we have carefully monitored student performance and have demonstrated a large gain in a pre- and posttest comparison of scores on identical problems, improved test scores on several successive midterm examinations when the statistical analysis accounts for the relative difficulty of the problems, and higher scores in comparison to students in a control course whose objective was information transfer, not acquisition of reasoning skills. A novel analytical index (student mobility profile) is described that demonstrates that this improvement was not random, but a systematic outcome of the teaching/learning strategies employed. An assessment of attitudes showed that, in spite of finding it difficult, students endorse this approach to learning, but also favor curricular changes that would introduce an analytical emphasis earlier in their training.

  10. Teaching Cell Biology in the Large-Enrollment Classroom: Methods to Promote Analytical Thinking and Assessment of Their Effectiveness

    PubMed Central

    Kitchen, Elizabeth; Bell, John D.; Reeve, Suzanne; Sudweeks, Richard R.; Bradshaw, William S.

    2003-01-01

    A large-enrollment, undergraduate cellular biology lecture course is described whose primary goal is to help students acquire skill in the interpretation of experimental data. The premise is that this kind of analytical reasoning is not intuitive for most people and, in the absence of hands-on laboratory experience, will not readily develop unless instructional methods and examinations specifically designed to foster it are employed. Promoting scientific thinking forces changes in the roles of both teacher and student. We describe didactic strategies that include directed practice of data analysis in a workshop format, active learning through verbal and written communication, visualization of abstractions diagrammatically, and the use of ancillary small-group mentoring sessions with faculty. The implications for a teacher in reducing the breadth and depth of coverage, becoming coach instead of lecturer, and helping students to diagnose cognitive weaknesses are discussed. In order to determine the efficacy of these strategies, we have carefully monitored student performance and have demonstrated a large gain in a pre- and posttest comparison of scores on identical problems, improved test scores on several successive midterm examinations when the statistical analysis accounts for the relative difficulty of the problems, and higher scores in comparison to students in a control course whose objective was information transfer, not acquisition of reasoning skills. A novel analytical index (student mobility profile) is described that demonstrates that this improvement was not random, but a systematic outcome of the teaching/learning strategies employed. An assessment of attitudes showed that, in spite of finding it difficult, students endorse this approach to learning, but also favor curricular changes that would introduce an analytical emphasis earlier in their training. PMID:14506506

  11. A Large-Scale Inquiry-Based Astronomy Intervention Project: Impact on Students' Content Knowledge Performance and Views of their High School Science Classroom

    NASA Astrophysics Data System (ADS)

    Fitzgerald, Michael; McKinnon, David H.; Danaia, Lena; Deehan, James

    2015-08-01

    In this paper, we present the results from a study of the impact on students involved in a large-scale inquiry-based astronomical high school education intervention in Australia. Students in this intervention were led through an educational design allowing them to undertake an investigative approach to understanding the lifecycle of stars more aligned with the `ideal' picture of school science. Through the use of two instruments, one focused on content knowledge gains and the other on student views of school science, we explore the impact of this design. Overall, students made moderate content knowledge gains although these gains were heavily dependent on the individual teacher, the number of times a teacher implemented and the depth to which an individual teacher went with the provided materials. In terms of students' views, there were significant global changes in their views of their experience of the science classroom. However, there were some areas where no change or slightly negative changes of which some were expected and some were not. From these results, we comment on the necessity of sustained long-period implementations rather than single interventions, the requirement for similarly sustained professional development and the importance of monitoring the impact of inquiry-based implementations. This is especially important as inquiry-based approaches to science are required by many new curriculum reforms, most notably in this context, the new Australian curriculum currently being rolled out.

  12. Classroom Management in Diverse Classrooms

    ERIC Educational Resources Information Center

    Milner, H. Richard, IV; Tenore, F. Blake

    2010-01-01

    Classroom management continues to be a serious concern for teachers and especially in urban and diverse learning environments. The authors present the culturally responsive classroom management practices of two teachers from an urban and diverse middle school to extend the construct, culturally responsive classroom management. The principles that…

  13. Classroom Dimensions and Classroom Types.

    ERIC Educational Resources Information Center

    Kendall, Arthur J.; Solomon, Daniel

    Although classroom "openness" has been much discussed in recent years, there has been little effort to investigate to what degree this openness occurs within a general sample of classrooms. The purpose of this study is to identify significant attributes of classroom activity and organization relevant to the concepts of "traditional" and "open" and…

  14. How do you assign persistent identifiers to extracts from large, complex, dynamic data sets that underpin scholarly publications?

    NASA Astrophysics Data System (ADS)

    Wyborn, Lesley; Car, Nicholas; Evans, Benjamin; Klump, Jens

    2016-04-01

    Persistent identifiers in the form of a Digital Object Identifier (DOI) are becoming more mainstream, assigned at both the collection and dataset level. For static datasets, this is a relatively straight-forward matter. However, many new data collections are dynamic, with new data being appended, models and derivative products being revised with new data, or the data itself revised as processing methods are improved. Further, because data collections are becoming accessible as services, researchers can log in and dynamically create user-defined subsets for specific research projects: they also can easily mix and match data from multiple collections, each of which can have a complex history. Inevitably extracts from such dynamic data sets underpin scholarly publications, and this presents new challenges. The National Computational Infrastructure (NCI) has been experiencing and making progress towards addressing these issues. The NCI is large node of the Research Data Services initiative (RDS) of the Australian Government's research infrastructure, which currently makes available over 10 PBytes of priority research collections, ranging from geosciences, geophysics, environment, and climate, through to astronomy, bioinformatics, and social sciences. Data are replicated to, or are produced at, NCI and then processed there to higher-level data products or directly analysed. Individual datasets range from multi-petabyte computational models and large volume raster arrays, down to gigabyte size, ultra-high resolution datasets. To facilitate access, maximise reuse and enable integration across the disciplines, datasets have been organized on a platform called the National Environmental Research Data Interoperability Platform (NERDIP). Combined, the NERDIP data collections form a rich and diverse asset for researchers: their co-location and standardization optimises the value of existing data, and forms a new resource to underpin data-intensive Science. New publication

  15. Is Our Classroom an Ecological Place?

    ERIC Educational Resources Information Center

    Xia, Wang

    2006-01-01

    The essence of ecology is life and its diversity, integrity, openness and coexistence. When one contemplates and analyzes classroom from the perspective of ecology, classroom should contain open-ended and multiple goals instead of a single and pre-set goal; classroom is more flexible, allowing great diversity instead of being narrow-minded,…

  16. Photometric selection of quasars in large astronomical data sets with a fast and accurate machine learning algorithm

    NASA Astrophysics Data System (ADS)

    Gupta, Pramod; Connolly, Andrew J.; Gardner, Jeffrey P.

    2014-03-01

    Future astronomical surveys will produce data on ˜108 objects per night. In order to characterize and classify these sources, we will require algorithms that scale linearly with the size of the data, that can be easily parallelized and where the speedup of the parallel algorithm will be linear in the number of processing cores. In this paper, we present such an algorithm and apply it to the question of colour selection of quasars. We use non-parametric Bayesian classification and a binning algorithm implemented with hash tables (BASH tables). We show that this algorithm's run time scales linearly with the number of test set objects and is independent of the number of training set objects. We also show that it has the same classification accuracy as other algorithms. For current data set sizes, it is up to three orders of magnitude faster than commonly used naive kernel-density-estimation techniques and it is estimated to be about eight times faster than the current fastest algorithm using dual kd-trees for kernel density estimation. The BASH table algorithm scales linearly with the size of the test set data only, and so for future larger data sets, it will be even faster compared to other algorithms which all depend on the size of the test set and the size of the training set. Since it uses linear data structures, it is easier to parallelize compared to tree-based algorithms and its speedup is linear in the number of cores unlike tree-based algorithms whose speedup plateaus after a certain number of cores. Moreover, due to the use of hash tables to implement the binning, the memory usage is very small. While our analysis is for the specific problem of selection of quasars, the ideas are general and the BASH table algorithm can be applied to any density-estimation problem involving sparse high-dimensional data sets. Since sparse high-dimensional data sets are a common type of scientific data set, this method has the potential to be useful in a broad range of

  17. Improving Interactions in the Large Language Class.

    ERIC Educational Resources Information Center

    Raymond, Patricia M.; Raymond, Jacques; Pilon, Daniel

    1998-01-01

    Describes a prototypical microcomputer system that improves the interactions between teacher and large language classes in a traditional language classroom setting. This system achieves dynamic interactions through multiple student/professor interventions, immediate and delayed feedback, and individual teacher/student conferences. The system uses…

  18. The Classroom as Public Space.

    ERIC Educational Resources Information Center

    Weiss, Robert O.

    The necessity for maintaining and extending the public space within which argumentation may appear, whether or not represented in the classroom, stems largely from pressures which have increasingly restricted that space. To function as public spaces, classrooms must enable students as citizens to confer in an unrestricted fashion about matters of…

  19. Impacts of Flipped Classroom in High School Health Education

    ERIC Educational Resources Information Center

    Chen, Li-Ling

    2016-01-01

    As advanced technology increasingly infiltrated into classroom, the flipped classroom has come to light in secondary educational settings. The flipped classroom is a new instructional approach that intends to flip the traditional teacher-centered classroom into student centered. The purpose of this research is to investigate the impact of the…

  20. A Zebra in the Classroom.

    ERIC Educational Resources Information Center

    Leake, Devin; Morvillo, Nancy

    1998-01-01

    Describes the care and breeding of zebra fish, suggests various experiments and observations easily performed in a classroom setting, and provides some ideas to further student interest and exploration of these organisms. (DDR)

  1. Pre-Service Teachers and Classroom Authority

    ERIC Educational Resources Information Center

    Pellegrino, Anthony M.

    2010-01-01

    This study examined the classroom practices of five pre-service teachers from three secondary schools in a large southeastern state. Through classroom observations, survey responses, reviews of refection logs, and focus-group interview responses, we centered on the issue of developing classroom authority as a means to effective classroom…

  2. Computation of a stabilizing set of feedback matrices of a large-scale nonlinear musculoskeletal dynamic model.

    PubMed

    Dhaher, Y Y

    2001-02-01

    The purpose of this study is to present a general mathematical framework to compute a set of feedback matrices which stabilize an unstable nonlinear anthropomorphic musculoskeletal dynamic model. This method is activity specific and involves four fundamental stages. First, from muscle activation data (input) and motion degrees-of-freedom (output) a dynamic experimental model is obtained using system identification schemes. Second, a nonlinear musculoskeletal dynamic model which contains the same number of muscles and degrees-of-freedom and best represents the activity being considered is proposed. Third, the nonlinear musculoskeletal model (anthropomorphic model) is replaced by a family of linear systems, parameterized by the same set of input/output data (nominal points) used in the identification of the experimental model. Finally, a set of stabilizing output feedback matrices, parameterized again by the same set of nominal points, is computed such that when combined with the anthropomorphic model, the combined system resembles the structural form of the experimental model. The method is illustrated in regard to the human squat activity. PMID:11264866

  3. Computation of a stabilizing set of feedback matrices of a large-scale nonlinear musculoskeletal dynamic model.

    PubMed

    Dhaher, Y Y

    2001-02-01

    The purpose of this study is to present a general mathematical framework to compute a set of feedback matrices which stabilize an unstable nonlinear anthropomorphic musculoskeletal dynamic model. This method is activity specific and involves four fundamental stages. First, from muscle activation data (input) and motion degrees-of-freedom (output) a dynamic experimental model is obtained using system identification schemes. Second, a nonlinear musculoskeletal dynamic model which contains the same number of muscles and degrees-of-freedom and best represents the activity being considered is proposed. Third, the nonlinear musculoskeletal model (anthropomorphic model) is replaced by a family of linear systems, parameterized by the same set of input/output data (nominal points) used in the identification of the experimental model. Finally, a set of stabilizing output feedback matrices, parameterized again by the same set of nominal points, is computed such that when combined with the anthropomorphic model, the combined system resembles the structural form of the experimental model. The method is illustrated in regard to the human squat activity.

  4. My Classroom: Kazakhstan

    ERIC Educational Resources Information Center

    Whitaker, Lauren

    2016-01-01

    Yulia Bulatkulova discovered her passion for English language teaching at a young age as a result of the example set by an esteemed childhood English teacher, Elvira Kuyanova. This article discusses how Ms. Bulatkulova's interactions with her students, both inside and outside the classroom, demonstrate that she has followed in the footsteps of her…

  5. The Paperless Music Classroom

    ERIC Educational Resources Information Center

    Giebelhausen, Robin

    2016-01-01

    In an age where the world is becoming ever more aware of paper consumption, educators are turning toward technology to cut back on paper waste. Besides the environmental reasons, a paperless music classroom helps students develop their musicianship in new and exciting ways. This article will look at the considerations for setting up a paperless…

  6. 'Flipping' the Classroom.

    PubMed

    Billings, Diane M

    2016-09-01

    This article is one in a series on the roles of adjunct clinical faculty and preceptors, who teach nursing students and new graduates to apply knowledge in clinical settings. This article describes the benefits and challenges of using a "flipped" classroom to promote active engagement among learners and more meaningful interaction between learners and educators. PMID:27560340

  7. Classroom Management That Works

    ERIC Educational Resources Information Center

    Cleve, Lauren

    2012-01-01

    The purpose of this study was to find the best classroom management strategies to use when teaching in an elementary school setting. I wanted to conduct the best possible management tools for a variety of age groups as well as meet educational standards. Through my research I found different approaches in different grade levels is an important…

  8. Learning in Tomorrow's Classrooms

    ERIC Educational Resources Information Center

    Bowman, Richard F.

    2015-01-01

    Teaching today remains the most individualistic of all the professions, with educators characteristically operating in a highly fragmented world of "their" courses, "their" skills, and "their" students. Learning will occur in the classrooms of the future through a sustainable set of complementary capabilities:…

  9. 'Flipping' the Classroom.

    PubMed

    Billings, Diane M

    2016-09-01

    This article is one in a series on the roles of adjunct clinical faculty and preceptors, who teach nursing students and new graduates to apply knowledge in clinical settings. This article describes the benefits and challenges of using a "flipped" classroom to promote active engagement among learners and more meaningful interaction between learners and educators.

  10. Developing a "Semi-Systematic" Approach to Using Large-Scale Data-Sets for Small-Scale Interventions: The "Baby Matterz" Initiative as a Case Study

    ERIC Educational Resources Information Center

    O'Brien, Mark

    2011-01-01

    The appropriateness of using statistical data to inform the design of any given service development or initiative often depends upon judgements regarding scale. Large-scale data sets, perhaps national in scope, whilst potentially important in informing the design, implementation and roll-out of experimental initiatives, will often remain unused…

  11. Key Issues and Strategies for Recruitment and Implementation in Large-Scale Randomized Controlled Trial Studies in Afterschool Settings. Afterschool Research Brief. Issue No. 2

    ERIC Educational Resources Information Center

    Jones, Debra Hughes; Vaden-Kiernan, Michael; Rudo, Zena; Fitzgerald, Robert; Hartry, Ardice; Chambers, Bette; Smith, Dewi; Muller, Patricia; Moss, Marcey A.

    2008-01-01

    Under the larger scope of the National Partnership for Quality Afterschool Learning, SEDL funded three awardees to carry out large-scale randomized controlled trials (RCT) assessing the efficacy of promising literacy curricula in afterschool settings on student academic achievement. SEDL provided analytic and technical support to the RCT studies…

  12. The Effect of Repeated Reading with Pairs of Students in a Large-Group Setting on Fluency and Comprehension for Students at Risk for Reading Failure

    ERIC Educational Resources Information Center

    Frame, John N.

    2011-01-01

    Problem: Some students are failing to develop acceptable reading skills; however, instructional time allocated to reading fluency can increase reading comprehension. The purpose of this study was to compare students who received repeated reading with pairs of students in a large-group setting with those who did not in terms of reading fluency,…

  13. Outdoor Classrooms

    ERIC Educational Resources Information Center

    Mayes, Valynda

    2010-01-01

    An outdoor classroom is the ideal vehicle for community involvement: Parents, native plant societies, 4-H, garden clubs, and master naturalists are all resources waiting to be tapped, as are local businesses offering support. If you enlist your community in the development and maintenance of your outdoor classroom, the entire community will…

  14. Classroom Activities.

    ERIC Educational Resources Information Center

    Stuart, Frances R.

    This pamphlet suggests activities that may be used in the elementary school classroom. Chapter I lists various short plays that children can easily perform which encourage their imagination. Chapter II details a few quiet classroom games such as "I Saw,""Corral the Wild Horse,""Who Has Gone from the Room," and "Six-Man-Football Checkers." A number…

  15. Classroom Management.

    ERIC Educational Resources Information Center

    Dinsmore, Terri Sue

    This paper is a report of a middle-school teacher's study of classroom management. The teacher/researcher was interested in how some of the techniques in the Kovalik Integrated Thematic Instruction model of training would influence the teacher/researcher's classroom management; the effects of direct instruction within a community circle; the…

  16. pXRF quantitative analysis of the Otowi Member of the Bandelier Tuff: Generating large, robust data sets to decipher trace element zonation in large silicic magma chambers

    NASA Astrophysics Data System (ADS)

    Van Hoose, A. E.; Wolff, J.; Conrey, R.

    2013-12-01

    Advances in portable X-Ray fluorescence (pXRF) analytical technology have made it possible for high-quality, quantitative data to be collected in a fraction of the time required by standard, non-portable analytical techniques. Not only do these advances reduce analysis time, but data may also be collected in the field in conjunction with sampling. Rhyolitic pumice, being primarily glass, is an excellent material to be analyzed with this technology. High-quality, quantitative data for elements that are tracers of magmatic differentiation (e.g. Rb, Sr, Y, Nb) can be collected for whole, individual pumices and subsamples of larger pumices in 4 minutes. We have developed a calibration for powdered rhyolite pumice from the Otowi Member of the Bandelier Tuff analyzed with the Bruker Tracer IV pXRF using Bruker software and influence coefficients for pumice, which measures the following 19 oxides and elements: SiO2, TiO2, Al2O3, FeO*, MnO, CaO, K2O, P2O5, Zn, Ga, Rb, Sr, Y, Zr, Nb, Ba, Ce, Pb, and Th. With this calibration for the pXRF and thousands of individual powdered pumice samples, we have generated an unparalleled data set for any single eruptive unit with known trace element zonation. The Bandelier Tuff of the Valles-Toledo Caldera Complex, Jemez Mountains, New Mexico, is divided into three main eruptive events. For this study, we have chosen the 1.61 Ma, 450 km3 Otowi Member as it is primarily unwelded and pumice samples are easily accessible. The eruption began with a plinian phase from a single source located near center of the current caldera and deposited the Guaje Pumice Bed. The initial Unit A of the Guaje is geochemically monotonous, but Units B through E, co-deposited with ignimbrite show very strong chemical zonation in trace elements, progressing upwards through the deposits from highly differentiated compositions (Rb ~350 ppm, Nb ~200 ppm) to less differentiated (Rb ~100 ppm, Nb ~50 ppm). Co-erupted ignimbrites emplaced during column collapse show

  17. Statistical Analysis of a Large Sample Size Pyroshock Test Data Set Including Post Flight Data Assessment. Revision 1

    NASA Technical Reports Server (NTRS)

    Hughes, William O.; McNelis, Anne M.

    2010-01-01

    The Earth Observing System (EOS) Terra spacecraft was launched on an Atlas IIAS launch vehicle on its mission to observe planet Earth in late 1999. Prior to launch, the new design of the spacecraft's pyroshock separation system was characterized by a series of 13 separation ground tests. The analysis methods used to evaluate this unusually large amount of shock data will be discussed in this paper, with particular emphasis on population distributions and finding statistically significant families of data, leading to an overall shock separation interface level. The wealth of ground test data also allowed a derivation of a Mission Assurance level for the flight. All of the flight shock measurements were below the EOS Terra Mission Assurance level thus contributing to the overall success of the EOS Terra mission. The effectiveness of the statistical methodology for characterizing the shock interface level and for developing a flight Mission Assurance level from a large sample size of shock data is demonstrated in this paper.

  18. The large karstic holes at the top of the Syrian coastal Mountain Range. Importance of structural setting for the karstogenesis.

    NASA Astrophysics Data System (ADS)

    Mocochain, Ludovic; Blanpied, Christian; Bigot, Jean-Yves; Peyronel, Olivier; Gorini, Christian; Abdalla, Abdelkarim Al; Azki, Fawaz

    2015-04-01

    Along the Eastern Mediterranean Sea, the Syria Coastal Mountain Range spreads from north to south over 150 km of long. This range is a monocline structure stopped by a major escarpment that domines Al-Gahb Graben to the East. The Coastal Mountain Range is mainly formed by Mesozoic limestone that show a major unconformity between the Upper Jurassic and Aptien deposits, and important erosions in the Upper Cretaceous deposits. Locally, the Juro-Cretaceous unconformity is characterized by a layer of continental basalts with fossil woods that reveal a long emersion of the platform. The most recent carbonate deposits at the top of the Coastal Mountain Range are Turonian age. In the center part of the Coastal Mountain Range, in a small area, the Cretaceous carbonates are affected by large karstic dolines. These dolines are curiously located at the top of the mountain range. This position is not beneficial for the development of large karstic holes.

  19. Fast and Accurate Protein False Discovery Rates on Large-Scale Proteomics Data Sets with Percolator 3.0

    NASA Astrophysics Data System (ADS)

    The, Matthew; MacCoss, Michael J.; Noble, William S.; Käll, Lukas

    2016-08-01

    Percolator is a widely used software tool that increases yield in shotgun proteomics experiments and assigns reliable statistical confidence measures, such as q values and posterior error probabilities, to peptides and peptide-spectrum matches (PSMs) from such experiments. Percolator's processing speed has been sufficient for typical data sets consisting of hundreds of thousands of PSMs. With our new scalable approach, we can now also analyze millions of PSMs in a matter of minutes on a commodity computer. Furthermore, with the increasing awareness for the need for reliable statistics on the protein level, we compared several easy-to-understand protein inference methods and implemented the best-performing method—grouping proteins by their corresponding sets of theoretical peptides and then considering only the best-scoring peptide for each protein—in the Percolator package. We used Percolator 3.0 to analyze the data from a recent study of the draft human proteome containing 25 million spectra (PM:24870542). The source code and Ubuntu, Windows, MacOS, and Fedora binary packages are available from http://percolator.ms/ under an Apache 2.0 license.

  20. Fast and Accurate Protein False Discovery Rates on Large-Scale Proteomics Data Sets with Percolator 3.0

    NASA Astrophysics Data System (ADS)

    The, Matthew; MacCoss, Michael J.; Noble, William S.; Käll, Lukas

    2016-11-01

    Percolator is a widely used software tool that increases yield in shotgun proteomics experiments and assigns reliable statistical confidence measures, such as q values and posterior error probabilities, to peptides and peptide-spectrum matches (PSMs) from such experiments. Percolator's processing speed has been sufficient for typical data sets consisting of hundreds of thousands of PSMs. With our new scalable approach, we can now also analyze millions of PSMs in a matter of minutes on a commodity computer. Furthermore, with the increasing awareness for the need for reliable statistics on the protein level, we compared several easy-to-understand protein inference methods and implemented the best-performing method—grouping proteins by their corresponding sets of theoretical peptides and then considering only the best-scoring peptide for each protein—in the Percolator package. We used Percolator 3.0 to analyze the data from a recent study of the draft human proteome containing 25 million spectra (PM:24870542). The source code and Ubuntu, Windows, MacOS, and Fedora binary packages are available from http://percolator.ms/ under an Apache 2.0 license.

  1. Sorting a large set of heavily used LiF:Mg,Ti thermoluminescent detectors into repeatable subsets of similar response.

    PubMed

    Kearfott, Kimberlee J; Newton, Jill P; Rafique, Muhammad

    2014-10-30

    A set of 920 heavily used LiF:Mg,Ti thermoluminescent dosimeters (TLDs) was placed into a polymethyl methacrylate (PMMA) plate attached to a 40×40×15cm(3) PMMA phantom and irradiated to 4.52mGy using a (137)Cs source. This was repeated three times to determine the mean and standard deviation of each TLD׳s sensitivity. Reader drift was tracked over time with 10 control dosimeters. Two test sets of 100 TLDs were divided into subsets with sensitivities within ±1% of their subset means. All dosimeters were re-irradiated four times to test the TLDs׳ response repeatability and determine the sensitivity uniformity within the subsets. Coefficients of variation revealed that, within a given subset, the dosimeters responded within ±2.5% of their subset mean in all calibrations. The coefficient of variation in any of the 200 TLDs׳ calibrations was below 6% across the four calibrations. The work validates the approach of performing three calibrations to separate heavily used and aged TLDs with overall sensitivity variations of ±25% into subsets that reproducibly respond within ±2.5%.

  2. The Effects of Positive Verbal Reinforcement on the Time Spent outside the Classroom for Students with Emotional and Behavioral Disorders in a Residential Setting

    ERIC Educational Resources Information Center

    Kennedy, Christina; Jolivette, Kristine

    2008-01-01

    To more effectively instruct the entire class, teachers of students with emotional behavioral disorders (EBD) often choose to send students who display inappropriate behavior out of the room. A multiple baseline across settings was used to evaluate the effects of increasing teacher positive verbal reinforcement on the amount of time 2 students…

  3. Moving toward an Empowering Setting in a First Grade Classroom Serving Primarily Working Class and Working Poor Latina/o Children: An Exploratory Analysis

    ERIC Educational Resources Information Center

    Silva, Janelle M.; Langhout, Regina Day

    2016-01-01

    Empowering settings are important places for people to develop leadership skills in order to enact social change. Yet, due to socio-cultural constructions of childhood in the US, especially constructions around working class and working poor children of Color, they are often not seen as capable or competent change agents, or in need of being in…

  4. Flexible Classroom Furniture

    ERIC Educational Resources Information Center

    Kim Hassell,

    2011-01-01

    Classroom design for the 21st-century learning environment should accommodate a variety of learning skills and needs. The space should be large enough so it can be configured to accommodate a number of learning activities. This also includes furniture that provides flexibility and accommodates collaboration and interactive work among students and…

  5. "Did Ronald McDonald also Tend to Scare You as a Child?": Working to Emplace Consumption, Commodities and Citizen-Students in a Large Classroom Setting

    ERIC Educational Resources Information Center

    Goodman, Michael K.

    2008-01-01

    So-called "radical" and "critical"pedagogy seems to be everywhere these days on the landscapes of geographical teaching praxis and theory. Part of the remit of radical/critical pedagogy involves a de-centring of the traditional "banking" method of pedagogical praxis. Yet, how do we challenge this "banking" model of knowledge transmission in both a…

  6. Learning to Stand: The Acceptability and Feasibility of Introducing Standing Desks into College Classrooms

    PubMed Central

    Benzo, Roberto M.; Gremaud, Allene L.; Jerome, Matthew; Carr, Lucas J.

    2016-01-01

    Prolonged sedentary behavior is an independent risk factor for multiple negative health outcomes. Evidence supports introducing standing desks into K-12 classrooms and work settings to reduce sitting time, but no studies have been conducted in the college classroom environment. The present study explored the acceptability and feasibility of introducing standing desks in college classrooms. A total of 993 students and 149 instructors completed a single online needs assessment survey. This cross-sectional study was conducted during the fall semester of 2015 at a large Midwestern University. The large majority of students (95%) reported they would prefer the option to stand in class. Most students (82.7%) reported they currently sit during their entire class time. Most students (76.6%) and instructors (86.6%) reported being in favor of introducing standing desks into college classrooms. More than half of students and instructors predicted having access to standing desks in class would improve student’s “physical health”, “attention”, and “restlessness”. Collectively, these findings support the acceptability of introducing standing desks in college classrooms. Future research is needed to test the feasibility, cost-effectiveness and efficacy of introducing standing desks in college classrooms. Such studies would be useful for informing institutional policies regarding classroom designs. PMID:27537901

  7. Learning to Stand: The Acceptability and Feasibility of Introducing Standing Desks into College Classrooms.

    PubMed

    Benzo, Roberto M; Gremaud, Allene L; Jerome, Matthew; Carr, Lucas J

    2016-01-01

    Prolonged sedentary behavior is an independent risk factor for multiple negative health outcomes. Evidence supports introducing standing desks into K-12 classrooms and work settings to reduce sitting time, but no studies have been conducted in the college classroom environment. The present study explored the acceptability and feasibility of introducing standing desks in college classrooms. A total of 993 students and 149 instructors completed a single online needs assessment survey. This cross-sectional study was conducted during the fall semester of 2015 at a large Midwestern University. The large majority of students (95%) reported they would prefer the option to stand in class. Most students (82.7%) reported they currently sit during their entire class time. Most students (76.6%) and instructors (86.6%) reported being in favor of introducing standing desks into college classrooms. More than half of students and instructors predicted having access to standing desks in class would improve student's "physical health", "attention", and "restlessness". Collectively, these findings support the acceptability of introducing standing desks in college classrooms. Future research is needed to test the feasibility, cost-effectiveness and efficacy of introducing standing desks in college classrooms. Such studies would be useful for informing institutional policies regarding classroom designs. PMID:27537901

  8. Learning to Stand: The Acceptability and Feasibility of Introducing Standing Desks into College Classrooms.

    PubMed

    Benzo, Roberto M; Gremaud, Allene L; Jerome, Matthew; Carr, Lucas J

    2016-01-01

    Prolonged sedentary behavior is an independent risk factor for multiple negative health outcomes. Evidence supports introducing standing desks into K-12 classrooms and work settings to reduce sitting time, but no studies have been conducted in the college classroom environment. The present study explored the acceptability and feasibility of introducing standing desks in college classrooms. A total of 993 students and 149 instructors completed a single online needs assessment survey. This cross-sectional study was conducted during the fall semester of 2015 at a large Midwestern University. The large majority of students (95%) reported they would prefer the option to stand in class. Most students (82.7%) reported they currently sit during their entire class time. Most students (76.6%) and instructors (86.6%) reported being in favor of introducing standing desks into college classrooms. More than half of students and instructors predicted having access to standing desks in class would improve student's "physical health", "attention", and "restlessness". Collectively, these findings support the acceptability of introducing standing desks in college classrooms. Future research is needed to test the feasibility, cost-effectiveness and efficacy of introducing standing desks in college classrooms. Such studies would be useful for informing institutional policies regarding classroom designs.

  9. A large, precise set of polarization observables for deuteron-proton breakup at 130 MeV

    SciTech Connect

    Stephan, E.; Biegun, A.; Klos, B.; Micherdzinska, A.; Zipper, W.; Kistryn, St.; Bodek, K.; Ciepal, I.; Golak, J.; Skibinski, R.; Sworst, R.; Witala, H.; Zejma, J.; Kalantar-Nayestanaki, N.; Kis, M.; Mahjour-Shafiei, M.; Deltuva, A.; Fonseca, A. C.; Epelbaum, E.; Nogga, A.

    2008-04-29

    High precision vector A{sub x},A{sub y} and tensor A{sub xx},A{sub xy},A{sub yy} analyzing powers for the {sup 1}H(d-vector,pp)n breakup reaction were measured at 130 MeV beam energy with the detection system covering a large part of the phase space. Results are compared with rigorous theoretical calculations based on realistic nucleon-nucleon potentials, also with a so-called three-nucleon force included, as well as on chiral perturbation theory. Theoretical predictions generally describe the data quite well, but in some regions discrepancies have been observed, which indicate incompleteness of the present-day treatment of three nucleon dynamics.

  10. Worsening Hypoxemia in the Face of Increasing PEEP: A Case of Large Pulmonary Embolism in the Setting of Intracardiac Shunt.

    PubMed

    Granati, Glen T; Teressa, Getu

    2016-01-01

    BACKGROUND Patent foramen ovale (PFO) are common, normally resulting in a left-to-right shunt or no net shunting. Pulmonary embolism (PE) can cause sustained increased pulmonary vascular resistance (PVR) and right atrial pressure. Increasing positive end-expiratory pressure (PEEP) improves oxygenation at the expense of increasing intrathoracic pressures (ITP). Airway pressure release ventilation (APRV) decreases shunt fraction, improves ventilation/perfusion (V/Q) matching, increases cardiac output, and decreases right atrial pressure by facilitating low airway pressure. CASE REPORT A 40-year-old man presented with dyspnea and hemoptysis. Oxygen saturation (SaO2) 80% on room air with A-a gradient of 633 mmHg. Post-intubation SaO2 dropped to 71% on assist control, FiO2 100%, and PEEP of 5 cmH20. Successive PEEP dropped SaO2 to 60-70% and blood pressure plummeted. APRV was initaiated with improvement in SaO2 to 95% and improvement in blood pressure. Hemiparesis developed and CT head showed infarction. CT pulmonary angiogram found a large pulmonary embolism. Transthoracic echocardiogram detected right-to left intracardiac shunt, with large PFO. CONCLUSIONS There should be suspicion for a PFO when severe hypoxemia paradoxically worsens in response to increasing airway pressures. Concomitant venous and arterial thromboemboli should prompt evaluation for intra-cardiac shunt. Patients with PFO and hypoxemia should be evaluated for causes of sustained right-to-left pressure gradient, such as PE. Management should aim to decrease PVR and optimize V/Q matching by treating the inciting incident (e.g., thrombolytics in PE) and by minimizing ITP. APRV can minimize PVR and maximize V/Q ratios and should be considered in treating patients similar to the one whose case is presented here. PMID:27377010

  11. Discovery of a large set of SNP and SSR genetic markers by high-throughput sequencing of pepper (Capsicum annuum).

    PubMed

    Nicolaï, M; Pisani, C; Bouchet, J-P; Vuylsteke, M; Palloix, A

    2012-08-13

    Genetic markers based on single nucleotide polymorphisms (SNPs) are in increasing demand for genome mapping and fingerprinting of breeding populations in crop plants. Recent advances in high-throughput sequencing provide the opportunity for whole-genome resequencing and identification of allelic variants by mapping the reads to a reference genome. However, for many species, such as pepper (Capsicum annuum), a reference genome sequence is not yet available. To this end, we sequenced the C. annuum cv. "Yolo Wonder" transcriptome using Roche 454 pyrosequencing and assembled de novo 23,748 isotigs and 60,370 singletons. Mapping of 10,886,425 reads obtained by the Illumina GA II sequencing of C. annuum cv. "Criollo de Morelos 334" to the "Yolo Wonder" transcriptome allowed for SNP identification. By setting a threshold value that allows selecting reliable SNPs with minimal loss of information, 11,849 reliable SNPs spread across 5919 isotigs were identified. In addition, 853 single sequence repeats were obtained. This information has been made available online.

  12. Large reptiles and cold temperatures: Do extreme cold spells set distributional limits for tropical reptiles in Florida?

    USGS Publications Warehouse

    Mazzotti, Frank J.; Cherkiss, Michael S.; Parry, Mark; Beauchamp, Jeff; Rochford, Mike; Smith, Brian J.; Hart, Kristen M.; Brandt, Laura A.

    2016-01-01

    Distributional limits of many tropical species in Florida are ultimately determined by tolerance to low temperature. An unprecedented cold spell during 2–11 January 2010, in South Florida provided an opportunity to compare the responses of tropical American crocodiles with warm-temperate American alligators and to compare the responses of nonnative Burmese pythons with native warm-temperate snakes exposed to prolonged cold temperatures. After the January 2010 cold spell, a record number of American crocodiles (n = 151) and Burmese pythons (n = 36) were found dead. In contrast, no American alligators and no native snakes were found dead. American alligators and American crocodiles behaved differently during the cold spell. American alligators stopped basking and retreated to warmer water. American crocodiles apparently continued to bask during extreme cold temperatures resulting in lethal body temperatures. The mortality of Burmese pythons compared to the absence of mortality for native snakes suggests that the current population of Burmese pythons in the Everglades is less tolerant of cold temperatures than native snakes. Burmese pythons introduced from other parts of their native range may be more tolerant of cold temperatures. We documented the direct effects of cold temperatures on crocodiles and pythons; however, evidence of long-term effects of cold temperature on their populations within their established ranges remains lacking. Mortality of crocodiles and pythons outside of their current established range may be more important in setting distributional limits.

  13. Discovery of a large set of SNP and SSR genetic markers by high-throughput sequencing of pepper (Capsicum annuum).

    PubMed

    Nicolaï, M; Pisani, C; Bouchet, J-P; Vuylsteke, M; Palloix, A

    2012-01-01

    Genetic markers based on single nucleotide polymorphisms (SNPs) are in increasing demand for genome mapping and fingerprinting of breeding populations in crop plants. Recent advances in high-throughput sequencing provide the opportunity for whole-genome resequencing and identification of allelic variants by mapping the reads to a reference genome. However, for many species, such as pepper (Capsicum annuum), a reference genome sequence is not yet available. To this end, we sequenced the C. annuum cv. "Yolo Wonder" transcriptome using Roche 454 pyrosequencing and assembled de novo 23,748 isotigs and 60,370 singletons. Mapping of 10,886,425 reads obtained by the Illumina GA II sequencing of C. annuum cv. "Criollo de Morelos 334" to the "Yolo Wonder" transcriptome allowed for SNP identification. By setting a threshold value that allows selecting reliable SNPs with minimal loss of information, 11,849 reliable SNPs spread across 5919 isotigs were identified. In addition, 853 single sequence repeats were obtained. This information has been made available online. PMID:22911599

  14. Strategy Training in a Task-Based Language Classroom

    ERIC Educational Resources Information Center

    Lai, Chun; Lin, Xiaolin

    2015-01-01

    Recent literature that examines the implementation of task-based language teaching (TBLT) in classroom settings has reported various challenges related to educational cultures, classroom management, teacher cognition and learner perceptions. To facilitate the smooth transition of TBLT from laboratory settings to classroom contexts, measures need…

  15. Approaching the complete basis set limit of CCSD(T) for large systems by the third-order incremental dual-basis set zero-buffer F12 method

    SciTech Connect

    Zhang, Jun Dolg, Michael

    2014-01-28

    The third-order incremental dual-basis set zero-buffer approach was combined with CCSD(T)-F12x (x = a, b) theory to develop a new approach, i.e., the inc3-db-B0-CCSD(T)-F12 method, which can be applied as a black-box procedure to efficiently obtain the near complete basis set (CBS) limit of the CCSD(T) energies also for large systems. We tested this method for several cases of different chemical nature: four complexes taken from the standard benchmark sets S66 and X40, the energy difference between isomers of water hexamer and the rotation barrier of biphenyl. The results show that our method has an error relative to the best estimation of CBS energy of only 0.2 kcal/mol or less. By parallelization, our method can accomplish the CCSD(T)-F12 calculations of about 60 correlated electrons and 800 basis functions in only several days, which by standard implementation are impossible for ordinary hardware. We conclude that the inc3-db-B0-CCSD(T)-F12a/AVTZ method, which is of CCSD(T)/AV5Z quality, is close to the limit of accuracy that one can achieve for large systems currently.

  16. Worsening Hypoxemia in the Face of Increasing PEEP: A Case of Large Pulmonary Embolism in the Setting of Intracardiac Shunt

    PubMed Central

    Granati, Glen T.; Teressa, Getu

    2016-01-01

    Patient: Male, 40 Final Diagnosis: Patent foramen ovale Symptoms: Dyspnea exertional • hemoptysis • shortness of breath Medication: — Clinical Procedure: Airway pressure release ventilation Specialty: Critical Care Medicine Objective: Rare co-existance of disease or pathology Background: Patent foramen ovale (PFO) are common, normally resulting in a left to right shunt or no net shunting. Pulmonary embolism (PE) can cause sustained increased pulmonary vascular resistance (PVR) and right atrial pressure. Increasing positive end-expiratory pressure (PEEP) improves oxygenation at the expense of increasing intrathoracic pressures (ITP). Airway pressure release ventilation (APRV) decreases shunt fraction, improves ventilation/perfusion (V/Q) matching, increases cardiac output, and decreases right atrial pressure by facilitating low airway pressure. Case Report: A 40-year-old man presented with dyspnea and hemoptysis. Oxygen saturation (SaO2) 80% on room air with A a gradient of 633 mmHg. Post-intubation SaO2 dropped to 71% on assist control, FiO2 100%, and PEEP of 5 cmH20. Successive PEEP dropped SaO2 to 60–70% and blood pressure plummeted. APRV was initaiated with improvement in SaO2 to 95% and improvement in blood pressure. Hemiparesis developed and CT head showed infarction. CT pulmonary angiogram found a large pulmonary embolism. Transthoracic echocardiogram detected right-to left intracardiac shunt, with large PFO. Conclusions: There should be suspicion for a PFO when severe hypoxemia paradoxically worsens in response to increasing airway pressures. Concomitant venous and arterial thromboemboli should prompt evaluation for intra cardiac shunt. Patients with PFO and hypoxemia should be evaluated for causes of sustained right-to left pressure gradient, such as PE. Management should aim to decrease PVR and optimize V/Q matching by treating the inciting incident (e.g., thrombolytics in PE) and by minimizing ITP. APRV can minimize PVR and maximize V/Q ratios and

  17. Application of a statistical software package for analysis of large patient dose data sets obtained from RIS.

    PubMed

    Fazakerley, J; Charnock, P; Wilde, R; Jones, R; Ward, M

    2010-01-01

    For the purpose of patient dose audit, clinical audit and radiology workload analysis, data from Radiology Information Systems (RIS) at many hospitals are collected using a database and the analysis was automated using a statistical package and Visual Basic coding. The database is a Structured Query Language database, which can be queried using an off-the-shelf statistical package, Statistica. Macros were created to automatically format the data to a consistent format between different hospitals ready for analysis. These macros can also be used to automate further analysis such as detailing mean kV, mAs and entrance surface dose per room and per gender. Standard deviation and standard error of the mean are also generated. Graphs can also be generated to illustrate the trends in doses between different variables such as room and gender. Collectively, this information can be used to generate a report. A process that once could take up to 1 d to complete now takes around 1 h. A major benefit in providing the service to hospital trusts is that less resource is now required to report on RIS data, making the possibility of continuous dose audit more likely. Time that was spent on sorting through data can now be spent on improving the analysis to provide benefit to the customer. Using data sets from RIS is a good way to perform dose audits as the huge numbers of data available provide the bases for very accurate analysis. Using macros written in Statistica Visual Basic has helped sort and consistently analyse these data. Being able to analyse by exposure factors has provided a more detailed report to the customer.

  18. Organizational development trajectory of a large academic radiotherapy department set up similarly to a prospective clinical trial: the MAASTRO experience

    PubMed Central

    Boersma, L; Dekker, A; Hermanns, E; Houben, R; Govers, M; van Merode, F; Lambin, P

    2015-01-01

    Objective: To simultaneously improve patient care processes and clinical research activities by starting a hypothesis-driven reorganization trajectory mimicking the rigorous methodology of a prospective clinical trial. Methods: The design of this reorganization trajectory was based on the model of a prospective trial. It consisted of (1) listing problems and analysing their potential causes, (2) defining interventions, (3) defining end points and (4) measuring the effect of the interventions (i.e. at baseline and after 1 and 2 years). The primary end point for patient care was the number of organizational root causes of incidents/near incidents; for clinical research, it was the number of patients in trials. There were several secondary end points. We analysed the data using two sample z-tests, χ2 test, a Mann–Whitney U test and the one-way analysis of variance with Bonferroni correction. Results: The number of organizational root causes was reduced by 27% (p < 0.001). There was no effect on the percentage of patients included in trials. Conclusion: The reorganizational trajectory was successful for the primary end point of patient care and had no effect on clinical research. Some confounding events hampered our ability to draw strong conclusions. Nevertheless, the transparency of this approach can give medical professionals more confidence in moving forward with other organizational changes in the same way. Advances in knowledge: This article is novel because managerial interventions were set up similarly to a prospective clinical trial. This study is the first of its kind in radiotherapy, and this approach can contribute to discussions about the effectiveness of managerial interventions. PMID:25679320

  19. Fast selection of miRNA candidates based on large-scale pre-computed MFE sets of randomized sequences

    PubMed Central

    2014-01-01

    Background Small RNAs are important regulators of genome function, yet their prediction in genomes is still a major computational challenge. Statistical analyses of pre-miRNA sequences indicated that their 2D structure tends to have a minimal free energy (MFE) significantly lower than MFE values of equivalently randomized sequences with the same nucleotide composition, in contrast to other classes of non-coding RNA. The computation of many MFEs is, however, too intensive to allow for genome-wide screenings. Results Using a local grid infrastructure, MFE distributions of random sequences were pre-calculated on a large scale. These distributions follow a normal distribution and can be used to determine the MFE distribution for any given sequence composition by interpolation. It allows on-the-fly calculation of the normal distribution for any candidate sequence composition. Conclusion The speedup achieved makes genome-wide screening with this characteristic of a pre-miRNA sequence practical. Although this particular property alone will not be able to distinguish miRNAs from other sequences sufficiently discriminative, the MFE-based P-value should be added to the parameters of choice to be included in the selection of potential miRNA candidates for experimental verification. PMID:24418292

  20. Large Increases In Spending On Postacute Care In Medicare Point To The Potential For Cost Savings In These Settings

    PubMed Central

    Chandra, Amitabh; Dalton, Maurice A.; Holmes, Jonathan

    2013-01-01

    Identifying policies that will cut or constrain US health care spending and spending growth dominates reform efforts, yet little is known about whether the drivers of spending levels and of spending growth are the same. Policies that produce a one-time reduction in the level of spending, for example by making hospitals more efficient, may do little to reduce subsequent annual spending growth. To identify factors causing health care spending to grow the fastest, we focused on three conditions in the Medicare population: heart attacks, congestive heart failure, and hip fractures. We found that spending on postacute care—long-term hospital care, rehabilitation care, and skilled nursing facility care—was the fastest growing major spending category and accounted for a large portion of spending growth in 1994–2009. During that period average spending for postacute care doubled for patients with hip fractures, more than doubled for those with congestive heart failure, and more than tripled for those with heart attacks. We conclude that policies aimed at controlling acute care spending, such as bundled payments for short-term hospital spending and physician services, are likely to be more effective if they include postacute care, as is currently being tested under Medicare’s Bundled Payment for Care Improvement Initiative. PMID:23650319

  1. SU-E-I-58: Experiences in Setting Up An Online Fluoroscopy Tracking System in a Large Healthcare System

    SciTech Connect

    Fisher, R; Wunderle, K; Lingenfelter, M

    2015-06-15

    Purpose: Transitioning from a paper based to an online system for tracking fluoroscopic case information required by state regulation and to conform to NCRP patient dose tracking suggestions. Methods: State regulations require documentation of operator, equipment, and some metric of tube output for fluoroscopy exams. This information was previously collected in paper logs, which was cumbersome and inefficient for the large number of fluoroscopic units across multiple locations within the system. The “tech notes” feature within Siemens’ Syngo workflow RIS was utilized to create an entry form for technologists to input case information, which was sent to a third party vendor for archiving and display though an online web based portal. Results: Over 55k cases were logged in the first year of implementation, with approximately 6,500 cases per month once fully online. A system was built for area managers to oversee and correct data, which has increased the accuracy of inputted values. A high-dose report was built to automatically send notifications when patients exceed trigger levels. In addition to meeting regulatory requirements, the new system allows for larger scale QC in fluoroscopic cases by allowing comparison of data from specific procedures, locations, equipment, and operators so that instances that fall outside of reference levels can be identified for further evaluation. The system has also drastically improved identification of operators without documented equipment specific training. Conclusion: The transition to online fluoroscopy logs has improved efficiency in meeting state regulatory requirements as well as allowed for identification of particular procedures, equipment, and operators in need of additional attention in order to optimize patient and personnel doses, while high dose alerts improve patient care and follow up. Future efforts are focused on incorporating case information from outside of radiology, as well as on automating processes for

  2. Eruptive history and tectonic setting of Medicine Lake Volcano, a large rear-arc volcano in the southern Cascades

    USGS Publications Warehouse

    Donnelly-Nolan, J. M.; Grove, T.L.; Lanphere, M.A.; Champion, D.E.; Ramsey, D.W.

    2008-01-01

    Medicine Lake Volcano (MLV), located in the southern Cascades ??? 55??km east-northeast of contemporaneous Mount Shasta, has been found by exploratory geothermal drilling to have a surprisingly silicic core mantled by mafic lavas. This unexpected result is very different from the long-held view derived from previous mapping of exposed geology that MLV is a dominantly basaltic shield volcano. Detailed mapping shows that < 6% of the ??? 2000??km2 of mapped MLV lavas on this southern Cascade Range shield-shaped edifice are rhyolitic and dacitic, but drill holes on the edifice penetrated more than 30% silicic lava. Argon dating yields ages in the range ??? 475 to 300??ka for early rhyolites. Dates on the stratigraphically lowest mafic lavas at MLV fall into this time frame as well, indicating that volcanism at MLV began about half a million years ago. Mafic compositions apparently did not dominate until ??? 300??ka. Rhyolite eruptions were scarce post-300??ka until late Holocene time. However, a dacite episode at ??? 200 to ??? 180??ka included the volcano's only ash-flow tuff, which was erupted from within the summit caldera. At ??? 100??ka, compositionally distinctive high-Na andesite and minor dacite built most of the present caldera rim. Eruption of these lavas was followed soon after by several large basalt flows, such that the combined area covered by eruptions between 100??ka and postglacial time amounts to nearly two-thirds of the volcano's area. Postglacial eruptive activity was strongly episodic and also covered a disproportionate amount of area. The volcano has erupted 9 times in the past 5200??years, one of the highest rates of late Holocene eruptive activity in the Cascades. Estimated volume of MLV is ??? 600??km3, giving an overall effusion rate of ??? 1.2??km3 per thousand years, although the rate for the past 100??kyr may be only half that. During much of the volcano's history, both dry HAOT (high-alumina olivine tholeiite) and hydrous calcalkaline

  3. Eruptive history and tectonic setting of Medicine Lake Volcano, a large rear-arc volcano in the southern Cascades

    NASA Astrophysics Data System (ADS)

    Donnelly-Nolan, Julie M.; Grove, Timothy L.; Lanphere, Marvin A.; Champion, Duane E.; Ramsey, David W.

    2008-10-01

    Medicine Lake Volcano (MLV), located in the southern Cascades ˜ 55 km east-northeast of contemporaneous Mount Shasta, has been found by exploratory geothermal drilling to have a surprisingly silicic core mantled by mafic lavas. This unexpected result is very different from the long-held view derived from previous mapping of exposed geology that MLV is a dominantly basaltic shield volcano. Detailed mapping shows that < 6% of the ˜ 2000 km 2 of mapped MLV lavas on this southern Cascade Range shield-shaped edifice are rhyolitic and dacitic, but drill holes on the edifice penetrated more than 30% silicic lava. Argon dating yields ages in the range ˜ 475 to 300 ka for early rhyolites. Dates on the stratigraphically lowest mafic lavas at MLV fall into this time frame as well, indicating that volcanism at MLV began about half a million years ago. Mafic compositions apparently did not dominate until ˜ 300 ka. Rhyolite eruptions were scarce post-300 ka until late Holocene time. However, a dacite episode at ˜ 200 to ˜ 180 ka included the volcano's only ash-flow tuff, which was erupted from within the summit caldera. At ˜ 100 ka, compositionally distinctive high-Na andesite and minor dacite built most of the present caldera rim. Eruption of these lavas was followed soon after by several large basalt flows, such that the combined area covered by eruptions between 100 ka and postglacial time amounts to nearly two-thirds of the volcano's area. Postglacial eruptive activity was strongly episodic and also covered a disproportionate amount of area. The volcano has erupted 9 times in the past 5200 years, one of the highest rates of late Holocene eruptive activity in the Cascades. Estimated volume of MLV is ˜ 600 km 3, giving an overall effusion rate of ˜ 1.2 km 3 per thousand years, although the rate for the past 100 kyr may be only half that. During much of the volcano's history, both dry HAOT (high-alumina olivine tholeiite) and hydrous calcalkaline basalts erupted

  4. "Just Don't": The Suppression and Invitation of Dialogue in the Mathematics Classroom

    ERIC Educational Resources Information Center

    Wagner, David; Herbel-Eisenmann, Beth

    2008-01-01

    Responding to concerns raised by grade 11 mathematics students, we examined a broad set of mathematics classroom transcripts from multiple teachers to examine how the word "just" was and could be used to suppress and invite dialogue. We used corpus linguistics tools to process and quantify the large body of text, not to describe the nature of the…

  5. A Model for the Social Aspects of Classroom Organization. Final Report.

    ERIC Educational Resources Information Center

    Talavage, Joseph

    An initial effort is made to investigate social aspects of the classroom within a mathematical framework called general system theory. The objective of the study is to set the stage for a theory of social behavior in the large which, when verified, may be employed to guide computer simulations of detailed social situations. A model of a…

  6. The Impact of Course Delivery Systems on Student Achievement and Sense of Community: A Comparison of Learning Community versus Stand-Alone Classroom Settings in an Open-Enrollment Inner City Public Community College

    ERIC Educational Resources Information Center

    Bandyopadhyay, Pamela

    2010-01-01

    This study examined the effects of two types of course delivery systems (learning community classroom environments versus stand-alone classroom environments) on the achievement of students who were simultaneously enrolled in remedial and college-level social science courses at an inner city open-enrollment public community college. This study was…

  7. Small Atomic Orbital Basis Set First-Principles Quantum Chemical Methods for Large Molecular and Periodic Systems: A Critical Analysis of Error Sources.

    PubMed

    Sure, Rebecca; Brandenburg, Jan Gerit; Grimme, Stefan

    2016-04-01

    In quantum chemical computations the combination of Hartree-Fock or a density functional theory (DFT) approximation with relatively small atomic orbital basis sets of double-zeta quality is still widely used, for example, in the popular B3LYP/6-31G* approach. In this Review, we critically analyze the two main sources of error in such computations, that is, the basis set superposition error on the one hand and the missing London dispersion interactions on the other. We review various strategies to correct those errors and present exemplary calculations on mainly noncovalently bound systems of widely varying size. Energies and geometries of small dimers, large supramolecular complexes, and molecular crystals are covered. We conclude that it is not justified to rely on fortunate error compensation, as the main inconsistencies can be cured by modern correction schemes which clearly outperform the plain mean-field methods. PMID:27308221

  8. Social epidemiology of a large outbreak of chickenpox in the Colombian sugar cane producer region: a set theory-based analysis.

    PubMed

    Idrovo, Alvaro J; Albavera-Hernández, Cidronio; Rodríguez-Hernández, Jorge Martín

    2011-07-01

    There are few social epidemiologic studies on chickenpox outbreaks, although previous findings suggested the important role of social determinants. This study describes the context of a large outbreak of chickenpox in the Cauca Valley region, Colombia (2003 to 2007), with an emphasis on macro-determinants. We explored the temporal trends in chickenpox incidence in 42 municipalities to identify the places with higher occurrences. We analyzed municipal characteristics (education quality, vaccination coverage, performance of health care services, violence-related immigration, and area size of planted sugar cane) through analyses based on set theory. Edwards-Venn diagrams were used to present the main findings. The results indicated that three municipalities had higher incidences and that poor quality education was the attribute most prone to a higher incidence. Potential use of set theory for exploratory outbreak analyses is discussed. It is a tool potentially useful to contrast units when only small sample sizes are available.

  9. Towards Perceptual Interface for Visualization Navigation of Large Data Sets Using Gesture Recognition with Bezier Curves and Registered 3-D Data

    SciTech Connect

    Shin, M C; Tsap, L V; Goldgof, D B

    2003-03-20

    This paper presents a gesture recognition system for visualization navigation. Scientists are interested in developing interactive settings for exploring large data sets in an intuitive environment. The input consists of registered 3-D data. A geometric method using Bezier curves is used for the trajectory analysis and classification of gestures. The hand gesture speed is incorporated into the algorithm to enable correct recognition from trajectories with variations in hand speed. The method is robust and reliable: correct hand identification rate is 99.9% (from 1641 frames), modes of hand movements are correct 95.6% of the time, recognition rate (given the right mode) is 97.9%. An application to gesture-controlled visualization of 3D bioinformatics data is also presented.

  10. Small Atomic Orbital Basis Set First-Principles Quantum Chemical Methods for Large Molecular and Periodic Systems: A Critical Analysis of Error Sources.

    PubMed

    Sure, Rebecca; Brandenburg, Jan Gerit; Grimme, Stefan

    2016-04-01

    In quantum chemical computations the combination of Hartree-Fock or a density functional theory (DFT) approximation with relatively small atomic orbital basis sets of double-zeta quality is still widely used, for example, in the popular B3LYP/6-31G* approach. In this Review, we critically analyze the two main sources of error in such computations, that is, the basis set superposition error on the one hand and the missing London dispersion interactions on the other. We review various strategies to correct those errors and present exemplary calculations on mainly noncovalently bound systems of widely varying size. Energies and geometries of small dimers, large supramolecular complexes, and molecular crystals are covered. We conclude that it is not justified to rely on fortunate error compensation, as the main inconsistencies can be cured by modern correction schemes which clearly outperform the plain mean-field methods.

  11. Small Atomic Orbital Basis Set First‐Principles Quantum Chemical Methods for Large Molecular and Periodic Systems: A Critical Analysis of Error Sources

    PubMed Central

    Sure, Rebecca; Brandenburg, Jan Gerit

    2015-01-01

    Abstract In quantum chemical computations the combination of Hartree–Fock or a density functional theory (DFT) approximation with relatively small atomic orbital basis sets of double‐zeta quality is still widely used, for example, in the popular B3LYP/6‐31G* approach. In this Review, we critically analyze the two main sources of error in such computations, that is, the basis set superposition error on the one hand and the missing London dispersion interactions on the other. We review various strategies to correct those errors and present exemplary calculations on mainly noncovalently bound systems of widely varying size. Energies and geometries of small dimers, large supramolecular complexes, and molecular crystals are covered. We conclude that it is not justified to rely on fortunate error compensation, as the main inconsistencies can be cured by modern correction schemes which clearly outperform the plain mean‐field methods. PMID:27308221

  12. A success story: A large urban district offers a working model for implementing multisensory teaching into the resource and regular classroom.

    PubMed

    Hutcheson, L; Selig, H; Young, N

    1990-01-01

    A large urban school district contracted with a private nonprofit educational foundation to train 126 special education resource teachers in the last three years in an Orton-Gillingham-based program. These teachers are currently teaching learning-disabled students in groups of 8-10 at the elementary level and 10-13 students at the secondary level. Learning-disabled students who qualify for Special Education, either in reading or spelling, or both, are receiving the instruction.The teachers took a Basic Introductory Class (90 hours of Advanced Academic Credit offered by the Texas Education Agency, or six hours of graduate credit at a local university) in order to teach the program in the resource setting. A two year Advanced Training included annual on-site observations, two half-day workshops each fall and spring, and a two-day advanced workshop in the second summer.First grade teachers, one selected from each of the 164 campuses, supervisors, and principals attended a 25-hour course on "Recognizing Dyslexia: Using Multisensory Teaching and Discovery Techniques." The first grade teachers and special education resource teachers collaborated to provide inservice training for their colleagues.Research, conducted by the district's Research Department, reveals statistically significant gains in reading and spelling ability for the learning-disabled resource students as measured by the Woodcock Reading Mastery Test-Revised, and the Test of Written Spelling.

  13. Tectonic stress inversion of large multi-phase fracture data sets: application of Win-Tensor to reveal the brittle tectonic history of the Lufilan Arc, DRC

    NASA Astrophysics Data System (ADS)

    Delvaux, Damien; Kipata, Louis; Sintubin, Manuel

    2013-04-01

    Large fault-slip data sets from multiphase orogenic regions present a particular challenge in paleostress reconstructions. The Lufilian Arc is an arcuate fold-and-thrust belt that formed during the late Pan-African times as the result of combined N-S and E-W amalgamation of Gondwana in SE-DRCongo and N-Zambia. We studied more than 22 sites in the Lufilian Arc, and its foreland and correlated the results obtained with existing result in the Ubende belt of W-Tanzania. Most studied sites are characterized by multiphase brittle deformation in which the observed brittle structures are the result of progressive saturation of the host rock by neoformed fractures and the reactivation of early formed fractures. They correspond to large mining exploitations with multiple large and continuous outcrops that allow obtaining datasets sufficiently large to be of statistical significance and often corresponding to several successive brittle events. In this context, the reconstruction of tectonic stress necessitates an initial field-base separation of data, completed by a dynamic separation of the original data set into subsets. In the largest sites, several parts of the deposits have been measured independently and are considered as sub-sites that are be processed separately in an initial stage. The procedure used for interactive fault-slip data separation and stress inversion will be illustrated by field examples (Luiswishi and Manono mining sites). This principle has been applied to all result in the reconstruction of the brittle tectonic history of the region, starting with two major phases of orogenic compression, followed by late orogenic extension and extensional collapse. A regional tectonic inversion during the early Mesozoic, as a result of far- field stresses mark the transition towards rift-related extension. More details in Kipata, Delvaux et al.(2013), Geologica Belgica 16/1-2: 001-017 Win-Tensor can be downloaded at: http://users.skynet.be/damien.delvaux/Tensor/tensor-index.html

  14. Collaborative Classroom Management. Video to Accompany "A Biological Brain in a Cultural Classroom: Applying Biological Research to Classroom Management." [Videotape].

    ERIC Educational Resources Information Center

    2001

    This 43-minute VHS videotape is designed to be used in course and workshop settings with "A Biological Brain in a Cultural Classroom: Applying Biological Research to Classroom Management." The videotape's principal values are as an introduction to the issues explored in the book and as a catalyst for group discussions and activities related to…

  15. Thermal comfort in tropical classrooms

    SciTech Connect

    Kwok, A.G.

    1998-10-01

    This paper examines the comfort criteria of ANSI/ASHRAE Standard 55-1992 for their applicability in tropical classrooms. A field study conducted in Hawaii used a variety of methods to collect the data: survey questionnaires, physical measurements, interviews, and behavioral observations. A total of 3,544 students and teachers completed questionnaires in 29 naturally ventilated and air-conditioned classrooms in six schools during two seasons. The majority of classrooms failed to meet the physical specifications of the Standard 55 comfort zone. Thermal neutrality, preference, and acceptability results are compared with other field studies and the Standard 55 criteria. Acceptability votes by occupants of both naturally ventilated and air-conditioned classrooms exceeded the standard`s 80% acceptability criteria, regardless of whether physical conditions were in or out of the comfort zone. Responses from these two school populations suggest not only a basis for separate comfort standards but energy conservation opportunities through raising thermostat set points.

  16. A geometrical correction for the inter- and intra-molecular basis set superposition error in Hartree-Fock and density functional theory calculations for large systems.

    PubMed

    Kruse, Holger; Grimme, Stefan

    2012-04-21

    chemistry yields MAD=0.68 kcal/mol, which represents a huge improvement over plain B3LYP/6-31G* (MAD=2.3 kcal/mol). Application of gCP-corrected B97-D3 and HF-D3 on a set of large protein-ligand complexes prove the robustness of the method. Analytical gCP gradients make optimizations of large systems feasible with small basis sets, as demonstrated for the inter-ring distances of 9-helicene and most of the complexes in Hobza's S22 test set. The method is implemented in a freely available FORTRAN program obtainable from the author's website. PMID:22519309

  17. Automated quantitative dose-response modeling and point of departure determination for large toxicogenomic and high-throughput screening data sets.

    PubMed

    Burgoon, Lyle D; Zacharewski, Timothy R

    2008-08-01

    Regulatory and homeland security agencies undertake safety and risk assessments to assess the potential hazards of radiation, chemical, biological, and pharmaceutical agents. By law, these assessments must be science-based to ensure public safety and environmental quality. These agencies use dose-response modeling and benchmark dose methods to identify points of departure across single end points elicited by the agent. Regulatory agencies have also begun to examine toxicogenomic data to identify novel biomarkers of exposure and assess potential toxicity. The ToxResponse Modeler streamlines analyses and point of departure (POD) calculations across hundreds of responses (e.g., differential gene expression, changes in metabolite levels) through an automated process capable of large-scale modeling and model selection. The application identifies the best-fit dose-response model utilizing particle swarm optimization and calculates the probabilistic POD. The application analyzed a publicly available 2,3,7,8-tetrachlorodibenzo-p-dioxin dose-response data set of hepatic gene expression data in C57BL/6 mice to identify putative biomarkers. The Gene Ontology mapped these responses to specific functions to differentiate adaptive effects from toxic responses. In principle, safety and risk assessors could use the automated ToxResponse Modeler to analyze any large dose-response data set including outputs from high-throughput screening assays to assist with the ranking and prioritization of compounds that warrant further investigation or development.

  18. River Modeling in Large and Ungauged Basins: Experience of Setting up the HEC RAS Model over the Ganges-Brahmaputra-Meghna Basins

    NASA Astrophysics Data System (ADS)

    Hossain, F.; Maswood, M.

    2014-12-01

    River modeling is the processing of setting up a physically-based hydrodynamic model that can simulate the water flow dynamics of a stream network against time varying boundary conditions. Such river models are an important component of any flood forecasting system that forecasts river levels in flood prone regions. However, many large river basins in the developing world such as the Ganges, Brahmaputra, Meghna (GBM), Indus, Irrawaddy, Salween, Mekong and Niger are mostly ungauged. Such large basins lack the necessary in-situ measurements of river bed depth/slope, bathymetry (river cross section), floodplain mapping and boundary condition flows for forcing a river model. For such basins, proxy approaches relying mostly on remote sensing data from space platforms are the only alternative. In this study, we share our experience of setting up the widely-used 1-D river model over the entire GBM basin and its stream network. Good quality in-situ measurements of river hydraulics (cross section, slope, flow) was available only for the downstream and flood prone region of the basin, which comprises only 7% of the basin area. For the remaining 93% of the basin area, we resorted to the use of data from the following satellite sensors to build a workable river model: a) Shuttle Radar Topography Mission (SRTM) for deriving bed slope; b) LANDSAT/MODIS for updating river network and flow direction generated by elevation data; c) radar altimetry data to build depth versus width relationship at river locations; d) satellite precipitation based hydrologic modeling of lateral flows into main stem rivers. In addition, we referred to an extensive body of literature to estimate the prevailing baseline hydraulics of rivers in the ungauged region. We measured success of our approach by systematically testing how well the basin-wide river model could simulate river level dynamics at two measured locations inside Bangladesh. Our experience of river modeling was replete with numerous

  19. Classroom Tech

    ERIC Educational Resources Information Center

    Instructor, 2006

    2006-01-01

    This article features the latest classroom technologies namely the FLY Pentop, WriteToLearn, and a new iris scan identification system. The FLY Pentop is a computerized pen from Leapster that "magically" understands what kids write and draw on special FLY paper. WriteToLearn is an automatic grading software from Pearson Knowledge Technologies and…

  20. Smart Classroom

    ERIC Educational Resources Information Center

    Kelly, Rhea, Ed.

    2006-01-01

    What makes a classroom "smart"? Presentation technologies such as projectors, document cameras, and LCD panels clearly fit the bill, but when considering other technologies for teaching, learning, and developing content, the possibilities become limited only by the boundaries of an institution's innovation. This article presents 32 best practices…

  1. Classroom Notes

    ERIC Educational Resources Information Center

    International Journal of Mathematical Education in Science and Technology, 2007

    2007-01-01

    In this issue's "Classroom Notes" section, the following papers are discussed: (1) "Constructing a line segment whose length is equal to the measure of a given angle" (W. Jacob and T. J. Osler); (2) "Generating functions for the powers of Fibonacci sequences" (D. Terrana and H. Chen); (3) "Evaluation of mean and variance integrals without…

  2. Classroom Tips.

    ERIC Educational Resources Information Center

    Stevens, Jacqueline; And Others

    1993-01-01

    Describes five classroom activities or projects used in Canadian social studies classes. Includes discussions of the use of artifacts, a field trip to Spain, a simulation of the Earth Summit meeting, and the application of mahatma Gandhi's philosophy to current problems. (CFR)

  3. Classroom Notes

    ERIC Educational Resources Information Center

    International Journal of Mathematical Education in Science and Technology, 2007

    2007-01-01

    In this issue's "Classroom Notes" section, the following papers are described: (1) "Sequences of Definite Integrals" by T. Dana-Picard; (2) "Structural Analysis of Pythagorean Monoids" by M.-Q Zhan and J. Tong; (3) "A Random Walk Phenomenon under an Interesting Stopping Rule" by S. Chakraborty; (4) "On Some Confidence Intervals for Estimating the…

  4. Group Goal Setting

    ERIC Educational Resources Information Center

    Sparks, Dennis C.

    1978-01-01

    Action goal setting uses power of peer influence in a healthy and constructive manner, and provides appropriate follow-up for many counseling and classroom activities. This process could help individuals of all ages to take more control over their behavior and create life-styles congruent with their abilities, interests, and values. (Author)

  5. Expanding Knowledge: From the Classroom into Cyberspace

    ERIC Educational Resources Information Center

    Barbas, Maria Potes Santa-Clara

    2006-01-01

    This paper is part of a larger project in the area of research. The main purpose of this mediated discourse was to implement, observe and analyse experiences of teachers in a training project developed for two different settings in the classroom. The first was between international classrooms through cyberspace and the second was a cyberspace…

  6. Should Supervisors Intervene during Classroom Visits?

    ERIC Educational Resources Information Center

    Marshall, Kim

    2015-01-01

    Real-time coaching has become the go-to supervisory model in some schools (especially charters), with supervisors routinely jumping in during teacher observations and sometimes taking over the class to model a more effective approach. The author sets out goals and guidelines for impromptu classroom visits that include visiting each classroom at…

  7. The Inclusive Classroom: How Inclusive Is Inclusion?

    ERIC Educational Resources Information Center

    Reid, Claudette M.

    2010-01-01

    This paper presents the position that inclusion is limited; inclusion does not go far enough. The inclusive classroom has been assessed to be of benefit both to the teacher and student. There are, however, limits set on inclusion. In most classrooms only children with learning disability are included omitting those with severe disabilities,…

  8. Enhancing Vocabulary Development in Multiple Classroom Contexts.

    ERIC Educational Resources Information Center

    Harmon, Janis M.; Staton, Denise G.

    1999-01-01

    Describes ways teachers can enhance students' vocabulary development through multiple contexts available in typical middle school classroom settings. Addresses questions about vocabulary learning and offers suggestions for enhancing vocabulary with narrative and expository texts that involve multiple classroom contexts. Considers the Vocab-o-gram…

  9. mzDB: a file format using multiple indexing strategies for the efficient analysis of large LC-MS/MS and SWATH-MS data sets.

    PubMed

    Bouyssié, David; Dubois, Marc; Nasso, Sara; Gonzalez de Peredo, Anne; Burlet-Schiltz, Odile; Aebersold, Ruedi; Monsarrat, Bernard

    2015-03-01

    The analysis and management of MS data, especially those generated by data independent MS acquisition, exemplified by SWATH-MS, pose significant challenges for proteomics bioinformatics. The large size and vast amount of information inherent to these data sets need to be properly structured to enable an efficient and straightforward extraction of the signals used to identify specific target peptides. Standard XML based formats are not well suited to large MS data files, for example, those generated by SWATH-MS, and compromise high-throughput data processing and storing. We developed mzDB, an efficient file format for large MS data sets. It relies on the SQLite software library and consists of a standardized and portable server-less single-file database. An optimized 3D indexing approach is adopted, where the LC-MS coordinates (retention time and m/z), along with the precursor m/z for SWATH-MS data, are used to query the database for data extraction. In comparison with XML formats, mzDB saves ∼25% of storage space and improves access times by a factor of twofold up to even 2000-fold, depending on the particular data access. Similarly, mzDB shows also slightly to significantly lower access times in comparison with other formats like mz5. Both C++ and Java implementations, converting raw or XML formats to mzDB and providing access methods, will be released under permissive license. mzDB can be easily accessed by the SQLite C library and its drivers for all major languages, and browsed with existing dedicated GUIs. The mzDB described here can boost existing mass spectrometry data analysis pipelines, offering unprecedented performance in terms of efficiency, portability, compactness, and flexibility.

  10. mzDB: A File Format Using Multiple Indexing Strategies for the Efficient Analysis of Large LC-MS/MS and SWATH-MS Data Sets*

    PubMed Central

    Bouyssié, David; Dubois, Marc; Nasso, Sara; Gonzalez de Peredo, Anne; Burlet-Schiltz, Odile; Aebersold, Ruedi; Monsarrat, Bernard

    2015-01-01

    The analysis and management of MS data, especially those generated by data independent MS acquisition, exemplified by SWATH-MS, pose significant challenges for proteomics bioinformatics. The large size and vast amount of information inherent to these data sets need to be properly structured to enable an efficient and straightforward extraction of the signals used to identify specific target peptides. Standard XML based formats are not well suited to large MS data files, for example, those generated by SWATH-MS, and compromise high-throughput data processing and storing. We developed mzDB, an efficient file format for large MS data sets. It relies on the SQLite software library and consists of a standardized and portable server-less single-file database. An optimized 3D indexing approach is adopted, where the LC-MS coordinates (retention time and m/z), along with the precursor m/z for SWATH-MS data, are used to query the database for data extraction. In comparison with XML formats, mzDB saves ∼25% of storage space and improves access times by a factor of twofold up to even 2000-fold, depending on the particular data access. Similarly, mzDB shows also slightly to significantly lower access times in comparison with other formats like mz5. Both C++ and Java implementations, converting raw or XML formats to mzDB and providing access methods, will be released under permissive license. mzDB can be easily accessed by the SQLite C library and its drivers for all major languages, and browsed with existing dedicated GUIs. The mzDB described here can boost existing mass spectrometry data analysis pipelines, offering unprecedented performance in terms of efficiency, portability, compactness, and flexibility. PMID:25505153

  11. Portraits of Whole Language Classrooms: Learning for All Ages.

    ERIC Educational Resources Information Center

    Mills, Heidi, Ed.; Clyde, Jean Anne, Ed.

    Highlighting typical days in a variety of whole-language classrooms, this book describes learners of all ages, beginning with a home day-care setting through preschool programs and elementary classrooms to a junior high and high school. The book also describes a special education site and an English-as-a-Second Language classroom, and concludes in…

  12. Systemize Classroom Management to Enhance Teaching and Learning

    ERIC Educational Resources Information Center

    Delman, Douglas J.

    2011-01-01

    Good classroom management is one of the most important goals teachers strive to establish from the first day of class. The rules, procedures, activities, and behaviors set the classroom tone throughout the school year. By revising, updating, and systemizing classroom management activities, teachers can eliminate many problems created by students…

  13. Environmentally Enriched Classrooms and the Development of Disadvantaged Preschool Children.

    ERIC Educational Resources Information Center

    Busse, Thomas V.; And Others

    This study evaluates the effects of placement of additional equipment in preschool classrooms on the cognitive, perceptual, and social development of urban Negro four-year-old children. Two Get Set classrooms in each of six areas of Philadelphia were paired for teachers, subjects, physical facilities and equipment. One classroom in each pair was…

  14. Improving the Teacher's Awareness of Nonverbal Communication in the Classroom.

    ERIC Educational Resources Information Center

    Kachur, Donald; And Others

    The emphasis in this paper is on developing teacher awareness of how nonverbal communication fits into the classroom setting. Various positive and negative aspects of this phase of communication in the classroom are explored. A classroom teacher is observed closely by students every day, and her/his attitude, feelings, mood or state of mind,…

  15. Practical Classroom Applications of Language Experience: Looking Back, Looking Forward.

    ERIC Educational Resources Information Center

    Nelson, Olga G., Ed.; Linek, Wayne M., Ed.

    The 38 essays in this book look back at language experience as an educational approach, provide practical classroom applications, and reconceptualize language experience as an overarching education process. Classroom teachers and reading specialists describe strategies in use in a variety of classroom settings and describe ways to integrate…

  16. The Social Context of Urban Classrooms: Measuring Student Psychological Climate

    ERIC Educational Resources Information Center

    Frazier, Stacy L.; Mehta, Tara G.; Atkins, Marc S.; Glisson, Charles; Green, Philip D.; Gibbons, Robert D.; Kim, Jong Bae; Chapman, Jason E.; Schoenwald, Sonja K.; Cua, Grace; Ogle, Robert R.

    2015-01-01

    Classrooms are unique and complex work settings in which teachers and students both participate in and contribute to classroom processes. This article describes the measurement phase of a study that examined the social ecology of urban classrooms. Informed by the dimensions and items of an established measure of organizational climate, we designed…

  17. Multilingual Label Quests: A Practice for the "Asymmetrical" Multilingual Classroom

    ERIC Educational Resources Information Center

    Bonacina-Pugh, Florence

    2013-01-01

    Research on multilingual classrooms usually focuses on contexts where both teachers and pupils share the same linguistic repertoire; what can be called "symmetrical" multilingual classrooms. This paper sets out to investigate whether (and how) pupils' multilingual resources can be used in classrooms where the teacher does not share pupils'…

  18. Examining the large-scale convergence of photosynthesis-weighted tree leaf temperatures through stable oxygen isotope analysis of multiple data sets.

    PubMed

    Song, Xin; Barbour, Margaret M; Saurer, Matthias; Helliker, Brent R

    2011-12-01

    The idea that photosynthesis-weighted tree canopy leaf temperature (T(canδ)) can be resolved through analysis of oxygen isotope composition in tree wood cellulose (δ(18) O(wc)) has led to the observation of boreal-to-subtropical convergence of T(canδ) to c. 20°C. To further assess the validity of the large-scale convergence of T(canδ), we used the isotope approach to perform calculation of T(canδ) for independent δ(18) O(wc) data sets that have broad coverage of climates. For the boreal-to-subtropical data sets, we found that the deviation of T(canδ) from the growing season temperature systemically increases with the decreasing mean annual temperature. Across the whole data sets we calculated a mean T(canδ) of 19.48°C and an SD of 2.05°C, while for the tropical data set, the mean T(canδ) was 26.40 ± 1.03°C, significantly higher than the boreal-to-subtropical mean. Our study thus offers independent isotopic support for the concept that boreal-to-subtropical trees display conserved T(canδ) near 20°C. The isotopic analysis cannot distinguish between the possibility that leaf temperatures are generally elevated above ambient air temperatures in cooler environments and the possibility that leaf temperature equals air temperature, whereas the leaf/air temperature at which photosynthesis occurs has a weighted average of near 20°C in cooler environments. Future work will separate these potential explanations. PMID:21899555

  19. Resistance to disruption in a classroom setting.

    PubMed

    Parry-Cruwys, Diana E; Neal, Carrie M; Ahearn, William H; Wheeler, Emily E; Premchander, Raseeka; Loeb, Melissa B; Dube, William V

    2011-01-01

    Substantial experimental evidence indicates that behavior reinforced on a denser schedule is more resistant to disruption than is behavior reinforced on a thinner schedule. The present experiment studied resistance to disruption in a natural educational environment. Responding during familiar activities was reinforced on a multiple variable-interval (VI) 7-s VI 30-s schedule for 6 participants with developmental disabilities. Resistance to disruption was measured by presenting a distracting item. Response rates in the disruption components were compared to within-session response rates in prior baseline components. Results were consistent with the predictions of behavioral momentum theory for 5 of 6 participants.

  20. Pivotal Response Teaching in the Classroom Setting

    ERIC Educational Resources Information Center

    Stahmer, Aubyn C.; Suhrheinrich, Jessica; Reed, Sarah; Bolduc, Cynthia; Schreibman, Laura

    2010-01-01

    Pivotal response teaching (PRT) is an empirically supported naturalistic behavioral intervention proven to be efficacious in the education of children with autism. This intervention involves loosely structured learning environments, teaching during ongoing interactions between student and teacher, child initiation of teaching episodes, child…

  1. Resistance to disruption in a classroom setting.

    PubMed

    Parry-Cruwys, Diana E; Neal, Carrie M; Ahearn, William H; Wheeler, Emily E; Premchander, Raseeka; Loeb, Melissa B; Dube, William V

    2011-01-01

    Substantial experimental evidence indicates that behavior reinforced on a denser schedule is more resistant to disruption than is behavior reinforced on a thinner schedule. The present experiment studied resistance to disruption in a natural educational environment. Responding during familiar activities was reinforced on a multiple variable-interval (VI) 7-s VI 30-s schedule for 6 participants with developmental disabilities. Resistance to disruption was measured by presenting a distracting item. Response rates in the disruption components were compared to within-session response rates in prior baseline components. Results were consistent with the predictions of behavioral momentum theory for 5 of 6 participants. PMID:21709794

  2. Resistance to Disruption in a Classroom Setting

    ERIC Educational Resources Information Center

    Parry-Cruwys, Diana E.; Neal, Carrie M.; Ahearn, William H.; Wheeler, Emily E.; Premchander, Raseeka; Loeb, Melissa B.; Dube, William V.

    2011-01-01

    Substantial experimental evidence indicates that behavior reinforced on a denser schedule is more resistant to disruption than is behavior reinforced on a thinner schedule. The present experiment studied resistance to disruption in a natural educational environment. Responding during familiar activities was reinforced on a multiple…

  3. Promoting Active Involvement in Classrooms

    ERIC Educational Resources Information Center

    Conderman, Greg; Bresnahan, Val; Hedin, Laura

    2012-01-01

    This article presents a rationale for using active involvement techniques, describes large- and small-group methods based on their documented effectiveness and applicability to K-12 classrooms, and illustrates their use. These approaches include ways of engaging students in large groups (e.g., unison responses, response cards, dry-erase boards,…

  4. Nurturing Mathematical Promise in a Regular Elementary Classroom: Exploring the Role of the Teacher and Classroom Environment

    ERIC Educational Resources Information Center

    Dimitriadis, Christos

    2016-01-01

    This article presents findings from a case study of an in-classroom program based on ability grouping for Year 2 (ages 6-7) primary (elementary) children identified as high ability in mathematics. The study examined the role of classroom setting, classroom environment, and teacher's approach in realizing and developing mathematical promise. The…

  5. Multilevel and Diverse Classrooms

    ERIC Educational Resources Information Center

    Baurain, Bradley, Ed.; Ha, Phan Le, Ed.

    2010-01-01

    The benefits and advantages of classroom practices incorporating unity-in-diversity and diversity-in-unity are what "Multilevel and Diverse Classrooms" is all about. Multilevel classrooms--also known as mixed-ability or heterogeneous classrooms--are a fact of life in ESOL programs around the world. These classrooms are often not only multilevel…

  6. A large proportion of asymptomatic Plasmodium infections with low and sub-microscopic parasite densities in the low transmission setting of Temotu Province, Solomon Islands: challenges for malaria diagnostics in an elimination setting

    PubMed Central

    2010-01-01

    Background Many countries are scaling up malaria interventions towards elimination. This transition changes demands on malaria diagnostics from diagnosing ill patients to detecting parasites in all carriers including asymptomatic infections and infections with low parasite densities. Detection methods suitable to local malaria epidemiology must be selected prior to transitioning a malaria control programme to elimination. A baseline malaria survey conducted in Temotu Province, Solomon Islands in late 2008, as the first step in a provincial malaria elimination programme, provided malaria epidemiology data and an opportunity to assess how well different diagnostic methods performed in this setting. Methods During the survey, 9,491 blood samples were collected and examined by microscopy for Plasmodium species and density, with a subset also examined by polymerase chain reaction (PCR) and rapid diagnostic tests (RDTs). The performances of these diagnostic methods were compared. Results A total of 256 samples were positive by microscopy, giving a point prevalence of 2.7%. The species distribution was 17.5% Plasmodium falciparum and 82.4% Plasmodium vivax. In this low transmission setting, only 17.8% of the P. falciparum and 2.9% of P. vivax infected subjects were febrile (≥38°C) at the time of the survey. A significant proportion of infections detected by microscopy, 40% and 65.6% for P. falciparum and P. vivax respectively, had parasite density below 100/μL. There was an age correlation for the proportion of parasite density below 100/μL for P. vivax infections, but not for P. falciparum infections. PCR detected substantially more infections than microscopy (point prevalence of 8.71%), indicating a large number of subjects had sub-microscopic parasitemia. The concordance between PCR and microscopy in detecting single species was greater for P. vivax (135/162) compared to P. falciparum (36/118). The malaria RDT detected the 12 microscopy and PCR positive P

  7. Integrated QSPR models to predict the soil sorption coefficient for a large diverse set of compounds by using different modeling methods

    NASA Astrophysics Data System (ADS)

    Shao, Yonghua; Liu, Jining; Wang, Meixia; Shi, Lili; Yao, Xiaojun; Gramatica, Paola

    2014-05-01

    The soil sorption coefficient (Koc) is a key physicochemical parameter to assess the environmental risk of organic compounds. To predict soil sorption coefficient in a more effective and economical way, here, quantitative structure-property relationship (QSPR) models were developed based on a large diverse dataset including 964 non-ionic organic compounds. Multiple linear regression (MLR), local lazy regression (LLR) and least squares support vector machine (LS-SVM) were utilized to develop QSPR models based on the four most relevant theoretical molecular descriptors selected by genetic algorithms-variable subset selection (GA-VSS) procedure. The QSPR development strictly followed the OECD principles for QSPR model validation, thus great attentions were paid to internal and external validations, applicability domain and mechanistic interpretation. The obtained results indicate that the LS-SVM model performed better than the MLR and the LLR models. For best LS-SVM model, the correlation coefficients (R2) for the training set was 0.913 and concordance correlation coefficient (CCC) for the prediction set was 0.917. The root-mean square errors (RMSE) were 0.330 and 0.426, respectively. The results of internal and external validations together with applicability domain analysis indicate that the QSPR models proposed in our work are predictive and could provide a useful tool for prediction soil sorption coefficient of new compounds.

  8. i-ADHoRe 3.0—fast and sensitive detection of genomic homology in extremely large data sets

    PubMed Central

    Proost, Sebastian; Fostier, Jan; De Witte, Dieter; Dhoedt, Bart; Demeester, Piet; Van de Peer, Yves; Vandepoele, Klaas

    2012-01-01

    Comparative genomics is a powerful means to gain insight into the evolutionary processes that shape the genomes of related species. As the number of sequenced genomes increases, the development of software to perform accurate cross-species analyses becomes indispensable. However, many implementations that have the ability to compare multiple genomes exhibit unfavorable computational and memory requirements, limiting the number of genomes that can be analyzed in one run. Here, we present a software package to unveil genomic homology based on the identification of conservation of gene content and gene order (collinearity), i-ADHoRe 3.0, and its application to eukaryotic genomes. The use of efficient algorithms and support for parallel computing enable the analysis of large-scale data sets. Unlike other tools, i-ADHoRe can process the Ensembl data set, containing 49 species, in 1 h. Furthermore, the profile search is more sensitive to detect degenerate genomic homology than chaining pairwise collinearity information based on transitive homology. From ultra-conserved collinear regions between mammals and birds, by integrating coexpression information and protein–protein interactions, we identified more than 400 regions in the human genome showing significant functional coherence. The different algorithmical improvements ensure that i-ADHoRe 3.0 will remain a powerful tool to study genome evolution. PMID:22102584

  9. Building and calibrating a large-extent and high resolution coupled groundwater-land surface model using globally available data-sets

    NASA Astrophysics Data System (ADS)

    Sutanudjaja, E. H.; Van Beek, L. P.; de Jong, S. M.; van Geer, F.; Bierkens, M. F.

    2012-12-01

    The current generation of large-scale hydrological models generally lacks a groundwater model component simulating lateral groundwater flow. Large-scale groundwater models are rare due to a lack of hydro-geological data required for their parameterization and a lack of groundwater head data required for their calibration. In this study, we propose an approach to develop a large-extent fully-coupled land surface-groundwater model by using globally available datasets and calibrate it using a combination of discharge observations and remotely-sensed soil moisture data. The underlying objective is to devise a collection of methods that enables one to build and parameterize large-scale groundwater models in data-poor regions. The model used, PCR-GLOBWB-MOD, has a spatial resolution of 1 km x 1 km and operates on a daily basis. It consists of a single-layer MODFLOW groundwater model that is dynamically coupled to the PCR-GLOBWB land surface model. This fully-coupled model accommodates two-way interactions between surface water levels and groundwater head dynamics, as well as between upper soil moisture states and groundwater levels, including a capillary rise mechanism to sustain upper soil storage and thus to fulfill high evaporation demands (during dry conditions). As a test bed, we used the Rhine-Meuse basin, where more than 4000 groundwater head time series have been collected for validation purposes. The model was parameterized using globally available data-sets on surface elevation, drainage direction, land-cover, soil and lithology. Next, the model was calibrated using a brute force approach and massive parallel computing, i.e. by running the coupled groundwater-land surface model for more than 3000 different parameter sets. Here, we varied minimal soil moisture storage and saturated conductivities of the soil layers as well as aquifer transmissivities. Using different regularization strategies and calibration criteria we compared three calibration scenarios

  10. Behavior Problems in Learning Activities and Social Interactions in Head Start Classrooms and Early Reading, Mathematics, and Approaches to Learning

    ERIC Educational Resources Information Center

    Bulotsky-Shearer, Rebecca J.; Fernandez, Veronica; Dominguez, Ximena; Rouse, Heather L.

    2011-01-01

    Relations between early problem behavior in preschool classrooms and a comprehensive set of school readiness outcomes were examined for a stratified random sample (N = 256) of 4-year-old children enrolled in a large, urban school district Head Start program. A series of multilevel models examined the unique contribution of early problem behavior…

  11. The Changing College Classroom.

    ERIC Educational Resources Information Center

    Paulien, Daniel K.

    1998-01-01

    Describes the ways in which college classrooms are changing as a result of technology, furnishings, and educational needs requiring more space and different classroom-design concepts. Explains why the traditional tablet armchair classroom is becoming unpopular. (GR)

  12. Creating Learning Communities in the Classroom

    ERIC Educational Resources Information Center

    Saville, Bryan K.; Lawrence, Natalie Kerr; Jakobsen, Krisztina V.

    2012-01-01

    There are many ways to construct classroom-based learning communities. Nevertheless, the emphasis is always on cooperative learning. In this article, the authors focus on three teaching methods--interteaching, team-based learning, and cooperative learning in large, lecture-based courses--that they have used successfully to create classroom-based…

  13. New Ways of Classroom Assessment. Revised

    ERIC Educational Resources Information Center

    Brown, J. D., Ed.

    2013-01-01

    In this revised edition in the popular New Ways Series, teachers have once again been given an opportunity to show how they do assessment in their classrooms on an everyday basis. Often feeling helpless when confronted with large-scale standardized testing practices, teachers here offer classroom testing created with the direct aim of helping…

  14. How Tablets Are Utilized in the Classroom

    ERIC Educational Resources Information Center

    Ditzler, Christine; Hong, Eunsook; Strudler, Neal

    2016-01-01

    New technologies are a large part of the educational landscape in the 21st century. Emergent technologies are implemented in the classroom at an exponential rate. The newest technology to be added to the daily classroom is the tablet computer. Understanding students' and teachers' perceptions about the role of tablet computers is important as this…

  15. Bag-Tanks for Your Classroom.

    ERIC Educational Resources Information Center

    Wulfson, Stephen E.

    1981-01-01

    Suggests using plastic bags as aquaria and terraria. Describes techniques for converting plastic sheets into aquaria, how to set them up for classroom use, and other uses for plastic bag aquaria. (DS)

  16. Harnessing mobile devices in the classroom.

    PubMed

    Smith, Charlene M

    2012-12-01

    This column describes the use of mobile devices in the classroom to support nurses' competency in information literacy. Nurses, as knowledge workers, require competency in information literacy and mobile technology to access accurate and current information promptly in practice settings.

  17. The Classroom Animal: Crickets.

    ERIC Educational Resources Information Center

    Kramer, David C.

    1985-01-01

    Suggests using crickets for classroom activities, providing background information on their anatomy and reproduction and tips on keeping individual organisms or a breeding colony in the classroom. (JN)

  18. Flipped Classroom Modules for Large Enrollment General Chemistry Courses: A Low Barrier Approach to Increase Active Learning and Improve Student Grades

    ERIC Educational Resources Information Center

    Eichler, Jack F.; Peeples, Junelyn

    2016-01-01

    In the face of mounting evidence revealing active learning approaches result in improved student learning outcomes compared to traditional passive lecturing, there is a growing need to change the way instructors teach large introductory science courses. However, a large proportion of STEM faculty continues to use traditional instructor-centered…

  19. Price Discrimination: A Classroom Experiment

    ERIC Educational Resources Information Center

    Aguiló, Paula; Sard, Maria; Tugores, Maria

    2016-01-01

    In this article, the authors describe a classroom experiment aimed at familiarizing students with different types of price discrimination (first-, second-, and third-degree price discrimination). During the experiment, the students were asked to decide what tariffs to set as monopolists for each of the price discrimination scenarios under…

  20. Classroom Culture Promotes Academic Resiliency

    ERIC Educational Resources Information Center

    DiTullio, Gina

    2014-01-01

    Resiliency is what propels many students to continue moving forward under difficult learning and life conditions. We intuitively think that such resilience is a character quality that cannot be taught. On the contrary, when a teacher sets the right conditions and culture for it in the classroom by teaching collaboration and communication skills,…

  1. Getting Started in Classroom Computing.

    ERIC Educational Resources Information Center

    Ahl, David H.

    Written for secondary students, this booklet provides an introduction to several computer-related concepts through a set of six classroom games, most of which can be played with little more than a sheet of paper and a pencil. The games are: 1) SECRET CODES--introduction to binary coding, punched cards, and paper tape; 2) GUESS--efficient methods…

  2. Overview: Patterns of Classroom Authority.

    ERIC Educational Resources Information Center

    Borman, Kathryn M.

    1978-01-01

    The following themes integrate the collection of papers presented in this issue: ethnographic approaches; the importance of the teacher's role as arbitrator; the necessity to study firsthand the interaction of teachers and students; and the need not to separate the classroom from its larger institutional and social settings. (Author/MC)

  3. The Development of the Older Persons and Informal Caregivers Survey Minimum DataSet (TOPICS-MDS): A Large-Scale Data Sharing Initiative

    PubMed Central

    Lutomski, Jennifer E.; Baars, Maria A. E.; Schalk, Bianca W. M.; Boter, Han; Buurman, Bianca M.; den Elzen, Wendy P. J.; Jansen, Aaltje P. D.; Kempen, Gertrudis I. J. M.; Steunenberg, Bas; Steyerberg, Ewout W.; Olde Rikkert, Marcel G. M.; Melis, René J. F.

    2013-01-01

    Introduction In 2008, the Ministry of Health, Welfare and Sport commissioned the National Care for the Elderly Programme. While numerous research projects in older persons’ health care were to be conducted under this national agenda, the Programme further advocated the development of The Older Persons and Informal Caregivers Survey Minimum DataSet (TOPICS-MDS) which would be integrated into all funded research protocols. In this context, we describe TOPICS data sharing initiative (www.topics-mds.eu). Materials and Methods A working group drafted TOPICS-MDS prototype, which was subsequently approved by a multidisciplinary panel. Using instruments validated for older populations, information was collected on demographics, morbidity, quality of life, functional limitations, mental health, social functioning and health service utilisation. For informal caregivers, information was collected on demographics, hours of informal care and quality of life (including subjective care-related burden). Results Between 2010 and 2013, a total of 41 research projects contributed data to TOPICS-MDS, resulting in preliminary data available for 32,310 older persons and 3,940 informal caregivers. The majority of studies sampled were from primary care settings and inclusion criteria differed across studies. Discussion TOPICS-MDS is a public data repository which contains essential data to better understand health challenges experienced by older persons and informal caregivers. Such findings are relevant for countries where increasing health-related expenditure has necessitated the evaluation of contemporary health care delivery. Although open sharing of data can be difficult to achieve in practice, proactively addressing issues of data protection, conflicting data analysis requests and funding limitations during TOPICS-MDS developmental phase has fostered a data sharing culture. To date, TOPICS-MDS has been successfully incorporated into 41 research projects, thus supporting the

  4. HIV Testing among Patients with Presumptive Tuberculosis: How Do We Implement in a Routine Programmatic Setting? Results of a Large Operational Research from India

    PubMed Central

    Kumar, Ajay MV; Gupta, Devesh; Kumar, Ashok; Gupta, R. S.; Kanchar, Avinash; Rao, Raghuram; Shastri, Suresh; Suryakanth, MD; Rangaraju, Chethana; Naik, Balaji; Guddemane, Deepak K.; Bhat, Prashant; Nair, Achuthan Sreenivas; Harries, Anthony David; Dewan, Puneet

    2016-01-01

    Background In March 2012, World Health Organization recommended that HIV testing should be offered to all patients with presumptive TB (previously called TB suspects). How this is best implemented and monitored in routine health care settings in India was not known. An operational research was conducted in Karnataka State (South India, population 64 million, accounts for 10% of India’s HIV burden), to test processes and learn results and challenges of screening presumptive TB patients for HIV within routine health care settings. Methods In this cross-sectional study conducted between January-March 2012, all presumptive TB patients attending public sector sputum microscopy centres state-wide were offered HIV testing by the laboratory technician, and referred to the nearest public sector HIV counselling and testing services, usually within the same facility. The HIV status of the patients was recorded in the routine TB laboratory form and TB laboratory register. The laboratory register was compiled to obtain the number of presumptive TB patients whose HIV status was ascertained, and the number found HIV positive. Aggregate data on reasons for non-testing were compiled at district level. Results Overall, 115,308 patients with presumptive TB were examined for sputum smear microscopy at 645 microscopy centres state-wide. Of these, HIV status was ascertained for 62,847(55%) among whom 7,559(12%) were HIV-positive, and of these, 3,034(40%) were newly diagnosed. Reasons for non-testing were reported for 37,700(72%) of the 52,461 patients without HIV testing; non-availability of testing services at site of sputum collection was cited by health staff in 54% of respondents. Only 4% of patients opted out of HIV testing. Conclusion Offering HIV testing routinely to presumptive TB patients detected large numbers of previously-undetected instances of HIV infection. Several operational challenges were noted which provide useful lessons for improving uptake of HIV testing in this

  5. Global Internet Video Classroom: A Technology Supported Learner-Centered Classroom

    ERIC Educational Resources Information Center

    Lawrence, Oliver

    2010-01-01

    The Global Internet Video Classroom (GIVC) Project connected Chicago Civil Rights activists of the 1960s with Cape Town Anti-Apartheid activists of the 1960s in a classroom setting where learners from Cape Town and Chicago engaged activists in conversations about their motivation, principles, and strategies. The project was launched in order to…

  6. Classroom Management and Teachers' Coping Strategies: Inside Classrooms in Australia, China and Israel

    ERIC Educational Resources Information Center

    Romi, Shlomo; Lewis, Ramon; Roache, Joel

    2013-01-01

    This paper discusses the degree to which recently reported relationships between the classroom management techniques and coping styles of Australian teachers apply in two other national settings: China and Israel. Little is known about which teacher characteristics relate to their approach to classroom management, although researchers in Australia…

  7. Comparison of Two Methods for Estimating the Sampling-Related Uncertainty of Satellite Rainfall Averages Based on a Large Radar Data Set

    NASA Technical Reports Server (NTRS)

    Lau, William K. M. (Technical Monitor); Bell, Thomas L.; Steiner, Matthias; Zhang, Yu; Wood, Eric F.

    2002-01-01

    The uncertainty of rainfall estimated from averages of discrete samples collected by a satellite is assessed using a multi-year radar data set covering a large portion of the United States. The sampling-related uncertainty of rainfall estimates is evaluated for all combinations of 100 km, 200 km, and 500 km space domains, 1 day, 5 day, and 30 day rainfall accumulations, and regular sampling time intervals of 1 h, 3 h, 6 h, 8 h, and 12 h. These extensive analyses are combined to characterize the sampling uncertainty as a function of space and time domain, sampling frequency, and rainfall characteristics by means of a simple scaling law. Moreover, it is shown that both parametric and non-parametric statistical techniques of estimating the sampling uncertainty produce comparable results. Sampling uncertainty estimates, however, do depend on the choice of technique for obtaining them. They can also vary considerably from case to case, reflecting the great variability of natural rainfall, and should therefore be expressed in probabilistic terms. Rainfall calibration errors are shown to affect comparison of results obtained by studies based on data from different climate regions and/or observation platforms.

  8. Classroom risks and resources: Teacher burnout, classroom quality and children's adjustment in high needs elementary schools.

    PubMed

    Hoglund, Wendy L G; Klingle, Kirsten E; Hosan, Naheed E

    2015-10-01

    The current paper presents two related sets of findings on the classroom context in high needs elementary schools. First, we investigated change over one school term in teacher burnout (emotional exhaustion, depersonalization, personal accomplishment) and classroom quality (emotional and instructional support, organization) and assessed the degree to which burnout and classroom quality co-varied over the term with each other and with aggregate externalizing behaviors (average child externalizing behaviors in the classroom). These analyses describe the classroom context in which the children are nested. Second, we examined change over one school term in children's social adjustment (relationship quality with teachers and friends) and academic adjustment (school engagement, literacy skills) and assessed how adjustment co-varied over time with child externalizing behaviors and was predicted by teacher burnout, classroom quality and aggregate externalizing behaviors. These models were tested with a sample of low-income, ethnically diverse children in kindergarten to grade 3 and their teachers. The children and teachers were assessed three times over one school term. Personal accomplishment co-varied positively with overall classroom quality. Reciprocally, classroom organization co-varied positively with overall teacher burnout. Aggregate externalizing behaviors co-varied positively with depersonalization and negatively with personal accomplishment and overall classroom quality, including emotional support and organization. In turn, teacher burnout interacted with aggregate externalizing behaviors to predict change in child social and academic adjustment. Alternatively, classroom quality interacted with aggregate and child externalizing behaviors to predict change in child social and academic adjustment.

  9. Classroom risks and resources: Teacher burnout, classroom quality and children's adjustment in high needs elementary schools.

    PubMed

    Hoglund, Wendy L G; Klingle, Kirsten E; Hosan, Naheed E

    2015-10-01

    The current paper presents two related sets of findings on the classroom context in high needs elementary schools. First, we investigated change over one school term in teacher burnout (emotional exhaustion, depersonalization, personal accomplishment) and classroom quality (emotional and instructional support, organization) and assessed the degree to which burnout and classroom quality co-varied over the term with each other and with aggregate externalizing behaviors (average child externalizing behaviors in the classroom). These analyses describe the classroom context in which the children are nested. Second, we examined change over one school term in children's social adjustment (relationship quality with teachers and friends) and academic adjustment (school engagement, literacy skills) and assessed how adjustment co-varied over time with child externalizing behaviors and was predicted by teacher burnout, classroom quality and aggregate externalizing behaviors. These models were tested with a sample of low-income, ethnically diverse children in kindergarten to grade 3 and their teachers. The children and teachers were assessed three times over one school term. Personal accomplishment co-varied positively with overall classroom quality. Reciprocally, classroom organization co-varied positively with overall teacher burnout. Aggregate externalizing behaviors co-varied positively with depersonalization and negatively with personal accomplishment and overall classroom quality, including emotional support and organization. In turn, teacher burnout interacted with aggregate externalizing behaviors to predict change in child social and academic adjustment. Alternatively, classroom quality interacted with aggregate and child externalizing behaviors to predict change in child social and academic adjustment. PMID:26407833

  10. Classroom Discipline. Research Roundup.

    ERIC Educational Resources Information Center

    Bielefeldt, Talbot

    1989-01-01

    Recent research in classroom discipline tends to show that discipline is a by-product of effective instruction and classroom management. The five publications reviewed in this annotated bibliography explore aspects of the complex classroom environment that relate to student discipline. Walter Doyle's chapter on "Classroom Organization and…

  11. Revoicing Classrooms: A Spatial Manifesto

    ERIC Educational Resources Information Center

    Fisher, Kenn

    2004-01-01

    Why is the physical learning environment in schools largely ignored by teachers within pedagogical practice? The cellular classroom has remained seemingly immutable since the Industrial Revolution, with spatiality playing a silent and subconscious role in schooling other than related to concerns around surveillance. Previous studies have shown…

  12. The role of large strike-slip faults in a convergent continental setting - first results from the Dzhungarian Fault in Eastern Kazakhstan

    NASA Astrophysics Data System (ADS)

    Grützner, Christoph; Campbell, Grace; Elliott, Austin; Walker, Richard; Abdrakhmatov, Kanatbek

    2016-04-01

    The Tien Shan and the Dzhungarian Ala-tau mountain ranges in Eastern Kazakhstan and China take up a significant portion of the total convergence between India and Eurasia, despite the fact that they are more than 1000 km away from the actual plate boundary. Shortening is accommodated by large thrust faults that strike more or less perpendicular to the convergence vector, and by a set of conjugate strike-slip faults. Some of these strike-slip faults are major features of several hundred kilometres length and have produced great historical earthquakes. In most cases, little is known about their slip-rates and earthquake history, and thus, about their role in the regional tectonic setting. This study deals with the NW-SE trending Dzhungarian Fault, a more than 350 km-long, right-lateral strike slip feature. It borders the Dzhungarian Ala-tau range and forms one edge of the so-called Dzhungarian Gate. The fault curves from a ~305° strike at its NW tip in Kazakhstan to a ~328° strike in China. No historical ruptures are known from the Kazakh part of the fault. A possible rupture in 1944 in the Chinese part remains discussed. We used remote sensing, Structure-from-Motion (SfM), differential GPS, field mapping, and Quaternary dating of offset geological markers in order to map the fault-related morphology and to measure the slip rate of the fault at several locations along strike. We also aimed to find out the age of the last surface rupturing earthquake and to determine earthquake recurrence intervals and magnitudes. We were further interested in the relation between horizontal and vertical motion along the fault and possible fault segmentation. Here we present first results from our 2015 survey. High-resolution digital elevation models of offset river terraces allowed us to determine the slip vector of the most recent earthquake. Preliminary dating results from abandoned fluvial terraces allow us to speculate on a late Holocene surface rupturing event. Morphological

  13. Consistency of Toddler Engagement across Two Settings

    ERIC Educational Resources Information Center

    Aguiar, Cecilia; McWilliam, R. A.

    2013-01-01

    This study documented the consistency of child engagement across two settings, toddler child care classrooms and mother-child dyadic play. One hundred twelve children, aged 14-36 months (M = 25.17, SD = 6.06), randomly selected from 30 toddler child care classrooms from the district of Porto, Portugal, participated. Levels of engagement were…

  14. Traveling Tags: The Informal Literacies of Mexican Newcomers in and out of the Classroom

    ERIC Educational Resources Information Center

    Bruna, Katherine Richardson

    2007-01-01

    This article documents tagging as one of several informal literacy practices used by newcomer Mexican youth in a Midwest school and classroom setting. Specifically, it details how tagging travels into the classroom. Using the tool of interactional ethnography to analyze videotaped classroom observation data of an English Learner Science setting, I…

  15. Photonics Explorer: revolutionizing photonics in the classroom

    NASA Astrophysics Data System (ADS)

    Prasad, Amrita; Debaes, Nathalie; Cords, Nina; Fischer, Robert; Vlekken, Johan; Euler, Manfred; Thienpont, Hugo

    2012-10-01

    The `Photonics Explorer' is a unique intra-curricular optics kit designed to engage, excite and educate secondary school students about the fascination of working with light - hands-on, in their own classrooms. Developed with a pan European collaboration of experts, the kit equips teachers with class sets of experimental material provided within a supporting didactic framework, distributed in conjunction with teacher training courses. The material has been specifically designed to integrate into European science curricula. Each kit contains robust and versatile components sufficient for a class of 25-30 students to work in groups of 2-3. The didactic content is based on guided inquiry-based learning (IBL) techniques with a strong emphasis on hands-on experiments, team work and relating abstract concepts to real world applications. The content has been developed in conjunction with over 30 teachers and experts in pedagogy to ensure high quality and ease of integration. It is currently available in 7 European languages. The Photonics Explorer allows students not only to hone their essential scientific skills but also to really work as scientists and engineers in the classroom. Thus, it aims to encourage more young people to pursue scientific careers and avert the imminent lack of scientific workforce in Europe. 50 Photonics Explorer kits have been successfully tested in 7 European countries with over 1500 secondary school students. The positive impact of the kit in the classroom has been qualitatively and quantitatively evaluated. A non-profit organisation, EYESTvzw [Excite Youth for Engineering Science and Technology], is responsible for the large scale distribution of the Photonics Explorer.

  16. Mendel in the Modern Classroom

    NASA Astrophysics Data System (ADS)

    Smith, Mike U.; Gericke, Niklas M.

    2015-01-01

    Mendel is an icon in the history of genetics and part of our common culture and modern biology instruction. The aim of this paper is to summarize the place of Mendel in the modern biology classroom. In the present article we will identify key issues that make Mendel relevant in the classroom today. First, we recount some of the historical controversies that have relevance to modern curricular design, such as Fisher's (Ann Sci 1:115-137, 1936/2008) claim that Mendel's data were too good to be true. We also address questions about Mendel's status as the father of genetics as well as questions about the sequencing of Mendel's work in genetics instruction in relation to modern molecular genetics and evolution. Next, we present a systematic set of examples of research based approaches to the use of Mendel in the modern classroom along with criticisms of these designs and questions about the historical accuracy of the story of Mendel as presented in the typical classroom. Finally, we identify gaps in our understanding in need of further study and present a selected set of resources that, along with the references cited, should be valuable to science educators interested in further study of the story of Mendel.

  17. Conformal Prediction Classification of a Large Data Set of Environmental Chemicals from ToxCast and Tox21 Estrogen Receptor Assays.

    PubMed

    Norinder, Ulf; Boyer, Scott

    2016-06-20

    Quantitative structure-activity relationships (QSAR) are critical to exploitation of the chemical information in toxicology databases. Exploitation can be extraction of chemical knowledge from the data but also making predictions of new chemicals based on quantitative analysis of past findings. In this study, we analyzed the ToxCast and Tox21 estrogen receptor data sets using Conformal Prediction to enhance the full exploitation of the information in these data sets. We applied aggregated conformal prediction (ACP) to the ToxCast and Tox21 estrogen receptor data sets using support vector machine classifiers to compare overall performance of the models but, more importantly, to explore the performance of ACP on data sets that are significantly enriched in one class without employing sampling strategies of the training set. ACP was also used to investigate the problem of applicability domain using both data sets. Comparison of ACP to previous results obtained on the same data sets using traditional QSAR approaches indicated similar overall balanced performance to methods in which careful training set selections were made, e.g., sensitivity and specificity for the external Tox21 data set of 70-75% and far superior results to those obtained using traditional methods without training set sampling where the corresponding results showed a clear imbalance of 50 and 96%, respectively. Application of conformal prediction to imbalanced data sets facilitates an unambiguous analysis of all data, allows accurate predictive models to be built which display similar accuracy in external validation to external validation, and, most importantly, allows an unambiguous treatment of the applicability domain.

  18. Connecting classrooms to the Milky Way

    NASA Astrophysics Data System (ADS)

    Salomé, P.; Radiguet, A.; Albert, B.; Batrung, M.; Caillat, M.; Gheudin, M.; Libert, Y.; Ferlet, R.; Maestrini, A.; Melchior, A.-L.; Munier, J.-M.; Rudolph, A.

    2012-12-01

    'Connecting Classrooms to the Milky Way' is a project of the EU-HOU Consortium (Hands-On-Universe, Europe), involving 11 European countries. It is supported by the lifelong Learning Programme of the European Community. The main goal of this project was to set up the first network of small radio-telescopes dedicated to education all around Europe and directly accessible from a simple Web interface. Any classroom connected to Internet via any Web-browser can now remotely control one of the radio-telescopes and observe the HI emission coming from our Galaxy. The interface also provides the users with simple tools to analyse the data: (i) derive the Milky-Way rotation curve and (ii) map the spiral arms HI distribution. A special emphasis has been made to enable the young generation to understand the challenges of these wavelengths, which are currently at the frontline of the new instruments with the development of the ALMA (Atacama Large Millimeter Array) and SKA (Square Kilometer Array) projects.

  19. Twelve tips for "flipping" the classroom.

    PubMed

    Moffett, Jennifer

    2015-04-01

    The flipped classroom is a pedagogical model in which the typical lecture and homework elements of a course are reversed. The following tips outline the steps involved in making a successful transition to a flipped classroom approach. The tips are based on the available literature alongside the author's experience of using the approach in a medical education setting. Flipping a classroom has a number of potential benefits, for example increased educator-student interaction, but must be planned and implemented carefully to support effective learning. PMID:25154646

  20. Twelve tips for "flipping" the classroom.

    PubMed

    Moffett, Jennifer

    2015-04-01

    The flipped classroom is a pedagogical model in which the typical lecture and homework elements of a course are reversed. The following tips outline the steps involved in making a successful transition to a flipped classroom approach. The tips are based on the available literature alongside the author's experience of using the approach in a medical education setting. Flipping a classroom has a number of potential benefits, for example increased educator-student interaction, but must be planned and implemented carefully to support effective learning.

  1. Using Water-Testing Data Sets.

    ERIC Educational Resources Information Center

    Varrella, Gary F.

    1994-01-01

    Advocates an approach to teaching environmentally related studies based on constructivism. Presents an activity that makes use of data on chemicals in the water supply, and discusses obtaining and using data sets in the classroom. (LZ)

  2. How Time Is Spent in Elementary Classrooms

    ERIC Educational Resources Information Center

    Rosenshine, Barak V.

    2015-01-01

    The Beginning Teacher Evaluation Study (BTES) provides valuable information on how time is spent in elementary classrooms. Some of the major topics are: the average minutes per day which students spend engaged in reading and math activities, student engagement rates in different settings (that is, teacher-led settings versus seatwork) and…

  3. Self-Contained Classrooms. Research Brief

    ERIC Educational Resources Information Center

    Walker, Karen

    2009-01-01

    Determining the ideal academic setting in which students can be successful continues to be one of the primary goals of educators. Is there a best classroom structure in which students can be successful? Although there is research on the academic gains in the block schedule and in traditional departmentalized settings, both of which are common in…

  4. Allowing "Artistic Agency" in the Elementary Classroom

    ERIC Educational Resources Information Center

    Rufo, David

    2011-01-01

    The author was interested in seeing what would happen if children were given more latitude when making art in school. In January 2009, he began by setting up environments in his classroom wherein he hoped his students would feel free to create self-initiated forms of artmaking. Two times each week an hour was set aside for an activity called Open…

  5. Designing Cooperative Learning in the Science Classroom: Integrating the Peer Tutoring Small Investigation Group (PTSIG) within the Model of the Six Mirrors of the Classroom Model

    ERIC Educational Resources Information Center

    Lazarowitz, Reuven; Hertz-Lazarowitz, Rachel; Khalil, Mahmood; Ron, Salit

    2013-01-01

    The model of the six mirrors of the classroom and its use in teaching biology in a cooperative learning mode were implemented in high school classrooms. In this study we present: a) The model of the six mirrors of the classroom (MSMC). b) Cooperative learning settings: 1. The Group Investigation; 2. The Jigsaw Method; and 3. Peer Tutoring in Small…

  6. In the Classroom.

    ERIC Educational Resources Information Center

    French, Michael P.; Danielson, Kathy Everts

    1991-01-01

    Presents seven reading activities involving the preschool classroom writing environment, using big books and predictable books, using cereal boxes to foster emergent literacy, using editorials, visual-auditory links, reading outside the classroom, and ownership of writing. (MG)

  7. The Networked Classroom

    ERIC Educational Resources Information Center

    Roschelle, Jeremy; Penuel, William R.; Abrahamson, Louis

    2004-01-01

    Classroom network requires every student to think actively, which enhances student participation in mathematics and science. Classroom-specific networks use software designed to enhance communication between teacher and students.

  8. Observing Classroom Practice

    ERIC Educational Resources Information Center

    Danielson, Charlotte

    2012-01-01

    Classroom observation is a crucial aspect of any system of teacher evaluation. No matter how skilled a teacher is in other aspects of teaching--such as careful planning, working well with colleagues, and communicating with parents--if classroom practice is deficient, that individual cannot be considered a good teacher. Classroom observations can…

  9. Competition in the Classroom

    ERIC Educational Resources Information Center

    Jameson, Daphne

    2007-01-01

    In this article, the author shares the strategy she adopted to even out the participation among her multicultural students during their classroom discussions. The author realized that her students had different concepts about the classroom and different philosophies about competition. For the Americans and Indians, the classroom was a site of…

  10. Multicultural Classroom Management.

    ERIC Educational Resources Information Center

    Grossman, Herbert

    1991-01-01

    Discusses the harmful effects of culturally inappropriate, prejudicial, and disempowering classroom management techniques often employed with students who are not Euro-American or middle class. Teachers need to adapt classroom management techniques to their student population, eliminate prejudicial classroom management, and replace techniques that…

  11. Classroom Use and Utilization.

    ERIC Educational Resources Information Center

    Fink, Ira

    2002-01-01

    Discusses how classrooms are distributed by size on a campus, how well they are used, and how their use changes with faculty and student needs and desires. Details how to analyze classroom space, use, and utilization, taking into account such factors as scheduling and classroom stations. (EV)

  12. Classroom Strategies: Classroom Management Systems. Volume 3.

    ERIC Educational Resources Information Center

    Speiss, Madeleine F.; And Others

    Classroom management is defined as procedures for arranging the classroom environment so that children learn what the teacher wants to teach them in the healthiest and most effective way possible. The Southwestern Cooperative Educational Laboratory presents a discussion of these procedures as they relate to social controls and components of…

  13. Classroom Management. TESOL Classroom Practice Series

    ERIC Educational Resources Information Center

    Farrell, Thomas S. C., Ed.

    2008-01-01

    This series captures the dynamics of the contemporary ESOL classroom. It showcases state-of-the-art curricula, materials, tasks, and activities reflecting emerging trends in language education and seeks to build localized language teaching and learning theories based on teachers' and students' unique experiences in and beyond the classroom. Each…

  14. Teaching Quality across School Settings

    ERIC Educational Resources Information Center

    Cohen, Julie; Brown, Michelle

    2016-01-01

    Districts are increasingly making personnel decisions based on teachers' impact on student-achievement gains and classroom observations. In some schools, however, a teacher's practices and their students' achievement may reflect not just individual but collaborative efforts. In other settings, teachers' instruction benefits less from the insights…

  15. Becoming urban science teachers by transforming middle-school classrooms: A study of the Urban Science Education Fellows Program

    NASA Astrophysics Data System (ADS)

    Furman, Melina Gabriela

    The current scenario in American education shows a large achievement and opportunity gap in science between urban children in poverty and more privileged youth. Research has shown that one essential factor that accounts for this gap is the shortage of qualified science teachers in urban schools. Teaching science in a high poverty school presents unique challenges to beginner teachers. Limited resources and support and a significant cultural divide with their students are some of the common problems that cause many novice teachers to quit their jobs or to start enacting what has been described as "the pedagogy of poverty." In this study I looked at the case of the Urban Science Education Fellows Program. This program aimed to prepare preservice teachers (i.e. "fellows") to enact socially just science pedagogies in urban classrooms. I conducted qualitative case studies of three fellows. Fellows worked over one year with science teachers in middle-school classrooms in order to develop transformative action research studies. My analysis focused on how fellows coauthored hybrid spaces within these studies that challenged the typical ways science was taught and learned in their classrooms towards a vision of socially just teaching. By coauthoring these hybrid spaces, fellows developed grounded generativity, i.e. a capacity to create new teaching scenarios rooted in the pragmatic realities of an authentic classroom setting. Grounded generativity included building upon their pedagogical beliefs in order to improvise pedagogies with others, repositioning themselves and their students differently in the classroom and constructing symbols of possibility to guide their practice. I proposed authentic play as the mechanism that enabled fellows to coauthor hybrid spaces. Authentic play involved contexts of moderate risk and of distributed expertise and required fellows to be positioned at the intersection of the margins and the center of the classroom community of practice. In

  16. Just in Time to Flip Your Classroom

    NASA Astrophysics Data System (ADS)

    Lasry, Nathaniel; Dugdale, Michael; Charles, Elizabeth

    2014-01-01

    With advocates like Sal Khan and Bill Gates, flipped classrooms are attracting an increasing amount of media and research attention.2 We had heard Khan's TED talk and were aware of the concept of inverted pedagogies in general. Yet it really hit home when we accidentally flipped our classroom. Our objective was to better prepare our students for class. We set out to effectively move some of our course content outside of class and decided to tweak the Just-in-Time Teaching approach (JiTT).3 To our surprise, this tweak—which we like to call the flip-JiTT—ended up completely flipping our classroom. What follows is narrative of our experience and a procedure that any teacher can use to extend JiTT to a flipped classroom.

  17. Molecular Phenotyping Small (Asian) versus Large (Western) Plaque Psoriasis Shows Common Activation of IL-17 Pathway Genes, but Different Regulatory Gene Sets

    PubMed Central

    Kim, Jaehwan; Oh, Chil-Hwan; Jeon, Jiehyun; Baek, Yoosang; Ahn, Jaewoo; Kim, Dong Joo; Lee, Hyun-Soo; da Rosa, Joel Correa; Suárez-Fariñas, Mayte; Lowes, Michelle A.; Krueger, James G.

    2015-01-01

    Psoriasis is present in all racial groups, but in varying frequencies and severity. Considering that small plaque psoriasis is specific to the Asian population and severe psoriasis is more predominant in the Western population, we defined Asian small and intermediate plaque psoriasis as psoriasis subtypes, and compared their molecular signatures with classic subtype of Western large plaque psoriasis. Two different characteristics of psoriatic spreading—vertical growth and radial expansion—were contrasted between subtypes, and genomic data were correlated to histologic and clinical measurements. Compared to Western large plaque psoriasis, Asian small plaque psoriasis revealed limited psoriasis spreading, but IL-17A and IL-17-regulated pro-inflammatory cytokines were highly expressed. Paradoxically, IL-17A and IL-17-regulated pro-inflammatory cytokines were lower in Western large plaque psoriasis, while T cells and dendritic cells in total psoriatic skin area were exponentially increased. Negative immune regulators, such as CD69 and FAS, were decreased in both Western large plaque psoriasis and psoriasis with accompanying arthritis or obesity, and their expression was correlated with psoriasis severity index. Based on the disease subtype comparisons, we propose that dysregulation of T cell expansion enabled by downregulation of immune negative regulators is the main mechanism for development of large plaque psoriasis subtypes. PMID:26763436

  18. Molecular Phenotyping Small (Asian) versus Large (Western) Plaque Psoriasis Shows Common Activation of IL-17 Pathway Genes but Different Regulatory Gene Sets.

    PubMed

    Kim, Jaehwan; Oh, Chil-Hwan; Jeon, Jiehyun; Baek, Yoosang; Ahn, Jaewoo; Kim, Dong Joo; Lee, Hyun-Soo; Correa da Rosa, Joel; Suárez-Fariñas, Mayte; Lowes, Michelle A; Krueger, James G

    2016-01-01

    Psoriasis is present in all racial groups, but in varying frequencies and severity. Considering that small plaque psoriasis is specific to the Asian population and severe psoriasis is more predominant in the Western population, we defined Asian small and intermediate plaque psoriasis as psoriasis subtypes and compared their molecular signatures with the classic subtype of Western large plaque psoriasis. Two different characteristics of psoriatic spreading-vertical growth and radial expansion-were contrasted between subtypes, and genomic data were correlated to histologic and clinical measurements. Compared with Western large plaque psoriasis, Asian small plaque psoriasis revealed limited psoriasis spreading, but IL-17A and IL-17-regulated proinflammatory cytokines were highly expressed. Paradoxically, IL-17A and IL-17-regulated proinflammatory cytokines were lower in Western large plaque psoriasis, whereas T cells and dendritic cells in total psoriatic skin area were exponentially increased. Negative immune regulators, such as CD69 and FAS, were decreased in both Western large plaque psoriasis and psoriasis with accompanying arthritis or obesity, and their expression was correlated with psoriasis severity index. Based on the disease subtype comparisons, we propose that dysregulation of T-cell expansion enabled by downregulation of immune negative regulators is the main mechanism for development of large plaque psoriasis subtypes. PMID:26763436

  19. Molecular Phenotyping Small (Asian) versus Large (Western) Plaque Psoriasis Shows Common Activation of IL-17 Pathway Genes but Different Regulatory Gene Sets.

    PubMed

    Kim, Jaehwan; Oh, Chil-Hwan; Jeon, Jiehyun; Baek, Yoosang; Ahn, Jaewoo; Kim, Dong Joo; Lee, Hyun-Soo; Correa da Rosa, Joel; Suárez-Fariñas, Mayte; Lowes, Michelle A; Krueger, James G

    2016-01-01

    Psoriasis is present in all racial groups, but in varying frequencies and severity. Considering that small plaque psoriasis is specific to the Asian population and severe psoriasis is more predominant in the Western population, we defined Asian small and intermediate plaque psoriasis as psoriasis subtypes and compared their molecular signatures with the classic subtype of Western large plaque psoriasis. Two different characteristics of psoriatic spreading-vertical growth and radial expansion-were contrasted between subtypes, and genomic data were correlated to histologic and clinical measurements. Compared with Western large plaque psoriasis, Asian small plaque psoriasis revealed limited psoriasis spreading, but IL-17A and IL-17-regulated proinflammatory cytokines were highly expressed. Paradoxically, IL-17A and IL-17-regulated proinflammatory cytokines were lower in Western large plaque psoriasis, whereas T cells and dendritic cells in total psoriatic skin area were exponentially increased. Negative immune regulators, such as CD69 and FAS, were decreased in both Western large plaque psoriasis and psoriasis with accompanying arthritis or obesity, and their expression was correlated with psoriasis severity index. Based on the disease subtype comparisons, we propose that dysregulation of T-cell expansion enabled by downregulation of immune negative regulators is the main mechanism for development of large plaque psoriasis subtypes.

  20. Ecological Analysis of Early Childhood Settings: Implications for Mainstreaming.

    ERIC Educational Resources Information Center

    Peterson, Karen L.

    In an effort to help developmentally delayed or disabled children succeed in an integrated or regular early childhood classroom setting, the Rural Area Model Preschool Project staff developed an ecological inventory to identify the behaviors and skills expected of preschoolers in classroom settings. The inventory was used for 2 months in eight…