Note: This page contains sample records for the topic large classroom setting from Science.gov.
While these samples are representative of the content of Science.gov,
they are not comprehensive nor are they the most current set.
We encourage you to perform a real-time search of Science.gov
to obtain the most current and comprehensive results. Last update: August 15, 2014.
Calibrated Peer Review[TM] (CPR) is a program that can significantly enhance the ability to integrate intensive information literacy exercises into largeclassroomsettings. CPR is founded on a solid pedagogic base for learning, and it is formulated in such a way that information skills can easily be inserted. However, there is no mention of its…
This guide, presented in four chapters, provides activities to train teachers to set up a multicultural classroom. In Section I the activity acquaints the participant with six types of classroom areas: private area, individual free-work area with seating, learning center with surfaces for sitting work, learning centers without surfaces for sitting…
In this article, the authors describe an active learning exercise which has been used to replace some lecture hours in the renal portion of an integrated, organ system-based curriculum for first-year medical students. The exercise takes place in a large auditorium with ~150 students. The authors, who are faculty members, lead the discussions,…
Problem-based learning (PBL) was originally introduced in medical education programs as a form of small-group learning, but its use has now spread to large undergraduate classrooms in various other disciplines. Introduction of new teaching techniques, including PBL-based methods, needs to be justified by demonstrating the benefits of such…
Student learning is directly related to classroom control established the first week of school (Wong and Wong 2001)--what you do the first day counts, and what you do the first 10 minutes counts even more. This article shares the advanced planning aspects
The purpose of this paper is to show how a large group of students can work collaboratively in a synchronous way within the classroom using the cheapest possible technological support. Making use of the features of Single Display Groupware and of Multiple Mice we propose a computer-supported collaborative learning approach for big groups within…
Human interactions are multimodal in nature. From simple to complex forms of transferal of information, human beings draw on a multiplicity of communicative modes, such as intonation and gaze, to make sense of everyday experiences. Likewise, the learning process, either within traditional classrooms or Virtual Learning Environments, is shaped by…
Described are materials developed to add to existing complex-number units and how they have been used in second-year algebra and precalculus classes. Discussed is software developed, objectives, and classroom materials. Two worksheets are included. A computer program is provided. (CW)
The objective of this study was to develop an instrument to observe the play behaviour of a whole group of children from four to six years of age in a classroomsetting on the basis of video recording. The instrument was developed in collaboration with experienced teachers and experts on play. Categories of play were derived from the literature…
Berkhout, Louise; Hoekman, Joop; Goorhuis-Brouwer, Sieneke M.
Certain areas in the social studies can be effectively taught in a non-classroomsetting. This experiment determined if, in a supermarket situation, consumer preferences (as measured in sales figures and augmented by questionnaire data) could be altered by the addition of nutritional information to the labels of sixteen items which had moderate…
The large auditorium classroom presents unique challenges to maintaining student engagement. During the fall 2012 semester, I adopted several specific strategies for increasing student engagement and reducing anonymity with the goal of maximizing student success in the large class. I measured attendance and student success in two classes, one with 300 students and one with 42, but otherwise taught as similarly as possible. While the students in the large class probably did better than they would have in a traditional lecture setting, attendance was still significantly lower in the large class, resulting in lower student success than in the small control class by all measures. I will discuss these results and compare to classes in previous semesters, including other small classes and large Distance Education classes conducted live over remote television link.
This set of 30 interactive problems, developed for high school physics, addresses the learners' ability to distinguish between mass and weight, determine net force, construct free-body diagrams, relate acceleration to net force and mass, and combine Newton's Second Law analysis with kinematics to solve for unknown quantities. Editor's Note: Of special note for students with disabilities: audio-guided solutions are available for each problem. The Physics Classroom is a set of resources created for learners and teachers of high school physics. It includes comprehensive tutorials, problem sets with solutions, extra help for struggling learners, Shockwave animations, multimedia learning modules, labs, and photo gallery.
This article examines an effort to support critical literacy in an English as a foreign language (EFL) setting by analyzing one college EFL reading classroom in which students read and responded to articles from "The New Yorker". Data include transcribed audiotapes of classroom interaction and interviews with students, classroom materials, and…
Active learning in large science classrooms furthers opportunities for students to engage in the content and in meaningful learning, yet students can still remain anonymously silent. This study aims to understand the impact of active learning on these silent students in a large General Chemistry course taught via Socratic questioning and…
Obenland, Carrie A.; Munson, Ashlyn H.; Hutchinson, John S.
Case study teaching is difficult in large classes, especially in fixed-seat amphitheaters. The development of audience response systems, or "clickers," for use in classrooms has opened up exciting new possibilities for creating and implementing interactive case studies, particularly in large introductory science courses.
The process of analyzing large data sets often includes an early exploratory stage to first, develop a basic understanding of the data and its interrelationships and second, to prepare and cleanup the data for hypothesis formulation and testing. This prel...
This research explores the emergence of student creativity in classroomsettings, specifically within two content areas: science and social studies. Fourteen classrooms in three elementary schools in Korea were observed, and the teachers and students were interviewed. The three types of student creativity emerging in the teaching and learning…
Many universities implement programs and interventions to increase students' perceived instrumental social support within the classroomsetting, yet to date, no measures exist to adequately assess such perceptions. In response to this need, the current research developed an operational definition of instrumental classroom social support and also…
This article describes thinking routines as tools to guide and support young children's thinking. These learning strategies, developed by Harvard University's Project Zero Classroom, actively engage students in constructing meaning while also understanding their own thinking process. The authors discuss how thinking routines can be used in both…
Examined 2- to 5-year-olds' play with peers in their child care classroom and on the playground. Found that children were more likely to engage in interactive dramatic play outdoors than indoors. Outdoors, older children were more likely to interact with peers than were younger children. Outdoors offered older children functional and dramatic play…
The view of Etzioni, that member socialization and communication tend to define orientations toward consensus (and in turn role performance) was tested using students in a college classroom. The Etzioni model was moderately predictive of role performance, while addition of a cost-benefits factor boosted explained variation. (Author/GDC)
Audience response systems (ARS) or clickers, as they are commonly called, offer a management tool for engaging students in the largeclassroom. Basic elements of the technology are discussed. These systems have been used in a variety of fields and at all levels of education. Typical goals of ARS questions are discussed, as well as methods of compensating for the reduction in lecture time that typically results from their use. Examples of ARS use occur throughout the literature and often detail positive attitudes from both students and instructors, although exceptions do exist. When used in classes, ARS clickers typically have either a benign or positive effect on student performance on exams, depending on the method and extent of their use, and create a more positive and active atmosphere in the largeclassroom. These systems are especially valuable as a means of introducing and monitoring peer learning methods in the large lecture classroom. So that the reader may use clickers effectively in his or her own classroom, a set of guidelines for writing good questions and a list of best-practice tips have been culled from the literature and experienced users.
In this paper we present a multiview registration method for aligning range data. We first align scans pairwise with each other and use the pairwise alignments as constraints that the multiview step enforces while evenly diffusing the pairwise registration errors. This approach is especially suitable for registering large data sets, since using con- straints from pairwise alignments does not require
This study evaluated the reliability of the 5-min psychomotor vigilance task (PVT) in a single-sex Australian primary school. Seventy-five male students (mean age = 11.82 years, SD = 1.12) completed two 5-min PVTs using a Palm personal digital assistant (PDA) in (1) an isolated setting and (2) a classroomsetting. Of this group of students, a subsample of 37 students completed a test-retest reliability trial within the classroomsetting. Using a mixed-model analysis, there was no significant difference in the mean response time (RT) or number of lapses (RTs >or= 500 msec) between the isolated and the classroomsetting. There was, however, an order effect for the number of lapses in the isolated setting, with the number of lapses being greater if the isolated test was conducted second. Test-retest intraclass correlation coefficients (ICCs) in the classroomsetting indicated moderate to high reliability (mean RT = .84, lapses = .59). Bland-Altman analysis showed no systematic difference between the two settings. Findings suggest that the 5-min PDA PVT is a reliable measure of sustained attention in the classroomsetting in this sample of primary-aged schoolchildren. The results provide further evidence for the versatility of this measuring device for larger interventions outside the laboratory. PMID:20805597
This paper examines activity settings and daily classroom routines experienced by 3- and 4-year-old low-income children in public center-based preschool programs, private center-based programs, and family child care homes. Two daily routine profiles were identified using a time-sampling coding procedure: a High Free-Choice pattern in which…
Since its development, occupancy modeling has become a popular and useful tool for ecologists wishing to learn about the dynamics of species occurrence over time and space. Such models require presence–absence data to be collected at spatially indexed survey units. However, only recently have researchers recognized the need to correct for spatially induced overdisperison by explicitly accounting for spatial autocorrelation in occupancy probability. Previous efforts to incorporate such autocorrelation have largely focused on logit-normal formulations for occupancy, with spatial autocorrelation induced by a random effect within a hierarchical modeling framework. Although useful, computational time generally limits such an approach to relatively small data sets, and there are often problems with algorithm instability, yielding unsatisfactory results. Further, recent research has revealed a hidden form of multicollinearity in such applications, which may lead to parameter bias if not explicitly addressed. Combining several techniques, we present a unifying hierarchical spatial occupancy model specification that is particularly effective over large spatial extents. This approach employs a probit mixture framework for occupancy and can easily accommodate a reduced-dimensional spatial process to resolve issues with multicollinearity and spatial confounding while improving algorithm convergence. Using open-source software, we demonstrate this new model specification using a case study involving occupancy of caribou (Rangifer tarandus) over a set of 1080 survey units spanning a large contiguous region (108?000 km2) in northern Ontario, Canada. Overall, the combination of a more efficient specification and open-source software allows for a facile and stable implementation of spatial occupancy models for large data sets.
Johnson, Devin S.; Conn, Paul B.; Hooten, Mevin B.; Ray, Justina C.; Pond, Bruce A.
This study used self-monitoring, coupled with a student/teacher matching strategy, to improve the classroom social skills of five inner-city middle school students, who were at risk for school failure. Using a multiple-probe across students and settings (class periods) design, we evaluated intervention effects in up to six different settings.…
Peterson, Lloyd Douglas; Young, K. Richard; Salzberg, Charles L.; West, Richard P.; Hill, Mary
This study explored the content-based literacy instruction of English language learners (ELLs) across multiple classroomsettings in U.S. elementary schools. The following research questions guided the study: (a) How are ELLs taught English in two types of instructional settings: regular content-area literacy instruction in the all-English…
Define the centre of a parallelogram to be the intersection of its diagonals. It was shown in an earlier paper that the intersection of arbitrarily many parallelograms with the same centre is the unit disc about that centre in a metric defined using ideas from Linear Algebra. In this note, it is shown that this characterizes compact, convex sets,…
An instructional package consisting of modeling, written and behavioral rehearsal, and feedback was used to teach communication behaviors to emotionally handicapped students, ages ranging between 11 and 17 years. Results show communication behaviors can be delivered effectively and economically in a classroomsetting. (Author)
Purpose: To explore the utility of time-interval analysis for documenting the reliability of coding social communication performance of children in classroomsettings. Of particular interest was finding a method for determining whether independent observers could reliably judge both occurrence and duration of ongoing behavioral dimensions for…
In this paper, we describe the Mobile-IT Education (MIT.EDU) system, which demonstrates the potential of using a distributed mobile device architecture for rapid prototyping of wireless mobile multi-user applications for use in classroomsettings. MIT.EDU is a stable, accessible system that combines inexpensive, commodity hardware, a flexible…
Sung, M.; Gips, J.; Eagle, N.; Madan, A.; Caneel, R.; DeVaul, R.; Bonsen, J.; Pentland, A.
This article arose from a project based on exploring the nature and value of interactive teaching with and without information and communications technology (ICT). It describes one facet of the project which involves developing an understanding of children's perceptions of learning in classroomsettings. This was explored through interviews with pupils aged between 3 and 7 years. These interviews utilised
This study compared the academic achievement between undergraduate students taking an introductory managerial accounting course online (N = 104) and students who took the same course in a hybrid classroomsetting (N = 203). Student achievement was measured using scores from twelve weekly online assignments, two major online assignments, a final…
Increasingly, research has focused on the cognitive processes associated with various standard-setting activities. This qualitative study involved an examination of 16 third-grade reading teachers' experiences with the cognitive task of conceptualizing an entire classroom of hypothetical target students when the single-passage bookmark method or…
This study evaluated the reliability of the 5-min psychomotor vigilance task (PVT) in a single-sex Australian primary school.\\u000a Seventy-five male students (mean age 5 11.82 years, SD = 1.12) completed two 5-min PVTs using a Palm personal digital assistant (PDA) in (1) an isolated setting and (2) a classroom\\u000a setting. Of this group of students, a subsample of 37 students
Andrew Wilson; James Dollman; Kurt Lushington; Timothy Olds
Although two recent films, Orgasmic Birth and Pregnant in America, were intended for the big screen, they can also serve as valuable teaching resources in multiple childbirth education settings. Each film conveys powerful messages about birth and today's birthing culture. Depending on a childbirth educator's classroomsetting (hospital, birthing center, or home birth environment), particular portions in each film, along with extra clips featured on the films' DVDs, can enhance an educator's curriculum and spark compelling discussions with class participants.
Head Start CARES (Classroom-based Approaches and Resources for Emotion and Social Skill Promotion) is a large-scale, national research demonstration that was designed to test the effects of a one-year program aimed at improving pre-kindergarteners social ...
This paper looks at South Korea as an example of a collectivist society having a rather large power distance dimension value. In a traditional Korean classroom the teacher is at the top of the classroom hierarchy, while the students are the passive participants. Gender and age play a role in the hierarchy between students themselves. Teaching…
Describes a process for teaching nursing research via secondary analysis of data sets from the National Center for Health Statistics. Addresses advantages, potential problems and limitations, guidelines for students, and evaluation methods. (Contains 32 references.) (SK)
In this paper, we describe the Mobile-IT Education (MIT.EDU) system, which demonstrates the potential of using a distributed mobile device architecture for rapid prototyping of wireless mobile multi-user applications for use in classroomsettings. MIT.EDU is a stable, accessible system that combines inexpensive, commodity hardware, a flexible sensor\\/ peripheral interconnection bus, and a powerful, light-weight distributed sensing, classifica- tion, and
M. Sung; Jonathan Gips; Nathan Eagle; Anmol Madan; Ron Caneel; Richard W. DeVaul; J. Bonsen; Alex Pentland
Visual data-mining techniques have proven valuable in exploratory data analysis, and they have strong potential in the exploration of large databases. Detecting interesting local patterns in large data sets is a key research challenge. Particularly challenging today is finding and deploying efficient and scalable visualization strategies for exploring large geospatial data sets. One way is to share ideas from the
Daniel A. Keim; Christian Panse; Mike Sips; Stephen C. North
Presents K-12 and college classrooms considered outstanding in a competition, which judged the most outstanding learning environments at educational institutions nationwide. Jurors spent two days reviewing projects, focusing on concepts and ideas that made them exceptional. For each citation, the article offers information on the firm, client,…
Science teacher beliefs and classroom practice related to constructivism and factors that may influence classroom practice were examined in this cross-case study. Data from four science teachers in two schools included interviews, demographic questionnaire, Classroom Learning Environment Survey (preferred/perceived), and classroom observations and…
Objective: To examine the learning styles of undergraduate athletic training students to determine their consistency in traditional classroom versus clinical settings. Design and Setting: Subjects completed the Learning Styles Inventory twice, once focusing on learning new information in the classroom and the other focusing on learning new information in the clinical setting. The order of focus regarding setting (classroom or clinical) was counterbalanced across subjects. Subjects: A total of 26 undergraduate athletic training students from a Committee on the Accreditation of Allied Health Education Programs accredited athletic training education program (16 women and 10 men; mean age, 24.42 ± 6.44 years) who were currently assigned to a clinical practicum as part of their academic program served as subjects. Measurements: I performed 4 paired t tests, 1 for each learning mode, to determine if differences existed between the classroom and clinical settings. The percentage of respondents whose learning styles changed across settings was also calculated. Results: The paired t tests revealed a significant difference between the Reflective Observation and Active Experimentation modes across settings. In addition, 58% of respondents' learning styles changed according to setting focus. Conclusions: It appears that learning styles do indeed shift, depending on the domain through which an individual is learning. Consequently, teaching strategies incorporated in 1 setting may not be equally effective in the other setting. Each learning setting should, therefore, be treated separately in order to accommodate individual learning styles and maximize learning achievement. Furthermore, if learning styles are to be considered when designing athletic training education, these findings indicate that in order to ensure the validity of the resulting learning style profile, it may be necessary to provide the respondent with a specific focus, either that of a classroom or clinical setting, before completing the Learning Styles Inventory. ImagesFigure 2.
This paper summarizes the special interest group discussion about slides sets for use by Astronomy 101 instructors. The NASA Science Mission Directorate Astrophysics Education and Public Outreach Forum is coordinating the development of a pilot series of slide sets to help Astronomy 101 instructors incorporate new discoveries in their classrooms. The “Astro 101 slide sets” are presentations of 5–7 slides on a new development or discovery from a NASA Astrophysics mission relevant to topics in introductory astronomy courses. We intend for these slide sets to help Astronomy 101 instructors include new developments (discoveries not yet in their textbooks) into the broader context of the course. With their modular design and non-technical language, the slide sets may also serve audiences beyond Astronomy 101 instruction and are adaptable to different needs. An example on exoplanets was highlighted in this session. In this paper, we outline the community feedback, which falls into the broad categories of content, format, uses, relevant topics, and future adaptations.
Meinke, B.; Schultz, G.; Bianchi, L.; Blair, W. P.; Len, P. M.
Field based learning can be found in nearly every course offered in Geology at Brigham Young University. For example, in our Structural Geology course field studies substitute for labs. Students collect data their own data from several different structural settings of the Wasatch Range. Our curriculum also includes a two-week, sophomore-level field course that introduces students to interpreting field relations themselves and sets the stage for much of what they learn in their upper-division courses. Our senior-level six-week field geology course includes classical field mapping with exercises in petroleum and mineral exploration, environmental geology and geological hazards. Experiments with substituting field-based general education courses for those in traditional classroomsettings indicate that student cognition, course enjoyment and recruiting of majors significantly increase in a field-based course. We offer a field-based introductory geology course (Geo 102) that is taught in seven, six-hour field trips during which students travel to localities of geologic interest to investigate a variety of fundamental geological problems. We compare the outcomes of Geo 102 with a traditional classroom-based geology course (Geo 101). For the comparison both courses are taught by the same instructor, use the same text and supplementary materials and take the same exams. The results of 7 years of reporting indicate that test scores and final grades are one-half grade point higher for Geo 102 students versus those in traditional introductory courses. Student evaluations of the course are also 0.8-1.4 points higher on a scale of 1-8, and are consistently the highest in the Department and College. Other observations include increased attendance, attention and curiosity. The later two are measured by the number of students asking questions of other students as well as the instructors, and the total number of questions asked during class time in the field versus the classroom. Normal classroom involvement includes two or three students asking nearly all of the questions, while in Geo 102 it is closer to half the class, and not the same students each time. Not only do more individuals participate in asking questions in Geo 102, but each participant asks more questions as well. Questions asked in class are generally specific to the discussion, while field questions are commonly multidisciplinary in nature. Field-based courses also encourage more students to collaborate with each other and to integrate shared observations due to the many different aspects of the geosciences present at each site. One of the most important pay-offs is the 50% increase in the number of students changing their major to geology in the field-based versus classroom-based courses. Field-based learning increases the depth of student understanding of the subjects they investigate as well as student involvement and enthusiasm in the class. The tradeoff we make for realizing significant individual and group discovery in the field is that more responsibility is placed on the student to understand the broad based geologic concepts found in the text. The field based approach allows the students to immediately apply their learning in real world applications.
This article focuses on the Incredible Years Teacher Classroom Management Training (IY TCM) intervention as an example of an evidence-based program that embeds coaching within its design. First, the core features of the IY TCM program are described. Second, the IY TCM coaching model and processes utilized to facilitate high fidelity of…
Reinke, Wendy M.; Stormont, Melissa; Webster-Stratton, Carolyn; Newcomer, Lori L.; Herman, Keith C.
Audience response systems (ARS) or clickers, as they are commonly called, offer a management tool for engaging students in the largeclassroom. Basic elements of the technology are discussed. These systems have been used in a variety of fields and at all levels of education. Typical goals of ARS questions are discussed, as well as methods of…
INTERIOR VIEW, SETTINGLARGE CORE WITH ASSISTANCE FROM THE OVERHEAD RAIL CRANE IN BOX FLOOR MOLD AREA (WORKERS: DAN T. WELLS AND TRUMAN CARLISLE). - Stockham Pipe & Fittings Company, Ductile Iron Foundry, 4000 Tenth Avenue North, Birmingham, Jefferson County, AL
In 2004, we launched a new calculus-based, introductory physics sequence at Washington University. Designed as an alternative to our traditional lecture-based sequence, the primary objectives for this new course were to actively engage students in the learning process, to significantly strengthen students' conceptual reasoning skills, to help students develop higher level quantitative problem solving skills necessary for analyzing ``real world'' problems, and to integrate modern physics into the curriculum. This talk will describe our approach, using The Six Ideas That Shaped Physics text by Thomas Moore, to creating an active learning environment in large classes as well as share our perspective on key elements for success and challenges that we face in the large class environment.
An experiment explicitly introducing learning strategies to a large, first-year undergraduate cell biology course was undertaken to see whether awareness and use of strategies had a measurable impact on student performance. The construction of concept maps was selected as the strategy to be introduced because of an inherent coherence with a course structured by concepts. Data were collected over three different semesters of an introductory cell biology course, all teaching similar course material with the same professor and all evaluated using similar examinations. The first group, used as a control, did not construct concept maps, the second group constructed individual concept maps, and the third group first constructed individual maps then validated their maps in small teams to provide peer feedback about the individual maps. Assessment of the experiment involved student performance on the final exam, anonymous polls of student perceptions, failure rate, and retention of information at the start of the following year. The main conclusion drawn is that concept maps without feedback have no significant effect on student performance, whereas concept maps with feedback produced a measurable increase in student problem-solving performance and a decrease in failure rates.
Several fast algorithms for clustering very large data sets have been proposed in the literature. CLARA is a combination of a sampling procedure and the classical PAM algorithm, while CLARANS adopts a serial randomized search strategy to find the optimal set of medoids. GAC-R3 and GAC-RARw exploit genetic search heuristics for solving clustering problems. In this research, we conducted an
Classroom teachers are in the front line of introducing students to formal learning, including assessments, which can be assumed to continue for students should they extend their schooling past the expected mandatory 12 years. The purpose of the present investigation was to survey secondary teachers' beliefs of classroom and large-scale tests for…
This report addresses the monumental challenge of developing exploratory analysis methods for large data sets. The goals of the report are to increase awareness of large data sets problems and to contribute simple graphical methods that address some of the problems. The graphical methods focus on two- and three-dimensional data and common task such as finding outliers and tail structure, assessing central structure and comparing central structures. The methods handle large sample size problems through binning, incorporate information from statistical models and adapt image processing algorithms. Examples demonstrate the application of methods to a variety of publicly available large data sets. The most novel application addresses the too many plots to examine'' problem by using cognostics, computer guiding diagnostics, to prioritize plots. The particular application prioritizes views of computational fluid dynamics solution sets on the fly. That is, as each time step of a solution set is generated on a parallel processor the cognostics algorithms assess virtual plots based on the previous time step. Work in such areas is in its infancy and the examples suggest numerous challenges that remain. 35 refs., 15 figs.
Advocates for multi-age classrooms claim multi-age groupings benefit children (Brynes, Shuster, & Jones, 1994). Currently, there is a lack of research examining play among students in multi-age classrooms. If indeed there is a positive benefit of play among children, research is needed to examine these behaviors among and between young children in…
This report documents the second of two studies on teaching and learning generic skills in high schools. It extends the earlier work by providing a model for designing classroom instruction in both academic and vocational classrooms where teaching generic skills is an instructional goal. Ethnographic field methods were used to observe, record, and…
Since 2002 we have been investigating the use of an electronic classroom communication system in large first year lecture classes. Handheld keypads were distributed to teams of students during a lecture class. Students used the keypads to answer two step multiple choice problems after a discussion within their group. The questions were generated using students answers from previous exams. We have evaluated our use of the classroom communication system using a survey about how comfortable students are with this type of interaction. In addition, we have tried to determine if the use of the classroom communication system can be linked to student performance on exams. Our results show that students are comfortable with this technology and feel that, on the whole, interactive lectures are useful. At a first glance, there is an improvement in students' exam performance, but there are too many competing factors to clearly say that this improvement is solely due to the use of the classroom communication system. Even though this paper is based in physics and a physics example is used to illustrate points, the technique can be applied to other discipline areas.
Background Genetic Analyses in large sample populations are important for a better understanding of the variation between populations, for designing conservation programs, for detecting rare mutations which may be risk factors for a variety of diseases, among other reasons. However these analyses frequently assume that the participating individuals or animals are mutually unrelated which may not be the case in large samples, leading to erroneous conclusions. In order to retain as much data as possible while minimizing the risk of false positives it is useful to identify a large subset of relatively unrelated individuals in the population. This can be done using a heuristic for finding a largeset of independent of nodes in an undirected graph. We describe a fast randomized heuristic for this purpose. The same methodology can also be used for identifying a suitable set of markers for analyzing population stratification, and other instances where a rapid heuristic for maximal independent sets in large graphs is needed. Results We present FastIndep, a fast random heuristic algorithm for finding a maximal independent set of nodes in an arbitrary undirected graph along with an efficient implementation in C++. On a 64 bit Linux or MacOS platform the execution time is a few minutes, even with a graph of several thousand nodes. The algorithm can discover multiple solutions of the same cardinality. FastIndep can be used to discover unlinked markers, and unrelated individuals in populations. Conclusions The methods presented here provide a quick and efficient method for identifying sets of unrelated individuals in large populations and unlinked markers in marker panels. The C++ source code and instructions along with utilities for generating the input files in the appropriate format are available at http://taurus.ansci.iastate.edu/wiki/people/jabr/Joseph_Abraham.html
When working with large data sets, users perform three primary types of activities: data manipulation, data analysis, and data visualization. The data manipulation process involves the selection and transformation of data prior to viewing. This paper addresses user goals for this process and the interactive interface mechanisms that support them. We consider three classes of data manipulation goals: controlling the
Reform efforts in science education have called for instructional methods and resources that mirror the practice of science. Little research and design methods have been documented in the literature for designing such materials. The purpose of this study was to develop problems sets for sophomore-level organic chemistry instruction. This research adapted an instructional design methodology from the science education literature for the creation of new curricular problem sets. The first phase of this study was to establish an understanding of current curricular problems in sophomore-level organic chemistry instruction. A sample of 792 problems was collected from four organic chemistry courses. These problems were assessed using three literature reported problem typologies. Two of these problem typologies have previously been used to understand general chemistry problems; comparisons between general and organic chemistry problems were thus made. Data from this phase was used to develop a set of five problems for practicing organic chemists. The second phase of this study was to explore practicing organic chemists' experiences solving problems in the context of organic synthesis research. Eight practicing organic chemists were interviewed and asked to solve two to three of the problems developed in phase one of this research. These participants spoke of three problem types: project level, synthetic planning, and day-to-day. Three knowledge types (internal knowledge, knowledgeable others, and literature) were used in solving these problems in research practice and in the developed problems. A set of guiding factors and implications were derived from this data and the chemistry education literature for the conversion of the problems for practicing chemists to problems for undergraduate students. A subsequent conversion process for the five problems occurred. The third, and last phase, of this study was to explore undergraduate students' experiences solving problems in the classroom. Eight undergraduate students from four different organic chemistry courses were interviewed and asked to solve three of the problems converted at the end of phase two. Data from these interviews were used to understand the types, methods, and knowledge uses by undergraduate students in the problem-solving process. Data from all three phases were used to assert seven ideas for the development of problems for undergraduate students.
Event catalogs for seismic data can become very large. Furthermore, as researchers collect multiple catalogs and reconcile them into a single catalog that is stored in a relational database, the reconciled set becomes even larger. The sheer number of these events makes searching for relevant events to compare with events of interest problematic. Information overload in this form can lead to the data sets being under-utilized and/or used incorrectly or inconsistently. Thus, efforts have been initiated to research techniques and strategies for helping researchers to make better use of large data sets. In this paper, the authors present their efforts to do so in two ways: (1) the Event Search Engine, which is a waveform correlation tool and (2) some content analysis tools, which area combination of custom-built and commercial off-the-shelf tools for accessing, managing, and querying seismic data stored in a relational database. The current Event Search Engine is based on a hierarchical clustering tool known as the dendrogram tool, which is written as a MatSeis graphical user interface. The dendrogram tool allows the user to build dendrogram diagrams for a set of waveforms by controlling phase windowing, down-sampling, filtering, enveloping, and the clustering method (e.g. single linkage, complete linkage, flexible method). It also allows the clustering to be based on two or more stations simultaneously, which is important to bridge gaps in the sparsely recorded event sets anticipated in such a large reconciled event set. Current efforts are focusing on tools to help the researcher winnow the clusters defined using the dendrogram tool down to the minimum optimal identification set. This will become critical as the number of reference events in the reconciled event set continually grows. The dendrogram tool is part of the MatSeis analysis package, which is available on the Nuclear Explosion Monitoring Research and Engineering Program Web Site. As part of the research into how to winnow the reference events in these large reconciled event sets, additional database query approaches have been developed to provide windows into these datasets. These custom built content analysis tools help identify dataset characteristics that can potentially aid in providing a basis for comparing similar reference events in these large reconciled event sets. Once these characteristics can be identified, algorithms can be developed to create and add to the reduced set of events used by the Event Search Engine. These content analysis tools have already been useful in providing information on station coverage of the referenced events and basic statistical, information on events in the research datasets. The tools can also provide researchers with a quick way to find interesting and useful events within the research datasets. The tools could also be used as a means to review reference event datasets as part of a dataset delivery verification process. There has also been an effort to explore the usefulness of commercially available web-based software to help with this problem. The advantages of using off-the-shelf software applications, such as Oracle's WebDB, to manipulate, customize and manage research data are being investigated. These types of applications are being examined to provide access to large integrated data sets for regional seismic research in Asia. All of these software tools would provide the researcher with unprecedented power without having to learn the intricacies and complexities of relational database systems.
Summary With scientific data available at geocoded locations, investigators are increasingly turning to spatial process models for carrying out statistical inference. Over the last decade, hierarchical models implemented through Markov chain Monte Carlo methods have become especially popular for spatial modelling, given their flexibility and power to fit models that would be infeasible with classical methods as well as their avoidance of possibly inappropriate asymptotics. However, fitting hierarchical spatial models often involves expensive matrix decompositions whose computational complexity increases in cubic order with the number of spatial locations, rendering such models infeasible for large spatial data sets. This computational burden is exacerbated in multivariate settings with several spatially dependent response variables. It is also aggravated when data are collected at frequent time points and spatiotemporal process models are used. With regard to this challenge, our contribution is to work with what we call predictive process models for spatial and spatiotemporal data. Every spatial (or spatiotemporal) process induces a predictive process model (in fact, arbitrarily many of them). The latter models project process realizations of the former to a lower dimensional subspace, thereby reducing the computational burden. Hence, we achieve the flexibility to accommodate non-stationary, non-Gaussian, possibly multivariate, possibly spatiotemporal processes in the context of large data sets. We discuss attractive theoretical properties of these predictive processes. We also provide a computational template encompassing these diverse settings. Finally, we illustrate the approach with simulated and real data sets.
Banerjee, Sudipto; Gelfand, Alan E.; Finley, Andrew O.; Sang, Huiyan
As the number of nursing students increases, the ability to actively engage all students in a largeclassroom is challenging and increasingly difficult. Clickers, or student response systems (SRS), are a relatively new technology in nursing education that use wireless technology and enable students to select individual responses to questions posed to them during class. The study design was a quasi-experimental comparison with one section of an adult medical-surgical course using the SRS and one receiving standard teaching. No significant differences between groups on any measure of performance were found. Focus groups were conducted to describe student perceptions of SRS. Three themes emerged: Being able to respond anonymously, validating an answer while providing immediate feedback, and providing an interactive and engaging environment. Although the clickers did not improve learning outcomes as measured by objective testing, perceptions shared by students indicated an increased degree of classroom engagement. Future research needs to examine other potential outcome variables. PMID:20044180
Patterson, Barbara; Kilpatrick, Judith; Woebkenberg, Eric
Ann L. Brown University of California—Berkeley The lion's share of my current research program is devoted to the study of learning in the blooming, buzzing confusion of inner-city classrooms. My high-level goal is to transform grade-school classrooms from work sites where students perform assigned tasks under the management of teachers into communities of learning (Bereiter & Scardamalia, 1989; Brown &
The authors describe the capabilities of McIDAS , an interactive visualization system that is vastly increasing the ability of earth scientists to manage and analyze data from remote sensing instruments and numerical simulation models. McIDAS provides animated three-dimensional images and highly interactive displays. The software can manage, analyze, and visualize large data sets that span many physical variables (such as
Currently behavior modification, stimulant medication, and combined treatments are supported as evidence-based interventions for attention deficit hyperactivity disorder in classroomsettings. However, there has been little study of the relative effects of these two modalities and their combination in classrooms. Using a within-subject design, the…
Fabiano, Gregory A.; Pelham, William E., Jr.; Gnagy, Elizabeth M.; Burrows-MacLean, Lisa; Coles, Erika K.; Chacko, Anil; Wymbs, Brian T.; Walker, Kathryn S.; Arnold, Fran; Garefino, Allison; Keenan, Jenna K.; Onyango, Adia N.; Hoffman, Martin T.; Massetti, Greta M.; Robb, Jessica A.
This recently completed study examined whether attribution theory can explain helping behavior in an interdependent classroom environment that utilized a cooperative-learning model. The study focused on student participants enrolled in 6 community college communication classes taught by the same instructor. Three levels of cooperative-learning were employed. Survey data were collected from student participants presented with situations describing a group member
The present paper aims to achieve a better understanding of the process of vocabulary acquisition by examining the development of lexical knowledge in both classroom and study abroad contexts. Taking Ife, Vives Boix, and Meara's (2000) study as a starting point, this study attempts to determine whether development in both levels of vocabulary…
A variety of coding schemes are available for direct observational assessment of student classroom behavior. These instruments have been used for a number of assessment tasks including screening children in need of further evaluation for emotional and behavior problems, diagnostic assessment of emotional and behavior problems, assessment of…
Volpe, Robert J.; DiPerna, James C.; Hintze, John M.; Shapiro, Edward S.
Many preschool, Head Start, and kindergarten educators of young children express concern about the number of children who exhibit frequent challenging behaviors and report that managing these behaviors is difficult within these classrooms. This article describes research-based strategies with practical applications that can be used as part of…
Preparing elementary students for online learning begins with basic computer competency. Computer competency skills were taught using integration of learned skills in the regular academic curriculum to sixth grade students under two conditions: (a) in a classroom with four computers, and (b) in a computer lab. Students of mixed ability (N = 53) were given pretest and posttest measures of
Audrey C. Rule; Manuel T. Barrera; C. Jolene Dockstader; John A. Derr
Osborne (1969) presented adolescent hearing impaired students with five minutes of free- time contingent upon remaining seated for 15 minutes in the classroom. This procedure substantially reduced the frequency of out- of- seat behavior. Since the publication of this seminal work, 49 studies have used free- time as a consequence to change behavior. This review critically examines these 49 studies.
Service user involvement in pre-registration nurse education is now a requirement, yet little is known about how students engage with users in the classroom, how such initiatives are being evaluated, how service users are prepared themselves to teach students, or the potential influence on clinical practice. The aim of this literature review was to bring together published articles on service user involvement in classroomsettings in pre-registration mental health nurse education programmes, including their evaluations. A comprehensive review of the literature was carried out via computer search engines and the Internet, as well as a hand search of pertinent journals and references. This produced eight papers that fitted the inclusion criteria, comprising four empirical studies and four review articles, which were then reviewed using a seven-item checklist. The articles revealed a range of teaching and learning strategies had been employed, ranging from exposure to users' personal stories, to students being required to demonstrate awareness of user perspectives in case study presentations, with others involving eLearning and assessment skills initiatives. This review concludes that further longitudinal research is needed to establish the influence of user involvement in the classroom over time. PMID:22296494
Automated instruments for DNA sequencing greatly simplify data collection in the Sanger sequencing procedure. By contrast, the so-called front-end problems of preparing sequencing templates, performing sequencing reactions, and loading these on the instruments remain major obstacles to extensive sequencing projects. We describe here the use of a manifold support to prepare and perform sequencing reactions on largesets of templates in parallel, as well as to load the reaction products on a sequencing instrument. In this manner, all reaction steps are performed without pipetting the samples. The strategy is applied to sequencing PCR-amplified clones of the human mitochondrial D-loop and for detection of heterozygous positions in the human major histocompatibility complex class II gene HLA-DQB, amplified from genomic DNA samples. This technique will promote sequencing in a clinical context and could form the basis of more efficient genomic sequencing strategies. PMID:8134382
Lagerkvist, A; Stewart, J; Lagerström-Fermér, M; Landegren, U
The authors describe the capabilities of McIDAS, an interactive visualization system that is vastly increasing the ability of earth scientists to manage and analyze data from remote sensing instruments and numerical simulation models. McIDAS provides animated three-dimensionsal images and highly interactive displays. The software can manage, analyze, and visualize large data sets that span many physical variables (such as temperature, pressure, humidity, and wind speed), as well as time and three spatial dimensions. The McIDAS system manages data from at least 100 different sources. The data management tools consist of data structures for storing different data types in files, libraries of routines for accessing these data structures, system commands for performing housekeeping functions on the data files, and reformatting programs for converting external data to the system's data structures. The McIDAS tools for three-dimensional visualization of meteorological data run on an IBM mainframe and can load up to 128-frame animation sequences into the workstations. A highly interactive version of the system can provide an interactive window into data sets containing tens of millions of points produced by numerical models and remote sensing instruments. The visualizations are being used for teaching as well as by scientists.
While climate models have used parallelism for several years, the post-processing tools are still mostly single-threaded applications and many are closed source. These tools are becoming a bottleneck in the production of new climate knowledge when they confront terabyte-sized output from high-resolution climate models. The ParVis project is using and creating Free and Open Source tools that bring data and task parallelism to climate model analysis to enable analysis of large climate data sets. ParVis is using the Swift task-parallel language to implement a diagnostic suite that generates over 600 plots of atmospheric quantities. ParVis has also created a Parallel Gridded Analysis Library (ParGAL) which implements many common climate analysis operations in a data-parallel fashion using the Message Passing Interface. ParGAL has in turn been built on sophisticated packages for describing grids in parallel (the Mesh Oriented database (MOAB), performing vector operations on arbitrary grids (Intrepid) and reading data in parallel (PnetCDF). ParGAL is being used to implement a parallel version of the NCAR Command Language (NCL) called ParNCL. ParNCL/ParCAL not only speeds up analysis of large datasets but also allows operations to be performed on native grids, eliminating the need to transform data to latitude-longitude grids. All of the tools ParVis is creating are available as free and open source software.
Identity by descent (IBD) inference is the task of computationally detecting genomic segments that are shared between individuals by means of common familial descent. Accurate IBD detection plays an important role in various genomic studies, ranging from mapping disease genes to exploring ancient population histories. The majority of recent work in the field has focused on improving the accuracy of inference, targeting shorter genomic segments that originate from a more ancient common ancestor. The accuracy of these methods, however, is achieved at the expense of high computational cost, resulting in a prohibitively long running time when applied to large cohorts. To enable the study of large cohorts, we introduce SpeeDB, a method that facilitates fast IBD detection in large unphased genotype data sets. Given a target individual and a database of individuals that potentially share IBD segments with the target, SpeeDB applies an efficient opposite-homozygous filter, which excludes chromosomal segments from the database that are highly unlikely to be IBD with the corresponding segments from the target individual. The remaining segments can then be evaluated by any IBD detection method of choice. When examining simulated individuals sharing 4 cM IBD regions, SpeeDB filtered out 99.5% of genomic regions from consideration while retaining 99% of the true IBD segments. Applying the SpeeDB filter prior to detecting IBD in simulated fourth cousins resulted in an overall running time that was 10,000x faster than inferring IBD without the filter and retained 99% of the true IBD segments in the output.
Huang, Lin; Bercovici, Sivan; Rodriguez, Jesse M.; Batzoglou, Serafim
Nurse educators must explore innovative technologies that make the most of the characteristics and learning styles of millennial learners. These students are comfortable with technology and prefer interactive classrooms with individual feedback and peer collaboration. This study evaluated the perceived effectiveness of personal response system (PRS) technology in enhancing student learning in small and largeclassrooms. PRS technology was integrated into two undergraduate courses, nursing research (n = 33) and junior medical-surgical nursing (n = 116). Multiple-choice, true-false, NCLEX-RN alternate format, and reading quiz questions were incorporated within didactic PowerPoint presentations. Data analysis of Likert-type and open-response questions supported the use of PRS technology as an effective strategy for educating millennial learners in both small and largeclassrooms. PRS technology promotes active learning, increases participation, and provides students and faculty with immediate feedback that reflects comprehension of content and increases faculty-student interaction. PMID:20055325
This Teaching Resource provides lecture notes, slides, and a problem set for a series of lectures introducing the mathematical concepts behind gene-set enrichment analysis (GSEA) and were part of a course entitled “Systems Biology: Biomedical Modeling.” GSEA is a statistical functional enrichment analysis commonly applied to identify enrichment of biological functional categories in sets of ranked differentially expressed genes from genome-wide mRNA expression data sets.
Neil R. Clark (New York;Mount Sinai School of Medicine REV); Avi Ma'ayan (New York;Mount Sinai School of Medicine REV)
Observations of 780 third-grade classrooms described classroom activities, child-teacher interactions, and dimensions of the global classroom environment, which were examined in relation to structural aspects of the classroom and child behavior. 1 child per classroom was targeted for observation in relation to classroom quality and teacher and…
We report on a large-scale implementation of extensive reading (ER) in a university setting in Japan where all students were required to read outside class time as part of their course requirement. A pre/posttest comparison between the 2009 cohort of students who read outside of class and the 2008 cohort who did no outside reading shows that the…
In model identification, calibration or sensitivity analysis, the model-parameter values may be required to yield model-output values that satisfy specified constraints, for given initial conditions and forcing. Inequality constraints on scalar functions of the model outputs (henceforth called output bounds) confine the parameters to their feasible set. Output bounds are fundamental to regional sensitivity analysis and a desirable addition to
Reform efforts in science education have called for instructional methods and resources that mirror the practice of science. Little research and design methods have been documented in the literature for designing such materials. The purpose of this study was to develop problems sets for sophomore-level organic chemistry instruction. ^ This research adapted an instructional design methodology from the science education
Since the identification of stress and the relationship of individual stress responses to physical and mental health, medical and behavioral professionals have been training individuals in coping strategies. To investigate the possibility of teaching cognitive coping skills to a nonclinical population in an academic setting, 41 college students…
One skill that elementary students need to acquire is the ability to analyze spatially distributed data. In this activity students are asked to complete the following tasks: (1) plot a set of data (related to "mud-sharks"--an imaginary fish) on a map of the state of Alabama, (2) identify trends in the data, (3) make graphs using the data…
As more young children enter school settings to attend early childhood programs, early childhood teachers and school psychologists have been charged with supporting a growing number of young children with chronic problem behaviors that put them at risk for the development of emotional/behavioral disorders (EBDs). There is a need for effective,…
Vo, Abigail K.; Sutherland, Kevin S.; Conroy, Maureen A.
Working within a person-process-context framework, we investigated the relation of the level of preschool children's compliance to child temperament, caregiver-child interaction in the child care setting, child care quality, and contextual chaos. Participants were 86 preschoolers and their teachers. Our database included both questionnaires and…
Viewpoints is an interactive tool for exploratory visual analysis of large high-dimensional (multivariate) data. It uses linked scatterplots to find relations in a few seconds that can take much longer with other plotting tools. Its features include linked scatter plots with brushing, dynamic histograms, normalization, and outlier detection/removal.
We are given a large database of customer transactions.Each transaction consists of items purchased by a customer in a visit. We present an efficient algorithm that generates all significant association rules between items in the database.The algorithm incorporates buffer management and novel estimation and pruning techniques. We also present results of applying this algorithm to sales data obtained from a
This paper describes a technically driven, collaborative approach to assessing the function of problem behavior using web-based technology. A case example is provided to illustrate the process used in this pilot project. A school team conducted a functional analysis with a child who demonstrated challenging behaviors in a preschool setting. Behavior analysts at a university setting provided the school team with initial workshop trainings, on-site visits, e-mail and phone communication, as well as live web-based feedback on functional analysis sessions. The school personnel implemented the functional analysis with high fidelity and scored the data reliably. Outcomes of the project suggest that there is great potential for collaboration via the use of web-based technologies for ongoing assessment and development of effective interventions. However, an empirical evaluation of this model should be conducted before wide-scale adoption is recommended.
The Classroom Observation Schedule to Measure Intentional Communication (COSMIC) was devised to provide ecologically valid outcome measures for a communication-focused intervention trial. Ninety-one children with autism spectrum disorder aged 6 years 10 months (SD 16 months) were videoed during their everyday snack, teaching and free play…
Pasco, Greg; Gordon, Rosanna K.; Howlin, Patricia; Charman, Tony
High-performance computing is often concerned with the speed at which floating- point calculations can be performed. The architectures of many parallel computers and/or their network topologies are based on these investigations. Often, benchmarks resulting from these investigations are compiled with little regard to how a large dataset would move about in these systems. This part of the Beowulf study addresses that concern by looking at specific applications software and system-level modifications. Applications include an implementation of a smoothing filter for time-series data, a parallel implementation of the decision tree algorithm used in the Landcover Characterization project, a parallel Kriging algorithm used to fit point data collected in the field on invasive species to a regular grid, and modifications to the Beowulf project's resampling algorithm to handle larger, higher resolution datasets at a national scale. Systems-level investigations include a feasibility study on Flat Neighborhood Networks and modifications of that concept with Parallel File Systems.
Steinwand, Daniel R.; Maddox, Brian; Beckmann, Tim; Schmidt, Gail
In this paper, we propose OPOSSUM, a novel similarity-based clustering algorithm using constrained, weighted graph- partitioning. Instead of binary presence or absence of products in a market-basket, we use an extended 'revenue per product' measure to better account for management objectives. Typically the number of clusters desired in a database marketing application is only in the teens or less. OPOSSUM proceeds top-down, which is more efficient and takes a small number of steps to attain the desired number of clusters as compared to bottom-up agglomerative clustering approaches. OPOSSUM delivers clusters that are balanced in terms of either customers (samples) or revenue (value). To facilitate data exploration and validation of results we introduce CLUSION, a visualization toolkit for high-dimensional clustering problems. To enable closed loop deployment of the algorithm, OPOSSUM has no user-specified parameters. Thresholding heuristics are avoided and the optimal number of clusters is automatically determined by a search for maximum performance. Results are presented on a real retail industry data-set of several thousand customers and products, to demonstrate the power of the proposed technique.
Near-infrared spectroscopy (NIRS) studies have revealed that performing mental arithmetic tasks have associated event-related hemodynamic responses that are detectable. Thus NIRS-based Brain Computer Interface (BCI) has the potential for investigating how to best teach mathematics in a classroomsetting. This paper presents a novel computational intelligent method of applying rough set-based neuro-fuzzy system (RNFS) in NIRS-based BCI for assessing numerical
Kai Keng Ang; Cuntai Guan; Kerry Lee; Jie Qi Lee; Shoko Nioka; Britton Chance
The effects of classroom noise and background speech on speech perception, measured by word-to-picture matching, and listening comprehension, measured by execution of oral instructions, were assessed in first- and third-grade children and adults in a classroom-like setting. For speech perception, in addition to noise, reverberation time (RT) was varied by conducting the experiment in two virtual classrooms with mean RT = 0.47 versus RT = 1.1 s. Children were more impaired than adults by background sounds in both speech perception and listening comprehension. Classroom noise evoked a reliable disruption in children's speech perception even under conditions of short reverberation. RT had no effect on speech perception in silence, but evoked a severe increase in the impairments due to background sounds in all age groups. For listening comprehension, impairments due to background sounds were found in the children, stronger for first- than for third-graders, whereas adults were unaffected. Compared to classroom noise, background speech had a smaller effect on speech perception, but a stronger effect on listening comprehension, remaining significant when speech perception was controlled. This indicates that background speech affects higher-order cognitive processes involved in children's comprehension. Children's ratings of the sound-induced disturbance were low overall and uncorrelated to the actual disruption, indicating that the children did not consciously realize the detrimental effects. The present results confirm earlier findings on the substantial impact of noise and reverberation on children's speech perception, and extend these to classroom-like environmental settings and listening demands closely resembling those faced by children at school. PMID:20871182
Head Start CARES (Classroom-based Approaches and Resources for Emotion and Social Skill Promotion) is a large-scale, national research demonstration that was designed to test the effects of a one-year program aimed at improving pre-kindergarteners' social and emotional readiness for school. To facilitate the delivery of the program, teachers…
Geoscience and education faculty at The University of Akron jointly developed a series of inquiry-based learning modules aimed at both non-major and major student populations enrolled in introductory geology courses. These courses typically serve 2500 students per year in four to six classes of 40-160 students each per section. Twelve modules were developed that contained common topics and assessments appropriate to Earth Science, Environmental Geology and Physical Geology classes. All modules were designed to meet four primary learning objectives agreed upon by Department of Geology faculty. These major objectives include: 1) Improvement of student understanding of the scientific method; 2) Incorporation of problem solving strategies involving analysis, synthesis, and interpretation; 3) Development of the ability to distinguish between inferences, data and observations; and 4) Obtaining an understanding of basic processes that operate on Earth. Additional objectives that may be addressed by selected modules include: 1) The societal relevance of science; 2) Use and interpretation of quantitative data to better understand the Earth; 3) Development of the students' ability to communicate scientific results; 4) Distinguishing differences between science, religion and pseudo-science; 5) Evaluation of scientific information found in the mass media; and 6) Building interpersonal relationships through in-class group work. Student pre- and post-instruction progress was evaluated by administering a test of logical thinking, an attitude toward science survey, and formative evaluations. Scores from the logical thinking instrument were used to form balanced four-person working groups based on the students' incoming cognitive level. Groups were required to complete a series of activities and/or exercises that targeted different cognitive domains based upon Bloom's taxonomy (knowledge, comprehension, application, analysis, synthesis and evaluation of information). Daily assessments of knowledge-level learning included evaluations of student responses to pre- and post-instruction conceptual test questions, short group exercises and content-oriented exam questions. Higher level thinking skills were assessed when students completed exercises that required the completion of Venn diagrams, concept maps and/or evaluation rubrics both during class periods and on exams. Initial results indicate that these techniques improved student attendance significantly and improved overall retention in the course by 8-14% over traditional lecture formats. Student scores on multiple choice exam questions were slightly higher (1-3%) for students taught in the active learning environment and short answer questions showed larger gains (7%) over students' scores in a more traditional class structure.
The only way to gain genuine expertise in Statistical Process Control (SPC) and the design of experiments (DOX) is with repeated practice, but not on canned problems with dead data sets. Rather, one must negotiate a wide variety of problems each with its own peculiarities and its own constantly changing data. The problems should not be of the type for which there is a single, well-defined answer that can be looked up in a fraternity file or in some text. The problems should match as closely as possible the open-ended types for which there is always an abundance of uncertainty. These are the only kinds that arise in real research, whether that be basic research in academe or engineering research in industry. To gain this kind of experience, either as a professional consultant or as an industrial employee, takes years. Vast amounts of money, not to mention careers, must be put at risk. The purpose here is to outline some realistic simulation-type lab exercises that are so simple and inexpensive to run that the students can repeat them as often as desired at virtually no cost. Simulations also allow the instructor to design problems whose outcomes are as noisy as desired but still predictable within limits. Also the instructor and the students can learn a great deal more from the postmortum conducted after the exercise is completed. One never knows for sure what the true data should have been when dealing only with real life experiments. To add a bit more realism to the exercises, it is sometimes desirable to make the students pay for each experimental result from a make-believe budget allocation for the problem.
A belief that pre-registration nursing programmes in the United Kingdom were not adequately equipping students with fundamental clinical skills has led to increasing interest in alternative methods for developing students' practical skills, as an adjunct to their placement experiences. Whilst recent literature offers insight into the operational aspects of developing and running clinical skills facilities in higher education institutions, there is limited evidence regarding the types of procedures taught, or the risks and benefits to students practising these procedures on each other. This study therefore sought to identify the current status of peer-practised learning within pre-registration nurse education. A survey approach was adopted and questionnaires were sent to all Higher Education Institutes delivering pre registration nursing and midwifery programmes in the United Kingdom (n=72). Ethical approval was acquired and principles of strict confidentiality were adhered to throughout. Both quantitative and qualitative data were obtained. Quantitative data were analysed using SPSS (version 11.5), and qualitative data were systematically scrutinised for emerging themes. The findings support the notion that peer-practised learning in the classroomsetting is a desirable method of teaching and learning core clinical skills from a teacher perspective. However, notable inconsistencies in the range of procedures students are allowed to perform on each other were found. The mechanisms of risk assessment and concept of consent were also found to be decidedly variable. PMID:18499523
Objectives: Graphical displays can make data more understandable; however, large graphs can challenge human comprehension. We have previously described a filtering method to provide high-level summary views of large data sets. In this paper we demonstrate our method for setting and selecting thresholds to limit graph size while retaining important information by applying it to large single and paired data sets, taken from patient and bibliographic databases. Methods: Four case studies are used to illustrate our method. The data are either patient discharge diagnoses (coded using the International Classification of Diseases, Clinical Modifications [ICD9-CM]) or Medline citations (coded using the Medical Subject Headings [MeSH]). We use combinations of different thresholds to obtain filtered graphs for detailed analysis. The thresholds setting and selection, such as thresholds for node counts, class counts, ratio values, p values (for diff data sets), and percentiles of selected class count thresholds, are demonstrated with details in case studies. The main steps include: data preparation, data manipulation, computation, and threshold selection and visualization. We also describe the data models for different types of thresholds and the considerations for thresholds selection. Results: The filtered graphs are 1%-3% of the size of the original graphs. For our case studies, the graphs provide 1) the most heavily used ICD9-CM codes, 2) the codes with most patients in a research hospital in 2011, 3) a profile of publications on "heavily represented topics" in MEDLINE in 2011, and 4) validated knowledge about adverse effects of the medication of rosiglitazone and new interesting areas in the ICD9-CM hierarchy associated with patients taking the medication of pioglitazone. Conclusions: Our filtering method reduces large graphs to a manageable size by removing relatively unimportant nodes. The graphical method provides summary views based on computation of usage frequency and semantic context of hierarchical terminology. The method is applicable to large data sets (such as a hundred thousand records or more) and can be used to generate new hypotheses from data sets coded with hierarchical terminologies. PMID:24727931
Welcome the Madagascar hissing cockroach into your classroom--they are not your average pest! This article describes the basic biology of this relatively tame creature, and how to set up and care for a classroom colony. It includes a list of suggested inquiry-centered classroom activities that you and your students will find both educational and fun!
Alcohol use and other drug use affect patient healthcare outcomes. This article describes a classroom-to-clinical approach teaching nursing students to utilize motivational interviewing techniques to support patient behavior change. Through the lens of a universal prevention method, nursing students learned about reward circuit activation leading to risky substance use and the difference between addiction and at-risk use. Specific assessment tools and motivational interviewing techniques were presented in the classroom. Students then applied their knowledge in simulation laboratories and clinical rotations. PMID:24743176
Kane, Irene; Mitchell, Ann M; Puskar, Kathryn R; Hagle, Holly; Talcott, Kimberly; Fioravanti, Marie; Droppa, Mandy; Luongo, Peter F; Lindsay, Dawn
Abstract The zebrafish (Danio rerio) is an established model organism for developmental and biomedical research. It is frequently used for high-throughput functional genomics experiments, such as genome-wide gene expression measurements, to systematically analyze molecular mechanisms. However, the use of whole embryos or larvae in such experiments leads to a loss of the spatial information. To address this problem, we have developed a tool called Zebrafish Expression Ontology of Gene Sets (ZEOGS) to assess the enrichment of anatomical terms in large gene sets. ZEOGS uses gene expression pattern data from several sources: first, in situ hybridization experiments from the Zebrafish Model Organism Database (ZFIN); second, it uses the Zebrafish Anatomical Ontology, a controlled vocabulary that describes connected anatomical structures; and third, the available connections between expression patterns and anatomical terms contained in ZFIN. Upon input of a gene set, ZEOGS determines which anatomical structures are overrepresented in the input gene set. ZEOGS allows one for the first time to look at groups of genes and to describe them in terms of shared anatomical structures. To establish ZEOGS, we first tested it on random gene selections and on two public microarray datasets with known tissue-specific gene expression changes. These tests showed that ZEOGS could reliably identify the tissues affected, whereas only very few enriched terms to none were found in the random gene sets. Next we applied ZEOGS to microarray datasets of 24 and 72?h postfertilization zebrafish embryos treated with beclomethasone, a potent glucocorticoid. This analysis resulted in the identification of several anatomical terms related to glucocorticoid-responsive tissues, some of which were stage-specific. Our studies highlight the ability of ZEOGS to extract spatial information from datasets derived from whole embryos, indicating that ZEOGS could be a useful tool to automatically analyze gene expression pattern features of any large zebrafish gene set.
Public engagement and outreach activities are increasingly using specialist staff for co-ordination, training and support for researchers, they are also becoming expected for large investments. Here, the experience of public engagement and outreach a large, interdisciplinary Research Hub is described. dot.rural, based at the University of Aberdeen UK, is a £11.8 million Research Councils UK Rural Digital Economy Hub, funded as part of the RCUK Digital Economy Theme (2009-2015). Digital Economy research aims to realise the transformational impact of digital technologies on aspects of the environment, community life, cultural experiences, future society, and the economy. The dot.rural Hub involves 92 researchers from 12 different disciplines, including Geography, Hydrology and Ecology. Public Engagement and Outreach is embedded in the dot.rural Digital Economy Hub via an Outreach Officer. Alongside this position, public engagement and outreach activities are compulsory part of PhD student contracts. Public Engagement and Outreach activities at the dot.rural Hub involve individuals and groups in both formal and informal settings organised by dot.rural and other organisations. Activities in the realms of Education, Public Engagement, Traditional and Social Media are determined by a set of Underlying Principles designed for the Hub by the Outreach Officer. The underlying Engagement and Outreach principles match funding agency requirements and expectations alongside researcher demands and the user-led nature of Digital Economy Research. All activities include researchers alongside the Outreach Officer are research informed and embedded into specific projects that form the Hub. Successful public engagement activities have included participation in Café Scientifique series, workshops in primary and secondary schools, and online activities such as I'm a Scientist Get Me Out of Here. From how to engage 8 year olds with making hydrographs more understandable to members of the public to blogging birds and engaging with remote, rural communities to Spiegeltents. This presentation will share successful public engagement and outreach events alongside some less successful experiences and lessons learnt along the way.
Research Findings: This paper reports on children's use of science materials in preschool classrooms during their free choice time. Baseline observations showed that children and teachers rarely spend time in the designated science area. An intervention was designed to "market" the science center by introducing children to 1 science tool, the…
Preschool teachers' job stressors have received increasing attention but have been understudied in the literature. We investigated the impacts of a classroom-based intervention, the Chicago School Readiness Project (CSRP), on teachers' perceived job stressors and confidence, as indexed by their perceptions of job control, job resources, job…
Zhai, Fuhua; Raver, C. Cybele; Li-Grining, Christine
We describe a novel approach to design a set of primers selective for large groups of genes. This method is based on the distribution frequency of all nucleotide combinations (octa- to decanucleotides), and the combined ability of primer pairs, based on these oligonucleotides, to detect genes. By analyzing 1000 human mRNAs, we found that surprisingly small subset of octanucleotides is shared by a high proportion of human protein-coding region sense strands. By computer simulation of polymerase chain reactions, a set based on only 30 primers was able to detect approximately 75% of known (and presumably unknown) human protein-coding regions. To validate the method and provide experimental support for the feasibility of the more ambitious goal of targeting human protein-coding regions, we sought to apply the technique to a large protein family: G-protein coupled receptors (GPCRs). Our results indicate that there is sufficient low level homology among human coding regions to allow design of a limited set of primer pairs that can selectively target coding regions in general, as well as genomic subsets (e.g., GPCRs). The approach should be generally applicable to human coding regions, and thus provide an efficient method for analyzing much of the transcriptionally active human genome. 23 refs., 5 figs., 2 tabs.
Lopez-Nieto, C.E.; Nigam, S.K. [Brigham and Women`s Hospital, Boston, MA (United States)]|[Harvard Medical School, Boston, MA (United States)
This article presents a new classroom observation scale that was developed to examine the differential learning activities and experiences of gifted children educated in regular classroomsettings. The Differentiated Classroom Observation Scale (DCOS) is presented in total, with clarification of the coding practices and strategies. Although the DCOS was developed to examine the impact of differentiated classroom practices for gifted
Jerrell C. Cassady; Kristie L. Speirs Neumeister; Cheryll M. Adams; Tracy L. Cross; Felicia A. Dixon; Rebecca L. Pierce
Large atomic natural orbital (ANO) basis sets are tabulated for the Sc to Cu. The primitive sets are taken from the largesets optimized by Partridge, namely (21s 13p 8d) for Sc and Ti and (20s 12p 9d) for V to Cu. These primitive sets are supplemented with three p, one d, six f, and four g functions. The ANO sets are derived from configuration interaction density matrices constructed as the average of the lowest states derived from the 3d(sup n)4s(sup 2) and 3d(sup n+1)4s(sup 1) occupations. For Ni, the 1S(3d(sup 10)) state is included in the averaging. The choice of basis sets for molecular calculations is discussed.
Bauschlicher, Charles W., Jr.; Langhoff, Stephen R. (Technical Monitor)
Determining the number of clusters present in a data set automatically is a very important problem. Conventional clustering techniques assume a certain number of clusters, and then try to find out the possible cluster structure associated to the above number. For very large and complex data sets it is not easy to guess this number of clusters. There exists validity
This Teaching Resource provides lecture notes, slides, and a problem set for a series of lectures from a course entitled “Systems Biology: Biomedical Modeling.” The materials are a lecture introducing the mathematical concepts behind principal components analysis (PCA). The lecture describes how to handle large data sets with correlation methods and unsupervised clustering with this popular method of analysis, PCA.
Neil R. Clark (New York;Mount Sinai School of Medicine REV); Avi Ma'ayan (New York;Mount Sinai School of Medicine REV)
This Teaching Resource provides lecture notes, slides, and a problem set for a series of lectures from a course entitled "Systems Biology: Biomedical Modeling." The materials are a lecture introducing the mathematical concepts behind principal components analysis (PCA). The lecture describes how to handle large data sets with correlation methods and unsupervised clustering with this popular method of analysis, PCA. PMID:21917717
Objective: To explore new graphical methods for reducing and analyzing large data sets in which the data are coded with a hierarchical terminology. Methods: We use a hierarchical terminology to organize a data set and display it in a graph. We reduce the size and complexity of the data set by considering the terminological structure and the data set itself (using a variety of thresholds) as well as contributions of child level nodes to parent level nodes. Results: We found that our methods can reduce large data sets to manageable size and highlight the differences among graphs. The thresholds used as filters to reduce the data set can be used alone or in combination. We applied our methods to two data sets containing information about how nurses and physicians query online knowledge resources. The reduced graphs make the differences between the two groups readily apparent. Conclusions: This is a new approach to reduce size and complexity of large data sets and to simplify visualization. This approach can be applied to any data sets that are coded with hierarchical terminologies.
Preschool teachers' job stressors have received increasing attention but have been understudied in the literature. We investigated the impacts of a classroom-based intervention, the Chicago School Readiness Project (CSRP), on teachers' perceived job stressors and confidence, as indexed by their perceptions of job control, job resources, job demands, and confidence in behavior management. Using a clustered randomized controlled trial (RCT) design, the CSRP provided multifaceted services to the treatment group, including teacher training and mental health consultation, which were accompanied by stress-reduction services and workshops. Overall, 90 teachers in 35 classrooms at 18 Head Start sites participated in the study. After adjusting for teacher and classroom factors and site fixed effects, we found that the CSRP had significant effects on the improvement of teachers' perceived job control and work-related resources. We also found that the CSRP decreased teachers' confidence in behavior management and had no statistically significant effects on job demands. Overall, we did not find significant moderation effects of teacher race/ethnicity, education, teaching experience, or teacher type. The implications for research and policy are discussed. PMID:21927538
Zhai, Fuhua; Raver, C Cybele; Li-Grining, Christine
This paper describes an analysis of simulated multistatic active sonar data from a large sonobuoy field having large measurement errors, a low probability of detection and a high false alarm rate. The data comprises the first of five scenarios collectively known as scenario one of the Metron Blind data set. Single- and multi-hypothesis Kalman filter trackers are used to process
In this work we presented the first experimental results of land subsidence mapping for large areas by using Coherent Point Target SAR interferometry with a reduced set of images in the North China Plain (NCP). Since the limitations of the classical Permanent Scatterer InSAR (PSI) for short temporal span surface deformation monitoring due to the dependency on large volumes data
Daqing Ge; Yan Wang; Ling Zhang; Ye Xia; Xiaofang Guo
Research Findings: Quality of care for preschool children in inclusive and noninclusive classrooms was examined in two studies. In Study 1, comparisons across a large sample of classrooms (N = 1, 313) showed that inclusive classrooms were higher than noninclusive classrooms in global quality as well as on two dimensions of quality…
Hestenes, Linda L.; Cassidy, Deborah J.; Shim, Jonghee; Hegde, Archana V.
Suggests ways for making classrooms fun, including giving children opportunities to work together, moving things around, letting children learn to set their own controls, setting up the classroom so students can experiment and discover together, and providing bonuses or incentives for the child who is falling behind. (CR)
This study presents findings from an evaluation of the Developmental Designs classroom management approach and professional development model during its first year of implementation across 22 middle schools in a large, Midwestern school district. The impact of this professional development model on teaching and learning as related to participants'…
Using data from a nationwide, large-scale experimental study of the effects of a connected classroom technology on student learning in algebra (Owens et al., 2004), this dissertation focuses on challenges that can arise in estimating treatment effects in educational field experiments when samples are highly heterogeneous in terms of various…
Several fast algorithms for clustering very large data sets have been proposed in the literature, including CLARA, CLARANS, GAC-R3, and GAC-RARw. CLARA is a combination of a sampling procedure and the classical PAM algorithm, while CLARANS adopts a serial randomized search strategy to find the optimal set of medoids. GAC-R3 and GAC-RARw exploit genetic search heuristics for solving clustering problems.
A method is given for the calculation of gas-phase bond dissociation energies (BDEs) for relatively large molecules. The method combines the use of locally dense basis sets (LDBS) with density functional theory, using the B3LYP functional. For water and propene, primary and secondary regions are defined and the BDE is tested for consistency with respect to full (primary basis) calculations. Results obtained with LDBS closely approach the limit of using large basis sets throughout. An application of biochemical interest is the determination of the O-H BDE of ?-tocopherol, which contains 81 atoms.
We present here an efficient algorithm to compute the Principal Component Analysis (PCA) of a large image set consisting of images and, for each image, the set of its uniform rotations in the plane. We do this by pointing out the block circulant structure of the covariance matrix and utilizing that structure to compute its eigenvectors. We also demonstrate the advantages of this algorithm over similar ones with numerical experiments. Although it is useful in many settings, we illustrate the specific application of the algorithm to the problem of cryo-electron microscopy.
This paper reports findings about a curriculum innovation conducted at The University of Hong Kong. A CD-ROM consisting of videos of two lessons by different teachers demonstrating exemplary science teaching was used to elicit conceptions of good science teaching of student-teachers enrolled for the 1-year Postgraduate Diploma in Education at several stages during the programme. It was found that the videos elicited student-teachers’ conceptions and had impact on those conceptions prior to the commencement of formal instruction. It has extended student-teachers’ awareness of alternative teaching methods and approaches not experienced in their own schooling, broadened their awareness of different classroom situations, provided proof of existence of good practices, and prompted them to reflect on their current preconceptions of good science teaching. In several ways, the videos acted as a catalyst in socializing the transition of student-teachers from the role of student to the role of teacher.
Wong, Siu Ling; Yung, Benny Hin Wai; Cheng, Man Wai; Lam, Kwok Leung; Hodson, Derek
Comprehensive characterization of a proteome is a fundamental goal in proteomics. To achieve saturation coverage of a proteome or specific subproteome via tandem mass spectrometric identification of tryptic protein sample digests, proteomics data sets are growing dramatically in size and heterogeneity. The trend toward very large integrated data sets poses so far unsolved challenges to control the uncertainty of protein identifications going beyond well established confidence measures for peptide-spectrum matches. We present MAYU, a novel strategy that reliably estimates false discovery rates for protein identifications in large scale data sets. We validated and applied MAYU using various large proteomics data sets. The data show that the size of the data set has an important and previously underestimated impact on the reliability of protein identifications. We particularly found that protein false discovery rates are significantly elevated compared with those of peptide-spectrum matches. The function provided by MAYU is critical to control the quality of proteome data repositories and thereby to enhance any study relying on these data sources. The MAYU software is available as standalone software and also integrated into the Trans-Proteomic Pipeline.
Reiter, Lukas; Claassen, Manfred; Schrimpf, Sabine P.; Jovanovic, Marko; Schmidt, Alexander; Buhmann, Joachim M.; Hengartner, Michael O.; Aebersold, Ruedi
In this article, the authors review International Large-Scale Assessment (ILSA)-based research over the last several decades, with specific attention on cross-national analysis of mean differences between and variation within countries in mathematics education. They discuss the role of sampling design and "opportunity to learn" (OTL)…
In 2008, the greatest experiment in history began. When in full operation, the Large Hadron Collider (LHC) at CERN will generate the greatest amount of information that has ever been produced in an experiment before. It will also reveal some of the most fundamental secrets of nature. Despite the enormous amount of information available on this…
Indigenous languages are powerful symbols of self-determination and sovereignty for tribal communities in the United States, and many community-based programs have been developed to support and maintain them. The successes of these programs, however, have been difficult to replicate at large research institutions. This article examines the issues…
Kernel principal component analysis (KPCA) is a popular nonlinear feature extraction method. It generally uses eigen-decomposition technique to extract the principal components. But the method is infeasible for large-scale data set because of the storage and computational problem. To overcome these disadvantages, an efficient iterative method of computing kernel principal components is proposed. First, the Gram matrix is transformed into
Analysis and visualization of extremely large and complex data sets may be one of the most significant challenges facing earth and space science investigators in the forthcoming decades. While advances in hardware speed and storage technology have roughly kept up with (indeed, have driven) increases in database size, the same is not of our abilities to manage the complexity of
Recently, several nonlinear shape normalization methods have been proposed in order to compensate for shape distortions in large-set handwritten characters. In this paper, these methods are reviewed from the two points of view: feature projection and feature density equalization. The former makes feature projection histogram by projecting a certain feature at each point onto horizontal- or vertical-axis and the latter
The promotion of reflective practice, while widely advocated in higher education settings, nonetheless presents numerous challenges. This is an under-researched aspect of the discourse on reflective practice. A key challenge for those working in the field of teacher education within higher education is to promote a culture of refection in large…
Symbolic data analysis (Bock and Diday (2000)) is concerned with the extension of classical data analysis and statistical methods to more complex data called symbolic data. In this paper we're interested in a new strategy (Baune (2006)) for clustering large symbolic quantitative data sets, based on two steps. The goal of the first step of the procedure is to reduce
Although some bright students in primary school are able to organise numerical data into classes, most attend to the characteristics of individuals rather than the group, and "see the trees rather than the forest". How can teachers in upper primary and early high school teach students to organise largesets of data with widely varying values into…
This paper analyses the use of factor analysis for instrumental variable estimation when the number of instruments tends to inflnity. We consider cases where the un- observed factors are the optimal instruments but also cases where the factors are not necessarily the optimal instruments but can provide a summary of a largeset of instru- ments. Further, the situation where
Current advances in computer hardware, information technology and data collection techniques have produced very large data sets in a wide variety of scientific and engineering disciplines. We must harness this opportunity to visualize and extract useful information from geophysical and geological data. We have taken the task of data mining by implementing a map-like approach over a web server for
Zachary A. Garbow; Nicholas R. Olson; David A. Yuen; John M. Boggs
We are validating the global cloud parameters derived from the satellite-borne HIRS2 and MSU atmospheric sounding instrument measurements, and are using the analysis of these data as one prototype for studying large geophysical data sets in general. The H...
Summary: The distributed brain systems associated with performance of a verbal fluency task were identified in a nondirected correlational analysis of neurophysiological data obtained with positron tomography. This analysis used a recursive principal-component analysis developed specifically for large data sets. This analysis is interpreted in terms of functional connectivity, defined as the temporal correlation of a neurophysiological index measured in
K. J. Friston; C. D. Frith; P. F. Liddle; R. S. J. Frackowiak
This report presents findings from two focus groups involving nine Crisis Intervention Team (CIT)-trained officers stationed at a large, international airport. The objective was to uncover themes that could inform crisis intervention approaches in special settings. The focus groups described officers’ motivations for participating in CIT and perceived benefits of CIT training. Additionally, the groups discussed special issues pertaining to
Joanne A. McGriff; Beth Broussard; Berivan N. Demir Neubert; Nancy J. Thompson; Michael T. Compton
High-throughput sequencing techniques are becoming attractive to molecular biologists and ecologists as they provide a time- and cost-effective way to explore diversity patterns in environmental samples at an unprecedented resolution. An issue common to many studies is the definition of what fractions of a data set should be considered as rare or dominant. Yet this question has neither been satisfactorily addressed, nor is the impact of such definition on data set structure and interpretation been fully evaluated. Here we propose a strategy, MultiCoLA (Multivariate Cutoff Level Analysis), to systematically assess the impact of various abundance or rarity cutoff levels on the resulting data set structure and on the consistency of the further ecological interpretation. We applied MultiCoLA to a 454 massively parallel tag sequencing data set of V6 ribosomal sequences from marine microbes in temperate coastal sands. Consistent ecological patterns were maintained after removing up to 35–40% rare sequences and similar patterns of beta diversity were observed after denoising the data set by using a preclustering algorithm of 454 flowgrams. This example validates the importance of exploring the impact of the definition of rarity in large community data sets. Future applications can be foreseen for data sets from different types of habitats, e.g. other marine environments, soil and human microbiota.
Gobet, Angelique; Quince, Christopher; Ramette, Alban
A test was devised for exploring the question of whether it will be possible to identify genes in large-scale genome studies solely by sequence comparison with current sequence collections. To this end, a facsimile data set was constructed by dividing GenBank Release 56 randomly into two halves, one to serve as a reference set and the other intended to simulate raw data anticipated from large genome sequence projects. All supplementary information and identifying marks were removed from the test set after assignment of random identification numbers to each entry and their encryption. Because noncoding intervening sequences (introns) are underrepresented in GenBank, a program that introduced (simulated) introns into mRNA and prokaryotic sequences was devised. In a further attempt to make the problem of identification more realistic, random base substitutions and single-base deletions were also incorporated. The randomly ordered entries were concatenated, along with random intergenic flanking sequences, into a single long "chromosome" 33 Mb in length and then cut into "cosmids" 50-100 kb long. The chopping process was conducted in such a way that terminal overlaps would allow the order of the entries in the chromosome to be reconstituted. Finally, the sequences of a substantial fraction of the cosmids were converted to their complements. Preliminary searching of 10 test cosmids revealed that more than two-thirds of the entries in the test set should be readily identifiable by type of gene product solely on the basis of comparison with the reference set. These preliminary results suggest that existing computer regimens and sequence collections would be able to identify the majority of eukaryotic genes in any new raw data set, the existence of introns not withstanding. Moreover, the analysis can be conducted in pace with the data collection so that the search results and summary identifications will be instantly available to the research community at large. PMID:2081603
Seely, O; Feng, D F; Smith, D W; Sulzbach, D; Doolittle, R F
The Active Learning exercise described here has been used to replace some lecture hours in the renal portion of an integrated, organ-system based curriculum for First Year Medical Students. The exercise takes place in a large auditorium with approximately 150 students. Two Faculty, Drs. Dietz (a Physiology) and Dr. Panzarino (a Nephrologist) lead the discussions which are based on two clinical cases developed from actual patient data. The cases have already been published in the APS Teaching Archive previously (Objects 197 & 192). The Students are pre-assigned to groups of 5 or 6 and designated to sit in clusters to facilitate their individual group discussions. Each of the Faculty wears a lapel microphone and each carries a handheld microphone to pass between the student groups.
PhD John R. Dietz (University of South Florida Department of Physiology and Biophysics)
We are validating the global cloud parameters derived from the satellite-borne HIRS2 and MSU atmospheric sounding instrument measurements, and are using the analysis of these data as one prototype for studying large geophysical data sets in general. The HIRS2/MSU data set contains a total of 40 physical parameters, filling 25 MB/day; raw HIRS2/MSU data are available for a period exceeding 10 years. Validation involves developing a quantitative sense for the physical meaning of the derived parameters over the range of environmental conditions sampled. This is accomplished by comparing the spatial and temporal distributions of the derived quantities with similar measurements made using other techniques, and with model results. The data handling needed for this work is possible only with the help of a suite of interactive graphical and numerical analysis tools. Level 3 (gridded) data is the common form in which large data sets of this type are distributed for scientific analysis. We find that Level 3 data is inadequate for the data comparisons required for validation. Level 2 data (individual measurements in geophysical units) is needed. A sampling problem arises when individual measurements, which are not uniformly distributed in space or time, are used for the comparisons. Standard 'interpolation' methods involve fitting the measurements for each data set to surfaces, which are then compared. We are experimenting with formal criteria for selecting geographical regions, based upon the spatial frequency and variability of measurements, that allow us to quantify the uncertainty due to sampling. As part of this project, we are also dealing with ways to keep track of constraints placed on the output by assumptions made in the computer code. The need to work with Level 2 data introduces a number of other data handling issues, such as accessing data files across machine types, meeting large data storage requirements, accessing other validated data sets, processing speed and throughput for interactive graphical work, and problems relating to graphical interfaces.
Kahn, Ralph; Haskins, Robert D.; Knighton, James E.; Pursch, Andrew; Granger-Gallegos, Stephanie
Amphibian embryos from the genus Xenopus are among the best species for understanding early vertebrate development and for studying basic cell biological processes. Xenopus, and in particular the diploid Xenopus tropicalis, is also ideal for functional genomics. Understanding the behavior of genes in this accessible model system will have a significant and beneficial impact on the understanding of similar genes in other vertebrate systems. Here we describe the analysis of 219,270 X. tropicalis expressed sequence tags (ESTs) from four early developmental stages. From these, we have deduced a set of unique expressed sequences comprising approximately 20,000 clusters and 16,000 singletons. Furthermore, we developed a computational method to identify clones that contain the complete coding sequence and describe the creation for the first time of a set of approximately 7000 such clones, the full-length (FL) clone set. The entire EST set is cloned in a eukaryotic expression vector and is flanked by bacteriophage promoters for in vitro transcription, allowing functional experiments to be carried out without further subcloning. We have created a publicly available database containing the FL clone set and related clustering data (http://www.gurdon.cam.ac.uk/informatics/Xenopus.html) and we make the FL clone set publicly available as a resource to accelerate the process of gene discovery and function in this model organism. The creation of the unique set of expressed sequences and the FL clone set pave the way toward a large-scale systematic analysis of gene sequence, gene expression, and gene function in this vertebrate species. PMID:15223350
Gilchrist, Michael J; Zorn, Aaron M; Voigt, Jana; Smith, James C; Papalopulu, Nancy; Amaya, Enrique
NATS 101 A Geological Perspective is a general education course taken by non science majors. We offer 600 seats per semester, with four large lecture sections taught by different faculty members. In the past we have offered optional once a week study groups taught by graduate teaching assistants. Students often feel overwhelmed by the science and associated jargon, and many are prone to skipping lectures altogether. Optional study groups are only attended by ~50% of the students. Faculty members find the class to be a lot of work, mainly due to the grading it generates. Activities given in lecture are often short multiple choice or true false assignments, limiting the depth of understanding we can evaluate. Our students often lack math and critical thinking skills, and we spend a lot of time in lecture reintroducing ideas students should have already gotten from the text. In summer 2007 we were funded to redesign the course. Our goals were to 1) cut the cost of running the course, and 2) improve student learning. Under our redesign optional study groups were replaced by once a week mandatory break out sessions where students complete activities that have been introduced in lecture. Break out sessions substitute for one hour of lecture, and are run by undergraduate preceptors and graduate teaching assistants (GTAs). During the lecture period, lectures themselves are brief with a large portion of the class devoted to active learning in small groups. Weekly reading quizzes are submitted via the online course management system. Break out sessions allow students to spend more time interacting with their fellow students, undergraduate preceptors, and GTAs. They get one on one help in break out sessions on assignments designed to enhance the lecture material. The active lecture format means less of their time is devoted to listening passively to a lecture, and more time is spent peer learning an interacting with the instructor. Completing quizzes online allows students more freedom in when and where they complete their work, and we provide instant feedback on their submitted work. The University of Wyoming Cognition in Astronomy, Physics and Earth sciences Research (CAPER) Team, who specialize in project evaluation, are leading the evaluation effort. We are comparing pre-test to post-test gains on the Geoscience Concept Inventory and Attitudes Toward Science surveys before and after the redesign, and inductive analysis of student interviews and reflective writing that describe student perceptions of the modified learning environment. The redesign has cut the cost of the class per student by more than half. This was achieved primarily in two ways: 1) by greatly reducing the number of hours spent by faculty and graduate teaching assistants on preparation, class time, and grading; and 2) reducing the number of graduate teaching assistants required for the class and replacing many of them with undergraduate preceptors. Undergraduate preceptors are not paid, but receive academic credit for their teaching service. The savings from the redesign is used to allow faculty more time to work on institutional priorities.
The SET-2400W is a newly designed whole-body PET scanner with a large axial field of view (20 cm). Its physical performance\\u000a was investigated and evaluated. The scanner consists of four rings of 112 BGO detector units (22.8 mm in-plane × 50 mm axial\\u000a × 30 mm depth). Each detector unit has a 6 (in-plane) × 8 (axial) matrix of BGO
There are many uncertainties in handwritten character recognition. Stochastic modeling is a flexible and general method for modeling such problems and entails the use of probabilistic models to deal with uncertain or incomplete information. This paper presents an efficient scheme for off-line recognition of large-set handwritten characters in the framework of stochastic models, the first-order hidden Markov models (HMMs). To
The k-means algorithm is well known for its efficiency in clustering large data sets. However, working only on numeric values prohibits it from being used to cluster real world data containing categorical values. In this paper we present two algorithms which extend the k-means algorithm to categorical domains and domains with mixed numeric and categorical values. The k-modes algorithm uses
Research concerning object classifications schemes are reviewed, focusing on large data sets. Classification techniques are discussed, including syntactic, decision theoretic methods, fuzzy techniques, and stochastic and fuzzy grammars. Consideration is given to the automation of MK classification (Morgan and Keenan, 1973) and other problems associated with the classification of spectra. In addition, the classification of galaxies is examined, including the problems of systematic errors, blended objects, galaxy types, and galaxy clusters.
We propose a neuron-synapse integrated circuit (IC) chip-set for large-scale chaotic neural networks. We use switched-capacitor (SC) circuit techniques to implement a three-internal-state transiently-chaotic neural network model. The SC chaotic neuron chip faithfully reproduces complex chaotic dynamics in real numbers through continuous state variables of the analog circuitry. We can digitally control most of the model parameters by means of
We describe the design and implementation of a an interactive application that is capable of visualizing large remote data sets over the Internet. The application is accessible using a standard Internet browser, allowing users the ability to run visualizations with no need to download any data or to install any additional software. The application has two parts: a graphical user interface client that runs inside the user's browser, and a server code that performs all CPU and data intensive tasks, including rendering.
Federl, P.; Grimstrup, A.; Kiddle, C.; Taylor, A. R.; Robinson, K.; Stephure, M.; Yee, G.
In this project we look at the performance characteristics of three tools used to move large data sets over dedicated long distance networking infrastructure. Although performance studies of wide area networks have been a frequent topic of interest, performance analyses have tended to focus on network latency characteristics and peak throughput using network traffic generators. In this study we instead perform an end-to-end long distance networking analysis that includes reading large data sets from a source file system and committing large data sets to a destination file system. An evaluation of end-to-end data movement is also an evaluation of the system configurations employed and the tools used to move the data. For this paper, we have built several storage platforms and connected them with a high performance long distance network configuration. We use these systems to analyze the capabilities of three data movement tools: BBcp, GridFTP, and XDD. Our studies demonstrate that existing data movement tools do not provide efficient performance levels or exercise the storage devices in their highest performance modes. We describe the device information required to achieve high levels of I/O performance and discuss how this data is applicable in use cases beyond data movement performance.
Hodson, Stephen W [ORNL; Poole, Stephen W [ORNL; Ruwart, Thomas [ORNL; Settlemyer, Bradley W [ORNL
"Classroom Assessment in Action" clarifies the multi-faceted roles of measurement and assessment and their applications in a classroomsetting. Comprehensive in scope, Shermis and Di Vesta explain basic measurement concepts and show students how to interpret the results of standardized tests. From these basic concepts, the authors then provide…
In this paper, we present a large database of over 50,000 user-labeled videos collected from YouTube. We develop a compact representation called "tiny videos" that achieves high video compression rates while retaining the overall visual appearance of the video as it varies over time. We show that frame sampling using affinity propagation-an exemplar-based clustering algorithm-achieves the best trade-off between compression and video recall. We use this large collection of user-labeled videos in conjunction with simple data mining techniques to perform related video retrieval, as well as classification of images and video frames. The classification results achieved by tiny videos are compared with the tiny images framework  for a variety of recognition tasks. The tiny images data set consists of 80 million images collected from the Internet. These are the largest labeled research data sets of videos and images available to date. We show that tiny videos are better suited for classifying scenery and sports activities, while tiny images perform better at recognizing objects. Furthermore, we demonstrate that combining the tiny images and tiny videos data sets improves classification precision in a wider range of categories. PMID:21252400
A quantum chemical method based on a Hartree-Fock calculation with a small Gaussian AO basis set is presented. Its main area of application is the computation of structures, vibrational frequencies, and noncovalent interaction energies in huge molecular systems. The method is suggested as a partial replacement of semiempirical approaches or density functional theory (DFT) in particular when self-interaction errors are acute. In order to get accurate results three physically plausible atom pair-wise correction terms are applied for London dispersion interactions (D3 scheme), basis set superposition error (gCP scheme), and short-ranged basis set incompleteness effects. In total nine global empirical parameters are used. This so-called Hartee-Fock-3c (HF-3c) method is tested for geometries of small organic molecules, interaction energies and geometries of noncovalently bound complexes, for supramolecular systems, and protein structures. In the majority of realistic test cases good results approaching large basis set DFT quality are obtained at a tiny fraction of computational cost. PMID:23670872
Recent and forthcoming increases in the amount and complexity of astronomy data are creating data sets that are not amenable to the methods of analysis with which astronomers are familiar. Traditional methods are often inadequate not merely because the data sets are too large and too complex to fully be analyzed "manually", but because many conventional algorithms and techniques cannot be scaled up enough to work effectively on the new data sets. It is essential to develop new methods for organization, scientific visualization (as opposed to illustrative visualization) and analysis of heterogeneous, multiresolution data across application domains. Scientific utilization of highly complex and massive data sets poses significant challenges, and calls for some mathematical approaches more advanced than are now generally available. In this paper, we both give an overview of several innovative developments that address these challenges, and describe a few specific examples of algorithms we have developed, as well as the ones we are developing in the course of this ongoing work. These approaches will enhance scientific visualization and data analysis capabilities, thus facilitating astronomical research and enabling discoveries. This work was carried out with partial funding from the National Geospatial-Intelligence Agency University Research Initiative (NURI), grant HM1582-08-1-0019.
I will review a few short examples of how NVO and large astronomy data-producting projects can enable inquiry-based science through the use of scientific data in the classroom. This will include a brief report from a successful NASA IDEAS grant, which funded a workshop for geography teachers in Native American schools. Examples of how data mining of large data collections has impacted some non-traditional learning environments will also be presented. Even in such non-astronomy classroomsettings, the use of astronomy data offers stimulation for learning and can have amazing results. Support for this work was provided in part by NSF through Cooperative Agreement AST0122449 to the Johns Hopkins University and through the NSF Cooperative Agreement to the LSST Corporation.
High-dimensional problems arising from robot motion planning, biology, data mining, and geographic information systems often require the computation of k nearest neighbor (knn) graphs. The knn graph of a data set is obtained by connecting each point to its k closest points. As the research in the above-mentioned fields progressively addresses problems of unprecedented complexity, the demand for computing knn graphs based on arbitrary distance metrics and large high-dimensional data sets increases, exceeding resources available to a single machine. In this work we efficiently distribute the computation of knn graphs for clusters of processors with message passing. Extensions to our distributed framework include the computation of graphs based on other proximity queries, such as approximate knn or range queries. Our experiments show nearly linear speedup with over one hundred processors and indicate that similar speedup can be obtained with several hundred processors.
"Options in Education" is a radio news program which focuses on issues and developments in education. This transcript contains discussions of the national commitment to desegregated education, racial conflict in the classroom, learning how to set up a publishing business, women in education (mathematics and sex) and education news highlights.…
George Washington Univ., Washington, DC. Inst. for Educational Leadership.
A memory efficient algorithm for the computation of principal component analysis (PCA) of large mass spectrometry imaging data sets is presented. Mass spectrometry imaging (MSI) enables two- and three-dimensional overviews of hundreds of unlabeled molecular species in complex samples such as intact tissue. PCA, in combination with data binning or other reduction algorithms, has been widely used in the unsupervised processing of MSI data and as a dimentionality reduction method prior to clustering and spatial segmentation. Standard implementations of PCA require the data to be stored in random access memory. This imposes an upper limit on the amount of data that can be processed, necessitating a compromise between the number of pixels and the number of peaks to include. With increasing interest in multivariate analysis of large 3D multislice data sets and ongoing improvements in instrumentation, the ability to retain all pixels and many more peaks is increasingly important. We present a new method which has no limitation on the number of pixels and allows an increased number of peaks to be retained. The new technique was validated against the MATLAB (The MathWorks Inc., Natick, Massachusetts) implementation of PCA (princomp) and then used to reduce, without discarding peaks or pixels, multiple serial sections acquired from a single mouse brain which was too large to be analyzed with princomp. Then, k-means clustering was performed on the reduced data set. We further demonstrate with simulated data of 83 slices, comprising 20,535 pixels per slice and equaling 44 GB of data, that the new method can be used in combination with existing tools to process an entire organ. MATLAB code implementing the memory efficient PCA algorithm is provided. PMID:23394348
Race, Alan M; Steven, Rory T; Palmer, Andrew D; Styles, Iain B; Bunch, Josephine
Envision is a software project at the University of Illinois and Texas A&M, funded by NASA's Applied Information Systems Research Project. It provides researchers in the geophysical sciences convenient ways to manage, browse, and visualize large observed or model data sets. Envision integrates data management, analysis, and visualization of geophysical data in an interactive environment. It employs commonly used standards in data formats, operating systems, networking, and graphics. It also attempts, wherever possible, to integrate with existing scientific visualization and analysis software. Envision has an easy-to-use graphical interface, distributed process components, and an extensible design. It is a public domain package, freely available to the scientific community.
Searight, K. R.; Wojtowicz, D. P.; Walsh, J. E.; Pathi, S.; Bowman, K. P.; Wilhelmson, R. B.
This paper presents the integration of ubiquitous computing systems into classroomsettings, in order to provide basic support for classrooms and field activities. We have developed web application components using Java technology and configured a classroom with wireless network access and a web camera for our purposes. In this classroom, the students interact among each other and with the professor
Hiroaki Ogata; Nobuji A. Saito; Rosa G. J. Paredes; Gerardo Ayala San Martin; Yoneo Yano
It is not enough to be great at sharing information in a largeclassroomsetting. To be an effective teacher you must be able to meaningfully engage your students with their peers and with the content. And you must do this regardless of class size or content. The issues of teaching effectively in largeclassroomsettings have presented ongoing…
Analysis and visualization of extremely large and complex data sets may be one of the most significant challenges facing earth and space science investigators in the forthcoming decades. While advances in hardware speed and storage technology have roughly kept up with (indeed, have driven) increases in database size, the same is not of our abilities to manage the complexity of these data. Current missions, instruments, and simulations produce so much data of such high dimensionality that they outstrip the capabilities of traditional visualization and analysis software. This problem can only be expected to get worse as data volumes increase by orders of magnitude in future missions and in ever-larger supercomputer simulations. For large multivariate data (more than 105 samples or records with more than 5 variables per sample) the interactive graphics response of most existing statistical analysis, machine learning, exploratory data analysis, and/or visualization tools such as Torch, MLC++, Matlab, S++/R, and IDL stutters, stalls, or stops working altogether. Fortunately, the graphics processing units (GPUs) built in to all professional desktop and laptop computers currently on the market are capable of transforming, filtering, and rendering hundreds of millions of points per second. We present a prototype open-source cross-platform application which leverages much of the power latent in the GPU to enable smooth interactive exploration and analysis of large high- dimensional data using a variety of classical and recent techniques. The targeted application is the interactive analysis of large, complex, multivariate data sets, with dimensionalities that may surpass 100 and sample sizes that may exceed 106-108.
Background Predicting protein residue-residue contacts is an important 2D prediction task. It is useful for ab initio structure prediction and understanding protein folding. In spite of steady progress over the past decade, contact prediction remains still largely unsolved. Results Here we develop a new contact map predictor (SVMcon) that uses support vector machines to predict medium- and long-range contacts. SVMcon integrates profiles, secondary structure, relative solvent accessibility, contact potentials, and other useful features. On the same test data set, SVMcon's accuracy is 4% higher than the latest version of the CMAPpro contact map predictor. SVMcon recently participated in the seventh edition of the Critical Assessment of Techniques for Protein Structure Prediction (CASP7) experiment and was evaluated along with seven other contact map predictors. SVMcon was ranked as one of the top predictors, yielding the second best coverage and accuracy for contacts with sequence separation >= 12 on 13 de novo domains. Conclusion We describe SVMcon, a new contact map predictor that uses SVMs and a largeset of informative features. SVMcon yields good performance on medium- to long-range contact predictions and can be modularly incorporated into a structure prediction pipeline.
In this paper we look at the performance characteristics of three tools used to move large data sets over dedicated long distance networking infrastructure. Although performance studies of wide area networks have been a frequent topic of interest, performance analyses have tended to focus on network latency characteristics and peak throughput using network traffic generators. In this study we instead perform an end-to-end long distance networking analysis that includes reading large data sets from a source file system and committing the data to a remote destination file system. An evaluation of end-to-end data movement is also an evaluation of the system configurations employed and the tools used to move the data. For this paper, we have built several storage platforms and connected them with a high performance long distance network configuration. We use these systems to analyze the capabilities of three data movement tools: BBcp, GridFTP, and XDD. Our studies demonstrate that existing data movement tools do not provide efficient performance levels or exercise the storage devices in their highest performance modes.
Settlemyer, Bradley W [ORNL; Dobson, Jonathan D [ORNL; Hodson, Stephen W [ORNL; Kuehn, Jeffery A [ORNL; Poole, Stephen W [ORNL; Ruwart, Thomas [ORNL
A large-scale similarity search investigation has been carried out on 266 well-defined compound activity classes extracted from the ChEMBL database. The analysis was performed using two widely applied two-dimensional (2D) fingerprints that mark opposite ends of the current performance spectrum of these types of fingerprints, i.e., MACCS structural keys and the extended connectivity fingerprint with bond diameter four (ECFP4). For each fingerprint, three nearest neighbor search strategies were applied. On the basis of these search calculations, a similarity search profile of the ChEMBL database was generated. Overall, the fingerprint search campaign was surprisingly successful. In 203 of 266 test cases (?76%), a compound recovery rate of at least 50% was observed with at least the better performing fingerprint and one search strategy. The similarity search profile also revealed several general trends. For example, fingerprint searching was often characterized by an early enrichment of active compounds in database selection sets. In addition, compound activity classes have been categorized according to different similarity search performance levels, which helps to put the results of benchmark calculations into perspective. Therefore, a compendium of activity classes falling into different search performance categories is provided. On the basis of our large-scale investigation, the performance range of state-of-the-art 2D fingerprinting has been delineated for compound data sets directed against a wide spectrum of pharmaceutical targets. PMID:21728295
Background From initial seed germination through reproduction, plants continuously reprogram their transcriptional repertoire to facilitate growth and development. This dynamic is mediated by a diverse but inextricably-linked catalog of regulatory proteins called transcription factors (TFs). Statistically quantifying TF binding site (TFBS) abundance in promoters of differentially expressed genes can be used to identify binding site patterns in promoters that are closely related to stress-response. Output from today’s transcriptomic assays necessitates statistically-oriented software to handle large promoter-sequence sets in a computationally tractable fashion. Results We present Marina, an open-source software for identifying over-represented TFBSs from amongst largesets of promoter sequences, using an ensemble of 7 statistical metrics and binding-site profiles. Through software comparison, we show that Marina can identify considerably more over-represented plant TFBSs compared to a popular software alternative. Conclusions Marina was used to identify over-represented TFBSs in a two time-point RNA-Seq study exploring the transcriptomic interplay between soybean (Glycine max) and soybean rust (Phakopsora pachyrhizi). Marina identified numerous abundant TFBSs recognized by transcription factors that are associated with defense-response such as WRKY, HY5 and MYB2. Comparing results from Marina to that of a popular software alternative suggests that regardless of the number of promoter-sequences, Marina is able to identify significantly more over-represented TFBSs.
This paper describes the development work and research findings of an initiative to create a statewide literacy assessment in New York to inform teaching and learning and report on group performance trends. The Early Literacy Profile (ELP) is a classroom-based, standards-referenced performance assessment for students in the primary grades…
The analysis of large data sets in meteorological and air quality studies is often made though the examination of specific case studies, especially when time-consuming computational models are employed. This paper presents the development of a tool to identify specific case studies, termed as representative days, that would subsequently be modelled. The success of such tools should be judged on the discrimination between the specified cases: and the degree to which they capture and recreate historical characteristics of the original data set. The developed approach utilises a principal component algorithm with varimax rotation (r-PCA) and the subtractive clustering algorithm coupled with a cluster validity criterion. In this paper, the developed tool is applied to a data set from the North Sea, utilizing two years worth of data from the DNMI operational forecasting model. The results will be subsequently used in photochemical and radiative forcing modelling tools as part of the EC funded project AEOLOS, with the ultimate goal to estimate the global warming potential of non-radioactive tracing substances such as SF6 and PFCs, which are heavily used in the oil industry.
Sfetsos, A.; Vlachogiannis, D.; Gounaris, N.; Stubos, A. K.
Normalizing all images in a large data set into a common space is a key step in many clinical and research studies, e.g., for brain development, maturation, and aging. Recently, groupwise registration has been developed for simultaneous alignment of all images without selecting a particular image as template, thus potentially avoiding bias in the registration. However, most conventional groupwise registration methods do not explore the data distribution during the image registration. Thus, their performance could be affected by large inter-subject variations in the data set under registration. To solve this potential issue, we propose to use a graph to model the distribution of all image data sitting on the image manifold, with each node representing an image and each edge representing the geodesic pathway between two nodes (or images). Then, the procedure of warping all images to their population center turns to the dynamic shrinking of the graph nodes along their graph edges until all graph nodes become close to each other. Thus, the topology of image distribution on the image manifold is always preserved during the groupwise registration. More importantly, by modeling the distribution of all images via a graph, we can potentially reduce registration error since every time each image is warped only according to its nearby images with similar structures in the graph. We have evaluated our proposed groupwise registration method on both infant and adult data sets, by also comparing with the conventional group-mean based registration and the ABSORB methods. All experimental results show that our proposed method can achieve better performance in terms of registration accuracy and robustness. PMID:24055505
This article investigates the perceptions of 12 teachers from New South Wales, Australia, regarding the classroom assignment of twins. Analysis of semi-structured interviews with each of the teachers revealed four key findings: 1) teachers' perceptions about the classroom assignment of twins vary according to their previous experience and…
Battelle is under contract with Warner Robins Air Logistics Center to design a common large area display set (CLADS) for use in multiple airborne C4I applications that currently use unique 19 inch CRTs. Engineers at Battelle have determined that by taking advantage of the latest flat panel display technology and the commonality between C4I applications, one display head (21 inch diagonal, 1280 by 1024) can be used in multiple applications. In addition, common modules are being designed by Battelle to reduce the number of installation- specific circuit card assemblies required for a particular application. Initial USAF applications include replacements for the E-3 AWACS color monitor assembly, E-8 Joint STARS graphics display unit, and ABCCC airborne color display. Initial U. S. Navy applications include the E-2C ACIS display. For these applications reliability and maintainability are key objectives. The common design reduces the number of unique subassemblies in the USAF inventory by 56 to 66%. In addition to total module reductions, CLADs module/subassembly re-use across nine potential applications is estimated to be 73%. As more platforms implement CLADS, the percentage of module re-use increases. The new design is also expected to have a MTBF of at least 3350 hours, an order of magnitude better than one of the current systems. In the Joint STARS installation, more than 1400 pounds can be eliminated from the aircraft. In the E-3 installation, the CLADs is estimated to provide a power reduction of approximately 1750 watts per aircraft. This paper discuses the common large area display set design and it use in a variety of C4I applications that require a large area, high resolution, full color display.
Background Large DNA sequence data sets require special bioinformatics tools to search and compare them. Such tools should be easy to use so that the data can be easily accessed by a wide array of researchers. In the past, the use of suffix trees for searching DNA sequences has been limited by a practical need to keep the trees in RAM. Newer algorithms solve this problem by using disk-based approaches. However, none of the fastest suffix tree algorithms have been implemented with a graphical user interface, preventing their incorporation into a feasible laboratory workflow. Results Suffix Tree Searcher (STS) is designed as an easy-to-use tool to index, search, and analyze very large DNA sequence datasets. The program accommodates very large numbers of very large sequences, with aggregate size reaching tens of billions of nucleotides. The program makes use of pre-sorted persistent "building blocks" to reduce the time required to construct new trees. STS is comprised of a graphical user interface written in Java, and four C modules. All components are automatically downloaded when a web link is clicked. The underlying suffix tree data structure permits extremely fast searching for specific nucleotide strings, with wild cards or mismatches allowed. Complete tree traversals for detecting common substrings are also very fast. The graphical user interface allows the user to transition seamlessly between building, traversing, and searching the dataset. Conclusions Thus, STS provides a new resource for the detection of substrings common to multiple DNA sequences or within a single sequence, for truly huge data sets. The re-searching of sequence hits, allowing wild card positions or mismatched nucleotides, together with the ability to rapidly retrieve large numbers of sequence hits from the DNA sequence files, provides the user with an efficient method of evaluating the similarity between nucleotide sequences by multiple alignment or use of Logos. The ability to re-use existing suffix tree pieces considerably shortens index generation time. The graphical user interface enables quick mastery of the analysis functions, easy access to the generated data, and seamless workflow integration.
This article describes a problem-based learning (PBL) approach (using tutorless groups) that was introduced as a supplement to standard didactic lectures in University of British Columbia Okanagan undergraduate biochemistry classes consisting of 45ÃÂ85 students. PBL was chosen as an effective method to assist students in learning biochemical and physiological processes. By monitoring student attendance and using informal and formal surveys, we demonstrated that PBL has a significant positive impact on student motivation to attend and participate in the course work.
Andis Klegeris (University of British Columbia); Heather Hurren (University of British Columbia)
The Swift X-ray Telescope has obtained 0.2-10 keV x-ray data on numerous blazars over timescales ranging from seconds to more than 8 years. Much of these data come from intense target of opportunity observations that can be analyzed in a multiwavelength context and used to model jet parameters, particularly during flare states. Another large component of these data comes from monitoring that was obtained during a variety of flux states. By looking at this broad data set, one can evaluate variability timescales and limit the emission mechanisms and associated parameters. Some of these blazars are known to exhibit variability timescales on the order of minutes in the gamma-ray band and tens of minutes in the x-ray band. We report on our search for short timescale x-ray variability that could limit the size and nature of the emission region/s in blazar jets.
Background Micro-and minisatellites are among the most powerful genetic markers known to date. They have been used as tools for a large number of applications ranging from gene mapping to phylogenetic studies and isolate typing. However, identifying micro-and minisatellite markers on large sequence data sets is often a laborious process. Results FONZIE was designed to successively 1) perform a search for markers via the external software Tandem Repeat Finder, 2) exclude user-defined specific genomic regions, 3) screen for the size and the percent matches of each relevant marker found by Tandem Repeat Finder, 4) evaluate marker specificity (i.e., occurrence of the marker as a single copy in the genome) using BLAST2.0, 5) design minisatellite primer pairs via the external software Primer3, and 6) check the specificity of each final PCR product by BLAST. A final file returns to users all the results required to amplify markers. A biological validation of the approach was performed using the whole genome sequence of the phytopathogenic fungus Leptosphaeria maculans, showing that more than 90% of the minisatellite primer pairs generated by the pipeline amplified a PCR product, 44.8% of which showed agarose-gel resolvable polymorphism between isolates. Segregation analyses confirmed that the polymorphic minisatellites corresponded to single-locus markers. Conclusion FONZIE is a stand-alone and user-friendly application developed to minimize tedious manual operations, reduce errors, and speed up the search for efficient minisatellite and microsatellite markers departing from whole-genome sequence data. This pipeline facilitates the integration of data and provides a set of specific primer sequences for PCR amplification of single-locus markers. FONZIE is freely downloadable at: http://www.versailles-grignon.inra.fr/bioger/equipes/leptosphaeria_maculans/outils_d_analyses/fonzie
Background The recent advent of high-throughput SNP genotyping technologies has opened new avenues of research for population genetics. In particular, a growing interest in the identification of footprints of selection, based on genome scans for adaptive differentiation, has emerged. Methodology/Principal Findings The purpose of this study is to develop an efficient model-based approach to perform Bayesian exploratory analyses for adaptive differentiation in very large SNP data sets. The basic idea is to start with a very simple model for neutral loci that is easy to implement under a Bayesian framework and to identify selected loci as outliers via Posterior Predictive P-values (PPP-values). Applications of this strategy are considered using two different statistical models. The first one was initially interpreted in the context of populations evolving respectively under pure genetic drift from a common ancestral population while the second one relies on populations under migration-drift equilibrium. Robustness and power of the two resulting Bayesian model-based approaches to detect SNP under selection are further evaluated through extensive simulations. An application to a cattle data set is also provided. Conclusions/Significance The procedure described turns out to be much faster than former Bayesian approaches and also reasonably efficient especially to detect loci under positive selection.
Tools for estimating population structure from genetic data are now used in a wide variety of applications in population genetics. However, inferring population structure in large modern data sets imposes severe computational challenges. Here, we develop efficient algorithms for approximate inference of the model underlying the STRUCTURE program using a variational Bayesian framework. Variational methods pose the problem of computing relevant posterior distributions as an optimization problem, allowing us to build on recent advances in optimization theory to develop fast inference tools. In addition, we propose useful heuristic scores to identify the number of populations represented in a data set and a new hierarchical prior to detect weak population structure in the data. We test the variational algorithms on simulated data and illustrate using genotype data from the CEPH–Human Genome Diversity Panel. The variational algorithms are almost two orders of magnitude faster than STRUCTURE and achieve accuracies comparable to those of ADMIXTURE. Furthermore, our results show that the heuristic scores for choosing model complexity provide a reasonable range of values for the number of populations represented in the data, with minimal bias toward detecting structure when it is very weak. Our algorithm, fastSTRUCTURE, is freely available online at http://pritchardlab.stanford.edu/structure.html.
Raj, Anil; Stephens, Matthew; Pritchard, Jonathan K.
The goal of this study is to validate the global cloud parameters derived from the satellite-borne HIRS2 and MSU atmospheric sounding instrument measurements, and to use the analysis of these data as one prototype for studying large geophysical data sets in general. The HIRS2/MSU data set contains a total of 40 physical parameters, filling 25 MB/day; raw HIRS2/MSU data are available for a period exceeding 10 years. Validation involves developing a quantitative sense for the physical meaning of the derived parameters over the range of environmental conditions sampled. This is accomplished by comparing the spatial and temporal distributions of the derived quantities with similar measurements made using other techniques, and with model results. The need to work with Level 2 (point) data, rather than Level 3 (gridded) data for validation purposes is discussed, and some techniques developed for charting the assumptions made in deriving an algorithm and generating a code to produce geophysical quantities from measured radiances are presented.
Kahn, Ralph; Haskins, Robert D.; Knighton, James E.; Pursch, Andrew; Granger-Gallegos, Stephanie
MEME-ChIP is a web-based tool for analyzing motifs in large DNA or RNA data sets. It can analyze peak regions identified by ChIP-seq, cross-linking sites identified by CLIP-seq and related assays, as well as sets of genomic regions selected using other criteria. MEME-ChIP performs de novo motif discovery, motif enrichment analysis, motif location analysis and motif clustering, providing a comprehensive picture of the DNA or RNA motifs that are enriched in the input sequences. MEME-ChIP performs two complementary types of de novo motif discovery: weight matrix-based discovery for high accuracy; and word-based discovery for high sensitivity. Motif enrichment analysis using DNA or RNA motifs from human, mouse, worm, fly and other model organisms provides even greater sensitivity. MEME-ChIP's interactive HTML output groups and aligns significant motifs to ease interpretation. This protocol takes less than 3 h, and it provides motif discovery approaches that are distinct and complementary to other online methods. PMID:24853928
This article describes a multiparameter calibration model, which improves the accuracy of density functional theory (DFT) for the prediction of standard enthalpies of formation for a largeset of organic compounds. The model applies atom based, bond based, electronic, and radical environmental correction terms to calibrate the calculated enthalpies of formation at B3LYP/6-31G(d,p) level by a least-square method. A diverse data set of 771 closed-shell compounds and radicals is used to train the model. The leave-one-out cross validation squared correlation coefficient q(2) of 0.84 and squared correlation coefficient r(2) of 0.86 for the final model are obtained. The mean absolute error in enthalpies of formation for the dataset is reduced from 4.9 kcal/mol before calibration to 2.1 kcal/mol after calibration. Five-fold cross validation is also used to estimate the performance of the calibration model and similar results are obtained. PMID:20740557
With the advent of remotely sensed data and coordinated efforts to create global databases, the ecological community has progressively become more data-intensive. However, in contrast to other disciplines, statistical ways of handling these large data sets, especially the gaps which are inherent to them, are lacking. Widely used theoretical approaches, for example model averaging based on Akaike's information criterion (AIC), are sensitive to missing values. Yet, the most common way of handling sparse matrices - the deletion of cases with missing data (complete case analysis) - is known to severely reduce statistical power as well as inducing biased parameter estimates. In order to address these issues, we present novel approaches to gap filling in large ecological data sets using matrix factorization techniques. Factorization based matrix completion was developed in a recommender system context and has since been widely used to impute missing data in fields outside the ecological community. Here, we evaluate the effectiveness of probabilistic matrix factorization techniques for imputing missing data in ecological matrices using two imputation techniques. Hierarchical Probabilistic Matrix Factorization (HPMF) effectively incorporates hierarchical phylogenetic information (phylogenetic group, family, genus, species and individual plant) into the trait imputation. Kernelized Probabilistic Matrix Factorization (KPMF) on the other hand includes environmental information (climate and soils) into the matrix factorization through kernel matrices over rows and columns. We test the accuracy and effectiveness of HPMF and KPMF in filling sparse matrices, using the TRY database of plant functional traits (http://www.try-db.org). TRY is one of the largest global compilations of plant trait databases (750 traits of 1 million plants), encompassing data on morphological, anatomical, biochemical, phenological and physiological features of plants. However, despite of unprecedented coverage, the TRY database is still very sparse, severely limiting joint trait analyses. Plant traits are the key to understanding how plants as primary producers adjust to changes in environmental conditions and in turn influence them. Forming the basis for Dynamic Global Vegetation Models (DGVMs), plant traits are also fundamental in global change studies for predicting future ecosystem changes. It is thus imperative that missing data is imputed in as accurate and precise a way as possible. In this study, we show the advantage of applying probabilistic matrix factorization techniques in incorporating hierarchical and environmental information for the prediction of missing plant traits as compared to conventional imputation techniques such as the complete case and mean approaches. We will discuss advantages of the proposed imputation techniques over other widely used methods such as multiple imputation (MI), as well as possible applications to other data sets.
Schrodt, F. I.; Shan, H.; Kattge, J.; Reich, P.; Banerjee, A.; Reichstein, M.
Perhaps the most important skill a good teacher should possess is the ability to control students. A teacher who can devise fascinating and unique lesson plans for her classroom is useless if she can't get the kids to sit down and listen to her instructions. Unfortunately, many beginning teachers simply are not prepared to manage their classrooms effectively. Managing a classroom means you must teach your students behavior expectations, not just post your rules on the classroom wall. Classroom management becomes even more of an issue when it applies to the active nature of the science classroom.
Scattered data interpolation is a problem of interest in numerous areas such as electronic imaging, smooth surface modeling, and computational geometry. Our motivation arises from applications in geology and mining, which often involve large scattered data sets and a demand for high accuracy. The method of choice is ordinary kriging. This is because it is a best unbiased estimator. Unfortunately, this interpolant is computationally very expensive to compute exactly. For n scattered data points, computing the value of a single interpolant involves solving a dense linear system of size roughly n x n. This is infeasible for large n. In practice, kriging is solved approximately by local approaches that are based on considering only a relatively small'number of points that lie close to the query point. There are many problems with this local approach, however. The first is that determining the proper neighborhood size is tricky, and is usually solved by ad hoc methods such as selecting a fixed number of nearest neighbors or all the points lying within a fixed radius. Such fixed neighborhood sizes may not work well for all query points, depending on local density of the point distribution. Local methods also suffer from the problem that the resulting interpolant is not continuous. Meyer showed that while kriging produces smooth continues surfaces, it has zero order continuity along its borders. Thus, at interface boundaries where the neighborhood changes, the interpolant behaves discontinuously. Therefore, it is important to consider and solve the global system for each interpolant. However, solving such large dense systems for each query point is impractical. Recently a more principled approach to approximating kriging has been proposed based on a technique called covariance tapering. The problems arise from the fact that the covariance functions that are used in kriging have global support. Our implementations combine, utilize, and enhance a number of different approaches that have been introduced in literature for solving large linear systems for interpolation of scattered data points. For very large systems, exact methods such as Gaussian elimination are impractical since they require 0(n(exp 3)) time and 0(n(exp 2)) storage. As Billings et al. suggested, we use an iterative approach. In particular, we use the SYMMLQ method, for solving the large but sparse ordinary kriging systems that result from tapering. The main technical issue that need to be overcome in our algorithmic solution is that the points' covariance matrix for kriging should be symmetric positive definite. The goal of tapering is to obtain a sparse approximate representation of the covariance matrix while maintaining its positive definiteness. Furrer et al. used tapering to obtain a sparse linear system of the form Ax = b, where A is the tapered symmetric positive definite covariance matrix. Thus, Cholesky factorization could be used to solve their linear systems. They implemented an efficient sparse Cholesky decomposition method. They also showed if these tapers are used for a limited class of covariance models, the solution of the system converges to the solution of the original system. Matrix A in the ordinary kriging system, while symmetric, is not positive definite. Thus, their approach is not applicable to the ordinary kriging system. Therefore, we use tapering only to obtain a sparse linear system. Then, we use SYMMLQ to solve the ordinary kriging system. We show that solving large kriging systems becomes practical via tapering and iterative methods, and results in lower estimation errors compared to traditional local approaches, and significant memory savings compared to the original global system. We also developed a more efficient variant of the sparse SYMMLQ method for large ordinary kriging systems. This approach adaptively finds the correct local neighborhood for each query point in the interpolation process.
We developed a package to process and analyze the data from the digital version of the Second Palomar Sky Survey. This system, called SKICAT, incorporates the latest in machine learning and expert systems software technology, in order to classify the detected objects objectively and uniformly, and facilitate handling of the enormous data sets from digital sky surveys and other sources. The system provides a powerful, integrated environment for the manipulation and scientific investigation of catalogs from virtually any source. It serves three principal functions: image catalog construction, catalog management, and catalog analysis. Through use of the GID3* Decision Tree artificial induction software, SKICAT automates the process of classifying objects within CCD and digitized plate images. To exploit these catalogs, the system also provides tools to merge them into a large, complex database which may be easily queried and modified when new data or better methods of calibrating or classifying become available. The most innovative feature of SKICAT is the facility it provides to experiment with and apply the latest in machine learning technology to the tasks of catalog construction and analysis. SKICAT provides a unique environment for implementing these tools for any number of future scientific purposes. Initial scientific verification and performance tests have been made using galaxy counts and measurements of galaxy clustering from small subsets of the survey data, and a search for very high redshift quasars. All of the tests were successful and produced new and interesting scientific results. Attachments to this report give detailed accounts of the technical aspects of the SKICAT system, and of some of the scientific results achieved to date. We also developed a user-friendly package for multivariate statistical analysis of small and moderate-size data sets, called STATPROG. The package was tested extensively on a number of real scientific applications and has produced real, published results.
Provides instructions for the construction of a paper mache classroom planetarium and suggests several student activities using this planetarium model. Lists reasons why students have difficulties in transferring classroom instruction in astronomy to the night sky. (DS)
Reviews research on classroom management, focusing on behavior modification, group management, teacher effects, management training, and planning. Five types of management skills and six principles for effective classroom organization identified by researchers are suggested for application by teachers. (PGD)
The Data for the Classroom collection gathers datasets that have accompanying instructional materials or other pertinent information for using the dataset in a classroomsetting for grades K-16. The data may be numerical, visual, maps, charts, tables or images. The data may be observational, remotely sensed or model data. The primary component is that there are materials supporting the use and understanding of the data either by educators or directly by students. Additionally, the dataset itself is desribed.
It is a myth that more pixels alone result in better images. The marketing of camera phones in particular has focused on their pixel numbers. However, their performance varies considerably according to the conditions of image capture. Camera phones are often used in low-light situations where the lack of a flash and limited exposure time will produce underexposed, noisy and blurred images. Camera utilization can be quantitatively described by photospace distributions, a statistical description of the frequency of pictures taken at varying light levels and camera-subject distances. If the photospace distribution is known, the user-experienced distribution of quality can be determined either directly by direct measurement of subjective quality, or by photospace-weighting of objective attributes. The population of a photospace distribution requires examining large numbers of images taken under typical camera phone usage conditions. ImagePhi was developed as a user-friendly software tool to interactively estimate the primary photospace variables, subject illumination and subject distance, from individual images. Additionally, subjective evaluations of image quality and failure modes for low quality images can be entered into ImagePhi. ImagePhi has been applied to sets of images taken by typical users with a selection of popular camera phones varying in resolution. The estimated photospace distribution of camera phone usage has been correlated with the distributions of failure modes. The subjective and objective data show that photospace conditions have a much bigger impact on image quality of a camera phone than the pixel count of its imager. The 'megapixel myth' is thus seen to be less a myth than an ill framed conditional assertion, whose conditions are to a large extent specified by the camera's operational state in photospace.
Alex Copeland on "Assembly of large metagenome data sets using a Convey HC-1 hybrid core computer" at the 2012 Sequencing, Finishing, Analysis in the Future Meeting held June 5-7, 2012 in Santa Fe, New Mexico.
Alex Copeland on "Assembly of large metagenome data sets using a Convey HC-1 hybrid core computer" at the 2012 Sequencing, Finishing, Analysis in the Future Meeting held June 5-7, 2012 in Santa Fe, New Mexico.
Alex Copeland on "Assembly of large metagenome data sets using a Convey HC-1 hybrid core computer" at the 2012 Sequencing, Finishing, Analysis in the Future Meeting held June 5-7, 2012 in Santa Fe, New Mexico.
A new intelligent software tool for PACS systems called ‘iScout" has been developed that constructs and displays an overview of large series or studies before downloading the set of images to a PACS workstation. The overview consists of two orthogonal cross-section images that allow the user to select and download a subset of images, avoiding long delays that can occur while downloading hundreds or even thousands of images. The iScout also provides a navigational tool, allowing the user to click on anatomical regions and view the relevant slices, while displaying the anatomical location of the image currently being displayed by the PACS workstation software. The construction of an iScout can be done on either a workstation or a server with only minimal overhead that does not significantly affect the speed of loading. A working iScout tool has been integrated with multi-modality PACS workstation software (McKesson Medical Imaging Solutions), and it was found that the iScout can be generated on the workstation with a maximum added overhead of only 3.4 seconds while downloading a study containing 433 512x512 CT images. The iScout is flexible and can generate scouts for virtually all types of CT and MR images, as well as 3D Ultrasound.
Mutual information as the asymptotic Bayesian measure of independence is an excellent starting point for investigating the existence of possible relationships among climate-relevant variables in large data sets, As mutual information is a nonlinear function of of its arguments, it is not beholden to the assumption of a linear relationship between the variables in question and can reveal features missed in linear correlation analyses. However, as mutual information is symmetric in its arguments, it only has the ability to reveal the probability that two variables are related. it provides no information as to how they are related; specifically, causal interactions or a relation based on a common cause cannot be detected. For this reason we also investigate the utility of a related quantity called the transfer entropy. The transfer entropy can be written as a difference between mutual informations and has the capability to reveal whether and how the variables are causally related. The application of these information theoretic measures is rested on some familiar examples using data from the International Satellite Cloud Climatology Project (ISCCP) to identify relation between global cloud cover and other variables, including equatorial pacific sea surface temperature (SST), over seasonal and El Nino Southern Oscillation (ENSO) cycles.
Knuth, Kevin H.; Rossow, William B.; Clancy, Daniel (Technical Monitor)
Progress in technical development has allowed piecing together increasingly long DNA sequences from subfossil remains of both extinct and extant species. At the same time, more and more species are analyzed on the population level, leading to a better understanding of population dynamics over time. Finally, new sequencing techniques have allowed targeting complete nuclear genomes of extinct species. The sequences obtained yield insights into a variety of research fields. First, phylogenetic relationships can be resolved with much greater accuracy and it becomes possible to date divergence events of species during and before the Quaternary. Second, large data sets in population genetics facilitate the assessment of changes in genetic diversity over time, an approach that has substantially revised our views about phylogeographic patterns and population dynamics. In the future, the combination of population genetics with long DNA sequences, e.g. complete mitochondrial (mt) DNA genomes, should lead to much more precise estimates of population size changes to be made. This will enable us to make inferences about - and hopefully understand - the causes for faunal turnover and extinctions during the Quaternary. Third, with regard to the nuclear genome, complete genes and genomes can now be sequenced and studied with regard to their function, revealing insights about the numerous traits of extinct species that are not preserved in the fossil record.
This paper addresses the problem of detecting and describing anomalies in largesets of high-dimensional symbol sequences. The approach taken uses unsupervised clustering of sequences using the normalized longest common subsequence (LCS) as a similarity measure, followed by detailed analysis of outliers to detect anomalies. As the LCS measure is expensive to compute, the first part of the paper discusses existing algorithms, such as the Hunt-Szymanski algorithm, that have low time-complexity. We then discuss why these algorithms often do not work well in practice and present a new hybrid algorithm for computing the LCS that, in our tests, outperforms the Hunt-Szymanski algorithm by a factor of five. The second part of the paper presents new algorithms for outlier analysis that provide comprehensible indicators as to why a particular sequence was deemed to be an outlier. The algorithms provide a coherent description to an analyst of the anomalies in the sequence, compared to more normal sequences. The algorithms we present are general and domain-independent, so we discuss applications in related areas such as anomaly detection.
Physical education is still taught in outdoor settings in many warmer climates of the United States. Even when indoor facilities are available, physical education may be moved outside because of other curricular needs or facility issues. How can physical educators make the outdoor setting seem more like an indoor classroom? Outdoor teaching…
Early stage research does a fantastic job providing knowledge and proof-of-feasibility for new product concepts. However, the handful of data points required to validate a concept is typically insufficient to provide insight on the whole range of effects relevant to manufacturing the product. Moving to manufacturing brings larger data sets and variability; opportunistic analysis of these larger sets can yield
Significant improvements in the ability of atmospheric chemistry models to predict the transport and production of atmospheric constituents on regional and global scales have been realized over the past decade. Concurrent with the model improvements, has been an increase in the size and complexity of atmospheric observational data sets. As a result, the challenge to provide efficient and realistic visualization of atmospheric data "products" has increased dramatically. Over the past several years, personnel from the Atmospheric Sciences Data Center (ASDC) at NASA's Langley Research Center have explored the merits of visualizing atmospheric data products using interactive, immersive visualization hardware and software. As part of this activity, the Virtual Global Explorer and Observatory (vGeo) software, developed by VRCO, Inc., has been utilized to support the visual analysis of large multivariate data sets. The vGeo software provides an environment in which the user can create, view, navigate, and interact with data, models, and images in an immersive 3-D environment. The vGeo visualization capability was employed during the March/April 2001, NASA Global Tropospheric Experiment Transport and Chemical Evolution over the Pacific (TRACE-P) mission [(GTE) http://www-gte.larc.nasa.gov] to support day-to-day flight-planning activities through the creation of virtual 3-D worlds containing modeled data and proposed aircraft flight paths. The GTE, a major activity within NASA's Earth Science Enterprise, is primarily an aircraft-based measurement program, supplemented by ground-based measurements and satellite observations, focused on understanding the impact of human activity on the global troposphere. The TRACE-P is the most recent campaign conducted by GTE and was deployed to Hong Kong and then to the Yokota Airbase, Japan. TRACE-P is the third in a series of GTE field campaigns in the northwestern Pacific region to understand the chemical composition of air masses emerging from the Asian Continent and their impact on the region. Since completing the field deployment phase of TRACE-P, the 3-D visualization capability has been used as a tool to combine and visually analyze TRACE-P data from multiple sources (e.g. model, airborne and ground based measurements, ozone sondes, and satellite observations). This capability to merge measurements into model data fields in a virtual 3-D world is perhaps the most exciting aspect of this new visualization capability. This allows for a more realistic contextual representation of the model/measurement results. The measured parameters along specific flights (of typical duration of 8 hrs) along with supporting ancillary measurements provide the "real" representation of the atmosphere at that specific point in time and space. The models provide the time evolution, and three-dimensional structure during the measurement period. When these are merged together the context of the observations is documented, and model predictions can be validated and/or improved. Specific TRACE-P case studies will be presented showing results from global and regional models coupled with airborne measurements for which the influence of transport on the spatial distribution of species measured on the aircraft was more clearly discerned within the 3-D environment than from conventional visualization techniques.
Frenzer, J. B.; Hoell, J. M.; Holdzkom, J. J.; Jacob, D.; Fuelberg, H.; Avery, M.; Carmichael, G.; Hopkins, D. L.
In the second year, we continued to built upon and improve our scanline-based direct volume renderer that we developed in the first year of this grant. This extremely general rendering approach can handle regular or irregular grids, including overlapping multiple grids, and polygon mesh surfaces. It runs in parallel on multi-processors. It can also be used in conjunction with a k-d tree hierarchy, where approximate models and error terms are stored in the nodes of the tree, and approximate fast renderings can be created. We have extended our software to handle time-varying data where the data changes but the grid does not. We are now working on extending it to handle more general time-varying data. We have also developed a new extension of our direct volume renderer that uses automatic decimation of the 3D grid, as opposed to an explicit hierarchy. We explored this alternative approach as being more appropriate for very large data sets, where the extra expense of a tree may be unacceptable. We also describe a new approach to direct volume rendering using hardware 3D textures and incorporates lighting effects. Volume rendering using hardware 3D textures is extremely fast, and machines capable of using this technique are becoming more moderately priced. While this technique, at present, is limited to use with regular grids, we are pursuing possible algorithms extending the approach to more general grid types. We have also begun to explore a new method for determining the accuracy of approximate models based on the light field method described at ACM SIGGRAPH '96. In our initial implementation, we automatically image the volume from 32 equi-distant positions on the surface of an enclosing tessellated sphere. We then calculate differences between these images under different conditions of volume approximation or decimation. We are studying whether this will give a quantitative measure of the effects of approximation. We have created new tools for exploring the differences between images produced by various rendering methods. Images created by our software can be stored in the SGI RGB format. Our idtools software reads in pair of images and compares them using various metrics. The differences of the images using the RGB, HSV, and HSL color models can be calculated and shown. We can also calculate the auto-correlation function and the Fourier transform of the image and image differences. We will explore how these image differences compare in order to find useful metrics for quantifying the success of various visualization approaches. In general, progress was consistent with our research plan for the second year of the grant.
Teachers from fourteen classrooms were randomly assigned to an adaptation of Incredible Years (IY) teacher training or to teacher training-as-usual. Observations were made of the behavior of 136 target preschool boys and girls nominated by teachers as having many or few conduct problems. Peer and teacher behavior were observed at baseline and post…
Classroom environmental factors that promote or inhibit a climate facilitating learning are discussed, and general principles that may serve as a guide to developing effective classroomsettings are presented. (MJB)
Background The number of algorithms available to predict ligand-protein interactions is large and ever-increasing. The number of test cases used to validate these methods is usually small and problem dependent. Recently, several databases have been released for further understanding of protein-ligand interactions, having the Protein Data Bank as backend support. Nevertheless, it appears to be difficult to test docking methods on a large variety of complexes. In this paper we report the development of a new database of protein-ligand complexes tailored for testing of docking algorithms. Methods Using a new definition of molecular contact, small ligands contained in the 2005 PDB edition were identified and processed. The database was enriched in molecular properties. In particular, an automated typing of ligand atoms was performed. A filtering procedure was applied to select a non-redundant dataset of complexes. Data mining was performed to obtain information on the frequencies of different types of atomic contacts. Docking simulations were run with the program DOCK. Results We compiled a large database of small ligand-protein complexes, enriched with different calculated properties, that currently contains more than 6000 non-redundant structures. As an example to demonstrate the value of the new database, we derived a new set of chemical matching rules to be used in the context of the program DOCK, based on contact frequencies between ligand atoms and points representing the protein surface, and proved their enhanced efficiency with respect to the default set of rules included in that program. Conclusion The new database constitutes a valuable resource for the development of knowledge-based docking algorithms and for testing docking programs on largesets of protein-ligand complexes. The new chemical matching rules proposed in this work significantly increase the success rate in DOCKing simulations. The database developed in this work is available at .
Diago, Luis A; Morell, Persy; Aguilera, Longendri; Moreno, Ernesto
Abstract: A new active-set method for smooth box-constrained minimizationis introduced. The algorithm combines an unconstrained method, includinga new line-search which aims to add many constraints to theworking set at a single iteration, with a recently introduced technique(spectral projected gradient) for dropping constraints from the workingset. Global convergence is proved. A computer implementation isfully described and a numerical comparison assesses the
The Classroom Expernomics site was jointly developed by Gregory Delemeester, Associate Professor of Economics, Marietta College, and John Neral, Associate Professor of Economics, Frostburg State University. This site features the Classroom Expernomics newsletter aimed at encouraging the "use of economic experiments as teaching tools for the classroom." The newsletter has been published twice a year since the spring of 1992; all current and previous issues are available at the web site. Each newsletter contains two or three articles by various professors profiling their use of economics experiments in the classroom.
The total atomization energies of a number of molecules have been computed using an augmented coupled-cluster method and (5s4p3d2f1g) and 4s3p2d1f) atomic natural orbital (ANO) basis sets, as well as the correlation consistent valence triple zeta plus polarization (cc-pVTZ) correlation consistent valence quadrupole zeta plus polarization (cc-pVQZ) basis sets. The performance of ANO and correlation consistent basis sets is comparable throughout, although the latter can result in significant CPU time savings. Whereas the inclusion of g functions has significant effects on the computed Sigma D(e) values, chemical accuracy is still not reached for molecules involving multiple bonds. A Gaussian-1 (G) type correction lowers the error, but not much beyond the accuracy of the G1 model itself. Using separate corrections for sigma bonds, pi bonds, and valence pairs brings down the mean absolute error to less than 1 kcal/mol for the spdf basis sets, and about 0.5 kcal/mol for the spdfg basis sets. Some conclusions on the success of the Gaussian-1 and Gaussian-2 models are drawn.
T1DBase (http://www.t1dbase.org) is web platform, which supports the type 1 diabetes (T1D) community. It integrates genetic, genomic and expression data relevant to T1D research across mouse, rat and human and presents this to the user as a set of web pages and tools. This update describes the incorporation of new data sets, tools and curation efforts as well as a new website design to simplify site use. New data sets include curated summary data from four genome-wide association studies relevant to T1D, HaemAtlas—a data set and tool to query gene expression levels in haematopoietic cells and a manually curated table of human T1D susceptibility loci, incorporating genetic overlap with other related diseases. These developments will continue to support T1D research and allow easy access to large and complex T1D relevant data sets.
Burren, Oliver S.; Adlem, Ellen C.; Achuthan, Premanand; Christensen, Mikkel; Coulson, Richard M. R.; Todd, John A.
Extremely large scale analysis is becoming increasingly important as supercomputers and their simulations move from petascale to exascale. The lack of dedicated hardware acceleration for rendering on today's supercomputing platforms motivates our detailed...
C. Hansen C. J. Mitchell C. S. Browniee J. M. Patchett J. P. Ahrens L. Lo - Ta
In this paper, we study the problem of sorting a large collection of strings in external memory. Based on adaptive construction\\u000a of a summary data structure, called adaptive synopsis trie, we present a practical string sorting algorithm DistStrSort, which is suitable for sorting string collections of large size in external memory, and also suitable for more complex string\\u000a processing problems
This poster describes a Physiology Understanding (PhUn) Week activity performed with 3rd and 5th grade elementary students in limited classroom space that demonstrated how the heart and lung work together to meet tissue demands. This poster was presented at the PhUn Week Poster Session, Experimental Biology 2011, by Michael J. Ryan, PhD, University of Mississippi Medical Center.
PhD Michael J Ryan (University of Mississippi Medical Center)
For constant $r$ and arbitrary $n$, it was known that in the graph $K_r^n$\\u000aany independent set of size close to the maximum is close to some independent\\u000aset of maximum size. We prove that this statement holds for arbitrary $r$ and\\u000a$n$.
Objective: To compare the effects of active and didactic teaching strategies on learning- and process-oriented outcomes. Design: Controlled trial. Setting: After-hours residents' teaching session. Participants: Family and Community Medicine, Internal Medicine, and Pediatrics residents at two academic medical institutions. Interventions: We…
Haidet, Paul; Morgan, Robert O.; O'Malley, Kimberly; Moran, Betty Jeanne; Richards, Boyd F.
Background Large-scale genetic data sets are frequently shared with other research groups and even released on the Internet to allow for secondary analysis. Study participants are usually not informed about such data sharing because data sets are assumed to be anonymous after stripping off personal identifiers. Discussion The assumption of anonymity of genetic data sets, however, is tenuous because genetic data are intrinsically self-identifying. Two types of re-identification are possible: the "Netflix" type and the "profiling" type. The "Netflix" type needs another small genetic data set, usually with less than 100 SNPs but including a personal identifier. This second data set might originate from another clinical examination, a study of leftover samples or forensic testing. When merged to the primary, unidentified set it will re-identify all samples of that individual. Even with no second data set at hand, a "profiling" strategy can be developed to extract as much information as possible from a sample collection. Starting with the identification of ethnic subgroups along with predictions of body characteristics and diseases, the asthma kids case as a real-life example is used to illustrate that approach. Summary Depending on the degree of supplemental information, there is a good chance that at least a few individuals can be identified from an anonymized data set. Any re-identification, however, may potentially harm study participants because it will release individual genetic disease risks to the public.
Background The MEDLINE database contains over 12 million references to scientific literature, with about 3/4 of recent articles including an abstract of the publication. Retrieval of entries using queries with keywords is useful for human users that need to obtain small selections. However, particular analyses of the literature or database developments may need the complete ranking of all the references in the MEDLINE database as to their relevance to a topic of interest. This report describes a method that does this ranking using the differences in word content between MEDLINE entries related to a topic and the whole of MEDLINE, in a computational time appropriate for an article search query engine. Results We tested the capabilities of our system to retrieve MEDLINE references which are relevant to the subject of stem cells. We took advantage of the existing annotation of references with terms from the MeSH hierarchical vocabulary (Medical Subject Headings, developed at the National Library of Medicine). A training set of 81,416 references was constructed by selecting entries annotated with the MeSH term stem cells or some child in its sub tree. Frequencies of all nouns, verbs, and adjectives in the training set were computed and the ratios of word frequencies in the training set to those in the entire MEDLINE were used to score references. Self-consistency of the algorithm, benchmarked with a test set containing the training set and an equal number of references randomly selected from MEDLINE was better using nouns (79%) than adjectives (73%) or verbs (70%). The evaluation of the system with 6,923 references not used for training, containing 204 articles relevant to stem cells according to a human expert, indicated a recall of 65% for a precision of 65%. Conclusion This strategy appears to be useful for predicting the relevance of MEDLINE references to a given concept. The method is simple and can be used with any user-defined training set. Choice of the part of speech of the words used for classification has important effects on performance. Lists of words, scripts, and additional information are available from the web address .
Data sets imaged with modern electron microscopes can range from tens of terabytes to about one petabyte. Two new tools, Ssecrett and NeuroTrace, support interactive exploration and analysis of large-scale optical and electron-microscopy images to help scientists reconstruct complex neural circuits of the mammalian nervous system.
We report a comprehensive approach for outbreak investigations, including cluster analysis (Bernoulli model), an algorithm to build inferential models, and molecular techniques to confirm cases. Our approach may be an interesting tool to best exploit the large amount of unsystematically collected information available during outbreak investigations in healthcare settings. PMID:24799653
Lanini, Simone; Cosi?, Gorana; Menzo, Stefano; Puro, Vincenzo; Duri?, Predrag; Garbuglia, Anna Rosa; Miloševi?, Vesna; Kara?, Tatjana; Capobianchi, Maria Rosaria; Ippolito, Giuseppe
Coefficients of inbreeding are commonly used in mixed-model methods for forming inverses of Wright's numerator relationship matrix and transformation matrices used in variance component estimation and national cattle evaluation. Computation of exact coefficients of inbreeding from very large data sets has been believed to be too expensive or too difficult a task to perform. Approximate methods have been used instead.
This paper is a report of a middle-school teacher's study of classroom management. The teacher/researcher was interested in how some of the techniques in the Kovalik Integrated Thematic Instruction model of training would influence the teacher/researcher's classroom management; the effects of direct instruction within a community circle; the…
This classroom screening device was developed by the Circle Preschool First Chance Project, a government-funded program to integrate handicapped children into regular classroom activities, for use in preschools, nursery schools, Head Start centers and other agencies working with young children. It is designed to give a gross measure of a child's…
Accurate quantitative structure-property relationship (QSPR) models based on a large data set containing a total of 3483 organic compounds were developed to predict chemicals' adsorption capability onto activated carbon in gas phrase. Both global multiple linear regression (MLR) method and local lazy regression (LLR) method were used to develop QSPR models. The results proved that LLR has prediction accuracy 10% higher than that of MLR model. By applying LLR method we can predict the test set (787 compounds) with Q2ext of 0.900 and root mean square error (RMSE) of 0.129. The accurate model based on this large data set could be useful to predict adsorption property of new compounds since such model covers a highly diverse structural space.
Clustering has been an active research area of great prac- tical importance for recent years. Most previous clusterin g models have focused on grouping objects with similar val- ues on a (sub)set of dimensions (e.g., subspace cluster) and assumed that every object has an associated value on ev- ery dimension (e.g., bicluster). These existing cluster mo dels may not always
Describes the care and breeding of zebra fish, suggests various experiments and observations easily performed in a classroomsetting, and provides some ideas to further student interest and exploration of these organisms. (DDR)
Student use of electronic response technology has been prevalent in postsecondary institutions and is beginning to penetrate K--12 classroomsettings. Despite these trends, research exploring the impact of this technology in these settings has been limited. The extant research has relied heavily on survey methodologies and largely has focused on student\\/teacher perception or implementation practices while remaining silent on learning
A procedure is described for making diatom microscopy slides from large numbers of sediment samples, simultaneously. The method uses small sediment samples, small test tubes, small amounts of hydrogen peroxide, no decantings, no centrifuging, and it requires no washing-up of glassware. It is, therefore, fast, cheap, gentle to the diatoms and needs little fume cupboard space.
We have developed an adaptive, automatic, correlation- and clustering- based method for greatly reducing the degree of picking inconsistency in large, digital seismic catalogs and for quantifying similarity within, and discriminating among, clusters of disparate waveform families. Innovations in the technique include (1) the use of eigenspectral methods for cross-spectral phase estimation and for providing subsample pick lag error estimates
The subgroup discovery, domain of application of CN2-SD, is defined as: ''given a population of individuals and a property of those individuals, we are interested in finding a population of subgroups as large as possible and have the most unusual statistical characteristic with respect to the property of interest''. The subgroup discovery algorithm CN2-SD, based on a separate and conquer
José-Ramón Cano; Francisco Herrera; Manuel Lozano; Salvador García
Thesis Statement: "Availability-dependent global predicates can be efficiently and scalably realized for a class of distributed services, in spite of specific selfish and colluding behaviors, using local and decentralized protocols". Several types of large-scale distributed systems spanning the Internet have to deal with availability variations…
The use of random algorithms in many areas of computer science has enabled the solution of otherwise intractable problems. In this paper we propose that random sampling can make the visualisation of large datasets both more computationally efficient and more perceptually effective. We review the explicit uses of randomness and the related deterministic techniques in the visualisation literature. We then
The size of many data bases have grown to the point where theycannot fit into the fast memory of even large memory machines, tosay nothing of current workstations. If what we want to do is to usethese data bases to construct predictions of various characteristics,then since the usual methods require that all data be held in fastmemory, various work-arounds have
The purpose of this study was to find the best classroom management strategies to use when teaching in an elementary school setting. I wanted to conduct the best possible management tools for a variety of age groups as well as meet educational standards. Through my research I found different approaches in different grade levels is an important…
This study investigated gender differences in science learning between two pedagogical approaches: traditional lecture and narrative case studies using personal response systems (‘clickers’). Thirteen instructors of introductory biology classes at 12 different institutions across the USA and Canada used two types of pedagogy (Clicker Cases and traditional lecture) to teach eight topic areas. Three different sets of multiple regression analysis
Hosun Kang; Mary Lundeberg; Bjørn Wolter; Robert delMas; Clyde F. Herreid
This study investigated gender differences in science learning between two pedagogical approaches: traditional lecture and narrative case studies using personal response systems (‘clickers’). Thirteen instructors of introductory biology classes at 12 different institutions across the USA and Canada used two types of pedagogy (Clicker Cases and traditional lecture) to teach eight topic areas. Three different sets of multiple regression analysis
Hosun Kang; Mary Lundeberg; Bjørn Wolter; Robert delMas; Clyde F. Herreid
In this paper, we describe a novel technique for visualizing large amounts of high-dimensio nal data, called 'circle segments'. The technique uses one colored pixel per data value and can therefore be classified as a pixel-per-value technique (Kei 96). The basic idea of the 'circle segments' visualization technique is to display the data dimensions as segments of a circle. If
Mihael Ankerst; Daniel A. Keim; Hans-Peter Kriegel
Over the past decade, data base sizes have continually increased and will continue to do so in the future. This problem of size is further compounded because the trend in present-day studies is to use data from many different locations and different instruments and then compare it with data from global scale physical models. The latter produce data bases of comparable if not even larger size. Much of the data can be viewed as 'image' time sequences and is most readily viewed on color display terminals. These data sets reside in national or owner-generated data bases linked together by computer networks. As the size increases, just moving this data around, taking 'quick-looks' at the data, or even storing it locally become severe problems compromising the scientific return from the data. Is the present-day technology with these analysis techniques being used in the best way? What are the prospects for reducing the storage and transmission size of the data sets? Examples of such problems and potential solutions are described in this paper.
This study examines the effectiveness of a classroom response system (CRS) and architecture students' perceptions of real-time feedback. CRS is designed to increase active engagement of students by their responses to a question or prompt via wireless keypads. Feedback is immediately portrayed on a classroom projector for discussion. The authors…
Background A key question when analyzing high throughput data is whether the information provided by the measured biological entities (gene, metabolite expression for example) is related to the experimental conditions, or, rather, to some interfering signals, such as experimental bias or artefacts. Visualization tools are therefore useful to better understand the underlying structure of the data in a 'blind' (unsupervised) way. A well-established technique to do so is Principal Component Analysis (PCA). PCA is particularly powerful if the biological question is related to the highest variance. Independent Component Analysis (ICA) has been proposed as an alternative to PCA as it optimizes an independence condition to give more meaningful components. However, neither PCA nor ICA can overcome both the high dimensionality and noisy characteristics of biological data. Results We propose Independent Principal Component Analysis (IPCA) that combines the advantages of both PCA and ICA. It uses ICA as a denoising process of the loading vectors produced by PCA to better highlight the important biological entities and reveal insightful patterns in the data. The result is a better clustering of the biological samples on graphical representations. In addition, a sparse version is proposed that performs an internal variable selection to identify biologically relevant features (sIPCA). Conclusions On simulation studies and real data sets, we showed that IPCA offers a better visualization of the data than ICA and with a smaller number of components than PCA. Furthermore, a preliminary investigation of the list of genes selected with sIPCA demonstrate that the approach is well able to highlight relevant genes in the data with respect to the biological experiment. IPCA and sIPCA are both implemented in the R package mixomics dedicated to the analysis and exploration of high dimensional biological data sets, and on mixomics' web-interface.
Thermal contact conductance test data at high vacuum were obtained from two Armco iron specimens having smooth, large radii of curvature, convex, one-half wave length surfaces. The data are compared with calculations based on two macroscopic elastic deformation theories and an empirical expression. Major disagreement with the theories and fair agreement with the empirical expression resulted. Plastic deformation of all the contacting surfaces was verified from surface analyzer statistics. These results indicate that the theoretical assumption of macroscopic elastic deformation is inadequate for accurate prediction of heat transfer with light loads for Armco iron specimens similar to those used in this investigation.
Owing to new advances in computer hardware, large text databases have become more prevalent than ever. Automatically mining information from these databases proves to be a challenge due to slow pattern/string matching techniques. In this paper we present a new, fast multi-string pattern matching method based on the well known Aho-Chorasick algorithm. Advantages of our algorithm include: the ability to exploit the natural structure of text, the ability to perform significant character shifting, avoiding backtracking jumps that are not useful, efficiency in terms of matching time and avoiding the typical “sub-string” false positive errors. Our algorithm is applicable to many fields with free text, such as the health care domain and the scientific document field. In this paper, we apply the BSS algorithm to health care data and mine hundreds of thousands of medical concepts from a large Electronic Medical Record (EMR) corpora simultaneously and efficiently. Experimental results show the superiority of our algorithm when compared with the top of the line multi-string matching algorithms.
Liu, Ying; Lita, Lucian Vlad; Niculescu, Radu Stefan; Mitra, Prasenjit; Giles, C. Lee
Cytologic diagnosis of palpable breast masses is an accepted method for diagnosis. However, the high nondiagnostic rate causes repeat biopsy, unnecessary delays, and increased costs. Our purpose is to evaluate the use of ultrasound (US)-guided large-core needle biopsy as part of the minimally invasive multidisciplinary diagnosis of palpable breast masses. We studied 502 consecutive patients with 510 palpable solid breast masses seen and evaluated by a multidisciplinary team. Patients had US-guided core biopsy. Clinical-imaging-pathologic correlation (CIPC) was done in all cases. Core biopsy was deemed conclusive if CIPC was congruent and was used to guide definitive management. The median age of our patients was 39 years. Median tumor size was 2.2 cm. Of these cases, 463 (91%) had a conclusive diagnosis on CIPC. Core needle findings on 47 masses were nondefinitive to guide therapy (fibroepithelial lesion, atypical ductal hyperplasia, intraductal papilloma, CIPC). Three cancers were detected in this group on excisional biopsy. In conclusion, US-guided large-core needle biopsy is a sensitive method for diagnosis of palpable breast masses. Multidisciplinary correlation of clinical findings, imaging, and pathology is essential for success. This approach improves use of operating room resources and maximizes patient participation in the decision-making process. PMID:15529839
This publication presents a set of readings and tools that accompany the education modules "Enhancing Classroom Approaches to Addressing Barriers to Learning: Classroom-Focused Enabling." Together, they delineate a preservice/inservice teacher preparation curriculum covering how regular classrooms and schools should be designed to ensure all…
California Univ., Los Angeles. Center for Mental Health in Schools.
Modern nuclear facilities, such as reprocessing plants, present inspectors with significant challenges due in part to the sheer amount of equipment that must be safeguarded. The Sandia-developed and patented Knowledge Generation system was designed to automatically analyze large amounts of safeguards data to identify anomalous events of interest by comparing sensor readings with those expected from a process of interest and operator declarations. This paper describes a demonstration of the Knowledge Generation system using simulated accountability tank sensor data to represent part of a reprocessing plant. The demonstration indicated that Knowledge Generation has the potential to address several problems critical to the future of safeguards. It could be extended to facilitate remote inspections and trigger random inspections. Knowledge Generation could analyze data to establish trust hierarchies, to facilitate safeguards use of operator-owned sensors.
Thomas, Maikel A.; Smartt, Heidi Anne; Matthews, Robert F.
The map data input and output problem for geographic information systems is rapidly diminishing with the increasing availability of mass digitizing, direct spatial data capture and graphics hardware based on raster technology. Although a large number of efficient raster-based algorithms exist for performing a wide variety of common tasks on these data, there are a number of procedures which are more efficiently performed in vector mode or for which raster mode equivalents of current vector-based techniques have not yet been developed. This paper presents a hybrid spatial data structure, named the ?vaster' structure, which can utilize the advantages of both raster and vector structures while potentially eliminating, or greatly reducing, the need for raster-to-vector and vector-to-raster conversion. Other advantages of the vaster structure are also discussed.
One of the major tasks in interpretation of data from large-scale stellar surveys is to determine the fundamental atmospheric parameters such as effective temperature, surface gravity, and metallicity. In most on-going and upcoming projects, they are derived spectroscopically, relying on classical one-dimensional (1D) model atmospheres and the assumption of LTE. This review discusses the present achievements and problems of non-local thermodynamic equilibrium (NLTE) line-formation calculations for FGK-type stars. The topics that are addressed include (i) the construction of comprehensive model atoms for the chemical elements with complex term system, (ii) possible systematic errors inherent in classical LTE spectroscopic determinations of stellar parameters and chemical abundances, (iii) the uncertainties in final NLTE results caused by the uncertainties in atomic data, and (iv) applications of the NLTE line-formation calculations coupled to the spatial and temporal average <3D> models to spectroscopic analyses.
Hydraulic properties related to river flow affect salmon spawning habitat. Accurate prediction of salmon spawning habitat and understanding the influential properties on the spawning behavior are of great interest for hydroelectric dam management. Previous research predicted salmon spawning habitat through deriving river specific spawning suitability indices and employing a function estimate method like logistic regression on several static river flow related properties and had some success. The objective of this study was two-fold. First dynamic river flow properties associated with upstream dam operation were successfully derived from a huge set of time series of both water velocity and water depth for about one fifth of a million habitat cells through principal component analysis (PCA) using nonlinear iterative partial least squares (NIPLAS). The inclusion of dynamic variables in the models greatly improved the model prediction. Secondly, nine machine learning methods were applied to the data and it was found that decision tree and rule induction methods were generally outperformed usually used logistic regression. Specifically random forest, an advanced decision tree algorithm, provided unanimous better results. Over-prediction problem in previous studies were greatly alleviated.
Xie, YuLong; Murray, Christopher J.; Hanrahan, Timothy P.; Geist, David R.
A European industrial consortium called SOLARDETOX has been created as the result of an EC-DGXII BRITE-EURAM-III-financed project on solar photocatalytic detoxification of water. The project objective was to develop a simple, efficient and commercially competitive water-treatment technology, based on compound parabolic collectors (CPCs) solar collectors and TiO2 photocatalysis, to make possible easy design and installation. The design, set-up and preliminary results of the main project deliverable, the first European industrial solar detoxification treatment plant, is presented. This plant has been designed for the batch treatment of 2 m3 of water with a 100 m2 collector-aperture area and aqueous aerated suspensions of polycrystalline TiO2 irradiated by sunlight. Fully automatic control reduces operation and maintenance manpower. Plant behaviour has been compared (using dichloroacetic acid and cyanide at 50 mg l(-1) initial concentration as model compounds) with the small CPC pilot plants installed at the Plataforma Solar de Almería several years ago. The first results with high-content cyanide (1 g l(-1)) waste water are presented and plant treatment capacity is calculated. PMID:11996143
WebViz is a web-based application designed to conduct collaborative, interactive visualizations of large data sets for multiple users, allowing researchers situated all over the world to utilize the visualization services offered by the University of Minnesota's Laboratory for Computational Sciences and Engineering (LCSE). This ongoing project has been built upon over the last 3 1\\/2 years .The motivation behind WebViz
D. A. Yuen; E. McArthur; R. M. Weiss; J. Zhou; B. Yao
Five Standards-based strategies for successful inclusion of special-needs students in the secondary science classroom are described in this article. Use a multisensory approach; encourage collaboration among students; provide specific expectations and ass
We have developed a derect iterative-variational technique for solving large systems of linear equations: Ax = b, where N is the order of the matrix A and the length of the vectors x and b. The method, which has analogues in the conjugate gradient and Lanczos schemes as well as the direct configuration-interaction procedures of quantum chemistry, involves the construction of an orthonormal basis from successive applications of the general linear algebraic (LA) matrix A, to an initial guess for the solution vector. The solution vector is expanded in this basis, and the coefficients are determined from a variational prescription. For m iterations, the number of operations tosolve the LA equations is of the order N2m. Since the basis is orthonormal, the procedure is guaranteed to converge within N iterations, provided that the basis vectors remain linearly independent. In practice, the convergence is much more rapid ( m ? N). Another advantage of the method is that the whole matrix A need not be stored. In the more general case of multiple right-hand-sides ( x and b matrices), the method can be applied simultaneously to all of the solutions, thus many redundant operations that would arise from treating each column of x independently. We have applied the techniques to the solution of LA systems that arise from converting radial coupled integrodifferential equations to an integral representation on a discrete quadrature, in particular, to a problems for electron-atom and -molecule collisions. The order of A is given approximately by the product of the number of quadrature points n p and the number of scattering channels nc( N = nc np). Since the method is direct, we need only store the potential, the regular and irregular components of the product Green's function, and the iterates, therefore drastically reducing the central memory requirements. The integrals are constructed from simple recursion relationship involving the stored quantities. In addition, the direct approach results in savings in computational time since the number of operations to generate an iterate scales as n2c np rather than N2. We generally start the iteration from the Born solution although better such as the distored wave, may enhance the convergence. As an example, we apply the method to electron collisions with the hydrogen molecular ion at the static level and treat large numbers of coupled channels ( nc ? 30).
Motivation:?Nucleotide sequence data are being produced at an ever increasing rate. Clustering such sequences by similarity is often an essential first step in their analysis—intended to reduce redundancy, define gene families or suggest taxonomic units. Exact clustering algorithms, such as hierarchical clustering, scale relatively poorly in terms of run time and memory usage, yet they are desirable because heuristic shortcuts taken during clustering might have unintended consequences in later analysis steps. Results:?Here we present HPC-CLUST, a highly optimized software pipeline that can cluster large numbers of pre-aligned DNA sequences by running on distributed computing hardware. It allocates both memory and computing resources efficiently, and can process more than a million sequences in a few hours on a small cluster. Availability and implementation:?Source code and binaries are freely available at http://meringlab.org/software/hpc-clust/; the pipeline is implemented in C++ and uses the Message Passing Interface (MPI) standard for distributed computing. Contact:? firstname.lastname@example.org Supplementary Information:?Supplementary data are available at Bioinformatics online.
Problem-based learning (PBL) is often described as resource demanding due to the high staff-to-student ratio required in a traditional PBL tutorial class where there is commonly one facilitator to every 5-16 students. The veterinary science program at Charles Sturt University, Australia, has developed a method of group facilitation which readily allows one or two staff members to facilitate up to 30 students at any one time while maintaining the benefits of a small PBL team of six students. Multi-team facilitation affords obvious financial and logistic advantages, but there are also important pedagogical benefits derived from uniform facilitation across multiple groups, enhanced discussion and debate between groups, and the development of self-facilitation skills in students. There are few disadvantages to the roaming facilitator model, provided that several requirements are addressed. These requirements include a suitable venue, large whiteboards, a structured approach to support student engagement with each disclosure, a detailed facilitator guide, and an open, collaborative, and communicative environment. PMID:23975069
Students use bearing measurements to triangulate and determine objects' locations. Working in teams of two or three, they must put on their investigative hats as they take bearing measurements to specified landmarks in their classroom (or other rooms in the school) from a "mystery location." With the extension activity, students are challenged with creating their own maps of the classroom or other school location and comparing them with their classmates' efforts.
Institute Of Navigation And Integrated Teaching And Learning Program
In this activity, students will use bearing measurements to triangulate and determine objects' locations. Working in teams of two or three, students must put on their investigative hats as they take bearing measurements to specified landmarks in their classroom (or other rooms in the school) from a mystery location. With the extension activity, students are challenged with creating their own map of the classroom or other school location and comparing it with their classmates' efforts.
Lippis, Matt; Axelrad, Penny; Yowell, Janet; Zarske, Malinda S.
The Paperless Classroom project explored the use of advanced technology to develop and demonstrate an interface to the large amounts of technical information available in electronic form. A prototype system was developed and field-tested at the AEGIS Trai...
This paper aims to present a heuristic algorithm with factor analysis and a local search optimization system for pattern identification problems as applied to large and multivariate aero-geophysical data. The algorithm was developed in MATLAB code using both multivariate and univariate methodologies. Two main analysis steps are detailed in the MATLAB code: the first deals with multivariate factor analysis to reduce the problem of dimension, and to orient the variables in an independent and orthogonal structure; and the second with the application of a novel local research optimization system based on univariate structure. The process of local search is simple and consistent because it solves a multivariate problem by summing up univariate and independent problems. Thus, it can reduce computational time and render the efficiency of estimates independent of the data bank. The aero-geophysical data include the results of the magnetometric and gammaspectrometric (TC, K, Th, and U channels) surveys for the Santa Maria region (RS, Brazil). After the classification, when the observations are superimposed on the regional map, one can see that data belonging to the same subspace appear closer to each other revealing some physical law governing area pattern distribution. The analysis of variance for the original variables as functions of the subspaces obtained results in different mean behaviors for all the variables. This result shows that the use of factor transformation captures the discriminative capacity of the original variables. The proposed algorithm for multivariate factor analysis and the local search system open up new challenges in aero-geophysical data handling and processing techniques.
da Silva Pereira, João Eduardo; Strieder, Adelir José; Amador, Janete Pereira; da Silva, José Luiz Silvério; Volcato Descovi Filho, Leônidas Luiz
This book is designed to acknowledge and support the work of teachers while explaining the implications of new knowledge for classroom practice. Ready, Set, Science! is an account of recent research into teaching and learning science. It is designed to help practitioners make sense of new research and use this research to inform their classroom practice.
Michaels, Sarah; Shouse, Andrew W.; Schweingruber, Heidi A.
The large number of spectra obtained from sky surveys such as the Sloan Digital Sky Survey (SDSS) and the survey executed by the Large sky Area Multi-Object fibre Spectroscopic Telescope (LAMOST, also called GuoShouJing Telescope) provide us with opportunities to search for peculiar or even unknown types of spectra. In response to the limitations of existing methods, a novel outlier-mining method, the Monte Carlo Local Outlier Factor (MCLOF), is proposed in this paper, which can be used to highlight unusual and rare spectra from large spectroscopic survey data sets. The MCLOF method exposes outliers automatically and efficiently by marking each spectrum with a number, i.e. using outlier index as a flag for an unusual and rare spectrum. The Local Outlier Factor (LOF) represents how unusual and rare a spectrum is compared with other spectra and the Monte Carlo method is used to compute the global LOF for each spectrum by randomly selecting samples in each independent iteration. Our MCLOF method is applied to over half a million stellar spectra (classified as STAR by the SDSS Pipeline) from the SDSS data release 8 (DR8) and a total of 37 033 spectra are selected as outliers with signal-to-noise ratio (S/N) ? 3 and outlier index ?0.85. Some of these outliers are shown to be binary stars, emission-line stars, carbon stars and stars with unusual continuum. The results show that our proposed method can efficiently highlight these unusual spectra from the survey data sets. In addition, some relatively rare and interesting spectra are selected, indicating that the proposed method can also be used to mine rare, even unknown, spectra. The proposed method can be applicable not only to spectral survey data sets but also to other types of survey data sets. The spectra of all peculiar objects selected by our MCLOF method are available from a user-friendly website: http://sciwiki.lamost.org/Miningdr8/.
Modern microscope automation permits the collection of vast amounts of continuous anatomical imagery in both two and three dimensions. These large data sets present significant challenges for data storage, access, viewing, annotation and analysis. The cost and overhead of collecting and storing the data can be extremely high. Large data sets quickly exceed an individual's capability for timely analysis and present challenges in efficiently applying transforms, if needed. Finally annotated anatomical data sets can represent a significant investment of resources and should be easily accessible to the scientific community. The Viking application was our solution created to view and annotate a 16.5 TB ultrastructural retinal connectome volume and we demonstrate its utility in reconstructing neural networks for a distinctive retinal amacrine cell class. Viking has several key features. (1) It works over the internet using HTTP and supports many concurrent users limited only by hardware. (2) It supports a multi-user, collaborative annotation strategy. (3) It cleanly demarcates viewing and analysis from data collection and hosting. (4) It is capable of applying transformations in real-time. (5) It has an easily extensible user interface, allowing addition of specialized modules without rewriting the viewer. PMID:21118201
Anderson, J R; Mohammed, S; Grimm, B; Jones, B W; Koshevoy, P; Tasdizen, T; Whitaker, R; Marc, R E
Modern microscope automation permits the collection of vast amounts of continuous anatomical imagery in both two and three dimensions. These large data sets present significant challenges for data storage, access, viewing, annotation and analysis. The cost and overhead of collecting and storing the data can be extremely high. Large data sets quickly exceed an individual's capability for timely analysis and present challenges in efficiently applying transforms, if needed. Finally annotated anatomical data sets can represent a significant investment of resources and should be easily accessible to the scientific community. The Viking application was our solution created to view and annotate a 16.5 TB ultrastructural retinal connectome volume and we demonstrate its utility in reconstructing neural networks for a distinctive retinal amacrine cell class. Viking has several key features. (1) It works over the internet using HTTP and supports many concurrent users limited only by hardware. (2) It supports a multi-user, collaborative annotation strategy. (3) It cleanly demarcates viewing and analysis from data collection and hosting. (4) It is capable of applying transformations in real-time. (5) It has an easily extensible user interface, allowing addition of specialized modules without rewriting the viewer.
This interactive tutorial on Ohm's Law, part of The Physics Classroom tutorial collection, provides a conceptual foundation for understanding one of the most powerful formulas in physics. Multiple circuit diagrams and tables illustrate the relationships among voltage, current, and resistance before users explore the mathematics. The author includes two quizzes for users to gauge their own understanding and perform simple calculations related to resistance. This page is part of The Physics Classroom, a comprehensive set of interactive tutorials, labs, and simulations for students of introductory physics. The Physics Classroom is one of the ComPADRE digital library collections.
As the discipline of biomedical science continues to apply new technologies capable of producing unprecedented volumes of noisy and complex biological data, it has become evident that available methods for deriving meaningful information from such data are simply not keeping pace. In order to achieve useful results, researchers require methods that consolidate, store and query combinations of structured and unstructured data sets efficiently and effectively. As we move towards personalized medicine, the need to combine unstructured data, such as medical literature, with large amounts of highly structured and high-throughput data such as human variation or expression data from very large cohorts, is especially urgent. For our study, we investigated a likely biomedical query using the Hadoop framework. We ran queries using native MapReduce tools we developed as well as other open source and proprietary tools. Our results suggest that the available technologies within the Big Data domain can reduce the time and effort needed to utilize and apply distributed queries over large datasets in practical clinical applications in the life sciences domain. The methodologies and technologies discussed in this paper set the stage for a more detailed evaluation that investigates how various data structures and data models are best mapped to the proper computational framework.
Repetski, Stephen; Venkataraman, Girish; Che, Anney; Luke, Brian T.; Girard, F. Pascal; Stephens, Robert M.
As the discipline of biomedical science continues to apply new technologies capable of producing unprecedented volumes of noisy and complex biological data, it has become evident that available methods for deriving meaningful information from such data are simply not keeping pace. In order to achieve useful results, researchers require methods that consolidate, store and query combinations of structured and unstructured data sets efficiently and effectively. As we move towards personalized medicine, the need to combine unstructured data, such as medical literature, with large amounts of highly structured and high-throughput data such as human variation or expression data from very large cohorts, is especially urgent. For our study, we investigated a likely biomedical query using the Hadoop framework. We ran queries using native MapReduce tools we developed as well as other open source and proprietary tools. Our results suggest that the available technologies within the Big Data domain can reduce the time and effort needed to utilize and apply distributed queries over large datasets in practical clinical applications in the life sciences domain. The methodologies and technologies discussed in this paper set the stage for a more detailed evaluation that investigates how various data structures and data models are best mapped to the proper computational framework. PMID:24312478
Mudunuri, Uma S; Khouja, Mohamad; Repetski, Stephen; Venkataraman, Girish; Che, Anney; Luke, Brian T; Girard, F Pascal; Stephens, Robert M
Weak H(2) physisorption energies present a significant challenge to even the best correlated theoretical many-body methods. We use the phaseless auxiliary-field quantum Monte Carlo method to accurately predict the binding energy of Ca(+)-4H(2). Attention has recently focused on this model chemistry to test the reliability of electronic structure methods for H(2) binding on dispersed alkaline earth metal centers. A modified Cholesky decomposition is implemented to realize the Hubbard-Stratonovich transformation efficiently with large Gaussian basis sets. We employ the largest correlation-consistent Gaussian type basis sets available, up to cc-pCV5Z for Ca, to accurately extrapolate to the complete basis limit. The calculated potential energy curve exhibits binding with a double-well structure. PMID:22047226
Large scale ship recognition in optical remote sensing images is of great importance for many military applications. It aims to recognize the category information of the detected ships for effective maritime surveillance. The contributions of the paper can be summarized as follows: Firstly, based on the rough set theory, the common discernibility degree is used to compute the significance weight of each candidate feature and select valid recognition features automatically; Secondly, RBF neural network is constructed based on the selected recognition features. Experiments on recorded optical satellite images show the proposed method is effective and can get better classification rates at a higher speed than the state of the art methods.
Background We propose a sequence clustering algorithm and compare the partition quality and execution time of the proposed algorithm with those of a popular existing algorithm. The proposed clustering algorithm uses a grammar-based distance metric to determine partitioning for a set of biological sequences. The algorithm performs clustering in which new sequences are compared with cluster-representative sequences to determine membership. If comparison fails to identify a suitable cluster, a new cluster is created. Results The performance of the proposed algorithm is validated via comparison to the popular DNA/RNA sequence clustering approach, CD-HIT-EST, and to the recently developed algorithm, UCLUST, using two different sets of 16S rDNA sequences from 2,255 genera. The proposed algorithm maintains a comparable CPU execution time with that of CD-HIT-EST which is much slower than UCLUST, and has successfully generated clusters with higher statistical accuracy than both CD-HIT-EST and UCLUST. The validation results are especially striking for large datasets. Conclusions We introduce a fast and accurate clustering algorithm that relies on a grammar-based sequence distance. Its statistical clustering quality is validated by clustering large datasets containing 16S rDNA sequences.
The technology revolution in image acquisition, instrumentation, and methods has resulted in vast data sets that far outstrip the human observers' ability to view, digest, and interpret modern medical images by using traditional methods. This may require a paradigm shift in the radiologic interpretation process. As human observers, radiologists must search for, detect, and interpret targets. Potential interventions should be based on an understanding of human perceptual and attentional abilities and limitations. New technologies and tools already in use in other fields can be adapted to the health care environment to improve medical image analysis, visualization, and navigation through large data sets. This historical psychophysical and technical review touches on a broad range of disciplines but focuses mainly on the analysis, visualization, and navigation of image data performed during the interpretive process. Advanced postprocessing, including three-dimensional image display, multimodality image fusion, quantitative measures, and incorporation of innovative human-machine interfaces, will likely be the future. Successful new paradigms will integrate image and nonimage data, incorporate workflow considerations, and be informed by evidence-based practices. This overview is meant to heighten the awareness of the complexities and limitations of how radiologists interact with images, particularly the large image sets generated today. Also addressed is how human-machine interface and informatics technologies could combine to transform the interpretation process in the future to achieve safer and better quality care for patients and a more efficient and effective work environment for radiologists. Supplemental material: http://radiology.rsna.org/lookup/suppl/doi:10.1148/radiol.11091276/-/DC1. PMID:21502391
Andriole, Katherine P; Wolfe, Jeremy M; Khorasani, Ramin; Treves, S Ted; Getty, David J; Jacobson, Francine L; Steigner, Michael L; Pan, John J; Sitek, Arkadiusz; Seltzer, Steven E
To help architectural students adapt to the realities of the work environment, Gerard Campbell of Holland College has set up his classroom as a design office. Working as a team, the students prepare a complete set of working drawings and construction documents, simulating an actual design process. (JOW)
The Jigsaw Classroom is a website for the jigsaw cooperative learning technique that strives to reduce racial conflict, promote better learning, improve motivation, and increase the enjoyment of the learning experience among school children. The website includes an overview of jigsaw techniques, jigsaw history, implementing tips, related books and articles, and links on cooperative learning, school violence, and jigsaw developer Elliot Aronson.
As a technician for the Continuing Education department at Confederation College, the author was approached by an Academic Support Strategist from college's Learning Centre who was looking for a solution for one of her students. She was working with a hard-of-hearing student, and at the time, they were sitting together in the classrooms, sharing a…
In this issue's "Classroom Notes" section, the following papers are discussed: (1) "Constructing a line segment whose length is equal to the measure of a given angle" (W. Jacob and T. J. Osler); (2) "Generating functions for the powers of Fibonacci sequences" (D. Terrana and H. Chen); (3) "Evaluation of mean and variance integrals without…
International Journal of Mathematical Education in Science and Technology, 2007
As part of the Australian Antarctic Division, Classroom Antarctica gives dozens of downloadable Adobe Acrobat files that allow students to discover this unique continent. Subjects include the history of the scientific research undertaken on Antarctica, surviving its climate, its biological ecosystem, the land's physical characteristics and affects on climate, and much more.
What makes a classroom "smart"? Presentation technologies such as projectors, document cameras, and LCD panels clearly fit the bill, but when considering other technologies for teaching, learning, and developing content, the possibilities become limited only by the boundaries of an institution's innovation. This article presents 32 best practices…
To more effectively instruct the entire class, teachers of students with emotional behavioral disorders (EBD) often choose to send students who display inappropriate behavior out of the room. A multiple baseline across settings was used to evaluate the effects of increasing teacher positive verbal reinforcement on the amount of time 2 students…
This 43-minute VHS videotape is designed to be used in course and workshop settings with "A Biological Brain in a Cultural Classroom: Applying Biological Research to Classroom Management." The videotape's principal values are as an introduction to the issues explored in the book and as a catalyst for group discussions and activities related to…
Comparative genomics is a powerful means to gain insight into the evolutionary processes that shape the genomes of related species. As the number of sequenced genomes increases, the development of software to perform accurate cross-species analyses becomes indispensable. However, many implementations that have the ability to compare multiple genomes exhibit unfavorable computational and memory requirements, limiting the number of genomes that can be analyzed in one run. Here, we present a software package to unveil genomic homology based on the identification of conservation of gene content and gene order (collinearity), i-ADHoRe 3.0, and its application to eukaryotic genomes. The use of efficient algorithms and support for parallel computing enable the analysis of large-scale data sets. Unlike other tools, i-ADHoRe can process the Ensembl data set, containing 49 species, in 1?h. Furthermore, the profile search is more sensitive to detect degenerate genomic homology than chaining pairwise collinearity information based on transitive homology. From ultra-conserved collinear regions between mammals and birds, by integrating coexpression information and protein–protein interactions, we identified more than 400 regions in the human genome showing significant functional coherence. The different algorithmical improvements ensure that i-ADHoRe 3.0 will remain a powerful tool to study genome evolution.
Proost, Sebastian; Fostier, Jan; De Witte, Dieter; Dhoedt, Bart; Demeester, Piet; Van de Peer, Yves; Vandepoele, Klaas
3D global electromagnetic hybrid (fluid electrons, kinetic ions) simulations have long been considered the holy grail in kinetic modeling of the magnetosphere but high computational requirements have kept them out of reach. Petascale computers provide the computational power to make such simulations possible but peta computing poses two technical challenges. One is related to the development of efficient and scalable algorithms that can take advantage of the large number of cores. The second is related to knowledge extraction from the resulting simulation output. The challenge of science discovery from the extremely large data sets (˜ 200 TB from a single run) generated from global kinetic simulations is compounded by the multi-variate and "noisy" nature of the data. Here, we review our innovations to overcome both challenges. We have developed a highly scalable hybrid simulation code (H3D) that we used to perform the first petascale global kinetic simulation of the magnetosphere using 98,304 cores on the NSF Kraken supercomputer. To facilitate analysis of data from such runs, we have developed complex visualization pipeline including physics based algorithms to detect and track events of interest in the data. The effectiveness of this approach is illustrated through examples.
Karimabadi, H.; Loring, B.; Vu, H. X.; Omelchenko, Y.; Tatineni, M.; Majumdar, A.; Ayachit, U.; Geveci, B.
The problem of finding a simple, generally applicable description of worldwide measured ambient dose equivalent rates at aviation altitudes between 8 and 12 km is difficult to solve due to the large variety of functional forms and parametrisations that are possible. We present an approach that uses Bayesian statistics and Monte Carlo methods to fit mathematical models to a largeset of data and to compare the different models. About 2500 data points measured in the periods 1997-1999 and 2003-2006 were used. Since the data cover wide ranges of barometric altitude, vertical cut-off rigidity and phases in the solar cycle 23, we developed functions which depend on these three variables. Whereas the dependence on the vertical cut-off rigidity is described by an exponential, the dependences on barometric altitude and solar activity may be approximated by linear functions in the ranges under consideration. Therefore, a simple Taylor expansion was used to define different models and to investigate the relevance of the different expansion coefficients. With the method presented here, it is possible to obtain probability distributions for each expansion coefficient and thus to extract reliable uncertainties even for the dose rate evaluated. The resulting function agrees well with new measurements made at fixed geographic positions and during long haul flights covering a wide range of latitudes. PMID:20826891
Behavior settings as control systems of behavior were investigated in two first grade classes in the Tucson Early Education Program (TEEM) experimental school. The two classrooms which served as the experimental units were observed during two behavior settings, individual choice and large group. The observation consisted of recording every 10…
This paper presents the position that inclusion is limited; inclusion does not go far enough. The inclusive classroom has been assessed to be of benefit both to the teacher and student. There are, however, limits set on inclusion. In most classrooms only children with learning disability are included omitting those with severe disabilities,…
This article explores the complex relationship between language and mathematics education in multilingual settings by presenting an analysis of one lesson from a multilingual primary mathematics classroom in South Africa taught by an appropriately qualified and experienced teacher. English emerged as a dominant language in this classroom, and this…
Describes ways teachers can enhance students' vocabulary development through multiple contexts available in typical middle school classroomsettings. Addresses questions about vocabulary learning and offers suggestions for enhancing vocabulary with narrative and expository texts that involve multiple classroom contexts. Considers the Vocab-o-gram…
Good classroom management is one of the most important goals teachers strive to establish from the first day of class. The rules, procedures, activities, and behaviors set the classroom tone throughout the school year. By revising, updating, and systemizing classroom management activities, teachers can eliminate many problems created by students…
The purpose of the study was to examine the contributions of classroom context (activity settings, teacher behavior, contact with peers and teachers) to children's cognitive and social competence in early childhood classrooms. One-hundred fourteen children (61 girls; Mage = 51.7 months) were observed in their early childhood classrooms during free play time. Children's cognitive and social competence were measured by
Research on multilingual classrooms usually focuses on contexts where both teachers and pupils share the same linguistic repertoire; what can be called "symmetrical" multilingual classrooms. This paper sets out to investigate whether (and how) pupils' multilingual resources can be used in classrooms where the teacher does not share pupils'…
The 38 essays in this book look back at language experience as an educational approach, provide practical classroom applications, and reconceptualize language experience as an overarching education process. Classroom teachers and reading specialists describe strategies in use in a variety of classroomsettings and describe ways to integrate…
Suggests using crickets for classroom activities, providing background information on their anatomy and reproduction and tips on keeping individual organisms or a breeding colony in the classroom. (JN)
This study examined the effects of two types of course delivery systems (learning community classroom environments versus stand-alone classroom environments) on the achievement of students who were simultaneously enrolled in remedial and college-level social science courses at an inner city open-enrollment public community college. This study was…
Large rhyolitic ignimbrite occurrences are close connected to the Early Miocene initiation of extensional processes in the central-west Anatolia along Ta?vanl?-Afyon zones. Field correlations, petrographical, geochemical and geochronological data lead to a substantial reinterpretation of the ignimbrite surrounding K?rka area, known from its world-class borate deposits, as representing the climatic event of a caldera collapse, unknown up to now and newly named "K?rka-Phrigian caldera". The caldera, which is roughly oval (24 km x 15km) in shape, one of the largest in Turkey, is supposed to have been formed in a single stage collapse event, at ~19 Ma that generated huge volume extracaldera outflow ignimbrites. Transtensive/distensive tectonic stresses since 25 Ma ago resulted in the NNW-SSE elongation of the magma chamber and influenced the roughly elliptical shape of the subsided block (caldera floor) belonging to the apex of Eski?ehir-Afyon-Isparta volcanic area. Intracaldera post-collapse sedimentation and volcanism (at ~ 18 Ma) was controlled through subsidence-related faults with generation of a series of volcanic structures (mainly domes) showing a large compositional range from saturated silicic rhyolites and crystal-rich trachytes to undersaturated lamproites. Such volcanic rock association is typical for lithospheric extension. In this scenario, enriched mantle components within the subcontinental lithospheric mantle will begin to melt via decompression melting during the initiation of extension. Interaction of these melts with crustal rocks, fractionation processes and crustal anatexis driven by the heat contained in the ascending mantle melts produced the silicic compositions in a large crustal reservoir. Such silicic melts generated the initial eruptions of K?rka-Phrigian caldera ignimbrites. The rock volume and geochemical evidence suggests that silicic volcanic rocks come from a long-lived magma chamber that evolved episodically; after caldera generation there is a shift to small volume episodic rhyolitic, trachytic and lamproitic volcanism, the last ones indicating a more primitive magma input with evident origin in an enriched mantle lithosphere. The volcanic rock succession provides a direct picture of the state of the magmatic system at the time of eruptions that generated caldera and post-caldera structures and offer an excellent example for silicic magma generation and associated potassic and ultrapotassic intermediate-mafic rocks in post-collisional extensional setting.
Investigated children's classroom sociometry and size of their best-friend networks. For both classroom and playground settings, popular children had the most reciprocal best friends, while rejected children had the fewest, but had more on the playground than in the classroom. Results suggest that constraints and opportunities of different…
Explains how teachers can take advantage of the Internet to create classroom projects. The process involves locating collaborating partners via the World Wide Web, then determining which projects would be worthwhile to students. Presents guidelines for organizing a classroom project, discusses classroom considerations, examines teacher…
The goal of the current study was to examine the predictive relationships among a set of cognitive-motivational variables that have been found in previous studies to support academic achievement. Student perception of a classroom's achievement goal structure (classroom mastery, classroom performance-approach, classroom performance-avoidance) was…
Scholars conducting research in classrooms face a myriad of ethical issues somewhat unique to the educational setting. While the Code of Federal Regulations (45 CFR 46) generally provides that educational research be classified as exempt from review by Institutional Review Boards, those same regulations provide a host of special conditions under…
The purpose of this article is to describe an applied method of assessing and manipulating environmental factors influencing student behavior. The assessment procedure is called structural analysis (SA) and can be a part of a functional behavioral assessment (FBA) process or a stand-alone set of procedures for teachers to use in their classrooms.…
This research explored the effects of a constructivist approach using computer projected simulations (CPS) and interactive engagement (IE) methods on 12th grade school students. The treatment lasted 18 weeks during the 1999-2000 fall semester and seeked to evaluate three variations in students': (1)conceptual understanding of Newtonian mechanics as measured by the Force Concept Inventory (FCI), (2)modification of their views about science as measured by the Views About Science Survey (VASS), and (3)achievement on traditional examinations, as measured by their end of semester grades. Analysis of Covariance (ANCOVA) was applied to determine the differences between the mean scores of the experimental group students, and students of the control group, who were exposed to traditional teaching methods only. The FCI data analysis showed that, after 18 weeks, conceptual understanding of Newtonian mechanics had markedly improved only in the experimental group (F(1,99) = 44.739, p < .001). By contrast, there was no statistically significant difference in students' performance on the VASS instrument for both groups (F(1,99) = .033, p = .856), confirming previous and comparable findings for studies of short implementation period. The lack of statistically significant difference between the control and experimental groups in graded achievement, while controlling for students' previous achievement, was unexpected (F(1,99) = 1.178, p = .280). It is suggested that in this particular setting, the influence of a technical factor may have been overlooked: the monitored and systematic drill exercises using elaborate math formulae to prepare students for traditional math-loaded exams. Still, despite being intentionally deprived of such preparation throughout the study, students of the experimental group did not achieve less than their counterpart, and in addition, they had gained a satisfactory understanding of Newtonian mechanics. This result points unmistakably at a plausible positive correlation between a better grasp of basic concepts in physics in a challenging and active engagement environment, and unproblematic achievement in traditional exams. Despite the modest sample size of the studied groups, students here, as elsewhere in the world, show a manifest readiness and capacity to master proper understanding of Newtonian mechanics when induced by the IE methods in a constructivist, semi-Socratic, environment.
This web page contains an in-depth tutorial on the properties and behavior of waves, suitable for beginning students of physics. It features multiple animations to accompany the text, plus self-guided question-and-answer sets. Included is a discussion of the nature and anatomy of waves, how wave energy is transported, and an overview of standing waves and harmonic patterns. This item is part of The Physics Classroom, a web-based collection of tutorials for high school physics students.
So-called "radical" and "critical"pedagogy seems to be everywhere these days on the landscapes of geographical teaching praxis and theory. Part of the remit of radical/critical pedagogy involves a de-centring of the traditional "banking" method of pedagogical praxis. Yet, how do we challenge this "banking" model of knowledge transmission in both a…
Here we present diverse examples where empirical mining and statistical analysis of large data sets have already been shown to be useful for a wide variety of practical decision-making problems within the realm of large-scale ecology. Because a full understanding and appreciation of particular ecological phenomena are possible only after hypothesis-directed research regarding the existence and nature of that process, some ecologists may feel that purely empirical data harvesting may represent a less-than-satisfactory approach. Restricting ourselves exclusively to process-driven approaches, however, may actually slow progress, particularly for more complex or subtle ecological processes. We may not be able to afford the delays caused by such directed approaches. Rather than attempting to formulate and ask every relevant question correctly, empirical methods allow trends, relationships and associations to emerge freely from the data themselves, unencumbered by a priori theories, ideas and prejudices that have been imposed upon them. Although they cannot directly demonstrate causality, empirical methods can be extremely efficient at uncovering strong correlations with intermediate "linking" variables. In practice, these correlative structures and linking variables, once identified, may provide sufficient predictive power to be useful themselves. Such correlation "shadows" of causation can be harnessed by, e.g., Bayesian Belief Nets, which bias ecological management decisions, made with incomplete information, toward favorable outcomes. Empirical data-harvesting also generates a myriad of testable hypotheses regarding processes, some of which may even be correct. Quantitative statistical regionalizations based on quantitative multivariate similarity have lended insights into carbon eddy-flux direction and magnitude, wildfire biophysical conditions, phenological ecoregions useful for vegetation type mapping and monitoring, forest disease risk maps (e.g., sudden oak death), global aquatic ecoregion risk maps for aquatic invasives, and forest vertical structure ecoregions (e.g., using extensive LiDAR data sets). Multivariate Spatio-Temporal Clustering, which quantitatively places alternative future conditions on a common footing with present conditions, allows prediction of present and future shifts in tree species ranges, given alternative climatic change forecasts. ForWarn, a forest disturbance detection and monitoring system mining 12 years of national 8-day MODIS phenology data, has been operating since 2010, producing national maps every 8 days showing many kinds of potential forest disturbances. Forest resource managers can view disturbance maps via a web-based viewer, and alerts are issued when particular forest disturbances are seen. Regression-based decadal trend analysis showing long-term forest thrive and decline areas, and individual-based, brute-force supercomputing to map potential movement corridors and migration routes across landscapes will also be discussed. As significant ecological changes occur with increasing rapidity, such empirical data-mining approaches may be the most efficient means to help land managers find the best, most-actionable policies and decision strategies.
Hargrove, W. W.; Hoffman, F. M.; Kumar, J.; Spruce, J.; Norman, S. P.
As is observed in both experiment and theory, in the absence of hydrothermal convection, the majority of magma chamber heat loss occurs via conduction through the roof of the intrusion and into the cold country rock above. The formation of an upper solidification front (or Upper Border Series, UBS), recorded in the rocks both geochemically and texturally, is a natural outcome of the progression of the solidification front from the cold roof to the hot center of the magma chamber. There are, however, a few unique layered mafic intrusions for which little or no UBS exists. In this study, I examine the thermal evolution and crystallization rates of several classic layered intrusions as it is recorded in the extent of the preserved UBS. For those intrusions that have experienced crystallization at the roof, such as the Skaergaard Intrusion, the development of a UBS reduces the temperature gradient at the roof and effectively slows the rate of heat loss from the main magma body. However, for those intrusions that do not have an UBS, such as the Bushveld Complex, the cooling rate is controlled only by the maximum rate of conductive heat loss through the overlying roof rocks, which decreases with time. The implications are two-fold: (1) The relative thickness of the UBS in large intrusions may be the key to quantifying their cooling and solidification rates; and (2) The nature of the magma mush zone near the roof of an intrusion may depend principally on the long-term thermal evolution of the magma body. Particularly at the end stages of crystallization, when the liquids are likely to be highly evolved and high viscosities may inhibit convection, intrusions lacking a well-defined UBS may provide important insights into the mechanics of crystal-liquid separation, melt extraction, and compaction in felsic plutons as well as mafic intrusions. These results are important for long-lived (>500 kyr) or repeatedly replenished magma chambers in all tectonic settings.
Recent research in classroom discipline tends to show that discipline is a by-product of effective instruction and classroom management. The five publications reviewed in this annotated bibliography explore aspects of the complex classroom environment that relate to student discipline. Walter Doyle's chapter on "Classroom Organization and…
In learning-centered classrooms, the emphasis of classroom management shifts from maintaining behavioral control to fostering student engagement and self-regulation as well as community responsibility. This brief describes classroom management in "learning centered" classrooms, where practices are consistent with recent research knowledge about…
National Education Association Research Department, 2006
Relations between early problem behavior in preschool classrooms and a comprehensive set of school readiness outcomes were examined for a stratified random sample (N = 256) of 4-year-old children enrolled in a large, urban school district Head Start program. A series of multilevel models examined the unique contribution of early problem behavior…
Bulotsky-Shearer, Rebecca J.; Fernandez, Veronica; Dominguez, Ximena; Rouse, Heather L.
Sea spray aerosols (SSA) are an important part of the climate system through their effects on the global radiative budget both directly as scatterers and absorbers of solar and terrestrial radiation, and indirectly as cloud condensation nuclei (CCN) influencing cloud formation, lifetime and precipitation. In terms of their global mass, SSA have the largest uncertainty of all aerosols. In this study we review 21 SSA source functions from the literature, several of which are used in current climate models, and we also propose a new function. Even excluding outliers, the global annual SSA mass produced by these source functions spans roughly 3-70 Pg yr-1 for the different source functions, with relatively little interannual variability for a given function. The FLEXPART Lagrangian model was run in backward mode for a large global set of observed SSA concentrations, comprised of several station networks and ship cruise measurement campaigns. FLEXPART backward calculations produce gridded emission sensitivity fields, which can subsequently be multiplied with gridded SSA production fluxes to obtain modeled SSA concentrations. This allowed to efficiently evaluate all 21source functions at the same time against the measurements. Another advantage of this method is that source-region information on wind speed and sea surface temperatures (SSTs) could be stored and used for improving the SSA source function parameterizations. The best source functions reproduced as much as 70% of the observed SSA concentration variability at several stations, which is comparable with "state of the art" aerosol models. The main driver of SSA production is wind, and we found that the best fit to the observation data could be obtained when the SSA production is proportional to U103.5 where U10 is the source region averaged 10 m wind speed, to the power of 3.5. A strong influence of SST on SSA production could be detected as well, although the underlying physical mechanisms of the SST influence remains unclear. Our new source function gives a global SSA production for particles smaller than 10 ?m of 9 Pg yr-1 and is the best fit to the observed concentrations.
Grythe, H.; Ström, J.; Krejci, R.; Quinn, P.; Stohl, A.
The literature on college examination practices is relatively sparse, which is surprising given their importance for student learning and motivation. The authors investigated the examination practices of 215 faculty at a comprehensive university in order to identify which practices are normative and to examine relationships among key variables.…
Pivotal response teaching (PRT) is an empirically supported naturalistic behavioral intervention proven to be efficacious in the education of children with autism. This intervention involves loosely structured learning environments, teaching during ongoing interactions between student and teacher, child initiation of teaching episodes, child…
The Paperless Classroom project explored the use of advanced technology to design, develop, and demonstrate an interface to large amounts of technical information available in electronic form. Instructional materials and graphics were successfully authore...
A large urban school district contracted with a private nonprofit educational foundation to train 126 special education resource teachers in the last three years in an Orton-Gillingham-based program. These teachers are currently teaching learning-disabled students in groups of 8-10 at the elementary level and 10-13 students at the secondary level. Learning-disabled students who qualify for Special Education, either in reading or spelling, or both, are receiving the instruction.The teachers took a Basic Introductory Class (90 hours of Advanced Academic Credit offered by the Texas Education Agency, or six hours of graduate credit at a local university) in order to teach the program in the resource setting. A two year Advanced Training included annual on-site observations, two half-day workshops each fall and spring, and a two-day advanced workshop in the second summer.First grade teachers, one selected from each of the 164 campuses, supervisors, and principals attended a 25-hour course on "Recognizing Dyslexia: Using Multisensory Teaching and Discovery Techniques." The first grade teachers and special education resource teachers collaborated to provide inservice training for their colleagues.Research, conducted by the district's Research Department, reveals statistically significant gains in reading and spelling ability for the learning-disabled resource students as measured by the Woodcock Reading Mastery Test-Revised, and the Test of Written Spelling. PMID:24233627
The purpose of this study was to compare the efficacy and safety of extended-release dexmethylphenidate (d-MPH-ER) to that of d,l-MPH-ER and placebo in children with attention-deficit/hyperactivity disorder (ADHD) in a laboratory classroomsetting. This multicenter, double-blind, crossover study randomized 82 children, 6 to 12 years of age, stabilized on a total daily dose to the nearest equivalent of 40 to 60 mg of d,l-MPH or 20 or 30 mg/day of d-MPH. Patients participated in a screening day and practice day, and were randomized to 1 of 10 sequences of all five treatments in five separate periods. Treatments included d-MPH-ER (20 mg/day), d-MPH-ER (30 mg/day), d,l-MPH-ER (36 mg/day), d,l-MPH-ER (54 mg/day), and placebo. Primary efficacy was measured by the change from predose on the Swanson, Kotkin, Agler, M-Flynn, and Pelham (SKAMP) Rating Scale-Combined scores at 2-h postdose during the 12-h laboratory assessment (d-MPH-ER 20 mg/day vs. d,l-MPH-ER 36 mg/day). Adverse events were monitored throughout the study period. d-MPH-ER (20 mg/day) was significantly more effective than d,l-MPH-ER (36 mg/day) in the primary efficacy variable, change from predose to 2-h postdose in SKAMP-combined score. In general, d-MPH-ER had an earlier onset of action than d,l-MPH-ER, while d,l-MPH-ER had a stronger effect at 12-h postdose. No serious adverse events were reported. Treatment with either agent was associated with significant improvements in ADHD symptoms. d-MPH-ER and d,l-MPH-ER can be differentiated on what part of the day each is more effective. PMID:18362868
In the current work, three-dimensional QSAR studies for one largeset of quinazoline type epidermal growth factor receptor (EGF-R) inhibitors were conducted using two types of molecular field analysis techniques: comparative molecular field analysis (CoMFA) and comparative molecular similarity indices analysis (CoMSIA). These compounds belonging to six different structural classes were randomly divided into a training set of 122 compounds
Several popular machine learning methods--Associative Neural Networks (ANN), Support Vector Machines (SVM), k Nearest Neighbors (kNN), modified version of the partial least-squares analysis (PLSM), backpropagation neural network (BPNN), and Multiple Linear Regression Analysis (MLR)--implemented in ISIDA, NASAWIN, and VCCLAB software have been used to perform QSPR modeling of melting point of structurally diverse data set of 717 bromides of nitrogen-containing organic cations (FULL) including 126 pyridinium bromides (PYR), 384 imidazolium and benzoimidazolium bromides (IMZ), and 207 quaternary ammonium bromides (QUAT). Several types of descriptors were tested: E-state indices, counts of atoms determined for E-state atom types, molecular descriptors generated by the DRAGON program, and different types of substructural molecular fragments. Predictive ability of the models was analyzed using a 5-fold external cross-validation procedure in which every compound in the parent set was included in one of five test sets. Among the 16 types of developed structure--melting point models, nonlinear SVM, ASNN, and BPNN techniques demonstrate slightly better performance over other methods. For the full set, the accuracy of predictions does not significantly change as a function of the type of descriptors. For other sets, the performance of descriptors varies as a function of method and data set used. The root-mean squared error (RMSE) of prediction calculated on independent test sets is in the range of 37.5-46.4 degrees C (FULL), 26.2-34.8 degrees C (PYR), 38.8-45.9 degrees C (IMZ), and 34.2-49.3 degrees C (QUAT). The moderate accuracy of predictions can be related to the quality of the experimental data used for obtaining the models as well as to difficulties to take into account the structural features of ionic liquids in the solid state (polymorphic effects, eutectics, glass formation). PMID:17381081
Varnek, Alexandre; Kireeva, Natalia; Tetko, Igor V; Baskin, Igor I; Solov'ev, Vitaly P
The problem of finding a simple, generally applicable description of worldwide measured ambient dose equivalent rates at aviation altitudes between 8 and 12 km is difficult to solve due to the large variety of functional forms and parametrisations that are possible. We present an approach that uses Bayesian statistics and Monte Carlo methods to fit mathematical models to a large
This article documents tagging as one of several informal literacy practices used by newcomer Mexican youth in a Midwest school and classroomsetting. Specifically, it details how tagging travels into the classroom. Using the tool of interactional ethnography to analyze videotaped classroom observation data of an English Learner Science setting, I…
Large-scale, outdoor microcosms were used to study the fate of anthracene, a polycyclic aromatic hydrocarbon, in the aquatic environment. The study provides a data set for describing the disposition of anthracene in the water and aufwuchs of the microcosms for the purpose of comp...
This dissertation explores the uses for data collected at broadband seismic stations to investigate source process of large strike-slip earthquakes and crust and upper mantle structure within active continental tectonic settings. First, we analyzed rupture mechanism of the 2002 Denali earthquake (M = 7.9) and the 2001 Kunlun earthquake (M = 7.8) using teleseismic P waveforms. According to our results,
Four tips for use in the English-as-a-Second-Language classroom are highlighted: Mr. Bean in the Classroom; Defining Your Future; Coin Questions; Our Futures: Simple, Progressive, and Perfect. (Author/VWL)
Epstein, Jim; Ashcraft, Nikki; Clarke, Paul M.; Wolf, Grant S.
Discusses schema theory in relation to the language classroom. Argues that as teachers themselves are former learners, the schemata they have developed both inside and outside the classroom will provide them assumptions about how people learn. (Author/VWL)
Describes the following uses for a video camera in the science classroom: video presentations, microscope work, taping and/or monitoring experiments, analyzing everyday phenomena, lesson enhancement, field trip alternative, and classroom management. (PR)
'Connecting Classrooms to the Milky Way' is a project of the EU-HOU Consortium (Hands-On-Universe, Europe), involving 11 European countries. It is supported by the lifelong Learning Programme of the European Community. The main goal of this project was to set up the first network of small radio-telescopes dedicated to education all around Europe and directly accessible from a simple Web interface. Any classroom connected to Internet via any Web-browser can now remotely control one of the radio-telescopes and observe the HI emission coming from our Galaxy. The interface also provides the users with simple tools to analyse the data: (i) derive the Milky-Way rotation curve and (ii) map the spiral arms HI distribution. A special emphasis has been made to enable the young generation to understand the challenges of these wavelengths, which are currently at the frontline of the new instruments with the development of the ALMA (Atacama Large Millimeter Array) and SKA (Square Kilometer Array) projects.
Salomé, P.; Radiguet, A.; Albert, B.; Batrung, M.; Caillat, M.; Gheudin, M.; Libert, Y.; Ferlet, R.; Maestrini, A.; Melchior, A.-L.; Munier, J.-M.; Rudolph, A.
Based on a recursive process of reducing the entropy, the general decision tree classifier with overlap has been analyzed. Several theorems have been proposed and proved. When the number of pattern classes is very large, the theorems can reveal both the advantages of a tree classifier and the main difficulties in its implementation. Suppose H is Shannon's entropy measure of
We study the phenomenon of nonlinear stochastic resonance (SR) in a complex noisy system formed by a finite number of interacting subunits driven by rectangular pulsed time periodic forces. We find that very large SR gains are obtained for subthreshold driving forces with frequencies much larger than the values observed in simpler one-dimensional systems. These effects are explained using simple considerations.
Cubero, David; Casado-Pascual, Jesús; Gómez-Ordóñez, José; Casado, José Manuel; Morillo, Manuel
Classroom management is defined as procedures for arranging the classroom environment so that children learn what the teacher wants to teach them in the healthiest and most effective way possible. The Southwestern Cooperative Educational Laboratory presents a discussion of these procedures as they relate to social controls and components of…
This series captures the dynamics of the contemporary ESOL classroom. It showcases state-of-the-art curricula, materials, tasks, and activities reflecting emerging trends in language education and seeks to build localized language teaching and learning theories based on teachers' and students' unique experiences in and beyond the classroom. Each…
The present paper analyses and evaluates spoken discourse in the bilingual classroom at Damascus University. It looks at the mechanism of classroom interaction: the use of questions, initiations, repetitions and expansions. Although this paper deals with classroom interaction at Damascus University, it is believed that the results arrived at may…
In this article, the author shares the strategy she adopted to even out the participation among her multicultural students during their classroom discussions. The author realized that her students had different concepts about the classroom and different philosophies about competition. For the Americans and Indians, the classroom was a site of…
Discusses how classrooms are distributed by size on a campus, how well they are used, and how their use changes with faculty and student needs and desires. Details how to analyze classroom space, use, and utilization, taking into account such factors as scheduling and classroom stations. (EV)
A search for cosmic-ray magnetic monopoles has been conducted using a fully coincident superconducting induction detector consisting of six independent high-order gradiometer coils forming the surfaces of a rectangular parallelepiped. The detector had an effective area for isotropic flux averaged over 4 pi sr of 1.0 sq m. Data have been collected from October 1986 to January 1989 with an accumulated live time of 13,410 h. No monopole candidate events were seen, setting a new lower monopole-flux limit for induction detectors of 3.8 x 10 to the -13th/sq cm/s/sr at the 90 percent confidence level.
Bermon, S.; Chi, C. C.; Tsuei, C. C.; Rozen, J. R.; Chaudhari, P.
The Earth Observing System (EOS) Terra spacecraft was launched on an Atlas IIAS launch vehicle on its mission to observe planet Earth in late 1999. Prior to launch, the new design of the spacecraft's pyroshock separation system was characterized by a series of 13 separation ground tests. The analysis methods used to evaluate this unusually large amount of shock data will be discussed in this paper, with particular emphasis on population distributions and finding statistically significant families of data, leading to an overall shock separation interface level. The wealth of ground test data also allowed a derivation of a Mission Assurance level for the flight. All of the flight shock measurements were below the EOS Terra Mission Assurance level thus contributing to the overall success of the EOS Terra mission. The effectiveness of the statistical methodology for characterizing the shock interface level and for developing a flight Mission Assurance level from a large sample size of shock data is demonstrated in this paper.
Advances in portable X-Ray fluorescence (pXRF) analytical technology have made it possible for high-quality, quantitative data to be collected in a fraction of the time required by standard, non-portable analytical techniques. Not only do these advances reduce analysis time, but data may also be collected in the field in conjunction with sampling. Rhyolitic pumice, being primarily glass, is an excellent material to be analyzed with this technology. High-quality, quantitative data for elements that are tracers of magmatic differentiation (e.g. Rb, Sr, Y, Nb) can be collected for whole, individual pumices and subsamples of larger pumices in 4 minutes. We have developed a calibration for powdered rhyolite pumice from the Otowi Member of the Bandelier Tuff analyzed with the Bruker Tracer IV pXRF using Bruker software and influence coefficients for pumice, which measures the following 19 oxides and elements: SiO2, TiO2, Al2O3, FeO*, MnO, CaO, K2O, P2O5, Zn, Ga, Rb, Sr, Y, Zr, Nb, Ba, Ce, Pb, and Th. With this calibration for the pXRF and thousands of individual powdered pumice samples, we have generated an unparalleled data set for any single eruptive unit with known trace element zonation. The Bandelier Tuff of the Valles-Toledo Caldera Complex, Jemez Mountains, New Mexico, is divided into three main eruptive events. For this study, we have chosen the 1.61 Ma, 450 km3 Otowi Member as it is primarily unwelded and pumice samples are easily accessible. The eruption began with a plinian phase from a single source located near center of the current caldera and deposited the Guaje Pumice Bed. The initial Unit A of the Guaje is geochemically monotonous, but Units B through E, co-deposited with ignimbrite show very strong chemical zonation in trace elements, progressing upwards through the deposits from highly differentiated compositions (Rb ~350 ppm, Nb ~200 ppm) to less differentiated (Rb ~100 ppm, Nb ~50 ppm). Co-erupted ignimbrites emplaced during column collapse show similar trace element zonation. The eruption culminated in caldera collapse after transitioning from a single central vent to ring fracture vents. Ignimbrites deposited at this time have lithic breccias and chaotic geochemical profiles. The geochemical discrepancy between early and late deposits warrants detailed, high-resolution sampling and analysis in order to fully understand the dynamics behind zonation processes. Samples were collected from locations that circumvent the caldera and prepared and analyzed in the field and the laboratory with the pXRF. Approximately 2,000 pumice samples will complete this unprecedented data set, allowing detailed reconstruction of trace element zonation around all sides of the Valles Caldera. These data are then used to constrain models of magma chamber processes that produce trace element zonation and how it is preserved in the deposits after a catastrophic, caldera-forming eruption.
This page describes the use of portfolios as an assessment tool. It is one of a series of Classroom Assessment Techniques (CATs) provided by the Field-tested Learning Assessment Guide (FLAG) website. The CATs of FLAG were constructed as a resource for science, technology, engineering and mathematics instructors to emphasize deeper levels of learning and to give instructors valuable feedback during a course. This site provides an overview of what portfolios are, how and when to use them, and why they are useful for assessing student understanding. The site is also linked to a set of discipline-specific "tools" that can be downloaded for immediate use, as well as supplementary links and sources to further explore this assessment tool.
Slater, Timothy F.; The National Institute for Science Education; College Level One Team
Using silicon-based recording electrodes, we recorded neuronal activity of the dorsal hippocampus and dorsomedial entorhinal cortex from behaving rats. The entorhinal neurons were classified as principal neurons and interneurons based on monosynaptic interactions and wave-shapes. The hippocampal neurons were classified as principal neurons and interneurons based on monosynaptic interactions, wave-shapes and burstiness. The data set contains recordings from 7,736 neurons (6,100 classified as principal neurons, 1,132 as interneurons, and 504 cells that did not clearly fit into either category) obtained during 442 recording sessions from 11 rats (a total of 204.5 hours) while they were engaged in one of eight different behaviours/tasks. Both original and processed data (time stamp of spikes, spike waveforms, result of spike sorting and local field potential) are included, along with metadata of behavioural markers. Community-driven data sharing may offer cross-validation of findings, refinement of interpretations and facilitate discoveries.
Polarimetric and multifrequency data from the NASA/JPL airborne synthetic aperture radar (AIRSAR) have been used in a multi-tier estimation algorithm to calculate a comprehensive set of forest canopy properties including branch layer moisture and thickness, trunk density, trunk water content and diameter, trunk height, and subcanapy soil moisture. The estimation algorithm takes advantage of species-specific allometric relations, and is applied to a 100Km x 100Km area in the Canadian boreal region containing many different vegetation species types. The results show very good agreement with ground measurements taken at several focused and auxiliary study sites. This paper expands on the results reported in  and applies the algorithm on the regional scale.
Agriculture ranks among industries with the highest rates of occupational injury and fatality. Administrative medical data sets have long been thought to have potential for occupational injury surveillance. This research explores the feasibility of establishing an agricultural injury surveillance system in New York State that combines data from existing electronic sources. Prehospital Care Report (PCR) data containing the nature of the accident, type of injury, time and date, and patient disposition were received. Researchers also obtained both hospital inpatient and emergency department (ED) records for 2007 through 2009 from the Statewide Planning and Research Cooperative System (SPARCS). For SPARCS data, a computer algorithm identified all potential cases of agricultural injury using International Classification of Diseases (ICD)-9 codes. An attempt was then made to match PCR and SPARCS data using accident date, gender, age, and admitting hospital. Of the PCR records that were matched to SPARCS, 46.8% were found on subsequent inspection to not actually relate to the same incident. Total PCR counts for 2007 and 2008 showed considerable fluctuation, at 2,512,828 and 2,948,841, respectively. A total of 1275, 1336, and 1393 farm injuries were identified in the SPARCS records for 2007, 2008, and 2009, respectively. This study demonstrates that accurate matching of PCR and SPARCS records requires the use of unique personal identifiers. Further, annual fluctuations in PCR counts preclude their current use in a surveillance system. An electronic data set consisting of SPARCS data could be used for surveillance, but would benefit from the addition of PCR data as these become more consistent. PMID:24125048
Scott, Erika E; Krupa, Nicole L; Sorensen, Julie; Jenkins, Paul L
Background Variability of plasma sample collection and of proteomics technology platforms has been detrimental to generation of large proteomic profile datasets from human biospecimens. Methods We carried out a clinical trial-like protocol to standardize collection of plasma from 204 healthy and 216 breast cancer patient volunteers. The breast cancer patients provided follow up samples at 3 month intervals. We generated proteomics profiles from these samples with a stable and reproducible platform for differential proteomics that employs a highly consistent nanofabricated ChipCube™ chromatography system for peptide detection and quantification with fast, single dimension mass spectrometry (LC-MS). Protein identification is achieved with subsequent LC-MS/MS analysis employing the same ChipCube™ chromatography system. Results With this consistent platform, over 800 LC-MS plasma proteomic profiles from prospectively collected samples of 420 individuals were obtained. Using a web-based data analysis pipeline for LC-MS profiling data, analyses of all peptide peaks from these plasma LC-MS profiles reveals an average coefficient of variability of less than 15%. Protein identification of peptide peaks of interest has been achieved with subsequent LC-MS/MS analyses and by referring to a spectral library created from about 150 discrete LC-MS/MS runs. Verification of peptide quantity and identity is demonstrated with several Multiple Reaction Monitoring analyses. These plasma proteomic profiles are publicly available through ProteomeCommons. Conclusion From a large prospective cohort of healthy and breast cancer patient volunteers and using a nano-fabricated chromatography system, a consistent LC-MS proteomics dataset has been generated that includes more than 800 discrete human plasma profiles. This large proteomics dataset provides an important resource in support of breast cancer biomarker discovery and validation efforts.
The Large Area Telescope (LAT) is the primary instrument on-board the Fermi Gamma-ray Space Telescope (Fermi). The LAT is not a typical imaging instrument but rather records the energy and arrival direction of each individual gamma-ray photon. It is designed to operate primarily in sky survey mode instead of pointing at fixed targets for long periods of time. The standard survey mode gives complete coverage of the celestial sphere every 2 orbits (˜3 hours). Additionally, the LAT has a very large (˜2 sr) field of view, and an energy-dependent point spread function (PSF). These factors combine to generate a large data volume and present a unique challenge in providing data to the user community for the study of astronomical sources. We present the design of the public data server at the Fermi Science Support Center (FSSC) for the LAT data as well as performance benchmarks of the initial flight implementation. The data server operates on event lists stored in FITS files. Based on the user's query, the photons matching the data cuts are extracted and presented to the user as a FITS file ready to be downloaded and used in the Fermi science analysis software. Running on a single CPU, the photon data server can extract and prepare the 3-4 million photons matching a typical user's query from the ˜120 million photons in a year's worth of observational data in approximately 20 seconds. The complete system is implemented as a small cluster of Linux PCs to improve performance even more by distributing the load of a single query or processing multiple queries simultaneously.
This study examines the contribution of the Responsive Classroom (RC) Approach, a set of teaching practices that integrate social and academic learning, to children's perceptions of their classroom, and children's academic and social performance over time. Three questions emerge: (a) What is the concurrent and cumulative relation between children's perceptions of the classroom and social and academic outcomes over time?
Laura L. Brock; Tracy K. Nishida; Cynthia Chiong; Kevin J. Grimm; Sara E. Rimm-Kaufman
Background Variability of plasma sample collection and of proteomics technology platforms has been detrimental to generation of large\\u000a proteomic profile datasets from human biospecimens.\\u000a \\u000a \\u000a \\u000a \\u000a Methods We carried out a clinical trial-like protocol to standardize collection of plasma from 204 healthy and 216 breast cancer patient\\u000a volunteers. The breast cancer patients provided follow up samples at 3 month intervals. We generated proteomics profiles
Catherine P Riley; Xiang Zhang; Harikrishna Nakshatri; Bryan Schneider; Fred E Regnier; Jiri Adamec; Charles Buck
The Cys2His2 zinc finger (ZF) is the most frequently found sequence-specific DNA-binding domain in eukaryotic proteins. The ZF's modular protein-DNA interface has also served as a platform for genome engineering applications. Despite decades of intense study, a predictive understanding of the DNA-binding specificities of either natural or engineered ZF domains remains elusive. To help fill this gap, we developed an integrated experimental-computational approach to enrich and recover distinct groups of ZFs that bind common targets. To showcase the power of our approach, we built several large ZF libraries and demonstrated their excellent diversity. As proof of principle, we used one of these ZF libraries to select and recover thousands of ZFs that bind several 3-nt targets of interest. We were then able to computationally cluster these recovered ZFs to reveal several distinct classes of proteins, all recovered from a single selection, to bind the same target. Finally, for each target studied, we confirmed that one or more representative ZFs yield the desired specificity. In sum, the described approach enables comprehensive large-scale selection and characterization of ZF specificities and should be a great aid in furthering our understanding of the ZF domain. PMID:24214968
Persikov, Anton V; Rowland, Elizabeth F; Oakes, Benjamin L; Singh, Mona; Noyes, Marcus B
In shot-gun proteomics raw tandem MS data are processed with extraction tools to produce condensed peak lists that can be uploaded to database search engines. Many extraction tools are available but to our knowledge, a systematic comparison of such tools has not yet been carried out. Using raw data containing more than 400,000 tandem MS spectra acquired using an Orbitrap Velos we compared 9 tandem MS extraction tools, freely available as well as commercial. We compared the tools with respect to number of extracted MS/MS events, fragment ion information, number of matches, precursor mass accuracies and agreement in-between tools. Processing a primary data set with 9 different tandem MS extraction tools resulted in a low overlap of identified peptides. The tools differ by assigned charge states of precursors, precursor and fragment ion masses, and we show that peptides identified very confidently using one extraction tool might not be matched when using another tool. We also found a bias towards peptides of lower charge state when extracting fragment ion data from higher resolution raw data without deconvolution. Collecting and comparing the extracted data from the same raw data allow adjusting parameters and expectations and selecting the right tool for extraction of tandem MS data. PMID:22728601
Mancuso, Francesco; Bunkenborg, Jakob; Wierer, Michael; Molina, Henrik
Maximum intensity projection (MIP) is an important visualization method that has been widely used for the diagnosis of enhanced vessels or bones by rotating or zooming MIP images. With the rapid spread of multidetector-row computed tomography (MDCT) scanners, MDCT scans of a patient generate a large data set. However, previous acceleration methods for MIP rendering of such a data set failed to generate MIP images at interactive rates. In this paper, we propose novel culling methods in both object and image space for interactive MIP rendering of large medical data sets. In object space, for the visibility test of a block, we propose the initial occluder resulting from a preceding image to utilize temporal coherence and increase the block culling ratio a lot. In addition, we propose the hole filling method using the mesh generation and rendering to improve the culling performance during the generation of the initial occluder. In image space, we find out that there is a trade-off between the block culling ratio in object space and the culling efficiency in image space. In this paper, we classify the visible blocks into two types by their visibility. And we propose a balanced culling method by applying a different culling algorithm in image space for each type to utilize the trade-off and improve the rendering speed. Experimental results on twenty CT data sets showed that our method achieved 3.85 times speed up in average without any loss of image quality comparing with conventional bricking method. Using our visibility culling method, we achieved interactive GPU-based MIP rendering of large medical data sets. PMID:22564547
From outcrops located in Provence (South-East France), we describe the distribution, the microstructures, and the petrophysical properties of deformation bands networks related to different tectonic events. In contractional setting, pervasively distributed networks of reverse-sense compactional-shear bands are observed in all the folded-sand units of the foreland, whereas localized networks of clustered reverse-sense shear bands are only observed close to a large-scale thrust. In extensional setting, networks of clustered normal-sense shear bands are generally observed adjacent to large-scale faults, although few and randomly distributed bands are also observed between these faults. Normal-sense cataclastic faults are also observed restricted to sand units, suggesting that faults can initiate in the sands in extension, which is not observed in contraction. Shear bands and faults show cataclastic microstructures of low-permeability whereas compactional-shear bands show crush microbreccia or protocataclastic microstructures of moderate permeability. This basin-scale analysis underlines the major role of tectonic settings (thrust-fault versus normal-fault andersonian-stress regime) and the influence of inherited large-scale faults on the formation of low-permeability shear bands. We also provide a geometrical analysis of the band network properties (spacing, thickness, shear/compaction ratio, degree of cataclasis, petrophysical properties) with respect to the host sand granulometry. This analysis suggests that granulometry, although less important than tectonic setting and the presence of large-scale faults, has however a non-negligible effect on the band networks geometry.
Technology is changing the classroom requiring new design features and considerations to make the classroom flexible and interactive with the teaching process. The design of a Master Classroom, a product of the Classroom Improvement Project at the University of North Carolina at Chapel Hill, is described. These classrooms are specially-equipped to…
With advocates like Sal Khan and Bill Gates,1 flipped classrooms are attracting an increasing amount of media and research attention.2 We had heard Khan's TED talk and were aware of the concept of inverted pedagogies in general. Yet it really hit home when we accidentally flipped our classroom. Our objective was to better prepare our students for class. We set out to effectively move some of our course content outside of class and decided to tweak the Just-in-Time Teaching approach (JiTT).3 To our surprise, this tweak—which we like to call the flip-JiTT—ended up completely flipping our classroom. What follows is narrative of our experience and a procedure that any teacher can use to extend JiTT to a flipped classroom.
Lasry, Nathaniel; Dugdale, Michael; Charles, Elizabeth
Background Results of phylogenetic analysis are often visualized as phylogenetic trees. Such a tree can typically only include up to a few hundred sequences. When more than a few thousand sequences are to be included, analyzing the phylogenetic relationships among them becomes a challenging task. The recent frequent outbreaks of influenza A viruses have resulted in the rapid accumulation of corresponding genome sequences. Currently, there are more than 7500 influenza A virus genomes in the database. There are no efficient ways of representing this huge data set as a whole, thus preventing a further understanding of the diversity of the influenza A virus genome. Results Here we present a new algorithm, "PhyloMap", which combines ordination, vector quantization, and phylogenetic tree construction to give an elegant representation of a large sequence data set. The use of PhyloMap on influenza A virus genome sequences reveals the phylogenetic relationships of the internal genes that cannot be seen when only a subset of sequences are analyzed. Conclusions The application of PhyloMap to influenza A virus genome data shows that it is a robust algorithm for analyzing large sequence data sets. It utilizes the entire data set, minimizes bias, and provides intuitive visualization. PhyloMap is implemented in JAVA, and the source code is freely available at http://www.biochem.uni-luebeck.de/public/software/phylomap.html
Battelle is under contract with Warner Robins Air Logistics Center to design a Common Large Area Display Set (CLADS) for use in multiple airborne C4I applications that currently use unique 19" CRTs. Battelle engineers have determined that by taking advantage of the latest flat panel display technology and the commonality between C4I applications, one display set (21" diag. 1280 X 1024) can be designed for use in multiple applications. In addition, common nodular driver and processing electronics are being designed by Battelle to reduce the number of installation-specific circuit card assemblies required for a particular application. Three initial applications include the E-3 (AWACS) color monitor assembly, E-8 (JSTARS) graphics display unit, and ABCCC airborne color display. For these three applications reliability and maintainability are key drivers. The common design approach reduces the number of unique subassemblies in the USAF inventory by approximately 56 to 66 percent. The new design is also expected to have MTBF of at least 3350 hours, and order of magnitude better than one of the current systems. In the JSTARS installation, more than 1400 lbs can be eliminated from the aircraft. In the E-3 installation, the CLADS is estimated to provide a power reduction of approximately 1750 watts per aircraft. This paper will discuss the common large area display set design and its use in a variety of C4I applications that require a large area, high resolution, full color display.
Orkis, Randall E.; Gorenflo, Ronald L.; Herman, David J.
A new algorithm for the automatic recognition of peak and baseline regions in spectra is presented. It is part of a study to devise a baseline correction method that is particularly suitable for the simple and fast treatment of large amounts of data of the same type, such as those coming from high-throughput instruments, images, process monitoring, etc. This algorithm is based on the continuous wavelet transform, and its parameters are automatically determined using the criteria of Shannon entropy and the statistical distribution of noise, requiring virtually no user intervention. It was assessed on simulated spectra with different noise levels and baseline amplitudes, successfully recognizing the baseline points in all cases but for a few extremely weak and noisy signals. It can be combined with various fitting methods for baseline estimation and correction. In this work, it was used together with an iterative polynomial fitting to successfully process a real Raman image of 40,000 pixels in about 2.5 h. PMID:24624486
Background Small RNAs are important regulators of genome function, yet their prediction in genomes is still a major computational challenge. Statistical analyses of pre-miRNA sequences indicated that their 2D structure tends to have a minimal free energy (MFE) significantly lower than MFE values of equivalently randomized sequences with the same nucleotide composition, in contrast to other classes of non-coding RNA. The computation of many MFEs is, however, too intensive to allow for genome-wide screenings. Results Using a local grid infrastructure, MFE distributions of random sequences were pre-calculated on a large scale. These distributions follow a normal distribution and can be used to determine the MFE distribution for any given sequence composition by interpolation. It allows on-the-fly calculation of the normal distribution for any candidate sequence composition. Conclusion The speedup achieved makes genome-wide screening with this characteristic of a pre-miRNA sequence practical. Although this particular property alone will not be able to distinguish miRNAs from other sequences sufficiently discriminative, the MFE-based P-value should be added to the parameters of choice to be included in the selection of potential miRNA candidates for experimental verification.
This study tested the effects of 5 classroom contextual features on the social status (perceived popularity and social preference) that peers accord to aggressive students in late elementary school, including classroom peer status hierarchy (whether within-classroom differences in popularity are large or small), classroom academic level, and grade…
Garandeau, Claire F.; Ahn, Hai-Jeong; Rodkin, Philip C.
While both federal agencies and professional associations emphasize the importance of neuroscience outreach, this goal seldom reaches the undergraduate neuroscience classroom. However, incorporating outreach into undergraduate neuroscience classes is an efficient means to reach not only future scientists, but also the future practitioners (K-12 teachers, social service workers, etc.) with whom neuroscientists hope to communicate. It also provides a vehicle for faculty members to engage in outreach activities that are typically un- or under-rewarded in faculty reviews. In this article, a Neuroscience Community Outreach Project (NCOP) is described. The project has been used in three offerings of a Cognitive Neuroscience course at a small liberal arts college, shared and applied at a large state university, and presented at a regional Society for Neuroscience meeting as an example of outreach opportunities for faculty. The NCOP assignment is a student-driven, modular activity that can be easily incorporated into existing neuroscience course frameworks. The assignment builds on student interests and connections in the community, providing a way for faculty at institutions without formal outreach programs to incorporate neuroscience outreach into the classroom and connect students to online resources. Several sample student projects are described across three broad domains (K-12 outreach, presentations to social service organizations, and media / popular press presentations). The article ends with a set of suggestions addressing common faculty concerns about incorporating community outreach into the undergraduate neuroscience classroom.
Medicine Lake Volcano (MLV), located in the southern Cascades ˜ 55 km east-northeast of contemporaneous Mount Shasta, has been found by exploratory geothermal drilling to have a surprisingly silicic core mantled by mafic lavas. This unexpected result is very different from the long-held view derived from previous mapping of exposed geology that MLV is a dominantly basaltic shield volcano. Detailed mapping shows that < 6% of the ˜ 2000 km 2 of mapped MLV lavas on this southern Cascade Range shield-shaped edifice are rhyolitic and dacitic, but drill holes on the edifice penetrated more than 30% silicic lava. Argon dating yields ages in the range ˜ 475 to 300 ka for early rhyolites. Dates on the stratigraphically lowest mafic lavas at MLV fall into this time frame as well, indicating that volcanism at MLV began about half a million years ago. Mafic compositions apparently did not dominate until ˜ 300 ka. Rhyolite eruptions were scarce post-300 ka until late Holocene time. However, a dacite episode at ˜ 200 to ˜ 180 ka included the volcano's only ash-flow tuff, which was erupted from within the summit caldera. At ˜ 100 ka, compositionally distinctive high-Na andesite and minor dacite built most of the present caldera rim. Eruption of these lavas was followed soon after by several large basalt flows, such that the combined area covered by eruptions between 100 ka and postglacial time amounts to nearly two-thirds of the volcano's area. Postglacial eruptive activity was strongly episodic and also covered a disproportionate amount of area. The volcano has erupted 9 times in the past 5200 years, one of the highest rates of late Holocene eruptive activity in the Cascades. Estimated volume of MLV is ˜ 600 km 3, giving an overall effusion rate of ˜ 1.2 km 3 per thousand years, although the rate for the past 100 kyr may be only half that. During much of the volcano's history, both dry HAOT (high-alumina olivine tholeiite) and hydrous calcalkaline basalts erupted together in close temporal and spatial proximity. Petrologic studies indicate that the HAOT magmas were derived by dry melting of spinel peridotite mantle near the crust mantle boundary. Subduction-derived H 2O-rich fluids played an important role in the generation of calcalkaline magmas. Petrology, geochemistry and proximity indicate that MLV is part of the Cascades magmatic arc and not a Basin and Range volcano, although Basin and Range extension impinges on the volcano and strongly influences its eruptive style. MLV may be analogous to Mount Adams in southern Washington, but not, as sometimes proposed, to the older distributed back-arc Simcoe Mountains volcanic field.
Donnelly-Nolan, Julie M.; Grove, Timothy L.; Lanphere, Marvin A.; Champion, Duane E.; Ramsey, David W.
Medicine Lake Volcano (MLV), located in the southern Cascades ??? 55??km east-northeast of contemporaneous Mount Shasta, has been found by exploratory geothermal drilling to have a surprisingly silicic core mantled by mafic lavas. This unexpected result is very different from the long-held view derived from previous mapping of exposed geology that MLV is a dominantly basaltic shield volcano. Detailed mapping shows that < 6% of the ??? 2000??km2 of mapped MLV lavas on this southern Cascade Range shield-shaped edifice are rhyolitic and dacitic, but drill holes on the edifice penetrated more than 30% silicic lava. Argon dating yields ages in the range ??? 475 to 300??ka for early rhyolites. Dates on the stratigraphically lowest mafic lavas at MLV fall into this time frame as well, indicating that volcanism at MLV began about half a million years ago. Mafic compositions apparently did not dominate until ??? 300??ka. Rhyolite eruptions were scarce post-300??ka until late Holocene time. However, a dacite episode at ??? 200 to ??? 180??ka included the volcano's only ash-flow tuff, which was erupted from within the summit caldera. At ??? 100??ka, compositionally distinctive high-Na andesite and minor dacite built most of the present caldera rim. Eruption of these lavas was followed soon after by several large basalt flows, such that the combined area covered by eruptions between 100??ka and postglacial time amounts to nearly two-thirds of the volcano's area. Postglacial eruptive activity was strongly episodic and also covered a disproportionate amount of area. The volcano has erupted 9 times in the past 5200??years, one of the highest rates of late Holocene eruptive activity in the Cascades. Estimated volume of MLV is ??? 600??km3, giving an overall effusion rate of ??? 1.2??km3 per thousand years, although the rate for the past 100??kyr may be only half that. During much of the volcano's history, both dry HAOT (high-alumina olivine tholeiite) and hydrous calcalkaline basalts erupted together in close temporal and spatial proximity. Petrologic studies indicate that the HAOT magmas were derived by dry melting of spinel peridotite mantle near the crust mantle boundary. Subduction-derived H2O-rich fluids played an important role in the generation of calcalkaline magmas. Petrology, geochemistry and proximity indicate that MLV is part of the Cascades magmatic arc and not a Basin and Range volcano, although Basin and Range extension impinges on the volcano and strongly influences its eruptive style. MLV may be analogous to Mount Adams in southern Washington, but not, as sometimes proposed, to the older distributed back-arc Simcoe Mountains volcanic field.
Donnelly-Nolan, J. M.; Grove, T. L.; Lanphere, M. A.; Champion, D. E.; Ramsey, D. W.
The current scenario in American education shows a large achievement and opportunity gap in science between urban children in poverty and more privileged youth. Research has shown that one essential factor that accounts for this gap is the shortage of qualified science teachers in urban schools. Teaching science in a high poverty school presents unique challenges to beginner teachers. Limited resources and support and a significant cultural divide with their students are some of the common problems that cause many novice teachers to quit their jobs or to start enacting what has been described as "the pedagogy of poverty." In this study I looked at the case of the Urban Science Education Fellows Program. This program aimed to prepare preservice teachers (i.e. "fellows") to enact socially just science pedagogies in urban classrooms. I conducted qualitative case studies of three fellows. Fellows worked over one year with science teachers in middle-school classrooms in order to develop transformative action research studies. My analysis focused on how fellows coauthored hybrid spaces within these studies that challenged the typical ways science was taught and learned in their classrooms towards a vision of socially just teaching. By coauthoring these hybrid spaces, fellows developed grounded generativity, i.e. a capacity to create new teaching scenarios rooted in the pragmatic realities of an authentic classroomsetting. Grounded generativity included building upon their pedagogical beliefs in order to improvise pedagogies with others, repositioning themselves and their students differently in the classroom and constructing symbols of possibility to guide their practice. I proposed authentic play as the mechanism that enabled fellows to coauthor hybrid spaces. Authentic play involved contexts of moderate risk and of distributed expertise and required fellows to be positioned at the intersection of the margins and the center of the classroom community of practice. In all, this study demonstrates that engaging in classroom reform can support preservice teachers in developing specialized tools to teach science in urban classrooms.
"Teacher's Survival Guide: Differentiating Instruction in the Elementary Classroom" answers the most common questions about differentiation, including what it is, how teachers can set up a classroom that promotes differentiation, which topics should be differentiated, what strategies are most effective, and how teachers can assess students engaged…
To investigate the pedagogical effectiveness of oral corrective feedback (CF) on target language development, we conducted a meta-analysis that focused exclusively on 15 classroom-based studies (N = 827). The analysis was designed to investigate whether CF was effective in classroomsettings and, if so, whether its effectiveness varied according…
We view college classroom teaching and learning as a multimedia authoring activity. The classroom provides a rich setting in which a number of different forms of communication co-exist, such as speech, writing and projected images. Much of the information in a lec- ture is poorly recorded or lost currently. Our hypoth- esis is that tools to aid in the capture
Gregory D. Abowd; Christopher G. Atkeson; Ami Feinstein; Cindy E. Hmelo; Rob Kooper; Sue Long; Mikiya Tani
One goal of high school teachers is to help students appreciate that reading does not end when they leave the classroom. When students find reading meaningful, they are more likely to see themselves as readers and choose to read long after they leave the classroomsetting (Hinchman, Alvermann, Boyd, Brozo, & Vacca, 2003-2004; Wilhelm, 2001).…
We evaluated a trial-based approach to conducting functional analyses in classroomsettings. Ten students referred for problem behavior were exposed to a series of assessment trials, which were interspersed among classroom activities throughout the day. Results of these trial-based functional analyses were compared to those of more traditional…
Bloom, Sarah E.; Iwata, Brian A.; Fritz, Jennifer N.; Roscoe, Eileen M.; Carreau, Abbey B.
This paper reports on the relationships among classroom teaching, learning activities and technology integration in the middle school classroom. The results are based on a comparison of three studies conducted across diverse middle school settings. The studies considered three primary questions: (1) Are specific learning activities identifiable…
This paper sets out to investigate (i) gender differences in whole class classroom interaction with a sample of teachers who were not using interactive whiteboards (IWBs) in their lessons; and (ii) the short-term and longer term impact of IWB use upon gender differences in classroom interaction. The study focused upon teacher-student interaction…
The present study set out to identify and examine dialogic educational interactions in Finnish pre-school classrooms. Video recordings of five observed pre-school classrooms that had shown a high or moderate quality of instructional support in literacy, maths and science studies were transcribed for micro-scale qualitative content analysis. Three…
Rasku-Puttonen, Helena; Lerkkanen, Marja-Kristiina; Poikkeus, Anna-Maija; Siekkinen, Martti
This article reports a study of the learning environments in computer networked classrooms. The study is unique in that it involved an evaluation of both the physical and psychosocial classroom environments in these computerised settings through the use of a combination of questionnaires and ergonomic evaluations. The study involved administering…
The study of learning environments provides a useful research framework for investigating the effects of educational innovations such as those which are associated with the use of the Internet in classroomsettings. This study reports an investigation into the use of Internet technologies in high-school classrooms in Australia and Canada.…
This annotated bibliography contains a record of the evolution of a system for evaluating children in the classroomsetting. The Barclay Classroom Climate Inventory is a needs assessment system in the affective-social domain. It is unique in that it taps three inputs, self-report, peer judgements and teacher expectations. These inputs are blended…
This report explores how students' multilingual literacies can become part of everyday classroom practices. It discusses the contribution made by the home language in English language learning and literacy by highlighting the connections between languages in mainstream classroomsettings. The strategies highlighted here focus on the representation…
A data set consisting of a large number of terpenoids, the widely distributed compounds in nature that are found in abundance in higher plants, have been used to develop a quantitative structure property relationship (QSPR) for their Kovats retention index. QSPR models are usually obtained by splitting the data into two sets including calibration (or training) and prediction (or validation). All model building steps, especially feature selection procedure, are performed using this initial splitting, and therefore the performances of the resulted models are highly dependent on the initial data splitting. To investigate the effects of data splitting on the feature selection in the current article we proposed a combined data splitting-feature selection (CDFS) methodology for QSPR model development by producing several different training/validation/test sets, and repeating all of the model building studies. In this method, data splitting is achieved many times and in each case feature selection is performed. The resulted models are compared for similarity and dissimilarity between the selected descriptors. The final model is one whose descriptors are the common variables between all of resulted models. The method was applied to QSPR study of a large data set containing the Kovats retention indices of 573 terpenoids. A final 8-parametric multilinear model with constitutional and topological indices was obtained. Cross-validation indicated that the model could reproduce more than 90% of variances in the Kovats retention data. The relative error of prediction for an external test set of 50 compounds was 3.2%. Finally, to improve the results, structure-retention relationships were followed by nonlinear approach using artificial neural networks and consequently better results were obtained. PMID:17499073
Teachers must start with an organized classroom. Think through how you want your classroom arranged, how students will turn in work, and where supplies are located. Students should also be instructed how the classroom is set up and who should be retrieving supplies. Having numbered containers with supplies is a quick way to distribute materials and check that everything has been returned at the end of the period. This article outlines additional classroom management plans that will prepare new teachers for the first day of school and throughout the entire school year.
In this Biodiversity Counts activity, students learn the importance of maps and scale as they work together to create a map of the classroom. The printable four-page PDF handout includes a series of inquiry-based questions to get students thinking about what they already know about maps and their classroom and a worksheet with step-by-step directions for creating a scale and determining the size and relative location of objects in the classroom.
In this activity, learners discover the importance of maps and scale as they work together to create a map of the classroom. The printable four-page PDF handout includes: a series of inquiry-based questions to get learners thinking about what they already know about maps and their classroom; and a worksheet with step-by-step directions for creating a scale and determining the size and relative location of objects in the classroom.
Grounded in the context of the gap in civic participation, action-based civics curricula, and how classroom interactions may affect student development, we present the CIVVICS (Civic Interactions motiVating diVerse Individuals in ClassroomSettings) observation tool. CIVVICS's four domains--Lesson Planning and Implementation, Classroom…
The New York City Department of Education has recently set forth new mandates for the redesign of classrooms. Teachers must be taught how to redesign their classrooms correctly so that all students will be provided with the necessary space to accommodate their environmental learning-style preferences. By altering the classroom, teachers give some…
A classroom observation rating scale, based upon a recent analysis of the literature and conceptually verified by open education advocates, effectively differentiated British and American open classrooms from American traditional classrooms. The influence of socio-economic settings was also demonstrated. For the three comparison groups, more…
The issue of classroom management in the English as a foreign language (EFL) setting has not been addressed adequately despite teachers' views of it as constituting one of their prioritized tasks. Among the aspects of classroom management, in particular, classroom discipline seems to warrant research focus because it contributes to "smooth and…
This paper examines the speech and writing of two young bilingual children in a classroom context. After summarising the diverse contexts in which such children learn, we present four sets of data: the informants' written texts, following a classroom activity (observing and handling tortoises); partial transcripts of a classroom feedback session in which they address the whole class; of their
Provides background information on the earthworm. Reviews basic anatomical, behavioral, and reproductive facts. Offers suggestions for procuring, maintaining, and breeding colonies for classroom use. (ML)
This article shares some personal reflections on several years of integrating educational technology into mathematics courses while retaining the direct interaction strengths of the traditional classroom.
One of the major acoustical concerns in classrooms is the establishment of effective verbal communication between teachers and students. Non-optimal acoustical conditions, resulting in reduced verbal communication, can cause two main problems. First, they can lead to reduce learning efficiency. Second, they can also cause fatigue, stress, vocal strain and health problems, such as headaches and sore throats, among teachers who are forced to compensate for poor acoustical conditions by raising their voices. Besides, inadequate acoustical conditions can induce the usage of public address system. Improper usage of such amplifiers or loudspeakers can lead to impairment of students' hearing systems. The social costs of poor classroom acoustics will be large to impair the learning of children. This invisible problem has far reaching implications for learning, but is easily solved. Many researches have been carried out that they have accurately and concisely summarized the research findings on classrooms acoustics. Though, there is still a number of challenging questions remaining unanswered. Most objective indices for speech intelligibility are essentially based on studies of western languages. Even several studies of tonal languages as Mandarin have been conducted, there is much less on Cantonese. In this research, measurements have been done in unoccupied rooms to investigate the acoustical parameters and characteristics of the classrooms. The speech intelligibility tests, which based on English, Mandarin and Cantonese, and the survey were carried out on students aged from 5 years old to 22 years old. It aims to investigate the differences in intelligibility between English, Mandarin and Cantonese of the classrooms in Hong Kong. The significance on speech transmission index (STI) related to Phonetically Balanced (PB) word scores will further be developed. Together with developed empirical relationship between the speech intelligibility in classrooms with the variations of the reverberation time, the indoor ambient noise (or background noise level), the signal-to-noise ratio, and the speech transmission index, it aims to establish a guideline for improving the speech intelligibility in classrooms for any countries and any environmental conditions. The study showed that the acoustical conditions of most of the measured classrooms in Hong Kong are unsatisfactory. The selection of materials inside a classroom is important for improving speech intelligibility at design stage, especially the acoustics ceiling, to shorten the reverberation time inside the classroom. The signal-to-noise should be higher than 11dB(A) for over 70% of speech perception, either tonal or non-tonal languages, without the usage of address system. The unexpected results bring out a call to revise the standard design and to devise acceptable standards for classrooms in Hong Kong. It is also demonstrated a method for assessment on the classroom in other cities with similar environmental conditions.
This paper presents a gesture recognition system for visualization navigation. Scientists are interested in developing interactive settings for exploring large data sets in an intuitive environment. The input consists of registered 3-D data. A geometric method using Bezier curves is used for the trajectory analysis and classification of gestures. The hand gesture speed is incorporated into the algorithm to enable correct recognition from trajectories with variations in hand speed. The method is robust and reliable: correct hand identification rate is 99.9% (from 1641 frames), modes of hand movements are correct 95.6% of the time, recognition rate (given the right mode) is 97.9%. An application to gesture-controlled visualization of 3D bioinformatics data is also presented.
There are few social epidemiologic studies on chickenpox outbreaks, although previous findings suggested the important role of social determinants. This study describes the context of a large outbreak of chickenpox in the Cauca Valley region, Colombia (2003 to 2007), with an emphasis on macro-determinants. We explored the temporal trends in chickenpox incidence in 42 municipalities to identify the places with higher occurrences. We analyzed municipal characteristics (education quality, vaccination coverage, performance of health care services, violence-related immigration, and area size of planted sugar cane) through analyses based on set theory. Edwards-Venn diagrams were used to present the main findings. The results indicated that three municipalities had higher incidences and that poor quality education was the attribute most prone to a higher incidence. Potential use of set theory for exploratory outbreak analyses is discussed. It is a tool potentially useful to contrast units when only small sample sizes are available. PMID:21808823
Formative assessment involves the probing of students' ideas to determine their level of understanding during the instructional sequence. Often conceptualized as a cycle, formative assessment consists of the teacher posing an instructional task to students, collecting data about student understanding, and engaging in follow-up strategies such as clarifying student understanding and adjusting instruction to meet learning needs. Despite having been shown to increase student achievement in a variety of classroomsettings, formative assessment remains a relative weak area of teacher practice. Methods that enhance formative assessment strategies may therefore have a positive effect on student achievement. Audience response systems comprise a broad category of technologies that support richer classroom interaction and have the potential to facilitate formative assessment. Results from a large national research study, Classroom Connectivity in Promoting Mathematics and Science Achievement (CCMS), show that students in algebra classrooms where the teacher has implemented a type of audience response system experience significantly higher achievement gains compared to a control group. This suggests a role for audience response systems in promoting rich formative assessment. The importance of incorporating formative assessment strategies into regular classroom practice is widely recognized. However, it remains challenging to identify whether rich formative assessment is occurring during a particular class session. This dissertation uses teacher interviews and classroom observations to develop a fine-grained model of formative assessment in secondary science classrooms employing a type of audience response system. This model can be used by researchers and practitioners to characterize components of formative assessment practice in classrooms. A major component of formative assessment practice is the collection and aggregation of evidence of student learning. This dissertation proposes the use of the assessment episode to characterize extended cycles of teacher-student interactions. Further, the model presented here provides a new methodology to describe the teacher's use of questioning and subsequent classroom discourse to uncover student learning. Additional components of this model of formative assessment focus on the recognition of student learning by the teacher and the resultant changes in instructional practice to enhance student understanding.
With smartphone sales currently surpassing laptop sales, it is hard not to think that these devices will have a place in the classroom. More specifically, with little to no monetary investment, classroom-centric mobile applications have the ability to suit the needs of teachers. Previously, programming such an item was a daunting task to the classroom teacher. But now, through the use of online visual tools, anyone has the ability to generate a mobile application to suit individual classroom needs. The "MY NASA DATA" (MND) project has begun work on such an application. Using online tools that are directed at the non-programmer, the team has developed two usable mobile applications ("apps") that fit right into the science classroom. The two apps generated include a cloud dichotomous key for cloud identification in the field, and an atmospheric science glossary to help with standardized testing key vocabulary and classroom assignments. Through the use of free online tools, teachers and students now have the ability to customize mobile applications to meet their individual needs. As an extension of the mobile applications, the MND team is planning web-based application programming interfaces (API's) that will be generated from data that is currently included in the MND Live Access Server. This will allow teachers and students to choose data sets that they want to include in the mobile application without having to populate the API themselves. Through the use of easy to understand online mobile app tutorials and MND data sets, teachers will have the ability to generate unit-specific mobile applications to further engage and empower students in the science classroom.
Lewis, P.; Oostra, D.; Crecelius, S.; Chambers, L. H.
An improved method for the assessment of Social Development of secondary school students is described. For those with Social Development difficulties, the "Vineland Classroom" Edition can be used and interpreted to provide an Adaptive Behaviour Composite score. Prior to the present development, the "Classroom" Edition was only applicable to…
In this article, the authors describe a systematic approach to planning for the first days of school that is appropriate for today's demanding high school science classrooms. These strategies apply to any science subject and benefit student teachers, new teachers, and those teachers wishing to improve their classroom management skills. (Contains 3…
Four games for use in the foreign language classroom are described. The first, "A Shopping Game," by Gordon Hartig, is played on a game board in the German classroom and provides practice in producing sentences with the preposition "in," which in some instances takes the dative and in others takes the accusative. A diagram of the game board is…
To differentiate is to make different, distinct, or specialized (Costello, 1994). Differentiation has become popular in education as an instructional philosophy aimed at equitably meeting the learning needs of all students in the classroom. Differentiated planning and delivery of classroom guidance is also necessary for appropriate school…
Akos, Patrick; Cockman, Caroline R.; Strickland, Cindy A.
School restructuring movements have gained a great deal of national attention. This guidebook addresses the multi-age classroom movement, in which a mixed-age group of children stays with a given teacher for a number of years. The work provides a complete design for the mixed-age primary classroom, from philosophy and rationale to sample lesson…
This high school classroom exercise from the Center for Chronic Disease Prevention and Health Promotion gives an introduction to epidemiology. Visitors will find background materials (including an introduction to epidemiology and how to investigate and outbreak) and suggestions for classroom use.
A copyright infringement suit involving duplication of material for classroom use without permission or acknowledgement and related cases are discussed with reference to the fair use privilege, the Copyright Act of 1976, and congressional guidelines. Generally, fair use has been rejected as a blanket defense in classroom copying. (MJL)
This chapter of "Helping Teachers Manage Classrooms" presents strategies and processes that teachers can use to establish well-managed classrooms. These recommendations are based on the results of year-long descriptive studies of the management methods used by third grade teachers and by seventh and eighth grade English and mathematics teachers.…
Presents ideas for creating a welcoming classroom environment, including: decorating the room by hanging students' names from the ceiling; making a classroom community puzzle involving each student; and developing a variety of welcoming bulletin boards. A reproducible sheet includes cut-out shapes to make an underwater-theme bulletin board that…
There are three areas of interest in the topic of sociolinguistics in the classroom. First, the study of sociolinguistics is the interest of the professional linguist; second, the application of sociolinguistic principles is or should be the concern of all professional people who care about what goes on in the classroom, especially the teacher;…
A commonly advocated best practice for classroom assessment is to make the assessments authentic. Authentic is often used as meaning the mirroring of real-world tasks or expectations. There is no consensus, however, in the actual definition of the term or the characteristics of an authentic classroom assessment. Sometimes, the realistic component…
Frey, Bruce B.; Schmitt, Vicki L.; Allen, Justin P.
It is frequently said that evangelism or proselytizing has no place in the classroom. The purpose of this essay is to counter this generalization and to explore the nature of legitimate religious influence in the classroom. In doing so I will offer some criteria to help us determine what is and what is not acceptable by way of religious persuasion…
A semi-empirical counterpoise-type correction for basis set superposition error (BSSE) in molecular systems is presented. An atom pair-wise potential corrects for the inter- and intra-molecular BSSE in supermolecular Hartree-Fock (HF) or density functional theory (DFT) calculations. This geometrical counterpoise (gCP) denoted scheme depends only on the molecular geometry, i.e., no input from the electronic wave-function is required and hence is applicable to molecules with ten thousands of atoms. The four necessary parameters have been determined by a fit to standard Boys and Bernadi counterpoise corrections for Hobza's S66×8 set of non-covalently bound complexes (528 data points). The method's target are small basis sets (e.g., minimal, split-valence, 6-31G*), but reliable results are also obtained for larger triple-? sets. The intermolecular BSSE is calculated by gCP within a typical error of 10%-30% that proves sufficient in many practical applications. The approach is suggested as a quantitative correction in production work and can also be routinely applied to estimate the magnitude of the BSSE beforehand. The applicability for biomolecules as the primary target is tested for the crambin protein, where gCP removes intramolecular BSSE effectively and yields conformational energies comparable to def2-TZVP basis results. Good mutual agreement is also found with Jensen's ACP(4) scheme, estimating the intramolecular BSSE in the phenylalanine-glycine-phenylalanine tripeptide, for which also a relaxed rotational energy profile is presented. A variety of minimal and double-? basis sets combined with gCP and the dispersion corrections DFT-D3 and DFT-NL are successfully benchmarked on the S22 and S66 sets of non-covalent interactions. Outstanding performance with a mean absolute deviation (MAD) of 0.51 kcal/mol (0.38 kcal/mol after D3-refit) is obtained at the gCP-corrected HF-D3/(minimal basis) level for the S66 benchmark. The gCP-corrected B3LYP-D3/6-31G* model chemistry yields MAD=0.68 kcal/mol, which represents a huge improvement over plain B3LYP/6-31G* (MAD=2.3 kcal/mol). Application of gCP-corrected B97-D3 and HF-D3 on a set of large protein-ligand complexes prove the robustness of the method. Analytical gCP gradients make optimizations of large systems feasible with small basis sets, as demonstrated for the inter-ring distances of 9-helicene and most of the complexes in Hobza's S22 test set. The method is implemented in a freely available FORTRAN program obtainable from the author's website. PMID:22519309
A dramatic shift in research priorities has recently produced a large number of ambitious randomized trials in K-12 education. In most cases, the aim is to improve student academic learning by improving classroom instruction. Embedded in these studies are theories about how the quality of classroom must improve if these interventions are to…
Much of what happens in primary classrooms reflects a number of rituals and routines that have largely become an unconscious part of teachers' repertoires. While these "rituals of practice" provide a framework or structure to learning in classrooms, they are often left unexamined. These taken-for-granted ways of teaching require close examination…
Students' degree of territoriality based on gender and seat preferences in different types of classroom arrangements was studied. The types of classroom arrangements included rows of tablet-arm chairs, U-shaped, clusters, and rows of tables with individual chairs. The study was carried out through a survey at a large public institution in the…
This article explores the construction of a bilingual professional identity in a bilingual creative-writing graduate program in southwest Texas by analyzing a classroom event and the participants' interpretation of it. In bilingual classrooms the resources available to construct professional identities include a large repertoire of linguistic…
This study explores the idea of teaching fundamental cockpit automation concepts and skills to aspiring professional pilots in a classroomsetting, without the use of sophisticated aircraft or equipment simulators. Pilot participants from a local professional pilot academy completed eighteen hours of classroom instruction that placed a strong emphasis on understanding the underlying principles of cockpit automation systems and their use in a multi-crew cockpit. The instructional materials consisted solely of a single textbook. Pilots received no hands-on instruction or practice during their training. At the conclusion of the classroom instruction, pilots completed a written examination testing their mastery of what had been taught during the classroom meetings. Following the written exam, each pilot was given a check flight in a full-mission Level D simulator of a Boeing 747-400 aircraft. Pilots were given the opportunity to fly one practice leg, and were then tested on all concepts and skills covered in the class during a second leg. The results of the written exam and simulator checks strongly suggest that instruction delivered in a traditional classroomsetting can lead to high levels of preparation without the need for expensive airplane or equipment simulators.
Several studies highlight the important role of soil moisture in the water and energy cycles. Soil moisture is variable on both temporal and spatial scale, which is characterized by small- and large-scale variability as well as short- and long-term processes within the system. Several soil moisture data sets ranging from the point to the global scales are available and provide a promising tool to investigate the spatio-temporal variability as well as spatial representativeness of soil moisture. In the current study we assess the large-scale spatial representativeness of soil moisture over the United States using point as well as global scale soil moisture data sets. The following three data sets are used: (i) point scale in-situ measurements obtained through the International Soil Moisture Network (ISMN), provide soil moisture measurements at different depths and are defined as reference data set; at the global scale the (ii) remote sensing based Essential Climate Variable soil moisture data set (ECV-SM) from the European Space Agency (ESA), representing surface soil moisture for the period 1978 to 2010, as well as (iii) soil moisture estimates from the land surface model ERA Interim/Land, including soil moisture for four different soil layers over the period 1979 to 2010. Following Orlowsky and Seneviratne (in press) the spatial representativeness at the point scale is determined by defining an area surrounding a station in which other stations exhibit similar temporal dynamics, according to a given cutoff of similarity. Consequently, the areal extent of this area gives the measure of spatial representativeness. This method is similarly applied to the gridded data sets, where the area is then defined by the areal extent of the grid cells that comply with the similarity criteria. The spatial representativeness is calculated for the period April to September for absolute values, as well as for short- and long-term anomalies. We use the top soil layer for all three products, and in addition the root zone soil layer for ERA Interim/Land and the in-situ measurements. First results show that the spatial pattern of ERA Interim/Land representativeness compares better to in-situ for the absolute values of soil moisture while for the anomalies of soil moisture the ECV-SM compares better. Additionally, for the absolute values of soil moisture, ERA Interim/Land compares better to in-situ for areas of large spatial representativeness, while for the anomalies of soil moisture ECV-SM compares better to in-situ for small spatial representativeness. Further investigation will link the identified spatial patterns to, among others, large-scale circulation. References Orlowsky, B. and S.I. Seneviratne (in press), Short Communication: On the spatial representativeness of temporal dynamics at European weather stations, Int. J. of Climatology.
Nicolai-Shaw, Nadine; Hirschi, Martin; Mittelbach, Heidi; Seneviratne, Sonia I.
The meta-analysis of large-scale postgenomics data sets within public databases promises to provide important novel biological knowledge. Statistical approaches including correlation analyses in coexpression studies of gene expression have emerged as tools to elucidate gene function using these data sets. Here, we present a powerful and novel alternative methodology to computationally identify functional relationships between genes from microarray data sets using rule-based machine learning. This approach, termed “coprediction,” is based on the collective ability of groups of genes co-occurring within rules to accurately predict the developmental outcome of a biological system. We demonstrate the utility of coprediction as a powerful analytical tool using publicly available microarray data generated exclusively from Arabidopsis thaliana seeds to compute a functional gene interaction network, termed Seed Co-Prediction Network (SCoPNet). SCoPNet predicts functional associations between genes acting in the same developmental and signal transduction pathways irrespective of the similarity in their respective gene expression patterns. Using SCoPNet, we identified four novel regulators of seed germination (ALTERED SEED GERMINATION5, 6, 7, and 8), and predicted interactions at the level of transcript abundance between these novel and previously described factors influencing Arabidopsis seed germination. An online Web tool to query SCoPNet has been developed as a community resource to dissect seed biology and is available at http://www.vseed.nottingham.ac.uk/.
Bassel, George W.; Glaab, Enrico; Marquez, Julietta; Holdsworth, Michael J.; Bacardit, Jaume
Participation in classroomsettings decreases with class size and di- versity, thus creating passive modes of learning, due to feelings of shyness, peer pressure, and the like. Computing technology can help by creating a \\
Tan Minh Truong; William G. Griswold; Matthew Ratto
The synthesis of 35 single-subject nonaversive treatments of classroom behavior problems indicated that reinforcement and feedback are highly effective as a remedy across a variety of behaviors, settings, and administrative arrangements. (Author/DB)
A shopping exercise for the foreign language classroom is described. In this exercise, students contribute unused items, "money" is provided, and the students then set up stores and buy and sell the items. (RM)
A bunch of intrepid teachers spent a week in Iceland in a quest to learn more about the country's challenging landscape, by engaging in a unique and inspiring professional development opportunity to learn about innovative ways to teach science and mathematics outside of a classroomsetting. A 2008 Ofsted report highlighted the benefits of learning…
This manual for classroom teachers is designed to teach the application of contingency contracting procedures in the typical instructional setting. A "contingency contract" is an agreement between teacher and student whereby the student, upon demonstration of specific task achievement, receives a reward: permission to participate in a "reinforcing…
Westinghouse Learning Corp., Albuquerque, NM. Behavior Systems Div.
This paper describes research on dialogue between teachers and pupils during primary school science lessons, using talk from two classrooms to provide our examples. We consider whether teachers use dialogue to make education a cumulative, continuing process for guiding the development of children's understanding. Case studies of two teachers, using observational data taken from a larger data set, are used
Offers tips for using garage sales and other community sources for obtaining inexpensive classroom materials. Lists potential finds at yard sales and thrift shops. Suggestions include setting up a budget before shopping, recognizing the educational value of real objects for young children, and taking health and safety precautions before…
This research examined exposure to classroom noise of 25 full-time teaching staff in 14 preschool settings located across Western Sydney. The results indicated that one teacher exceeded the maximum permissible level of daily noise exposure for employees under the health and safety legislation. Three staff approached this level and 92% of teachers…
Uses of shortwave radio in the language classroom are discussed. Benefits are increased language skills and familiarity with the target language and country. Problems of transmission and reception are discussed; obtaining a license is explained. Advice is given on purchasing a radio set. (CHK)
Considers how a monolingual teacher supports linguistic diversity in a classroom of children who speak many different languages. Discusses teachers' attitudes about native language usage in school settings and addresses misconceptions about multiliteracy development. Presents 10 beginning ideas for monolingual teachers to foster multiliteracy. (SG)
Bullying takes on many forms and occurs in all classrooms, and the activities found in physical education often provide fertile ground for these behaviors. For example, dodgeball is often played in physical education settings, even though the American Alliance for Health, Physical Education, Recreation and Dance has clearly stated that dodgeball…
Despite the large research base grounded in behavioral theory for strategies to increase appropriate behavior and prevent or decrease inappropriate behavior in the classroom, a systematic review of multi-component universal classroom management research is necessary to establish the effects of teachers' universal classroom management approaches.…
Oliver, Regina M.; Wehby, Joseph H.; Reschly, Daniel J.
Exploration of space provides a compelling need for cell-based research into the basic mechanisms that underlie the profound changes that occur in terrestrial life that is transitioned to low gravity environments. Toward that end, NASA developed a rotating bioreactor in which cells are cultured while continuously suspended in a cylinder in which the culture medium rotates with the cylinder. The randomization of the gravity vector accomplished by the continuous rotation, in a low shear environment, provides an analog of microgravity. Because cultures grown in bioreactors develop structures and functions that are much closer to those exhibited by native tissue than can be achieved with traditional culture methods, bioreactors have contributed substantially to advancing research in the fields of cancer, diabetes, infectious disease modeling for vaccine production, drug efficacy, and tissue engineering. NASA has developed a Classroom Bioreactor (CB) that is built from parts that are easily obtained and assembled, user-friendly and versatile. It can be easily used in simple school settings to examine the effect cultures of seeds or cells. An educational brief provides assembly instructions and lesson plans that describes activities in science, math and technology that explore free fall, microgravity, orbits, bioreactors, structure-function relationships and the scientific method.
This document focuses on classroom discipline and how the teacher can maintain an environment that will optimize appropriate learning. Part 1 defines classroom discipline. Part 2 discusses classroom misbehavior and describes a number of classroom management techniques. Part 3 offers suggestions for control techniques. Part 4 discusses techniques…
Mathematics education research has documented several classroom practices that might influence student self-regulation. We know little, however, about the ways these classroom practices could be structured in real classroomsettings. In this exploratory case study, we purposefully selected a sixth-grade mathematics teacher who had participated in a professional development program focussed on NCTM standards and SRL in the mathematics classroom for extensive classroom observation. The purpose was to explore how and to what extend she structured classroom practices to support strategic competence in her students. Four features of classroom practices were found as evidence for how strategic competence was potentially supported in this classroom: (a) allowing autonomy and shared responsibility during the early stages of learning, (b) focusing on student understanding, (c) creating contexts for students to learn about strategic learning and to exercise strategic behaviour, and (d) helping students to personalise strategies by recognising their ideas and strategic behaviours.
The soil sorption coefficient (Koc) is a key physicochemical parameter to assess the environmental risk of organic compounds. To predict soil sorption coefficient in a more effective and economical way, here, quantitative structure-property relationship (QSPR) models were developed based on a large diverse dataset including 964 non-ionic organic compounds. Multiple linear regression (MLR), local lazy regression (LLR) and least squares support vector machine (LS-SVM) were utilized to develop QSPR models based on the four most relevant theoretical molecular descriptors selected by genetic algorithms-variable subset selection (GA-VSS) procedure. The QSPR development strictly followed the OECD principles for QSPR model validation, thus great attentions were paid to internal and external validations, applicability domain and mechanistic interpretation. The obtained results indicate that the LS-SVM model performed better than the MLR and the LLR models. For best LS-SVM model, the correlation coefficients (R2) for the training set was 0.913 and concordance correlation coefficient (CCC) for the prediction set was 0.917. The root-mean square errors (RMSE) were 0.330 and 0.426, respectively. The results of internal and external validations together with applicability domain analysis indicate that the QSPR models proposed in our work are predictive and could provide a useful tool for prediction soil sorption coefficient of new compounds.
The current generation of large-scale hydrological models generally lacks a groundwater model component simulating lateral groundwater flow. Large-scale groundwater models are rare due to a lack of hydro-geological data required for their parameterization and a lack of groundwater head data required for their calibration. In this study, we propose an approach to develop a large-extent fully-coupled land surface-groundwater model by using globally available datasets and calibrate it using a combination of discharge observations and remotely-sensed soil moisture data. The underlying objective is to devise a collection of methods that enables one to build and parameterize large-scale groundwater models in data-poor regions. The model used, PCR-GLOBWB-MOD, has a spatial resolution of 1 km x 1 km and operates on a daily basis. It consists of a single-layer MODFLOW groundwater model that is dynamically coupled to the PCR-GLOBWB land surface model. This fully-coupled model accommodates two-way interactions between surface water levels and groundwater head dynamics, as well as between upper soil moisture states and groundwater levels, including a capillary rise mechanism to sustain upper soil storage and thus to fulfill high evaporation demands (during dry conditions). As a test bed, we used the Rhine-Meuse basin, where more than 4000 groundwater head time series have been collected for validation purposes. The model was parameterized using globally available data-sets on surface elevation, drainage direction, land-cover, soil and lithology. Next, the model was calibrated using a brute force approach and massive parallel computing, i.e. by running the coupled groundwater-land surface model for more than 3000 different parameter sets. Here, we varied minimal soil moisture storage and saturated conductivities of the soil layers as well as aquifer transmissivities. Using different regularization strategies and calibration criteria we compared three calibration scenarios: using discharge observations, using time series of remotely sensing soil moisture fields (ERS Soil Water Index from TU Vienna), as well as a combination of both. Note that both sources of information are globally available. Each calibration strategy was subsequently validated using over 4000 groundwater head measurement time series. Comparison of the calibration strategies shows that remotely-sensed soil moisture data can be used for the calibration of upper soil hydraulic conductivities that determine groundwater recharge. However, discharge measurements should be included to calibrate the complete model, specifically to constrain aquifer transmissivities and runoff-infiltration partitioning processes. The combined approach using both remotely-sensed soil moisture data and discharge observations yielded a model that was able to fit both soil moisture as well as discharge reasonably well, as well as predicting the dynamics of groundwater heads with acceptable accuracy. However, absolute levels of groundwater head or are only accurate in regions with shallow groundwater tables. Even though there is room for improvement, our study shows that with the global data-sets that are currently available, large-extent groundwater modeling in data-poor environments is certainly within reach.
Sutanudjaja, E. H.; Van Beek, L. P.; de Jong, S. M.; van Geer, F.; Bierkens, M. F.
Background Insecticide–treated bednets are effective at preventing malaria. This study focuses on household–level factors that are associated with bednet ownership in a rural area of Madagascar which had not been a recipient of large–scale ITN distribution. Methods Data were gathered on individual and household characteristics, malaria knowledge, household assets and bednet ownership. Principal components analysis was used to construct both a wealth index based on household assets and a malaria knowledge index based on responses to questions about malaria. Bivariate and multivariate regressions were used to determine predictors of household bednet ownership and malaria knowledge. Results Forty–seven of 560 households (8.4%) owned a bednet. In multivariate analysis, higher level of malaria knowledge among household members was the only variable significantly associated with bednet ownership (odds ratio 3.72, P?0.001). Among respondents, predictors of higher malaria knowledge included higher education levels, female sex and reporting fever as the most frequent or dangerous illness in the community. Household wealth was not a significant predictor of bednet ownership or respondent malaria knowledge. Conclusion In this setting of limited supply of affordable bednets, malaria knowledge was associated with an increased probability of household bednet ownership. Further studies should determine how such malaria knowledge evolves and if malaria–specific education programs could help overcome the barriers to bednet ownership among at–risk households living outside the reach of large–scale bednet distribution programs.
Krezanoski, Paul J.; Tsai, Alexander C.; Hamer, Davidson H.; Comfort, Alison B.; Bangsberg, David R.
When taking students outside to see the stars is not an option, teachers can bring the stars inside the classroom. These instructions for building a portable planetarium also include suggestions for cross-cultural and social studies connections.
This project analyzes the regular patterns of social interaction in science classrooms and the verbal and nonverbal strategies by which the science content of lessons is communicated. Based on observation and recording of 60 lessons by 20 teachers in seco...
Describes the philosophy of a high school chemistry teacher, providing examples through classroom experiences, with emphasis on the use of demonstrations as instructional aids. Specific stoichiometry and conductometric titration demonstrations are discussed. (CS)
These classroom activities on forensics science from the University of Colorado at Boulder were designed to help students understand the process of scientific investigation and develop better laboratory and data-collection techniques.
Leslie Leinwand (University of Colorado at Boulder;)
Details an activity in which students create and study miniature impact craters in the classroom. Engages students in making detailed, meaningful observations, drawing inferences, reaching conclusions based on scientific evidence, and designing experiments to test selected variables. (DDR)
Procedures are described for practicing the art of scrimshaw in the classroom. Several materials are suggested for use. These include beef soup bones, old piano keys, nails, sandpaper, and lampblack or charcoal. (SA)
The use of invertebrates as classroom "pets" can develop students' skills in scientific inquiry and instill respect for science. Few materials are needed for projects involving invertebrates. Suggested activities using snails, crickets, earthworms, crayfish, and guppies are offered. (DF)
A number of chemical concepts can be easily illustrated in a more friendly way to children by using toys as teaching tools in the classroom. Some of the examples illustrated are shrinking toys, drinking birds and hand boiler.
Describes four approaches to utilizing and addressing cultural differences in the classroom: multicultural education, anti-bias curriculum, global education, and international education. Presents diversity education techniques in terms of direct communication (explicit), indirect communication (implicit), cultural information resources available…
Presents an overview of research on the ways in which classroom thermal environment, lighting conditions, ion state, and electromagnetic and air pollution affect learning and the performance of students and teachers. (SJL)
How well does competitive theory explain the outcome in experimental markets? The authors examined the results of a large number of classroom trading experiments that used a pit-trading design found in Experiments with Economic Principles, an introductory economics textbook by Bergstrom and Miller. They compared experimental outcomes with…
A group of fourth graders in Durham, North Carolina, are showing America the way to a clean energy future. They are installing solar panels on their classroom roof for a project that goes above and beyond a normal day in school. From researching solar panel installation, to generating funds for the project via Kickstarter, these are students who put their plans into action. Their accomplishments go beyond the classroom and stress the importance of getting people of all ages involved in renewable energy.
A group of fourth graders in Durham, North Carolina, are showing America the way to a clean energy future. They are installing solar panels on their classroom roof for a project that goes above and beyond a normal day in school. From researching solar panel installation, to generating funds for the project via Kickstarter, these are students who put their plans into action. Their accomplishments go beyond the classroom and stress the importance of getting people of all ages involved in renewable energy.
Greene's finding that children's involvement was higher in the more pupil-controlled classrooms is in apparent conflict with CCEP's expectation that involvement is a measure independent of setting. A resolution is suggested in the study of a set of 20 all-day behavior stream observations of individual children in 3 Follow Through classrooms and…
This study monitored classroom quality throughout three Head Start programs in the Southeastern United States, using the "Assessment Profile for Early Childhood Programs: Research Edition II." A random sample of classrooms was selected to represent high and low quality classrooms in urban and rural settings. Parents and teachers rated the social…
Described by the teacher is the removal of a 10-year-old hyperactive boy from drug therapy (Ritalin). The setting, open classroom, and the involvement of four regular classroom teachers and a progressive teacher employing behavior modification in a small group setting are thought to be related to the boy's academic and behavioral progress. The boy…
Background Students spend a large portion of their day in classrooms which may be a source of mold exposure. We examined the diversity and concentrations of molds in inner-city schools and described differences between classrooms within the same school. Methods Classroom airborne mold spores, collected over a 2 day period, were measured twice during the school year by direct microscopy. Results There were 180 classroom air samples collected from 12 schools. Mold was present in 100% of classrooms. Classrooms within the same school had differing mold levels and mold diversity scores. The total mold per classroom was 176.6 ± 4.2 spores/m3 (geometric mean ± standard deviation) and ranged from 11.2 to 16,288.5 spores/m3. Mold diversity scores for classroom samples ranged from 1 to 19 (7.7 ± 3.5). The classroom accounted for the majority of variance (62%) in the total mold count, and for the majority of variance (56%) in the mold diversity score versus the school. The species with the highest concentrations and found most commonly included Cladosporium (29.3 ± 4.2 spores/m3), Penicillium/Aspergillus (15.0 ± 5.4 spores/m3), smut spores (12.6 ± 4.0 spores/m3), and basidiospores (6.6 ± 7.1 spores/m3). Conclusions Our study found that the school is a source of mold exposure, but particularly the classroom microenvironment varies in quantity of spores and species among classrooms within the same school. We also verified that visible mold may be a predictor for higher mold spore counts. Further studies are needed to determine the clinical significance of mold exposure relative to asthma morbidity in sensitized and non-sensitized asthmatic children.
Explores ethnic and gender differences in classroom conversational styles by comparing student involvement in face-to-face and computer-mediated discussions. Finds that white males participated more frequently than other groups in the face-to-face setting, and white women benefited more from computer-mediated communication. Notes Hispanic women…
The uncertainty of rainfall estimated from averages of discrete samples collected by a satellite is assessed using a multi-year radar data set covering a large portion of the United States. The sampling-related uncertainty of rainfall estimates is evaluated for all combinations of 100 km, 200 km, and 500 km space domains, 1 day, 5 day, and 30 day rainfall accumulations, and regular sampling time intervals of 1 h, 3 h, 6 h, 8 h, and 12 h. These extensive analyses are combined to characterize the sampling uncertainty as a function of space and time domain, sampling frequency, and rainfall characteristics by means of a simple scaling law. Moreover, it is shown that both parametric and non-parametric statistical techniques of estimating the sampling uncertainty produce comparable results. Sampling uncertainty estimates, however, do depend on the choice of technique for obtaining them. They can also vary considerably from case to case, reflecting the great variability of natural rainfall, and should therefore be expressed in probabilistic terms. Rainfall calibration errors are shown to affect comparison of results obtained by studies based on data from different climate regions and/or observation platforms.
Lau, William K. M. (Technical Monitor); Bell, Thomas L.; Steiner, Matthias; Zhang, Yu; Wood, Eric F.
This page describes the minute paper, one of a series of Classroom Assessment Techniques (CATs) provided by the Field-tested Learning Assessment Guide (FLAG) website. The CATs of FLAG were constructed as a resource for science, technology, engineering and mathematics (STEM) instructors to emphasize deeper levels of learning and to give instructors valuable feedback during a course. The minute paper is a concise note, taking one minute and written by students, that focuses on a short question presented by the instructor to the class. It provides real-time feedback from a class to find out if students recognized the main points of a class session and also helps the instructor make changes for the next class. This site provides an overview of this assessment instrument including information about how to use minute papers in the classroom. The site is also linked to a set of discipline-specific "tools" that can be downloaded for immediate use, as well as supplementary links and sources to further explore this assessment tool.
Zeilik, Michael; The National Institute for Science Education; College Level One Team
Explains the planning procedure for outdoor classrooms and introduces an integrated unit on monarch butterflies called the Monarch Watch program. Makes recommendations to solve financial problems of outdoor classrooms. (YDS)
Describes an experience with modeling as a teaching technique in the mathematics classroom as opposed to mathematical modeling. Offering models in the mathematics classroom is a good idea. Presents fundamental ideas for creating an effective learning environment with models. (WRM)
Describes a three-day classroom activity combining criminal investigations and scientific skills, especially observation skills. Provides detailed classroom procedures with an illustration of eight basic fingerprint patterns and a classification chart. (YP)
Though it may seem that classroom management comes naturally to some teachers, upon closer examination youâll probably discover that preparation and adaptation are more important than any innate ability when it comes to successful classroom management. Any experienced middle school science teacher can tell you that successful classroom management is an ongoing, evolving processâteachers need to modify their daily practices based on the observed behaviors and feedback of their students. This article describes some strategies to manage inquiry-based science classrooms effectively.
We proposed a partnership with the Rochester (New York) City School District to enhance their space science curriculum through teacher training seminars designed to increase teachers' knowledge of astronomy. On 1 April 1998, we facilitated the third grade science in-service program. We presented background science information on the Moon and demonstrated hands-on activities that teachers could transfer to their classrooms. During the 1998-99 school year, we visited several middle schools within the school district to facilitate the ``Comet in the Classroom'' program with sixth grade teachers. ``Comets in the Classroom'' presents background knowledge about comets and explains several hands-on activities regarding comets and their travels through the inner solar system. This work is funded through NASA's Initiative to Develop Education through Astronomy and Space Sciences (IDEAS) program.
This manual identifies the essential design elements of modern, high quality learning environments and includes discussions on facility programming, management, utilization, evaluation, and planning for future technology. Classrooms examined include general purpose classrooms, lecture halls, seminar rooms, and specialized classrooms such as…
Allen, Robert L.; Bowen, J. Thomas; Clabaugh, Sue; DeWitt, Beth B.; Francis, JoAllen; Kerstetter, John P.; Rieck, Donald A.
Though it may seem that classroom management comes naturally to some teachers, upon closer examination you'll probably discover that preparation and adaptation are more important than any innate ability when it comes to successful classroom management. Any experienced middle school science teacher can tell you that successful classroom management…
One of the most significant advances in interpreting thermochronological data is arguably our ability to extract information about the rate and trajectory of cooling over a range of temperatures, rather than having to rely on the veracity of the simplification of assuming a single closure temperature specified by a rate of monotonic cooling. Modern thermochronometry data, such as apatite fission track and (U-Th)/He analysis, are particularly good examples of data amenable to this treatment as acceptably well calibrated kinetic models now exist for both systems. With ever larger data sets of this type being generated over ever larger areas the prospect of inverting very large amounts of such data distributed spatially over large areas offers new possibilities for constraining the thermal and erosional histories over length scales approximating whole orogens and sub-continents. The challenge though is in how to properly deal with joint inversion of multiple samples in a self-consistent manner while also utilising all the available information contained in the data. We describe a new approach to this problem, called the Community of Family Circles (CFC) algorithm, which extracts information from spatially distributed apatite fission track ages (AFT) and track length distributions (TLD). The method is based on the rationale that the 3D geothermal field of the crust varies smoothly through space and time because of the efficiency of thermal diffusion. Our approach consists of seeking groups of spatially adjacent samples, or families, within a given circular radius for which a common thermal history is appropriate. The temperature offsets between individual time-temperature paths are determined relative to a low-pass filtered topographic surface, whose shape is assumed to mimic the shape of the isotherms in the partial annealing zone. This enables a single common thermal history to be shared, or interpolated, between the family members while still honouring the individual samples temperature offset requirements. The geothermal gradient can be either treated as a parameter in the inversion scheme or evaluated when local vertical profile or heat flow measurements are available. As data for each sample is inverted several times with different subsets, or as a member of different families, we then extract the subset with the lowest misfit and assign that sample to its respective ''family'' whose optimum time-temperature path is subsequently assigned to the sample. We thus obtain a set of thermal histories (one for each sample) which can then be interpolated to obtain exhumation rates or maximum temperature maps. We demonstrate our approach on a variety of synthetic datasets, generated for different geomorphologies and sampling densities, using the 3D thermal Pecube code in order to test the resolution and limits of the method. The approach is then applied to a 600 by 600 km area in northern Namibia where an extensive apatite fission track dataset including ages and track length distributions is available. We finally discuss extension of the technique to multiple thermochronometers. We also discuss possible future modifications and strategies for improving the flexibility and computational efficiency and effectiveness of the method.
Background Somatic cell nuclear transfer (SCNT) using genetically engineered donor cells is currently the most widely used strategy to generate tailored pig models for biomedical research. Although this approach facilitates a similar spectrum of genetic modifications as in rodent models, the outcome in terms of live cloned piglets is quite variable. In this study, we aimed at a comprehensive analysis of environmental and experimental factors that are substantially influencing the efficiency of generating genetically engineered pigs. Based on a considerably large data set from 274 SCNT experiments (in total 18,649 reconstructed embryos transferred into 193 recipients), performed over a period of three years, we assessed the relative contribution of season, type of genetic modification, donor cell source, number of cloning rounds, and pre-selection of cloned embryos for early development to the cloning efficiency. Results 109 (56%) recipients became pregnant and 85 (78%) of them gave birth to offspring. Out of 318 cloned piglets, 243 (76%) were alive, but only 97 (40%) were clinically healthy and showed normal development. The proportion of stillborn piglets was 24% (75/318), and another 31% (100/318) of the cloned piglets died soon after birth. The overall cloning efficiency, defined as the number of offspring born per SCNT embryos transferred, including only recipients that delivered, was 3.95%. SCNT experiments performed during winter using fetal fibroblasts or kidney cells after additive gene transfer resulted in the highest number of live and healthy offspring, while two or more rounds of cloning and nuclear transfer experiments performed during summer decreased the number of healthy offspring. Conclusion Although the effects of individual factors may be different between various laboratories, our results and analysis strategy will help to identify and optimize the factors, which are most critical to cloning success in programs aiming at the generation of genetically engineered pig models.
The larger the classroom, the more likely is it that communications consist of a one-way flow from the instructor to students. Classroom Response Systems (CRSs) are frequently hailed as technologies capable of improving communications by opening the space for dialogic engagement; yet, a causal relationship is not documented in the literature. The data reported on here stem from a mixed methodology study and provide insights into motivations for CRS use and enacted CRS use across disciplines, as well as student and instructor perceptions of the tool's effects on teaching and learning. From these data emerged a framework of interaction (the C3 Framework) that situates CRS use from both the instructors' and learners' perspectives. The framework consists of an interdependent relationship between Concerns, Centeredness, and Control of discourse. Although this study took place in university classrooms, the C3 Framework presented here applies across educational settings.
Just a few days before my career as a fledgling science teacher began in a large public high school in New York City, a mentor suggested I might get some ideas about how to run a classroom from a book called The First Days Of School by Harry Wong. Although the book seemed to concentrate more on elementary students, I found that many of the principles in the book worked well for high school students. Even as I have begun to teach at the university level, many of Wong’s themes have persisted in my teaching style. Wong’s central thesis is that for learning to occur, a teacher must create the proper environment. In education jargon, a good climate for learning is generated via classroom management, an array of methods used by elementary and secondary school teachers to provide structure and routine to a class period via a seamless flow of complementary activities. Many college professors would likely consider classroom management to be chiefly a set of rules to maintain discipline and order among an otherwise unruly herd of schoolchildren, and therefore not a useful concept for mature university students. However, classroom management is much deeper than mere rules for behavior; it is an approach to instructional design that considers the classroom experience holistically. A typical professorial management style is to lecture for an hour or so and ask students to demonstrate learning via examinations several times in a semester. In contrast, a good high school teacher will manage a class from bell-to-bell to create a natural order and flow to a given lesson. In this presentation, I will argue for an approach to college lesson design similar to the classroom management style commonly employed by high school and elementary school teachers. I will suggest some simple, practical techniques learned during my high school experience that work just as well in college: warm-up and practice problems, time management, group activities, bulletin boards, learning environment, and standard procedures. Central to all of these suggestions is the basic concept of planning activities for students beyond passive absorption of lecture material and fitting them smoothly within the typical time constraints of a class period. Well-managed students learn better. I close with the observation that the most basic desires of students are independent of age; learners of all ages and levels prefer well-designed classroom experiences. In this context, books and resources intended for the professional development of secondary--and even elementary—teachers suddenly contain a wealth of techniques that, with some modification, might be useful at the university level.
This study examines the contribution of the "Responsive Classroom" (RC) Approach, a set of teaching practices that integrate social and academic learning, to children's perceptions of their classroom, and children's academic and social performance over time. Three questions emerge: (a) What is the concurrent and cumulative relation between…
Brock, Laura L.; Nishida, Tracy K.; Chiong, Cynthia; Grimm, Kevin J.; Rimm-Kaufman, Sara E.
This site describes the use of weekly reports as an assessment tool for student learning. It is one of a series of Classroom Assessment Techniques (CATs) provided by the Field-tested Learning Assessment Guide (FLAG) website. The CATs of FLAG were constructed as a resource for science, technology, engineering and mathematics instructors to emphasize deeper levels of learning and to give instructors valuable feedback during a course. Weekly reports provide rapid feedback about what students think they are learning and what conceptual difficulties they are experiencing. This site provides an overview of this assessment technique including information about how to use it. The site is also linked to a set of discipline-specific "tools" that can be downloaded for immediate use, as well as supplementary links and sources to further explore this assessment tool.
Etkina, Eugenia; The National Institute for Science Education; College Level One Team
Noting that preservice teachers and experienced teachers share a concern for classroom management, this study compared the beliefs of classroom teachers, intern teachers, and senior level practicum students regarding classroom management styles. Participating in the study were 43 early childhood and 44 elementary education preservice teachers, and…
There has been little analysis of how the "L" Shape design pattern might influence learning as well as be incorporated into the design of new school facilities. This article: (1) re-examines the "Fat L" (Dyck, 1994) Classroom as a design pattern which supports a range of activity settings; (2) defines activity settings; (3) describes the "Fat L"…
A history teacher in a girls independent high school set out to examine her own teaching process in order to revitalize her work and to understand more about how her students learn. She began to keep a journal of her observations of classroom work and, with the help of a colleague, to analyze the observations. In addition, she set aside class time…
The authors discuss the notion of Rapid Collaborative Knowledge Building (RCKB) in classroomsettings. RCKB seeks to harness the collective intelligence of groups to learn faster, envision new possibilities, and to reveal latent knowledge in a dynamic live setting. It is characterized by the notion of rapid cycles of knowledge building activities…
Although classroom inquiry is the primary pedagogy of science education, it has often been difficult to implement within conventional classroom cultures. This study turned to the alternatively structured Montessori learning environment to better understand the ways in which it fosters the essential elements of classroom inquiry, as defined by prominent policy documents. Specifically, we examined the opportunities present in Montessori classrooms for students to develop an interest in the natural world, generate explanations in science, and communicate about science. Using ethnographic research methods in four Montessori classrooms at the primary and elementary levels, this research captured a range of scientific learning opportunities. The study found that the Montessori learning environment provided opportunities for students to develop enduring interests in scientific topics and communicate about science in various ways. The data also indicated that explanation was largely teacher-driven in the Montessori classroom culture. This study offers lessons for both conventional and Montessori classrooms and suggests further research that bridges educational contexts.
Rinke, Carol R.; Gimbel, Steven J.; Haskell, Sophie
The effectiveness of the recently developed, explicitly correlated coupled cluster method CCSD(T)-F12b is examined in terms of its ability to reproduce atomization energies derived from complete basis set extrapolations of standard CCSD(T). Most of the standard method findings were obtained with aug-cc-pV7Z or aug-cc-pV8Z basis sets. For a few homonuclear diatomic molecules it was possible to push the basis set to the aug-cc-pV9Z level. F12b calculations were performed with the cc-pVnZ-F12 (n = D, T, Q) basis set sequence and were also extrapolated to the basis set limit using a Schwenke-style, parameterized formula. A systematic bias was observed in the F12b method with the (VTZ-F12/VQZ-F12) basis set combination. This bias resulted in the underestimation of reference values associated with small molecules (valence correlation energies <0.5 Eh) and an even larger overestimation of atomization energies for bigger systems. Consequently, caution should be exercised in the use of F12b for high accuracy studies. Root mean square and mean absolute deviation error metrics for this basis set combination were comparable to complete basis set values obtained with standard CCSD(T) and the aug-cc-pVDZ through aug-cc-pVQZ basis set sequence. However, the mean signed deviation was an order of magnitude larger. Problems partially due to basis set superposition error were identified with second row compounds which resulted in a weak performance for the smaller VDZ-F12/VTZ-F12 combination of basis sets.
The effectiveness of the recently developed, explicitly correlated coupled cluster method CCSD(T)-F12b is examined in terms of its ability to reproduce atomization energies derived from complete basis set extrapolations of standard CCSD(T). Most of the standard method findings were obtained with aug-cc-pV7Z or aug-cc-pV8Z basis sets. For a few homonuclear diatomic molecules it was possible to push the basis set to the aug-cc-pV9Z level. F12b calculations were performed with the cc-pVnZ-F12 (n = D, T, Q) basis set sequence and were also extrapolated to the basis set limit using a Schwenke-style, parameterized formula. A systematic bias was observed in the F12b method with the (VTZ-F12/VQZ-F12) basis set combination. This bias resulted in the underestimation of reference values associated with small molecules (valence correlation energies <0.5 E(h)) and an even larger overestimation of atomization energies for bigger systems. Consequently, caution should be exercised in the use of F12b for high accuracy studies. Root mean square and mean absolute deviation error metrics for this basis set combination were comparable to complete basis set