Sample records for collection analysis interpretation

  1. 30 CFR 551.11 - Submission, inspection, and selection of geological data and information collected under a permit...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ..., or interpretation of any geological data and information. Initial analysis and processing are the stages of analysis or processing where the data and information first become available for in-house... geochemical) data and information describing each operation of analysis, processing, and interpretation; (2...

  2. 30 CFR 551.11 - Submission, inspection, and selection of geological data and information collected under a permit...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ..., or interpretation of any geological data and information. Initial analysis and processing are the stages of analysis or processing where the data and information first become available for in-house... geochemical) data and information describing each operation of analysis, processing, and interpretation; (2...

  3. 30 CFR 551.11 - Submission, inspection, and selection of geological data and information collected under a permit...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ..., or interpretation of any geological data and information. Initial analysis and processing are the stages of analysis or processing where the data and information first become available for in-house... geochemical) data and information describing each operation of analysis, processing, and interpretation; (2...

  4. Soils as an indicator of forest health: a guide to the collection, analysis, and interpretation of soil indicator data in the Forest Inventory and Analysis program

    Treesearch

    Katherine P O' Neill; Michael C. Amacher; Charles H. Perry

    2005-01-01

    Documents the types of data collected as part of the Forest Inventory and Analysis soil indicator, the field and laboratory methods used, and the rationale behind these data collection procedures. Guides analysts and researchers on incorporating soil indicator data into reports and research studies.

  5. Teachers' Perception of Social Justice in Mathematics Classrooms

    ERIC Educational Resources Information Center

    Panthi, Ram Krishna; Luitel, Bal Chandra; Belbase, Shashidhar

    2017-01-01

    The purpose of this study was to explore mathematics teachers' perception of social justice in mathematics classrooms. We applied interpretive qualitative method for data collection, analysis, and interpretation through iterative process. We administered in-depth semi-structured interviews to capture the perceptions of three mathematics teachers…

  6. Teachers' Perception of Social Justice in Mathematics Classrooms

    ERIC Educational Resources Information Center

    Panthi, Ram Krishna; Luitel, Bal Chandra; Belbase, Shashidhar

    2018-01-01

    The purpose of this study was to explore mathematics teachers' perception of social justice in mathematics classrooms. We applied interpretive qualitative method for data collection, analysis, and interpretation through iterative process. We administered in-depth semi-structured interviews to capture the perceptions of three mathematics teachers…

  7. Traveler information services in rural tourism areas : appendix D, system/historical data analysis

    DOT National Transportation Integrated Search

    2000-06-30

    This document presents information regarding data collection and dissemination functions for traveler information services in rural areas. It documents data collection functions and information dissemination functions, and provides an interpretive de...

  8. Washington’s forest resources: Forest Inventory and Analysis, 2002–2011

    Treesearch

    Justin Holgerson; Sharon Stanton; Karen Waddell; Marin Palmer; Olaf Kuegler; Glenn Christensen

    2018-01-01

    This report highlights key findings from data collected by the Forest Inventory and Analysis program across all forest land in the state of Washington from 2002 through 2011, updating previously published findings from data collected up to 2006. We summarize and interpret basic resource information such as forest area, composition, ownership, volume, biomass, and...

  9. California's forest resources: Forest Inventory and Analysis, 2001–2010

    Treesearch

    Glenn A. Christensen; Karen L. Waddell; Sharon M. Stanton; Olaf Kuegler

    2016-01-01

    This report highlights key findings from the most recent (2001–2010) data collected by the Forest Inventory and Analysis program across all forest land in California, updating previously published findings from data collected from 2001 through 2005 (Christensen et al. 2008). We summarize and interpret basic resource information such as forest area, ownership, volume,...

  10. How accurate are interpretations of curriculum-based measurement progress monitoring data? Visual analysis versus decision rules.

    PubMed

    Van Norman, Ethan R; Christ, Theodore J

    2016-10-01

    Curriculum based measurement of oral reading (CBM-R) is used to monitor the effects of academic interventions for individual students. Decisions to continue, modify, or terminate these interventions are made by interpreting time series CBM-R data. Such interpretation is founded upon visual analysis or the application of decision rules. The purpose of this study was to compare the accuracy of visual analysis and decision rules. Visual analysts interpreted 108 CBM-R progress monitoring graphs one of three ways: (a) without graphic aids, (b) with a goal line, or (c) with a goal line and a trend line. Graphs differed along three dimensions, including trend magnitude, variability of observations, and duration of data collection. Automated trend line and data point decision rules were also applied to each graph. Inferential analyses permitted the estimation of the probability of a correct decision (i.e., the student is improving - continue the intervention, or the student is not improving - discontinue the intervention) for each evaluation method as a function of trend magnitude, variability of observations, and duration of data collection. All evaluation methods performed better when students made adequate progress. Visual analysis and decision rules performed similarly when observations were less variable. Results suggest that educators should collect data for more than six weeks, take steps to control measurement error, and visually analyze graphs when data are variable. Implications for practice and research are discussed. Copyright © 2016 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.

  11. Interpretive repertoires as mirrors on society and as tools for action: reflections on Zeyer and Roth's A mirror of society

    NASA Astrophysics Data System (ADS)

    Milne, Catherine

    2009-12-01

    I respond to Zeyer and Roth's (Cultural Studies of Science Education, 2009) paper on their use of interpretive repertoire analysis to explicate Swiss middle school students' dialogic responses to environmental issues. I focus on the strategy of interpretive repertoire analysis, making sense of the stance Zeyer and Roth take with this analysis by synthesizing their argument and comparing their analysis with other researchers that have also used this analytic tool. Interpretive repertoires are discourse resources, including mores, tropes, and metaphors that can be evoked by speakers in support of a tenuous claim. So interpretive repertoires have rhetorical character and function. Interpretive repertoire analysis requires looking for patterns in the contradictions in the speech of a collective of participants that can be codified as interpretive repertoires. Interpretive repertoires provide insight into macro-structures that frame, and are used to justify participants' behavior. My response to Zeyer and Roth's argument might also be thought to be contradictory but I think defensible. In this paper, I outline why I am excited by the possibilities I can image for this type of analysis in areas of science education research. However, I also felt the need to identify possible limitations of Zeyer and Roth's exclusive focus on environmental issues to the neglect of other issues, such as those associated with gender, embedded in participants' discourse. I argue that a critical and historical focus, in conjunction with interpretive repertoire analysis, offer a rich strategy for analysis in science education research, especially in the study of macrostructures, such as gender, race, identity and power.

  12. Manual for analysis of ethanol in biological liquids

    DOT National Transportation Integrated Search

    1977-01-01

    This manual covers selected aspects of the analysis of ethanol in biological liquids and the interpretation of the results of such analyses. Recommendations are made concerning the selection, collection, identification, and preservation of suitable b...

  13. 77 FR 58414 - Agency Information Collection Activities: Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-20

    ... science and engineering workforce and changes in their employment, education and demographic... engineering population. The 2013 NSCG will provide necessary input into the SESTAT. The National Science... clearinghouse for the collection, interpretation, and analysis of data on scientific and engineering resources...

  14. Too close to home? Experiences of Kurdish refugee interpreters working in UK mental health services.

    PubMed

    Green, Hannah; Sperlinger, David; Carswell, Kenneth

    2012-06-01

    Despite their essential role in the National Health Service, there is limited research on the experiences of refugee interpreters. To explore Kurdish refugee interpreters' experiences of working in UK mental health services. Six participants were interviewed and data collected were analysed using interpretative phenomenological analysis. The results showed that interpreters often felt overwhelmed by the emotional impact of interpreting in mental health services, particularly at the beginning of their careers. Interpreters struggled to negotiate complex and unclear roles and responsibilities. Interpreting for refugees with shared histories was particularly challenging. The study recommends that interpreters working in mental health services receive training on mental health issues and self-care and are assisted by frameworks to help make sense of the impact of the work, such as supervision.

  15. Visual Pattern Analysis in Histopathology Images Using Bag of Features

    NASA Astrophysics Data System (ADS)

    Cruz-Roa, Angel; Caicedo, Juan C.; González, Fabio A.

    This paper presents a framework to analyse visual patterns in a collection of medical images in a two stage procedure. First, a set of representative visual patterns from the image collection is obtained by constructing a visual-word dictionary under a bag-of-features approach. Second, an analysis of the relationships between visual patterns and semantic concepts in the image collection is performed. The most important visual patterns for each semantic concept are identified using correlation analysis. A matrix visualization of the structure and organization of the image collection is generated using a cluster analysis. The experimental evaluation was conducted on a histopathology image collection and results showed clear relationships between visual patterns and semantic concepts, that in addition, are of easy interpretation and understanding.

  16. Using recurrence plot analysis for software execution interpretation and fault detection

    NASA Astrophysics Data System (ADS)

    Mosdorf, M.

    2015-09-01

    This paper shows a method targeted at software execution interpretation and fault detection using recurrence plot analysis. In in the proposed approach recurrence plot analysis is applied to software execution trace that contains executed assembly instructions. Results of this analysis are subject to further processing with PCA (Principal Component Analysis) method that simplifies number coefficients used for software execution classification. This method was used for the analysis of five algorithms: Bubble Sort, Quick Sort, Median Filter, FIR, SHA-1. Results show that some of the collected traces could be easily assigned to particular algorithms (logs from Bubble Sort and FIR algorithms) while others are more difficult to distinguish.

  17. Spit: saliva in nursing research, uses and methodological considerations in older adults.

    PubMed

    Woods, Diana Lynn; Mentes, Janet C

    2011-07-01

    Over the last 10 years, interest in the analysis of saliva as a biomarker for a variety of systemic diseases or for potential disease has soared. There are numerous advantages to using saliva as a biological fluid, particularly for nurse researchers working with vulnerable populations, such as frail older adults. Most notably, it is noninvasive and easier to collect than serum or urine. The authors describe their experiences with the use of saliva in research with older adults that examined (a) osmolality as an indicator of hydration status and (b) cortisol and behavioral symptoms of dementia. In particular, the authors discuss the timing of data collection along with data analysis and interpretation. For example, it is not enough to detect levels or rely solely on summary statistics; rather it is critical to characterize any rhythmicity inherent in the parameter of interest. Not accounting for rhythmicity in the analysis and interpretation of data can limit the interpretation of associations, thus impeding advances related to the contribution that an altered rhythm may make to individual vulnerability.

  18. Cultural Competence and School Counselor Training: A Collective Case Study

    ERIC Educational Resources Information Center

    Nelson, Judith A.; Bustamante, Rebecca; Sawyer, Cheryl; Sloan, Eva D.

    2015-01-01

    This collective case study investigated the experiences of bilingual counselors-in-training who assessed school-wide cultural competence in public schools. Analysis and interpretation of data resulted in the identification of 5 themes: eye-opening experiences, recognition of strengths, the role of school leaders, road maps for change, and…

  19. A Practical Guide to Interpretation of Large Collections of Incident Narratives Using the QUORUM Method

    NASA Technical Reports Server (NTRS)

    McGreevy, Michael W.

    1997-01-01

    Analysis of incident reports plays an important role in aviation safety. Typically, a narrative description, written by a participant, is a central part of an incident report. Because there are so many reports, and the narratives contain so much detail, it can be difficult to efficiently and effectively recognize patterns among them. Recognizing and addressing recurring problems, however, is vital to continuing safety in commercial aviation operations. A practical way to interpret large collections of incident narratives is to apply the QUORUM method of text analysis, modeling, and relevance ranking. In this paper, QUORUM text analysis and modeling are surveyed, and QUORUM relevance ranking is described in detail with many examples. The examples are based on several large collections of reports from the Aviation Safety Reporting System (ASRS) database, and a collection of news stories describing the disaster of TWA Flight 800, the Boeing 747 which exploded in mid- air and crashed near Long Island, New York, on July 17, 1996. Reader familiarity with this disaster should make the relevance-ranking examples more understandable. The ASRS examples illustrate the practical application of QUORUM relevance ranking.

  20. Enhanced Analysis of Falling Weight Deflectometer Data for Use With Mechanistic-Empirical Flexible Pavement Design and Analysis and Recommendations for Improvements to Falling Weight Deflectometers

    DOT National Transportation Integrated Search

    2017-03-01

    This report describes the efforts undertaken to review the status of falling weight deflectometer (FWD) equipment, data collection, analysis, and interpretation, including dynamic backcalculation, as they relate to the models and procedures incorpora...

  1. Preliminary assessment of an economical fugitive road dust sampler for the collection of bulk samples for geochemical analysis.

    PubMed

    Witt, Emitt C; Wronkiewicz, David J; Shi, Honglan

    2013-01-01

    Fugitive road dust collection for chemical analysis and interpretation has been limited by the quantity and representativeness of samples. Traditional methods of fugitive dust collection generally focus on point-collections that limit data interpretation to a small area or require the investigator to make gross assumptions about the origin of the sample collected. These collection methods often produce a limited quantity of sample that may hinder efforts to characterize the samples by multiple geochemical techniques, preserve a reference archive, and provide a spatially integrated characterization of the road dust health hazard. To achieve a "better sampling" for fugitive road dust studies, a cyclonic fugitive dust (CFD) sampler was constructed and tested. Through repeated and identical sample collection routes at two collection heights (50.8 and 88.9 cm above the road surface), the products of the CFD sampler were characterized using particle size and chemical analysis. The average particle size collected by the cyclone was 17.9 μm, whereas particles collected by a secondary filter were 0.625 μm. No significant difference was observed between the two sample heights tested and duplicates collected at the same height; however, greater sample quantity was achieved at 50.8 cm above the road surface than at 88.9 cm. The cyclone effectively removed 94% of the particles >1 μm, which substantially reduced the loading on the secondary filter used to collect the finer particles; therefore, suction is maintained for longer periods of time, allowing for an average sample collection rate of about 2 g mi. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.

  2. Econo-ESA in semantic text similarity.

    PubMed

    Rahutomo, Faisal; Aritsugi, Masayoshi

    2014-01-01

    Explicit semantic analysis (ESA) utilizes an immense Wikipedia index matrix in its interpreter part. This part of the analysis multiplies a large matrix by a term vector to produce a high-dimensional concept vector. A similarity measurement between two texts is performed between two concept vectors with numerous dimensions. The cost is expensive in both interpretation and similarity measurement steps. This paper proposes an economic scheme of ESA, named econo-ESA. We investigate two aspects of this proposal: dimensional reduction and experiments with various data. We use eight recycling test collections in semantic text similarity. The experimental results show that both the dimensional reduction and test collection characteristics can influence the results. They also show that an appropriate concept reduction of econo-ESA can decrease the cost with minor differences in the results from the original ESA.

  3. Systematic Analysis and Interpretation of Collected Data for a Research Study: A Practical Methodological Framework for Writing Research Report

    ERIC Educational Resources Information Center

    Boaduo, Nana Adu-Pipim

    2011-01-01

    Two basic data sources required for research studies have been secondary and primary. Secondary data collection helps the researcher to provide relevant background to the study and are, in most cases, available for retrieval from recorded sources. Primary data collection requires the researcher to venture into the field where the study is to take…

  4. The Use of a Checklist and Qualitative Notebooks for an Interactive Process of Teaching and Learning Qualitative Research

    ERIC Educational Resources Information Center

    Frels, Rebecca K.; Sharma, Bipin; Onwuegbuzie, Anthony J.; Leech, Nancy L.; Stark, Marcella D.

    2011-01-01

    From the perspective of doctoral students and instructors, we explain a developmental, interactive process based upon the Checklist for Qualitative Data Collection, Data Analysis, and Data Interpretation (Onwuegbuzie, 2010) for students' writing assignments regarding: (a) the application of conceptual knowledge for collecting, analyzing, and…

  5. Analyzing jobs for redesign decisions.

    PubMed

    Conn, V S; Davis, N K; Occena, L G

    1996-01-01

    Job analysis, the collection and interpretation of information that describes job behaviors and activities performed by occupants of jobs, can provide nurse administrators with valuable information for redesigning effective and efficient systems of care.

  6. Analysis of Three Cobble Ring Sites at Abiquiu Reservoir, Rio Arriba County, New Mexico.

    DTIC Science & Technology

    1989-01-01

    major travel and migration route for both humans and large game animals (Bertram et al. 1987, Schander 1986). The Rio Chama flows in a general...for interpreting site characteristics in terms of settlement and subsistence. Theoretical research issues to be addressed are chronology, subsistence...artifacts, collection areas, and locations of permanent and temporary data points across the site landscape . These data are critical for interpreting

  7. A guide to soil samplng and analysis on the National Forests of the Inland Northwest United States

    Treesearch

    Deb Page-Dumroese; Al Harvey; Marty Jurgensen

    1995-01-01

    This guide details soil collection methods, sample analysis, and data translation. It outlines what field soil scientists need to make accurate interpretations of site information. Included are instructions for sampling typical Andisols found on National Forests of the Inland Northwest United States.

  8. Charisma, crowd psychology and altered states of consciousness.

    PubMed

    Lindholm, C

    1992-09-01

    This paper argues that an interpretive meaning-centered analysis is not adequate for understanding collective behavior that is outside the range of calculating rationality. Alternative approaches to collective irrational action are drawn from the work of Weber and Durkheim, as well as from the crowd psychologists Le Bon and Tarde. These approaches are then illustrated in a short analysis of the trajectories and recruitment techniques of two contemporary American religious annunications: est and Scientology, and the findings applied to the general social formation.

  9. Prediction of compression-induced image interpretability degradation

    NASA Astrophysics Data System (ADS)

    Blasch, Erik; Chen, Hua-Mei; Irvine, John M.; Wang, Zhonghai; Chen, Genshe; Nagy, James; Scott, Stephen

    2018-04-01

    Image compression is an important component in modern imaging systems as the volume of the raw data collected is increasing. To reduce the volume of data while collecting imagery useful for analysis, choosing the appropriate image compression method is desired. Lossless compression is able to preserve all the information, but it has limited reduction power. On the other hand, lossy compression, which may result in very high compression ratios, suffers from information loss. We model the compression-induced information loss in terms of the National Imagery Interpretability Rating Scale or NIIRS. NIIRS is a user-based quantification of image interpretability widely adopted by the Geographic Information System community. Specifically, we present the Compression Degradation Image Function Index (CoDIFI) framework that predicts the NIIRS degradation (i.e., a decrease of NIIRS level) for a given compression setting. The CoDIFI-NIIRS framework enables a user to broker the maximum compression setting while maintaining a specified NIIRS rating.

  10. Instruction Guide and Macro Analysis Tool for Community-led Air Monitoring

    EPA Pesticide Factsheets

    EPA has developed two tools for evaluating the performance of low-cost sensors and interpreting the data they collect to help citizen scientists, communities, and professionals interested in learning about local air quality.

  11. The Historical and Situated Nature Design Experiments--Implications for Data Analysis

    ERIC Educational Resources Information Center

    Krange, I.; Ludvigsen, Sten

    2009-01-01

    This article is a methodological contribution to the use of design experiments in educational research. We will discuss the implications of a historical and situated interpretation to design experiments, the consequences this has for the analysis of the collected data and empirically based suggestions to improve the designs of the computer-based…

  12. Branding Access through the Carolina Covenant: Fostering Institutional Image and Brand

    ERIC Educational Resources Information Center

    Harris, Michael S.; Barnes, Bradley

    2011-01-01

    This study analyzes the potential of major financial aid initiatives to serve as key elements of an institutional branding strategy. Concepts of branding and marketing serve as guiding frameworks for the analysis and interpretation of the findings. Using a case study approach, data were collected through interviews and document analysis at the…

  13. The Use of Multiple Regression Models to Determine if Conjoint Analysis Should Be Conducted on Aggregate Data.

    ERIC Educational Resources Information Center

    Fraas, John W.; Newman, Isadore

    1996-01-01

    In a conjoint-analysis consumer-preference study, researchers must determine whether the product factor estimates, which measure consumer preferences, should be calculated and interpreted for each respondent or collectively. Multiple regression models can determine whether to aggregate data by examining factor-respondent interaction effects. This…

  14. UTOOLS: microcomputer software for spatial analysis and landscape visualization.

    Treesearch

    Alan A. Ager; Robert J. McGaughey

    1997-01-01

    UTOOLS is a collection of programs designed to integrate various spatial data in a way that allows versatile spatial analysis and visualization. The programs were designed for watershed-scale assessments in which a wide array of resource data must be integrated, analyzed, and interpreted. UTOOLS software combines raster, attribute, and vector data into "spatial...

  15. Psychosocial identification of drivers responsible for fatal vehicular accidents in Boston

    DOT National Transportation Integrated Search

    1976-05-01

    This Final Report includes a total human factor data presentation, analysis, evaluation and interpretation of selected variables collected by the Boston University Traffic Accident Research Special Study team during the 30-month period of the experim...

  16. 30 CFR 285.112 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... authorized under subpart J of this part. Archaeological resource means any material remains of human life or... observation, contextual measurement, controlled collection, analysis, interpretation, and explanation). Best...; pipelines; and permanently moored vessels. Any group of OCS installations interconnected with walkways, or...

  17. Exploring the Micro-Social Geography of Children's Interactions in Preschool: A Long-Term Observational Study and Analysis Using Geographic Information Technologies

    ERIC Educational Resources Information Center

    Torrens, Paul M.; Griffin, William A.

    2013-01-01

    The authors describe an observational and analytic methodology for recording and interpreting dynamic microprocesses that occur during social interaction, making use of space--time data collection techniques, spatial-statistical analysis, and visualization. The scheme has three investigative foci: Structure, Activity Composition, and Clustering.…

  18. California's forest resources, 2001-2005: five-year Forest Inventory and Analysis Report.

    Treesearch

    Glenn A. Christensen; Sally J. Campbell; Jeremy S. Fried

    2008-01-01

    This report highlights key findings from the most recent (2001-2005) data collected by the Forest Inventory and Analysis Program across all forest land in California. We summarize and interpret basic resource information such as forest area, ownership, volume, biomass, and carbon stocks; structure and function topics such as biodiversity, forest age, dead wood, and...

  19. Community-led Air Sensor Evaluation: New Tools for Citizen Scientists Fact Sheet

    EPA Pesticide Factsheets

    EPA has developed a guide and analysis tool for citizen scientists to evaluate the performance of low-cost sensors and interpret the data they collect to help citizen scientists interested in learning about local air quality.

  20. Conducting qualitative research in audiology: a tutorial.

    PubMed

    Knudsen, Line V; Laplante-Lévesque, Ariane; Jones, Lesley; Preminger, Jill E; Nielsen, Claus; Lunner, Thomas; Hickson, Louise; Naylor, Graham; Kramer, Sophia E

    2012-02-01

    Qualitative research methodologies are being used more frequently in audiology as it allows for a better understanding of the perspectives of people with hearing impairment. This article describes why and how international interdisciplinary qualitative research can be conducted. This paper is based on a literature review and our recent experience with the conduction of an international interdisciplinary qualitative study in audiology. We describe some available qualitative methods for sampling, data collection, and analysis and we discuss the rationale for choosing particular methods. The focus is on four approaches which have all previously been applied to audiologic research: grounded theory, interpretative phenomenological analysis, conversational analysis, and qualitative content analysis. This article provides a review of methodological issues useful for those designing qualitative research projects in audiology or needing assistance in the interpretation of qualitative literature.

  1. The Web Measurement Environment (WebME): A Tool for Combining and Modeling Distributed Data

    NASA Technical Reports Server (NTRS)

    Tesoriero, Roseanne; Zelkowitz, Marvin

    1997-01-01

    Many organizations have incorporated data collection into their software processes for the purpose of process improvement. However, in order to improve, interpreting the data is just as important as the collection of data. With the increased presence of the Internet and the ubiquity of the World Wide Web, the potential for software processes being distributed among several physically separated locations has also grown. Because project data may be stored in multiple locations and in differing formats, obtaining and interpreting data from this type of environment becomes even more complicated. The Web Measurement Environment (WebME), a Web-based data visualization tool, is being developed to facilitate the understanding of collected data in a distributed environment. The WebME system will permit the analysis of development data in distributed, heterogeneous environments. This paper provides an overview of the system and its capabilities.

  2. The EnviroAtlas: Connecting ecosystems, people, and well-being

    EPA Science Inventory

    The EnviroAtlas is a web-based application containing a collection of geospatial data, analysis tools, and interpretive information focused on ecosystem goods and services. Ecosystem goods and services are essentially defined as the benefits that humans receive from nature and en...

  3. Deep Lake Explorer: Using citizen science to analyze underwater video from the Great Lakes

    EPA Science Inventory

    While underwater video collection technology continues to improve, advancements in underwater video analysis techniques have lagged. Crowdsourcing image interpretation using the Zooniverse platform has proven successful for many projects, but few projects to date have included vi...

  4. Collection analysis and interpretation of data on relationship between drugs and driving

    DOT National Transportation Integrated Search

    1972-02-01

    The purpose of this study was to determine if drug usage is relate to driving history. Laboratory analyses of urine samples, in-dept interviews, and public driving records were obtained to investigate the relationship of traffic accidents and violati...

  5. Program Evaluation: A Review and Synthesis.

    ERIC Educational Resources Information Center

    Webber, Charles F.

    This paper reviews models of program evaluation. Major topics and issues found in the evaluation literature include quantitative versus qualitative approaches, identification and involvement of stakeholders, formulation of research questions, collection of data, analysis and interpretation of data, reporting of results, evaluation utilization, and…

  6. Is Heart Rate Variability Better Than Routine Vital Signs for Prehospital Identification of Major Hemorrhage

    DTIC Science & Technology

    2015-01-01

    different PRBC transfusion volumes. We performed multivariate regression analysis using HRV metrics and routine vital signs to test the hypothesis that...study sponsors did not have any role in the study design, data collection, analysis and interpretation of data, report writing, or the decision to...primary outcome was hemorrhagic injury plus different PRBC transfusion volumes. We performed multivariate regression analysis using HRV metrics and

  7. Case study of a survivor of suicide who lost all family members through parent-child collective suicide.

    PubMed

    Lee, Eunjin; Won Kim, Sung; Enright, Robert D

    2015-01-01

    South Korea is characterized by a high percentage of parent-child collective suicide. This case study explores one individual's personal experience as an adult survivor of suicide who lost his wife and his only son through parent-child collective suicide in South Korea. The study reports data from a semistructured interview, which were analyzed using interpretive phenomenological analysis (IPA). Two themes were identified through the analysis of the narratives of the survivor. The first theme provides a detailed picture of the survivor's explanation of why the parent-child collective suicide occurred. The second theme examines how the participant experienced complicated bereavement after his heart-breaking loss of both wife and son. We discuss the importance of support from other people or grief experts for the survivors of suicide who lose family to collective suicide.

  8. Students' Successes and Challenges Applying Data Analysis and Measurement Skills in a Fifth-Grade Integrated STEM Unit

    ERIC Educational Resources Information Center

    Glancy, Aran W.; Moore, Tamara J.; Guzey, Selcen; Smith, Karl A.

    2017-01-01

    An understanding of statistics and skills in data analysis are becoming more and more essential, yet research consistently shows that students struggle with these concepts at all levels. This case study documents some of the struggles four groups of fifth-grade students encounter as they collect, organize, and interpret data and then ultimately…

  9. Concept for facilitating analyst-mediated interpretation of qualitative chromatographic-mass spectral data: an alternative to manual examination of extracted ion chromatograms.

    PubMed

    Borges, Chad R

    2007-07-01

    A chemometrics-based data analysis concept has been developed as a substitute for manual inspection of extracted ion chromatograms (XICs), which facilitates rapid, analyst-mediated interpretation of GC- and LC/MS(n) data sets from samples undergoing qualitative batchwise screening for prespecified sets of analytes. Automatic preparation of data into two-dimensional row space-derived scatter plots (row space plots) eliminates the need to manually interpret hundreds to thousands of XICs per batch of samples while keeping all interpretation of raw data directly in the hands of the analyst-saving great quantities of human time without loss of integrity in the data analysis process. For a given analyte, two analyte-specific variables are automatically collected by a computer algorithm and placed into a data matrix (i.e., placed into row space): the first variable is the ion abundance corresponding to scan number x and analyte-specific m/z value y, and the second variable is the ion abundance corresponding to scan number x and analyte-specific m/z value z (a second ion). These two variables serve as the two axes of the aforementioned row space plots. In order to collect appropriate scan number (retention time) information, it is necessary to analyze, as part of every batch, a sample containing a mixture of all analytes to be tested. When pure standard materials of tested analytes are unavailable, but representative ion m/z values are known and retention time can be approximated, data are evaluated based on two-dimensional scores plots from principal component analysis of small time range(s) of mass spectral data. The time-saving efficiency of this concept is directly proportional to the percentage of negative samples and to the total number of samples processed simultaneously.

  10. Segmental hair analysis for differentiation of tilidine intake from external contamination using LC-ESI-MS/MS and MALDI-MS/MS imaging.

    PubMed

    Poetzsch, Michael; Baumgartner, Markus R; Steuer, Andrea E; Kraemer, Thomas

    2015-02-01

    Segmental hair analysis has been used for monitoring changes of consumption habit of drugs. Contamination from the environment or sweat might cause interpretative problems. For this reason, hair analysis results were compared in hair samples taken 24 h and 30 days after a single tilidine dose. The 24-h hair samples already showed high concentrations of tilidine and nortilidine. Analysis of wash water from sample preparation confirmed external contamination by sweat as reason. The 30-day hair samples were still positive for tilidine in all segments. Negative wash-water analysis proved incorporation from sweat into the hair matrix. Interpretation of a forensic case was requested where two children had been administered tilidine by their nanny and tilidine/nortilidine had been detected in all hair segments, possibly indicating multiple applications. Taking into consideration the results of the present study and of MALDI-MS imaging, a single application as cause for analytical results could no longer be excluded. Interpretation of consumption behaviour of tilidine based on segmental hair analysis has to be done with caution, even after typical wash procedures during sample preparation. External sweat contamination followed by incorporation into the hair matrix can mimic chronic intake. For assessment of external contamination, hair samples should not only be collected several weeks but also one to a few days after intake. MALDI-MS imaging of single hair can be a complementary tool for interpretation. Limitations for interpretation of segmental hair analysis shown here might also be applicable to drugs with comparable physicochemical and pharmacokinetic properties. Copyright © 2014 John Wiley & Sons, Ltd.

  11. Identification of the isomers using principal component analysis (PCA) method

    NASA Astrophysics Data System (ADS)

    Kepceoǧlu, Abdullah; Gündoǧdu, Yasemin; Ledingham, Kenneth William David; Kilic, Hamdi Sukur

    2016-03-01

    In this work, we have carried out a detailed statistical analysis for experimental data of mass spectra from xylene isomers. Principle Component Analysis (PCA) was used to identify the isomers which cannot be distinguished using conventional statistical methods for interpretation of their mass spectra. Experiments have been carried out using a linear TOF-MS coupled to a femtosecond laser system as an energy source for the ionisation processes. We have performed experiments and collected data which has been analysed and interpreted using PCA as a multivariate analysis of these spectra. This demonstrates the strength of the method to get an insight for distinguishing the isomers which cannot be identified using conventional mass analysis obtained through dissociative ionisation processes on these molecules. The PCA results dependending on the laser pulse energy and the background pressure in the spectrometers have been presented in this work.

  12. Analyzing Data Generated Through Deliberative Dialogue: Bringing Knowledge Translation Into Qualitative Analysis.

    PubMed

    Plamondon, Katrina M; Bottorff, Joan L; Cole, Donald C

    2015-11-01

    Deliberative dialogue (DD) is a knowledge translation strategy that can serve to generate rich data and bridge health research with action. An intriguing alternative to other modes of generating data, the purposeful and evidence-informed conversations characteristic of DD generate data inclusive of collective interpretations. These data are thus dialogic, presenting complex challenges for qualitative analysis. In this article, we discuss the nature of data generated through DD, orienting ourselves toward a theoretically grounded approach to analysis. We offer an integrated framework for analysis, balancing analytical strategies of categorizing and connecting with the use of empathetic and suspicious interpretive lenses. In this framework, data generation and analysis occur in concert, alongside engaging participants and synthesizing evidence. An example of application is provided, demonstrating nuances of the framework. We conclude with reflections on the strengths and limitations of the framework, suggesting how it may be relevant in other qualitative health approaches. © The Author(s) 2015.

  13. The effectiveness of cartographic visualisations in landscape archaeology

    NASA Astrophysics Data System (ADS)

    Fairbairn, David

    2018-05-01

    The use of maps and other geovisualisation methods has been longstanding in archaeology. Archaeologists employ advanced contemporary tools in their data collection, analysis and presentation. Maps can be used to render the `big data' commonly collected by archaeological prospection techniques, but are also fundamental output instru-ments for the dissemination of archaeological interpretation and modelling. This paper addresses, through case studies, alternate methods of geovisualisation in archaeology and identifies the efficiencies of each.

  14. Statistical techniques applied to aerial radiometric surveys (STAARS): principal components analysis user's manual. [NURE program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koch, C.D.; Pirkle, F.L.; Schmidt, J.S.

    1981-01-01

    A Principal Components Analysis (PCA) has been written to aid in the interpretation of multivariate aerial radiometric data collected by the US Department of Energy (DOE) under the National Uranium Resource Evaluation (NURE) program. The variations exhibited by these data have been reduced and classified into a number of linear combinations by using the PCA program. The PCA program then generates histograms and outlier maps of the individual variates. Black and white plots can be made on a Calcomp plotter by the application of follow-up programs. All programs referred to in this guide were written for a DEC-10. From thismore » analysis a geologist may begin to interpret the data structure. Insight into geological processes underlying the data may be obtained.« less

  15. Thellier GUI: An integrated tool for analyzing paleointensity data from Thellier-type experiments

    NASA Astrophysics Data System (ADS)

    Shaar, Ron; Tauxe, Lisa

    2013-03-01

    Thellier-type experiments are a method used to estimate the intensity of the ancient geomagnetic field from samples carrying thermoremanent magnetization. The analysis of Thellier-type experimental data is conventionally done by manually interpreting data from each specimen individually. The main limitations of this approach are: (1) manual interpretation is highly subjective and can be biased by misleading concepts, (2) the procedure is time consuming, and (3) unless the measurement data are published, the final results cannot be reproduced by readers. These issues compound when trying to combine together paleointensity data from a collection of studies. Here, we address these problems by introducing the Thellier GUI: a comprehensive tool for interpreting Thellier-type experimental data. The tool presents a graphical user interface, which allows manual interpretation of the data, but also includes two new interpretation tools: (1) Thellier Auto Interpreter: an automatic interpretation procedure based on a given set of experimental requirements, and 2) Consistency Test: a self-test for the consistency of the results assuming groups of samples that should have the same paleointensity values. We apply the new tools to data from two case studies. These demonstrate that interpretation of non-ideal Arai plots is nonunique and different selection criteria can lead to significantly different conclusions. Hence, we recommend adopting the automatic interpretation approach, as it allows a more objective interpretation, which can be easily repeated or revised by others. When the analysis is combined with a Consistency Test, the credibility of the interpretations is enhanced. We also make the case that published paleointensity studies should include the measurement data (as supplementary files or as a contributions to the MagIC database) so that results based on a particular data set can be reproduced and assessed by others.

  16. 40 CFR 228.2 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... agencies, public data archives, and social and economic studies and records of affected areas. (d) The term... social and economic studies and records of areas which would be affected by use of the proposed site. (e...) The term disposal site evaluation study means the collection, analysis, and interpretation of all...

  17. 40 CFR 228.2 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... agencies, public data archives, and social and economic studies and records of affected areas. (d) The term... social and economic studies and records of areas which would be affected by use of the proposed site. (e...) The term disposal site evaluation study means the collection, analysis, and interpretation of all...

  18. 40 CFR 228.2 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... agencies, public data archives, and social and economic studies and records of affected areas. (d) The term... social and economic studies and records of areas which would be affected by use of the proposed site. (e...) The term disposal site evaluation study means the collection, analysis, and interpretation of all...

  19. 40 CFR 228.2 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... agencies, public data archives, and social and economic studies and records of affected areas. (d) The term... social and economic studies and records of areas which would be affected by use of the proposed site. (e...) The term disposal site evaluation study means the collection, analysis, and interpretation of all...

  20. 40 CFR 228.2 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... agencies, public data archives, and social and economic studies and records of affected areas. (d) The term... social and economic studies and records of areas which would be affected by use of the proposed site. (e...) The term disposal site evaluation study means the collection, analysis, and interpretation of all...

  1. Literary Aesthetics in the Narration of Dagara Folktales

    ERIC Educational Resources Information Center

    Kyiileyang, Martin

    2016-01-01

    Dagara folktales, like other African folktales, are embedded with various literary aesthetic features related to structure, language and performance. This paper examines major literary aesthetics found in Dagara folktales. The methodology used is based on the collection, analysis and interpretation of selected Dagara folktales gathered through…

  2. COMPARISONS AND CONTRASTS AMONG DIFFERENT SCALED ASSESSMENTS

    EPA Science Inventory

    A comparison of a regional (multi-state) and local (multi-county) scale assessment was done to evaluate similarities and differences in the collection, analysis, and interpretation of landscape data. The study areas included EP A Region 3 a11d a sub-region spanning North and Sout...

  3. Making Decisions with Data: Are We Environmentally Friendly?

    ERIC Educational Resources Information Center

    English, Lyn; Watson, Jane

    2016-01-01

    Statistical literacy is a vital component of numeracy. Students need to learn to critically evaluate and interpret statistical information if they are to become informed citizens. This article examines a Year 5 unit of work that uses the data collection and analysis cycle within a sustainability context.

  4. Using American sign language interpreters to facilitate research among deaf adults: lessons learned.

    PubMed

    Sheppard, Kate

    2011-04-01

    Health care providers commonly discuss depressive symptoms with clients, enabling earlier intervention. Such discussions rarely occur between providers and Deaf clients. Most culturally Deaf adults experience early-onset hearing loss, self-identify as part of a unique culture, and communicate in the visual language of American Sign Language (ASL). Communication barriers abound, and depression screening instruments may be unreliable. To train and use ASL interpreters for a qualitative study describing depressive symptoms among Deaf adults. Training included research versus community interpreting. During data collection, interpreters translated to and from voiced English and ASL. Training eliminated potential problems during data collection. Unexpected issues included participants asking for "my interpreter" and worrying about confidentiality or friendship in a small community. Lessons learned included the value of careful training of interpreters prior to initiating data collection, including resolution of possible role conflicts and ensuring conceptual equivalence in real-time interpreting.

  5. GeoDash: Assisting Visual Image Interpretation in Collect Earth Online by Leveraging Big Data on Google Earth Engine

    NASA Technical Reports Server (NTRS)

    Markert, Kel; Ashmall, William; Johnson, Gary; Saah, David; Mollicone, Danilo; Diaz, Alfonso Sanchez-Paus; Anderson, Eric; Flores, Africa; Griffin, Robert

    2017-01-01

    Collect Earth Online (CEO) is a free and open online implementation of the FAO Collect Earth system for collaboratively collecting environmental data through the visual interpretation of Earth observation imagery. The primary collection mechanism in CEO is human interpretation of land surface characteristics in imagery served via Web Map Services (WMS). However, interpreters may not have enough contextual information to classify samples by only viewing the imagery served via WMS, be they high resolution or otherwise. To assist in the interpretation and collection processes in CEO, SERVIR, a joint NASA-USAID initiative that brings Earth observations to improve environmental decision making in developing countries, developed the GeoDash system, an embedded and critical component of CEO. GeoDash leverages Google Earth Engine (GEE) by allowing users to set up custom browser-based widgets that pull from GEE's massive public data catalog. These widgets can be quick looks of other satellite imagery, time series graphs of environmental variables, and statistics panels of the same. Users can customize widgets with any of GEE's image collections, such as the historical Landsat collection with data available since the 1970s, select date ranges, image stretch parameters, graph characteristics, and create custom layouts, all on-the-fly to support plot interpretation in CEO. This presentation focuses on the implementation and potential applications, including the back-end links to GEE and the user interface with custom widget building. GeoDash takes large data volumes and condenses them into meaningful, relevant information for interpreters. While designed initially with national and global forest resource assessments in mind, the system will complement disaster assessments, agriculture management, project monitoring and evaluation, and more.

  6. GeoDash: Assisting Visual Image Interpretation in Collect Earth Online by Leveraging Big Data on Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Markert, K. N.; Ashmall, W.; Johnson, G.; Saah, D. S.; Anderson, E.; Flores Cordova, A. I.; Díaz, A. S. P.; Mollicone, D.; Griffin, R.

    2017-12-01

    Collect Earth Online (CEO) is a free and open online implementation of the FAO Collect Earth system for collaboratively collecting environmental data through the visual interpretation of Earth observation imagery. The primary collection mechanism in CEO is human interpretation of land surface characteristics in imagery served via Web Map Services (WMS). However, interpreters may not have enough contextual information to classify samples by only viewing the imagery served via WMS, be they high resolution or otherwise. To assist in the interpretation and collection processes in CEO, SERVIR, a joint NASA-USAID initiative that brings Earth observations to improve environmental decision making in developing countries, developed the GeoDash system, an embedded and critical component of CEO. GeoDash leverages Google Earth Engine (GEE) by allowing users to set up custom browser-based widgets that pull from GEE's massive public data catalog. These widgets can be quick looks of other satellite imagery, time series graphs of environmental variables, and statistics panels of the same. Users can customize widgets with any of GEE's image collections, such as the historical Landsat collection with data available since the 1970s, select date ranges, image stretch parameters, graph characteristics, and create custom layouts, all on-the-fly to support plot interpretation in CEO. This presentation focuses on the implementation and potential applications, including the back-end links to GEE and the user interface with custom widget building. GeoDash takes large data volumes and condenses them into meaningful, relevant information for interpreters. While designed initially with national and global forest resource assessments in mind, the system will complement disaster assessments, agriculture management, project monitoring and evaluation, and more.

  7. Improving Deliverability in Gas Storage Fields by Identifying the Timing and Sources of Damage Using Smart Well Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    J.H. Frantz Jr; K.G. Brown; W.K. Sawyer

    2006-03-01

    This report summarizes the work performed under contract DE-FC26-03NT41743. The primary objective of this study was to develop tools that would allow Underground Gas Storage (UGS) operators to use wellhead electronic flow measurement (EFM) data to quickly and efficiently identify trends in well damage over time, thus aiding in the identification of potential causes of the damage. Secondary objectives of this work included: (1) To assist UGS operators in the evaluation of hardware and software requirements for implementing an EFM system similar to the one described in this report, and (2) To provide a cost-benefit analysis framework UGS operators canmore » use to evaluate economic benefits of installing wellhead EFM systems in their particular fields. Assessment of EFM data available for use, and selection of the specific study field are reviewed. The various EFM data processing tasks, including data collection, organization, extraction, processing, and interpretation are discussed. The process of damage assessment via pressure transient analysis of EFM data is outlined and demonstrated, including such tasks as quality control, semi-log analysis, and log-log analysis of pressure transient test data extracted from routinely collected EFM data. Output from pressure transient test analyses for 21 wells is presented, and the interpretation of these analyses to determine the timing of damage development is demonstrated using output from specific study wells. Development of processing and interpretation modules to handle EFM data interpretation in horizontal wells is also a presented and discussed. A spreadsheet application developed to aid underground gas storage operators in the selection of EFM equipment is presented, discussed, and used to determine the cost benefit of installing EFM equipment in a gas storage field. Recommendations for future work related to EFM in gas storage fields are presented and discussed.« less

  8. INTEGRATION OF FACILITY MODELING CAPABILITIES FOR NUCLEAR NONPROLIFERATION ANALYSIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gorensek, M.; Hamm, L.; Garcia, H.

    2011-07-18

    Developing automated methods for data collection and analysis that can facilitate nuclear nonproliferation assessment is an important research area with significant consequences for the effective global deployment of nuclear energy. Facility modeling that can integrate and interpret observations collected from monitored facilities in order to ascertain their functional details will be a critical element of these methods. Although improvements are continually sought, existing facility modeling tools can characterize all aspects of reactor operations and the majority of nuclear fuel cycle processing steps, and include algorithms for data processing and interpretation. Assessing nonproliferation status is challenging because observations can come frommore » many sources, including local and remote sensors that monitor facility operations, as well as open sources that provide specific business information about the monitored facilities, and can be of many different types. Although many current facility models are capable of analyzing large amounts of information, they have not been integrated in an analyst-friendly manner. This paper addresses some of these facility modeling capabilities and illustrates how they could be integrated and utilized for nonproliferation analysis. The inverse problem of inferring facility conditions based on collected observations is described, along with a proposed architecture and computer framework for utilizing facility modeling tools. After considering a representative sampling of key facility modeling capabilities, the proposed integration framework is illustrated with several examples.« less

  9. Interpretation of analytical toxicology results in life and at postmortem.

    PubMed

    Flanagan, Robert J; Connally, Geraldine

    2005-01-01

    Interpretation of analytical toxicology results from live patients is sometimes difficult. Possible factors may be related to: (i) the nature of the poison(s) present; (ii) sample collection, transport and storage; (iii) the analytical methodology used; (iv) the circumstances of exposure; (v) mechanical factors such as trauma or inhalation of stomach contents; and (vi) pharmacological factors such as tolerance or synergy. In some circumstances, detection of a drug or other poison may suffice to prove exposure. At the other extreme, the interpretation of individual measurements may be simplified by regulation. Examples here include whole blood alcohol (ethanol) in regard to driving a motor vehicle and blood lead assays performed to assess occupational exposure. With pharmaceuticals, the plasma or serum concentrations of drugs and metabolites attained during treatment often provide a basis for the interpretation of quantitative measurements. With illicit drugs, comparative information from casework may be all that is available. Postmortem toxicology is an especially complex area since changes in the composition of fluids such as blood depending on the site of collection from the body and the time elapsed since death, amongst other factors, may influence the result obtained. This review presents information to assist in the interpretation of analytical results, especially regarding postmortem toxicology. Collection and analysis of not only peripheral blood, but also other fluids/tissues is usually important in postmortem work. Alcohol, for example, can be either lost from, or produced in, blood especially if there has been significant trauma, hence measurements in urine or vitreous humour are needed to confirm the reliability of a blood result. Measurement of metabolites may also be valuable in individual cases.

  10. Arabic-speaking migrants' experiences of the use of interpreters in healthcare: a qualitative explorative study.

    PubMed

    Hadziabdic, Emina; Hjelm, Katarina

    2014-06-16

    Arabic-speaking migrants have constituted a growing population in recent years. This entails major challenges to ensure good communication in the healthcare encounter in order to provide individual and holistic healthcare. One of the solutions to ensure good communication between patient and healthcare staff who do not share the same language is to use a professional interpreter. To our knowledge, no previous qualitative studies have been found concerning Arabic-speaking migrants and the use of interpreters. This study aims to ascertain their individual experiences which can help extend our understanding of the studied area. A purposive sample of 13 Arabic-speaking persons with experience of using interpreters in healthcare encounters. Data were collected between November 2012 and March 2013 by four focus-group interviews and analysed with qualitative analysis according to a method described for focus groups. Four categories appeared from the analysis: 1) The professional interpreter as spokesperson; 2) Different types of interpreters and modes of interpretation adapting to the healthcare encounter; 3) The professional interpreter's task and personal properties affected the use of professional interpreters in a healthcare encounter; 4) Future planning of the use of professional interpreters in a healthcare encounter. The main findings were that the use of interpreters was experienced both as a possibility and as a problem. The preferred type of interpreters depended on the interpreter's dialect and ability to interpret correctly. Besides the professional interpreter's qualities of good skill in language and medical terminology, translation ability, neutrality and objectivity, Arabic-speaking participants stated that professional interpreters need to share the same origin, religion, dialect, gender and political views as the patient in order to facilitate the interpreter use and avoid inappropriate treatment. The study showed that the personal qualities of a good interpreter not only cover language ability but also origin, religion, dialect, gender and political views. Thus, there is need to develop strategies for personalized healthcare in order to avoid inappropriate communication, to satisfy the preferences of the person in need of interpreters and improve the impact of interpretation on the quality of healthcare.

  11. Disrupting the Pipeline: Critical Analyses of Student Pathways through Postsecondary STEM Education

    ERIC Educational Resources Information Center

    Metcalf, Heather E.

    2014-01-01

    Critical mixed methods approaches allow us to reflect upon the ways in which we collect, measure, interpret, and analyze data, providing novel alternatives for quantitative analysis. For institutional researchers, whose work influences institutional policies, programs, and practices, the approach has the transformative ability to expose and create…

  12. COLLABORATIVE, MULTI-TIME PERIOD LIDAR COLLECTION AND ANALYSIS FOR RESIDENTIAL DEVELOPMENT IMPACT ASSESSMENT AND MONITORING

    EPA Science Inventory

    The U.S. EPA Environmental Photographic Interpretation Center (EPIC) in

    Reston, Virginia is currently conducting collaborative landscape/stream ecology research

    in the Clarksburg Special Protection Area (CSPA) in Montgomery County, Maryland.

    The CSPA is an ar...

  13. Attitudes to Educational Issues: Development of an Instrument.

    ERIC Educational Resources Information Center

    Lemke, Jay L.; And Others

    To obtain a test which could be used for the collection, analysis, and interpretation of data on teachers' attitudes toward contemporary educational issues, the Attitudes to Educational Issues instrument (AEI) was developed. Statements were written in five-choice Likert format to express attitudes toward these six educational issues: (1)…

  14. How a national vegetation classification can help ecological research and management

    Treesearch

    Scott Franklin; Patrick Comer; Julie Evens; Exequiel Ezcurra; Don Faber-Langendoen; Janet Franklin; Michael Jennings; Carmen Josse; Chris Lea; Orie Loucks; Esteban Muldavin; Robert Peet; Serguei Ponomarenko; David Roberts; Ayzik Solomeshch; Todd Keeler-Wolf; James Van Kley; Alan Weakley; Alexa McKerrow; Marianne Burke; Carol Spurrier

    2015-01-01

    The elegance of classification lies in its ability to compile and systematize various terminological conventions and masses of information that are unattainable during typical research projects. Imagine a discipline without standards for collection, analysis, and interpretation; unfortunately, that describes much of 20th-century vegetation ecology.

  15. Field guide for collecting samples for analysis of volatile organic compounds in stream water for the National Water-Quality Assessment Program

    USGS Publications Warehouse

    Shelton, Larry R.

    1997-01-01

    For many years, stream samples for analysis of volatile organic compounds have been collected without specific guidelines or a sampler designed to avoid analyte loss. In 1996, the U.S. Geological Survey's National Water-Quality Assessment Program began aggressively monitoring urban stream-water for volatile organic compounds. To assure representative samples and consistency in collection procedures, a specific sampler was designed to collect samples for analysis of volatile organic compounds in stream water. This sampler, and the collection procedures, were tested in the laboratory and in the field for compound loss, contamination, sample reproducibility, and functional capabilities. This report describes that sampler and its use, and outlines field procedures specifically designed to provide contaminant-free, reproducible volatile organic compound data from stream-water samples. These guidelines and the equipment described represent a significant change in U.S. Geological Survey instructions for collecting and processing stream-water samples for analysis of volatile organic compounds. They are intended to produce data that are both defensible and interpretable, particularly for concentrations below the microgram-per-liter level. The guidelines also contain detailed recommendations for quality-control samples.

  16. Enhancing graphical literacy skills in the high school science classroom via authentic, intensive data collection and graphical representation exposure

    NASA Astrophysics Data System (ADS)

    Palmeri, Anthony

    This research project was developed to provide extensive practice and exposure to data collection and data representation in a high school science classroom. The student population engaged in this study included 40 high school sophomores enrolled in two microbiology classes. Laboratory investigations and activities were deliberately designed to include quantitative data collection that necessitated organization and graphical representation. These activities were embedded into the curriculum and conducted in conjunction with the normal and expected course content, rather than as a separate entity. It was expected that routine practice with graph construction and interpretation would result in improved competency when graphing data and proficiency in analyzing graphs. To objectively test the effectiveness in achieving this goal, a pre-test and post-test that included graph construction, interpretation, interpolation, extrapolation, and analysis was administered. Based on the results of a paired T-Test, graphical literacy was significantly enhanced by extensive practice and exposure to data representation.

  17. Interpreting comprehensive two-dimensional gas chromatography using peak topography maps with application to petroleum forensics.

    PubMed

    Ghasemi Damavandi, Hamidreza; Sen Gupta, Ananya; Nelson, Robert K; Reddy, Christopher M

    2016-01-01

    Comprehensive two-dimensional gas chromatography [Formula: see text] provides high-resolution separations across hundreds of compounds in a complex mixture, thus unlocking unprecedented information for intricate quantitative interpretation. We exploit this compound diversity across the [Formula: see text] topography to provide quantitative compound-cognizant interpretation beyond target compound analysis with petroleum forensics as a practical application. We focus on the [Formula: see text] topography of biomarker hydrocarbons, hopanes and steranes, as they are generally recalcitrant to weathering. We introduce peak topography maps (PTM) and topography partitioning techniques that consider a notably broader and more diverse range of target and non-target biomarker compounds compared to traditional approaches that consider approximately 20 biomarker ratios. Specifically, we consider a range of 33-154 target and non-target biomarkers with highest-to-lowest peak ratio within an injection ranging from 4.86 to 19.6 (precise numbers depend on biomarker diversity of individual injections). We also provide a robust quantitative measure for directly determining "match" between samples, without necessitating training data sets. We validate our methods across 34 [Formula: see text] injections from a diverse portfolio of petroleum sources, and provide quantitative comparison of performance against established statistical methods such as principal components analysis (PCA). Our data set includes a wide range of samples collected following the 2010 Deepwater Horizon disaster that released approximately 160 million gallons of crude oil from the Macondo well (MW). Samples that were clearly collected following this disaster exhibit statistically significant match [Formula: see text] using PTM-based interpretation against other closely related sources. PTM-based interpretation also provides higher differentiation between closely correlated but distinct sources than obtained using PCA-based statistical comparisons. In addition to results based on this experimental field data, we also provide extentive perturbation analysis of the PTM method over numerical simulations that introduce random variability of peak locations over the [Formula: see text] biomarker ROI image of the MW pre-spill sample (sample [Formula: see text] in Additional file 4: Table S1). We compare the robustness of the cross-PTM score against peak location variability in both dimensions and compare the results against PCA analysis over the same set of simulated images. Detailed description of the simulation experiment and discussion of results are provided in Additional file 1: Section S8. We provide a peak-cognizant informational framework for quantitative interpretation of [Formula: see text] topography. Proposed topographic analysis enables [Formula: see text] forensic interpretation across target petroleum biomarkers, while including the nuances of lesser-known non-target biomarkers clustered around the target peaks. This allows potential discovery of hitherto unknown connections between target and non-target biomarkers.

  18. Solving the challenges of data preprocessing, uploading, archiving, retrieval, analysis and visualization for large heterogeneous paleo- and rock magnetic datasets

    NASA Astrophysics Data System (ADS)

    Minnett, R.; Koppers, A. A.; Tauxe, L.; Constable, C.; Jarboe, N. A.

    2011-12-01

    The Magnetics Information Consortium (MagIC) provides an archive for the wealth of rock- and paleomagnetic data and interpretations from studies on natural and synthetic samples. As with many fields, most peer-reviewed paleo- and rock magnetic publications only include high level results. However, access to the raw data from which these results were derived is critical for compilation studies and when updating results based on new interpretation and analysis methods. MagIC provides a detailed metadata model with places for everything from raw measurements to their interpretations. Prior to MagIC, these raw data were extremely cumbersome to collect because they mostly existed in a lab's proprietary format on investigator's personal computers or undigitized in field notebooks. MagIC has developed a suite of offline and online tools to enable the paleomagnetic, rock magnetic, and affiliated scientific communities to easily contribute both their previously published data and data supporting an article undergoing peer-review, to retrieve well-annotated published interpretations and raw data, and to analyze and visualize large collections of published data online. Here we present the technology we chose (including VBA in Excel spreadsheets, Python libraries, FastCGI JSON webservices, Oracle procedures, and jQuery user interfaces) and how we implemented it in order to serve the scientific community as seamlessly as possible. These tools are now in use in labs worldwide, have helped archive many valuable legacy studies and datasets, and routinely enable new contributions to the MagIC Database (http://earthref.org/MAGIC/).

  19. [The future of forensic DNA analysis for criminal justice].

    PubMed

    Laurent, François-Xavier; Vibrac, Geoffrey; Rubio, Aurélien; Thévenot, Marie-Thérèse; Pène, Laurent

    2017-11-01

    In the criminal framework, the analysis of approximately 20 DNA microsatellites enables the establishment of a genetic profile with a high statistical power of discrimination. This technique gives us the possibility to establish or exclude a match between a biological trace detected at a crime scene and a suspect whose DNA was collected via an oral swab. However, conventional techniques do tend to complexify the interpretation of complex DNA samples, such as degraded DNA and mixture DNA. The aim of this review is to highlight the powerness of new forensic DNA methods (including high-throughput sequencing or single-cell sequencing) to facilitate the interpretation of the expert with full compliance with existing french legislation. © 2017 médecine/sciences – Inserm.

  20. Development of data processing, interpretation and analysis system for the remote sensing of trace atmospheric gas species

    NASA Technical Reports Server (NTRS)

    Casas, Joseph C.; Saylor, Mary S.; Kindle, Earl C.

    1987-01-01

    The major emphasis is on the advancement of remote sensing technology. In particular, the gas filter correlation radiometer (GFCR) technique was applied to the measurement of trace gas species, such as carbon monoxide (CO), from airborne and Earth orbiting platforms. Through a series of low altitude aircraft flights, high altitude aircraft flights, and orbiting space platform flights, data were collected and analyzed, culminating in the first global map of carbon monoxide concentration in the middle troposphere and stratosphere. The four major areas of this remote sensing program, known as the Measurement of Air Pollution from Satellites (MAPS) experiment, are: (1) data acquisition, (2) data processing, analysis, and interpretation algorithms, (3) data display techniques, and (4) information processing.

  1. Determination of Nutrient Intakes by a Modified Visual Estimation Method and Computerized Nutritional Analysis for Dietary Assessments

    DTIC Science & Technology

    1987-09-01

    a useful average for population studies, do not delay data processing , and is relatively Inexpensive. Using MVEN and observing recipe preparation...for population studies, do not delay data processing , and is relatively inexpensive. Using HVEM and observing recipe preparation procedures improve the...extensive review of the procedures and problems in design, collection, analysis, processing and interpretation of dietary survey data for individuals

  2. Assessment of water quality parameters using multivariate analysis for Klang River basin, Malaysia.

    PubMed

    Mohamed, Ibrahim; Othman, Faridah; Ibrahim, Adriana I N; Alaa-Eldin, M E; Yunus, Rossita M

    2015-01-01

    This case study uses several univariate and multivariate statistical techniques to evaluate and interpret a water quality data set obtained from the Klang River basin located within the state of Selangor and the Federal Territory of Kuala Lumpur, Malaysia. The river drains an area of 1,288 km(2), from the steep mountain rainforests of the main Central Range along Peninsular Malaysia to the river mouth in Port Klang, into the Straits of Malacca. Water quality was monitored at 20 stations, nine of which are situated along the main river and 11 along six tributaries. Data was collected from 1997 to 2007 for seven parameters used to evaluate the status of the water quality, namely dissolved oxygen, biochemical oxygen demand, chemical oxygen demand, suspended solids, ammoniacal nitrogen, pH, and temperature. The data were first investigated using descriptive statistical tools, followed by two practical multivariate analyses that reduced the data dimensions for better interpretation. The analyses employed were factor analysis and principal component analysis, which explain 60 and 81.6% of the total variation in the data, respectively. We found that the resulting latent variables from the factor analysis are interpretable and beneficial for describing the water quality in the Klang River. This study presents the usefulness of several statistical methods in evaluating and interpreting water quality data for the purpose of monitoring the effectiveness of water resource management. The results should provide more straightforward data interpretation as well as valuable insight for managers to conceive optimum action plans for controlling pollution in river water.

  3. Real-time monitoring of CO2 storage sites: Application to Illinois Basin-Decatur Project

    USGS Publications Warehouse

    Picard, G.; Berard, T.; Chabora, E.; Marsteller, S.; Greenberg, S.; Finley, R.J.; Rinck, U.; Greenaway, R.; Champagnon, C.; Davard, J.

    2011-01-01

    Optimization of carbon dioxide (CO2) storage operations for efficiency and safety requires use of monitoring techniques and implementation of control protocols. The monitoring techniques consist of permanent sensors and tools deployed for measurement campaigns. Large amounts of data are thus generated. These data must be managed and integrated for interpretation at different time scales. A fast interpretation loop involves combining continuous measurements from permanent sensors as they are collected to enable a rapid response to detected events; a slower loop requires combining large datasets gathered over longer operational periods from all techniques. The purpose of this paper is twofold. First, it presents an analysis of the monitoring objectives to be performed in the slow and fast interpretation loops. Second, it describes the implementation of the fast interpretation loop with a real-time monitoring system at the Illinois Basin-Decatur Project (IBDP) in Illinois, USA. ?? 2011 Published by Elsevier Ltd.

  4. Chemkin-II: A Fortran chemical kinetics package for the analysis of gas-phase chemical kinetics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kee, R.J.; Rupley, F.M.; Miller, J.A.

    1989-09-01

    This document is the user's manual for the second-generation Chemkin package. Chemkin is a software package for whose purpose is to facilitate the formation, solution, and interpretation of problems involving elementary gas-phase chemical kinetics. It provides an especially flexible and powerful tool for incorporating complex chemical kinetics into simulations of fluid dynamics. The package consists of two major software components: an Interpreter and Gas-Phase Subroutine Library. The Interpreter is a program that reads a symbolic description of an elementary, user-specified chemical reaction mechanism. One output from the Interpreter is a data file that forms a link to the Gas-Phase Subroutinemore » Library. This library is a collection of about 100 highly modular Fortran subroutines that may be called to return information on equation of state, thermodynamic properties, and chemical production rates.« less

  5. High-resolution sclerochronological analysis of the bivalve mollusk Saxidomus gigantea from Alaska and British Columbia: techniques for revealing environmental archives and archaeological seasonality

    USGS Publications Warehouse

    Hallman, Nadine; Burchell, Meghan; Schone, Bernd R.; Irvine, Gail V.; Maxwell, David

    2009-01-01

    The butter clam, Saxidomus gigantea, is one of the most commonly recovered bivalves from archaeological shell middens on the Pacific Coast of North America. This study presents the results of the sclerochronology of modern specimens of S. gigantea, collected monthly from Pender Island (British Columbia), and additional modern specimens from the Dundas Islands (BC) and Mink and Little Takli Islands (Alaska). The methods presented can be used as a template to interpret local environmental conditions and increase the precision of seasonality estimates in shellfish using sclerochronology and oxygen isotope analysis. This method can also identify, with a high degree of accuracy, the date of shell collection to the nearest fortnightly cycle, the time of day the shell was collected and the approximate tidal elevation (i.e., approx. water depth and distance from the shoreline) from which the shell was collected. Life-history traits of S. gigantea were analyzed to understand the timing of growth line formation, the duration of the growing season, the growth rate, and the reliability of annual increments. We also examine the influence of the tidal regime and freshwater mixing in estuarine locations and how these variables can affect both incremental structures and oxygen isotope values. The results of the sclerochronological analysis show that there is a latitudinal trend in shell growth that needs to be considered when using shells for seasonality studies. Oxygen isotope analysis reveals clear annual cycles with the most positive values corresponding to the annual winter growth lines, and the most negative values corresponding to high temperatures during the summer. Intra-annual increment widths demonstrate clear seasonal oscillations with broadest increments in summer and very narrow increments or no growth during the winter months. This study provides new insights into the biology, geochemistry and seasonal growth of S. gigantea, which are crucial for paleoclimate reconstructions and interpreting seasonality patterns of past human collection.

  6. Arabic-speaking migrants’ experiences of the use of interpreters in healthcare: a qualitative explorative study

    PubMed Central

    2014-01-01

    Introduction Arabic-speaking migrants have constituted a growing population in recent years. This entails major challenges to ensure good communication in the healthcare encounter in order to provide individual and holistic healthcare. One of the solutions to ensure good communication between patient and healthcare staff who do not share the same language is to use a professional interpreter. To our knowledge, no previous qualitative studies have been found concerning Arabic-speaking migrants and the use of interpreters. This study aims to ascertain their individual experiences which can help extend our understanding of the studied area. Method A purposive sample of 13 Arabic-speaking persons with experience of using interpreters in healthcare encounters. Data were collected between November 2012 and March 2013 by four focus-group interviews and analysed with qualitative analysis according to a method described for focus groups. Results Four categories appeared from the analysis: 1) The professional interpreter as spokesperson; 2) Different types of interpreters and modes of interpretation adapting to the healthcare encounter; 3) The professional interpreter’s task and personal properties affected the use of professional interpreters in a healthcare encounter; 4) Future planning of the use of professional interpreters in a healthcare encounter. The main findings were that the use of interpreters was experienced both as a possibility and as a problem. The preferred type of interpreters depended on the interpreter’s dialect and ability to interpret correctly. Besides the professional interpreter’s qualities of good skill in language and medical terminology, translation ability, neutrality and objectivity, Arabic-speaking participants stated that professional interpreters need to share the same origin, religion, dialect, gender and political views as the patient in order to facilitate the interpreter use and avoid inappropriate treatment. Conclusion The study showed that the personal qualities of a good interpreter not only cover language ability but also origin, religion, dialect, gender and political views. Thus, there is need to develop strategies for personalized healthcare in order to avoid inappropriate communication, to satisfy the preferences of the person in need of interpreters and improve the impact of interpretation on the quality of healthcare. PMID:24934755

  7. Radiology Workflow Dynamics: How Workflow Patterns Impact Radiologist Perceptions of Workplace Satisfaction.

    PubMed

    Lee, Matthew H; Schemmel, Andrew J; Pooler, B Dustin; Hanley, Taylor; Kennedy, Tabassum; Field, Aaron; Wiegmann, Douglas; Yu, John-Paul J

    2017-04-01

    The study aimed to assess perceptions of reading room workflow and the impact separating image-interpretive and nonimage-interpretive task workflows can have on radiologist perceptions of workplace disruptions, workload, and overall satisfaction. A 14-question survey instrument was developed to measure radiologist perceptions of workplace interruptions, satisfaction, and workload prior to and following implementation of separate image-interpretive and nonimage-interpretive reading room workflows. The results were collected over 2 weeks preceding the intervention and 2 weeks following the end of the intervention. The results were anonymized and analyzed using univariate analysis. A total of 18 people responded to the preintervention survey: 6 neuroradiology fellows and 12 attending neuroradiologists. Fifteen people who were then present for the 1-month intervention period responded to the postintervention survey. Perceptions of workplace disruptions, image interpretation, quality of trainee education, ability to perform nonimage-interpretive tasks, and quality of consultations (P < 0.0001) all improved following the intervention. Mental effort and workload also improved across all assessment domains, as did satisfaction with quality of image interpretation and consultative work. Implementation of parallel dedicated image-interpretive and nonimage-interpretive workflows may improve markers of radiologist perceptions of workplace satisfaction. Copyright © 2017 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  8. Collection of X-ray diffraction data from macromolecular crystals

    PubMed Central

    Dauter, Zbigniew

    2017-01-01

    Diffraction data acquisition is the final experimental stage of the crystal structure analysis. All subsequent steps involve mainly computer calculations. Optimally measured and accurate data make the structure solution and refinement easier and lead to more faithful interpretation of the final models. Here, the important factors in data collection from macromolecular crystals are discussed and strategies appropriate for various applications, such as molecular replacement, anomalous phasing, atomic-resolution refinement etc., are presented. Criteria useful for judging the diffraction data quality are also discussed. PMID:28573573

  9. Characterization of rock populations on planetary surfaces - Techniques and a preliminary analysis of Mars and Venus

    NASA Technical Reports Server (NTRS)

    Garvin, J. B.; Mouginis-Mark, P. J.; Head, J. W.

    1981-01-01

    A data collection and analysis scheme developed for the interpretation of rock morphology from lander images is reviewed with emphasis on rock population characterization techniques. Data analysis techniques are also discussed in the context of identifying key characteristics of a rock that place it in a single category with similar rocks. Actual rock characteristics observed from Viking and Venera lander imagery are summarized. Finally, some speculations regarding the block fields on Mars and Venus are presented.

  10. Discussing Laddering Application by the Means-End Chain Theory

    ERIC Educational Resources Information Center

    Veludo-de-Oliveira, Tania Modesto; Ikeda, Ana Akemi; Campomar, Marcos Cortez

    2006-01-01

    This article aims at analyzing laddering as a technique of qualitative research, emphasizing the procedures for data collection, analysis and interpretation, and its main limitations as well. "Laddering refers to an in-depth, one-on-one interviewing technique used to develop an understanding of how consumers translate the attributes of products…

  11. The Religious and Spiritual Experiences of Undergraduate Gay Males Attending a Religiously Affiliated Institution of Higher Education

    ERIC Educational Resources Information Center

    Adams, Melvin D., III

    2013-01-01

    This doctoral thesis studied the religious and spiritual experiences of undergraduate gay males at a Protestant affiliated higher education institution and how undergraduate gay males made sense of their personal journeys. Data was collected from four participants and analyzed using interpretative phenomenological analysis. Five themes emerged…

  12. Lithuanian Students' Choice of University: A Consumer Value Approach

    ERIC Educational Resources Information Center

    Bartkute, Darija

    2017-01-01

    Increasing competition within the Lithuanian educational market has paved the way for an analysis of the complex choice processes enrollees undergo in selecting a higher education institution. This research examines the concept of consumer value and its interpretation in the Lithuanian higher education setting. Based on data collected from 445…

  13. Hui Students' Identity Construction in Eastern China: A Postcolonial Critique

    ERIC Educational Resources Information Center

    Wang, Yuxiang; Phillion, JoAnn

    2011-01-01

    In this article, we explored Hui students' lived experiences in school in eastern China and the impact of their experiences on their identity construction. We used postcolonial theory as a theoretical framework and narrative inquiry as a research methodology to guide questions that we asked, data collection, data analysis, and interpretation and…

  14. Confirmatory Factor Analytic Structure and Measurement Invariance of Quantitative Autistic Traits Measured by the Social Responsiveness Scale-2

    ERIC Educational Resources Information Center

    Frazier, Thomas W.; Ratliff, Kristin R.; Gruber, Chris; Zhang, Yi; Law, Paul A.; Constantino, John N.

    2014-01-01

    Understanding the factor structure of autistic symptomatology is critical to the discovery and interpretation of causal mechanisms in autism spectrum disorder. We applied confirmatory factor analysis and assessment of measurement invariance to a large ("N" = 9635) accumulated collection of reports on quantitative autistic traits using…

  15. The U.S. EPA ToxCast Program: Moving from Data Generation to Application (SOT Tox21 update symposium presentation)

    EPA Science Inventory

    The U.S. EPA ToxCast program is entering its tenth year. Significant learning and progress have occurred towards collection, analysis, and interpretation of the data. The library of ~1,800 chemicals has been subject to ongoing characterization (e.g., identity, purity, stability...

  16. Dads, Data and Discourse: Theory, Analysis and Interpretation in Parenting Research.

    ERIC Educational Resources Information Center

    Holland, Annette

    This paper discusses the use of theoretical premises in the design and implementation of a study of men's perceptions of fatherhood. Forty Australian fathers participated in small discussion groups over a 7-week period regarding contemporary fatherhood. Data were collected using questionnaires, including the Perception of Parental Role Scales, the…

  17. Revealing the structural nature of the Cd isotopes

    NASA Astrophysics Data System (ADS)

    Garrett, P. E.; Diaz Varela, A.; Green, K. L.; Jamieson, D. S.; Jigmeddorj, B.; Wood, J. L.; Yates, S. W.

    2015-10-01

    The even-even Cd isotopes have provided fertile ground for the investigation of collectivity in nuclei. Soon after the development of the Bohr model, the stable Cd isotopes were identified as nearly harmonic vibrators based on their excitation energy patterns. The measurements of enhanced B (E 2) values appeared to support this interpretation. Shape co-existing rotational-like intruder bands were discovered, and mixing between the configurations was invoked to explain the deviation of the decay pattern of multiphonon vibrational states. Very recently, a detailed analysis of the low-lying levels of 110Cd combining results of the (n ,n' γ) reaction and high-statistics β decay, provided strong evidence that the mixing between configurations is weak, except for the ground-state band and ``Kπ =0+ '' intruder band. The analysis of the levels in 110Cd has now been extended to 3 MeV, and combined with data for 112Cd and previous Coulomb excitation data for 114Cd, enables a detailed map of the E 2 collectivity in these nuclei, demanding a complete re-interpretation of the structure of the stable Cd isotopes.

  18. Quantifying and predicting interpretational uncertainty in cross-sections

    NASA Astrophysics Data System (ADS)

    Randle, Charles; Bond, Clare; Monaghan, Alison; Lark, Murray

    2015-04-01

    Cross-sections are often constructed from data to create a visual impression of the geologist's interpretation of the sub-surface geology. However as with all interpretations, this vision of the sub-surface geology is uncertain. We have designed and carried out an experiment with the aim of quantifying the uncertainty in geological cross-sections created by experts interpreting borehole data. By analysing different attributes of the data and interpretations we reflect on the main controls on uncertainty. A group of ten expert modellers at the British Geological Survey were asked to interpret an 11.4 km long cross-section from south-east Glasgow, UK. The data provided consisted of map and borehole data of the superficial deposits and shallow bedrock. Each modeller had a unique set of 11 boreholes removed from their dataset, to which their interpretations of the top of the bedrock were compared. This methodology allowed quantification of how far from the 'correct answer' each interpretation is at 11 points along each interpreted cross-section line; through comparison of the interpreted and actual bedrock elevations in the boreholes. This resulted in the collection of 110 measurements of the error to use in further analysis. To determine the potential control on uncertainty various attributes relating to the modeller, the interpretation and the data were recorded. Modellers were asked to fill out a questionnaire asking for information; such as how much 3D modelling experience they had, and how long it took them to complete the interpretation. They were also asked to record their confidence in their interpretations graphically, in the form of a confidence level drawn onto the cross-section. Initial analysis showed the majority of the experts' interpreted bedrock elevations within 5 metres of those recorded in the withheld boreholes. Their distribution is peaked and symmetrical about a mean of zero, indicating that there was no tendency for the experts to either under or over estimate the elevation of the bedrock. More complex analysis was completed in the form of linear mixed effects modelling. The modelling was used to determine if there were any correlations between the error and any other parameter recorded in the questionnaire, section or the initial dataset. This has resulted in the determination of both data based and interpreter based controls on uncertainty, adding insight into how uncertainty can be predicted, as well as how interpretation workflows can be improved. Our results will inform further experiments across a wide variety of geological situations to build understanding and best practice workflows for cross-section interpretation to reduce uncertainty.

  19. Characterization of Escherichia coli isolates from different fecal sources by means of classification tree analysis of fatty acid methyl ester (FAME) profiles.

    PubMed

    Seurinck, Sylvie; Deschepper, Ellen; Deboch, Bishaw; Verstraete, Willy; Siciliano, Steven

    2006-03-01

    Microbial source tracking (MST) methods need to be rapid, inexpensive and accurate. Unfortunately, many MST methods provide a wealth of information that is difficult to interpret by the regulators who use this information to make decisions. This paper describes the use of classification tree analysis to interpret the results of a MST method based on fatty acid methyl ester (FAME) profiles of Escherichia coli isolates, and to present results in a format readily interpretable by water quality managers. Raw sewage E. coli isolates and animal E. coli isolates from cow, dog, gull, and horse were isolated and their FAME profiles collected. Correct classification rates determined with leaveone-out cross-validation resulted in an overall low correct classification rate of 61%. A higher overall correct classification rate of 85% was obtained when the animal isolates were pooled together and compared to the raw sewage isolates. Bootstrap aggregation or adaptive resampling and combining of the FAME profile data increased correct classification rates substantially. Other MST methods may be better suited to differentiate between different fecal sources but classification tree analysis has enabled us to distinguish raw sewage from animal E. coli isolates, which previously had not been possible with other multivariate methods such as principal component analysis and cluster analysis.

  20. Collecting and Interpreting Qualitative Materials. Third Edition

    ERIC Educational Resources Information Center

    Denzin, Norman K., Ed.; Lincoln, Yvonna, Ed.

    2007-01-01

    This book is the third volume of the paperback versions of "The SAGE Handbook of Qualitative Research, Third Edition." This portion of the handbook considers the tasks of collecting, analyzing, and interpreting empirical materials, and comprises the Handbook's Parts IV ("Methods of Collecting and Analyzing Empirical Materials") and V ("The Art and…

  1. Being-in-the-Chemotherapy-Suite versus Being-in-the-Oncology-Ward: An Analytical View of Two Hospital Sites Occupied by People Experiencing Cancer †

    PubMed Central

    Hughes, Catherine; van Heugten, Kate; Keeling, Sally; Szekely, Francisc

    2017-01-01

    How do people with cancer occupy places within the health system during their journey through palliative care? The answer to this question was explored by the authors as part of a wider ethnographic study of eight people’s journeys from referral to palliative care services to the end of life. This article reports on findings that have emerged from ongoing analysis that has been completed in the years proceeding data collection. An ethnographic research design was used to collect data about the participants and their family members over a three-year period. Data was collected using participant observation and semi-structured interviews. Over 380 transcripts based on field note entries and taped interviews were produced during the 1121 h of contact with participants and family members that made up the research period. Analysis of these texts identified two focal sites within Christchurch Hospital that were occupied by the participants. These were the Chemotherapy Suite and the Oncology Ward. Drawing on literature concerning previous anthropological analysis, research was conducted to understand how places affect people and how people affect places. The researchers have used a model outlined by the American ethnographer Miles Richardson to analyse two distinct sites within one hospital. As explained in Richardson’s article, whose title is used to model the title of this article, a sense of place becomes apparent when comparing and contrasting two sites within the same location. Richardson’s article is highly interpretative and relies not only on pre-existing theoretical frameworks but also on personal interpretation. The same approach has been used in the current article. Here, ethnographic methods require the researcher’s interpretation of how participants occupied these sites. Following this approach, the Chemotherapy Suite is presented as a place where medicine dominates illness, and appears as distinct from the Oncology Ward, where disease predominates and death is secreted away. PMID:28587264

  2. Proxies and Other External Raters: Methodological Considerations

    PubMed Central

    Snow, A Lynn; Cook, Karon F; Lin, Pay-Shin; Morgan, Robert O; Magaziner, Jay

    2005-01-01

    Objective The purpose of this paper is to introduce researchers to the measurement and subsequent analysis considerations involved when using externally rated data. We will define and describe two categories of externally rated data, recommend methodological approaches for analyzing and interpreting data in these two categories, and explore factors affecting agreement between self-rated and externally rated reports. We conclude with a discussion of needs for future research. Data Sources/Study Setting Data sources for this paper are previous published studies and reviews comparing self-rated with externally rated data. Study Design/Data Collection/Extraction Methods This is a psychometric conceptual paper. Principal Findings We define two types of externally rated data: proxy data and other-rated data. Proxy data refer to those collected from someone who speaks for a patient who cannot, will not, or is unavailable to speak for him or herself, whereas we use the term other-rater data to refer to situations in which the researcher collects ratings from a person other than the patient to gain multiple perspectives on the assessed construct. These two types of data differ in the way the measurement model is defined, the definition of the gold standard against which the measurements are validated, the analysis strategies appropriately used, and how the analyses are interpreted. There are many factors affecting the discrepancies between self- and external ratings, including characteristics of the patient, the proxy, and of the rated construct. Several psychological theories can be helpful in predicting such discrepancies. Conclusions Externally rated data have an important place in health services research, but use of such data requires careful consideration of the nature of the data and how it will be analyzed and interpreted. PMID:16179002

  3. Toward Establishing the Validity of the Resource Interpreter's Self-Efficacy Instrument

    NASA Astrophysics Data System (ADS)

    Smith, Grant D.

    Interpretive rangers serve as one of the major educational resources that visitors may encounter during their visit to a park or other natural area, yet our understanding of their professional growth remains limited. This study helps address this issue by developing an instrument that evaluates the beliefs of resource interpreters regarding their capabilities of communicating with the public. The resulting 11-item instrument was built around the construct of Albert Bandura's self-efficacy theory (Bandura, 1977, 1986, 1997), used guidelines and principles developed over the course of 30 years of teacher efficacy studies (Bandura, 2006; Gibson & Dembo, 1984; Riggs & Enochs, 1990; Tschannen-Moran & Hoy, 2001; Tschannen-Moran, Hoy, & Hoy, 1998), and probed areas of challenge that are unique to the demands of resource interpretation (Brochu & Merriman, 2002; Ham, 1992; Knudson, Cable, & Beck, 2003; Larsen, 2003; Tilden, 1977). A voluntary convenience sample of 364 National Park Service rangers was collected in order to conduct the statistical analyses needed to winnow the draft instrument down from 47 items in its original form to 11 items in its final state. Statistical analyses used in this process included item-total correlation, index of discrimination, exploratory factor analysis, and confirmatory factor analysis.

  4. A Integrated Service Platform for Remote Sensing Image 3D Interpretation and Draughting based on HTML5

    NASA Astrophysics Data System (ADS)

    LIU, Yiping; XU, Qing; ZhANG, Heng; LV, Liang; LU, Wanjie; WANG, Dandi

    2016-11-01

    The purpose of this paper is to solve the problems of the traditional single system for interpretation and draughting such as inconsistent standards, single function, dependence on plug-ins, closed system and low integration level. On the basis of the comprehensive analysis of the target elements composition, map representation and similar system features, a 3D interpretation and draughting integrated service platform for multi-source, multi-scale and multi-resolution geospatial objects is established based on HTML5 and WebGL, which not only integrates object recognition, access, retrieval, three-dimensional display and test evaluation but also achieves collection, transfer, storage, refreshing and maintenance of data about Geospatial Objects and shows value in certain prospects and potential for growth.

  5. Seeking support: An interpretative phenomenological analysis of an Internet message board for people with Complex Regional Pain Syndrome.

    PubMed

    Rodham, Karen; McCabe, Candy; Blake, David

    2009-07-01

    In this article, we report on the findings of a qualitative inquiry into how an online message board for people who have Complex Regional Pain Syndrome (CRPS) was used by its members. All messages (and responses) posted on the CRPS message board over a 4-month period were collected retrospectively. The data were analysed using the method of Interpretative Phenomenological Analysis. Members used the message board to seek (and provide) support to those with CRPS, and also to express their emotions, feelings and experiences linked to their condition. The message board provided an important source of support for a patient group that can otherwise become isolated as a result of their mobility problems. Furthermore, the analysis revealed the unrealistic hopes that patients can hold concerning the anticipated outcomes of their treatment. This is an important issue for healthcare professionals to explicitly address when interacting with the patient group.

  6. Forensic trace DNA: a review

    PubMed Central

    2010-01-01

    DNA analysis is frequently used to acquire information from biological material to aid enquiries associated with criminal offences, disaster victim identification and missing persons investigations. As the relevance and value of DNA profiling to forensic investigations has increased, so too has the desire to generate this information from smaller amounts of DNA. Trace DNA samples may be defined as any sample which falls below recommended thresholds at any stage of the analysis, from sample detection through to profile interpretation, and can not be defined by a precise picogram amount. Here we review aspects associated with the collection, DNA extraction, amplification, profiling and interpretation of trace DNA samples. Contamination and transfer issues are also briefly discussed within the context of trace DNA analysis. Whilst several methodological changes have facilitated profiling from trace samples in recent years it is also clear that many opportunities exist for further improvements. PMID:21122102

  7. Graphic Strategies for Analyzing and Interpreting Curricular Mapping Data

    PubMed Central

    Leonard, Sean T.

    2010-01-01

    Objective To describe curricular mapping strategies used in analyzing and interpreting curricular mapping data and present findings on how these strategies were used to facilitate curricular development. Design Nova Southeastern University's doctor of pharmacy curriculum was mapped to the college's educational outcomes. The mapping process included development of educational outcomes followed by analysis of course material and semi-structured interviews with course faculty members. Data collected per course outcome included learning opportunities and assessment measures used. Assessment Nearly 1,000 variables and 10,000 discrete rows of curricular data were collected. Graphic representations of curricular data were created using bar charts and stacked area graphs relating the learning opportunities to the educational outcomes. Graphs were used in the curricular evaluation and development processes to facilitate the identification of curricular holes, sequencing misalignments, learning opportunities, and assessment measures. Conclusion Mapping strategies that use graphic representations of curricular data serve as effective diagnostic and curricular development tools. PMID:20798804

  8. Graphic strategies for analyzing and interpreting curricular mapping data.

    PubMed

    Armayor, Graciela M; Leonard, Sean T

    2010-06-15

    To describe curricular mapping strategies used in analyzing and interpreting curricular mapping data and present findings on how these strategies were used to facilitate curricular development. Nova Southeastern University's doctor of pharmacy curriculum was mapped to the college's educational outcomes. The mapping process included development of educational outcomes followed by analysis of course material and semi-structured interviews with course faculty members. Data collected per course outcome included learning opportunities and assessment measures used. Nearly 1,000 variables and 10,000 discrete rows of curricular data were collected. Graphic representations of curricular data were created using bar charts and stacked area graphs relating the learning opportunities to the educational outcomes. Graphs were used in the curricular evaluation and development processes to facilitate the identification of curricular holes, sequencing misalignments, learning opportunities, and assessment measures. Mapping strategies that use graphic representations of curricular data serve as effective diagnostic and curricular development tools.

  9. Performance analysis of automated evaluation of Crithidia luciliae-based indirect immunofluorescence tests in a routine setting - strengths and weaknesses.

    PubMed

    Hormann, Wymke; Hahn, Melanie; Gerlach, Stefan; Hochstrate, Nicola; Affeldt, Kai; Giesen, Joyce; Fechner, Kai; Damoiseaux, Jan G M C

    2017-11-27

    Antibodies directed against dsDNA are a highly specific diagnostic marker for the presence of systemic lupus erythematosus and of particular importance in its diagnosis. To assess anti-dsDNA antibodies, the Crithidia luciliae-based indirect immunofluorescence test (CLIFT) is one of the assays considered to be the best choice. To overcome the drawback of subjective result interpretation that inheres indirect immunofluorescence assays in general, automated systems have been introduced into the market during the last years. Among these systems is the EUROPattern Suite, an advanced automated fluorescence microscope equipped with different software packages, capable of automated pattern interpretation and result suggestion for ANA, ANCA and CLIFT analysis. We analyzed the performance of the EUROPattern Suite with its automated fluorescence interpretation for CLIFT in a routine setting, reflecting the everyday life of a diagnostic laboratory. Three hundred and twelve consecutive samples were collected, sent to the Central Diagnostic Laboratory of the Maastricht University Medical Centre with a request for anti-dsDNA analysis over a period of 7 months. Agreement between EUROPattern assay analysis and the visual read was 93.3%. Sensitivity and specificity were 94.1% and 93.2%, respectively. The EUROPattern Suite performed reliably and greatly supported result interpretation. Automated image acquisition is readily performed and automated image classification gives a reliable recommendation for assay evaluation to the operator. The EUROPattern Suite optimizes workflow and contributes to standardization between different operators or laboratories.

  10. SAM Gcms Chromatography Performed at Mars : Elements of Interpretation

    NASA Astrophysics Data System (ADS)

    Szopa, C.; Coll, P. J.; Buch, A.; François, P.; Cabane, M.; Coscia, D.; Teinturier, S.; Navarro-Gonzalez, R.; Glavin, D. P.; Freissinet, C.; Mahaffy, P. R.

    2013-12-01

    The characterisation of the chemical and mineralogical composition of regolith samples collected with the Curiosity rover is a primary objective of the SAM experiment. These data should provide essential clues on the past habitability of Gale crater. Interpretation of the data collected after SAM pyrolysis evolved gas analysis (EGA) and gas chromatography mass spectrometry (GC-MS) experiments on the first soil samples collected by MSL at the Rocknest Aeolian Deposit in Gale Crater has been challenging due to the concomitant presence in the ovens of an oxychlorine phase present in the samples, and a derivatization agent coming from the SAM wet chemistry experiment (Glavin et al., 2013). Moreover, accurate identification and quantification, in the SAM EGA mode, of volatiles released from the heated sample, or generated by reactions occurring in the SAM pyrolysis oven, is also difficult for a few compounds due to evolution over similar temperature ranges and overlap of their MS signatures. Hence, the GC analyses, coupled with MS, enabled the separation and identification and quantification of most of the volatile compounds detected. These results can have been obtained through tests and calibration done with GC individual spare components and with the SAM testbed. This paper will present a view of the interpretation of the chromatograms obtained when analyzing the Rocknest and John Klein solid samples delivered to SAM, on sols 96 and 199 respectively, supported by laboratory calibrations.

  11. Introducing Blended Learning: An Experience of Uncertainty for Students in the United Arab Emirates

    ERIC Educational Resources Information Center

    Kemp, Linzi J.

    2013-01-01

    The cultural dimension of Uncertainty Avoidance is analysed in this study of an introduction to blended learning for international students. Content analysis was conducted on the survey narratives collected from three cohorts of management undergraduates in the United Arab Emirates. Interpretation of certainty with blended learning was found in:…

  12. The Perceived and Real Value of Health Information Exchange in Public Health Surveillance

    ERIC Educational Resources Information Center

    Dixon, Brian Edward

    2011-01-01

    Public health agencies protect the health and safety of populations. A key function of public health agencies is surveillance or the ongoing, systematic collection, analysis, interpretation, and dissemination of data about health-related events. Recent public health events, such as the H1N1 outbreak, have triggered increased funding for and…

  13. The Meaning of the Future for the Oldest Old

    ERIC Educational Resources Information Center

    Nilsson, Margareta; Sarvimaki, Anneli; Ekman, Sirkka-Liisa

    2003-01-01

    The aim of the study was to highlight the oldest old people's view of their future from a perspective of philosophy of life. Data was collected by means of life story interviews with 15 persons. The analysis was performed by utilizing a phenomenological hermeneutic method and the interpretation was guided by the conceptual framework of philosophy…

  14. Institutional Research: What Problems Are We Trying to Solve?

    ERIC Educational Resources Information Center

    Longden, Bernard; Yorke, Mantz

    2009-01-01

    Institutional research in UK higher education is rarely consolidated into a central office function. This is in marked comparison to the position of IR in the USA where most universities accord it a high status which is absent from the UK context. The collection, analysis and interpretation of data in the USA appears, on the whole, more systematic…

  15. Making the Instructional Curriculum as an Interactive, Contextualized Process: Case Studies of Seven ESOL Teachers

    ERIC Educational Resources Information Center

    Wette, Rosemary

    2009-01-01

    This article reports on data from interpretive case studies of seven well-qualified, experienced teachers of adult ESOL, collected through weekly interviews and analysis of documents and materials produced over the duration of a whole course for each teacher. Teachers' knowledge and experience was apparent in their ability to conceptualize and…

  16. Using Mixed Methods to Study First-Year College Impact on Liberal Arts Learning Outcomes

    ERIC Educational Resources Information Center

    Seifert, Tricia A.; Goodman, Kathleen; King, Patricia M.; Baxter Magolda, Marcia B.

    2010-01-01

    This study details the collection, analysis, and interpretation of data from a national multi-institutional longitudinal mixed methods study of college impact and student development of liberal arts outcomes. The authors found three sets of practices in the quantitative data that corroborated with the themes that emerged from the qualitative data:…

  17. Sensitivity of landscape metrics to pixel size

    Treesearch

    J. D. Wickham; K. H. Riitters

    1995-01-01

    Analysis of diversity and evenness metrics using land cover data are becoming formalized in landscape ecology. Diversity and evenness metrics are dependent on the pixel size (scale) over which the data are collected. Aerial photography was interpreted for land cover and converted into four raster data sets with 4, 12, 28, and 80 m pixel sizes, representing pixel sizes...

  18. Analysis of Military Nursing Practice Study Data Collected in Iraq

    DTIC Science & Technology

    2011-10-24

    She wanted to have that hope that she wanted to walk again and to do all that stuff again. The interpreters were not telling her that she won’t...This too was similar to the findings of this study, where nurses spoke of common nursing care such as ostomy /wound care within the added impact of the

  19. Contemporary Attitudes and Practice Patterns of North American Urologists in Investigating Stone-Forming Patients-A Survey of Endourological Society Members.

    PubMed

    McGuire, Barry B; Matulewicz, Richard S; Zuccarino-Crowe, Rian; Nadler, Robert B; Perry, Kent T

    2016-04-01

    Recent evidence would suggest a low rate of metabolic assessment in stone formers, even in those deemed as high risk. We wished to assess the attitudes and practice patterns of metabolic work up in North American members of the Endourological Society as part of the management of stone-forming patients. A 12-question online multiple-choice questionnaire (using Survey Monkey(®)) was distributed to all members of the Endourological Society through e-mail. Descriptive analyses were performed. A total of 124 North American members of the Endourological Society responded (90% endourologists, 65% fellowship trained). Ninety-seven percent perform metabolic assessments without referring to a consultant. Eighty-three percent use a commercial analysis company and 17% request serum or urine parameters individually. Ninety-seven percent believe that 24-48-hour urine collection is a better way of assessing patients for metabolic abnormalities than a "basic analysis." Many respondents (37%) would be more likely to metabolically assess if results were easier to interpret, and 35% would like assistance/advice in the interpretation of results. At initial investigation of a first-time stone former, 87% of respondents use serum chemistry, 48% use 24-hour urine, 26% use 48-hour urine (two consecutive 24-hour urine collections), 54% send stone for analysis, and 7% do not investigate. On recurrent stone formers, 69% use serum chemistry, 73% use 24/48-hour urine, and 23% send stone for analysis. On routine follow-up, 36% check serum chemistry, 55% use 24-hour urine, 2% use 48-hour urine, and 29% do not metabolically evaluate. The majority agree that pharmacologic therapy plays a strong role in preventing recurrence (90%). After initiating pharmacologic therapy, 59% reassess using serum chemistry and 84% and 7% use 24/48-hour urine collection, respectively. Physicians re-evaluate patients after 1 month (7%), 1-2 months (10%), 2-4 months (44%), 4-6 months (30%), or after 6-12 months (7%). This snapshot assessment of Endourological Society members' practices in the metabolic investigation of stone-forming patients demonstrates wide testing variations. Many physicians expressed interest in assistance/advice in the interpretation of the metabolic assessment results.

  20. Analysis of Shuttle Orbiter Reliability and Maintainability Data for Conceptual Studies

    NASA Technical Reports Server (NTRS)

    Morris, W. D.; White, N. H.; Ebeling, C. E.

    1996-01-01

    In order to provide a basis for estimating the expected support required of new systems during their conceptual design phase, Langley Research Center has recently collected Shuttle Orbiter reliability and maintainability data from the various data base sources at Kennedy Space Center. This information was analyzed to provide benchmarks, trends, and distributions to aid in the analysis of new designs. This paper presents a summation of those results and an initial interpretation of the findings.

  1. Interpretive policy analysis: Marshallese COFA migrants and the Affordable Care Act.

    PubMed

    McElfish, Pearl Anna; Purvis, Rachel S; Maskarinec, Gregory G; Bing, Williamina Ioanna; Jacob, Christopher J; Ritok-Lakien, Mandy; Rubon-Chutaro, Jellesen; Lang, Sharlynn; Mamis, Sammie; Riklon, Sheldon

    2016-06-11

    Since the enactment of the Affordable Care Act (ACA), the rate of uninsured in the United States has declined significantly. However, not all legal residents have benefited equally. As part of a community-based participatory research (CBPR) partnership with the Marshallese community, an interpretative policy analysis research project was conducted to document Marshallese Compact of Free Association (COFA) migrants' understanding and experiences regarding the ACA and related health policies. This article is structured to allow the voice of Marshallese COFA migrants to explain their understanding and interpretation of the ACA and related polices on their health in their own words. Qualitative data was collected from 48 participants in five focus groups conducted at the local community center and three individual interviews for those unable to attend the focus groups. Marshallese community co-investigators participated throughout the research and writing process to ensure that cultural context and nuances in meaning were accurately captured and presented. Community co-investigators assisted with the development of the semi-structured interview guide, facilitated focus groups, and participated in qualitative data analysis. Content analysis revealed six consistent themes across all focus groups and individual interviews that include: understanding, experiences, effect on health, relational/historical lenses, economic contribution, and pleas. Working with Marshallese community co-investigators, we selected quotations that most represented the participants' collective experiences. The Marshallese view the ACA and their lack of coverage as part of the broader relationship between the Republic of the Marshall Islands (RMI) and the United States. The Marshallese state that they have honored the COFA relationship, and they believe the United States is failing to meet its obligations of care and support outlined in the COFA. While the ACA and Medicaid Expansion have reduced the national uninsured rate, Marshallese COFA migrants have not benefited equally from this policy. The lack of healthcare coverage for the Marshallese COFA migrants exacerbates the health disparities this underserved population faces. This article is an important contribution to researchers because it presents the Marshallese's interpretation of the policy, which will help inform policy makers that are working to improve Marshallese COFA migrant health.

  2. Accuracy assessment of vegetation community maps generated by aerial photography interpretation: perspective from the tropical savanna, Australia

    NASA Astrophysics Data System (ADS)

    Lewis, Donna L.; Phinn, Stuart

    2011-01-01

    Aerial photography interpretation is the most common mapping technique in the world. However, unlike an algorithm-based classification of satellite imagery, accuracy of aerial photography interpretation generated maps is rarely assessed. Vegetation communities covering an area of 530 km2 on Bullo River Station, Northern Territory, Australia, were mapped using an interpretation of 1:50,000 color aerial photography. Manual stereoscopic line-work was delineated at 1:10,000 and thematic maps generated at 1:25,000 and 1:100,000. Multivariate and intuitive analysis techniques were employed to identify 22 vegetation communities within the study area. The accuracy assessment was based on 50% of a field dataset collected over a 4 year period (2006 to 2009) and the remaining 50% of sites were used for map attribution. The overall accuracy and Kappa coefficient for both thematic maps was 66.67% and 0.63, respectively, calculated from standard error matrices. Our findings highlight the need for appropriate scales of mapping and accuracy assessment of aerial photography interpretation generated vegetation community maps.

  3. Rapid DNA analysis for automated processing and interpretation of low DNA content samples.

    PubMed

    Turingan, Rosemary S; Vasantgadkar, Sameer; Palombo, Luke; Hogan, Catherine; Jiang, Hua; Tan, Eugene; Selden, Richard F

    2016-01-01

    Short tandem repeat (STR) analysis of casework samples with low DNA content include those resulting from the transfer of epithelial cells from the skin to an object (e.g., cells on a water bottle, or brim of a cap), blood spatter stains, and small bone and tissue fragments. Low DNA content (LDC) samples are important in a wide range of settings, including disaster response teams to assist in victim identification and family reunification, military operations to identify friend or foe, criminal forensics to identify suspects and exonerate the innocent, and medical examiner and coroner offices to identify missing persons. Processing LDC samples requires experienced laboratory personnel, isolated workstations, and sophisticated equipment, requires transport time, and involves complex procedures. We present a rapid DNA analysis system designed specifically to generate STR profiles from LDC samples in field-forward settings by non-technical operators. By performing STR in the field, close to the site of collection, rapid DNA analysis has the potential to increase throughput and to provide actionable information in real time. A Low DNA Content BioChipSet (LDC BCS) was developed and manufactured by injection molding. It was designed to function in the fully integrated Accelerated Nuclear DNA Equipment (ANDE) instrument previously designed for analysis of buccal swab and other high DNA content samples (Investigative Genet. 4(1):1-15, 2013). The LDC BCS performs efficient DNA purification followed by microfluidic ultrafiltration of the purified DNA, maximizing the quantity of DNA available for subsequent amplification and electrophoretic separation and detection of amplified fragments. The system demonstrates accuracy, precision, resolution, signal strength, and peak height ratios appropriate for casework analysis. The LDC rapid DNA analysis system is effective for the generation of STR profiles from a wide range of sample types. The technology broadens the range of sample types that can be processed and minimizes the time between sample collection, sample processing and analysis, and generation of actionable intelligence. The fully integrated Expert System is capable of interpreting a wide range or sample types and input DNA quantities, allowing samples to be processed and interpreted without a technical operator.

  4. How to interpret the results of medical time series data analysis: Classical statistical approaches versus dynamic Bayesian network modeling.

    PubMed

    Onisko, Agnieszka; Druzdzel, Marek J; Austin, R Marshall

    2016-01-01

    Classical statistics is a well-established approach in the analysis of medical data. While the medical community seems to be familiar with the concept of a statistical analysis and its interpretation, the Bayesian approach, argued by many of its proponents to be superior to the classical frequentist approach, is still not well-recognized in the analysis of medical data. The goal of this study is to encourage data analysts to use the Bayesian approach, such as modeling with graphical probabilistic networks, as an insightful alternative to classical statistical analysis of medical data. This paper offers a comparison of two approaches to analysis of medical time series data: (1) classical statistical approach, such as the Kaplan-Meier estimator and the Cox proportional hazards regression model, and (2) dynamic Bayesian network modeling. Our comparison is based on time series cervical cancer screening data collected at Magee-Womens Hospital, University of Pittsburgh Medical Center over 10 years. The main outcomes of our comparison are cervical cancer risk assessments produced by the three approaches. However, our analysis discusses also several aspects of the comparison, such as modeling assumptions, model building, dealing with incomplete data, individualized risk assessment, results interpretation, and model validation. Our study shows that the Bayesian approach is (1) much more flexible in terms of modeling effort, and (2) it offers an individualized risk assessment, which is more cumbersome for classical statistical approaches.

  5. Cultural competency assessment tool for hospitals: evaluating hospitals' adherence to the culturally and linguistically appropriate services standards.

    PubMed

    Weech-Maldonado, Robert; Dreachslin, Janice L; Brown, Julie; Pradhan, Rohit; Rubin, Kelly L; Schiller, Cameron; Hays, Ron D

    2012-01-01

    The U.S. national standards for culturally and linguistically appropriate services (CLAS) in health care provide guidelines on policies and practices aimed at developing culturally competent systems of care. The Cultural Competency Assessment Tool for Hospitals (CCATH) was developed as an organizational tool to assess adherence to the CLAS standards. First, we describe the development of the CCATH and estimate the reliability and validity of the CCATH measures. Second, we discuss the managerial implications of the CCATH as an organizational tool to assess cultural competency. We pilot tested an initial draft of the CCATH, revised it based on a focus group and cognitive interviews, and then administered it in a field test with a sample of California hospitals. The reliability and validity of the CCATH were evaluated using factor analysis, analysis of variance, and Cronbach's alphas. Exploratory and confirmatory factor analyses identified 12 CCATH composites: leadership and strategic planning, data collection on inpatient population, data collection on service area, performance management systems and quality improvement, human resources practices, diversity training, community representation, availability of interpreter services, interpreter services policies, quality of interpreter services, translation of written materials, and clinical cultural competency practices. All the CCATH scales had internal consistency reliability of .65 or above, and the reliability was .70 or above for 9 of the 12 scales. Analysis of variance results showed that not-for-profit hospitals have higher CCATH scores than for-profit hospitals in five CCATH scales and higher CCATH scores than government hospitals in two CCATH scales. The CCATH showed adequate psychometric properties. Managers and policy makers can use the CCATH as a tool to evaluate hospital performance in cultural competency and identify and target improvements in hospital policies and practices that undergird the provision of CLAS.

  6. 3-D visualisation of palaeoseismic trench stratigraphy and trench logging using terrestrial remote sensing and GPR - combining techniques towards an objective multiparametric interpretation

    NASA Astrophysics Data System (ADS)

    Schneiderwind, S.; Mason, J.; Wiatr, T.; Papanikolaou, I.; Reicherter, K.

    2015-09-01

    Two normal faults on the Island of Crete and mainland Greece were studied to create and test an innovative workflow to make palaeoseismic trench logging more objective, and visualise the sedimentary architecture within the trench wall in 3-D. This is achieved by combining classical palaeoseismic trenching techniques with multispectral approaches. A conventional trench log was firstly compared to results of iso cluster analysis of a true colour photomosaic representing the spectrum of visible light. Passive data collection disadvantages (e.g. illumination) were addressed by complementing the dataset with active near-infrared backscatter signal image from t-LiDAR measurements. The multispectral analysis shows that distinct layers can be identified and it compares well with the conventional trench log. According to this, a distinction of adjacent stratigraphic units was enabled by their particular multispectral composition signature. Based on the trench log, a 3-D-interpretation of GPR data collected on the vertical trench wall was then possible. This is highly beneficial for measuring representative layer thicknesses, displacements and geometries at depth within the trench wall. Thus, misinterpretation due to cutting effects is minimised. Sedimentary feature geometries related to earthquake magnitude can be used to improve the accuracy of seismic hazard assessments. Therefore, this manuscript combines multiparametric approaches and shows: (i) how a 3-D visualisation of palaeoseismic trench stratigraphy and logging can be accomplished by combining t-LiDAR and GRP techniques, and (ii) how a multispectral digital analysis can offer additional advantages and a higher objectivity in the interpretation of palaeoseismic and stratigraphic information. The multispectral datasets are stored allowing unbiased input for future (re-)investigations.

  7. Clinical and technical considerations in the analysis of gingival crevicular fluid.

    PubMed

    Wassall, Rebecca R; Preshaw, Philip M

    2016-02-01

    Despite the technical challenges involved when collecting, processing and analyzing gingival crevicular fluid samples, research using gingival crevicular fluid has, and will continue to play, a fundamental role in expanding our understanding of periodontal pathogenesis and healing outcomes following treatment. A review of the literature, however, clearly demonstrates that there is considerable variation in the methods used for collection, processing and analysis of gingival crevicular fluid samples by different research groups around the world. Inconsistent or inadequate reporting impairs interpretation of results, prevents accurate comparison of data between studies and potentially limits the conclusions that can be made from a larger body of evidence. The precise methods used for collection and analysis of gingival crevicular fluid (including calibration studies required before definitive clinical studies) should be reported in detail, either in the methods section of published papers or as an online supplementary file, so that other researchers may reproduce the methodology. Only with clear and transparent reporting will the full impact of future gingival crevicular fluid research be realized. This paper discusses the complexities of gingival crevicular fluid collection and analysis and provides guidance to researchers working in this field. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  8. Analysis of the Relations among the Components of Technological Pedagogical and Content Knowledge (TPACK): A Structural Equation Model

    ERIC Educational Resources Information Center

    Celik, Ismail; Sahin, Ismail; Akturk, Ahmet Oguz

    2014-01-01

    In the current study, the model of technological pedagogical and content knowledge (TPACK) is used as the theoretical framework in the process of data collection and interpretation of the results. This study analyzes the perceptions of 744 undergraduate students regarding their TPACK levels measured by responses to a survey developed by Sahin…

  9. Task Effects on Linguistic Complexity and Accuracy: A Large-Scale Learner Corpus Analysis Employing Natural Language Processing Techniques

    ERIC Educational Resources Information Center

    Alexopoulou, Theodora; Michel, Marije; Murakami, Akira; Meurers, Detmar

    2017-01-01

    Large-scale learner corpora collected from online language learning platforms, such as the EF-Cambridge Open Language Database (EFCAMDAT), provide opportunities to analyze learner data at an unprecedented scale. However, interpreting the learner language in such corpora requires a precise understanding of tasks: How does the prompt and input of a…

  10. Problem-Solving in Las Vegas: Students Are Building Skills and a Global Network.

    ERIC Educational Resources Information Center

    Budd, Gregory; Curry, Don

    1995-01-01

    Describes a project initiated at Silverado High School in Las Vegas, where students from Las Vegas and schools across the United States monitor the levels of radon in the atmosphere. Enables students to learn first hand about the collection, analysis, and interpretation of scientific data and to network with other students from the United States…

  11. The Examining Reading Motivation of Primary Students in the Terms of Some Variables

    ERIC Educational Resources Information Center

    Biyik, Merve Atas; Erdogan, Tolga; Yildiz, Mustafa

    2017-01-01

    The purpose of this research, is to examine reading motivation of the primary 2, 3 and 4th grade students in the terms of gender, class and socioeconomic status. Research is structured according to model of survey in the descriptive type. In the collection, analysis and interpretation of the data "mix method". The sample consists of…

  12. 40 CFR Appendix N to Part 50 - Interpretation of the National Ambient Air Quality Standards for PM2.5

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... monitors utilize the same specific sampling and analysis method. Combined site data record is the data set... monitors are suitable monitors designated by a state or local agency in their annual network plan (and in... appendix. Seasonal sampling is the practice of collecting data at a reduced frequency during a season of...

  13. Statistics: The Shape of the Data. Used Numbers: Real Data in the Classroom. Grades 4-6.

    ERIC Educational Resources Information Center

    Russell, Susan Jo; Corwin, Rebecca B.

    A unit of study that introduces collecting, representing, describing, and interpreting data is presented. Suitable for students in grades 4 through 6, it provides a foundation for further work in statistics and data analysis. The investigations may extend from one to four class sessions and are grouped into three parts: "Introduction to Data…

  14. "Information at Their Hands": Applying Sociocultural Theory to an Analysis of the Pedagogical Moves of Pre-Service Science Teachers during a Science Lesson

    ERIC Educational Resources Information Center

    Shively, Christopher

    2013-01-01

    The National Science Education Standards (NSES) state that students must "experience scientific inquiry directly to gain a deep understanding of its characteristics" (Olson & Loucks-Horsley, 2000, p. 14). The standards also emphasize the use of technology to help students collect, organize, analyze, interpret and present data in ways…

  15. Evaluation of the Distance Education Pre-Service Teachers' Opinions about Teaching Practice Course (Case of Izmir City)

    ERIC Educational Resources Information Center

    Guven, Meral; Kurum, Dilruba; Saglam, Mustafa

    2012-01-01

    The aim of this study was to determine the distance education pre-service teachers' opinions about the teaching practice course. The study was conducted with descriptive method. For data collection, analysis and interpretation, qualitative research method was used. Out of the students enrolled at Open Education Faculty, Department of Pre-school…

  16. Drama and Oral Interpretation: Abstracts of Doctoral Dissertations Published in "Dissertation Abstracts International," July through December 1978 (Vol. 39 Nos. 1 through 6).

    ERIC Educational Resources Information Center

    ERIC Clearinghouse on Reading and Communication Skills, Urbana, IL.

    This collection of abstracts is part of a continuing series providing information on recent doctoral dissertations. The 15 titles deal with the following topics: interpersonal conflict and the nonviolent peacemaking tradition; theatrical transactional analysis; recycling existing spaces for theatre use; the effect of theatre study on the…

  17. Public Schools and Political Ideas: Canadian Educational Policy in Historical Perspective.

    ERIC Educational Resources Information Center

    Manzer, Ronald A.

    This book interprets the framework of political ideas and beliefs that structure individual and collective thinking about educational policies and give them meaning. The analysis begins with the state of education in the mid-19th century and brings up to date the prospective reforms of the early 1990s. The study argues that, from its foundation,…

  18. 75 FR 22565 - Privacy Act of 1974; System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-29

    ... collect, assemble, interpret, analyze, report and publish surveys; research, study, statistical and... commercial entities, for surveys or research, where such releases are consistent with the mission of the...): To collect, assemble, interpret, analyze, report and publish surveys; research, study, statistical...

  19. The childbearing experience of women with spinal cord injury in Iran: a phenomenological study.

    PubMed

    Khazaeipour, Zahra; Nikbakht-Nasrabadi, Alireza; Mohammadi, Nooredin; Salehi-Nejad, Alireza; Shabany, Maryam

    2018-06-14

    This was a qualitative study conducted using an interpretative phenomenological approach. This study investigated the experience of pregnancy and childbirth in women with spinal cord injury (SCI). Brain and Spinal Cord Injury Research Center, Tehran University of Medical Sciences, Tehran, Iran. The data were collected using telecommunication and face-to-face semi-structured interviews with eight women with SCI. The study employed the interpretative phenomenological approach suggested by Van Manen in 2016 and thematic analysis to provide a comprehensive understanding of the childbearing experience of women with SCI. MAXQDA 10 software was used to manage the collected data. Five main themes have emerged from data analysis: "revivification", "fear and concern of motherhood with SCI", "flawed health care system", "maternal experience under a supportive umbrella", and "strengthening spirituality and religious belief". Childbearing had a positive effect on the family relationship, continuity of marriage, and quality of life following SCI. There are potential benefits in establishing a center that provides consultation on childbearing and childcare for women with SCI. Moreover, training for the medical team, which includes nurses, midwives, and specialists is highly recommended. Further research is needed to expand our understanding of childbearing from the perspectives of healthcare providers.

  20. Deciphering Sources of Variability in Clinical Pathology.

    PubMed

    Tripathi, Niraj K; Everds, Nancy E; Schultze, A Eric; Irizarry, Armando R; Hall, Robert L; Provencher, Anne; Aulbach, Adam

    2017-01-01

    The objectives of this session were to explore causes of variability in clinical pathology data due to preanalytical and analytical variables as well as study design and other procedures that occur in toxicity testing studies. The presenters highlighted challenges associated with such variability in differentiating test article-related effects from the effects of experimental procedures and its impact on overall data interpretation. These presentations focused on preanalytical and analytical variables and study design-related factors and their influence on clinical pathology data, and the importance of various factors that influence data interpretation including statistical analysis and reference intervals. Overall, these presentations touched upon potential effect of many variables on clinical pathology parameters, including animal physiology, sample collection process, specimen handling and analysis, study design, and some discussion points on how to manage those variables to ensure accurate interpretation of clinical pathology data in toxicity studies. This article is a brief synopsis of presentations given in a session entitled "Deciphering Sources of Variability in Clinical Pathology-It's Not Just about the Numbers" that occurred at the 35th Annual Symposium of the Society of Toxicologic Pathology in San Diego, California.

  1. Analysis of Garment Production Methods. Part 2: Comparison of Cost and Production between a Traditional Bundle System and Modular Manufacturing

    DTIC Science & Technology

    1992-02-01

    configuration. We have spent the last year observing two firms as they experimented with modular manufacturing. The following report will track the progress of...the transitions as they I moved through the year . Incorporated into the analysis is the statistical interpretation of data collected from each firm, as...during the year . FEBRUARY The most noticeable change this month was the introduction of the new ergonomic chairs for the operators. Previously the

  2. The interpretation of hair analysis for drugs and drug metabolites.

    PubMed

    Cuypers, Eva; Flanagan, Robert J

    2018-02-01

    Head hair analysis for drugs and drug metabolites has been used widely with the aim of detecting exposure in the weeks or months prior to sample collection. However, inappropriate interpretation of results has likely led to serious miscarriages of justice, especially in child custody cases. The aim of this review is to assess critically what can, and perhaps more importantly, what cannot be claimed as regards the interpretation of hair test results in a given set of circumstances in order to inform future testing. We searched the PubMed database for papers published 2010-2016 using the terms "hair" and "drug" and "decontamination", the terms "hair" and "drug" and "contamination", the terms "hair" and "drug-facilitated crime", the terms "hair" and "ethyl glucuronide", and the terms "hair", "drug testing" and "analysis". Study of the reference lists of the 46 relevant papers identified 25 further relevant citations, giving a total of 71 citations. Hair samples: Drugs, drug metabolites and/or decomposition products may arise not only from deliberate drug administration, but also via deposition from a contaminated atmosphere if drug(s) have been smoked or otherwise vaporized in a confined area, transfer from contaminated surfaces via food/fingers, etc., and transfer from sweat and other secretions after a single large exposure, which could include anesthesia. Excretion in sweat of endogenous analytes such as γ-hydroxybutyric acid is a potential confounder if its use is to be investigated. Cosmetic procedures such as bleaching or heat treatment of hair may remove analytes prior to sample collection. Hair color and texture, the area of the head the sample is taken from, the growth rate of individual hairs, and how the sample has been stored, may also affect the interpretation of results. Toxicological analysis: Immunoassay results alone do not provide reliable evidence on which to base judicial decisions. Gas or liquid chromatography with mass spectrometric detection (GC- or LC-MS), if used with due caution, can give accurate analyte identification and high sensitivity, but many problems remain. Firstly, it is not possible to prepare assay calibrators or quality control material except by soaking "blank" hair in solutions of appropriate analytes, drying, and then subjecting the dried material to an analysis. The fact that solvents can be used to add analytes to hair points to the fact that analytes can arrive not only on, but also in hair from exogenous sources. A range of solvent-washing procedures have been advocated to "decontaminate" hair by removing adsorbed analytes, but these carry the risk of transporting adsorbed analytes into the medulla of the hair therefore confounding the whole procedure. This is especially true if segmental analysis is being undertaken in order to provide a "time course" of drug exposure. Proposed clinical applications of hair analysis: There have been a number of reports where drugs seemingly administered during the perpetration of a crime have been detected in head hair. However, detailed evaluation of these reports is difficult without full understanding of the possible effects of any "decontamination" procedures used and of other variables such as hair color or cosmetic hair treatment. Similarly, in child custody cases and where the aim is to demonstrate abstinence from drug or alcohol use, the issues of possible exogenous sources of analyte, and of the large variations in analyte concentrations reported in known users, continue to confound the interpretation of results in individual cases. Interpretation of results of head hair analysis must take into account all the available circumstantial and other evidence especially as regards the methodology employed and the possibility of surface contamination of the hair prior to collection.

  3. Seismic facies analysis based on self-organizing map and empirical mode decomposition

    NASA Astrophysics Data System (ADS)

    Du, Hao-kun; Cao, Jun-xing; Xue, Ya-juan; Wang, Xing-jian

    2015-01-01

    Seismic facies analysis plays an important role in seismic interpretation and reservoir model building by offering an effective way to identify the changes in geofacies inter wells. The selections of input seismic attributes and their time window have an obvious effect on the validity of classification and require iterative experimentation and prior knowledge. In general, it is sensitive to noise when waveform serves as the input data to cluster analysis, especially with a narrow window. To conquer this limitation, the Empirical Mode Decomposition (EMD) method is introduced into waveform classification based on SOM. We first de-noise the seismic data using EMD and then cluster the data using 1D grid SOM. The main advantages of this method are resolution enhancement and noise reduction. 3D seismic data from the western Sichuan basin, China, are collected for validation. The application results show that seismic facies analysis can be improved and better help the interpretation. The powerful tolerance for noise makes the proposed method to be a better seismic facies analysis tool than classical 1D grid SOM method, especially for waveform cluster with a narrow window.

  4. Using service data: tools for taking action.

    PubMed

    1992-01-01

    Program performance can be improved through use of a simple information system. The focus of the discussion is on analysis of service data, decision making, and program improvement. Clinic managers must collect and analyze their own data and not wait for supervisors from central or district offices to conduct thorough examination. Local decision making has the advantage of providing monitoring and modification of services in a timely way and in a way responsive to client needs. Information can be shared throughout all levels of local and central administration. The model for decision making is based on data collection, data analysis, decision making, action, evaluation, information dissemination, and feedback. Data need to be collected on types of clients (new acceptor or continuing user), type of contraceptive method and quantity dispensed, and how the client learned about the clinic. Supply data also needs to be collected on methods of contraceptives on hand, number dispensed by method to clients, and projected supplies; requests for additional supplies can thus be made in a timely and appropriate way. The basic clinic forms are the family planning (FP), client record, the client referral card, an appointment card, a complication card, a daily FP activity register, a FP activities worksheet, a monthly summary of FP activities, and a commodities request/receipt form. A suggestion sheet from users addresses issues about performance targets, continuing users, dropouts, staff motivation, and setting up a system. Suggestions are also provided on the importance of staff training in data collection and analysis and in creating awareness of the program's objectives. Discussion is directed to how to interpret new acceptor data and to look for patterns. A sample chart is provided of a summary of FP activities, possible interpretations, and possible actions to take. Analysis is given for new acceptor trends, contraceptive method mix, and sources of information. A short example illustrates how client card data and bar graphs of method mix by desire for no more children or for more children revealed that couples childbearing desires did not affect method choice.

  5. Getting the picture: A mixed-methods inquiry into how visual representations are interpreted by students, incorporated within textbooks, and integrated into middle-school science classrooms

    NASA Astrophysics Data System (ADS)

    Lee, Victor Raymond

    Modern-day middle school science textbooks are heavily populated with colorful images, technical diagrams, and other forms of visual representations. These representations are commonly perceived by educators to be useful aids to support student learning of unfamiliar scientific ideas. However, as the number of representations in science textbooks has seemingly increased in recent decades, concerns have been voiced that many current of these representations are actually undermining instructional goals; they may be introducing substantial conceptual and interpretive difficulties for students. To date, very little empirical work has been done to examine how the representations used in instructional materials have changed, and what influences these changes exert on student understanding. Furthermore, there has also been limited attention given to the extent to which current representational-use routines in science classrooms may mitigate or limit interpretive difficulties. This dissertation seeks to do three things: First, it examines the nature of the relationship between published representations and students' reasoning about the natural world. Second, it considers the ways in which representations are used in textbooks and how that has changed over a span of five decades. Third, this dissertation provides an in-depth look into how middle school science classrooms naturally use these visual representations and what kinds of support are being provided. With respect to the three goals of this dissertation, three pools of data were collected and analyzed for this study. First, interview data was collected in which 32 middle school students interpreted and reasoned with a set of more and less problematic published textbook representations. Quantitative analyses of the interview data suggest that, counter to what has been anticipated in the literature, there were no significant differences in the conceptualizations of students in the different groups. An accompanying qualitative analysis probes further into why this was the case. In addition to the interview data, a corpus of graphic representations from 34 science textbooks (published between 1943-2005) was catalogued and examined for compositional trends and changes. This historical textbook analysis of images and illustrations reveals that, consistent with expectations, there has indeed been an overall increase in the number of representations in a given instructional unit. Yet, despite the increase, there is very little shift in the instructional functions that those representations serve. Where the most dramatic changes appear are with the individual representations themselves and how they are used to relate scientific ideas to middle school students. Finally, a set of video-recorded classroom observations with three different teachers was collected in order to study representational-use routines. A numerical analysis of classroom episodes suggests that it is fairly common for the majority of representations that are used to appear fleetingly and not be discussed again. When representations are reused or reintroduced, a qualitative analysis reveals that they are often accompanied by interpretive support from the teacher, which may steer students away from misinterpretations.

  6. Diagnosing collisions of magnetized, high energy density plasma flows using a combination of collective Thomson scattering, Faraday rotation, and interferometry (invited).

    PubMed

    Swadling, G F; Lebedev, S V; Hall, G N; Patankar, S; Stewart, N H; Smith, R A; Harvey-Thompson, A J; Burdiak, G C; de Grouchy, P; Skidmore, J; Suttle, L; Suzuki-Vidal, F; Bland, S N; Kwek, K H; Pickworth, L; Bennett, M; Hare, J D; Rozmus, W; Yuan, J

    2014-11-01

    A suite of laser based diagnostics is used to study interactions of magnetised, supersonic, radiatively cooled plasma flows produced using the Magpie pulse power generator (1.4 MA, 240 ns rise time). Collective optical Thomson scattering measures the time-resolved local flow velocity and temperature across 7-14 spatial positions. The scattering spectrum is recorded from multiple directions, allowing more accurate reconstruction of the flow velocity vectors. The areal electron density is measured using 2D interferometry; optimisation and analysis are discussed. The Faraday rotation diagnostic, operating at 1053 nm, measures the magnetic field distribution in the plasma. Measurements obtained simultaneously by these diagnostics are used to constrain analysis, increasing the accuracy of interpretation.

  7. Patient movement characteristics and the impact on CBCT image quality and interpretability.

    PubMed

    Spin-Neto, Rubens; Costa, Cláudio; Salgado, Daniela Mra; Zambrana, Nataly Rm; Gotfredsen, Erik; Wenzel, Ann

    2018-01-01

    To assess the impact of patient movement characteristics and metal/radiopaque materials in the field-of-view (FOV) on CBCT image quality and interpretability. 162 CBCT examinations were performed in 134 consecutive (i.e. prospective data collection) patients (age average: 27.2 years; range: 9-73). An accelerometer-gyroscope system registered patient's head position during examination. The threshold for movement definition was set at ≥0.5-mm movement distance based on accelerometer-gyroscope recording. Movement complexity was defined as uniplanar/multiplanar. Three observers scored independently: presence of stripe (i.e. streak) artefacts (absent/"enamel stripes"/"metal stripes"/"movement stripes"), overall unsharpness (absent/present) and image interpretability (interpretable/not interpretable). Kappa statistics assessed interobserver agreement. χ 2 tests analysed whether movement distance, movement complexity and metal/radiopaque material in the FOV affected image quality and image interpretability. Relevant risk factors (p ≤ 0.20) were entered into a multivariate logistic regression analysis with "not interpretable" as the outcome. Interobserver agreement for image interpretability was good (average = 0.65). Movement distance and presence of metal/radiopaque materials significantly affected image quality and interpretability. There were 22-28 cases, in which the observers stated the image was not interpretable. Small movements (i.e. <3 mm) did not significantly affect image interpretability. For movements ≥ 3 mm, the risk that a case was scored as "not interpretable" was significantly (p ≤ 0.05) increased [OR 3.2-11.3; 95% CI (0.70-65.47)]. Metal/radiopaque material was also a significant (p ≤ 0.05) risk factor (OR 3.61-5.05). Patient movement ≥3 mm and metal/radiopaque material in the FOV significantly affected CBCT image quality and interpretability.

  8. Validation of educational assessments: a primer for simulation and beyond.

    PubMed

    Cook, David A; Hatala, Rose

    2016-01-01

    Simulation plays a vital role in health professions assessment. This review provides a primer on assessment validation for educators and education researchers. We focus on simulation-based assessment of health professionals, but the principles apply broadly to other assessment approaches and topics. Validation refers to the process of collecting validity evidence to evaluate the appropriateness of the interpretations, uses, and decisions based on assessment results. Contemporary frameworks view validity as a hypothesis, and validity evidence is collected to support or refute the validity hypothesis (i.e., that the proposed interpretations and decisions are defensible). In validation, the educator or researcher defines the proposed interpretations and decisions, identifies and prioritizes the most questionable assumptions in making these interpretations and decisions (the "interpretation-use argument"), empirically tests those assumptions using existing or newly-collected evidence, and then summarizes the evidence as a coherent "validity argument." A framework proposed by Messick identifies potential evidence sources: content, response process, internal structure, relationships with other variables, and consequences. Another framework proposed by Kane identifies key inferences in generating useful interpretations: scoring, generalization, extrapolation, and implications/decision. We propose an eight-step approach to validation that applies to either framework: Define the construct and proposed interpretation, make explicit the intended decision(s), define the interpretation-use argument and prioritize needed validity evidence, identify candidate instruments and/or create/adapt a new instrument, appraise existing evidence and collect new evidence as needed, keep track of practical issues, formulate the validity argument, and make a judgment: does the evidence support the intended use? Rigorous validation first prioritizes and then empirically evaluates key assumptions in the interpretation and use of assessment scores. Validation science would be improved by more explicit articulation and prioritization of the interpretation-use argument, greater use of formal validation frameworks, and more evidence informing the consequences and implications of assessment.

  9. Platforms for Single-Cell Collection and Analysis.

    PubMed

    Valihrach, Lukas; Androvic, Peter; Kubista, Mikael

    2018-03-11

    Single-cell analysis has become an established method to study cell heterogeneity and for rare cell characterization. Despite the high cost and technical constraints, applications are increasing every year in all fields of biology. Following the trend, there is a tremendous development of tools for single-cell analysis, especially in the RNA sequencing field. Every improvement increases sensitivity and throughput. Collecting a large amount of data also stimulates the development of new approaches for bioinformatic analysis and interpretation. However, the essential requirement for any analysis is the collection of single cells of high quality. The single-cell isolation must be fast, effective, and gentle to maintain the native expression profiles. Classical methods for single-cell isolation are micromanipulation, microdissection, and fluorescence-activated cell sorting (FACS). In the last decade several new and highly efficient approaches have been developed, which not just supplement but may fully replace the traditional ones. These new techniques are based on microfluidic chips, droplets, micro-well plates, and automatic collection of cells using capillaries, magnets, an electric field, or a punching probe. In this review we summarize the current methods and developments in this field. We discuss the advantages of the different commercially available platforms and their applicability, and also provide remarks on future developments.

  10. Platforms for Single-Cell Collection and Analysis

    PubMed Central

    Valihrach, Lukas; Androvic, Peter; Kubista, Mikael

    2018-01-01

    Single-cell analysis has become an established method to study cell heterogeneity and for rare cell characterization. Despite the high cost and technical constraints, applications are increasing every year in all fields of biology. Following the trend, there is a tremendous development of tools for single-cell analysis, especially in the RNA sequencing field. Every improvement increases sensitivity and throughput. Collecting a large amount of data also stimulates the development of new approaches for bioinformatic analysis and interpretation. However, the essential requirement for any analysis is the collection of single cells of high quality. The single-cell isolation must be fast, effective, and gentle to maintain the native expression profiles. Classical methods for single-cell isolation are micromanipulation, microdissection, and fluorescence-activated cell sorting (FACS). In the last decade several new and highly efficient approaches have been developed, which not just supplement but may fully replace the traditional ones. These new techniques are based on microfluidic chips, droplets, micro-well plates, and automatic collection of cells using capillaries, magnets, an electric field, or a punching probe. In this review we summarize the current methods and developments in this field. We discuss the advantages of the different commercially available platforms and their applicability, and also provide remarks on future developments. PMID:29534489

  11. Open science resources for the discovery and analysis of Tara Oceans data

    PubMed Central

    Pesant, Stéphane; Not, Fabrice; Picheral, Marc; Kandels-Lewis, Stefanie; Le Bescot, Noan; Gorsky, Gabriel; Iudicone, Daniele; Karsenti, Eric; Speich, Sabrina; Troublé, Romain; Dimier, Céline; Searson, Sarah; Acinas, Silvia G.; Bork, Peer; Boss, Emmanuel; Bowler, Chris; Vargas, Colomban De; Follows, Michael; Gorsky, Gabriel; Grimsley, Nigel; Hingamp, Pascal; Iudicone, Daniele; Jaillon, Olivier; Kandels-Lewis, Stefanie; Karp-Boss, Lee; Karsenti, Eric; Krzic, Uros; Not, Fabrice; Ogata, Hiroyuki; Pesant, Stéphane; Raes, Jeroen; Reynaud, Emmanuel G.; Sardet, Christian; Sieracki, Mike; Speich, Sabrina; Stemmann, Lars; Sullivan, Matthew B.; Sunagawa, Shinichi; Velayoudon, Didier; Weissenbach, Jean; Wincker, Patrick

    2015-01-01

    The Tara Oceans expedition (2009–2013) sampled contrasting ecosystems of the world oceans, collecting environmental data and plankton, from viruses to metazoans, for later analysis using modern sequencing and state-of-the-art imaging technologies. It surveyed 210 ecosystems in 20 biogeographic provinces, collecting over 35,000 samples of seawater and plankton. The interpretation of such an extensive collection of samples in their ecological context requires means to explore, assess and access raw and validated data sets. To address this challenge, the Tara Oceans Consortium offers open science resources, including the use of open access archives for nucleotides (ENA) and for environmental, biogeochemical, taxonomic and morphological data (PANGAEA), and the development of on line discovery tools and collaborative annotation tools for sequences and images. Here, we present an overview of Tara Oceans Data, and we provide detailed registries (data sets) of all campaigns (from port-to-port), stations and sampling events. PMID:26029378

  12. Open science resources for the discovery and analysis of Tara Oceans data

    NASA Astrophysics Data System (ADS)

    2015-05-01

    The Tara Oceans expedition (2009-2013) sampled contrasting ecosystems of the world oceans, collecting environmental data and plankton, from viruses to metazoans, for later analysis using modern sequencing and state-of-the-art imaging technologies. It surveyed 210 ecosystems in 20 biogeographic provinces, collecting over 35,000 samples of seawater and plankton. The interpretation of such an extensive collection of samples in their ecological context requires means to explore, assess and access raw and validated data sets. To address this challenge, the Tara Oceans Consortium offers open science resources, including the use of open access archives for nucleotides (ENA) and for environmental, biogeochemical, taxonomic and morphological data (PANGAEA), and the development of on line discovery tools and collaborative annotation tools for sequences and images. Here, we present an overview of Tara Oceans Data, and we provide detailed registries (data sets) of all campaigns (from port-to-port), stations and sampling events.

  13. Source apportion of atmospheric particulate matter: a joint Eulerian/Lagrangian approach.

    PubMed

    Riccio, A; Chianese, E; Agrillo, G; Esposito, C; Ferrara, L; Tirimberio, G

    2014-12-01

    PM2.5 samples were collected during an annual monitoring campaign (January 2012-January 2013) in the urban area of Naples, one of the major cities in Southern Italy. Samples were collected by means of a standard gravimetric sampler (Tecora Echo model) and characterized from a chemical point of view by ion chromatography. As a result, 143 samples together with their ionic composition have been collected. We extend traditional source apportionment techniques, usually based on multivariate factor analysis, interpreting the chemical analysis results within a Lagrangian framework. The Hybrid Single-Particle Lagrangian Integrated Trajectory Model (HYSPLIT) model was used, providing linkages to the source regions in the upwind areas. Results were analyzed in order to quantify the relative weight of different source types/areas. Model results suggested that PM concentrations are strongly affected not only by local emissions but also by transboundary emissions, especially from the Eastern and Northern European countries and African Saharan dust episodes.

  14. Open science resources for the discovery and analysis of Tara Oceans data.

    PubMed

    Pesant, Stéphane; Not, Fabrice; Picheral, Marc; Kandels-Lewis, Stefanie; Le Bescot, Noan; Gorsky, Gabriel; Iudicone, Daniele; Karsenti, Eric; Speich, Sabrina; Troublé, Romain; Dimier, Céline; Searson, Sarah

    2015-01-01

    The Tara Oceans expedition (2009-2013) sampled contrasting ecosystems of the world oceans, collecting environmental data and plankton, from viruses to metazoans, for later analysis using modern sequencing and state-of-the-art imaging technologies. It surveyed 210 ecosystems in 20 biogeographic provinces, collecting over 35,000 samples of seawater and plankton. The interpretation of such an extensive collection of samples in their ecological context requires means to explore, assess and access raw and validated data sets. To address this challenge, the Tara Oceans Consortium offers open science resources, including the use of open access archives for nucleotides (ENA) and for environmental, biogeochemical, taxonomic and morphological data (PANGAEA), and the development of on line discovery tools and collaborative annotation tools for sequences and images. Here, we present an overview of Tara Oceans Data, and we provide detailed registries (data sets) of all campaigns (from port-to-port), stations and sampling events.

  15. An innovative and shared methodology for event reconstruction using images in forensic science.

    PubMed

    Milliet, Quentin; Jendly, Manon; Delémont, Olivier

    2015-09-01

    This study presents an innovative methodology for forensic science image analysis for event reconstruction. The methodology is based on experiences from real cases. It provides real added value to technical guidelines such as standard operating procedures (SOPs) and enriches the community of practices at stake in this field. This bottom-up solution outlines the many facets of analysis and the complexity of the decision-making process. Additionally, the methodology provides a backbone for articulating more detailed and technical procedures and SOPs. It emerged from a grounded theory approach; data from individual and collective interviews with eight Swiss and nine European forensic image analysis experts were collected and interpreted in a continuous, circular and reflexive manner. Throughout the process of conducting interviews and panel discussions, similarities and discrepancies were discussed in detail to provide a comprehensive picture of practices and points of view and to ultimately formalise shared know-how. Our contribution sheds light on the complexity of the choices, actions and interactions along the path of data collection and analysis, enhancing both the researchers' and participants' reflexivity. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  16. On the value of information for Industry 4.0

    NASA Astrophysics Data System (ADS)

    Omenzetter, Piotr

    2018-03-01

    Industry 4.0, or the fourth industrial revolution, that blurs the boundaries between the physical and the digital, is underpinned by vast amounts of data collected by sensors that monitor processes and components of smart factories that continuously communicate amongst one another and with the network hubs via the internet of things. Yet, collection of those vast amounts of data, which are inherently imperfect and burdened with uncertainties and noise, entails costs including hardware and software, data storage, processing, interpretation and integration into the decision-making process to name just the few main expenditures. This paper discusses a framework for rationalizing the adoption of (big) data collection for Industry 4.0. The pre-posterior Bayesian decision analysis is used to that end and industrial process evolution with time is conceptualized as a stochastic observable and controllable dynamical system. The chief underlying motivation is to be able to use the collected data in such a way as to derive the most benefit from them by trading off successfully the management of risks pertinent to failure of the monitored processes and/or its components against the cost of data collection, processing and interpretation. This enables formulation of optimization problems for data collection, e.g. for selecting the monitoring system type, topology and/or time of deployment. An illustrative example utilizing monitoring of the operation of an assembly line and optimizing the topology of a monitoring system is provided to illustrate the theoretical concepts.

  17. Android and ODK based data collection framework to aid in epidemiological analysis

    PubMed Central

    Raja, A.; Tridane, A.; Gaffar, A.; Lindquist, T.; Pribadi, K.

    2014-01-01

    Periodic collection of field data, analysis and interpretation of data are key to a good healthcare service. This data is used by the subsequent decision makers to recognize preventive measures, provide timely support to the affected and to help measure the effects of their interventions. While the resources required for good disease surveillance and proactive healthcare are available more readily in developed countries, the lack of these in developing countries may compromise the quality of service provided. This combined with the critical nature of some diseases makes this an essential issue to be addressed. Taking advantage of the rapid growth of cell phone usage and related infrastructure in developed as well as developing countries, several systems have been established to address the gaps in data collection. Android, being an open sourced platform, has gained considerable popularity in this aspect. Open data kit is one such tool developed to aid in data collection. The aim of this paper is to present a prototype framework built using few such existing tools and technologies to address data collection for seasonal influenza, commonly referred to as the flu. PMID:24678381

  18. Evaluation of Primary/Preferred Language Data Collection

    PubMed Central

    Duong, Linh M.; Singh, Simple D.; Buchanan, Natasha; Phillips, Joan L; Cerlach, Ken

    2015-01-01

    A literature review was conducted to identify peer-reviewed articles related to primary/preferred language and interpreter-use data collection practices in hospitals, clinics, and outpatient settings to assess its completeness and quality. In January 2011, Embase (Ovid), MEDLINE (Ovid), PubMed, and Web of Science databases were searched for eligible studies. Primary and secondary inclusion criteria were applied to selected eligible articles. This extensive literature search yielded 768 articles after duplicates were removed. After primary and secondary inclusion criteria were applied, 28 eligible articles remained for data abstraction. All 28 articles in this review reported collecting primary/preferred language data, but only 18% (5/28) collected information on interpreter use. This review revealed that there remains variability in the way that primary/preferred language and interpreter use data are collected; all studies used various methodologies for evaluating and abstracting these data. Likewise, the sources from which the data were abstracted differed. PMID:23443456

  19. Evaluation of primary/preferred language data collection.

    PubMed

    Duong, Linh M; Singh, Simple D; Buchanan, Natasha; Phillips, Joan L; Gerlach, Ken

    2012-01-01

    A literature review was conducted to identify peer-reviewed articles related to primary/preferred language and interpreter-use data collection practices in hospitals, clinics, and outpatient settings to assess its completeness and quality. In January 2011, Embase (Ovid), MEDLINE (Ovid), PubMed, and Web of Science databases were searched for eligible studies. Primary and secondary inclusion criteria were applied to selected eligible articles. This extensive literature search yielded 768 articles after duplicates were removed. After primary and secondary inclusion criteria were applied, 28 eligible articles remained for data abstraction. All 28 articles in this review reported collecting primary/preferred language data, but only 18% (5/28) collected information on interpreter use. This review revealed that there remains variability in the way that primary/preferred language and interpreter use data are collected; all studies used various methodologies for evaluating and abstracting these data. Likewise, the sources from which the data were abstracted differed.

  20. Coping with stress: dream interpretation in the Mapuche family.

    PubMed

    Degarrod, L N

    1990-06-01

    Dreams are shared and interpreted daily within the family unit among the Mapuche Indians of Chile. This anthropological study examines the communicative aspect of dream sharing and interpreting among Mapuche families undergoing emotional and physical stress. Specifically, it investigates the ways in which the Mapuche dream interpretation system provides the family members with another means of interaction and a way of solving their problems. It also examines how individuals influence their attitudes towards one another by communally participating in the dream interpretation process, and in its narrative performance. The data used in this research consists of dreams and of their interpretations, collected in the natural setting, from two families with members suffering of witchcraft, and fear of death. This information was collected over a period of 17 months from October 1985 to March 1987 with a Fulbright-Hayes Doctoral Dissertation Grant.

  1. The interpretation of dream meaning: Resolving ambiguity using Latent Semantic Analysis in a small corpus of text.

    PubMed

    Altszyler, Edgar; Ribeiro, Sidarta; Sigman, Mariano; Fernández Slezak, Diego

    2017-11-01

    Computer-based dreams content analysis relies on word frequencies within predefined categories in order to identify different elements in text. As a complementary approach, we explored the capabilities and limitations of word-embedding techniques to identify word usage patterns among dream reports. These tools allow us to quantify words associations in text and to identify the meaning of target words. Word-embeddings have been extensively studied in large datasets, but only a few studies analyze semantic representations in small corpora. To fill this gap, we compared Skip-gram and Latent Semantic Analysis (LSA) capabilities to extract semantic associations from dream reports. LSA showed better performance than Skip-gram in small size corpora in two tests. Furthermore, LSA captured relevant word associations in dream collection, even in cases with low-frequency words or small numbers of dreams. Word associations in dreams reports can thus be quantified by LSA, which opens new avenues for dream interpretation and decoding. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. Job satisfaction among neonatal nurses.

    PubMed

    Archibald, Cynthia

    2006-01-01

    The purpose of this study was to understand the job satisfaction of nurses who work in intensive care nurseries. The design used a convenience sample of eight nurses with an average of 11 years of active and current experience as neonatal intensive care nurses. METHOD/DATA COLLECTION: Human rights were protected according to the institutional guidelines. Data collection included semi-structured, intensive face-to-face interviews, observation, and field notes. The interviews were tape recorded and transcribed. Collaizzi's (1978) interpretation method was used to interpret and analyze the data using significant statements, formulated meanings, and clustering. Each participant was allowed to review the typed interview as one means of credibility. Analysis of the described experiences revealed that nurses were able to identify enough satisfying situations that compelled them to continue working in the neonatal intensive care unit (NICU). These rewards included compensation, team spirit, support from physicians, and advocacy. Knowledge of the factors that contribute to nurses' job satisfaction can provide a useful framework to implement policies to improve working conditions for nurses.

  3. Painful languages of the body: experiences of headache among women in two Peruvian communities.

    PubMed

    Darghouth, Sarah; Pedersen, Duncan; Bibeau, Gilles; Rousseau, Cecile

    2006-09-01

    This exploratory study focuses on the understandings of and experiences with headache in two settings in Peru: the Quechua-speaking district of Ayacucho, in southern Peru, and a poor urban district of Lima Metropolitana. More specifically, it explores the personal and collective meanings constructed around women's headache experiences. Structured and open-ended interviews were administered to patients suffering headache to elicit interpretations of headache episodes. An analysis of the collected narratives suggests that headache is often comprehended in a polysemic framework, where meanings ascribed in bodily, emotional, family, and social terms articulate individual and shared notions of suffering within larger contexts of social dislocation. Often woven into experiences of solitude, headache accounts are lived and told in dynamic temporal spaces, and narrate dissolution of family ties and tensions associated with women's roles. The results underscore the significance of patients' subjective interpretations of painful experiences and underscore the connections between bodily and emotional pain and distress experienced at family, community, and larger social levels.

  4. Meeting the challenge of interpretation: Hearing the voices of people with intellectual and developmental disability through I-Poems.

    PubMed

    Corby, Deirdre; Taggart, Laurence; Cousins, Wendy

    2018-06-01

    Including the inner perspectives of people who have intellectual disability can pose methodological challenges to qualitative researchers. This article explains how the Listening Guide was applied as an additional step in the analysis during a study which used hermeneutic interviews with people with intellectual disability as the sole method of data collection. An argument is made for the systematic application of the guide with a focus on the use of I-Poems. This article advances qualitative methodological approaches and concludes that this method of drawing attention to the participants' own voices provides a unique basis for interpreting interviews and tasks researchers to examine the use of the Listening Guide.

  5. New interpretations of the Fort Clark State Historic Site based on aerial color and thermal infrared imagery

    NASA Astrophysics Data System (ADS)

    Heller, Andrew Roland

    The Fort Clark State Historic Site (32ME2) is a well known site on the upper Missouri River, North Dakota. The site was the location of two Euroamerican trading posts and a large Mandan-Arikara earthlodge village. In 2004, Dr. Kenneth L. Kvamme and Dr. Tommy Hailey surveyed the site using aerial color and thermal infrared imagery collected from a powered parachute. Individual images were stitched together into large image mosaics and registered to Wood's 1993 interpretive map of the site using Adobe Photoshop. The analysis of those image mosaics resulted in the identification of more than 1,500 archaeological features, including as many as 124 earthlodges.

  6. Exit Presentation

    NASA Technical Reports Server (NTRS)

    Melone, Kate

    2016-01-01

    Skills Acquired: Tensile Testing: Prepare materials and setting up the tensile tests; Collect and interpret (messy) data. Outgassing Testing: Understand TML (Total Mass Loss) and CVCM (Collected Volatile Condensable Material); Collaboration with other NASA centers. Z2 (NASA's Prototype Space Suit Development) Support: Hands on building mockups of components; Analyze data; Work with others, understanding what both parties need in order to make a run successful. LCVG (Liquid Cooling and Ventilation Garment) Flush and Purge Console: Both formal design and design review process; How to determine which components to use - flow calculations, pressure ratings, size, etc.; Hazard Analysis; How to make design tradeoffs.

  7. Intervention research: GAO experiences.

    PubMed

    Grasso, P G

    1996-04-01

    This paper describes tools of program evaluation that may prove useful in conducting research on occupational health and safety interventions. It presents examples of three studies conducted by the U.S. General Accounting Office that illustrate a variety of techniques for collecting and analyzing data on program interventions, including analysis of extant data, synthesis of results of existing studies, and combining data from administrative files with survey results. At the same time, it stresses the importance and difficulty of constructing an adequate "theory" of how the intervention is expected to affect outcomes, both for guiding data collection and for allowing adequate interpretation of results.

  8. A compilation of K-Ar-ages for southern California

    USGS Publications Warehouse

    Miller, Fred K.; Morton, Douglas M.; Morton, Janet L.; Miller, David M.

    2014-01-01

    The purpose of this report is to make available a large body of conventional K-Ar ages for granitic, volcanic, and metamorphic rocks collected in southern California. Although one interpretive map is included, the report consists primarily of a systematic listing, without discussion or interpretation, of published and unpublished ages that may be of value in future regional and other geologic studies. From 1973 to 1979, 468 rock samples from southern California were collected for conventional K-Ar dating under a regional geologic mapping project of Southern California (predecessor of the Southern California Areal Mapping Project). Most samples were collected and dated between 1974 and 1977. For 61 samples (13 percent of those collected), either they were discarded for varying reasons, or the original collection data were lost. For the remaining samples, 518 conventional K-Ar ages are reported here; coexisting mineral pairs were dated from many samples. Of these K-Ar ages, 225 are previously unpublished, and identified as such in table 1. All K-Ar ages are by conventional K-Ar analysis; no 40Ar/39Ar dating was done. Subsequent to the rock samples collected in the 1970s and reported here, 33 samples were collected and 38 conventional K-Ar ages determined under projects directed at (1) characterization of the Mesozoic and Cenozoic igneous rocks in and on both sides of the Transverse Ranges and (2) clarifying the Mesozoic and Cenozoic tectonics of the eastern Mojave Desert. Although previously published (Beckerman et al., 1982), another eight samples and 11 conventional K-Ar ages are included here, because they augment those completed under the previous two projects.

  9. Collection and analysis of remotely sensed data from the Rhode River Estuary Watershed

    NASA Technical Reports Server (NTRS)

    Jenkins, D. W.; Williamson, F. S. L.

    1973-01-01

    The remote sensing study to survey the Rhode River watershed for spray irrigation with secondarily treated sewage is reported. The standardization of Autumn coloration changes with Munsell color chips is described along with the mapping of old field vegetation for the spray irrigation project. The interpretation and verification of salt marsh vegetation by remote sensing of the water shed is discussed.

  10. Whole genome sequencing options for bacterial strain typing and epidemiologic analysis based on single nucleotide polymorphism versus gene-by-gene-based approaches.

    PubMed

    Schürch, A C; Arredondo-Alonso, S; Willems, R J L; Goering, R V

    2018-04-01

    Whole genome sequence (WGS)-based strain typing finds increasing use in the epidemiologic analysis of bacterial pathogens in both public health as well as more localized infection control settings. This minireview describes methodologic approaches that have been explored for WGS-based epidemiologic analysis and considers the challenges and pitfalls of data interpretation. Personal collection of relevant publications. When applying WGS to study the molecular epidemiology of bacterial pathogens, genomic variability between strains is translated into measures of distance by determining single nucleotide polymorphisms in core genome alignments or by indexing allelic variation in hundreds to thousands of core genes, assigning types to unique allelic profiles. Interpreting isolate relatedness from these distances is highly organism specific, and attempts to establish species-specific cutoffs are unlikely to be generally applicable. In cases where single nucleotide polymorphism or core gene typing do not provide the resolution necessary for accurate assessment of the epidemiology of bacterial pathogens, inclusion of accessory gene or plasmid sequences may provide the additional required discrimination. As with all epidemiologic analysis, realizing the full potential of the revolutionary advances in WGS-based approaches requires understanding and dealing with issues related to the fundamental steps of data generation and interpretation. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  11. Development and evaluation of a thermochemistry concept inventory for college-level general chemistry

    NASA Astrophysics Data System (ADS)

    Wren, David A.

    The research presented in this dissertation culminated in a 10-item Thermochemistry Concept Inventory (TCI). The development of the TCI can be divided into two main phases: qualitative studies and quantitative studies. Both phases focused on the primary stakeholders of the TCI, college-level general chemistry instructors and students. Each phase was designed to collect evidence for the validity of the interpretations and uses of TCI testing data. A central use of TCI testing data is to identify student conceptual misunderstandings, which are represented as incorrect options of multiple-choice TCI items. Therefore, quantitative and qualitative studies focused heavily on collecting evidence at the item-level, where important interpretations may be made by TCI users. Qualitative studies included student interviews (N = 28) and online expert surveys (N = 30). Think-aloud student interviews (N = 12) were used to identify conceptual misunderstandings used by students. Novice response process validity interviews (N = 16) helped provide information on how students interpreted and answered TCI items and were the basis of item revisions. Practicing general chemistry instructors (N = 18), or experts, defined boundaries of thermochemistry content included on the TCI. Once TCI items were in the later stages of development, an online version of the TCI was used in expert response process validity survey (N = 12), to provide expert feedback on item content, format and consensus of the correct answer for each item. Quantitative studies included three phases: beta testing of TCI items (N = 280), pilot testing of the a 12-item TCI (N = 485), and a large data collection using a 10-item TCI ( N = 1331). In addition to traditional classical test theory analysis, Rasch model analysis was also used for evaluation of testing data at the test and item level. The TCI was administered in both formative assessment (beta and pilot testing) and summative assessment (large data collection), with items performing well in both. One item, item K, did not have acceptable psychometric properties when the TCI was used as a quiz (summative assessment), but was retained in the final version of the TCI based on the acceptable psychometric properties displayed in pilot testing (formative assessment).

  12. The meaning of "independence" for older people in different residential settings.

    PubMed

    Hillcoat-Nallétamby, Sarah

    2014-05-01

    Drawing on older people's understandings of "independence" and Collopy's work on autonomy, the article elaborates an interpretive framework of the concept in relation to 3 residential settings-the private dwelling-home, the extra-care, and the residential-care settings. Data include 91 qualitative interviews with frail, older people living in each setting, collected as part of a larger Welsh study. Thematic analysis techniques were employed to identify patterns in meanings of independence across settings and then interpreted using Collopy's conceptualizations of autonomy, as well as notions of space and interdependencies. Independence has multiple meanings for older people, but certain meanings are common to all settings: Accepting help at hand; doing things alone; having family, friends, and money as resources; and preserving physical and mental capacities. Concepts of delegated, executional, authentic, decisional, and consumer autonomy, as well as social interdependencies and spatial and social independence, do provide appropriate higher order interpretive constructs of these meanings across settings. A broader interpretive framework of "independence" should encompass concepts of relative independence, autonomy(ies), as well as spatial and social independence, and can provide more nuanced interpretations of structured dependency and institutionalization theories when applied to different residential settings.

  13. Efficacy of hair analysis for monitoring exposure to uranium: a mini-review.

    PubMed

    Joksić, Agnes Šömen; Katz, Sidney A

    2014-01-01

    In spite of the ease with which samples may be collected and the stability of the samples after collection, the use of hair mineral analysis for monitoring environmental exposures and evaluating heavy metal poisonings has remained controversial since its initial applications for these purposes in the early 1950s. Among the major arguments against using hair mineral analysis in general were the absence of biokinetic models and/or metabolic data that adequately described the incorporation of trace elements into the hair, the absence of correlations between the concentrations of trace elements in the hair and their concentrations in other tissues, the inability to distinguish between trace elements that were deposited in the hair endogenously and those that were deposited on the hair exogenously, the absence of reliable reference ranges for interpreting the results of hair mineral analysis and a lack of standard procedures for the collecting, preparing and analyzing the hair samples. The developments of the past two decades addressing these objections are reviewed here, and arguments supporting the use of hair analysis for monitoring environmental and/or occupational exposures to uranium are made on the basis of the information presented in this review.

  14. Introducing W.A.T.E.R.S.: a workflow for the alignment, taxonomy, and ecology of ribosomal sequences.

    PubMed

    Hartman, Amber L; Riddle, Sean; McPhillips, Timothy; Ludäscher, Bertram; Eisen, Jonathan A

    2010-06-12

    For more than two decades microbiologists have used a highly conserved microbial gene as a phylogenetic marker for bacteria and archaea. The small-subunit ribosomal RNA gene, also known as 16 S rRNA, is encoded by ribosomal DNA, 16 S rDNA, and has provided a powerful comparative tool to microbial ecologists. Over time, the microbial ecology field has matured from small-scale studies in a select number of environments to massive collections of sequence data that are paired with dozens of corresponding collection variables. As the complexity of data and tool sets have grown, the need for flexible automation and maintenance of the core processes of 16 S rDNA sequence analysis has increased correspondingly. We present WATERS, an integrated approach for 16 S rDNA analysis that bundles a suite of publicly available 16 S rDNA analysis software tools into a single software package. The "toolkit" includes sequence alignment, chimera removal, OTU determination, taxonomy assignment, phylogentic tree construction as well as a host of ecological analysis and visualization tools. WATERS employs a flexible, collection-oriented 'workflow' approach using the open-source Kepler system as a platform. By packaging available software tools into a single automated workflow, WATERS simplifies 16 S rDNA analyses, especially for those without specialized bioinformatics, programming expertise. In addition, WATERS, like some of the newer comprehensive rRNA analysis tools, allows researchers to minimize the time dedicated to carrying out tedious informatics steps and to focus their attention instead on the biological interpretation of the results. One advantage of WATERS over other comprehensive tools is that the use of the Kepler workflow system facilitates result interpretation and reproducibility via a data provenance sub-system. Furthermore, new "actors" can be added to the workflow as desired and we see WATERS as an initial seed for a sizeable and growing repository of interoperable, easy-to-combine tools for asking increasingly complex microbial ecology questions.

  15. Palaeoenvironmental reconstructions on the Mozambique coast as a tool to understand human evolution: from modern analogues to borehole interpretation

    NASA Astrophysics Data System (ADS)

    Gomes, Ana; Skosey-LaLonde, Elena; Zinsious, Brandon; Gonçalves, Célia; Bicho, Nuno; Raja, Mussa; Cascalheira, João; Haws, Jonathan

    2017-04-01

    In the framework of the project "Stone Age Vilankulos: Modern Human Origins Research South of the Rio Save, Mozambique" a geoarchaeological survey was conducted in 2016 aiming to better understand the environmental history and landscape evolution of the study area including the environmental context of human occupation. During the survey, 23 sediment surface samples were collected across a variety of environments, namely: freshwater environment - Elephant River basin in Southwestern Mozambique - and brackish and marine tidal environments - Inhambane coastal area, Southeastern Mozambique. These samples will be used as modern analogues to interpret the sedimentological and paleontological record of 4 cores collected in a mangrove area of the Inhambane estuary and then reconstruct its palaeoenvironmental evolution. All the sampling points were georeferenced and the study area was overflown with a drone to collect photogrammetric data. Both surface and core samples were used for diatom, texture and geochemical analysis. Diatoms will be used as the main palaeontological proxy, because they are unicellular algae with a short-live cycle and largely sensible to environmental variables such as salinity, sediment texture and duration of the tidal inundation. Preliminary data on the modern diatoms analysis showed that diatom diversity is high and the equitability is low in all environments. Cores sedimentological description and dating are also presented. The work was supported by the project PTDC/EPHARQ/4168/2014, funded by the Portuguese Foundation for Science and Technology.

  16. Multiple-Locus Variable-Number Tandem-Repeat Analysis in Genotyping Yersinia enterocolitica Strains from Human and Porcine Origins

    PubMed Central

    Laukkanen-Ninios, R.; Ortiz Martínez, P.; Siitonen, A.; Fredriksson-Ahomaa, M.; Korkeala, H.

    2013-01-01

    Sporadic and epidemiologically linked Yersinia enterocolitica strains (n = 379) isolated from fecal samples from human patients, tonsil or fecal samples from pigs collected at slaughterhouses, and pork samples collected at meat stores were genotyped using multiple-locus variable-number tandem-repeat analysis (MLVA) with six loci, i.e., V2A, V4, V5, V6, V7, and V9. In total, 312 different MLVA types were found. Similar types were detected (i) in fecal samples collected from human patients over 2 to 3 consecutive years, (ii) in samples from humans and pigs, and (iii) in samples from pigs that originated from the same farms. Among porcine strains, we found farm-specific MLVA profiles. Variations in the numbers of tandem repeats from one to four for variable-number tandem-repeat (VNTR) loci V2A, V5, V6, and V7 were observed within a farm. MLVA was applicable for serotypes O:3, O:5,27, and O:9 and appeared to be a highly discriminating tool for distinguishing sporadic and outbreak-related strains. With long-term use, interpretation of the results became more challenging due to variations in more-discriminating loci, as was observed for strains originating from pig farms. Additionally, we encountered unexpectedly short V2A VNTR fragments and sequenced them. According to the sequencing results, updated guidelines for interpreting V2A VNTR results were prepared. PMID:23637293

  17. SP mountain data analysis

    NASA Technical Reports Server (NTRS)

    Rawson, R. F.; Hamilton, R. E.; Liskow, C. L.; Dias, A. R.; Jackson, P. L.

    1981-01-01

    An analysis of synthetic aperture radar data of SP Mountain was undertaken to demonstrate the use of digital image processing techniques to aid in geologic interpretation of SAR data. These data were collected with the ERIM X- and L-band airborne SAR using like- and cross-polarizations. The resulting signal films were used to produce computer compatible tapes, from which four-channel imagery was generated. Slant range-to-ground range and range-azimuth-scale corrections were made in order to facilitate image registration; intensity corrections were also made. Manual interpretation of the imagery showed that L-band represented the geology of the area better than X-band. Several differences between the various images were also noted. Further digital analysis of the corrected data was done for enhancement purposes. This analysis included application of an MSS differencing routine and development of a routine for removal of relief displacement. It was found that accurate registration of the SAR channels is critical to the effectiveness of the differencing routine. Use of the relief displacement algorithm on the SP Mountain data demonstrated the feasibility of the technique.

  18. Applicability of SWOT analysis for measuring quality of public oral health services as perceived by adult patients in Finland. Strengths, weaknesses, opportunities and threats.

    PubMed

    Toivanen, T; Lahti, S; Leino-Kilpi, H

    1999-10-01

    To determine the applicability of SWOT analysis for measuring the quality of public oral health services from the adult client's perspective. Data were collected using a structured questionnaire developed in an earlier study. The study group consisted of all adult (over 18 years of age) clients (n = 256) using public municipal oral health services in Kirkkonummi, Finland, during 2 weeks in 1995. Before treatment, patients filled out a questionnaire that measured the importance of their expectations in different aspects of oral care. After the appointment, they filled out a similar questionnaire that measured the enactment of these expectations in the treatment situation. The response rate was 51%. The difference between subjective importance and enactment of expectations was tested by Wilcoxon's signed rank test. Results were interpreted using both a conventional analysis of "expectation enacted or not" and SWOT analysis, which is used in strategic planning to identify areas of strengths (S), weaknesses (W), opportunities (O) and threats (T) in an organisation. In 28 statements out of 35, the two analyses revealed similar interpretations. In most areas the patient-perceived quality of the services was good. Weaknesses were found in the following areas: communicating to patients the causes and risk of developing oral diseases, informing them about different treatment possibilities, and including patients in decision-making when choosing restorative materials. SWOT analysis provided more structured interpretation of the results, and can be more easily transferred to development of services.

  19. DOMstudio: an integrated workflow for Digital Outcrop Model reconstruction and interpretation

    NASA Astrophysics Data System (ADS)

    Bistacchi, Andrea

    2015-04-01

    Different Remote Sensing technologies, including photogrammetry and LIDAR, allow collecting 3D dataset that can be used to create 3D digital representations of outcrop surfaces, called Digital Outcrop Models (DOM), or sometimes Virtual Outcrop Models (VOM). Irrespective of the Remote Sensing technique used, DOMs can be represented either by photorealistic point clouds (PC-DOM) or textured surfaces (TS-DOM). The first are datasets composed of millions of points with XYZ coordinates and RGB colour, whilst the latter are triangulated surfaces onto which images of the outcrop have been mapped or "textured" (applying a tech-nology originally developed for movies and videogames). Here we present a workflow that allows exploiting in an integrated and efficient, yet flexible way, both kinds of dataset: PC-DOMs and TS-DOMs. The workflow is composed of three main steps: (1) data collection and processing, (2) interpretation, and (3) modelling. Data collection can be performed with photogrammetry, LIDAR, or other techniques. The quality of photogrammetric datasets obtained with Structure From Motion (SFM) techniques has shown a tremendous improvement over the past few years, and this is becoming the more effective way to collect DOM datasets. The main advantages of photogrammetry over LIDAR are represented by the very simple and lightweight field equipment (a digital camera), and by the arbitrary spatial resolution, that can be increased simply getting closer to the out-crop or by using a different lens. It must be noted that concerns about the precision of close-range photogrammetric surveys, that were justified in the past, are no more a problem if modern software and acquisition schemas are applied. In any case, LIDAR is a well-tested technology and it is still very common. Irrespective of the data collection technology, the output will be a photorealistic point cloud and a collection of oriented photos, plus additional imagery in special projects (e.g. infrared images). This dataset can be used as-is (PC-DOM), or a 3D triangulated surface can be interpolated from the point cloud, and images can be used to associate a texture to this surface (TS-DOM). In the DOMstudio workflow we use both PC-DOMs and TS-DOMs. Particularly, the latter are obtained projecting the original images onto the triangulated surface, without any downsampling, thus retaining the original resolution and quality of images collected in the field. In the DOMstudio interpretation step, PC-DOM is considered the best option for fracture analysis in outcrops where facets corresponding to fractures are present. This allows obtaining orientation statistics (e.g. stereoplots, Fisher statistics, etc.) directly from a point cloud where, for each point, the unit vector normal to the outcrop surface has been calculated. A recent development in this kind of processing is represented by the possibility to automatically select (segment) subset point clouds representing single fracture surfaces, which can be used for studies on fracture length, spacing, etc., allowing to obtain parameters like the frequency-length distribution, P21, etc. PC-DOM interpretation can be combined or complemented, depending on the outcrop morphology, with an interpretation carried out on a TS-DOM in terms of traces, which are the linear intersection of "geological" surfaces (fractures, faults, bedding, etc.) with the outcrop surface. This kind of interpretation is very well suited for outcrops with smooth surfaces, and can be performed either by manual picking, or by applying image analysis techniques on the images associated with the DOM. In this case, a huge mass of data, with very high resolution, can be collected very effectively. If we consider applications like lithological or mineral map-ping, TS-DOM datasets are the only suitable support. Finally, the DOMstudio workflow produces output in formats that are compatible with all common geomodelling packages (e.g. Gocad/Skua, Petrel, Move), allowing to directly use the quantitative data collected on DOMs to generate and calibrate geological, structural, or geostatistical models. I will present examples of applications including hydrocarbon reservoir analogue studies, studies on fault zone architecture, lithological mapping on sedimentary and metamorphic rocks, and studies on the surface of planets and small bodies in the Solar System.

  20. Understanding the behavior of Giardia and Cryptosporidium in an urban watershed: Explanation and application of techniques to collect and evaluate monitoring data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crockett, C.S.; Haas, C.N.

    1996-11-01

    Due to current proposed regulations requiring monitoring for protozoans and demonstration of adequate protozoan removal depending on source water concentrations detected, many utilities are considering or are engaged in protozoan monitoring activities within their watershed so that proper watershed management and treatment modifications can reduce their impact on drinking water safety and quality. However, due to the difficulties associated with the current analytical methods and sample collection many sampling efforts collect data that cannot be interpreted or lack the tools to interpret the information obtained. Therefore, it is necessary to determine how to develop an effective sampling program tailored tomore » a utility`s specific needs to provide interpretable data and develop tools for evaluating such data. The following case study describes the process in which a utility learned how to collect and interpret monitoring data for their specific needs and provides concepts and tools which other utilities can use to aid in their own macro and microwatershed management efforts.« less

  1. Distributed data collection for a database of radiological image interpretations

    NASA Astrophysics Data System (ADS)

    Long, L. Rodney; Ostchega, Yechiam; Goh, Gin-Hua; Thoma, George R.

    1997-01-01

    The National Library of Medicine, in collaboration with the National Center for Health Statistics and the National Institute for Arthritis and Musculoskeletal and Skin Diseases, has built a system for collecting radiological interpretations for a large set of x-ray images acquired as part of the data gathered in the second National Health and Nutrition Examination Survey. This system is capable of delivering across the Internet 5- and 10-megabyte x-ray images to Sun workstations equipped with X Window based 2048 X 2560 image displays, for the purpose of having these images interpreted for the degree of presence of particular osteoarthritic conditions in the cervical and lumbar spines. The collected interpretations can then be stored in a database at the National Library of Medicine, under control of the Illustra DBMS. This system is a client/server database application which integrates (1) distributed server processing of client requests, (2) a customized image transmission method for faster Internet data delivery, (3) distributed client workstations with high resolution displays, image processing functions and an on-line digital atlas, and (4) relational database management of the collected data.

  2. OzFlux data: network integration from collection to curation

    NASA Astrophysics Data System (ADS)

    Isaac, Peter; Cleverly, James; McHugh, Ian; van Gorsel, Eva; Ewenz, Cacilia; Beringer, Jason

    2017-06-01

    Measurement of the exchange of energy and mass between the surface and the atmospheric boundary-layer by the eddy covariance technique has undergone great change in the last 2 decades. Early studies of these exchanges were confined to brief field campaigns in carefully controlled conditions followed by months of data analysis. Current practice is to run tower-based eddy covariance systems continuously over several years due to the need for continuous monitoring as part of a global effort to develop local-, regional-, continental- and global-scale budgets of carbon, water and energy. Efficient methods of processing the increased quantities of data are needed to maximise the time available for analysis and interpretation. Standardised methods are needed to remove differences in data processing as possible contributors to observed spatial variability. Furthermore, public availability of these data sets assists with undertaking global research efforts. The OzFlux data path has been developed (i) to provide a standard set of quality control and post-processing tools across the network, thereby facilitating inter-site integration and spatial comparisons; (ii) to increase the time available to researchers for analysis and interpretation by reducing the time spent collecting and processing data; (iii) to propagate both data and metadata to the final product; and (iv) to facilitate the use of the OzFlux data by adopting a standard file format and making the data available from web-based portals. Discovery of the OzFlux data set is facilitated through incorporation in FLUXNET data syntheses and the publication of collection metadata via the RIF-CS format. This paper serves two purposes. The first is to describe the data sets, along with their quality control and post-processing, for the other papers of this Special Issue. The second is to provide an example of one solution to the data collection and curation challenges that are encountered by similar flux tower networks worldwide.

  3. Diagnosing collisions of magnetized, high energy density plasma flows using a combination of collective Thomson scattering, Faraday rotation, and interferometry (invited)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Swadling, G. F., E-mail: swadling@imperial.ac.uk; Lebedev, S. V.; Hall, G. N.

    2014-11-15

    A suite of laser based diagnostics is used to study interactions of magnetised, supersonic, radiatively cooled plasma flows produced using the Magpie pulse power generator (1.4 MA, 240 ns rise time). Collective optical Thomson scattering measures the time-resolved local flow velocity and temperature across 7–14 spatial positions. The scattering spectrum is recorded from multiple directions, allowing more accurate reconstruction of the flow velocity vectors. The areal electron density is measured using 2D interferometry; optimisation and analysis are discussed. The Faraday rotation diagnostic, operating at 1053 nm, measures the magnetic field distribution in the plasma. Measurements obtained simultaneously by these diagnosticsmore » are used to constrain analysis, increasing the accuracy of interpretation.« less

  4. NASA/BLM APT, phase 2. Volume 2: Technology demonstration. [Arizona

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Techniques described include: (1) steps in the preprocessing of LANDSAT data; (2) the training of a classifier; (3) maximum likelihood classification and precision; (4) geometric correction; (5) class description; (6) digitizing; (7) digital terrain data; (8) an overview of sample design; (9) allocation and selection of primary sample units; (10) interpretation of secondary sample units; (11) data collection ground plots; (12) data reductions; (13) analysis for productivity estimation and map verification; (14) cost analysis; and (150) LANDSAT digital products. The evaluation of the pre-inventory planning for P.J. is included.

  5. Hearing children of Deaf parents: Gender and birth order in the delegation of the interpreter role in culturally Deaf families.

    PubMed

    Moroe, Nomfundo F; de Andrade, Victor

    2018-01-01

    Culturally, hearing children born to Deaf parents may have to mediate two different positions within the hearing and Deaf cultures. However, there appears to be little written about the experiences of hearing children born to Deaf parents in the South African context. This study sought to investigate the roles of children of Deaf adults (CODAs) as interpreters in Deaf-parented families, more specifically, the influence of gender and birth order in language brokering. Two male and eight female participants between the ages of 21 and 40 years were recruited through purposive and snowball sampling strategies. A qualitative design was employed and data were collected using a semi-structured, open-ended interview format. Themes which emerged were analysed using thematic analysis. The findings indicated that there was no formal assignment of the interpreter role; however, female children tended to assume the role of interpreter more often than the male children. Also, it appeared as though the older children shifted the responsibility for interpreting to younger siblings. The participants in this study indicated that they interpreted in situations where they felt they were not developmentally or emotionally ready, or in situations which they felt were better suited for older siblings or for siblings of another gender. This study highlights a need for the formalisation of interpreting services for Deaf people in South Africa in the form of professional interpreters rather than the reliance on hearing children as interpreters in order to mediate between Deaf and hearing cultures.

  6. Hearing children of Deaf parents: Gender and birth order in the delegation of the interpreter role in culturally Deaf families

    PubMed Central

    de Andrade, Victor

    2018-01-01

    Background Culturally, hearing children born to Deaf parents may have to mediate two different positions within the hearing and Deaf cultures. However, there appears to be little written about the experiences of hearing children born to Deaf parents in the South African context. Objective This study sought to investigate the roles of children of Deaf adults (CODAs) as interpreters in Deaf-parented families, more specifically, the influence of gender and birth order in language brokering. Method Two male and eight female participants between the ages of 21 and 40 years were recruited through purposive and snowball sampling strategies. A qualitative design was employed and data were collected using a semi-structured, open-ended interview format. Themes which emerged were analysed using thematic analysis. Results The findings indicated that there was no formal assignment of the interpreter role; however, female children tended to assume the role of interpreter more often than the male children. Also, it appeared as though the older children shifted the responsibility for interpreting to younger siblings. The participants in this study indicated that they interpreted in situations where they felt they were not developmentally or emotionally ready, or in situations which they felt were better suited for older siblings or for siblings of another gender. Conclusion This study highlights a need for the formalisation of interpreting services for Deaf people in South Africa in the form of professional interpreters rather than the reliance on hearing children as interpreters in order to mediate between Deaf and hearing cultures. PMID:29850437

  7. Validation of Alternatives to High Volatile Organic Compound Solvents Used in Aeronautical Antifriction Bearing Cleaning

    DTIC Science & Technology

    2006-10-17

    Name, address, telephone number, and technical point of contact at company supplying product. (3) Material safety data sheet (MSDS) and label...existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments...Depot level maintenance cleaning. Data analysis and interpretation are based on analytical test results as well as visual inspections performed on

  8. Assessing GPS Constellation Resiliency in an Urban Canyon Environment

    DTIC Science & Technology

    2015-03-26

    Taipei, Taiwan as his area of interest. His GPS constellation is modeled in the Satellite Toolkit ( STK ) where augmentation satellites can be added and...interaction. SEAS also provides a visual display of the simulation which is useful for verification and debugging portions of the analysis. Furthermore...entire system. Interpreting the model is aided by the visual display of the agents moving in the region of inter- est. Furthermore, SEAS collects

  9. A Componential Interpretation of the General Factor in Human Intelligence.

    DTIC Science & Technology

    1980-10-01

    individual - differences data . If one delves into the nature of variation across stimulus types rather than across subjects, however, a result...the conclusion we will reach from an evaluation of the data we have collected, we assert here that individual differences in general intelligence can... data . The technique we used was nonmetric multidimensional scaling rather than factor analysis , however (see Kruskal, 1964a, 1964b; Shepard, 1962a

  10. Collectivity, Distributivity, and the Interpretation of Plural Numerical Expressions in Child and Adult Language

    ERIC Educational Resources Information Center

    Syrett, Kristen; Musolino, Julien

    2013-01-01

    Sentences containing plural numerical expressions (e.g., "two boys") can give rise to two interpretations (collective and distributive), arising from the fact that their representation admits of a part-whole structure. We present the results of a series of experiments designed to explore children's understanding of this distinction…

  11. Urban environmental health applications of remote sensing

    NASA Technical Reports Server (NTRS)

    Rush, M.; Goldstein, J.; Hsi, B. P.; Olsen, C. B.

    1974-01-01

    An urban area was studied through the use of the inventory-by-surrogate method rather than by direct interpretation of photographic imagery. Prior uses of remote sensing in urban and public research are examined. The effects of crowding, poor housing conditions, air pollution, and street conditions on public health are considered. Color infrared photography was used to categorize land use features and the grid method was used in photo interpretation analysis. The incidence of shigella and salmonella, hepatitis, meningitis, tuberculosis, myocardial infarction and veneral disease were studied, together with mortality and morbidity rates. Sample census data were randomly collected and validated. The hypothesis that land use and residential quality are associated with and act as an influence upon health and physical well-being was studied and confirmed.

  12. Aerodynamics in the amusement park: interpreting sensor data for acceleration and rotation

    NASA Astrophysics Data System (ADS)

    Löfstrand, Marcus; Pendrill, Ann-Marie

    2016-09-01

    The sky roller ride depends on interaction with the air to create a rolling motion. In this paper, we analyse forces, torque and angular velocities during different parts of the ride, combining a theoretical analysis, with photos, videos as well as with accelerometer and gyroscopic data, that may be collected e.g. with a smartphone. For interpreting the result, it must be taken into account that the sensors and their coordinate system rotate together with the rider. The sky roller offers many examples for physics education, from simple circular motion, to acceleration and rotation involving several axes, as well as the relation between wing orientation, torque and angular velocities and using barometer pressure to determine the elevation gain.

  13. Growthcurver: an R package for obtaining interpretable metrics from microbial growth curves.

    PubMed

    Sprouffske, Kathleen; Wagner, Andreas

    2016-04-19

    Plate readers can measure the growth curves of many microbial strains in a high-throughput fashion. The hundreds of absorbance readings collected simultaneously for hundreds of samples create technical hurdles for data analysis. Growthcurver summarizes the growth characteristics of microbial growth curve experiments conducted in a plate reader. The data are fitted to a standard form of the logistic equation, and the parameters have clear interpretations on population-level characteristics, like doubling time, carrying capacity, and growth rate. Growthcurver is an easy-to-use R package available for installation from the Comprehensive R Archive Network (CRAN). The source code is available under the GNU General Public License and can be obtained from Github (Sprouffske K, Growthcurver sourcecode, 2016).

  14. How Does an Environmental Educator Address Student Engagement in a Meaningful Watershed Educational Experience (MWEE)?

    NASA Astrophysics Data System (ADS)

    Char, Chelia

    Children represent the future and thus by providing them with effective environmental educational experiences, educators may be taking a critical step in preventing "the probable serious environmental problems in the future" (Gokhan, 2010, p. 56). The Meaningful Watershed Educational Experience (MWEE) is an excellent example of one such education program. MWEEs aim to educate and enhance the students' relationship with the Chesapeake Bay Watershed through an integration of classroom activities and fieldwork. As environmental educators and role models, field interpreters are a major component and significant influence on the local MWEE programs, however their perspective as to how they have impacted the programs has yet to be examined. Through a qualitative analysis and specific focus on the behavioral, emotional, and cognitive dimensions of student engagement, the researcher intended to address this void. The focus of the study was to examine how the local MWEE field interpreters understood and addressed student engagement in a field setting. This was measured via data collected from observations of and semi-structured, one-on-one interviews with each field interpreter involved with the local MWEE programs. Data analysis uncovered that field interpreters demonstrated a strong awareness of student engagement. Furthermore, they defined, recognized, and addressed student engagement within the constructs of the emotional, behavioral, and cognitive dimensions. Ultimately, the individual experiences of each MWEE field interpreter provides insight into the phenomenon, however further research is required to strengthen the awareness of how, if at all, their perspectives of student engagement directly impact student outcomes.

  15. Dynamics of essential collective motions in proteins: Theory

    NASA Astrophysics Data System (ADS)

    Stepanova, Maria

    2007-11-01

    A general theoretical background is introduced for characterization of conformational motions in protein molecules, and for building reduced coarse-grained models of proteins, based on the statistical analysis of their phase trajectories. Using the projection operator technique, a system of coupled generalized Langevin equations is derived for essential collective coordinates, which are generated by principal component analysis of molecular dynamic trajectories. The number of essential degrees of freedom is not limited in the theory. An explicit analytic relation is established between the generalized Langevin equation for essential collective coordinates and that for the all-atom phase trajectory projected onto the subspace of essential collective degrees of freedom. The theory introduced is applied to identify correlated dynamic domains in a macromolecule and to construct coarse-grained models representing the conformational motions in a protein through a few interacting domains embedded in a dissipative medium. A rigorous theoretical background is provided for identification of dynamic correlated domains in a macromolecule. Examples of domain identification in protein G are given and employed to interpret NMR experiments. Challenges and potential outcomes of the theory are discussed.

  16. Geostatistics for spatial genetic structures: study of wild populations of perennial ryegrass.

    PubMed

    Monestiez, P; Goulard, M; Charmet, G

    1994-04-01

    Methods based on geostatistics were applied to quantitative traits of agricultural interest measured on a collection of 547 wild populations of perennial ryegrass in France. The mathematical background of these methods, which resembles spatial autocorrelation analysis, is briefly described. When a single variable is studied, the spatial structure analysis is similar to spatial autocorrelation analysis, and a spatial prediction method, called "kriging", gives a filtered map of the spatial pattern over all the sampled area. When complex interactions of agronomic traits with different evaluation sites define a multivariate structure for the spatial analysis, geostatistical methods allow the spatial variations to be broken down into two main spatial structures with ranges of 120 km and 300 km, respectively. The predicted maps that corresponded to each range were interpreted as a result of the isolation-by-distance model and as a consequence of selection by environmental factors. Practical collecting methodology for breeders may be derived from such spatial structures.

  17. Antileishmanial compounds from Cordia fragrantissima collected in Burma (Myanmar).

    PubMed

    Mori, Kanami; Kawano, Marii; Fuchino, Hiroyuki; Ooi, Takashi; Satake, Motoyoshi; Agatsuma, Yutaka; Kusumi, Takenori; Sekita, Setsuko

    2008-01-01

    A methanol extract of the wood of Cordia fragrantissima, collected in Burma (Myanmar), was found to exhibit significant activity against Leishmania major. Bioassay-guided fractionation of this extract using several chromatographic techniques afforded three new compounds (1-3) and five known compounds (4-8). The structures of the new compounds were revealed on the basis of spectroscopic data interpretation and by X-ray crystallographic analysis. Interestingly, the new compounds, despite the presence of asymmetric carbons, were found to be racemates. The activities of the isolates from C. fragrantissima and several derivatives were evaluated against the promastigote forms of Leishmania major, L. panamensis, and L. guyanensis.

  18. Ecological tolerances of Miocene larger benthic foraminifera from Indonesia

    NASA Astrophysics Data System (ADS)

    Novak, Vibor; Renema, Willem

    2018-01-01

    To provide a comprehensive palaeoenvironmental reconstruction based on larger benthic foraminifera (LBF), a quantitative analysis of their assemblage composition is needed. Besides microfacies analysis which includes environmental preferences of foraminiferal taxa, statistical analyses should also be employed. Therefore, detrended correspondence analysis and cluster analysis were performed on relative abundance data of identified LBF assemblages deposited in mixed carbonate-siliciclastic (MCS) systems and blue-water (BW) settings. Studied MCS system localities include ten sections from the central part of the Kutai Basin in East Kalimantan, ranging from late Burdigalian to Serravallian age. The BW samples were collected from eleven sections of the Bulu Formation on Central Java, dated as Serravallian. Results from detrended correspondence analysis reveal significant differences between these two environmental settings. Cluster analysis produced five clusters of samples; clusters 1 and 2 comprise dominantly MCS samples, clusters 3 and 4 with dominance of BW samples, and cluster 5 showing a mixed composition with both MCS and BW samples. The results of cluster analysis were afterwards subjected to indicator species analysis resulting in the interpretation that generated three groups among LBF taxa: typical assemblage indicators, regularly occurring taxa and rare taxa. By interpreting the results of detrended correspondence analysis, cluster analysis and indicator species analysis, along with environmental preferences of identified LBF taxa, a palaeoenvironmental model is proposed for the distribution of LBF in Miocene MCS systems and adjacent BW settings of Indonesia.

  19. Thermal and Evolved Gas Analysis of "Nanophase" Carbonates: Implications for Thermal and Evolved Gas Analysis on Mars Missions

    NASA Technical Reports Server (NTRS)

    Lauer, Howard V., Jr.; Archer, P. D., Jr.; Sutter, B.; Niles, P. B.; Ming, Douglas W.

    2012-01-01

    Data collected by the Mars Phoenix Lander's Thermal and Evolved Gas Analyzer (TEGA) suggested the presence of calcium-rich carbonates as indicated by a high temperature CO2 release while a low temperature (approx.400-680 C) CO2 release suggested possible Mg- and/or Fe-carbonates [1,2]. Interpretations of the data collected by Mars remote instruments is done by comparing the mission data to a database on the thermal properties of well-characterized Martian analog materials collected under reduced and Earth ambient pressures [3,4]. We are proposing that "nano-phase" carbonates may also be contributing to the low temperature CO2 release. The objectives of this paper is to (1) characterize the thermal and evolved gas proper-ties of carbonates of varying particle size, (2) evaluate the CO2 releases from CO2 treated CaO samples and (3) examine the secondary CO2 release from reheated calcite of varying particle size.

  20. The Challenges of Data Rate and Data Accuracy in the Analysis of Volcanic Systems: An Assessment Using Multi-Parameter Data from the 2012-2013 Eruption Sequence at White Island, New Zealand

    NASA Astrophysics Data System (ADS)

    Jolly, A. D.; Christenson, B. W.; Neuberg, J. W.; Fournier, N.; Mazot, A.; Kilgour, G.; Jolly, G. E.

    2014-12-01

    Volcano monitoring is usually undertaken with the collection of both automated and manual data that form a multi-parameter time-series having a wide range of sampling rates and measurement accuracies. Assessments of hazards and risks ultimately rely on incorporating this information into usable form, first for the scientists to interpret, and then for the public and relevant stakeholders. One important challenge is in building appropriate and efficient strategies to compare and interpret data from these exceptionally different datasets. The White Island volcanic system entered a new eruptive state beginning in mid-2012 and continuing through the present time. Eruptive activity during this period comprised small phreatic and phreato-magmatic events in August 2012, August 2013 and October 2013 and the intrusion of a small dome that was first observed in November 2012. We examine the chemical and geophysical dataset to assess the effects of small magma batches on the shallow hydrothermal system. The analysis incorporates high data rate (100 Hz) seismic, and infrasound data, lower data rate (1 Hz to 5 min sampling interval) GPS, tilt-meter, and gravity data and very low data rate geochemical time series (sampling intervals from days to months). The analysis is further informed by visual observations of lake level changes, geysering activity through crater lake vents, and changes in fumarolic discharges. We first focus on the problems of incorporating the range of observables into coherent time frame dependant conceptual models. We then show examples where high data rate information may be improved through new processing methods and where low data rate information may be collected more frequently without loss of fidelity. By this approach we hope to improve the accuracy and efficiency of interpretations of volcano unrest and thereby improve hazard assessments.

  1. Cultural competency assessment tool for hospitals: Evaluating hospitals’ adherence to the culturally and linguistically appropriate services standards

    PubMed Central

    Weech-Maldonado, Robert; Dreachslin, Janice L.; Brown, Julie; Pradhan, Rohit; Rubin, Kelly L.; Schiller, Cameron; Hays, Ron D.

    2016-01-01

    Background The U.S. national standards for culturally and linguistically appropriate services (CLAS) in health care provide guidelines on policies and practices aimed at developing culturally competent systems of care. The Cultural Competency Assessment Tool for Hospitals (CCATH) was developed as an organizational tool to assess adherence to the CLAS standards. Purposes First, we describe the development of the CCATH and estimate the reliability and validity of the CCATH measures. Second, we discuss the managerial implications of the CCATH as an organizational tool to assess cultural competency. Methodology/Approach We pilot tested an initial draft of the CCATH, revised it based on a focus group and cognitive interviews, and then administered it in a field test with a sample of California hospitals. The reliability and validity of the CCATH were evaluated using factor analysis, analysis of variance, and Cronbach’s alphas. Findings Exploratory and confirmatory factor analyses identified 12 CCATH composites: leadership and strategic planning, data collection on inpatient population, data collection on service area, performance management systems and quality improvement, human resources practices, diversity training, community representation, availability of interpreter services, interpreter services policies, quality of interpreter services, translation of written materials, and clinical cultural competency practices. All the CCATH scales had internal consistency reliability of .65 or above, and the reliability was .70 or above for 9 of the 12 scales. Analysis of variance results showed that not-for-profit hospitals have higher CCATH scores than for-profit hospitals in five CCATH scales and higher CCATH scores than government hospitals in two CCATH scales. Practice Implications The CCATH showed adequate psychometric properties. Managers and policy makers can use the CCATH as a tool to evaluate hospital performance in cultural competency and identify and target improvements in hospital policies and practices that undergird the provision of CLAS. PMID:21934511

  2. Nostradamus (1503-66)

    NASA Astrophysics Data System (ADS)

    Murdin, P.

    2000-11-01

    Doctor, astrologer, born in St Rémy, France. Took on the role of a prophet and wrote Centuries, a collection of predictions in rhyme (1555-8). The predictions are expressed in obscure and enigmatic terms, which are both difficult to interpret and open to many interpretations, and so can be interpreted as successful prophesies, including what Catherine of Medici interpreted as the manner of deat...

  3. Theory Interpretations in PVS

    NASA Technical Reports Server (NTRS)

    Owre, Sam; Shankar, Natarajan; Butler, Ricky W. (Technical Monitor)

    2001-01-01

    The purpose of this task was to provide a mechanism for theory interpretations in a prototype verification system (PVS) so that it is possible to demonstrate the consistency of a theory by exhibiting an interpretation that validates the axioms. The mechanization makes it possible to show that one collection of theories is correctly interpreted by another collection of theories under a user-specified interpretation for the uninterpreted types and constants. A theory instance is generated and imported, while the axiom instances are generated as proof obligations to ensure that the interpretation is valid. Interpretations can be used to show that an implementation is a correct refinement of a specification, that an axiomatically defined specification is consistent, or that a axiomatically defined specification captures its intended models. In addition, the theory parameter mechanism has been extended with a notion of theory as parameter so that a theory instance can be given as an actual parameter to an imported theory. Theory interpretations can thus be used to refine an abstract specification or to demonstrate the consistency of an axiomatic theory. In this report we describe the mechanism in detail. This extension is a part of PVS version 3.0, which will be publicly released in mid-2001.

  4. Design of integration-ready metasurface-based infrared absorbers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ogando, Karim, E-mail: karim@cab.cnea.gov.ar; Pastoriza, Hernán

    2015-07-28

    We introduce an integration ready design of metamaterial infrared absorber, highly compatible with many kinds of fabrication processes. We present the results of an exhaustive experimental characterization, including an analysis of the effects of single meta-atom geometrical parameters and collective arrangement. We confront the results with the theoretical interpretations proposed in the literature. Based on the results, we develop a set of practical design rules for metamaterial absorbers in the infrared region.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doug Blankenship

    PDFs of seismic reflection profiles 101,110, 111 local to the West Flank FORGE site. 45 line kilometers of seismic reflection data are processed data collected in 2001 through the use of vibroseis trucks. The initial analysis and interpretation of these data was performed by Unruh et al. (2001). Optim processed these data by inverting the P-wave first arrivals to create a 2-D velocity structure. Kirchhoff images were then created for each line using velocity tomograms (Unruh et al., 2001).

  6. The value of psychosocial group activity in nursing education: A qualitative analysis.

    PubMed

    Choi, Yun-Jung

    2018-05-01

    Nursing faculty often struggle to find effective teaching strategies for nursing students that integrate group work into nursing students' learning activities. This study was conducted to evaluate students' experiences in a psychiatric and mental health nursing course using psychosocial group activities to develop therapeutic communication and interpersonal relationship skills, as well as to introduce psychosocial nursing interventions. A qualitative research design was used. The study explored nursing students' experiences of the course in accordance with the inductive, interpretative, and constructive approaches via focus group interviews. Participants were 17 undergraduate nursing students who registered for a psychiatric and mental health nursing course. The collected data were analyzed by qualitative content analysis. The analysis resulted in 28 codes, 14 interpretive codes, 4 themes (developing interpersonal relationships, learning problem-solving skills, practicing cooperation and altruism, and getting insight and healing), and a core theme (interdependent growth in self-confidence). The psychosocial group activity provided constructive opportunities for the students to work independently and interdependently as healthcare team members through reflective learning experiences. Copyright © 2018 Elsevier Ltd. All rights reserved.

  7. The application of sensitivity analysis to models of large scale physiological systems

    NASA Technical Reports Server (NTRS)

    Leonard, J. I.

    1974-01-01

    A survey of the literature of sensitivity analysis as it applies to biological systems is reported as well as a brief development of sensitivity theory. A simple population model and a more complex thermoregulatory model illustrate the investigatory techniques and interpretation of parameter sensitivity analysis. The role of sensitivity analysis in validating and verifying models, and in identifying relative parameter influence in estimating errors in model behavior due to uncertainty in input data is presented. This analysis is valuable to the simulationist and the experimentalist in allocating resources for data collection. A method for reducing highly complex, nonlinear models to simple linear algebraic models that could be useful for making rapid, first order calculations of system behavior is presented.

  8. Exploring the onset of collective motion in self-organised trails of social organisms

    NASA Astrophysics Data System (ADS)

    Brigatti, E.; Hernández, A.

    2018-04-01

    We investigate the emergence of self-organised trails between two specific target areas in collective motion of social organisms by means of an agent-based model. We present numerical evidences that an increase in the efficiency of navigation, in dependence of the colony size, exists. Moreover, the shift, from the diffusive to the directed motion can be quantitatively characterised, identifying and measuring a well defined crossover point. This point corresponds to the minimal number of individuals necessary for the onset of collective cooperation. Finally, by means of a finite-size scaling analysis, we describe its scaling behaviour as a function of the environment size. This last result can be of particular interest for interpreting empirical observations or for the design of artificial swarms.

  9. Prehospital Acute ST-Elevation Myocardial Infarction Identification in San Diego: A Retrospective Analysis of the Effect of a New Software Algorithm.

    PubMed

    Coffey, Christanne; Serra, John; Goebel, Mat; Espinoza, Sarah; Castillo, Edward; Dunford, James

    2018-05-03

    A significant increase in false positive ST-elevation myocardial infarction (STEMI) electrocardiogram interpretations was noted after replacement of all of the City of San Diego's 110 monitor-defibrillator units with a new brand. These concerns were brought to the manufacturer and a revised interpretive algorithm was implemented. This study evaluated the effects of a revised interpretation algorithm to identify STEMI when used by San Diego paramedics. Data were reviewed 6 months before and 6 months after the introduction of a revised interpretation algorithm. True-positive and false-positive interpretations were identified. Factors contributing to an incorrect interpretation were assessed and patient demographics were collected. A total of 372 (234 preimplementation, 138 postimplementation) cases met inclusion criteria. There was a significant reduction in false positive STEMI (150 preimplementation, 40 postimplementation; p < 0.001) after implementation. The most common factors resulting in false positive before implementation were right bundle branch block, left bundle branch block, and atrial fibrillation. The new algorithm corrected for these misinterpretations with most postimplementation false positives attributed to benign early repolarization and poor data quality. Subsequent follow-up at 10 months showed maintenance of the observed reduction in false positives. This study shows that introducing a revised 12-lead interpretive algorithm resulted in a significant reduction in the number of false positive STEMI electrocardiogram interpretations in a large urban emergency medical services system. Rigorous testing and standardization of new interpretative software is recommended before introduction into a clinical setting to prevent issues resulting from inappropriate cardiac catheterization laboratory activations. Copyright © 2018 Elsevier Inc. All rights reserved.

  10. Real-time complex event processing for cloud resources

    NASA Astrophysics Data System (ADS)

    Adam, M.; Cordeiro, C.; Field, L.; Giordano, D.; Magnoni, L.

    2017-10-01

    The ongoing integration of clouds into the WLCG raises the need for detailed health and performance monitoring of the virtual resources in order to prevent problems of degraded service and interruptions due to undetected failures. When working in scale, the existing monitoring diversity can lead to a metric overflow whereby the operators need to manually collect and correlate data from several monitoring tools and frameworks, resulting in tens of different metrics to be constantly interpreted and analyzed per virtual machine. In this paper we present an ESPER based standalone application which is able to process complex monitoring events coming from various sources and automatically interpret data in order to issue alarms upon the resources’ statuses, without interfering with the actual resources and data sources. We will describe how this application has been used with both commercial and non-commercial cloud activities, allowing the operators to quickly be alarmed and react to misbehaving VMs and LHC experiments’ workflows. We will present the pattern analysis mechanisms being used, as well as the surrounding Elastic and REST API interfaces where the alarms are collected and served to users.

  11. Narratives of empowerment and compliance: studies of communication in online patient support groups.

    PubMed

    Wentzer, Helle S; Bygholm, Ann

    2013-12-01

    New technologies enable new forms of patient participation in health care. The article discusses whether communication in online patient support groups is a source of individual as well as collective empowerment or to be understood within the tradition of compliance. The discussion is based on a qualitative analysis of patient communication in two online groups on the Danish portal sundhed.dk, one for lung patients and one for women with fertility problems. The object of study is the total sum of postings during a specific period of time - a total of 4301 posts are included. The textmaterial was analyzed according to the textual paradigm of Paul Ricoeur, and the three steps of critical interpretation. Thus, the analysis moves from describing communicative characteristics of the site to a thorough semantic analysis of its narrative structure of construing meaning, interaction and collective identity, and finally as a source of collective action. The meta-narratives of the two groups confirm online patient support groups for individual empowerment, for collective group identity, but not for collective empowerment. The collective identities of patienthood on the two sites are created by the users (patients) through specific styles of communication and interaction, referred to as 'multi-logical narratives'. In spite of the potential of online communities of opening up health care to the critical voice of the public, the analysis points to a synthesis of the otherwise opposite positions of empowerment and compliance in patient care. On a collective level, the site is empowering the individual users to comply with 'doctor's recommendations' as a group. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  12. Analysis of Within-Test Variability of Non-Destructive Test Methods to Evaluate Compressive Strength of Normal Vibrated and Self-Compacting Concretes

    NASA Astrophysics Data System (ADS)

    Nepomuceno, Miguel C. S.; Lopes, Sérgio M. R.

    2017-10-01

    Non-destructive tests (NDT) have been used in the last decades for the assessment of in-situ quality and integrity of concrete elements. An important step in the application of NDT methods concerns to the interpretation and validation of the test results. In general, interpretation of NDT results should involve three distinct phases leading to the development of conclusions: processing of collected data, analysis of within-test variability and quantitative evaluation of property under investigation. The analysis of within-test variability can provide valuable information, since this can be compared with that of within-test variability associated with the NDT method in use, either to provide a measure of the quality control or to detect the presence of abnormal circumstances during the in-situ application. This paper reports the analysis of the experimental results of within-test variability of NDT obtained for normal vibrated concrete and self-compacting concrete. The NDT reported includes the surface hardness test, ultrasonic pulse velocity test, penetration resistance test, pull-off test, pull-out test and maturity test. The obtained results are discussed and conclusions are presented.

  13. Sources of Safety Data and Statistical Strategies for Design and Analysis: Clinical Trials.

    PubMed

    Zink, Richard C; Marchenko, Olga; Sanchez-Kam, Matilde; Ma, Haijun; Jiang, Qi

    2018-03-01

    There has been an increased emphasis on the proactive and comprehensive evaluation of safety endpoints to ensure patient well-being throughout the medical product life cycle. In fact, depending on the severity of the underlying disease, it is important to plan for a comprehensive safety evaluation at the start of any development program. Statisticians should be intimately involved in this process and contribute their expertise to study design, safety data collection, analysis, reporting (including data visualization), and interpretation. In this manuscript, we review the challenges associated with the analysis of safety endpoints and describe the safety data that are available to influence the design and analysis of premarket clinical trials. We share our recommendations for the statistical and graphical methodologies necessary to appropriately analyze, report, and interpret safety outcomes, and we discuss the advantages and disadvantages of safety data obtained from clinical trials compared to other sources. Clinical trials are an important source of safety data that contribute to the totality of safety information available to generate evidence for regulators, sponsors, payers, physicians, and patients. This work is a result of the efforts of the American Statistical Association Biopharmaceutical Section Safety Working Group.

  14. Water quality analysis of the Rapur area, Andhra Pradesh, South India using multivariate techniques

    NASA Astrophysics Data System (ADS)

    Nagaraju, A.; Sreedhar, Y.; Thejaswi, A.; Sayadi, Mohammad Hossein

    2017-10-01

    The groundwater samples from Rapur area were collected from different sites to evaluate the major ion chemistry. The large number of data can lead to difficulties in the integration, interpretation, and representation of the results. Two multivariate statistical methods, hierarchical cluster analysis (HCA) and factor analysis (FA), were applied to evaluate their usefulness to classify and identify geochemical processes controlling groundwater geochemistry. Four statistically significant clusters were obtained from 30 sampling stations. This has resulted two important clusters viz., cluster 1 (pH, Si, CO3, Mg, SO4, Ca, K, HCO3, alkalinity, Na, Na + K, Cl, and hardness) and cluster 2 (EC and TDS) which are released to the study area from different sources. The application of different multivariate statistical techniques, such as principal component analysis (PCA), assists in the interpretation of complex data matrices for a better understanding of water quality of a study area. From PCA, it is clear that the first factor (factor 1), accounted for 36.2% of the total variance, was high positive loading in EC, Mg, Cl, TDS, and hardness. Based on the PCA scores, four significant cluster groups of sampling locations were detected on the basis of similarity of their water quality.

  15. The Government Finance Database: A Common Resource for Quantitative Research in Public Financial Analysis

    PubMed Central

    Pierson, Kawika; Hand, Michael L.; Thompson, Fred

    2015-01-01

    Quantitative public financial management research focused on local governments is limited by the absence of a common database for empirical analysis. While the U.S. Census Bureau distributes government finance data that some scholars have utilized, the arduous process of collecting, interpreting, and organizing the data has led its adoption to be prohibitive and inconsistent. In this article we offer a single, coherent resource that contains all of the government financial data from 1967-2012, uses easy to understand natural-language variable names, and will be extended when new data is available. PMID:26107821

  16. Evaluation of electrostatic precipitator during SRC combustion tests. Final task report Apr--Aug 1977

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nichols, G.B.; Barrett, W.J.

    1978-07-01

    The report deals with the evaluation of an electrostatic precipitator (ESP) and associated environmental factors during the burning of solvent refined coal (SRC) in a boiler at Plant Mitchell of the Georgia Power Company. The effort was part of an overall study of the use of SRC in a full-scale electric power plant. Results of a performance evaluation of the ESP are reported and interpreted. Samples of stack emissions were collected with a Source Assessment Sampling System (SASS) train for chemical analysis: results of the analysis are to be reported later.

  17. The Government Finance Database: A Common Resource for Quantitative Research in Public Financial Analysis.

    PubMed

    Pierson, Kawika; Hand, Michael L; Thompson, Fred

    2015-01-01

    Quantitative public financial management research focused on local governments is limited by the absence of a common database for empirical analysis. While the U.S. Census Bureau distributes government finance data that some scholars have utilized, the arduous process of collecting, interpreting, and organizing the data has led its adoption to be prohibitive and inconsistent. In this article we offer a single, coherent resource that contains all of the government financial data from 1967-2012, uses easy to understand natural-language variable names, and will be extended when new data is available.

  18. A new method for reporting and interpreting textural composition of spawning gravel.

    Treesearch

    Fredrick B. Lotspeich; Fred H. Everest

    1981-01-01

    A new method has been developed for collecting, sorting, and interpreting gravel quality. Samples are collected with a tri-tube freeze-core device and dry-sorted by using sieves based on the Wentworth scale. An index to the quality of gravel is obtained by dividing geometric mean particle size by the sorting coefficient (a measure of the distribution of grain sizes) of...

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crossno, Patricia Joyce; Dunlavy, Daniel M.; Stanton, Eric T.

    This report is a summary of the accomplishments of the 'Scalable Solutions for Processing and Searching Very Large Document Collections' LDRD, which ran from FY08 through FY10. Our goal was to investigate scalable text analysis; specifically, methods for information retrieval and visualization that could scale to extremely large document collections. Towards that end, we designed, implemented, and demonstrated a scalable framework for text analysis - ParaText - as a major project deliverable. Further, we demonstrated the benefits of using visual analysis in text analysis algorithm development, improved performance of heterogeneous ensemble models in data classification problems, and the advantages ofmore » information theoretic methods in user analysis and interpretation in cross language information retrieval. The project involved 5 members of the technical staff and 3 summer interns (including one who worked two summers). It resulted in a total of 14 publications, 3 new software libraries (2 open source and 1 internal to Sandia), several new end-user software applications, and over 20 presentations. Several follow-on projects have already begun or will start in FY11, with additional projects currently in proposal.« less

  20. Statistical Analysis of Time-Series from Monitoring of Active Volcanic Vents

    NASA Astrophysics Data System (ADS)

    Lachowycz, S.; Cosma, I.; Pyle, D. M.; Mather, T. A.; Rodgers, M.; Varley, N. R.

    2016-12-01

    Despite recent advances in the collection and analysis of time-series from volcano monitoring, and the resulting insights into volcanic processes, challenges remain in forecasting and interpreting activity from near real-time analysis of monitoring data. Statistical methods have potential to characterise the underlying structure and facilitate intercomparison of these time-series, and so inform interpretation of volcanic activity. We explore the utility of multiple statistical techniques that could be widely applicable to monitoring data, including Shannon entropy and detrended fluctuation analysis, by their application to various data streams from volcanic vents during periods of temporally variable activity. Each technique reveals changes through time in the structure of some of the data that were not apparent from conventional analysis. For example, we calculate the Shannon entropy (a measure of the randomness of a signal) of time-series from the recent dome-forming eruptions of Volcán de Colima (Mexico) and Soufrière Hills (Montserrat). The entropy of real-time seismic measurements and the count rate of certain volcano-seismic event types from both volcanoes is found to be temporally variable, with these data generally having higher entropy during periods of lava effusion and/or larger explosions. In some instances, the entropy shifts prior to or coincident with changes in seismic or eruptive activity, some of which were not clearly recognised by real-time monitoring. Comparison with other statistics demonstrates the sensitivity of the entropy to the data distribution, but that it is distinct from conventional statistical measures such as coefficient of variation. We conclude that each analysis technique examined could provide valuable insights for interpretation of diverse monitoring time-series.

  1. Investigations into near-real-time surveying for geophysical data collection using an autonomous ground vehicle

    USGS Publications Warehouse

    Phelps, Geoffrey A.; Ippolito, C.; Lee, R.; Spritzer, R.; Yeh, Y.

    2014-01-01

    The U.S. Geological Survey and the National Aeronautics and Space Administration are cooperatively investigating the utility of unmanned vehicles for near-real-time autonomous surveys of geophysical data collection. Initially focused on unmanned ground vehicle collection of magnetic data, this cooperative effort has brought unmanned surveying, precision guidance, near-real-time communication, on-the-fly data processing, and near-real-time data interpretation into the realm of ground geophysical surveying, all of which offer advantages over current methods of manned collection of ground magnetic data. An unmanned ground vehicle mission has demonstrated that these vehicles can successfully complete missions to collect geophysical data, and add advantages in data collection, processing, and interpretation. We view the current experiment as an initial phase in further unmanned vehicle data-collection missions, including aerial surveying.

  2. Facilitating the exploitation of ERTS imagery using snow enhancement techniques

    NASA Technical Reports Server (NTRS)

    Wobber, F. J. (Principal Investigator); Martin, K. R.; Amato, R. V.

    1973-01-01

    The author has identified the following significant results. New fracture detail within New England test area has been interpreted from ERTS-1 images. Comparative analysis of snow-free imagery (1096-15065 and 1096-15072) has demonstrated that MSS bands 5 and 7 supply the greatest amount of geological fracture detail. Interpretation of the first snow-covered ERTS-1 images (1132-15074 and 1168-15065) in correlation with ground snow depth data indicates that a heavy blanket of snow (less than 9 inches) accentuates major structural features while a light dusting (greater than 1 inch) accentuates more subtle topographic expressions. Snow cover was found to accentuate drainage patterns which are indicative of lithological and/or structural variations. Snow cover provided added enhancement for viewing and detecting topographically expressed fractures and faults. A recent field investigation was conducted within the New England test area to field check lineaments observed from analysis of ERTS-1 imagery, collect snow depth readings, and obtain structural joint readings at key locations in the test area.

  3. The experience of high-frequency gambling behavior of older adult females in the United Kingdom: An interpretative phenomenological analysis.

    PubMed

    Pattinson, Julie; Parke, Adrian

    2017-01-01

    The prevalence of older adult female gambling participation and gambling disorder is increasing in the UK, and there is a paucity of published research available to understand possible risk factors for frequent gambling in this demographic. The aim of the current study was to identify and explore motivations and patterns of gambling behavior in high-frequency older adult female gamblers in the UK, from the perspective of the individual and in the context of their experience of aging. Ten UK older adult female high-frequency gamblers were recruited via stratified purposive sampling, with a mean age of 70.4 years. Data was collected via semistructured interviews and analyzed using interpretative phenomenological analysis. Three core themes representative of the experience of this phenomenon emerged from the transcripts, including: Filling voids, emotional escape, and overspending. The present study has provided a contextualized understanding of motivating factors and several age-related vulnerabilities that may account for high gambling frequency in this population.

  4. Validity of active fault identification through magnetic anomalous using earthquake mechanism, microgravity and topography structure analysis in Cisolok area

    NASA Astrophysics Data System (ADS)

    Setyonegoro, Wiko; Kurniawan, Telly; Ahadi, Suaidi; Rohadi, Supriyanto; Hardy, Thomas; Prayogo, Angga S.

    2017-07-01

    Research was conducted to determine the value of the magnetic anomalies to identify anomalous value standard fault, down or up with the type of Meratus trending northeast-southwest Cisolok, Sukabumi. Data collection was performed by setting the measurement grid at intervals of 5 meters distance measurement using a Precision Proton Magnetometer (PPM) -GSM-19T. To identification the active fault using magnetic is needed another parameter. The purpose of this study is to identification active fault using magnetic Anomaly in related with subsurface structure through the validation analysis of earthquake mechanism, microgravity and with Topography Structure in Java Island. Qualitative interpretation is done by analyzing the residual anomaly that has been reduced to the pole while the quantitative interpretation is done by analyzing the pattern of residual anomalies through computation. The results of quantitative interpretation, an anomalous value reduction to the pole magnetic field is at -700 nT to 700 nT while the results of the qualitative interpretation of the modeling of the path AA', BB' and CC' shows the magnetic anomaly at coordinates liquefaction resources with a value of 1028.04, 1416.21, - 1565, -1686.91. The measurement results obtained in Cisolok magnetic anomalies that indicate a high content of alumina (Al) and iron (Fe) which be identified appears through the fault gap towards the northeast through Rajamandala Lembang Fault related to the mechanism in the form of a normal fault with slip rate of 2 mm / year.

  5. Social Archaeological Approaches in Port and Harbour Studies

    NASA Astrophysics Data System (ADS)

    Rogers, Adam

    2013-12-01

    This introductory article to the special issue of the Journal of Maritime Archaeology offers a comparative perspective on the theme of archaeological theory and social archaeological approaches to ports and harbours. As a specialist in Roman archaeology I was keen to explore the way in which specialists in other areas of archaeology approached the archaeology of ports and harbours and whether different approaches and perspectives may be able to add nuances to the way in which material is interpreted. The volume brings together a collection of exciting new studies which explore social themes in port and harbour studies with the intention to encourage debate and the use of new interpretative perspectives. This article examines a number of interpretative themes including those relating to architectural analyse, human behaviour, action and experience and artefact analysis. These themes help us to move towards a more theoretically informed ports and harbour archaeology which focuses on meaning as well as description. The emphasis on theory within archaeology allows us to be more ambitious in our interpretative frameworks including in Roman archaeology which has not tended to embrace the theoretical aspects of the archaeological discipline with as much enthusiasm as some other areas of archaeology.

  6. [Psychosocial rehabilitation: perceptions of the mental health staff].

    PubMed

    Jorge, Maria Salete Bessa; Randemark, Norma Faustino Rocha; Queiroz, Maria Veraci Oliveira; Ruiz, Erasmo Miessa

    2006-01-01

    This study is inserted in assumptions of research's analysis qualitative which objective was to interpretate the Mental Health professional's perspectives about psychosocial rehabilitation of mental disorder's porter to know as them proceed it in their professional practice. Data collection came up by the application of semi-structured interviews to 8 Mental Health professionals that work in the Center of Psychosocial Attention. After the readings, notes of pieces of talk, subcategories and categories were composed after the interpretation based on the literature. The results pointed that psychosocial rehabilitation is a process which implementation and still needs effective overcome of traditional paradigma of health mental disease, that form conception and therapeutic practices and requires trust of professionals about the users' capacity of live as citizen in the most variable segments of social life.

  7. Constraining back-arc basin formation in the eastern Coral Sea: preliminary results from the ECOSAT voyage

    NASA Astrophysics Data System (ADS)

    Seton, M.; Williams, S.; Mortimer, N. N.; Meffre, S.; Moore, J.; Micklethwaite, S.; Zahirovic, S.

    2013-12-01

    The eastern Coral Sea region is an underexplored area at the northeastern corner of the Australian plate, where long-lived interaction between the Pacific and Australian plate boundaries has resulted in an intricate assemblage of deep oceanic basins and ridges, continental fragments and volcanic products. A paucity of marine geophysical and geological data from this complex region has resulted in the lack of a clear conceptual framework to describe its formation, ultimately affecting our understanding of the connection between the plate boundaries of the SW Pacific and SE Asia. In particular, the tectonic relationship between two back-arc basins, the Santa Cruz and d'Entrecasteaux Basins, and the South Rennell Trough, has yet to be resolved. In October-November, 2012, we collected 6,200 km of marine magnetic, 6,800 km of gravity and over 13,600 km2 of swath bathymetry data from the eastern Coral Sea onboard the RV Southern Surveyor. A complementary dredging program yielded useful samples from 14 seafloor sites. Our preliminary geochemical interpretation of the dredge samples obtained from the South Rennell Trough reveal volcanic rocks resembling MORB or BABB-type basalts, similar in composition to the recently re-analysed and dated ORSTOM dredges from the area that yielded ~28 Ma MORB-like basalts. Swath bathymetry profiles from the Santa Cruz Basin reveal that the South Rennell Trough extends into this basin, with seafloor spreading fabric being parallel to the trough. Preliminary analysis of the three full and four partial new magnetic anomaly profiles across the Santa Cruz Basin, coupled with limited existing profiles, reveals that the basin may have formed between Chrons 13-18 (~32-38 Ma), with an extinct spreading ridge along the inferred continuation of the South Rennell Trough, consistent with ORSTOM age dates. Our results suggest that the South Rennell Trough is an extinct southwestward propagating spreading ridge, which may have initiated along a pre-existing zone of weakness. A preliminary interpretation of the 4 magnetic profiles collected in the d'Entrecasteaux Basin and existing profiles of seafloor fabric shows that this basin does not share a common seafloor spreading history with the Santa Cruz Basin, as has been suggested previously. Our preliminary interpretation of the relationship between the Santa Cruz Basin, South Rennell Trough and d'Entrecasteaux Basin requires a re-interpretation of existing models of the SW Pacific to take into account a southwestward propagating spreading ridge between 38-32 Ma, contemporaneous with seafloor spreading further south in the North Loyalty Basin. Further work on age-dating and geochemical analysis of the newly collected dredge samples and an in-depth analysis of the magnetic anomalies in the d'Entrecasteaux Basin may further yield important information concerning the tectonic evolution of the area.

  8. Analysis and Interpretation of Artifact Collections from Site 3CT271, Randolph Estate Development, Crittenden County, Arkansas

    DTIC Science & Technology

    1991-02-01

    The county has many streams, bayous, and lakes . Major drainages in Crittenden County include the Tyronza River, Fifteenmile Bayou, Tenmile Bayou, and...and canadian geese (Bra nta canadensis). -Fish from the larger streams, oxbow lakes , and beaver ponds, U such as the flathead catfish, alligator gar...type site for the Big Lake phase., Similar components have been recently recognized along the Mississippi River drainage just east of the project area

  9. Forensic DNA testing.

    PubMed

    Butler, John M

    2011-12-01

    Forensic DNA testing has a number of applications, including parentage testing, identifying human remains from natural or man-made disasters or terrorist attacks, and solving crimes. This article provides background information followed by an overview of the process of forensic DNA testing, including sample collection, DNA extraction, PCR amplification, short tandem repeat (STR) allele separation and sizing, typing and profile interpretation, statistical analysis, and quality assurance. The article concludes with discussions of possible problems with the data and other forensic DNA testing techniques.

  10. Interpreting biomarker data from the COPHES/DEMOCOPHES twin projects: Using external exposure data to understand biomarker differences among countries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smolders, R., E-mail: roel.smolders@vito.be; Den Hond, E.; Koppen, G.

    In 2011 and 2012, the COPHES/DEMOCOPHES twin projects performed the first ever harmonized human biomonitoring survey in 17 European countries. In more than 1800 mother–child pairs, individual lifestyle data were collected and cadmium, cotinine and certain phthalate metabolites were measured in urine. Total mercury was determined in hair samples. While the main goal of the COPHES/DEMOCOPHES twin projects was to develop and test harmonized protocols and procedures, the goal of the current paper is to investigate whether the observed differences in biomarker values among the countries implementing DEMOCOPHES can be interpreted using information from external databases on environmental quality andmore » lifestyle. In general, 13 countries having implemented DEMOCOPHES provided high-quality data from external sources that were relevant for interpretation purposes. However, some data were not available for reporting or were not in line with predefined specifications. Therefore, only part of the external information could be included in the statistical analyses. Nonetheless, there was a highly significant correlation between national levels of fish consumption and mercury in hair, the strength of antismoking legislation was significantly related to urinary cotinine levels, and we were able to show indications that also urinary cadmium levels were associated with environmental quality and food quality. These results again show the potential of biomonitoring data to provide added value for (the evaluation of) evidence-informed policy making. - Highlights: • External data was collected to interpret HBM data from DEMOCOPHES. • Hg in hair could be related to fish consumption across different countries. • Urinary cotinine was related to strictness of anti-smoking legislation. • Urinary Cd was borderline significantly related to air and food quality. • Lack of comparable data among countries hampered the analysis.« less

  11. In situ analysis of Mars soil sample with the sam gcms instrumentation onboard Curiosity : interpretation and comparison of measurements done at Rocknest and Yelloknife bay sites

    NASA Astrophysics Data System (ADS)

    Szopa, Cyril; Coll, Patrice; Cabane, Michel; Coscia, David; Buch, Arnaud; Francois, Pascaline; Millan, Maeva; Teinturier, Sammy; Navarro-Gonzales, Rafael; Glavin, Daniel; Freissinet, Caro; Steele, Andrew; Eigenbrode, Jen; Mahaffy, Paul

    2014-05-01

    The characterisation of the chemical and mineralogical composition of regolith samples collected with the Curiosity rover is a primary objective of the SAM experiment. These data should provide essential clues on the past habitability of Gale crater. Amongst the SAM suite of instruments [1], SAM-GC (Gas Chromatograph) is devoted to identify and quantify volatiles evolved from the thermal (heating up to about 900°C)/chemical (derivatization procedure) treatment of any soil sample collected by the Curiosity rover. With the aim to search for potential organic molecules outgassed from the samples, a SAM-GC analytical channel composed of thermal-desorption injector and a MXT-CLP chromatographic column was chosen to achieve all the measurements done up today, as it was designed for the separation of a wide range of volatile organic molecules. Three solid samples have been analyzed with GCMS, one sand sample collected at the Rocknest site, and two rock samples (John Klein and Cumberland respectively) collected at the Yellowknife Bay site using the Curiosity driller. All the measurements were successful and they produced complex chromatograms with both detectors used for SAM GC, i.e. a thermal conductivity detector and the SAM quandrupole mass spectrometer. Their interpretation already revealed the presence of an oxychlorine phase present in the sample which is at the origin of chlorohydrocarbons clearly identified [2] but this represents only a fraction of the GCMS signal recorded [3,4]. This work presents a systematic comparison of the GCMS measurements done for the different samples collected, supported by reference data obtained in laboratory with different spare models of the gas chromatograph, with the aim to bring new elements of interpretation of the SAM measurements. References: [1] Mahaffy, P. et al. (2012) Space Sci Rev, 170, 401-478. [2] Glavin, D. et al. (2013), JGR. [3] Leshin L. et al. (2013), Science, [4] Ming D. et al. (2013), Science, 32, 64-67. Acknowledgements: SAM-GC team acknowledges support from the French Space Agency (CNES), French National Programme of Planetology (PNP), National French Council (CNRS), Pierre Simon Laplace Institute, Institut Universitaire de France (IUF) and ESEP Labex.

  12. Non-Western interpreters' experiences of trauma: the protective role of culture following exposure to oppression.

    PubMed

    Johnson, Howard; Thompson, Andrew; Downs, Maria

    2009-08-01

    Many people flee their countries of origin after suffering severe trauma and there is a need to explore how socio-cultural factors are implicated in the experience of both trauma and posttraumatic growth. Interpreters who have been through a trauma are in a unique position to be able to reflect on cultural context. This study explored how interpreters working in the UK who had formerly suffered trauma in their country of origin, and who identified themselves as coping well, managed their experience of trauma. The qualitative method Interpretative Phenomenological Analysis (IPA) was used. Nine interpreters were interviewed following a semi-structured guide and the resulting transcripts were analysed according to IPA principles. Three key themes emerged from the data that were labelled as: trauma in the context of wider shared oppression; resisting and responding; and cultural protection and growth. Many participants described their lives prior to arriving in the UK as involving a collective traumatisation as a result of being a victim of oppression related to their cultural identity. The participants described the importance of staying connected to their culture. Giving and providing social support, religious practices, and the role of interpreter facilitated remaining connected. A sense of shared victimisation provided a protective backdrop from which the participants could make sense of the personal traumas they had experienced. The role of interpreting was important as it helped maintain cultural identity. The findings are discussed in relation to theories of both PTSD and Posttraumatic Growth. The results have implications for the work of clinicians supporting non-Western people who have been traumatised.

  13. Collectivity, Distributivity, and the Interpretation of Plural Numerical Expressions in Child and Adult Language

    PubMed Central

    Musolino, Julien

    2013-01-01

    Sentences containing plural numerical expressions (e.g., two boys) can give rise to two interpretations (collective and distributive), arising from the fact that their representation admits of a part-whole structure. We present the results of a series of experiments designed to explore children’s understanding of this distinction and its implications for the acquisition of linguistic expressions with number words. We show that preschoolers access both interpretations, indicating that they have the requisite linguistic and conceptual machinery to generate the corresponding representations. Furthermore, they can shift their interpretation in response to structural and lexical manipulations. However, they are not fully adult-like: unlike adults, they are drawn to the distributive interpretation, and are not yet aware of the lexical semantics of each and together, which should favor one or another interpretation. This research bridges a gap between a well-established body of work in cognitive psychology on the acquisition of number words and more recent work investigating children’s knowledge of the syntactic and semantic properties of sentences featuring numerical expressions. PMID:24223477

  14. Collectivity, Distributivity, and the Interpretation of Plural Numerical Expressions in Child and Adult Language.

    PubMed

    Syrett, Kristen; Musolino, Julien

    2013-01-01

    Sentences containing plural numerical expressions (e.g., two boys ) can give rise to two interpretations (collective and distributive), arising from the fact that their representation admits of a part-whole structure. We present the results of a series of experiments designed to explore children's understanding of this distinction and its implications for the acquisition of linguistic expressions with number words. We show that preschoolers access both interpretations, indicating that they have the requisite linguistic and conceptual machinery to generate the corresponding representations. Furthermore, they can shift their interpretation in response to structural and lexical manipulations. However, they are not fully adult-like: unlike adults, they are drawn to the distributive interpretation, and are not yet aware of the lexical semantics of each and together , which should favor one or another interpretation. This research bridges a gap between a well-established body of work in cognitive psychology on the acquisition of number words and more recent work investigating children's knowledge of the syntactic and semantic properties of sentences featuring numerical expressions.

  15. Beyond data collection in digital mapping: interpretation, sketching and thought process elements in geological map making

    NASA Astrophysics Data System (ADS)

    Watkins, Hannah; Bond, Clare; Butler, Rob

    2016-04-01

    Geological mapping techniques have advanced significantly in recent years from paper fieldslips to Toughbook, smartphone and tablet mapping; but how do the methods used to create a geological map affect the thought processes that result in the final map interpretation? Geological maps have many key roles in the field of geosciences including understanding geological processes and geometries in 3D, interpreting geological histories and understanding stratigraphic relationships in 2D and 3D. Here we consider the impact of the methods used to create a map on the thought processes that result in the final geological map interpretation. As mapping technology has advanced in recent years, the way in which we produce geological maps has also changed. Traditional geological mapping is undertaken using paper fieldslips, pencils and compass clinometers. The map interpretation evolves through time as data is collected. This interpretive process that results in the final geological map is often supported by recording in a field notebook, observations, ideas and alternative geological models explored with the use of sketches and evolutionary diagrams. In combination the field map and notebook can be used to challenge the map interpretation and consider its uncertainties. These uncertainties and the balance of data to interpretation are often lost in the creation of published 'fair' copy geological maps. The advent of Toughbooks, smartphones and tablets in the production of geological maps has changed the process of map creation. Digital data collection, particularly through the use of inbuilt gyrometers in phones and tablets, has changed smartphones into geological mapping tools that can be used to collect lots of geological data quickly. With GPS functionality this data is also geospatially located, assuming good GPS connectivity, and can be linked to georeferenced infield photography. In contrast line drawing, for example for lithological boundary interpretation and sketching, is yet to find the digital flow that is achieved with pencil on notebook page or map. Free-form integrated sketching and notebook functionality in geological mapping software packages is in its nascence. Hence, the result is a tendency for digital geological mapping to focus on the ease of data collection rather than on the thoughts and careful observations that come from notebook sketching and interpreting boundaries on a map in the field. The final digital geological map can be assessed for when and where data was recorded, but the thought processes of the mapper are less easily assessed, and the use of observations and sketching to generate ideas and interpretations maybe inhibited by reliance on digital mapping methods. All mapping methods used have their own distinct advantages and disadvantages and with more recent technologies both hardware and software issues have arisen. We present field examples of using conventional fieldslip mapping, and compare these with more advanced technologies to highlight some of the main advantages and disadvantages of each method and discuss where geological mapping may be going in the future.

  16. Standards for data acquisition and software-based analysis of in vivo electroencephalography recordings from animals. A TASK1-WG5 report of the AES/ILAE Translational Task Force of the ILAE.

    PubMed

    Moyer, Jason T; Gnatkovsky, Vadym; Ono, Tomonori; Otáhal, Jakub; Wagenaar, Joost; Stacey, William C; Noebels, Jeffrey; Ikeda, Akio; Staley, Kevin; de Curtis, Marco; Litt, Brian; Galanopoulou, Aristea S

    2017-11-01

    Electroencephalography (EEG)-the direct recording of the electrical activity of populations of neurons-is a tremendously important tool for diagnosing, treating, and researching epilepsy. Although standard procedures for recording and analyzing human EEG exist and are broadly accepted, there are no such standards for research in animal models of seizures and epilepsy-recording montages, acquisition systems, and processing algorithms may differ substantially among investigators and laboratories. The lack of standard procedures for acquiring and analyzing EEG from animal models of epilepsy hinders the interpretation of experimental results and reduces the ability of the scientific community to efficiently translate new experimental findings into clinical practice. Accordingly, the intention of this report is twofold: (1) to review current techniques for the collection and software-based analysis of neural field recordings in animal models of epilepsy, and (2) to offer pertinent standards and reporting guidelines for this research. Specifically, we review current techniques for signal acquisition, signal conditioning, signal processing, data storage, and data sharing, and include applicable recommendations to standardize collection and reporting. We close with a discussion of challenges and future opportunities, and include a supplemental report of currently available acquisition systems and analysis tools. This work represents a collaboration on behalf of the American Epilepsy Society/International League Against Epilepsy (AES/ILAE) Translational Task Force (TASK1-Workgroup 5), and is part of a larger effort to harmonize video-EEG interpretation and analysis methods across studies using in vivo and in vitro seizure and epilepsy models. Wiley Periodicals, Inc. © 2017 International League Against Epilepsy.

  17. What Could Be Future Scenarios?—Lessons from the History of Public Health Surveillance for the Future

    PubMed Central

    Choi, Bernard C.K.

    2015-01-01

    This article provides insights into the future based on a review of the past and present of public health surveillance—the ongoing systematic collection, analysis, interpretation, and dissemination of health data for the planning, implementation, and evaluation of public health action. Public health surveillance dates back to the first recorded epidemic in 3180 BC in Egypt. A number of lessons and items of interest are summarised from a review of historical perspectives in the past 5,000 years and the current practice of surveillance. Some future scenarios are presented: exploring new frontiers; enhancing computer technology; improving epidemic investigations; improving data collection, analysis, dissemination and use; building on lessons from the past; building capacity; and enhancing global surveillance. It is concluded that learning from the past, reflecting on the present, and planning for the future can further enhance public health surveillance. PMID:29546093

  18. Comparison of air-coupled GPR data analysis results determined by multiple analysts

    NASA Astrophysics Data System (ADS)

    Martino, Nicole; Maser, Ken

    2016-04-01

    Current bridge deck condition assessments using ground penetrating radar (GPR) requires a trained analyst to manually interpret substructure layering information from B-scan images in order to proceed with an intended analysis (pavement thickness, concrete cover, effects of rebar corrosion, etc.) For example, a recently developed method to rapidly and accurately analyze air-coupled GPR data based on the effects of rebar corrosion, requires that a user "picks" a layer of rebar reflections in each B-scan image collected along the length of the deck. These "picks" have information like signal amplitude and two way travel time. When a deck is new, or has little rebar corrosion, the resulting layer of rebar reflections is readily evident and there is little room for subjectivity. However, when a deck is severely deteriorated, the rebar layer may be difficult to identify, and different analysts may make different interpretations of the appropriate layer to analyze. One highly corroded bridge deck, was assessed with a number of nondestructive evaluation techniques including 2GHz air-coupled GPR. Two trained analysts separately selected the rebar layer in each B-scan image, choosing as much information as possible, even in areas of significant deterioration. The post processing of the selected data points was then completed and the results from each analyst were contour plotted to observe any discrepancies. The paper describes the differences between ground coupled and air-coupled GPR systems, the data collection and analysis methods used by two different analysts for one case study, and the results of the two different analyses.

  19. Evolved Gas Analysis and X-Ray Diffraction of Carbonate Samples from the 2009 Arctic Mars Analog Svalbard Expedition: Implications for Mineralogical Inferences from the Mars Science Laboratory

    NASA Technical Reports Server (NTRS)

    McAdam, A. C.; Mahaffy, P. R.; Blake, D. F.; Ming, D. W.; Franz, H. B.; Eigenbrode, J. L.; Steele, A.

    2010-01-01

    The 2009 Arctic Mars Analog Svalbard Expedition (AMASE) investigated several geologic settings using methodologies and techniques being developed or considered for future Mars missions, such as the Mars Science Laboratory (MSL), ExoMars, and Mars Sample Return (MSR). AMASE-related research comprises both analyses conducted during the expedition and further analyses of collected samples using laboratory facilities at a variety of institutions. The Sample Analysis at Mars (SAM) instrument suite, which will be part of the Analytical Laboratory on MSL, consists of a quadrupole mass spectrometer (QMS), a gas chromatograph (GC), and a tunable laser spectrometer (TLS). An Evolved Gas Analysis Mass Spectrometer (EGA-MS) was used during AMASE to represent part of the capabilities of SAM. The other instrument included in the MSL Analytical Laboratory is CheMin, which uses X-Ray Diffraction (XRD) and X-Ray Fluorescence (XRF) to perform quantitative mineralogical characterization of samples. Field-portable versions of CheMin were used during the AMASE 2009. Here, we discuss the preliminary interpretation of EGA and XRD analyses of selected AMASE carbonate samples and implications for mineralogical interpretations from MSL. Though CheMin will be the primary mineralogical tool on MSL, SAM EGA could be used to support XRD identifications or indicate the presence of volatile-bearing minerals which may be near or below XRD detection limits. Data collected with instruments in the field and in comparable laboratory setups (e.g., the SAM breadboard) will be discussed.

  20. Crustal insights from gravity and aeromagnetic analysis: Central North Slope, Alaska

    USGS Publications Warehouse

    Saltus, R.W.; Potter, C.J.; Phillips, J.D.

    2006-01-01

    Aeromagnetic and gravity data are processed and interpreted to reveal deep and shallow information about the crustal structure of the central North Slope, Alaska. Regional aeromagnetic anomalies primarily reflect deep crustal features. Regional gravity anomalies are more complex and require detailed analysis. We constrain our geophysical models with seismic data and interpretations along two transects including the Trans-Alaska Crustal Transect. Combined geophysical analysis reveals a remarkable heterogeneity of the pre-Mississippian basement. In the central North Slope, pre-Mississippian basement consists of two distinct geophysical domains. To the southwest, the basement is dense and highly magnetic; this basement is likely mafic and mechanically strong, possibly acting as a buttress to basement involvement in Brooks Range thrusting. To the northeast, the central North Slope basement consists of lower density, moderately magnetic rocks with several discrete regions (intrusions?) of more magnetic rocks. A conjugate set of geophysical trends, northwest-southeast and southwest-northeast, may be a factor in the crustal response to tectonic compression in this domain. High-resolution gravity and aeromagnetic data, where available, reflect details of shallow fault and fold structure. The maps and profile models in this report should provide useful guidelines and complementary information for regional structural studies, particularly in combination with detailed seismic reflection interpretations. Future challenges include collection of high-resolution gravity and aeromagnetic data for the entire North Slope as well as additional deep crustal information from seismic, drilling, and other complementary methods. Copyrights ?? 2006. The American Association of Petroleum Geologists. All rights reserved.

  1. The Standard Deviation of Differential Index as an innovation diagnostic tool based on kinematic parameters for objective assessment of a upper limb motion pathology.

    PubMed

    Jurkojć, Jacek; Wodarski, Piotr; Michnik, Robert A; Bieniek, Andrzej; Gzik, Marek; Granek, Arkadiusz

    2017-01-01

    Indexing methods are very popular in terms of determining the degree of disability associated with motor dysfunctions. Currently, indexing methods dedicated to the upper limbs are not very popular, probably due to difficulties in their interpretation. This work presents the calculation algorithm of new SDDI index and the attempt is made to determine the level of physical dysfunction along with description of its kind, based on the interpretation of the calculation results of SDDI and PULMI indices. 23 healthy people (10 women and 13 men), which constituted a reference group, and a group of 3 people with mobility impairments participated in the tests. In order to examine possibilities of the utilization of the SDDI index the participants had to repetitively perform two selected rehabilitation movements of upper extremities. During the tests the kinematic value was registered using inertial motion analysis system MVN BIOMECH. The results of the test were collected in waveforms of 9 anatomical angles in 4 joints of upper extremities. Then, SDDI and PULMI indices were calculated for each person with mobility impairments. Next, the analysis was performed to check which abnormalities in upper extremity motion can influence the value of both indexes and interpretation of those indexes was shown. Joint analysis of the both indices provides information on whether the patient has correctly performed the set sequence of movement and enables the determination of possible irregularities in the performance of movement given.

  2. Microbial forensics: the next forensic challenge.

    PubMed

    Budowle, Bruce; Murch, Randall; Chakraborty, Ranajit

    2005-11-01

    Pathogens and toxins can be converted to bioweapons and used to commit bioterrorism and biocrime. Because of the potential and relative ease of an attack using a bioweapon, forensic science needs to be prepared to assist in the investigation to bring perpetrators to justice and to deter future attacks. A new subfield of forensics--microbial forensics--has been created, which is focused on characterization of evidence from a bioterrorism act, biocrime, hoax, or an inadvertent release. Forensic microbiological investigations are essentially the same as any other forensic investigation regarding processing. They involve crime scene(s) investigation, chain of custody practices, evidence collection, handling and preservation, evidence shipping, analysis of evidence, interpretation of results, and court presentation. In addition to collecting and analyzing traditional forensic evidence, the forensic investigation will attempt to determine the etiology and identity of the causal agent, often in a similar fashion as in an epidemiologic investigation. However, for attribution, higher-resolution characterization is needed. The tools for attribution include genetic- and nongenetic-based assays and informatics to attempt to determine the unique source of a sample or at least eliminate some sources. In addition, chemical and physical assays may help determine the process used to prepare, store, or disseminate the bioweapon. An effective microbial forensics program will require development and/or validation of all aspects of the forensic investigative process, from sample collection to interpretation of results. Quality assurance (QA) and QC practices, comparable to those used by the forensic DNA science community, are being implemented. Lastly, partnerships with other laboratories will be requisite, because many of the necessary capabilities for analysis will not reside in the traditional forensic laboratory.

  3. Validation of the sex estimation method elaborated by Schutkowski in the Granada Osteological Collection of identified infant and young children: Analysis of the controversy between the different ways of analyzing and interpreting the results.

    PubMed

    Irurita Olivares, Javier; Alemán Aguilera, Inmaculada

    2016-11-01

    Sex estimation of juveniles in the Physical and Forensic Anthropology context is currently a task with serious difficulties because the discriminatory bone characteristics are minimal until puberty. Also, the small number of osteological collections of children available for research has made it difficult to develop effective methodologies in this regard. This study tested the characteristics of the ilium and jaw proposed by Schutkowski in 1993 for estimation of sex in subadults. The study sample consisted of 109 boys and 76 girls, ranging in age from 5 months of gestation to 6 years, from the identified osteological collection of Granada (Spain). For the analysis and interpretation of the results, we have proposed changes from previous studies because we believe they raised methodological errors relating to the calculation of probabilities of success and sex distribution in the sample. The results showed correct assignment probabilities much lower than those obtained by Schutkowski as well as by other authors. The best results were obtained with the angle and depth of the sciatic notch, with 0.73 and 0.80 probability of correct assignment respectively if the male trait was observed. The results obtained with the other criteria were too small to be valid in the context of Physical or Forensic Anthropology. From our results, we concluded that Schutkowski method should not be used in forensic context, and that the sciatic notch is the most dimorphic trait in subadults and, therefore, the most appropriate to develop more effective methods for estimating sex.

  4. Patient-reported outcome measures in arthroplasty registries

    PubMed Central

    Bohm, Eric; Franklin, Patricia; Lyman, Stephen; Denissen, Geke; Dawson, Jill; Dunn, Jennifer; Eresian Chenok, Kate; Dunbar, Michael; Overgaard, Søren; Garellick, Göran; Lübbeke, Anne

    2016-01-01

    Abstract — The International Society of Arthroplasty Registries (ISAR) Patient-Reported Outcome Measures (PROMs) Working Group have evaluated and recommended best practices in the selection, administration, and interpretation of PROMs for hip and knee arthroplasty registries. The 2 generic PROMs in common use are the Short Form health surveys (SF-36 or SF-12) and EuroQol 5-dimension (EQ-5D). The Working Group recommends that registries should choose specific PROMs that have been appropriately developed with good measurement properties for arthroplasty patients. The Working Group recommend the use of a 1-item pain question (“During the past 4 weeks, how would you describe the pain you usually have in your [right/left] [hip/knee]?”; response: none, very mild, mild, moderate, or severe) and a single-item satisfaction outcome (“How satisfied are you with your [right/left] [hip/knee] replacement?”; response: very unsatisfied, dissatisfied, neutral, satisfied, or very satisfied). Survey logistics include patient instructions, paper- and electronic-based data collection, reminders for follow-up, centralized as opposed to hospital-based follow-up, sample size, patient- or joint-specific evaluation, collection intervals, frequency of response, missing values, and factors in establishing a PROMs registry program. The Working Group recommends including age, sex, diagnosis at joint, general health status preoperatively, and joint pain and function score in case-mix adjustment models. Interpretation and statistical analysis should consider the absolute level of pain, function, and general health status as well as improvement, missing data, approaches to analysis and case-mix adjustment, minimal clinically important difference, and minimal detectable change. The Working Group recommends data collection immediately before and 1 year after surgery, a threshold of 60% for acceptable frequency of response, documentation of non-responders, and documentation of incomplete or missing data. PMID:27228230

  5. Quality assurance and quality control of geochemical data—A primer for the research scientist

    USGS Publications Warehouse

    Geboy, Nicholas J.; Engle, Mark A.

    2011-01-01

    Geochemistry is a constantly expanding science. More and more, scientists are employing geochemical tools to help answer questions about the Earth and earth system processes. Scientists may assume that the responsibility of examining and assessing the quality of the geochemical data they generate is not theirs but rather that of the analytical laboratories to which their samples have been submitted. This assumption may be partially based on knowledge about internal and external quality assurance and quality control (QA/QC) programs in which analytical laboratories typically participate. Or there may be a perceived lack of time or resources to adequately examine data quality. Regardless of the reason, the lack of QA/QC protocols can lead to the generation and publication of erroneous data. Because the interpretations drawn from the data are primary products to U.S. Geological Survey (USGS) stakeholders, the consequences of publishing erroneous results can be significant. The principal investigator of a scientific study ultimately is responsible for the quality and interpretation of the project's findings, and thus must also play a role in the understanding, implementation, and presentation of QA/QC information about the data. Although occasionally ignored, QA/QC protocols apply not only to procedures in the laboratory but also in the initial planning of a research study and throughout the life of the project. Many of the tenets of developing a sound QA/QC program or protocols also parallel the core concepts of developing a good study: What is the main objective of the study? Will the methods selected provide data of enough resolution to answer the hypothesis? How should samples be collected? Are there known or unknown artifacts or contamination sources in the sampling and analysis methods? Assessing data quality requires communication between the scientists responsible for designing the study and those collecting samples, analyzing samples, treating data, and interpreting results. This primer has been developed to provide basic information and guidance about developing QA/QC protocols for geochemical studies. It is not intended to be a comprehensive guide but rather an introduction to key concepts tied to a list of relevant references for further reading. The guidelines are presented in stepwise order beginning with presampling considerations and continuing through final data interpretation. The goal of this primer is to outline basic QA/QC practices that scientists can use before, during, and after chemical analysis to ensure the validity of the data they collect with the goal of providing defendable results and conclusions.

  6. Interpretations of Graphs by University Biology Students and Practicing Scientists: Toward a Social Practice View of Scientific Representation Practices.

    ERIC Educational Resources Information Center

    Bowen, G. Michael; Roth, Wolff-Michael; McGinn, Michelle K.

    1999-01-01

    Describes a study of the similarities and differences in graph-related interpretations between scientists and college students engaged in collective graph interpretation. Concludes that while many students learned to provide correct answers to scientific graphing questions, they did not come to make linguistic distinctions or increase their…

  7. A robust ambient temperature collection and stabilization strategy: Enabling worldwide functional studies of the human microbiome

    PubMed Central

    Anderson, Ericka L.; Li, Weizhong; Klitgord, Niels; Highlander, Sarah K.; Dayrit, Mark; Seguritan, Victor; Yooseph, Shibu; Biggs, William; Venter, J. Craig; Nelson, Karen E.; Jones, Marcus B.

    2016-01-01

    As reports on possible associations between microbes and the host increase in number, more meaningful interpretations of this information require an ability to compare data sets across studies. This is dependent upon standardization of workflows to ensure comparability both within and between studies. Here we propose the standard use of an alternate collection and stabilization method that would facilitate such comparisons. The DNA Genotek OMNIgene∙Gut Stool Microbiome Kit was compared to the currently accepted community standard of freezing to store human stool samples prior to whole genome sequencing (WGS) for microbiome studies. This stabilization and collection device allows for ambient temperature storage, automation, and ease of shipping/transfer of samples. The device permitted the same data reproducibility as with frozen samples, and yielded higher recovery of nucleic acids. Collection and stabilization of stool microbiome samples with the DNA Genotek collection device, combined with our extraction and WGS, provides a robust, reproducible workflow that enables standardized global collection, storage, and analysis of stool for microbiome studies. PMID:27558918

  8. Analysis and Interpretation of Artifact Collections from four Archaeological Sites within the Country Club Gardens Permit Area, West Memphis, Crittenden County, Arkansas

    DTIC Science & Technology

    1990-12-01

    Excavation at Chucallssa (40SY1). Laboratory work at C.H. Nash Museum. Testing of suspected mound site near Reelfoot Lake , Obion County, Tennessee...Ferguson 1974:2). The county has many streams, bayov s, and lakes . Major drainages in Crittenden County include the Tyronza River, Fifteenmile Bayou...Branta canadensis). Fish from the larger streams, oxbow lakes and beaver ponds, such as the flathead catfish, alligator gar, drum, buffalo, largemouth

  9. A Cultural Resources Survey of the St. Charles Parish Hurricane Protection Levee, St. Charles Parish, Louisiana

    DTIC Science & Technology

    1988-09-01

    Environments, Inc. kDEC 2 1198 8 1260 Main Street Baton Rouge, Louisiana 70802 H Prepared for U.S. Army Corps of Engineers New Orleans District P.O. Box 60267...Geological interpretations were developed by Charles Pearson with the assistance of Sherwood Gagliano. Figures for the report were prepared by...November 1987. Analysis of the collected data and report preparation took place in December 1987 and January 1988. Although the hurricane protection levee

  10. Berlin Reflectance Spectral Library (BRSL)

    NASA Astrophysics Data System (ADS)

    Henckel, D.; Arnold, G.; Kappel, D.; Moroz, L. V.; Markus, K.

    2017-09-01

    The Berlin Reflectance Spectral Library (BRSL) provides a collection of reflectance spectra between 0.3 and 17 µm. It was originally dedicated to support space missions to small solar system bodies. Meanwhile the library includes selections of biconical reflectance spectra for spectral data analysis of other planetary bodies as well. The library provides reference spectra of well-characterized terrestrial analogue materials and meteorites for interpretation of remote sensing reflectance spectra of planetary surfaces. We introduce the BRSL, summarize the data available, and access to use them for further relevant applications.

  11. Endometritis: Diagnostic Tools for Infectious Endometritis.

    PubMed

    Ferris, Ryan A

    2016-12-01

    Infectious endometritis is among the leading causes of subfertility in the mare. However, the best way to reliably diagnose these cases of infectious endometritis can be confusing to the veterinary practitioner. The goal of this article is to describe how to perform various sample collection techniques, what analyses can be performed on these samples, and how to interpret the results of these analysis. Additionally, future technologies will be presented that are not currently used in equine reproduction practice. Published by Elsevier Inc.

  12. In Situ Analysis of Mars Soil and Rocks Sample with the Sam Gcms Instrumentation Onboard Curiosity : Interpretation and Comparison of Measurements Done during the First Martian Year of Curiosity on Mars

    NASA Astrophysics Data System (ADS)

    Szopa, C.; Coll, P. J.; Cabane, M.; Buch, A.; Coscia, D.; Millan, M.; Francois, P.; Belmahadi, I.; Teinturier, S.; Navarro-Gonzalez, R.; Glavin, D. P.; Freissinet, C.; Steele, A.; Eigenbrode, J. L.; Mahaffy, P. R.

    2014-12-01

    The characterisation of the chemical and mineralogical composition of solid surface samples collected with the Curiosity rover is a primary objective of the SAM experiment. These data should provide essential clues on the past habitability of Gale crater. Amongst the SAM suite of instruments [1], SAM-GC (Gas Chromatograph) is devoted to identify and quantify volatiles evolved from the thermal (heating up to about 900°C)/chemical (derivatization procedure) treatment of any soil sample collected by the Curiosity rover. With the aim to search for potential organic molecules outgassed from the samples, SAM-GC analytical channels composed of thermal-desorption injector, and a MXT-CLP or a MXT-Q chromatographic column was chosen to achieve all the measurements done up today, with the aim to separate of a wide range of volatile inorganic and organic molecules. Four solid samples have been analyzed with GCMS, one sand sample collected at the Rocknest site, two rock samples (John Klein and Cumberland respectively) collected at the Yellowknife Bay site using the Curiosity driller, and one rock sample collected at the Kimberly site. All the measurements were successful and they produced complex chromatograms with both detectors used for SAM GC, i.e. a thermal conductivity detector and the SAM quandrupole mass spectrometer. Their interpretation already revealed the presence of an oxychlorine phase present in the sample which is at the origin of chlorohydrocarbons clearly identified [2] but this represents only a fraction of the GCMS signal recorded [3,4]. This work presents a systematic comparison of the GCMS measurements done for the different samples collected, supported by reference data obtained in laboratory with different spare models of the gas chromatograph, with the aim to bring new elements of interpretation of the SAM measurements. References: [1] Mahaffy, P. et al. (2012) Space Sci Rev, 170, 401-478. [2] Glavin, D. et al. (2013), JGR. [3] Leshin L. et al. (2013), Science, [4] Ming D. et al. (2013), Science, 32, 64-67. Acknowledgements: SAM-GC team acknowledges support from the French Space Agency (CNES), French National Programme of Planetology (PNP), National French Council (CNRS), Pierre Simon Laplace Institute, Institut Universitaire de France (IUF) and ESEP Labex, and the great MSL team

  13. Machine-readable files developed for the High Plains Regional Aquifer-System analysis in parts of Colorado, Kansas, Nebraska, New Mexico, Oklahoma, South Dakota, Texas, and Wyoming

    USGS Publications Warehouse

    Ferrigno, C.F.

    1986-01-01

    Machine-readable files were developed for the High Plains Regional Aquifer-System Analysis project are stored on two magnetic tapes available from the U.S. Geological Survey. The first tape contains computer programs that were used to prepare, store, retrieve, organize, and preserve the areal interpretive data collected by the project staff. The second tape contains 134 data files that can be divided into five general classes: (1) Aquifer geometry data, (2) aquifer and water characteristics , (3) water levels, (4) climatological data, and (5) land use and water use data. (Author 's abstract)

  14. Quality of nutrient data from streams and ground water sampled during water years 1992-2001

    USGS Publications Warehouse

    Mueller, David K.; Titus, Cindy J.

    2005-01-01

    Proper interpretation of water-quality data requires consideration of the effects that bias and variability might have on measured constituent concentrations. In this report, methods are described to estimate the bias due to contamination of samples in the field or laboratory and the variability due to sample collection, processing, shipment, and analysis. Contamination can adversely affect interpretation of measured concentrations in comparison to standards or criteria. Variability can affect interpretation of small differences between individual measurements or mean concentrations. Contamination and variability are determined for nutrient data from quality-control samples (field blanks and replicates) collected as part of the National Water-Quality Assessment (NAWQA) Program during water years 1992-2001. Statistical methods are used to estimate the likelihood of contamination and variability in all samples. Results are presented for five nutrient analytes from stream samples and four nutrient analytes from ground-water samples. Ammonia contamination can add at least 0.04 milligram per liter in up to 5 percent of all samples. This could account for more than 22 percent of measured concentrations at the low range of aquatic-life criteria (0.18 milligram per liter). Orthophosphate contamination, at least 0.019 milligram per liter in up to 5 percent of all samples, could account for more than 38 percent of measured concentrations at the limit to avoid eutrophication (0.05 milligram per liter). Nitrite-plus-nitrate and Kjeldahl nitrogen contamination is less than 0.4 milligram per liter in 99 percent of all samples; thus there is no significant effect on measured concentrations of environmental significance. Sampling variability has little or no effect on reported concentrations of ammonia, nitrite-plus-nitrate, orthophosphate, or total phosphorus sampled after 1998. The potential errors due to sampling variability are greater for the Kjeldahl nitrogen analytes and for total phosphorus sampled before 1999. The uncertainty in a mean of 10 concentrations caused by sampling variability is within a small range (1 to 7 percent) for all nutrients. These results can be applied to interpretation of environmental data collected during water years 1992-2001 in 52 NAWQA study units.

  15. Hamster Math: Authentic Experiences in Data Collection.

    ERIC Educational Resources Information Center

    Jorgensen, Beth

    1996-01-01

    Describes the data collection and interpretation project of primary grade students involving predicting, graphing, estimating, measuring, number problem construction, problem solving, and probability. (MKR)

  16. A User's Guide to the Encyclopedia of DNA Elements (ENCODE)

    PubMed Central

    2011-01-01

    The mission of the Encyclopedia of DNA Elements (ENCODE) Project is to enable the scientific and medical communities to interpret the human genome sequence and apply it to understand human biology and improve health. The ENCODE Consortium is integrating multiple technologies and approaches in a collective effort to discover and define the functional elements encoded in the human genome, including genes, transcripts, and transcriptional regulatory regions, together with their attendant chromatin states and DNA methylation patterns. In the process, standards to ensure high-quality data have been implemented, and novel algorithms have been developed to facilitate analysis. Data and derived results are made available through a freely accessible database. Here we provide an overview of the project and the resources it is generating and illustrate the application of ENCODE data to interpret the human genome. PMID:21526222

  17. Up the Beanstalk: An Evolutionary Organizational Structure for Libraries.

    ERIC Educational Resources Information Center

    Hoadley, Irene B.; Corbin, John

    1990-01-01

    Presents a functional organizational model for research libraries consisting of six major divisions and subunits: acquisition (buying, borrowing, leasing); organization (records creation, records maintenance); collections (collections management, selection, preservation, special collections and archives); interpretation (reference, instructional…

  18. Interpreting expressive performance through listener judgments of musical tension

    PubMed Central

    Farbood, Morwaread M.; Upham, Finn

    2013-01-01

    This study examines listener judgments of musical tension for a recording of a Schubert song and its harmonic reduction. Continuous tension ratings collected in an experiment and quantitative descriptions of the piece's musical features, include dynamics, pitch height, harmony, onset frequency, and tempo, were analyzed from two different angles. In the first part of the analysis, the different processing timescales for disparate features contributing to tension were explored through the optimization of a predictive tension model. The results revealed the optimal time windows for harmony were considerably longer (~22 s) than for any other feature (~1–4 s). In the second part of the analysis, tension ratings for the individual verses of the song and its harmonic reduction were examined and compared. The results showed that although the average tension ratings between verses were very similar, differences in how and when participants reported tension changes highlighted performance decisions made in the interpretation of the score, ambiguity in tension implications of the music, and the potential importance of contrast between verses and phrases. Analysis of the tension ratings for the harmonic reduction also provided a new perspective for better understanding how complex musical features inform listener tension judgments. PMID:24416024

  19. The development of participatory health research among incarcerated women in a Canadian prison

    PubMed Central

    Murphy, K.; Hanson, D.; Hemingway, C.; Ramsden, V.; Buxton, J.; Granger-Brown, A.; Condello, L-L.; Buchanan, M.; Espinoza-Magana, N.; Edworthy, G.; Hislop, T. G.

    2009-01-01

    This paper describes the development of a unique prison participatory research project, in which incarcerated women formed a research team, the research activities and the lessons learned. The participatory action research project was conducted in the main short sentence minimum/medium security women's prison located in a Western Canadian province. An ethnographic multi-method approach was used for data collection and analysis. Quantitative data was collected by surveys and analysed using descriptive statistics. Qualitative data was collected from orientation package entries, audio recordings, and written archives of research team discussions, forums and debriefings, and presentations. These data and ethnographic observations were transcribed and analysed using iterative and interpretative qualitative methods and NVivo 7 software. Up to 15 women worked each day as prison research team members; a total of 190 women participated at some time in the project between November 2005 and August 2007. Incarcerated women peer researchers developed the research processes including opportunities for them to develop leadership and technical skills. Through these processes, including data collection and analysis, nine health goals emerged. Lessons learned from the research processes were confirmed by the common themes that emerged from thematic analysis of the research activity data. Incarceration provides a unique opportunity for engagement of women as expert partners alongside academic researchers and primary care workers in participatory research processes to improve their health. PMID:25759141

  20. GSuite HyperBrowser: integrative analysis of dataset collections across the genome and epigenome.

    PubMed

    Simovski, Boris; Vodák, Daniel; Gundersen, Sveinung; Domanska, Diana; Azab, Abdulrahman; Holden, Lars; Holden, Marit; Grytten, Ivar; Rand, Knut; Drabløs, Finn; Johansen, Morten; Mora, Antonio; Lund-Andersen, Christin; Fromm, Bastian; Eskeland, Ragnhild; Gabrielsen, Odd Stokke; Ferkingstad, Egil; Nakken, Sigve; Bengtsen, Mads; Nederbragt, Alexander Johan; Thorarensen, Hildur Sif; Akse, Johannes Andreas; Glad, Ingrid; Hovig, Eivind; Sandve, Geir Kjetil

    2017-07-01

    Recent large-scale undertakings such as ENCODE and Roadmap Epigenomics have generated experimental data mapped to the human reference genome (as genomic tracks) representing a variety of functional elements across a large number of cell types. Despite the high potential value of these publicly available data for a broad variety of investigations, little attention has been given to the analytical methodology necessary for their widespread utilisation. We here present a first principled treatment of the analysis of collections of genomic tracks. We have developed novel computational and statistical methodology to permit comparative and confirmatory analyses across multiple and disparate data sources. We delineate a set of generic questions that are useful across a broad range of investigations and discuss the implications of choosing different statistical measures and null models. Examples include contrasting analyses across different tissues or diseases. The methodology has been implemented in a comprehensive open-source software system, the GSuite HyperBrowser. To make the functionality accessible to biologists, and to facilitate reproducible analysis, we have also developed a web-based interface providing an expertly guided and customizable way of utilizing the methodology. With this system, many novel biological questions can flexibly be posed and rapidly answered. Through a combination of streamlined data acquisition, interoperable representation of dataset collections, and customizable statistical analysis with guided setup and interpretation, the GSuite HyperBrowser represents a first comprehensive solution for integrative analysis of track collections across the genome and epigenome. The software is available at: https://hyperbrowser.uio.no. © The Author 2017. Published by Oxford University Press.

  1. Metaphorical interpretations of the educator-student relationship: An innovation in nursing educational research.

    PubMed

    Chan, Zenobia C Y; Chien, Wai Tong; Henderson, Saras

    2018-01-01

    Previous research has shown that collecting and analysing metaphors is a useful strategy in seeking data that are difficult to collect via verbal interviews or that cannot be represented by statistics. This study explored nursing students' perceptions of the educator-student relationship using metaphorical interpretation. A qualitative study with a personal essay approach was adopted. A total of 124 students were recruited from a nursing school in Hong Kong. A personal essay form was distributed to the participants. They were asked to give a metaphor with explanations to describe the power dynamics in the educator-student relationship, within 200 words in English or Chinese. After some thought, the participants each gave their own metaphor individually, because the aim of this study was to collect their subjective experiences. The results were presented as follows: a) The overall description of the metaphors; b) The three groups of metaphors; c) The fives natures of metaphors; d) The most significant metaphors; and e) The four thematic meanings - (i) nurturing role; (ii) guiding role; (iii) insufficient connection; and (iv) promoting development. The implications for research methods and nurse education of collecting and analyzing metaphors were discussed. Discrepancies in metaphorical interpretations are to be expected, as interpretations are dependent on the researchers' socio-cultural background, personal experiences, professional training, languages spoken, and other factors. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Perception of mental health in Pakistani nomads: An interpretative phenomenological analyses

    PubMed Central

    Choudhry, Fahad Riaz; Bokharey, Iram Zehra

    2013-01-01

    The study was conducted to explore the mental health issues of Pakistani nomads and to uncover their concept, ideation, and perception about mental health and illnesses. It was an exploratory study situated in the qualitative paradigm. The research strategy used was Interpretative Phenomenological Analysis (IPA), as the study was planned to explore the lived experiences of nomads regarding mental health and coping strategies and how they interpret those experiences. For data collection, focus group discussions (FGDs) were conducted. Seven participants were included in the FGDs, and two FGDs were conducted composed of both genders. The responses were recorded, and data were transcribed and analysed using IPA. Data verification procedures of peer review, which help to clarify researcher bias and rich thick description, were used. The major themes were lack of resources and myriad unfulfilled needs, specifically the basic needs (food, shelter, and drinking and bathing water). Moreover, a strong desire to fulfil the secondary needs of enjoyment and having luxuries was also reflected. A list of recommendations was forwarded for policy making of this marginalized community and to create awareness regarding mental health. PMID:24369779

  3. Remote sensing data applied to the evaluation of soil erosion caused by land-use. Ribeirao Anhumas Basin Area: A case study. [Brazil

    NASA Technical Reports Server (NTRS)

    Dejesusparada, N. (Principal Investigator); Dosanjosferreirapinto, S.; Kux, H. J. H.

    1980-01-01

    Formerly covered by a tropical forest, the study area was deforested in the early 40's for coffee plantation and cattle raising, which caused intense gully erosion problems. To develop a method to analyze the relationship between land use and soil erosion, visual interpretations of aerial photographs (scale 1:25.000), MSS-LANDSAT imagery (scale 1:250,000), as well as automatic interpretation of computer compatible tapes by IMAGE-100 system were carried out. From visual interpretation the following data were obtained: land use and cover tapes, slope classes, ravine frequency, and a texture sketch map. During field work, soil samples were collected for texture and X-ray analysis. The texture sketch map indicate that the areas with higher slope angles have a higher susceptibilty to the development of gullies. Also, the over carriage of pastureland, together with very friable lithologies (mainly sandstone) occuring in that area, seem to be the main factors influencing the catastrophic extension of ravines in the study site.

  4. Perception of mental health in Pakistani nomads: an interpretative phenomenological analyses.

    PubMed

    Choudhry, Fahad Riaz; Bokharey, Iram Zehra

    2013-12-19

    The study was conducted to explore the mental health issues of Pakistani nomads and to uncover their concept, ideation, and perception about mental health and illnesses. It was an exploratory study situated in the qualitative paradigm. The research strategy used was Interpretative Phenomenological Analysis (IPA), as the study was planned to explore the lived experiences of nomads regarding mental health and coping strategies and how they interpret those experiences. For data collection, focus group discussions (FGDs) were conducted. Seven participants were included in the FGDs, and two FGDs were conducted composed of both genders. The responses were recorded, and data were transcribed and analysed using IPA. Data verification procedures of peer review, which help to clarify researcher bias and rich thick description, were used. The major themes were lack of resources and myriad unfulfilled needs, specifically the basic needs (food, shelter, and drinking and bathing water). Moreover, a strong desire to fulfil the secondary needs of enjoyment and having luxuries was also reflected. A list of recommendations was forwarded for policy making of this marginalized community and to create awareness regarding mental health.

  5. Digital recovery, modification, and analysis of Tetra Tech seismic horizon mapping, National Petroleum Reserve Alaska (NPRA), northern Alaska

    USGS Publications Warehouse

    Saltus, R.W.; Kulander, Christopher S.; Potter, Christopher J.

    2002-01-01

    We have digitized, modified, and analyzed seismic interpretation maps of 12 subsurface stratigraphic horizons spanning portions of the National Petroleum Reserve in Alaska (NPRA). These original maps were prepared by Tetra Tech, Inc., based on about 15,000 miles of seismic data collected from 1974 to 1981. We have also digitized interpreted faults and seismic velocities from Tetra Tech maps. The seismic surfaces were digitized as two-way travel time horizons and converted to depth using Tetra Tech seismic velocities. The depth surfaces were then modified by long-wavelength corrections based on recent USGS seismic re-interpretation along regional seismic lines. We have developed and executed an algorithm to identify and calculate statistics on the area, volume, height, and depth of closed structures based on these seismic horizons. These closure statistics are tabulated and have been used as input to oil and gas assessment calculations for the region. Directories accompanying this report contain basic digitized data, processed data, maps, tabulations of closure statistics, and software relating to this project.

  6. Intelligent data analysis to interpret major risk factors for diabetic patients with and without ischemic stroke in a small population

    PubMed Central

    Gürgen, Fikret; Gürgen, Nurgül

    2003-01-01

    This study proposes an intelligent data analysis approach to investigate and interpret the distinctive factors of diabetes mellitus patients with and without ischemic (non-embolic type) stroke in a small population. The database consists of a total of 16 features collected from 44 diabetic patients. Features include age, gender, duration of diabetes, cholesterol, high density lipoprotein, triglyceride levels, neuropathy, nephropathy, retinopathy, peripheral vascular disease, myocardial infarction rate, glucose level, medication and blood pressure. Metric and non-metric features are distinguished. First, the mean and covariance of the data are estimated and the correlated components are observed. Second, major components are extracted by principal component analysis. Finally, as common examples of local and global classification approach, a k-nearest neighbor and a high-degree polynomial classifier such as multilayer perceptron are employed for classification with all the components and major components case. Macrovascular changes emerged as the principal distinctive factors of ischemic-stroke in diabetes mellitus. Microvascular changes were generally ineffective discriminators. Recommendations were made according to the rules of evidence-based medicine. Briefly, this case study, based on a small population, supports theories of stroke in diabetes mellitus patients and also concludes that the use of intelligent data analysis improves personalized preventive intervention. PMID:12685939

  7. Hermeneutic phenomenological analysis: the 'possibility' beyond 'actuality' in thematic analysis.

    PubMed

    Ho, Ken H M; Chiang, Vico C L; Leung, Doris

    2017-07-01

    This article discusses the ways researchers may become open to manifold interpretations of lived experience through thematic analysis that follows the tradition of hermeneutic phenomenology. Martin Heidegger's thinking about historical contexts of understandings and the notions of 'alētheia' and 'techne' disclose what he called meaning of lived experience, as the 'unchanging Being of changing beings'. While these notions remain central to hermeneutic phenomenological research, novice phenomenologists usually face the problem of how to incorporate these philosophical tenets into thematic analysis. Discussion paper. This discussion paper is based on our experiences of hermeneutic analysis supported by the writings of Heidegger. Literature reviewed for this paper ranges from 1927 - 2014. We draw on data from a study of foreign domestic helpers in Hong Kong to demonstrate how 'dwelling' in the language of participants' 'ek-sistence' supported us in a process of thematic analysis. Data were collected from December 2013 - February 2016. Nurses doing hermeneutic phenomenology have to develop self-awareness of one's own 'taken-for-granted' thinking to disclose the unspoken meanings hidden in the language of participants. Understanding the philosophical tenets of hermeneutic phenomenology allows nurses to preserve possibilities of interpretations in thinking. In so doing, methods of thematic analysis can uncover and present the structure of the meaning of lived experience. We provide our readers with vicarious experience of how to begin cultivating thinking that is aligned with hermeneutic phenomenological philosophical tenets to conduct thematic analysis. © 2017 John Wiley & Sons Ltd.

  8. Geochemistry and the understanding of ground-water systems

    USGS Publications Warehouse

    Glynn, Pierre D.; Plummer, Niel

    2005-01-01

    Geochemistry has contributed significantly to the understanding of ground-water systems over the last 50 years. Historic advances include development of the hydrochemical facies concept, application of equilibrium theory, investigation of redox processes, and radiocarbon dating. Other hydrochemical concepts, tools, and techniques have helped elucidate mechanisms of flow and transport in ground-water systems, and have helped unlock an archive of paleoenvironmental information. Hydrochemical and isotopic information can be used to interpret the origin and mode of ground-water recharge, refine estimates of time scales of recharge and ground-water flow, decipher reactive processes, provide paleohydrological information, and calibrate ground-water flow models. Progress needs to be made in obtaining representative samples. Improvements are needed in the interpretation of the information obtained, and in the construction and interpretation of numerical models utilizing hydrochemical data. The best approach will ensure an optimized iterative process between field data collection and analysis, interpretation, and the application of forward, inverse, and statistical modeling tools. Advances are anticipated from microbiological investigations, the characterization of natural organics, isotopic fingerprinting, applications of dissolved gas measurements, and the fields of reaction kinetics and coupled processes. A thermodynamic perspective is offered that could facilitate the comparison and understanding of the multiple physical, chemical, and biological processes affecting ground-water systems.

  9. Child and adolescent mental health nursing seen through a social constructionist lens.

    PubMed

    Rasmussen, Philippa; Muir-Cochrane, Eimear; Henderson, Ann

    2015-11-01

    To discuss the theoretical framework of social constructivism and justify it s appropriateness for and compatibility with an interpretive approach to child adolescent mental health (CAMH) nursing research. Recent changes to national nursing legislation in Australia have resulted in the removal of the separate register with regulatory authorities that existed for the specialty of mental health nursing. Aspects of mental health nursing age are not easily defined, with some being tacit. CAMH nursing is a sub-specialty area of mental health in which the role and function of these nurses is also not overtly understood. An interpretive research study was designed to develop a deeper understanding of the role and work of CAMH nurses when working in an inpatient setting. REVEW METHODS: An interpretive enquiry methodology was used fro the study, with three sequential stages of data collection: document analysis, focus group interviews and semi-structured individual interviews. Social constructionism was the chosen theoretical framework for this study as it provided a useful lens for interpreting and understanding the work of the CAMH nurse. The social constructionist lens was simpatico with mental health nursing, as they both involved making meaning of or assessing information and understanding of social processes and interactions. IMPLICATIONS FOR REEARCH/PRACTICE: A useful lens for further research into mental health nursing practice.

  10. Uncovering racial bias in nursing fundamentals textbooks.

    PubMed

    Byrne, M M

    2001-01-01

    This article describes research that sought to identify and critique selected content areas from three nursing fundamentals textbooks for the presence or absence of racial bias embedded in the portrayal of African Americans. The analyzed content areas were the history of nursing, cultural content, and physical assessment/hygiene parameters. A researcher-developed guide was used for data collection and analysis of textual language, illustrations, linguistics, and references. A thematic analysis resulted in I I themes reflecting the portrayal of African Americans in these sampled textbooks. An interpretive analysis with a lens of Sadker and Sadker's categories of bias, along with other literary and theoretical contexts, were used to explore for the presence or absence of racial bias. Recommendations for nursing education are provided.

  11. Timing of specimen collection is crucial in urine screening of drug dependent mothers and newborns.

    PubMed

    Halstead, A C; Godolphin, W; Lockitch, G; Segal, S

    1988-01-01

    We compared results of urine drug analysis with clinical data and history to test the usefulness of peripartum drug screening and to establish guidelines for optimal testing. Urine from 28 mothers and 52 babies was analysed. Drugs not suspected by history were found in 10 mothers and six babies. Results assisted in the management of neonatal withdrawal in three babies. Drugs suspected by history were not found in 11/22 mothers and 23/35 babies. About half of these results were associated with delayed urine collection. In 12/28 mothers, drugs administered in hospital could have confused interpretation of screen results. We conclude that urine drug screening without strict protocols for specimen collection is of limited usefulness for management of drug abuse in pregnancy and neonatal drug withdrawal. We favour testing of maternal urine obtained before drugs are administered in hospital. Neonatal urine, if used, should be collected in the first day of life.

  12. Results and Interpretations of U.S. Geological Survey Data Collected In and Around the Tuba City Open Dump, Arizona

    USGS Publications Warehouse

    Johnson, Raymond H.; Otton, James K.; Horton, Robert J.

    2009-01-01

    This Open-File Report was originally an Administrative Report presentation to the Bureau of Indian Affairs based on U.S. Geological Survey data that has been collected and presented in four previous reports (Open-File Reports 2009-1020, 2008-1380, and 2008-1374, and an Administrative Report on geophysical data). This presentation was given at a technical meeting requested by the BIA on March 3 and 4, 2009, in Phoenix, Arizona. The idea for this meeting was for all the technical people working on issues related to the Tuba City Open Dump site to come together and share their data collection procedures, results, interpretations, and working hypotheses. The meeting goal was to have a clear record of each party's interpretations and a summary of additional data that would be needed to solve differences of opinion. The intention of this presentation is not to provide an exhaustive summary of U.S. Geological Survey efforts at the Tuba City Open Dump site given in the four previously published Open-File Reports listed above, since these reports have already been made available. This presentation briefly summarizes the data collected for those reports and provides results, interpretations, and working hypotheses relating to the data available in these reports. The major questions about the Tuba City Open Dump addressed by the U.S. Geological Survey are (1) what are the sources for uranium and other constituents found in the ground water in and around the Tuba City Open Dump, (2) what is the current distribution of ground water contaminants away from the Tuba City Open Dump (can plume limits be delineated), and (3) what controls the mobility of uranium and other constituents in and around the Tuba City Open Dump? Data collection, results, and interpretations by the U.S. Geological Survey that address these questions are presented herein.

  13. Cutting Corners: Provider Perceptions of Interpretation Services and Factors Related to Use of an Ad Hoc Interpreter.

    PubMed

    Mayo, Rachel; Parker, Veronica G; Sherrill, Windsor W; Coltman, Kinneil; Hudson, Matthew F; Nichols, Christina M; Yates, Adam M; Pribonic, Anne Paige

    2016-06-01

    This study assessed health providers' perceptions of factors related to professional interpretation services and the association between these factors and the potential use of ad hoc interpreters. Data were collected from a convenience sample of 150 health services providers at a large, regional health system in South Carolina. Providers rated "ability to communicate effectively during a clinical encounter" as paramount regarding the use of interpretation services. The most important factors related to the likely use of ad hoc interpreters (cutting corners) included locating a qualified interpreter, having to wait for a qualified interpreter, and technical difficulties regarding phone and video technology. Health care organizations may benefit from increasing staff awareness about patient safety and legal and regulatory risks involved with the use of ad hoc interpreters. © The Author(s) 2016.

  14. [Legal Analysis of the Implementation Rules of Delegation of Home Visits by Family Doctors to Non-Physician Health Professionals: Is the Implementation in Accordance with the Intention of the Law?

    PubMed

    Ruppel, T; van den Berg, N; Hoffmann, W

    2016-10-01

    Objective: Triggered by the AGnES model project of the University Medicine Greifswald, the Code of Social Law V was changed by the German Lower and Upper House of Parliament (Bundestag and Bundesrat) in 2008 so that the delegation of GP's activities to non-physician colleagues was allowed under highly restricted preconditions. Delegated home visits should become an integral part of the standard care in Germany. In this study, the implementation of § 87 para 2b clause 5 SGB V, established in Annex 8 of the Federal Collective Agreement, was checked for its legality in terms of qualification. Methods: The problem was checked with the legal methods of interpretation in pursuance of the norm and the methods of systematic, historic and teleologic interpretation. Results: Even though the Parliament clearly required orientation to the AGnES model project (in order to assure safety and effective care of delegated home visits), self-management in the implementation of the law remained far behind these guidelines. The main outcome of the legal analysis was that the implementation arrangements of the Code of Social Law V are predominantly illegal. Conclusions: The parties of the Federal Collective Agreement have to change the arrangements to meet the requirements of the Parliament and to avoid risks of liability for delegating GPs. © Georg Thieme Verlag KG Stuttgart · New York.

  15. Gender in occupational health research of farmworkers: A systematic review

    PubMed Central

    Habib, Rima R; Hojeij, Safa; Elzein, Kareem

    2014-01-01

    Background Farmwork is one of the most hazardous occupations for men and women. Research suggests sex/gender shapes hazardous workplace exposures and outcomes for farmworkers. This paper reviews the occupational health literature on farmworkers, assessing how gender is treated and interpreted in exposure-outcome studies. Methods The paper evaluates peer-reviewed articles on men and women farmworkers' health published between 2000 and 2012 in PubMed or SCOPUS. Articles were identified and analyzed for approaches toward sampling, data analysis, and use of exposure indicators in relation to sex/gender. Results 18% of articles reported on and interpreted sex/gender differences in health outcomes and exposures. Sex/gender dynamics often shaped health outcomes, yet adequate data was not collected on established sex/gender risk factors relating to study outcomes. Conclusion Research can better incorporate sex/gender analysis into design, analytical and interpretive approaches to better explore its mediation of health outcomes in light of emerging calls to mainstream gender research. Am. J. Ind. Med. 57:1344–1367, 2014. © 2014 The Authors. American Journal of Industrial Medicine Published by Wiley Periodicals, Inc. This is an open access article under the terms of the Creative Commons Attribution-NonCommercial-NoDerivs License, which permits use and distribution in any medium, provided the original work is properly cited, the use is non-commercial and no modifications or adaptations are made. PMID:25227724

  16. [Recovering helpers in the addiction treatment system in Hungary: an interpretative phenomenological analysis].

    PubMed

    Kassai, Szilvia; Pintér, Judit Nóra; Rácz, József

    2015-01-01

    The work of recovering helpers who work in the addiction rehabilitation centres was studied. The aim was to investigate the process of addicts becoming recovering helpers, and to study what peer help means to them. According to interpretative phenomenological analysis (IPA) design, subjects were selected, data were collected and analysed. 6 (5 males, 1 female), working as recovering helpers at least one year at addiction rehabilitation centres. Semi-structured life interviews were carried out and analysed according to IPA. Emerging themes from the interviews were identified and summarized, then interpreted as central themes: important periods and turning points of the life story interviews: the experience of psychoactive drugs use, the development of the addiction (which became " Turning Point No 1") then the "rock bottom" experience ("Turning Point No 2"). Then the experience of the helping process was examined: here four major themes were identified: the development of the recovering self and the helping self, the wounded helper and the skilled helper, the experience of the helping process. IPA was found to be a useful method for idiographic exploration of the development and the work of the recovering helpers. The work of the recovering helpers can be described as mentoring of the addict clients. Our experiences might be used for the training programs for recovering helpers as well as to adopt their professional role in addiction services.

  17. The impact of the participation of settlers to the formation of environment in Kampung Nelayan Belawan Medan

    NASA Astrophysics Data System (ADS)

    Marpaung, B. O. Y.; Waginah

    2018-03-01

    Every existence of community settlements that formed has related to social, culture, and economy that exists in that society. Participation is a process that involving human interaction towards each other, of these interactions creates activities that potentially form a new space (Hendriksen, et al., 2012). Problems in this research are related to community involvement in building residential, determining land used, building roads, and utilities in Kampung Nelayan Belawan Medan residential. The aim of this research is to find the community involvement of building residential, determining land used, building roads, and utilities in Kampung Nelayan Belawan Medan residential. In the process of collecting data, researchers conducted field observation and interviews. Then the researchers connect the theory and interpretation of data in determining the method of data analysis. Then the researchers connect the theory and interpretation of data in determining the method of data analysis. The discovery of this research is that the formation of settlement spaces in the fishing village is inseparable from the participation in Kampung Nelayan Belawan Medan residential.

  18. Principal facts and an approach to collecting gravity data using near-real-time observations in the vicinity of Barstow, California

    USGS Publications Warehouse

    Phelps, G.; Cronkite-Ratcliff, C.; Klofas, L.

    2013-01-01

    A gravity survey was done in the vicinity of Barstow, California, in which data were processed and analyzed in the field. The purpose of the data collection was to investigate possible changes in gravity across mapped Quaternary faults and to improve regional gravity coverage, adding to the existing national gravity database. Data were collected, processed, analyzed, and interpreted in the field in order to make decisions about where to collect data for the remainder of the survey. Geological targets in the Barstow area included the Cady Fault, the Manix Fault, and the Yermo Hills. Upon interpreting initial results, additional data were collected to more completely define the fault targets, rather than collecting data to improve the regional gravity coverage in an adjacent area. Both the Manix and Cady Faults showed gravitational expression of the subsurface in the form of steep gravitational gradients that we interpret to represent down-dropped blocks. The gravitational expression of the Cady Fault is on trend with the linear projection of the mapped fault, and the gravitational expression of the Manix Fault is north of the current northernmost mapped strand of the fault. The relative gravitational low over the Yermo Hills was confirmed and better constrained, indicating a significant thickness of sediments at the junction of the Calico, Manix, and Tin Can Alley Faults.

  19. 76 FR 17331 - Debt Collection

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-29

    ... procedures for collection of debts through salary offset, administrative offset, tax refund offset, and... procedure and is interpretative in nature. The procedures contained in the interim final rule for salary...

  20. Hyporheic Exchange Flows and Biogeochemical Patterns near a Meandering Stream: East Fork of the Jemez River, Valles Caldera National Preserve, New Mexico

    NASA Astrophysics Data System (ADS)

    Christensen, H.; Wooten, J. P.; Swanson, E.; Senison, J. J.; Myers, K. D.; Befus, K. M.; Warden, J.; Zamora, P. B.; Gomez, J. D.; Wilson, J. L.; Groffman, A.; Rearick, M. S.; Cardenas, M. B.

    2012-12-01

    A study by the 2012 Hydrogeology Field Methods class of the University of Texas at Austin implemented multiple approaches to evaluate and characterize local hyporheic zone flow and biogeochemical trends in a highly meandering reach of the of the East Fork of the Jemez River, a fourth order stream in northwestern New Mexico. This section of the Jemez River is strongly meandering and exhibits distinct riffle-pool morphology. The high stream sinuosity creates inter-meander hyporheic flow that is also largely influenced by local groundwater gradients. In this study, dozens of piezometers were used to map the water table and flow vectors were then calculated. Surface water and ground water samples were collected and preserved for later geochemical analysis by ICPMS and HPLC, and unstable parameters and alkalinity were measured on-site. Additionally, information was collected from thermal monitoring of the streambed, stream gauging, and from a series of electrical resistivity surveys forming a network across the site. Hyporheic flow paths are suggested by alternating gaining and losing sections of the stream as determined by stream gauging at multiple locations along the reach. Water table maps and calculated fluxes across the sediment-water interface also indicate hyporheic flow paths. We find variability in the distribution of biogeochemical constituents (oxidation-reduction potential, nitrate, ammonium, and phosphate) along interpreted flow paths which is partly consistent with hyporheic exchange. The variability and heterogeneity of reducing and oxidizing conditions is interpreted to be a result of groundwater-surface water interaction. Two-dimensional mapping of biogeochemical parameters show redox transitions along interpreted flow paths. Further analysis of various measured unstable chemical parameters results in observable trends strongly delineated along these preferential flow paths that are consistent with the direction of groundwater flow and the assumed direction of inter-meander hyporheic flow.

  1. 14 CFR 65.55 - Knowledge requirements.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 2 2014-01-01 2014-01-01 false Knowledge requirements. 65.55 Section 65.55 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) AIRMEN...) General system of weather and NOTAM collection, dissemination, interpretation, and use; (4) Interpretation...

  2. 14 CFR 65.55 - Knowledge requirements.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 2 2012-01-01 2012-01-01 false Knowledge requirements. 65.55 Section 65.55 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) AIRMEN...) General system of weather and NOTAM collection, dissemination, interpretation, and use; (4) Interpretation...

  3. 14 CFR 61.155 - Aeronautical knowledge.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 2 2012-01-01 2012-01-01 false Aeronautical knowledge. 61.155 Section 61.155 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED... system of weather and NOTAM collection, dissemination, interpretation, and use; (4) Interpretation and...

  4. [Analysis of qualitative data collection methods used in adolescent research].

    PubMed

    Ndengeyingoma, Assumpta; De Montigny, Francine; Miron, Jean-Marie

    2013-03-01

    There has been remarkable growth in research on adolescents in the last decade, particularly in nursing science. The goal of this article is to produce a synthesis of findings justifying the use of qualitative methods in collecting data from adolescents. A literature review identified relevant articles (N : 27) from digital databases. While the studies done on adolescents were on different topics, the data collection methods were often similar. Most of the studies used more than one technique to reconcile scientific rigour and the way the adolescents expressed themselves. In order to understand a phenomenon, its context and the meaning given to the experience proved essential. In qualitative research on adolescents, it is important to use data collection methods that make it possible to clearly target the experience explored and to orient and guide the individual in deepening that experience in order to favour the emergence of his or her point of view. Data collection methods based on written communication have to be complemented with other methods more focused on oral communication so as to draw out interpretations reflecting adolescents' points of view as accurately as possible.

  5. Evaluation of Two Commercial Systems for Automated Processing, Reading, and Interpretation of Lyme Borreliosis Western Blots▿

    PubMed Central

    Binnicker, M. J.; Jespersen, D. J.; Harring, J. A.; Rollins, L. O.; Bryant, S. C.; Beito, E. M.

    2008-01-01

    The diagnosis of Lyme borreliosis (LB) is commonly made by serologic testing with Western blot (WB) analysis serving as an important supplemental assay. Although specific, the interpretation of WBs for diagnosis of LB (i.e., Lyme WBs) is subjective, with considerable variability in results. In addition, the processing, reading, and interpretation of Lyme WBs are laborious and time-consuming procedures. With the need for rapid processing and more objective interpretation of Lyme WBs, we evaluated the performances of two automated interpretive systems, TrinBlot/BLOTrix (Trinity Biotech, Carlsbad, CA) and BeeBlot/ViraScan (Viramed Biotech AG, Munich, Germany), using 518 serum specimens submitted to our laboratory for Lyme WB analysis. The results of routine testing with visual interpretation were compared to those obtained by BLOTrix analysis of MarBlot immunoglobulin M (IgM) and IgG and by ViraScan analysis of ViraBlot and ViraStripe IgM and IgG assays. BLOTrix analysis demonstrated an agreement of 84.7% for IgM and 87.3% for IgG compared to visual reading and interpretation. ViraScan analysis of the ViraBlot assays demonstrated agreements of 85.7% for IgM and 94.2% for IgG, while ViraScan analysis of the ViraStripe IgM and IgG assays showed agreements of 87.1 and 93.1%, respectively. Testing by the automated systems yielded an average time savings of 64 min/run compared to processing, reading, and interpretation by our current procedure. Our findings demonstrated that automated processing and interpretive systems yield results comparable to those of visual interpretation, while reducing the subjectivity and time required for Lyme WB analysis. PMID:18463211

  6. Evaluation of two commercial systems for automated processing, reading, and interpretation of Lyme borreliosis Western blots.

    PubMed

    Binnicker, M J; Jespersen, D J; Harring, J A; Rollins, L O; Bryant, S C; Beito, E M

    2008-07-01

    The diagnosis of Lyme borreliosis (LB) is commonly made by serologic testing with Western blot (WB) analysis serving as an important supplemental assay. Although specific, the interpretation of WBs for diagnosis of LB (i.e., Lyme WBs) is subjective, with considerable variability in results. In addition, the processing, reading, and interpretation of Lyme WBs are laborious and time-consuming procedures. With the need for rapid processing and more objective interpretation of Lyme WBs, we evaluated the performances of two automated interpretive systems, TrinBlot/BLOTrix (Trinity Biotech, Carlsbad, CA) and BeeBlot/ViraScan (Viramed Biotech AG, Munich, Germany), using 518 serum specimens submitted to our laboratory for Lyme WB analysis. The results of routine testing with visual interpretation were compared to those obtained by BLOTrix analysis of MarBlot immunoglobulin M (IgM) and IgG and by ViraScan analysis of ViraBlot and ViraStripe IgM and IgG assays. BLOTrix analysis demonstrated an agreement of 84.7% for IgM and 87.3% for IgG compared to visual reading and interpretation. ViraScan analysis of the ViraBlot assays demonstrated agreements of 85.7% for IgM and 94.2% for IgG, while ViraScan analysis of the ViraStripe IgM and IgG assays showed agreements of 87.1 and 93.1%, respectively. Testing by the automated systems yielded an average time savings of 64 min/run compared to processing, reading, and interpretation by our current procedure. Our findings demonstrated that automated processing and interpretive systems yield results comparable to those of visual interpretation, while reducing the subjectivity and time required for Lyme WB analysis.

  7. A community responds to collective trauma: an ecological analysis of the James Byrd murder in Jasper, Texas.

    PubMed

    Wicke, Thomas; Silver, Roxane Cohen

    2009-12-01

    The brutal murder of James Byrd Jr. in June 1998 unleashed a storm of media, interest groups, high profile individuals and criticism on the Southeast Texas community of Jasper. The crime and subsequent response-from within the community as well as across the world-engulfed the entire town in a collective trauma. Using natural disaster literature/theory and employing an ecological approach, Jasper, Texas was investigated via an interrupted time series analysis to identify how the community changed as compared to a control community (Center, Texas) on crime, economic, health, educational, and social capital measures collected at multiple pre- and post-crime time points between 1995 and 2003. Differences-in-differences (DD) analysis revealed significant post-event changes in Jasper, as well as a surprising degree of resilience and lack of negative consequences. Interviews with residents conducted between March 2005 and 2007 identified how the community responded to the crisis and augmented quantitative findings with qualitative, field-informed interpretation. Interviews suggest the intervention of external organizations exacerbated the severity of the events. However, using strengths of specific local social institutions-including faith based, law enforcement, media, business sector and civic government organizations-the community effectively responded to the initial threat and to the potential negative ramifications of external entities.

  8. [Clinical reasoning in nursing, concept analysis].

    PubMed

    Côté, Sarah; St-Cyr Tribble, Denise

    2012-12-01

    Nurses work in situations of complex care requiring great clinical reasoning abilities. In literature, clinical reasoning is often confused with other concepts and it has no consensual definition. To conduct a concept analysis of a nurse's clinical reasoning in order to clarify, define and distinguish it from the other concepts as well as to better understand clinical reasoning. Rodgers's method of concept analysis was used, after literature was retrieved with the use of clinical reasoning, concept analysis, nurse, intensive care and decision making as key-words. The use of cognition, cognitive strategies, a systematic approach of analysis and data interpretation, generating hypothesis and alternatives are attributes of clinical reasoning. The antecedents are experience, knowledge, memory, cues, intuition and data collection. The consequences are decision making, action, clues and problem resolution. This concept analysis helped to define clinical reasoning, to distinguish it from other concepts used synonymously and to guide future research.

  9. Geophysical Analysis of an Urban Region in Southwestern Pennsylvania

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harbert, W.P.; Lipinski, B.A.; Kaminski, V.

    2006-12-01

    The goal of this project was to categorize the subsurface beneath an urban region of Southwestern Pennsylvania and to determine geological structure and attempt to image pathways for gas migration in this area. Natural gas had been commercially produced from this region at the turn of the century but this field, with more than 100 wells drilled, was closed approximately eighty years ago. There are surface expressions of gas migration visible in the study region. We applied geophysical methods to determine geological structure in this region, which included multi frequency electromagnetic survey performed using Geophex Gem-2 system, portable reflection seismicmore » and a System I/O-based reflection seismic survey. Processing and interpretation of EM data included filtering 10 raw channels (inphase and quadrature components measured at 5 frequencies), inverting the data for apparent conductivity using EM1DFM software by University of British Columbia, Canada and further interpretation in terms of nearsurface features at a maximum depth of up to 20 meters. Analysis of the collected seismic data included standard seismic processing and the use of the SurfSeis software package developed by the Kansas Geological Survey. Standard reflection processing of these data were completed using the LandMark ProMAX 2D/3D and Parallel Geoscience Corporations software. Final stacked sections were then imported into a Seismic Micro Technologies Kingdom Suite+ geodatabase for visualization and analysis. Interpretation of these data was successful in identifying and confirming a region of unmined Freeport coal, determining regional stratigraphic structure and identifying possible S-wave lower velocity anomalies in the shallow subsurface.« less

  10. The role of sample preparation in interpretation of trace element concentration variability in moss bioindication studies

    USGS Publications Warehouse

    Migaszewski, Z.M.; Lamothe, P.J.; Crock, J.G.; Galuszka, A.; Dolegowska, S.

    2011-01-01

    Trace element concentrations in plant bioindicators are often determined to assess the quality of the environment. Instrumental methods used for trace element determination require digestion of samples. There are different methods of sample preparation for trace element analysis, and the selection of the best method should be fitted for the purpose of a study. Our hypothesis is that the method of sample preparation is important for interpretation of the results. Here we compare the results of 36 element determinations performed by ICP-MS on ashed and on acid-digested (HNO3, H2O2) samples of two moss species (Hylocomium splendens and Pleurozium schreberi) collected in Alaska and in south-central Poland. We found that dry ashing of the moss samples prior to analysis resulted in considerably lower detection limits of all the elements examined. We also show that this sample preparation technique facilitated the determination of interregional and interspecies differences in the chemistry of trace elements. Compared to the Polish mosses, the Alaskan mosses displayed more positive correlations of the major rock-forming elements with ash content, reflecting those elements' geogenic origin. Of the two moss species, P. schreberi from both Alaska and Poland was also highlighted by a larger number of positive element pair correlations. The cluster analysis suggests that the more uniform element distribution pattern of the Polish mosses primarily reflects regional air pollution sources. Our study has shown that the method of sample preparation is an important factor in statistical interpretation of the results of trace element determinations. ?? 2010 Springer-Verlag.

  11. 36 CFR § 1290.6 - Originals and copies.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... JFK ASSASSINATION RECORDS GUIDANCE FOR INTERPRETATION AND IMPLEMENTATION OF THE PRESIDENT JOHN F. KENNEDY ASSASSINATION RECORDS COLLECTION ACT OF 1992 (JFK ACT) § 1290.6 Originals and copies. (a) For... President John F. Kennedy Assassination Records Collection (JFK Assassination Records Collection...

  12. 76 FR 47154 - Proposed Information Collection; Comment Request; California Signage Plan: Evaluation of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-04

    ... DEPARTMENT OF COMMERCE National Oceanic and Atmospheric Administration Proposed Information Collection; Comment Request; California Signage Plan: Evaluation of Interpretive Signs AGENCY: National... This request is for a regular submission (new collection). The California Signage Plan is an organized...

  13. Analysis and interpretation of diffraction data from complex, anisotropic materials

    NASA Astrophysics Data System (ADS)

    Tutuncu, Goknur

    Most materials are elastically anisotropic and exhibit additional anisotropy beyond elastic deformation. For instance, in ferroelectric materials the main inelastic deformation mode is via domains, which are highly anisotropic crystallographic features. To quantify this anisotropy of ferroelectrics, advanced X-ray and neutron diffraction methods were employed. Extensive sets of data were collected from tetragonal BaTiO3, PZT and other ferroelectric ceramics. Data analysis was challenging due to the complex constitutive behavior of these materials. To quantify the elastic strain and texture evolution in ferroelectrics under loading, a number of data analysis techniques such as the single peak and Rietveld methods were used and their advantages and disadvantages compared. It was observed that the single peak analysis fails at low peak intensities especially after domain switching while the Rietveld method does not account for lattice strain anisotropy although it overcomes the low intensity problem via whole pattern analysis. To better account for strain anisotropy the constant stress (Reuss) approximation was employed within the Rietveld method and new formulations to estimate lattice strain were proposed. Along the way, new approaches for handling highly anisotropic lattice strain data were also developed and applied. All of the ceramics studied exhibited significant changes in their crystallographic texture after loading indicating non-180° domain switching. For a full interpretation of domain switching the spherical harmonics method was employed in Rietveld. A procedure for simultaneous refinement of multiple data sets was established for a complete texture analysis. To further interpret diffraction data, a solid mechanics model based on the self-consistent approach was used in calculating lattice strain and texture evolution during the loading of a polycrystalline ferroelectric. The model estimates both the macroscopic average response of a specimen and its hkl-dependent lattice strains for different reflections. It also tracks the number of grains (or domains) contributing to each reflection and allows for domain switching. The agreement between the model and experimental data was found to be satisfactory.

  14. A system architecture for online data interpretation and reduction in fluorescence microscopy

    NASA Astrophysics Data System (ADS)

    Röder, Thorsten; Geisbauer, Matthias; Chen, Yang; Knoll, Alois; Uhl, Rainer

    2010-01-01

    In this paper we present a high-throughput sample screening system that enables real-time data analysis and reduction for live cell analysis using fluorescence microscopy. We propose a novel system architecture capable of analyzing a large amount of samples during the experiment and thus greatly minimizing the post-analysis phase that is the common practice today. By utilizing data reduction algorithms, relevant information of the target cells is extracted from the online collected data stream, and then used to adjust the experiment parameters in real-time, allowing the system to dynamically react on changing sample properties and to control the microscope setup accordingly. The proposed system consists of an integrated DSP-FPGA hybrid solution to ensure the required real-time constraints, to execute efficiently the underlying computer vision algorithms and to close the perception-action loop. We demonstrate our approach by addressing the selective imaging of cells with a particular combination of markers. With this novel closed-loop system the amount of superfluous collected data is minimized, while at the same time the information entropy increases.

  15. Evaluation of SLAR and simulated thematic mapper MSS data for forest cover mapping using computer-aided analysis techniques

    NASA Technical Reports Server (NTRS)

    Hoffer, R. M.; Dean, M. E.; Knowlton, D. J.; Latty, R. S.

    1982-01-01

    Kershaw County, South Carolina was selected as the study site for analyzing simulated thematic mapper MSS data and dual-polarized X-band synthetic aperture radar (SAR) data. The impact of the improved spatial and spectral characteristics of the LANDSAT D thematic mapper data on computer aided analysis for forest cover type mapping was examined as well as the value of synthetic aperture radar data for differentiating forest and other cover types. The utility of pattern recognition techniques for analyzing SAR data was assessed. Topics covered include: (1) collection and of TMS and reference data; (2) reformatting, geometric and radiometric rectification, and spatial resolution degradation of TMS data; (3) development of training statistics and test data sets; (4) evaluation of different numbers and combinations of wavelength bands on classification performance; (5) comparison among three classification algorithms; and (6) the effectiveness of the principal component transformation in data analysis. The collection, digitization, reformatting, and geometric adjustment of SAR data are also discussed. Image interpretation results and classification results are presented.

  16. [Qualitative dimensions of the scientific, technological and innovation production at Public Health].

    PubMed

    Luz, Madel Therezinha; Mattos, Rafael da Silva

    2010-07-01

    This article shows the results of a qualitative evaluation on the expansion of the Collective Health area according with the production of the triennial Collective Health Congresses Annals which happen between 1997 to 2006, promoted by Abrasco - Brazilian Association of Collective Health. The specific objective was to estimate the growth of importance of the area in the scientific as well as in the social Brazilian scenario in the last decennary through the analysis of aspects and substantive dimensions. The methodological strategy of the study was to consider the complexity and data profusion referred to the dimensions of this multidisciplinary field (more and more interdisciplinary) of knowledge and intervention. From this perspective, analysis and interpretations of document sources were done, applying theoretical, methodological and analytical referential of the social science and X statistics techniques. It could be observed that: (1) in the last decade, the Collective Health area expanded into its three subareas (Epidemiology, Planning/Management and Health Services and, Human Sciences); (2) there is a tendency of more interactivity among the programs and with their communities and the institutions; (3) there is a growth in the quantity of authors writing about the field and different authors by article; (4) it is being elaborated a big internal specialization into the subareas.

  17. Research report: a grounded theory description of pastoral counseling.

    PubMed

    Townsend, Loren L

    2011-01-01

    Historically, clerical paradigms of ordained ministry have defined pastoral counseling. However, these fail to describe pastoral counselors in the complex social, theological and medical contexts in which they now work. This study asks the question: How do pastoral counselors in clinical practice describe what is uniquely "pastoral" about the counseling they offer clients? Grounded theory was used to propose a preliminary description and an intermediate theory of how pastoral counselors interpret "pastoral." Eighty-five pastoral counselors were selected for the study over a four year period using criteria to assure maximum variation. Interviews and pastoral identity statements were collected and coded, and theoretical models were organized using NVIVO, a computer assisted qualitative design and analysis software (CAQDAS) package. Results suggest that pastoral counselors share some common ideas regarding "pastoral identity" and clinical practice. How pastoral counselors interpret "pastoral" is highly context sensitive and varies widely.

  18. The University of Kansas Applied Sensing Program: An operational perspective

    NASA Technical Reports Server (NTRS)

    Martinko, E. A.

    1981-01-01

    The Kansas applied remote sensing (KARS) program conducts demonstration projects and applied research on remote sensing techniques which enable local, regional, state and federal agency personnel to better utilize available satellite and airborne remote sensing systems. As liason with Kansas agencies for the Earth Resources Laboratory (ERL), Kansas demonstration project, KARS coordinated interagency communication, field data collection, hands-on training, and follow-on technical assistance and worked with Kansas agency personnel in evaluating land cover maps provided by ERL. Short courses are being conducted to provide training in state-of-the-art remote sensing technology for university faculty, state personnel, and persons from private industry and federal government. Topics are listed which were considered in intensive five-day courses covering the acquisition, interpretation, and application of information derived through remote sensing with specific training and hands-on experience in image interpretation and the analysis of LANDSAT data are listed.

  19. Evaluation of recycling programmes in household waste collection systems.

    PubMed

    Dahlén, Lisa; Lagerkvist, Anders

    2010-07-01

    A case study and a literature review have been carried out to address the two questions: how can waste flow data from collection systems be interpreted and compared? and which factors are decisive in the results of recycling programmes in household waste collection systems? The aim is to contribute to the understanding of how recycling programmes affect the quantity of waste and sorting activities. It is shown how the results from various waste sorting systems can be interpreted and made comparable. A set of waste flow indicators is proposed, which together with generic system descriptions can facilitate comparisons of different collections systems. The evaluation of collection systems depends on the system boundaries and will always be site-specific to some degree. Various factors are relevant, e.g. environmental objectives, technical function, operating costs, types of recyclable materials collected separately, property-close collection or drop-off systems, economic incentives, information strategies, residential structure, social codes, etc. Kerbside collection of recyclables and weight-based billing led to increased waste sorting activities in the case study. Forty-three decisive factors are listed and discussed.

  20. Considerations for estimating microbial environmental data concentrations collected from a field setting

    PubMed Central

    Silvestri, Erin E; Yund, Cynthia; Taft, Sarah; Bowling, Charlena Yoder; Chappie, Daniel; Garrahan, Kevin; Brady-Roberts, Eletha; Stone, Harry; Nichols, Tonya L

    2017-01-01

    In the event of an indoor release of an environmentally persistent microbial pathogen such as Bacillus anthracis, the potential for human exposure will be considered when remedial decisions are made. Microbial site characterization and clearance sampling data collected in the field might be used to estimate exposure. However, there are many challenges associated with estimating environmental concentrations of B. anthracis or other spore-forming organisms after such an event before being able to estimate exposure. These challenges include: (1) collecting environmental field samples that are adequate for the intended purpose, (2) conducting laboratory analyses and selecting the reporting format needed for the laboratory data, and (3) analyzing and interpreting the data using appropriate statistical techniques. This paper summarizes some key challenges faced in collecting, analyzing, and interpreting microbial field data from a contaminated site. Although the paper was written with considerations for B. anthracis contamination, it may also be applicable to other bacterial agents. It explores the implications and limitations of using field data for determining environmental concentrations both before and after decontamination. Several findings were of interest. First, to date, the only validated surface/sampling device combinations are swabs and sponge-sticks on stainless steel surfaces, thus limiting availability of quantitative analytical results which could be used for statistical analysis. Second, agreement needs to be reached with the analytical laboratory on the definition of the countable range and on reporting of data below the limit of quantitation. Finally, the distribution of the microbial field data and statistical methods needed for a particular data set could vary depending on these data that were collected, and guidance is needed on appropriate statistical software for handling microbial data. Further, research is needed to develop better methods to estimate human exposure from pathogens using environmental data collected from a field setting. PMID:26883476

  1. Service user involvement enhanced the research quality in a study using interpretative phenomenological analysis - the power of multiple perspectives.

    PubMed

    Mjøsund, Nina Helen; Eriksson, Monica; Espnes, Geir Arild; Haaland-Øverby, Mette; Jensen, Sven Liang; Norheim, Irene; Kjus, Solveig Helene Høymork; Portaasen, Inger-Lill; Vinje, Hege Forbech

    2017-01-01

    The aim of this study was to examine how service user involvement can contribute to the development of interpretative phenomenological analysis methodology and enhance research quality. Interpretative phenomenological analysis is a qualitative methodology used in nursing research internationally to understand human experiences that are essential to the participants. Service user involvement is requested in nursing research. We share experiences from 4 years of collaboration (2012-2015) on a mental health promotion project, which involved an advisory team. Five research advisors either with a diagnosis or related to a person with severe mental illness constituted the team. They collaborated with the research fellow throughout the entire research process and have co-authored this article. We examined the joint process of analysing the empirical data from interviews. Our analytical discussions were audiotaped, transcribed and subsequently interpreted following the guidelines for good qualitative analysis in interpretative phenomenological analysis studies. The advisory team became 'the researcher's helping hand'. Multiple perspectives influenced the qualitative analysis, which gave more insightful interpretations of nuances, complexity, richness or ambiguity in the interviewed participants' accounts. The outcome of the service user involvement was increased breadth and depth in findings. Service user involvement improved the research quality in a nursing research project on mental health promotion. The interpretative element of interpretative phenomenological analysis was enhanced by the emergence of multiple perspectives in the qualitative analysis of the empirical data. We argue that service user involvement and interpretative phenomenological analysis methodology can mutually reinforce each other and strengthen qualitative methodology. © 2016 The Authors. Journal of Advanced Nursing Published by John Wiley & Sons Ltd.

  2. Relating ground truth collection to model sensitivity

    NASA Technical Reports Server (NTRS)

    Amar, Faouzi; Fung, Adrian K.; Karam, Mostafa A.; Mougin, Eric

    1993-01-01

    The importance of collecting high quality ground truth before a SAR mission over a forested area is two fold. First, the ground truth is used in the analysis and interpretation of the measured backscattering properties; second, it helps to justify the use of a scattering model to fit the measurements. Unfortunately, ground truth is often collected based on visual assessment of what is perceived to be important without regard to the mission itself. Sites are selected based on brief surveys of large areas, and the ground truth is collected by a process of selecting and grouping different scatterers. After the fact, it may turn out that some of the relevant parameters are missing. A three-layer canopy model based on the radiative transfer equations is used to determine, before hand, the relevant parameters to be collected. Detailed analysis of the contribution to scattering and attenuation of various forest components is carried out. The goal is to identify the forest parameters which most influence the backscattering as a function of frequency (P-, L-, and C-bands) and incident angle. The influence on backscattering and attenuation of branch diameters, lengths, angular distribution, and permittivity; trunk diameters, lengths, and permittivity; and needle sizes, their angular distribution, and permittivity are studied in order to maximize the efficiency of the ground truth collection efforts. Preliminary results indicate that while a scatterer may not contribute to the total backscattering, its contribution to attenuation may be significant depending on the frequency.

  3. Supporting flight data analysis for Space Shuttle Orbiter Experiments at NASA Ames Research Center

    NASA Technical Reports Server (NTRS)

    Green, M. J.; Budnick, M. P.; Yang, L.; Chiasson, M. P.

    1983-01-01

    The Space Shuttle Orbiter Experiments program in responsible for collecting flight data to extend the research and technology base for future aerospace vehicle design. The Infrared Imagery of Shuttle (IRIS), Catalytic Surface Effects, and Tile Gap Heating experiments sponsored by Ames Research Center are part of this program. The paper describes the software required to process the flight data which support these experiments. In addition, data analysis techniques, developed in support of the IRIS experiment, are discussed. Using the flight data base, the techniques have provided information useful in analyzing and correcting problems with the experiment, and in interpreting the IRIS image obtained during the entry of the third Shuttle mission.

  4. Supporting flight data analysis for Space Shuttle Orbiter experiments at NASA Ames Research Center

    NASA Technical Reports Server (NTRS)

    Green, M. J.; Budnick, M. P.; Yang, L.; Chiasson, M. P.

    1983-01-01

    The space shuttle orbiter experiments program is responsible for collecting flight data to extend the research and technology base for future aerospace vehicle design. The infrared imagery of shuttle (IRIS), catalytic surface effects, and tile gap heating experiments sponsored by Ames Research Center are part of this program. The software required to process the flight data which support these experiments is described. In addition, data analysis techniques, developed in support of the IRIS experiment, are discussed. Using the flight data base, the techniques provide information useful in analyzing and correcting problems with the experiment, and in interpreting the IRIS image obtained during the entry of the third shuttle mission.

  5. Interpreting forest and grassland biome productivity utilizing nested scales of image resolution and biogeographical analysis

    NASA Technical Reports Server (NTRS)

    Iverson, L. R.; Cook, E. A.; Graham, R. L.; Olson, J. S.; Frank, T.; Ke, Y.; Treworgy, C.; Risser, P. G.

    1986-01-01

    Several hardware, software, and data collection problems encountered were conquered. The Geographic Information System (GIS) data from other systems were converted to ERDAS format for incorporation with the image data. Statistical analysis of the relationship between spectral values and productivity is being pursued. Several project sites, including Jackson, Pope, Boulder, Smokies, and Huntington Forest are evolving as the most intensively studied areas, primarily due to availability of data and time. Progress with data acquisition and quality checking, more details on experimental sites, and brief summarizations of research results and future plans are discussed. Material on personnel, collaborators, facilities, site background, and meetings and publications of the investigators are included.

  6. The Topological Weighted Centroid (TWC): A topological approach to the time-space structure of epidemic and pseudo-epidemic processes

    NASA Astrophysics Data System (ADS)

    Buscema, Massimo; Massini, Giulia; Sacco, Pier Luigi

    2018-02-01

    This paper offers the first systematic presentation of the topological approach to the analysis of epidemic and pseudo-epidemic spatial processes. We introduce the basic concepts and proofs, at test the approach on a diverse collection of case studies of historically documented epidemic and pseudo-epidemic processes. The approach is found to consistently provide reliable estimates of the structural features of epidemic processes, and to provide useful analytical insights and interpretations of fragmentary pseudo-epidemic processes. Although this analysis has to be regarded as preliminary, we find that the approach's basic tenets are strongly corroborated by this first test and warrant future research in this vein.

  7. The influence of interpreters' professional background and experience on the interpretation of multimodality imaging of pulmonary lesions using 18F-3'-deoxy-fluorothymidine and 18F-fluorodeoxyglucose PET/CT.

    PubMed

    Xu, Bai-xuan; Liu, Chang-bin; Wang, Rui-min; Shao, Ming-zhe; Fu, Li-ping; Li, Yun-gang; Tian, Jia-he

    2013-01-01

    Based on the results of a recently accomplished multicenter clinical trial for the incremental value of a dual-tracer (18F-FDG and 18F-FLT), dual-modality (PET and CT) imaging in the differential diagnosis of pulmonary lesions, we investigate some issues that might affect the image interpretation and result reporting. The images were read in two separate sessions. Firstly the images were read and reported by physician(s) of the imaging center on completion of each PET/CT scanning. By the end of MCCT, all images collected during the trial were re-read by a collective of readers in an isolated, blinded, and independent way. One hundred sixty two patients successfully passed the data verification and entered into the final analysis. The primary reporting result showed adding 18F-FDG image information did not change the clinical performance much in sensitivity, specifity and accuracy, but the ratio between SUVFLT and SUVFDG did help the differentiation efficacy among the three subgroups of patients. The collective reviewing result showed the diagnostic achievement varied with reading strategies. ANOVA indicated significant differences among (18)F-FDG, (18)F-FLT in SUV (F = 14.239, p = 0.004). CT had almost the same diagnostic performance as 18F-FLT. When the 18F-FDG, 18F-FLT and CT images read in pair, both diagnostic sensitivity and specificity improved. The best diagnostic figures were obtained in full-modality strategy, when dual-tracer PET worked in combination with CT. With certain experience and training both radiologists and nuclear physicians are qualified to read and to achieve the similar diagnostic accuracy in PET/CT study. Making full use of modality combination and selecting right criteria seems more practical than professional back ground and personal experience in the new hybrid imaging technology, at least when novel tracer or application is concerned.

  8. LANDSAT data for coastal zone management. [New Jersey

    NASA Technical Reports Server (NTRS)

    Mckenzie, S.

    1981-01-01

    The lack of adequate, current data on land and water surface conditions in New Jersey led to the search for better data collections and analysis techniques. Four-channel MSS data of Cape May County and access to the OSER computer interpretation system were provided by NASA. The spectral resolution of the data was tested and a surface cover map was produced by going through the steps of supervised classification. Topics covered include classification; change detection and improvement of spectral and spatial resolution; merging LANDSAT and map data; and potential applications for New Jersey.

  9. Groundwater and streamflow information program Kansas Cooperative Water Science since 1895

    USGS Publications Warehouse

    Painter, Colin C.; Kramer, Ariele R.; Kelly, Brian P.

    2017-05-10

    The U.S. Geological Survey, in cooperation with State, local, and other Federal agencies, operates a network of streamgages throughout the State of Kansas. Data provided by this network are used to forecast floods, operate reservoirs, develop water policy, administer regulation of water, and perform interpretive analyses of streamflow. This data collection and analysis effort has been sustained since 1895 through cooperative matching fund programs that allow the USGS to work with cooperative agencies to solve groundwater and surface water challenges that affect citizens locally and throughout the Nation.  

  10. Forensic toxicology.

    PubMed

    Davis, Gregory G

    2012-01-01

    Toxicologic analysis is an integral part of death investigation, and the use or abuse of an unsuspected substance belongs in the differential diagnosis of patients who have a sudden, unexpected change in their condition. History and physical findings may alter suspicion that intoxication played a role in a patient's decline or death, but suspicions cannot be confirmed and is performed, analysis unless toxicologic no toxicologic analysis is possible unless someone collects the proper specimens necessary for analysis. In a hospital autopsy the only specimens that can rightfully be collected are those within the restrictions stated in the autopsy permit. Autopsies performed by the medical examiner do not have these restrictions. Sometimes the importance of toxicologic testing in a case is not evident until days or weeks after the change in the patient's status, thus retaining the appropriate specimens until investigation of that case has ended is important. Proper interpretation of toxicologic findings requires integrating the clinical setting and findings with the toxicologic results in a way that makes medical sense. If called upon to testify concerning findings, answer the questions truthfully, politely, and in a way that is understandable to someone who has no special training in toxicology.

  11. [Laser microdissection for biology and medicine].

    PubMed

    Podgornyĭ, O V; Lazarev, V N; Govorun, V M

    2012-01-01

    For routine extraction of DNA, RNA, proteins and metabolites, small tissue pieces are placed into lysing solution. These tissue pieces in general contain different cell types. For this reason, lysate contains components of different cell types, which complicates the interpretation of molecular analysis results. The laser microdissection allows overcoming this trouble. The laser microdissection is a method to procure tissue samples contained defined cell subpopulations, individual cells and even subsellular components under direct microscopic visualization. Collected samples can be undergone to different downstream molecular assays: DNA analysis, RNA transcript profiling, cDNA library generation and gene expression analysis, proteomic analysis and metabolite profiling. The laser microdissection has wide applications in oncology (research and routine), cellular and molecular biology, biochemistry and forensics. This paper reviews the principles of different laser microdissection instruments, examples of laser microdissection application and problems of sample preparation for laser microdissection.

  12. Australian Curriculum Linked Lessons: Statistics

    ERIC Educational Resources Information Center

    Day, Lorraine

    2014-01-01

    Students recognise and analyse data and draw inferences. They represent, summarise and interpret data and undertake purposeful investigations involving the collection and interpretation of data… They develop an increasingly sophisticated ability to critically evaluate chance and data concepts and make reasoned judgments and decisions, as well as…

  13. Remote sensing of wildland resources: A state-of-the-art review

    Treesearch

    Robert C. Aldrich

    1979-01-01

    A review, with literature citations, of current remote sensing technology, applications, and costs for wildland resource management, including collection, interpretation, and processing of data gathered through photographic and nonphotographic techniques for classification and mapping, interpretive information for specific applications, measurement of resource...

  14. Interpreting & Biomechanics. PEPNet Tipsheet

    ERIC Educational Resources Information Center

    PEPNet-Northeast, 2001

    2001-01-01

    Cumulative trauma disorder (CTD) refers to a collection of disorders associated with nerves, muscles, tendons, bones, and the neurovascular (nerves and related blood vessels) system. CTD symptoms may involve the neck, back, shoulders, arms, wrists, or hands. Interpreters with CTD may experience a variety of symptoms including: pain, joint…

  15. What Could Be Future Scenarios?-Lessons from the History of Public Health Surveillance for the Future: --A keynote address presented at the 8th World Alliance for Risk Factor Surveillance (WARFS) Global Conference on October 30, 2013, Beijing, China.

    PubMed

    Choi, Bernard C K

    2015-01-01

    This article provides insights into the future based on a review of the past and present of public health surveillance-the ongoing systematic collection, analysis, interpretation, and dissemination of health data for the planning, implementation, and evaluation of public health action. Public health surveillance dates back to the first recorded epidemic in 3180 BC in Egypt. A number of lessons and items of interest are summarised from a review of historical perspectives in the past 5,000 years and the current practice of surveillance. Some future scenarios are presented: exploring new frontiers; enhancing computer technology; improving epidemic investigations; improving data collection, analysis, dissemination and use; building on lessons from the past; building capacity; and enhancing global surveillance. It is concluded that learning from the past, reflecting on the present, and planning for the future can further enhance public health surveillance.

  16. Achieving Integration in Mixed Methods Designs—Principles and Practices

    PubMed Central

    Fetters, Michael D; Curry, Leslie A; Creswell, John W

    2013-01-01

    Mixed methods research offers powerful tools for investigating complex processes and systems in health and health care. This article describes integration principles and practices at three levels in mixed methods research and provides illustrative examples. Integration at the study design level occurs through three basic mixed method designs—exploratory sequential, explanatory sequential, and convergent—and through four advanced frameworks—multistage, intervention, case study, and participatory. Integration at the methods level occurs through four approaches. In connecting, one database links to the other through sampling. With building, one database informs the data collection approach of the other. When merging, the two databases are brought together for analysis. With embedding, data collection and analysis link at multiple points. Integration at the interpretation and reporting level occurs through narrative, data transformation, and joint display. The fit of integration describes the extent the qualitative and quantitative findings cohere. Understanding these principles and practices of integration can help health services researchers leverage the strengths of mixed methods. PMID:24279835

  17. Protocol for monitoring metals in Ozark National Scenic Riverways, Missouri: Version 1.0

    USGS Publications Warehouse

    Schmitt, Christopher J.; Brumbaugh, William G.; Besser, John M.; Hinck, Jo Ellen; Bowles, David E.; Morrison, Lloyd W.; Williams, Michael H.

    2008-01-01

    The National Park Service is developing a monitoring plan for the Ozark National Scenic Riverways in southeastern Missouri. Because of concerns about the release of lead, zinc, and other metals from lead-zinc mining to streams, the monitoring plan will include mining-related metals. After considering a variety of alternatives, the plan will consist of measuring the concentrations of cadmium, cobalt, lead, nickel, and zinc in composite samples of crayfish (Orconectes luteus or alternate species) and Asian clam (Corbicula fluminea) collected periodically from selected sites. This document, which comprises a protocol narrative and supporting standard operating procedures, describes the methods to be employed prior to, during, and after collection of the organisms, along with procedures for their chemical analysis and quality assurance; statistical analysis, interpretation, and reporting of the data; and for modifying the protocol narrative and supporting standard operating procedures. A list of supplies and equipment, data forms, and sample labels are also included. An example based on data from a pilot study is presented.

  18. Analyzing Hidden Semantics in Social Bookmarking of Open Educational Resources

    NASA Astrophysics Data System (ADS)

    Minguillón, Julià

    Web 2.0 services such as social bookmarking allow users to manage and share the links they find interesting, adding their own tags for describing them. This is especially interesting in the field of open educational resources, as delicious is a simple way to bridge the institutional point of view (i.e. learning object repositories) with the individual one (i.e. personal collections), thus promoting the discovering and sharing of such resources by other users. In this paper we propose a methodology for analyzing such tags in order to discover hidden semantics (i.e. taxonomies and vocabularies) that can be used to improve descriptions of learning objects and make learning object repositories more visible and discoverable. We propose the use of a simple statistical analysis tool such as principal component analysis to discover which tags create clusters that can be semantically interpreted. We will compare the obtained results with a collection of resources related to open educational resources, in order to better understand the real needs of people searching for open educational resources.

  19. Achieving integration in mixed methods designs-principles and practices.

    PubMed

    Fetters, Michael D; Curry, Leslie A; Creswell, John W

    2013-12-01

    Mixed methods research offers powerful tools for investigating complex processes and systems in health and health care. This article describes integration principles and practices at three levels in mixed methods research and provides illustrative examples. Integration at the study design level occurs through three basic mixed method designs-exploratory sequential, explanatory sequential, and convergent-and through four advanced frameworks-multistage, intervention, case study, and participatory. Integration at the methods level occurs through four approaches. In connecting, one database links to the other through sampling. With building, one database informs the data collection approach of the other. When merging, the two databases are brought together for analysis. With embedding, data collection and analysis link at multiple points. Integration at the interpretation and reporting level occurs through narrative, data transformation, and joint display. The fit of integration describes the extent the qualitative and quantitative findings cohere. Understanding these principles and practices of integration can help health services researchers leverage the strengths of mixed methods. © Health Research and Educational Trust.

  20. "When I saw walking I just kind of took it as wheeling": interpretations of mobility-related items in generic, preference-based health state instruments in the context of spinal cord injury.

    PubMed

    Michel, Yvonne Anne; Engel, Lidia; Rand-Hendriksen, Kim; Augestad, Liv Ariane; Whitehurst, David Gt

    2016-11-28

    In health economic analyses, health states are typically valued using instruments with few items per dimension. Due to the generic (and often reductionist) nature of such instruments, certain groups of respondents may experience challenges in describing their health state. This study is concerned with generic, preference-based health state instruments that provide information for decisions about the allocation of resources in health care. Unlike physical measurement instruments, preference-based health state instruments provide health state values that are dependent on how respondents interpret the items. This study investigates how individuals with spinal cord injury (SCI) interpret mobility-related items contained within six preference-based health state instruments. Secondary analysis of focus group transcripts originally collected in Vancouver, Canada, explored individuals' perceptions and interpretations of mobility-related items contained within the 15D, Assessment of Quality of Life 8-dimension (AQoL-8D), EQ-5D-5L, Health Utilities Index (HUI), Quality of Well-Being Scale Self-Administered (QWB-SA), and the 36-item Short Form health survey version 2 (SF-36v2). Ritchie and Spencer's 'Framework Approach' was used to perform thematic analysis that focused on participants' comments concerning the mobility-related items only. Fifteen individuals participated in three focus groups (five per focus group). Four themes emerged: wording of mobility (e.g., 'getting around' vs 'walking'), reference to aids and appliances, lack of suitable response options, and reframing of items (e.g., replacing 'walking' with 'wheeling'). These themes reflected item features that respondents perceived as relevant in enabling them to describe their mobility, and response strategies that respondents could use when faced with inaccessible items. Investigating perceptions to mobility-related items within the context of SCI highlights substantial variation in item interpretation across six preference-based health state instruments. Studying respondents' interpretations of items can help to understand discrepancies in the health state descriptions and values obtained from different instruments. This line of research warrants closer attention in the health economics and quality of life literature.

  1. Exhaled breath condensate – from an analytical point of view

    PubMed Central

    Dodig, Slavica; Čepelak, Ivana

    2013-01-01

    Over the past three decades, the goal of many researchers is analysis of exhaled breath condensate (EBC) as noninvasively obtained sample. A total quality in laboratory diagnostic processes in EBC analysis was investigated: pre-analytical (formation, collection, storage of EBC), analytical (sensitivity of applied methods, standardization) and post-analytical (interpretation of results) phases. EBC analysis is still used as a research tool. Limitations referred to pre-analytical, analytical, and post-analytical phases of EBC analysis are numerous, e.g. low concentrations of EBC constituents, single-analyte methods lack in sensitivity, and multi-analyte has not been fully explored, and reference values are not established. When all, pre-analytical, analytical and post-analytical requirements are met, EBC biomarkers as well as biomarker patterns can be selected and EBC analysis can hopefully be used in clinical practice, in both, the diagnosis and in the longitudinal follow-up of patients, resulting in better outcome of disease. PMID:24266297

  2. Land use/land cover mapping (1:25000) of Taiwan, Republic of China by automated multispectral interpretation of LANDSAT imagery

    NASA Technical Reports Server (NTRS)

    Sung, Q. C.; Miller, L. D.

    1977-01-01

    Three methods were tested for collection of the training sets needed to establish the spectral signatures of the land uses/land covers sought due to the difficulties of retrospective collection of representative ground control data. Computer preprocessing techniques applied to the digital images to improve the final classification results were geometric corrections, spectral band or image ratioing and statistical cleaning of the representative training sets. A minimal level of statistical verification was made based upon the comparisons between the airphoto estimates and the classification results. The verifications provided a further support to the selection of MSS band 5 and 7. It also indicated that the maximum likelihood ratioing technique can achieve more agreeable classification results with the airphoto estimates than the stepwise discriminant analysis.

  3. Comparing Geologic Data Sets Collected by Planetary Analog Traverses and by Standard Geologic Field Mapping: Desert Rats Data Analysis

    NASA Technical Reports Server (NTRS)

    Feng, Wanda; Evans, Cynthia; Gruener, John; Eppler, Dean

    2014-01-01

    Geologic mapping involves interpreting relationships between identifiable units and landforms to understand the formative history of a region. Traditional field techniques are used to accomplish this on Earth. Mapping proves more challenging for other planets, which are studied primarily by orbital remote sensing and, less frequently, by robotic and human surface exploration. Systematic comparative assessments of geologic maps created by traditional mapping versus photogeology together with data from planned traverses are limited. The objective of this project is to produce a geologic map from data collected on the Desert Research and Technology Studies (RATS) 2010 analog mission using Apollo-style traverses in conjunction with remote sensing data. This map is compared with a geologic map produced using standard field techniques.

  4. [Notes on vital statistics for the study of perinatal health].

    PubMed

    Juárez, Sol Pía

    2014-01-01

    Vital statistics, published by the National Statistics Institute in Spain, are a highly important source for the study of perinatal health nationwide. However, the process of data collection is not well-known and has implications both for the quality and interpretation of the epidemiological results derived from this source. The aim of this study was to present how the information is collected and some of the associated problems. This study is the result of an analysis of the methodological notes from the National Statistics Institute and first-hand information obtained from hospitals, the Central Civil Registry of Madrid, and the Madrid Institute for Statistics. Greater integration between these institutions is required to improve the quality of birth and stillbirth statistics. Copyright © 2014 SESPAS. Published by Elsevier Espana. All rights reserved.

  5. Automated Interpretation of Blood Culture Gram Stains by Use of a Deep Convolutional Neural Network.

    PubMed

    Smith, Kenneth P; Kang, Anthony D; Kirby, James E

    2018-03-01

    Microscopic interpretation of stained smears is one of the most operator-dependent and time-intensive activities in the clinical microbiology laboratory. Here, we investigated application of an automated image acquisition and convolutional neural network (CNN)-based approach for automated Gram stain classification. Using an automated microscopy platform, uncoverslipped slides were scanned with a 40× dry objective, generating images of sufficient resolution for interpretation. We collected 25,488 images from positive blood culture Gram stains prepared during routine clinical workup. These images were used to generate 100,213 crops containing Gram-positive cocci in clusters, Gram-positive cocci in chains/pairs, Gram-negative rods, or background (no cells). These categories were targeted for proof-of-concept development as they are associated with the majority of bloodstream infections. Our CNN model achieved a classification accuracy of 94.9% on a test set of image crops. Receiver operating characteristic (ROC) curve analysis indicated a robust ability to differentiate between categories with an area under the curve of >0.98 for each. After training and validation, we applied the classification algorithm to new images collected from 189 whole slides without human intervention. Sensitivity and specificity were 98.4% and 75.0% for Gram-positive cocci in chains and pairs, 93.2% and 97.2% for Gram-positive cocci in clusters, and 96.3% and 98.1% for Gram-negative rods. Taken together, our data support a proof of concept for a fully automated classification methodology for blood-culture Gram stains. Importantly, the algorithm was highly adept at identifying image crops with organisms and could be used to present prescreened, classified crops to technologists to accelerate smear review. This concept could potentially be extended to all Gram stain interpretive activities in the clinical laboratory. Copyright © 2018 American Society for Microbiology.

  6. U.S. Geological Survey applied research studies of the Cheyenne River System, South Dakota; description and collation of data, water years 1987-88

    USGS Publications Warehouse

    Goddard, K. E.

    1990-01-01

    The Cheyenne River System in western South Dakota has been impacted by the discharge of about 100 million metric tons of gold-mill tailings to Whitewood Creek near Lead, South Dakota. In April 1985, the U.S. Geological Survey initiated an extensive series of research studies to investigate the magnitude of the impact and to define important processes acting on the contaminated sediments present in the system. The report presents all data collected during the 1987 and 1988 water years for these research studies. Some of the data included have been published previously. Data collected in the 1985 and 1986 water years have been published in a companion report (U.S. Geological Survey Open-File Report 88-484). Hydrologic, geochemical, and biologic data are available for sites on Whitewood Creek, and the Belle Fourche and Cheyenne Rivers. Data complexity varies from routine discharge and water-quality to very complex energy-dispersive x-ray analysis. Methods for sample collection, handling and preservation, and laboratory analysis are also presented. No interpretations or complex statistical summaries are included. (See also W89-08390) (USGS)

  7. Alterations in hematologic indices during long-duration spaceflight.

    PubMed

    Kunz, Hawley; Quiriarte, Heather; Simpson, Richard J; Ploutz-Snyder, Robert; McMonigal, Kathleen; Sams, Clarence; Crucian, Brian

    2017-01-01

    Although a state of anemia is perceived to be associated with spaceflight, to date a peripheral blood hematologic assessment of red blood cell (RBC) indices has not been performed during long-duration space missions. This investigation collected whole blood samples from astronauts participating in up to 6-months orbital spaceflight, and returned those samples (ambient storage) to Earth for analysis. As samples were always collected near undock of a returning vehicle, the delay from collection to analysis never exceeded 48 h. As a subset of a larger immunologic investigation, a complete blood count was performed. A parallel stability study of the effect of a 48 h delay on these parameters assisted interpretation of the in-flight data. We report that the RBC and hemoglobin were significantly elevated during flight, both parameters deemed stable through the delay of sample return. Although the stability data showed hematocrit to be mildly elevated at +48 h, there was an in-flight increase in hematocrit that was ~3-fold higher in magnitude than the anticipated increase due to the delay in processing. While susceptible to the possible influence of dehydration or plasma volume alterations, these results suggest astronauts do not develop persistent anemia during spaceflight.

  8. Revisiting Interpretation of Canonical Correlation Analysis: A Tutorial and Demonstration of Canonical Commonality Analysis

    ERIC Educational Resources Information Center

    Nimon, Kim; Henson, Robin K.; Gates, Michael S.

    2010-01-01

    In the face of multicollinearity, researchers face challenges interpreting canonical correlation analysis (CCA) results. Although standardized function and structure coefficients provide insight into the canonical variates produced, they fall short when researchers want to fully report canonical effects. This article revisits the interpretation of…

  9. Methodological challenges in qualitative content analysis: A discussion paper.

    PubMed

    Graneheim, Ulla H; Lindgren, Britt-Marie; Lundman, Berit

    2017-09-01

    This discussion paper is aimed to map content analysis in the qualitative paradigm and explore common methodological challenges. We discuss phenomenological descriptions of manifest content and hermeneutical interpretations of latent content. We demonstrate inductive, deductive, and abductive approaches to qualitative content analysis, and elaborate on the level of abstraction and degree of interpretation used in constructing categories, descriptive themes, and themes of meaning. With increased abstraction and interpretation comes an increased challenge to demonstrate the credibility and authenticity of the analysis. A key issue is to show the logic in how categories and themes are abstracted, interpreted, and connected to the aim and to each other. Qualitative content analysis is an autonomous method and can be used at varying levels of abstraction and interpretation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. People's Collection Wales: Online Access to the Heritage of Wales from Museums, Archives and Libraries

    ERIC Educational Resources Information Center

    Tedd, Lucy A.

    2011-01-01

    Purpose: The People's Collection Wales aims to collect, interpret, distribute and discuss Wales' cultural heritage in an online environment. Individual users or local history societies are able to create their own digital collections, contribute relevant content, as well as access digital resources from heritage institutions. This paper aims to…

  11. Contributions to a neurophysiology of meaning: the interpretation of written messages could be an automatic stimulus-reaction mechanism before becoming conscious processing of information

    PubMed Central

    Convertini, Livia S.; Quatraro, Sabrina; Ressa, Stefania; Velasco, Annalisa

    2015-01-01

    Background. Even though the interpretation of natural language messages is generally conceived as the result of a conscious processing of the message content, the influence of unconscious factors is also well known. What is still insufficiently known is the way such factors work. We have tackled interpretation assuming it is a process, whose basic features are the same for the whole humankind, and employing a naturalistic approach (careful observation of phenomena in conditions the closest to “natural” ones, and precise description before and independently of data statistical analysis). Methodology. Our field research involved a random sample of 102 adults. We presented them with a complete real world-like case of written communication using unabridged message texts. We collected data (participants’ written reports on their interpretations) in controlled conditions through a specially designed questionnaire (closed and opened answers); then, we treated it through qualitative and quantitative methods. Principal Findings. We gathered some evidence that, in written message interpretation, between reading and the attribution of conscious meaning, an intermediate step could exist (we named it “disassembling”) which looks like an automatic reaction to the text words/expressions. Thus, the process of interpretation would be a discontinuous sequence of three steps having different natures: the initial “decoding” step (i.e., reading, which requires technical abilities), disassembling (the automatic reaction, an unconscious passage) and the final conscious attribution of meaning. If this is true, words and expressions would firstly function like physical stimuli, before being taken into account as symbols. Such hypothesis, once confirmed, could help explaining some links between the cultural (human communication) and the biological (stimulus-reaction mechanisms as the basis for meanings) dimension of humankind. PMID:26528419

  12. In Dialogue with the Decorative Arts

    ERIC Educational Resources Information Center

    Powell, Olivia

    2017-01-01

    How can museum educators create dialogical experiences with European decorative arts? This question frames my essay and stems from the challenges I have faced introducing objects whose original functions seem to overshadow their aesthetic and interpretive value. Repeated efforts to spark rich dialogue and collective interpretation around pieces of…

  13. Deep Lake Explorer: Bringing citizen scientists to the underwater world of the Great Lakes

    EPA Science Inventory

    Deep Lake Explorer is a web application hosted on the Zooniverse platform that allows the public to interpret underwater video collected in the Great Lakes. Crowdsourcing image interpretation using the Zooniverse platform has proven successful for many projects, but few projects ...

  14. Aristotle's "Rhetoric": Reinterpreting Invention.

    ERIC Educational Resources Information Center

    Quandahl, Ellen

    1986-01-01

    Shows that Aristotle's common topics are part of a theory of interpretation rather than a collection of devices for invention. Argues that it is more Aristotelian and more useful to understand composing as interpretation and not invention. Uses scholarship to inform pedagogy and to reorient composing toward acts of reading. (EL)

  15. 76 FR 6796 - Proposed Data Collections Submitted for Public Comment and Recommendations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-08

    ... perform the ``Quantitative Survey of Physician Practices in Laboratory Test Ordering and Interpretation... technology. Written comments should be received within 60 days of this notice. Proposed Project Quantitative Survey of Physician Practices in Laboratory Test Ordering and Interpretation-NEW-the Office of...

  16. An evaluation of lithographed forest stereograms.

    Treesearch

    David A. Bernstein

    1961-01-01

    Aerial photo stereograms are valuable for showing neophyte photo interpreters the stereoscopic appearance of common objects and conditions. They are also useful for instruction in measuring heights, horizontal distances, and angles on photos. Collections of stereograms of known conditions are worthwhile reference material for interpretation work in unknown areas.

  17. Shoreline Change and Storm-Induced Beach Erosion Modeling: A Collection of Seven Papers

    DTIC Science & Technology

    1990-03-01

    reducing, and analyzing the data in a systematic manner. Most physical data needed for evaluating and interpreting shoreline and beach evolution processes...proposed development concepts using both physical and numerical models. b. Analyzed and interpreted model results. c. Provided technical documentation of... interpret study results in the context required for "Confirmation" hearings. 26 The Corps of Engineers, Los Angeles District (SPL), has also begun studies

  18. Mobile Collection and Automated Interpretation of EEG Data

    NASA Technical Reports Server (NTRS)

    Mintz, Frederick; Moynihan, Philip

    2007-01-01

    A system that would comprise mobile and stationary electronic hardware and software subsystems has been proposed for collection and automated interpretation of electroencephalographic (EEG) data from subjects in everyday activities in a variety of environments. By enabling collection of EEG data from mobile subjects engaged in ordinary activities (in contradistinction to collection from immobilized subjects in clinical settings), the system would expand the range of options and capabilities for performing diagnoses. Each subject would be equipped with one of the mobile subsystems, which would include a helmet that would hold floating electrodes (see figure) in those positions on the patient s head that are required in classical EEG data-collection techniques. A bundle of wires would couple the EEG signals from the electrodes to a multi-channel transmitter also located in the helmet. Electronic circuitry in the helmet transmitter would digitize the EEG signals and transmit the resulting data via a multidirectional RF patch antenna to a remote location. At the remote location, the subject s EEG data would be processed and stored in a database that would be auto-administered by a newly designed relational database management system (RDBMS). In this RDBMS, in nearly real time, the newly stored data would be subjected to automated interpretation that would involve comparison with other EEG data and concomitant peer-reviewed diagnoses stored in international brain data bases administered by other similar RDBMSs.

  19. [Importance of quality control for the detection of β-lactam antibiotic resistance in Enterobacteriaceae].

    PubMed

    Rivera, Alba; Larrosa, Nieves; Mirelis, Beatriz; Navarro, Ferran

    2014-02-01

    β-lactam antimicrobial agents are frequently used to treat infections caused by Enterobacteriaceae. The main mechanism of resistance to these antibiotics is the production of certain enzymes, collectively named β-lactamases. Due to their substrate profile and their epidemiological implications, the most clinically important β-lactamases are extended-spectrum β-lactamases, class C β-lactamases and carbapenemases. Phenotypic detection of these enzymes may be complicated and is based on the use of specific inhibitors of each β-lactamase and on the loss of activity on some β-lactam indicators. Various international committees postulate that it is no longer necessary to interpret the susceptibility results or determine the mechanism of resistance. Several critics disagree, however, and consider that susceptibility results should be interpreted until more data are available on the clinical efficacy of treatment with β-lactams. Given these methodological difficulties and constant changes in the interpretation criteria, we consider that training and external quality controls are essential to keep updated in this field. For learning purposes, these external quality controls should always be accompanied by a review of the results and methodology used, and the analysis of errors. In this paper we review and contextualize all the aspects related to the detection and interpretation of these β-lactamases. Copyright © 2014 Elsevier España, S.L. All rights reserved.

  20. Collective Bargaining Agreements, Labor Relations, Division of Personnel

    Science.gov Websites

    Relations / Labor Relations / Collective Bargaining Agreement Collective Bargaining Agreements Indexed below , 2014 July 1, 2004 - June 30, 2007 One year supplemental agreement not ratified by IBU July 1, 2008 Agreement Frequently Asked Questions Interpretative Memoranda Union Contact List Contact Us Labor Relations

  1. Data analysis considerations for pesticides determined by National Water Quality Laboratory schedule 2437

    USGS Publications Warehouse

    Shoda, Megan E.; Nowell, Lisa H.; Stone, Wesley W.; Sandstrom, Mark W.; Bexfield, Laura M.

    2018-04-02

    In 2013, the U.S. Geological Survey National Water Quality Laboratory (NWQL) made a new method available for the analysis of pesticides in filtered water samples: laboratory schedule 2437. Schedule 2437 is an improvement on previous analytical methods because it determines the concentrations of 225 fungicides, herbicides, insecticides, and associated degradates in one method at similar or lower concentrations than previously available methods. Additionally, the pesticides included in schedule 2437 were strategically identified in a prioritization analysis that assessed likelihood of occurrence, prevalence of use, and potential toxicity. When the NWQL reports pesticide concentrations for analytes in schedule 2437, the laboratory also provides supplemental information useful to data users for assessing method performance and understanding data quality. That supplemental information is discussed in this report, along with an initial analysis of analytical recovery of pesticides in water-quality samples analyzed by schedule 2437 during 2013–2015. A total of 523 field matrix spike samples and their paired environmental samples and 277 laboratory reagent spike samples were analyzed for this report (1,323 samples total). These samples were collected in the field as part of the U.S. Geological Survey National Water-Quality Assessment groundwater and surface-water studies and as part of the NWQL quality-control program. This report reviews how pesticide samples are processed by the NWQL, addresses how to obtain all the data necessary to interpret pesticide concentrations, explains the circumstances that result in a reporting level change or the occurrence of a raised reporting level, and describes the calculation and assessment of recovery. This report also discusses reasons why a data user might choose to exclude data in an interpretive analysis and outlines the approach used to identify the potential for decreased data quality in the assessment of method recovery. The information provided in this report is essential to understanding pesticide data determined by schedule 2437 and should be reviewed before interpretation of these data.

  2. Volunteered Geographic Information: Interpretation, Visualisation and Social Computing (VGIscience)

    NASA Astrophysics Data System (ADS)

    Burghardt, Dirk; Nejdl, Wolfgang; Schiewe, Jochen; Sester, Monika

    2018-05-01

    In the past years Volunteered Geographic Information (VGI) has emerged as a novel form of user-generated content, which involves active generation of geo-data for example in citizen science projects or during crisis mapping as well as the passive collection of data via the user's location-enabled mobile devices. In addition there are more and more sensors available that detect our environment with ever greater detail and dynamics. These data can be used for a variety of applications, not only for the solution of societal tasks such as in environment, health or transport fields, but also for the development of commercial products and services. The interpretation, visualisation and usage of such multi-source data is challenging because of the large heterogeneity, the differences in quality, the high update frequencies, the varying spatial-temporal resolution, subjective characteristics and low semantic structuring. Therefore the German Research Foundation has launched a priority programme for the next 3-6 years which will support interdisciplinary research projects. This priority programme aims to provide a scientific basis for raising the potential of VGI- and sensor data. Research questions described more in detail in this short paper span from the extraction of spatial information, to the visual analysis and knowledge presentation, taking into account the social context while collecting and using VGI.

  3. How system designers think: a study of design thinking in human factors engineering.

    PubMed

    Papantonopoulos, Sotiris

    2004-11-01

    The paper presents a descriptive study of design thinking in human factors engineering. The objective of the study is to analyse the role of interpretation in design thinking and the role of design practice in guiding interpretation. The study involved 10 system designers undertaking the allocation of cognitive functions in three production planning and control task scenarios. Allocation decisions were recorded and verbal protocols of the design process were collected to elicit the subjects' thought processes. Verbal protocol analysis showed that subjects carried out the design of cognitive task allocation as a problem of applying a selected automation technology from their initial design deliberations. This design strategy stands in contrast to the predominant view of system design that stipulates that user requirements should be thoroughly analysed prior to making any decisions about technology. Theoretical frameworks from design research and ontological design showed that the system design process may be better understood by recognizing the role of design hypotheses in system design, as well as the diverse interactions between interpretation and practice, means and ends, and design practice and the designer's pre-understanding which shape the design process. Ways to balance the bias exerted on the design process were discussed.

  4. English Language Learning Strategies Reported by Advanced Language Learners

    ERIC Educational Resources Information Center

    Lee, Juyeon; Heinz, Michael

    2016-01-01

    The purpose of the present study is to investigate effective English language learning strategies (LLSs) employed by successful language learners. The participants in this study were 20 student interpreters enrolled in the graduate school of interpretation and translation in Korea. Data on LLSs were collected through unstructured essay writing, a…

  5. Using Panorama Theater To Teach Middle School Social Studies.

    ERIC Educational Resources Information Center

    Chilcoat, George W.

    1995-01-01

    Describes how use of panorama theater to teach middle school social studies can aid in teaching the academic skills of defining a problem, locating and collecting data, organizing and designing tasks, drawing inferences, creating and building interpretations, revising and editing, and interpreting data. Presents a classroom example of a panorama…

  6. Developing the 18th indicator for interpreting indicators of rangeland health on Northern Great Plains rangelands

    USDA-ARS?s Scientific Manuscript database

    National Resources Inventory (NRI) resource assessment report shows little to no departure on Rangeland Health for most Northern Great Plains Rangelands. This information is supported by Interpreting Indicators of Rangeland Health (IIRH) data collected at local to regional scales. There is however a...

  7. Encountering Student Texts: Interpretive Issues in Reading Student Writing.

    ERIC Educational Resources Information Center

    Lawson, Bruce, Ed.; And Others

    Designed to raise the full range of hermeneutic concerns regarding evaluation of student writing, and to spur further research and discussion, this collection of essays focuses on a reconsideration of the interpretation and evaluation practices of writing teachers. Essays include: "A Reflective Conversation: 'Tempos of Meaning'"…

  8. 76 FR 44462 - Statement of General Policy or Interpretation; Commentary on the Fair Credit Reporting Act

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-26

    ... collection, assembly, and use of consumer report information and provides the framework for the credit... increased the obligations of users of consumer reports, particularly employers. Most significantly, the 1996... transferred authority to issue interpretive guidance under the FCRA to the Consumer Financial Protection...

  9. Instrumental Students' Strategies for Finding Interpretations: Complexity and Individual Variety

    ERIC Educational Resources Information Center

    Hultberg, Cecilia

    2008-01-01

    In this article, a qualitative, collaborative study on two students' preparation of public performances of guitar duos is presented. A cultural-psychological perspective was used, and data were collected in natural settings. Participants' processes of finding interpretations are characterized by complex strategies, based on individual familiarity…

  10. MTpy - Python Tools for Magnetotelluric Data Processing and Analysis

    NASA Astrophysics Data System (ADS)

    Krieger, Lars; Peacock, Jared; Thiel, Stephan; Inverarity, Kent; Kirkby, Alison; Robertson, Kate; Soeffky, Paul; Didana, Yohannes

    2014-05-01

    We present the Python package MTpy, which provides functions for the processing, analysis, and handling of magnetotelluric (MT) data sets. MT is a relatively immature and not widely applied geophysical method in comparison to other geophysical techniques such as seismology. As a result, the data processing within the academic MT community is not thoroughly standardised and is often based on a loose collection of software, adapted to the respective local specifications. We have developed MTpy to overcome problems that arise from missing standards, and to provide a simplification of the general handling of MT data. MTpy is written in Python, and the open-source code is freely available from a GitHub repository. The setup follows the modular approach of successful geoscience software packages such as GMT or Obspy. It contains sub-packages and modules for the various tasks within the standard work-flow of MT data processing and interpretation. In order to allow the inclusion of already existing and well established software, MTpy does not only provide pure Python classes and functions, but also wrapping command-line scripts to run standalone tools, e.g. modelling and inversion codes. Our aim is to provide a flexible framework, which is open for future dynamic extensions. MTpy has the potential to promote the standardisation of processing procedures and at same time be a versatile supplement for existing algorithms. Here, we introduce the concept and structure of MTpy, and we illustrate the workflow of MT data processing, interpretation, and visualisation utilising MTpy on example data sets collected over different regions of Australia and the USA.

  11. Evaluation of arctic multibeam sonar data quality using nadir crossover error analysis and compilation of a full-resolution data product

    NASA Astrophysics Data System (ADS)

    Flinders, Ashton F.; Mayer, Larry A.; Calder, Brian A.; Armstrong, Andrew A.

    2014-05-01

    We document a new high-resolution multibeam bathymetry compilation for the Canada Basin and Chukchi Borderland in the Arctic Ocean - United States Arctic Multibeam Compilation (USAMBC Version 1.0). The compilation preserves the highest native resolution of the bathymetric data, allowing for more detailed interpretation of seafloor morphology than has been previously possible. The compilation was created from multibeam bathymetry data available through openly accessible government and academic repositories. Much of the new data was collected during dedicated mapping cruises in support of the United States effort to map extended continental shelf regions beyond the 200 nm Exclusive Economic Zone. Data quality was evaluated using nadir-beam crossover-error statistics, making it possible to assess the precision of multibeam depth soundings collected from a wide range of vessels and sonar systems. Data were compiled into a single high-resolution grid through a vertical stacking method, preserving the highest quality data source in any specific grid cell. The crossover-error analysis and method of data compilation can be applied to other multi-source multibeam data sets, and is particularly useful for government agencies targeting extended continental shelf regions but with limited hydrographic capabilities. Both the gridded compilation and an easily distributed geospatial PDF map are freely available through the University of New Hampshire's Center for Coastal and Ocean Mapping (ccom.unh.edu/theme/law-sea). The geospatial pdf is a full resolution, small file-size product that supports interpretation of Arctic seafloor morphology without the need for specialized gridding/visualization software.

  12. Validation of high throughput sequencing and microbial forensics applications

    PubMed Central

    2014-01-01

    High throughput sequencing (HTS) generates large amounts of high quality sequence data for microbial genomics. The value of HTS for microbial forensics is the speed at which evidence can be collected and the power to characterize microbial-related evidence to solve biocrimes and bioterrorist events. As HTS technologies continue to improve, they provide increasingly powerful sets of tools to support the entire field of microbial forensics. Accurate, credible results allow analysis and interpretation, significantly influencing the course and/or focus of an investigation, and can impact the response of the government to an attack having individual, political, economic or military consequences. Interpretation of the results of microbial forensic analyses relies on understanding the performance and limitations of HTS methods, including analytical processes, assays and data interpretation. The utility of HTS must be defined carefully within established operating conditions and tolerances. Validation is essential in the development and implementation of microbial forensics methods used for formulating investigative leads attribution. HTS strategies vary, requiring guiding principles for HTS system validation. Three initial aspects of HTS, irrespective of chemistry, instrumentation or software are: 1) sample preparation, 2) sequencing, and 3) data analysis. Criteria that should be considered for HTS validation for microbial forensics are presented here. Validation should be defined in terms of specific application and the criteria described here comprise a foundation for investigators to establish, validate and implement HTS as a tool in microbial forensics, enhancing public safety and national security. PMID:25101166

  13. Validation of high throughput sequencing and microbial forensics applications.

    PubMed

    Budowle, Bruce; Connell, Nancy D; Bielecka-Oder, Anna; Colwell, Rita R; Corbett, Cindi R; Fletcher, Jacqueline; Forsman, Mats; Kadavy, Dana R; Markotic, Alemka; Morse, Stephen A; Murch, Randall S; Sajantila, Antti; Schmedes, Sarah E; Ternus, Krista L; Turner, Stephen D; Minot, Samuel

    2014-01-01

    High throughput sequencing (HTS) generates large amounts of high quality sequence data for microbial genomics. The value of HTS for microbial forensics is the speed at which evidence can be collected and the power to characterize microbial-related evidence to solve biocrimes and bioterrorist events. As HTS technologies continue to improve, they provide increasingly powerful sets of tools to support the entire field of microbial forensics. Accurate, credible results allow analysis and interpretation, significantly influencing the course and/or focus of an investigation, and can impact the response of the government to an attack having individual, political, economic or military consequences. Interpretation of the results of microbial forensic analyses relies on understanding the performance and limitations of HTS methods, including analytical processes, assays and data interpretation. The utility of HTS must be defined carefully within established operating conditions and tolerances. Validation is essential in the development and implementation of microbial forensics methods used for formulating investigative leads attribution. HTS strategies vary, requiring guiding principles for HTS system validation. Three initial aspects of HTS, irrespective of chemistry, instrumentation or software are: 1) sample preparation, 2) sequencing, and 3) data analysis. Criteria that should be considered for HTS validation for microbial forensics are presented here. Validation should be defined in terms of specific application and the criteria described here comprise a foundation for investigators to establish, validate and implement HTS as a tool in microbial forensics, enhancing public safety and national security.

  14. Earthquake supersite project in the Messina Straits area (EQUAMES)

    NASA Astrophysics Data System (ADS)

    Mattia, Mario; Chiarabba, Claudio; Dell'Acqua, Fabio; Faccenna, Claudio; Lanari, Riccardo; Matteuzzi, Francesco; Neri, Giancarlo; Patanè, Domenico; Polonia, Alina; Prati, Claudio; Tinti, Stefano; Zerbini, Susanna

    2015-04-01

    A new permanent supersite is going to be proposed to the GEO GSNL (Geohazard Supersites and National Laboratories) for the Messina Straits area (Italy). The justification for this new supersite can be found in its geological and geophysical features and in the exposure to strong earthquakes, also in the recent past (1908). The Messina Supersite infrastructure (EQUAMES: EarthQUAkes in the MEssina Straits) will host, and contribute to the collection of, large amounts of data, basic for the analysis of seismic hazard/risk in this high seismic risk area, including risk from earthquake-related processes such as submarine mass failures and tsunamis. In EQUAMES, data of different types will coexist with models and methods useful for their analysis/interpretation and with first-level products of analysis that can be of interest for different kinds of users. EQUAMES will help all the interested scientific and non-scientific subjects to find and use data and to increase inter-institutional cooperation by addressing the following main topics in the Messina Straits area: • investigation of the geological and physical processes leading to the earthquake preparation and generation; • analysis of seismic shaking at ground (expected and observed); • combination of seismic hazard with vulnerability and exposure data for risk estimates; • analysis of tsunami generation, propagation and coastal inundation deriving from earthquake occurrence also through landslides due to instability conditions of subaerial and submarine slopes; • overall risk associated to earthquake activity in the Supersite area including the different types of cascade effects Many Italian and international Institutions have shown an effective interest in this project where a large variety of geophysical and geological in-situ data will be collected and where the INGV has the leading role with its large infrastructure of seismic, GPS and geochemical permanent stations. The groups supporting EQUAMES compile different expertises which will allow most up-to-date analysis and interpretation of the data to be acquired. Finally, the availability of SAR data from different satellites (ERS, Cosmo SkyMed, Sentinel) can be the key for important improvements in the knowledge of the geodynamics of this area of the Mediterranean Sea.

  15. Incidence of a single subsegmental mismatched perfusion defect in single-photon emission computed tomography and planar ventilation/perfusion scans.

    PubMed

    Stubbs, Matthew; Chan, Kenneth; McMeekin, Helena; Navalkissoor, Shaunak; Wagner, Thomas

    2017-02-01

    This study aims to compare the incidence of ventilation/perfusion (V/Q) scans interpreted as indeterminate for the diagnosis of pulmonary embolism (PE) using single-photon emission computed tomography (SPECT) versus planar scintigraphy and to consider the effect of variable interpretation of single subsegmental V/Q mismatch (SSM). A total of 1300 consecutive V/Q scans were retrospectively reviewed. After exclusion and matching for age and sex, 542 SPECT and 589 planar scans were included in the analysis. European Association of Nuclear Medicine guidelines were used to interpret the V/Q scans, initially interpreting SSM as negative scans. Patients with SSM were followed up for 3 months and further imaging for PE was collected. Indeterminate scans were significantly fewer in the SPECT than the planar group on the basis of the initial report (7.7 vs. 12.2%, P<0.05). This is irrespective of classification of SSM as a negative scan (4.6 vs. 12.1%, P<0.0001) or an indeterminate scan (8.3 vs. 12.2%, P<0.05). Of the 21 patients who had SSM, 19 underwent computer tomography pulmonary angiogram and embolism was found in one patient. None of these patients died at the 3-month follow-up. V/Q SPECT has greater diagnostic certainty of PE, with a 41% reduction in an indeterminate scan compared with planar scintigraphy. This is irrespective of the clinician's interpretation of SSM as negative or intermediate probability. Patients with SSM would not require further computer tomography pulmonary angiogram imaging.

  16. The Australian Pharmaceutical Benefits Scheme data collection: a practical guide for researchers.

    PubMed

    Mellish, Leigh; Karanges, Emily A; Litchfield, Melisa J; Schaffer, Andrea L; Blanch, Bianca; Daniels, Benjamin J; Segrave, Alicia; Pearson, Sallie-Anne

    2015-11-02

    The Pharmaceutical Benefits Scheme (PBS) is Australia's national drug subsidy program. This paper provides a practical guide to researchers using PBS data to examine prescribed medicine use. Excerpts of the PBS data collection are available in a variety of formats. We describe the core components of four publicly available extracts (the Australian Statistics on Medicines, PBS statistics online, section 85 extract, under co-payment extract). We also detail common analytical challenges and key issues regarding the interpretation of utilisation using the PBS collection and its various extracts. Research using routinely collected data is increasing internationally. PBS data are a valuable resource for Australian pharmacoepidemiological and pharmaceutical policy research. A detailed knowledge of the PBS, the nuances of data capture, and the extracts available for research purposes are necessary to ensure robust methodology, interpretation, and translation of study findings into policy and practice.

  17. Large-scale feature searches of collections of medical imagery

    NASA Astrophysics Data System (ADS)

    Hedgcock, Marcus W.; Karshat, Walter B.; Levitt, Tod S.; Vosky, D. N.

    1993-09-01

    Large scale feature searches of accumulated collections of medical imagery are required for multiple purposes, including clinical studies, administrative planning, epidemiology, teaching, quality improvement, and research. To perform a feature search of large collections of medical imagery, one can either search text descriptors of the imagery in the collection (usually the interpretation), or (if the imagery is in digital format) the imagery itself. At our institution, text interpretations of medical imagery are all available in our VA Hospital Information System. These are downloaded daily into an off-line computer. The text descriptors of most medical imagery are usually formatted as free text, and so require a user friendly database search tool to make searches quick and easy for any user to design and execute. We are tailoring such a database search tool (Liveview), developed by one of the authors (Karshat). To further facilitate search construction, we are constructing (from our accumulated interpretation data) a dictionary of medical and radiological terms and synonyms. If the imagery database is digital, the imagery which the search discovers is easily retrieved from the computer archive. We describe our database search user interface, with examples, and compare the efficacy of computer assisted imagery searches from a clinical text database with manual searches. Our initial work on direct feature searches of digital medical imagery is outlined.

  18. Capture and X-ray diffraction studies of protein microcrystals in a microfluidic trap array

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lyubimov, Artem Y.; Stanford University, Stanford, CA 94305; Stanford University, Stanford, CA 94305

    A microfluidic platform has been developed for the capture and X-ray analysis of protein microcrystals, affording a means to improve the efficiency of XFEL and synchrotron experiments. X-ray free-electron lasers (XFELs) promise to enable the collection of interpretable diffraction data from samples that are refractory to data collection at synchrotron sources. At present, however, more efficient sample-delivery methods that minimize the consumption of microcrystalline material are needed to allow the application of XFEL sources to a wide range of challenging structural targets of biological importance. Here, a microfluidic chip is presented in which microcrystals can be captured at fixed, addressablemore » points in a trap array from a small volume (<10 µl) of a pre-existing slurry grown off-chip. The device can be mounted on a standard goniostat for conducting diffraction experiments at room temperature without the need for flash-cooling. Proof-of-principle tests with a model system (hen egg-white lysozyme) demonstrated the high efficiency of the microfluidic approach for crystal harvesting, permitting the collection of sufficient data from only 265 single-crystal still images to permit determination and refinement of the structure of the protein. This work shows that microfluidic capture devices can be readily used to facilitate data collection from protein microcrystals grown in traditional laboratory formats, enabling analysis when cryopreservation is problematic or when only small numbers of crystals are available. Such microfluidic capture devices may also be useful for data collection at synchrotron sources.« less

  19. Exploring patterns enriched in a dataset with contrastive principal component analysis.

    PubMed

    Abid, Abubakar; Zhang, Martin J; Bagaria, Vivek K; Zou, James

    2018-05-30

    Visualization and exploration of high-dimensional data is a ubiquitous challenge across disciplines. Widely used techniques such as principal component analysis (PCA) aim to identify dominant trends in one dataset. However, in many settings we have datasets collected under different conditions, e.g., a treatment and a control experiment, and we are interested in visualizing and exploring patterns that are specific to one dataset. This paper proposes a method, contrastive principal component analysis (cPCA), which identifies low-dimensional structures that are enriched in a dataset relative to comparison data. In a wide variety of experiments, we demonstrate that cPCA with a background dataset enables us to visualize dataset-specific patterns missed by PCA and other standard methods. We further provide a geometric interpretation of cPCA and strong mathematical guarantees. An implementation of cPCA is publicly available, and can be used for exploratory data analysis in many applications where PCA is currently used.

  20. Additional Crime Scenes for Projectile Motion Unit

    NASA Astrophysics Data System (ADS)

    Fullerton, Dan; Bonner, David

    2011-12-01

    Building students' ability to transfer physics fundamentals to real-world applications establishes a deeper understanding of underlying concepts while enhancing student interest. Forensic science offers a great opportunity for students to apply physics to highly engaging, real-world contexts. Integrating these opportunities into inquiry-based problem solving in a team environment provides a terrific backdrop for fostering communication, analysis, and critical thinking skills. One such activity, inspired jointly by the museum exhibit "CSI: The Experience"2 and David Bonner's TPT article "Increasing Student Engagement and Enthusiasm: A Projectile Motion Crime Scene,"3 provides students with three different crime scenes, each requiring an analysis of projectile motion. In this lesson students socially engage in higher-order analysis of two-dimensional projectile motion problems by collecting information from 3-D scale models and collaborating with one another on its interpretation, in addition to diagramming and mathematical analysis typical to problem solving in physics.

  1. Quaternion-Based Signal Analysis for Motor Imagery Classification from Electroencephalographic Signals.

    PubMed

    Batres-Mendoza, Patricia; Montoro-Sanjose, Carlos R; Guerra-Hernandez, Erick I; Almanza-Ojeda, Dora L; Rostro-Gonzalez, Horacio; Romero-Troncoso, Rene J; Ibarra-Manzano, Mario A

    2016-03-05

    Quaternions can be used as an alternative to model the fundamental patterns of electroencephalographic (EEG) signals in the time domain. Thus, this article presents a new quaternion-based technique known as quaternion-based signal analysis (QSA) to represent EEG signals obtained using a brain-computer interface (BCI) device to detect and interpret cognitive activity. This quaternion-based signal analysis technique can extract features to represent brain activity related to motor imagery accurately in various mental states. Experimental tests in which users where shown visual graphical cues related to left and right movements were used to collect BCI-recorded signals. These signals were then classified using decision trees (DT), support vector machine (SVM) and k-nearest neighbor (KNN) techniques. The quantitative analysis of the classifiers demonstrates that this technique can be used as an alternative in the EEG-signal modeling phase to identify mental states.

  2. Quaternion-Based Signal Analysis for Motor Imagery Classification from Electroencephalographic Signals

    PubMed Central

    Batres-Mendoza, Patricia; Montoro-Sanjose, Carlos R.; Guerra-Hernandez, Erick I.; Almanza-Ojeda, Dora L.; Rostro-Gonzalez, Horacio; Romero-Troncoso, Rene J.; Ibarra-Manzano, Mario A.

    2016-01-01

    Quaternions can be used as an alternative to model the fundamental patterns of electroencephalographic (EEG) signals in the time domain. Thus, this article presents a new quaternion-based technique known as quaternion-based signal analysis (QSA) to represent EEG signals obtained using a brain-computer interface (BCI) device to detect and interpret cognitive activity. This quaternion-based signal analysis technique can extract features to represent brain activity related to motor imagery accurately in various mental states. Experimental tests in which users where shown visual graphical cues related to left and right movements were used to collect BCI-recorded signals. These signals were then classified using decision trees (DT), support vector machine (SVM) and k-nearest neighbor (KNN) techniques. The quantitative analysis of the classifiers demonstrates that this technique can be used as an alternative in the EEG-signal modeling phase to identify mental states. PMID:26959029

  3. Frequency-duration analysis of dissolved-oxygen concentrations in two southwestern Wisconsin streams

    USGS Publications Warehouse

    Greb, Steven R.; Graczyk, David J.

    2007-01-01

    Historically, dissolved-oxygen (DO) data have been collected in the same manner as other water-quality constituents, typically at infrequent intervals as a grab sample or an instantaneous meter reading. Recent years have seen an increase in continuous water-quality monitoring with electronic dataloggers. This new technique requires new approaches in the statistical analysis of the continuous record. This paper presents an application of frequency-duration analysis to the continuous DO records of a cold and a warm water stream in rural southwestern Wisconsin. This method offers a quick, concise way to summarize large time-series data bases in an easily interpretable manner. Even though the two streams had similar mean DO concentrations, frequency-duration analyses showed distinct differences in their DO-concentration regime. This type of analysis also may be useful in relating DO concentrations to biological effects and in predicting low DO occurrences.

  4. Exploring learners' beliefs about science reading and scientific epistemic beliefs, and their relations with science text understanding

    NASA Astrophysics Data System (ADS)

    Yang, Fang-Ying; Chang, Cheng-Chieh; Chen, Li-Ling; Chen, Yi-Chun

    2016-07-01

    The main purpose of this study was to explore learners' beliefs about science reading and scientific epistemic beliefs, and how these beliefs were associating with their understanding of science texts. About 400 10th graders were involved in the development and validation of the Beliefs about Science Reading Inventory (BSRI). To find the effects of reader beliefs and epistemic beliefs, a new group of 65 10th grade students whose reader and epistemic beliefs were assessed by the newly developed BSRI and an existing SEB questionnaire were invited to take part in a science reading task. Students' text understanding in terms of concept gain and text interpretations was collected and analyzed. By the correlation analysis, it was found that when students had stronger beliefs about meaning construction based on personal goals and experiences (i.e. transaction beliefs), they produced more thematic and critical interpretations of the content of the test article. The regression analysis suggested that students SEBs could predict concept gain as a result of reading. Moreover, among all beliefs examined in the study, transaction beliefs stood out as the best predictor of overall science-text understanding.

  5. Subsurface Structure Determination of Geotermal Area in Siogung-ogung Samosir District by Using Magnetic Method

    NASA Astrophysics Data System (ADS)

    Tampubolon, Togi; Hutahaean, Juniar; Siregar, Suryani N. J.

    2018-03-01

    Underwater research often uses geomagnets. It is one of the geophysical methods for measuring magnetic field variations. This research was done to identify how the subsurface rock structure is and determine kinds of rock based on its susceptibility value in Siogung-ogung geothermal area, Pangururan, Samosir District. The tool measurement of total magnetic field called Proton Precission Magnetometer, positioning using Global Position System, and north axis determination using geological compass. Data collection was done randomly with total 51 measuring points obtained. Data analysis started with International geomagnetics Reference Field correction to obtain the total magnetic field anomaly. Then, the data analysis of total magnetic anomaly was done by using surfer program 12. To get a magnetic anomaly cross section used Magdc For Windows program. Magnetic measurement results indicated that the variation of magnetic field strength in each point with the lowest magnetic intensity value of 41785.67 nano tesla. The highest magnetic intensity value is 43140, 33. From the results of qualitative interpretation, the magnetic anomaly value is at -200.92 to 1154.45 whereas the quantitative interpretive results of model show the existence of degradation and andesitic rocks, with the value of susceptibility

  6. IFCN-endorsed practical guidelines for clinical magnetoencephalography (MEG).

    PubMed

    Hari, Riitta; Baillet, Sylvain; Barnes, Gareth; Burgess, Richard; Forss, Nina; Gross, Joachim; Hämäläinen, Matti; Jensen, Ole; Kakigi, Ryusuke; Mauguière, François; Nakasato, Nobukatzu; Puce, Aina; Romani, Gian-Luca; Schnitzler, Alfons; Taulu, Samu

    2018-04-17

    Magnetoencephalography (MEG) records weak magnetic fields outside the human head and thereby provides millisecond-accurate information about neuronal currents supporting human brain function. MEG and electroencephalography (EEG) are closely related complementary methods and should be interpreted together whenever possible. This manuscript covers the basic physical and physiological principles of MEG and discusses the main aspects of state-of-the-art MEG data analysis. We provide guidelines for best practices of patient preparation, stimulus presentation, MEG data collection and analysis, as well as for MEG interpretation in routine clinical examinations. In 2017, about 200 whole-scalp MEG devices were in operation worldwide, many of them located in clinical environments. Yet, the established clinical indications for MEG examinations remain few, mainly restricted to the diagnostics of epilepsy and to preoperative functional evaluation of neurosurgical patients. We are confident that the extensive ongoing basic MEG research indicates potential for the evaluation of neurological and psychiatric syndromes, developmental disorders, and the integrity of cortical brain networks after stroke. Basic and clinical research is, thus, paving way for new clinical applications to be identified by an increasing number of practitioners of MEG. Copyright © 2018 International Federation of Clinical Neurophysiology. Published by Elsevier B.V. All rights reserved.

  7. Assessment of metal pollution based on multivariate statistical modeling of 'hot spot' sediments from the Black Sea.

    PubMed

    Simeonov, V; Massart, D L; Andreev, G; Tsakovski, S

    2000-11-01

    The paper deals with application of different statistical methods like cluster and principal components analysis (PCA), partial least squares (PLSs) modeling. These approaches are an efficient tool in achieving better understanding about the contamination of two gulf regions in Black Sea. As objects of the study, a collection of marine sediment samples from Varna and Bourgas "hot spots" gulf areas are used. In the present case the use of cluster and PCA make it possible to separate three zones of the marine environment with different levels of pollution by interpretation of the sediment analysis (Bourgas gulf, Varna gulf and lake buffer zone). Further, the extraction of four latent factors offers a specific interpretation of the possible pollution sources and separates natural from anthropogenic factors, the latter originating from contamination by chemical, oil refinery and steel-work enterprises. Finally, the PLSs modeling gives a better opportunity in predicting contaminant concentration on tracer (or tracers) element as compared to the one-dimensional approach of the baseline models. The results of the study are important not only in local aspect as they allow quick response in finding solutions and decision making but also in broader sense as a useful environmetrical methodology.

  8. Policies and practices of beach monitoring in the Great Lakes, USA: a critical review

    USGS Publications Warehouse

    Nevers, Meredith B.; Whitman, Richard L.

    2010-01-01

    Beaches throughout the Great Lakes are monitored for fecal indicator bacteria (typically Escherichia coli) in order to protect the public from potential sewage contamination. Currently, there is no universal standard for sample collection and analysis or results interpretation. Monitoring policies are developed by individual beach management jurisdictions, and applications are highly variable across and within lakes, states, and provinces. Extensive research has demonstrated that sampling decisions for time, depth, number of replicates, frequency of sampling, and laboratory analysis all influence the results outcome, as well as calculations of the mean and interpretation of the results in policy decisions. Additional shortcomings to current monitoring approaches include appropriateness and reliability of currently used indicator bacteria and the overall goal of these monitoring programs. Current research is attempting to circumvent these complex issues by developing new tools and methods for beach monitoring. In this review, we highlight the variety of sampling routines used across the Great Lakes and the extensive body of research that challenges comparisons among beaches. We also assess the future of Great Lakes monitoring and the advantages and disadvantages of establishing standards that are evenly applied across all beaches.

  9. Functionally relevant protein motions: Extracting basin-specific collective coordinates from molecular dynamics trajectories

    NASA Astrophysics Data System (ADS)

    Pan, Patricia Wang; Dickson, Russell J.; Gordon, Heather L.; Rothstein, Stuart M.; Tanaka, Shigenori

    2005-01-01

    Functionally relevant motion of proteins has been associated with a number of atoms moving in a concerted fashion along so-called "collective coordinates." We present an approach to extract collective coordinates from conformations obtained from molecular dynamics simulations. The power of this technique for differentiating local structural fuctuations between classes of conformers obtained by clustering is illustrated by analyzing nanosecond-long trajectories for the response regulator protein Spo0F of Bacillus subtilis, generated both in vacuo and using an implicit-solvent representation. Conformational clustering is performed using automated histogram filtering of the inter-Cα distances. Orthogonal (varimax) rotation of the vectors obtained by principal component analysis of these interresidue distances for the members of individual clusters is key to the interpretation of collective coordinates dominating each conformational class. The rotated loadings plots isolate significant variation in interresidue distances, and these are associated with entire mobile secondary structure elements. From this we infer concerted motions of these structural elements. For the Spo0F simulations employing an implicit-solvent representation, collective coordinates obtained in this fashion are consistent with the location of the protein's known active sites and experimentally determined mobile regions.

  10. Geological Interpretation of Bathymetric and Backscatter Imagery of the Sea Floor off Eastern Cape Cod, Massachusetts

    USGS Publications Warehouse

    Poppe, Larry J.; Paskevich, Valerie F.; Butman, Bradford; Ackerman, Seth D.; Danforth, William W.; Foster, Dave S.; Blackwood, Dann S.

    2006-01-01

    The imagery, interpretive data layers, and data presented herein were derived from multibeam echo-sounder data collected off Eastern Cape Cod, Massachusetts, and from the stations occupied to verify these acoustic data. The basic data layers show sea-floor topography, sun-illuminated shaded relief, and backscatter intensity; interpretive layers show the distributions of surficial sediment and sedimentary environments. Presented verification data include new and historical sediment grain-size analyses and a gallery of still photographs of the seabed. The multibeam data, which cover a narrow band of the sea floor extending from Provincetown around the northern tip of Cape Cod and south southeastward to off Monomoy Island, were collected during transits between concurrent mapping projects in the Stellwagen Bank National Marine Sanctuary (Valentine and others, 2001; Butman and others, 2004; and Valentine, 2005) and Great South Channel (Valentine and others, 2003a, b, c, d). Although originally collected to maximize the use of time aboard ship, these data provide a fundamental framework for research and management activities in this part of the Gulf of Maine (Noji and others, 2004), show the composition and terrain of the seabed, and provide information on sediment transport and benthic habitat. These data and interpretations also support ongoing modeling studies of the lower Cape's aquifer system (Masterson, 2004) and of erosional hotspots along the Cape Cod National Seashore (List and others, 2006).

  11. Sample collection of ash and burned soils from the October 2007 southern California Wildfires

    USGS Publications Warehouse

    Hoefen, Todd M.; Kokaly, Raymond F.; Martin, Deborah A.; Rochester, Carlton J.; Plumlee, Geoffrey S.; Mendez, Greg; Reichard, Eric G.; Fisher, Robert N.

    2009-01-01

    Between November 2 through 9, 2007 scientists from the U.S. Geological Survey (USGS) collected samples of ash and burned soils from 28 sites in six areas burned as a result of the Southern California wildfires of October 2007, including the Harris, Witch, Santiago, Ammo, Canyon, and Grass Valley Fires. The primary goal of this sampling and analysis effort was to understand how differences in ash and burned soil composition relate to vegetation type, underlying bedrock geology, burn intensity, and residential versus wildland. Sampling sites were chosen with the input of local experts from the USGS Water Resources and Biological Resources Disciplines to help understand possible effects of the fires on water supplies, ecosystems, and endangered species. The sampling was also carried out in conjunction with detailed field analysis of the spectral reflectance characteristics of the ash, so that chemical and mineralogical characteristics of the field samples could be used to help interpret data collected as part of an airborne, hyperspectral remote-sensing survey of several of the burned areas in mid-late November, 2007.This report presents an overview of the field sampling methodologies used to collect the samples, includes representative photos of the sites sampled, and summarizes important characteristics of each of the collection sites. In this report we use the term “ash” to refer collectively to white mineral ash, which results from full combustion of vegetation and black charred organic matter from partial combustion of vegetation or other materials. These materials were found to be intermingled as a deposited residue on the soil surface following the Southern California fires of 2007.

  12. A history of forensic anthropology.

    PubMed

    Ubelaker, Douglas H

    2018-04-01

    Forensic anthropology represents a dynamic and rapidly evolving complex discipline within anthropology and forensic science. Academic roots extend back to early European anatomists but development coalesced in the Americas through high-profile court testimony, assemblage of documented collections and focused research. Formation of the anthropology section of the American Academy of Forensic Sciences in 1972, the American Board of Forensic Anthropology in 1977/1978 and other organizational advances provided important stimuli for progress. While early pioneers concentrated on analysis of skeletonized human remains, applications today have expanded to include complex methods of search and recovery, the biomechanics of trauma interpretation, isotopic analysis related to diet and region of origin, age estimation of the living and issues related to humanitarian and human rights investigations. © 2018 Wiley Periodicals, Inc.

  13. "What About the Next Generation That's Coming?": The Recontextualization of Mothering Post-Refugee Resettlement.

    PubMed

    Hoffman, Sarah J; Robertson, Cheryl L; Tierney, Jessica Dockter

    The purpose of this analysis was to explore the recontextualization of mothering in Karen refugees from Burma. We collected ethnographic data over an 11-month period with a cohort of 12 Karen women postresettlement. Using Spradley's and tools of critical discourse analysis, we interpreted the migration narratives of women, in particular, experiences they shared as mothers. These narratives were grounded in the space of cultural difference; thus, we engaged hybridity as a theoretical frame. Findings reflect the negotiation of mothering practices within the norms, structures, and policies of the country of resettlement. We identified the spaces of transformation a woman constructed to usher change while sustaining a connection between herself, her culture, and her children.

  14. Self-perceived professional identity of pharmacy educators in South Africa.

    PubMed

    Burton, Sue; Boschmans, Shirley-Anne; Hoelson, Chris

    2013-12-16

    Objective. To identify, describe, and analyze the self-perceived professional identities of pharmacy educators within the South African context. Methods. Narrative interviews were conducted, recorded, and transcribed. Thematic analysis and interpretation of the transcripts were conducted using qualitative data analysis software. Results. Multiplicities of self-perceived professional identities were identified. All of these were multi-faceted and could be situated on a continuum between pharmacist identity on one end and academic identity on the other. In addition, 6 key determinants were recognized as underpinning the participants' self-perception of their professional identity. Conclusion. This study afforded a better understanding of who pharmacy educators in South Africa are as professionals. Moreover, the findings contribute to an international, collective understanding of the professional identity of pharmacy educators.

  15. Comment on "Effect of coal-fired power generation on visibility in a nearby National Park (Terhorst and Berkman, 2010)"

    NASA Astrophysics Data System (ADS)

    White, W. H.; Farber, R. J.; Malm, W. C.; Nuttall, M.; Pitchford, M. L.; Schichtel, B. A.

    2012-08-01

    Few electricity generating stations received more environmental scrutiny during the last quarter of the twentieth century than did the Mohave Power Project (MPP), a coal-fired facility near Grand Canyon National Park. Terhorst and Berkman (2010) examine regional aerosol monitoring data collected before and after the plant's 2006 retirement for retrospective evidence of MPP's impact on visibility in the Park. The authors' technical analysis is thoughtfully conceived and executed, but is misleadingly presented as discrediting previous studies and their interpretation by regulators. In reality the Terhorst-Berkman analysis validates a consensus on MPP's visibility impact that was established years before its closure, in a collaborative assessment undertaken jointly by Federal regulators and MPP's owners.

  16. Study and interpretation of chemical composition of rainwater in selected urban and rural locations in India using multivariate analysis

    NASA Astrophysics Data System (ADS)

    Chakraborty, Bidisha; Gupta, Abhik

    2018-04-01

    Rainwater is an important untapped resource for all water managers and can be collected and used personally for all uses and simultaneously diverted to ground for recharge of depleting aquifers. Rain water is the most purest form of water until it is contaminated by the atmospheric pollution. Evaluation of rainwater quality analysis is also essential for non-potable applications and to match quality to specific uses. Rainwater quality analysis is, therefore, carried out to understand the problems of rainwater contamination with various pollutants. Rainwater samples were collected from the pre-monsoon season of March 2010 to post-monsoon of October 2013, from seven sampling sites namely Irongmara, Badarpur, Bongaigaon, Dolaigaon, BGR Township, Kolkata and Kharagpur, which characterised typical suburban, urban and industrialised locations respectively. A total of 943 samples were collected during this period from the sampling sites, taking utmost care in sampling and storage were analysed for heavy metals determination. Results for pH, EC, Pb, Cd, Ni, Zn, Cr and Co were reported in this study. The samples were collected using PVC bottles. The highest concentration of elements was observed at the beginning of the rainfall season when large amounts of dust accumulated in the atmosphere scavenged by rain. The values of pH in rainwater samples were relatively within the World Health Organization (WHO) standard for drinking water. Multivariate statistical analysis especially varimax rotation was applied to bring to focus the hidden yet important variables which influence the rainwater quality. It is also observed that rainwater contamination may not be restricted to industrial areas alone but vehicular emission may also contribute significantly in certain areas.

  17. Analysis of cannabis in oral fluid specimens by GC-MS with automatic SPE.

    PubMed

    Choi, Hyeyoung; Baeck, Seungkyung; Kim, Eunmi; Lee, Sooyeun; Jang, Moonhee; Lee, Juseon; Choi, Hwakyung; Chung, Heesun

    2009-12-01

    Methamphetamine (MA) is the most commonly abused drug in Korea, followed by cannabis. Traditionally, MA analysis is carried out on both urine and hair samples and cannabis analysis in urine samples only. Despite the fact that oral fluid has become increasingly popular as an alternative specimen in the field of driving under the influence of drugs (DUID) and work place drug testing, its application has not been expanded to drug analysis in Korea. Oral fluid is easy to collect and handle and can provide an indication of recent drug abuse. In this study, we present an analytical method using GC-MS to determine tetrahydrocannabinol (THC) and its main metabolite 11-nor-delta9-tetrahydrocannabinol-9-carboxylic acid (THC-COOH) in oral fluid. The validated method was applied to oral fluid samples collected from drug abuse suspects and the results were compared with those in urine. The stability of THC and THC-COOH in oral fluid stored in different containers was also investigated. Oral fluid specimens from 12 drug abuse suspects, submitted by the police, were collected by direct expectoration. The samples were screened with microplate ELISA. For confirmation they were extracted using automated SPE with mixed-mode cation exchange cartridge, derivatized and analyzed by GC-MS using selective ion monitoring (SIM). The concentrations ofTHC and THC-COOH in oral fluid showed a large variation and the results from oral fluid and urine samples from cannabis abusers did not show any correlation. Thus, detailed information about time interval between drug use and sample collection is needed to interpret the oral fluid results properly. In addition, further investigation about the detection time window ofTHC and THC-COOH in oral fluid is required to substitute oral fluid for urine in drug testing.

  18. Collection and analysis of peritoneal fluid from healthy llamas and alpacas.

    PubMed

    Cebra, Christopher K; Tornquist, Susan J; Reed, Shannon K

    2008-05-01

    To describe a technique for abdominocentesis in camelids and report peritoneal fluid biochemical and cytologic findings from healthy llamas and alpacas. Prospective study. Animals-17 adult llamas and 5 adult alpacas. Right paracostal abdominocentesis was performed. Peritoneal fluid was collected by gravity flow into tubes containing potassium-EDTA for cell count and cytologic evaluation and lithium heparin for biochemical analysis. Blood samples were collected via jugular venipuncture into heparinized tubes at the same time. Cytologic components were quantified. Fluid pH and concentrations of total carbon dioxide, sodium, potassium, chloride, lactate, and glucose were compared between peritoneal fluid and venous blood. All but 3 camelids had peritoneal fluid cell counts of < 3,000 nucleated cells/microL, with < 2,000 neutrophils/microL and < 1,040 large mononuclear cells/microL. All but 1 had peritoneal fluid protein concentrations of > or = 2.5 g/dL. Peritoneal fluid of camelids generally contained slightly less glucose, lactate, and sodium and roughly equal concentrations of potassium and chloride as venous blood. Peritoneal fluid was collected safely from healthy camelids. Compared with blood, peritoneal fluid usually had a low cell count and protein concentration, but some individuals had higher values. Electrolyte concentrations resembled those found in blood. High cell counts and protein concentrations found in peritoneal fluid of some healthy camelids may overlap with values found in diseased camelids, complicating interpretation of peritoneal fluid values.

  19. Roadway system assessment using bluetooth-based automatic vehicle identification travel time data.

    DOT National Transportation Integrated Search

    2012-12-01

    This monograph is an exposition of several practice-ready methodologies for automatic vehicle identification (AVI) data collection : systems. This includes considerations in the physical setup of the collection system as well as the interpretation of...

  20. Analysis of PNGase F-resistant N-glycopeptides using SugarQb for Proteome Discoverer 2.1 reveals cryptic substrate specificities.

    PubMed

    Stadlmann, Johannes; Hoi, David M; Taubenschmid, Jasmin; Mechtler, Karl; Penninger, Josef M

    2018-05-18

    SugarQb (www.imba.oeaw.ac.at/sugarqb) is a freely available collection of computational tools for the automated identification of intact glycopeptides from high-resolution HCD MS/MS data-sets in the Proteome Discoverer environment. We report the migration of SugarQb to the latest and free version of Proteome Discoverer 2.1, and apply it to the analysis of PNGase F-resistant N-glycopeptides from mouse embryonic stem cells. The analysis of intact glycopeptides highlights unexpected technical limitations to PNGase F-dependent glycoproteomic workflows at the proteome level, and warrants a critical re-interpretation of seminal data-sets in the context of N-glycosylation-site prediction. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  1. INSPIRE and SPIRES Log File Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Cole; /Wheaton Coll. /SLAC

    2012-08-31

    SPIRES, an aging high-energy physics publication data base, is in the process of being replaced by INSPIRE. In order to ease the transition from SPIRES to INSPIRE it is important to understand user behavior and the drivers for adoption. The goal of this project was to address some questions in regards to the presumed two-thirds of the users still using SPIRES. These questions are answered through analysis of the log files from both websites. A series of scripts were developed to collect and interpret the data contained in the log files. The common search patterns and usage comparisons are mademore » between INSPIRE and SPIRES, and a method for detecting user frustration is presented. The analysis reveals a more even split than originally thought as well as the expected trend of user transition to INSPIRE.« less

  2. Data Model Performance in Data Warehousing

    NASA Astrophysics Data System (ADS)

    Rorimpandey, G. C.; Sangkop, F. I.; Rantung, V. P.; Zwart, J. P.; Liando, O. E. S.; Mewengkang, A.

    2018-02-01

    Data Warehouses have increasingly become important in organizations that have large amount of data. It is not a product but a part of a solution for the decision support system in those organizations. Data model is the starting point for designing and developing of data warehouses architectures. Thus, the data model needs stable interfaces and consistent for a longer period of time. The aim of this research is to know which data model in data warehousing has the best performance. The research method is descriptive analysis, which has 3 main tasks, such as data collection and organization, analysis of data and interpretation of data. The result of this research is discussed in a statistic analysis method, represents that there is no statistical difference among data models used in data warehousing. The organization can utilize four data model proposed when designing and developing data warehouse.

  3. Student-Centered Literacy Instruction: An Examination of an Elementary Teacher's Experience

    ERIC Educational Resources Information Center

    Wiezorek, Carolyn Marie

    2012-01-01

    In this qualitative study, I examined and interpreted the literacy instruction of a fourth grade instructor who identified herself as a student-centered teacher. I sought to understand and interpret the beliefs and attitudes of my participant, Julie. Through seven unstructured interviews and five observations, I collected, and simultaneously…

  4. Making Sense of Distributed Leadership: How Secondary School Educators Look at Job Redesign

    ERIC Educational Resources Information Center

    Louis, Karen Seashore; Mayrowetz, David; Murphy, Joseph; Smylie, Mark

    2013-01-01

    This paper examines how teachers and administrators who were involved in a multi-year effort to engage in distributed leadership interpreted their experiences. We lay out and apply an argument for using an interpretive perspective to study distributed leadership. Collective sensemaking around distributed leadership is illustrated by an in-depth…

  5. Comprehensive Interpretive Plans: A Framework of Questions

    ERIC Educational Resources Information Center

    Adams, Marianna; Koke, Judy

    2008-01-01

    As explored elsewhere in this publication, the purpose of a Comprehensive or Institution-wide Interpretive Plan (CIP) is to define or articulate the intellectual framework that connects the mission of an organization and its collections with the needs and interests of its audiences. In so doing, it should follow that this plan, shaped by the…

  6. Interpretacion y Biomecanica. Hoja de consejos de PEPNet (Interpreting and Biomechanics. PEPNet Tipsheet)

    ERIC Educational Resources Information Center

    DeGroote, Bill; Morrison, Carolyn

    2010-01-01

    This publication, written in Spanish, describes cumulative trauma disorder (CTD), which refers to a collection of disorders associated with nerves, muscles, tendons, bones, and the neurovascular (nerves and related blood vessels) system. CTD symptoms may involve the neck, back, shoulders, arms, wrists, or hands. Interpreters with CTD may…

  7. The Treasures of Plato's "Phaedrus": A Creative Interpretation for Teaching and Learning in Modern Day.

    ERIC Educational Resources Information Center

    Brandenburg, Maryanne

    This paper reflects upon Plato's "Phaedrus" from a background in education and experience teaching written business communications. The interpretation and development presented are guided by the Platonic method of collection and division, which is introduced in "Phaedrus." The paper begins with an evaluative overview, followed…

  8. 30 CFR 251.12 - Submission, inspection, and selection of geophysical data and information collected under a...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... information, and interpreted geophysical information including, but not limited to, shallow and deep subbottom...) You must notify the Regional Director, in writing, when you complete the initial processing and interpretation of any geophysical data and information. Initial processing is the stage of processing where the...

  9. Phase II Interim Report -- Assessment of Hydrocarbon Seepage Detection Methods on the Fort Peck Reservation, Northeast Montana

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Monson, Lawrence M.

    2002-04-24

    The following work was performed: (1) collected reconnaissance micro-magnetic data and background field data for Area 1, (2) identified and collected soil sample data in three anomalous regions of Area 1, (3) sampled soils in Northwest Poplar Oil Field, (4) graphed, mapped, and interpreted all data areas listed above, (5) registered for the AAPG Penrose Conference on Hydrocarbon Seepage Mechanisms and Migration (postponed from 9/16/01 until 4/7/02 in Vancouver, B.C.). Results include the identification and confirmation of an oil and gas prospect in the northwest part of Area 1 and the verification of a potential shallow gas prospect in themore » West Poplar Area. Correlation of hydrocarbon micro-seepage to TM tonal anomalies needs further data analysis.« less

  10. BRIEF REPORT: Beyond Clinical Experience: Features of Data Collection and Interpretation That Contribute to Diagnostic Accuracy

    PubMed Central

    Nendaz, Mathieu R; Gut, Anne M; Perrier, Arnaud; Louis-Simonet, Martine; Blondon-Choa, Katherine; Herrmann, François R; Junod, Alain F; Vu, Nu V

    2006-01-01

    BACKGROUND Clinical experience, features of data collection process, or both, affect diagnostic accuracy, but their respective role is unclear. OBJECTIVE, DESIGN Prospective, observational study, to determine the respective contribution of clinical experience and data collection features to diagnostic accuracy. METHODS Six Internists, 6 second year internal medicine residents, and 6 senior medical students worked up the same 7 cases with a standardized patient. Each encounter was audiotaped and immediately assessed by the subjects who indicated the reasons underlying their data collection. We analyzed the encounters according to diagnostic accuracy, information collected, organ systems explored, diagnoses evaluated, and final decisions made, and we determined predictors of diagnostic accuracy by logistic regression models. RESULTS Several features significantly predicted diagnostic accuracy after correction for clinical experience: early exploration of correct diagnosis (odds ratio [OR] 24.35) or of relevant diagnostic hypotheses (OR 2.22) to frame clinical data collection, larger number of diagnostic hypotheses evaluated (OR 1.08), and collection of relevant clinical data (OR 1.19). CONCLUSION Some features of data collection and interpretation are related to diagnostic accuracy beyond clinical experience and should be explicitly included in clinical training and modeled by clinical teachers. Thoroughness in data collection should not be considered a privileged way to diagnostic success. PMID:17105525

  11. Thresholds of arsenic toxicity to Eisenia fetida in field-collected agricultural soils exposed to copper mining activities in Chile.

    PubMed

    Bustos, Víctor; Mondaca, Pedro; Verdejo, José; Sauvé, Sébastien; Gaete, Hernán; Celis-Diez, Juan L; Neaman, Alexander

    2015-12-01

    Several previous studies highlighted the importance of using field-collected soils-and not artificially-contaminated soils-for ecotoxicity tests. However, the use of field-collected soils presents several difficulties for interpretation of results, due to the presence of various contaminants and unavoidable differences in the physicochemical properties of the tested soils. The objective of this study was to estimate thresholds of metal toxicity in topsoils of 24 agricultural areas historically contaminated by mining activities in Chile. We performed standardized earthworm reproduction tests (OECD 222 and ISO 11268-2) with Eisenia fetida. Total soil concentrations of Cu, As, Zn, and Pb were in the ranges of 82-1295 mg kg(-1), 7-41 mg kg(-1), 86-345 mg kg(-1), and 25-97 mg kg(-1), respectively. In order to differentiate between the effects of different metals, we used regression analysis between soil metal concentrations and earthworm responses, as well as between metal concentrations in earthworm tissues and earthworm responses. Based on regression analysis, we concluded that As was a metal of prime concern for Eisenia fetida in soils affected by Cu mining activities, while Cu exhibited a secondary effect. In contrast, the effects of Zn and Pb were not significant. Soil electrical conductivity was another significant contributor to reproduction toxicity in the studied soils, forcing its integration in the interpretation of the results. By using soils with electrical conductivity ≤ 0.29 dS m(-1) (which corresponds to EC50 of salt toxicity to Eisenia fetida), it was possible to isolate the effect of soil salinity on earthworm reproduction. Despite the confounding effects of Cu, it was possible to determine EC10, EC25 and EC50 values for total soil As at 8 mg kg(-1), 14 mg kg(-1) and 22 mg kg(-1), respectively, for the response of the cocoon production. However, it was not possible to determine these threshold values for juvenile production. Likewise, we were able to determine EC10, EC25 and EC50 of earthworm tissue As of 38 mg kg(-1), 47 mg kg(-1), and 57 mg kg(-1), respectively, for the response of the cocoon production. Finally, we determined the no-observed effect concentration of tissue As in E. fetida of 24 mg kg(-1). Thus, earthworm reproduction test is applicable for assessment of metal toxicity in field-collected soils with low electrical conductivity, while it might have a limited applicability in soils with high electrical conductivity because the salinity-induced toxicity will hinder the interpretation of the results. Copyright © 2015 Elsevier Inc. All rights reserved.

  12. Data processing, multi-omic pathway mapping, and metabolite activity analysis using XCMS Online

    PubMed Central

    Forsberg, Erica M; Huan, Tao; Rinehart, Duane; Benton, H Paul; Warth, Benedikt; Hilmers, Brian; Siuzdak, Gary

    2018-01-01

    Systems biology is the study of complex living organisms, and as such, analysis on a systems-wide scale involves the collection of information-dense data sets that are representative of an entire phenotype. To uncover dynamic biological mechanisms, bioinformatics tools have become essential to facilitating data interpretation in large-scale analyses. Global metabolomics is one such method for performing systems biology, as metabolites represent the downstream functional products of ongoing biological processes. We have developed XCMS Online, a platform that enables online metabolomics data processing and interpretation. A systems biology workflow recently implemented within XCMS Online enables rapid metabolic pathway mapping using raw metabolomics data for investigating dysregulated metabolic processes. In addition, this platform supports integration of multi-omic (such as genomic and proteomic) data to garner further systems-wide mechanistic insight. Here, we provide an in-depth procedure showing how to effectively navigate and use the systems biology workflow within XCMS Online without a priori knowledge of the platform, including uploading liquid chromatography (LCLC)–mass spectrometry (MS) data from metabolite-extracted biological samples, defining the job parameters to identify features, correcting for retention time deviations, conducting statistical analysis of features between sample classes and performing predictive metabolic pathway analysis. Additional multi-omics data can be uploaded and overlaid with previously identified pathways to enhance systems-wide analysis of the observed dysregulations. We also describe unique visualization tools to assist in elucidation of statistically significant dysregulated metabolic pathways. Parameter input takes 5–10 min, depending on user experience; data processing typically takes 1–3 h, and data analysis takes ~30 min. PMID:29494574

  13. Automated detection of radioisotopes from an aircraft platform by pattern recognition analysis of gamma-ray spectra.

    PubMed

    Dess, Brian W; Cardarelli, John; Thomas, Mark J; Stapleton, Jeff; Kroutil, Robert T; Miller, David; Curry, Timothy; Small, Gary W

    2018-03-08

    A generalized methodology was developed for automating the detection of radioisotopes from gamma-ray spectra collected from an aircraft platform using sodium-iodide detectors. Employing data provided by the U.S Environmental Protection Agency Airborne Spectral Photometric Environmental Collection Technology (ASPECT) program, multivariate classification models based on nonparametric linear discriminant analysis were developed for application to spectra that were preprocessed through a combination of altitude-based scaling and digital filtering. Training sets of spectra for use in building classification models were assembled from a combination of background spectra collected in the field and synthesized spectra obtained by superimposing laboratory-collected spectra of target radioisotopes onto field backgrounds. This approach eliminated the need for field experimentation with radioactive sources for use in building classification models. Through a bi-Gaussian modeling procedure, the discriminant scores that served as the outputs from the classification models were related to associated confidence levels. This provided an easily interpreted result regarding the presence or absence of the signature of a specific radioisotope in each collected spectrum. Through the use of this approach, classifiers were built for cesium-137 ( 137 Cs) and cobalt-60 ( 60 Co), two radioisotopes that are of interest in airborne radiological monitoring applications. The optimized classifiers were tested with field data collected from a set of six geographically diverse sites, three of which contained either 137 Cs, 60 Co, or both. When the optimized classification models were applied, the overall percentages of correct classifications for spectra collected at these sites were 99.9 and 97.9% for the 60 Co and 137 Cs classifiers, respectively. Copyright © 2018 Elsevier Ltd. All rights reserved.

  14. A unifying framework for marginalized random intercept models of correlated binary outcomes

    PubMed Central

    Swihart, Bruce J.; Caffo, Brian S.; Crainiceanu, Ciprian M.

    2013-01-01

    We demonstrate that many current approaches for marginal modeling of correlated binary outcomes produce likelihoods that are equivalent to the copula-based models herein. These general copula models of underlying latent threshold random variables yield likelihood-based models for marginal fixed effects estimation and interpretation in the analysis of correlated binary data with exchangeable correlation structures. Moreover, we propose a nomenclature and set of model relationships that substantially elucidates the complex area of marginalized random intercept models for binary data. A diverse collection of didactic mathematical and numerical examples are given to illustrate concepts. PMID:25342871

  15. Interpreting measurements obtained with the cloud absorption radiometer

    NASA Technical Reports Server (NTRS)

    1988-01-01

    The software developed for the analysis of data from the Cloud Absorption Radiometer (CAR) is discussed. The CAR is a multichannel radiometer designed to measure the radiation field in the middle of an optically thick cloud (the diffusion domain). It can also measure the surface albedo and escape function. The instrument currently flies on a C-131A aircraft operated by the University of Washington. Most of this data was collected during the First International satellite cloud climatology project Regional Experiment (FIRE) Marine Stratocumulus Intensive Field Observation program off San Diego during July 1987. Earlier flights of the CAR have also been studied.

  16. Use of Patient Registries and Administrative Datasets for the Study of Pediatric Cancer

    PubMed Central

    Rice, Henry E.; Englum, Brian R.; Gulack, Brian C.; Adibe, Obinna O.; Tracy, Elizabeth T.; Kreissman, Susan G.; Routh, Jonathan C.

    2015-01-01

    Analysis of data from large administrative databases and patient registries is increasingly being used to study childhood cancer care, although the value of these data sources remains unclear to many clinicians. Interpretation of large databases requires a thorough understanding of how the dataset was designed, how data were collected, and how to assess data quality. This review will detail the role of administrative databases and registry databases for the study of childhood cancer, tools to maximize information from these datasets, and recommendations to improve the use of these databases for the study of pediatric oncology. PMID:25807938

  17. Multispectral analysis of ocean dumped materials

    NASA Technical Reports Server (NTRS)

    Johnson, R. W.

    1977-01-01

    Experiments conducted in the Atlantic coastal zone indicated that plumes resulting from ocean dumping of acid wastes and sewage sludge have unique spectral characteristics. Remotely sensed wide area synoptic coverage provided information on these pollution features that was not readily available from other sources. Aircraft remotely sensed photographic and multispectral scanner data were interpreted by two methods. First, qualitative analyses in which pollution features were located, mapped, and identified without concurrent sea truth and, second, quantitative analyses in which concurrently collected sea truth was used to calibrate the remotely sensed data and to determine quantitative distributions of one or more parameters in a plume.

  18. Florence Nightingale a Hundred Years on: who she was and what she was not.

    PubMed

    McDonald, Lynn

    2010-01-01

    This article reviews Florence Nightingale's work 100 years after her death, based on surviving writing compiled for The Collected Works of Florence Nightingale. Described are her founding of a new profession for women, based on patient care, her pioneering work in statistics and data analysis and her bold reform of the workhouse infirmaries. A section on historiography focuses on the negative impact of F. B. Smith's attack on Nightingale in 1982 and Monica Baly's progressively more negative interpretations from the 1970s to her death in 1998. Note is made of future research opportunities .

  19. Advances in interpretation of subsurface processes with time-lapse electrical imaging

    USGS Publications Warehouse

    Singha, Kaminit; Day-Lewis, Frederick D.; Johnson, Tim B.; Slater, Lee D.

    2015-01-01

    Electrical geophysical methods, including electrical resistivity, time-domain induced polarization, and complex resistivity, have become commonly used to image the near subsurface. Here, we outline their utility for time-lapse imaging of hydrological, geochemical, and biogeochemical processes, focusing on new instrumentation, processing, and analysis techniques specific to monitoring. We review data collection procedures, parameters measured, and petrophysical relationships and then outline the state of the science with respect to inversion methodologies, including coupled inversion. We conclude by highlighting recent research focused on innovative applications of time-lapse imaging in hydrology, biology, ecology, and geochemistry, among other areas of interest.

  20. Advances in interpretation of subsurface processes with time-lapse electrical imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singha, Kamini; Day-Lewis, Frederick D.; Johnson, Timothy C.

    2015-03-15

    Electrical geophysical methods, including electrical resistivity, time-domain induced polarization, and complex resistivity, have become commonly used to image the near subsurface. Here, we outline their utility for time-lapse imaging of hydrological, geochemical, and biogeochemical processes, focusing on new instrumentation, processing, and analysis techniques specific to monitoring. We review data collection procedures, parameters measured, and petrophysical relationships and then outline the state of the science with respect to inversion methodologies, including coupled inversion. We conclude by highlighting recent research focused on innovative applications of time-lapse imaging in hydrology, biology, ecology, and geochemistry, among other areas of interest.

  1. Importance of hydrologic data for interpreting wetland maps and assessing wetland loss and mitigation

    USGS Publications Warehouse

    Carter, V.

    1991-01-01

    The US Geological Survey collects and disseminates, in written and digital formats, groundwater and surface-water information related to the tidal and nontidal wetlands of the United States. This information includes quantity, quality, and availability of groundwater and surface water; groundwater and surface-water interactions (recharge-discharge); groundwater flow; and the basic surface-water characteristics of streams, rivers, lakes, and wetlands. Water resources information in digital format can be used in geographic information systems (GISs) for many purposes related to wetlands. US Geological Survey wetland-related activities include collection of information important for assessing and mitigating coastal wetland loss and modification, hydrologic data collection and interpretation, GIS activities, identification of national trends in water quality and quantity, and process-oriented wetland research. -Author

  2. Analysis of anabolic steroids in hair: time courses in guinea pigs.

    PubMed

    Shen, Min; Xiang, Ping; Yan, Hui; Shen, Baohua; Wang, Mengye

    2009-09-01

    Sensitive, specific, and reproducible methods for the quantitative determination of eight anabolic steroids in guinea pig hair have been developed using LC/MS/MS and GC/MS/MS. Methyltestosterone, stanozolol, methandienone, nandrolone, trenbolone, boldenone, methenolone and DHEA were administered intraperitoneally in guinea pigs. After the first injection, black hair segments were collected on shaved areas of skin. The analysis of these segments revealed the distribution of anabolic steroids in the guinea pig hair. The major components in hair are the parent anabolic steroids. The time courses of the concentrations of the steroids in hair (except methenolone, which does not deposit in hair) demonstrated that the peak concentrations were reached on days 2-4, except stanozolol, which peaked on day 10 after administration. The concentrations in hair appeared to be related to the physicochemical properties of the drug compound and to the dosage. These studies on the distribution of drugs in the hair shaft and on the time course of their concentration changes provide information relevant to the optimal time and method of collecting hair samples. Such studies also provide basic data that will be useful in the application of hair analysis in the control of doping and in the interpretation of results.

  3. Vitreous humor analysis for the detection of xenobiotics in forensic toxicology: a review.

    PubMed

    Bévalot, Fabien; Cartiser, Nathalie; Bottinelli, Charline; Fanton, Laurent; Guitton, Jérôme

    2016-01-01

    Vitreous humor (VH) is a gelatinous substance contained in the posterior chamber of the eye, playing a mechanical role in the eyeball. It has been the subject of numerous studies in various forensic applications, primarily for the assessment of postmortem interval and for postmortem chemical analysis. Since most of the xenobiotics present in the bloodstream are detected in VH after crossing the selective blood-retinal barrier, VH is an alternative matrix useful for forensic toxicology. VH analysis offers particular advantages over other biological matrices: it is less prone to postmortem redistribution, is easy to collect, has relatively few interfering compounds for the analytical process, and shows sample stability over time after death. The present study is an overview of VH physiology, drug transport and elimination. Collection, storage, analytical techniques and interpretation of results from qualitative and quantitative points of view are dealt with. The distribution of xenobiotics in VH samples is thus discussed and illustrated by a table reporting the concentrations of 106 drugs from more than 300 case reports. For this purpose, a survey was conducted of publications found in the MEDLINE database from 1969 through April 30, 2015.

  4. Computation and application of tissue-specific gene set weights.

    PubMed

    Frost, H Robert

    2018-04-06

    Gene set testing, or pathway analysis, has become a critical tool for the analysis of highdimensional genomic data. Although the function and activity of many genes and higher-level processes is tissue-specific, gene set testing is typically performed in a tissue agnostic fashion, which impacts statistical power and the interpretation and replication of results. To address this challenge, we have developed a bioinformatics approach to compute tissuespecific weights for individual gene sets using information on tissue-specific gene activity from the Human Protein Atlas (HPA). We used this approach to create a public repository of tissue-specific gene set weights for 37 different human tissue types from the HPA and all collections in the Molecular Signatures Database (MSigDB). To demonstrate the validity and utility of these weights, we explored three different applications: the functional characterization of human tissues, multi-tissue analysis for systemic diseases and tissue-specific gene set testing. All data used in the reported analyses is publicly available. An R implementation of the method and tissue-specific weights for MSigDB gene set collections can be downloaded at http://www.dartmouth.edu/∼hrfrost/TissueSpecificGeneSets. rob.frost@dartmouth.edu.

  5. Lithostratigraphic interpretation from joint analysis of seismic tomography and magnetotelluric resistivity models using self-organizing map techniques

    NASA Astrophysics Data System (ADS)

    Bauer, K.; Muñoz, G.; Moeck, I.

    2012-12-01

    The combined interpretation of different models as derived from seismic tomography and magnetotelluric (MT) inversion represents a more efficient approach to determine the lithology of the subsurface compared with the separate treatment of each discipline. Such models can be developed independently or by application of joint inversion strategies. After the step of model generation using different geophysical methodologies, a joint interpretation work flow includes the following steps: (1) adjustment of a joint earth model based on the adapted, identical model geometry for the different methods, (2) classification of the model components (e.g. model blocks described by a set of geophysical parameters), and (3) re-mapping of the classified rock types to visualise their distribution within the earth model, and petrophysical characterization and interpretation. One possible approach for the classification of multi-parameter models is based on statistical pattern recognition, where different models are combined and translated into probability density functions. Classes of rock types are identified in these methods as isolated clusters with high probability density function values. Such techniques are well-established for the analysis of two-parameter models. Alternatively we apply self-organizing map (SOM) techniques, which have no limitations in the number of parameters to be analysed in the joint interpretation. Our SOM work flow includes (1) generation of a joint earth model described by so-called data vectors, (2) unsupervised learning or training, (3) analysis of the feature map by adopting image processing techniques, and (4) application of the knowledge to derive a lithological model which is based on the different geophysical parameters. We show the usage of the SOM work flow for a synthetic and a real data case study. Both tests rely on three geophysical properties: P velocity and vertical velocity gradient from seismic tomography, and electrical resistivity from MT inversion. The synthetic data are used as a benchmark test to demonstrate the performance of the SOM method. The real data were collected along a 40 km profile across parts of the NE German basin. The lithostratigraphic model from the joint SOM interpretation consists of eight litho-types and covers Cenozoic, Mesozoic and Paleozoic sediments down to 5 km depth. There is a remarkable agreement between the SOM based model and regional marker horizons interpolated from surrounding 2D industrial seismic data. The most interesting results include (1) distinct properties of the Jurassic (low P velocity gradients, low resistivities) interpreted as the signature of shaly clastics, and (2) a pattern within the Upper Permian Zechstein with decreased resistivities and increased P velocities within the salt depressions on the one hand, and increased resistivities and decreased P velocities in the salt pillows on the other hand. In our interpretation this pattern is related with flow of less dense salt matrix components into the pillows and remaining brittle evaporites within the depressions.

  6. The Defense Threat Reduction Agency's Technical Nuclear Forensics Research and Development Program

    NASA Astrophysics Data System (ADS)

    Franks, J.

    2015-12-01

    The Defense Threat Reduction Agency (DTRA) Technical Nuclear Forensics (TNF) Research and Development (R&D) Program's overarching goal is to design, develop, demonstrate, and transition advanced technologies and methodologies that improve the interagency operational capability to provide forensics conclusions after the detonation of a nuclear device. This goal is attained through the execution of three focus areas covering the span of the TNF process to enable strategic decision-making (attribution): Nuclear Forensic Materials Exploitation - Development of targeted technologies, methodologies and tools enabling the timely collection, analysis and interpretation of detonation materials.Prompt Nuclear Effects Exploitation - Improve ground-based capabilities to collect prompt nuclear device outputs and effects data for rapid, complementary and corroborative information.Nuclear Forensics Device Characterization - Development of a validated and verified capability to reverse model a nuclear device with high confidence from observables (e.g., prompt diagnostics, sample analysis, etc.) seen after an attack. This presentation will outline DTRA's TNF R&D strategy and current investments, with efforts focusing on: (1) introducing new technical data collection capabilities (e.g., ground-based prompt diagnostics sensor systems; innovative debris collection and analysis); (2) developing new TNF process paradigms and concepts of operations to decrease timelines and uncertainties, and increase results confidence; (3) enhanced validation and verification (V&V) of capabilities through technology evaluations and demonstrations; and (4) updated weapon output predictions to account for the modern threat environment. A key challenge to expanding these efforts to a global capability is the need for increased post-detonation TNF international cooperation, collaboration and peer reviews.

  7. Assessing the validity of discourse analysis: transdisciplinary convergence

    NASA Astrophysics Data System (ADS)

    Jaipal-Jamani, Kamini

    2014-12-01

    Research studies using discourse analysis approaches make claims about phenomena or issues based on interpretation of written or spoken text, which includes images and gestures. How are findings/interpretations from discourse analysis validated? This paper proposes transdisciplinary convergence as a way to validate discourse analysis approaches to research. The argument is made that discourse analysis explicitly grounded in semiotics, systemic functional linguistics, and critical theory, offers a credible research methodology. The underlying assumptions, constructs, and techniques of analysis of these three theoretical disciplines can be drawn on to show convergence of data at multiple levels, validating interpretations from text analysis.

  8. NeuroCa: integrated framework for systematic analysis of spatiotemporal neuronal activity patterns from large-scale optical recording data

    PubMed Central

    Jang, Min Jee; Nam, Yoonkey

    2015-01-01

    Abstract. Optical recording facilitates monitoring the activity of a large neural network at the cellular scale, but the analysis and interpretation of the collected data remain challenging. Here, we present a MATLAB-based toolbox, named NeuroCa, for the automated processing and quantitative analysis of large-scale calcium imaging data. Our tool includes several computational algorithms to extract the calcium spike trains of individual neurons from the calcium imaging data in an automatic fashion. Two algorithms were developed to decompose the imaging data into the activity of individual cells and subsequently detect calcium spikes from each neuronal signal. Applying our method to dense networks in dissociated cultures, we were able to obtain the calcium spike trains of ∼1000 neurons in a few minutes. Further analyses using these data permitted the quantification of neuronal responses to chemical stimuli as well as functional mapping of spatiotemporal patterns in neuronal firing within the spontaneous, synchronous activity of a large network. These results demonstrate that our method not only automates time-consuming, labor-intensive tasks in the analysis of neural data obtained using optical recording techniques but also provides a systematic way to visualize and quantify the collective dynamics of a network in terms of its cellular elements. PMID:26229973

  9. Structural interpretation of seismic data and inherent uncertainties

    NASA Astrophysics Data System (ADS)

    Bond, Clare

    2013-04-01

    Geoscience is perhaps unique in its reliance on incomplete datasets and building knowledge from their interpretation. This interpretation basis for the science is fundamental at all levels; from creation of a geological map to interpretation of remotely sensed data. To teach and understand better the uncertainties in dealing with incomplete data we need to understand the strategies individual practitioners deploy that make them effective interpreters. The nature of interpretation is such that the interpreter needs to use their cognitive ability in the analysis of the data to propose a sensible solution in their final output that is both consistent not only with the original data but also with other knowledge and understanding. In a series of experiments Bond et al. (2007, 2008, 2011, 2012) investigated the strategies and pitfalls of expert and non-expert interpretation of seismic images. These studies focused on large numbers of participants to provide a statistically sound basis for analysis of the results. The outcome of these experiments showed that a wide variety of conceptual models were applied to single seismic datasets. Highlighting not only spatial variations in fault placements, but whether interpreters thought they existed at all, or had the same sense of movement. Further, statistical analysis suggests that the strategies an interpreter employs are more important than expert knowledge per se in developing successful interpretations. Experts are successful because of their application of these techniques. In a new set of experiments a small number of experts are focused on to determine how they use their cognitive and reasoning skills, in the interpretation of 2D seismic profiles. Live video and practitioner commentary were used to track the evolving interpretation and to gain insight on their decision processes. The outputs of the study allow us to create an educational resource of expert interpretation through online video footage and commentary with associated further interpretation and analysis of the techniques and strategies employed. This resource will be of use to undergraduate, post-graduate, industry and academic professionals seeking to improve their seismic interpretation skills, develop reasoning strategies for dealing with incomplete datasets, and for assessing the uncertainty in these interpretations. Bond, C.E. et al. (2012). 'What makes an expert effective at interpreting seismic images?' Geology, 40, 75-78. Bond, C. E. et al. (2011). 'When there isn't a right answer: interpretation and reasoning, key skills for 21st century geoscience'. International Journal of Science Education, 33, 629-652. Bond, C. E. et al. (2008). 'Structural models: Optimizing risk analysis by understanding conceptual uncertainty'. First Break, 26, 65-71. Bond, C. E. et al., (2007). 'What do you think this is?: "Conceptual uncertainty" In geoscience interpretation'. GSA Today, 17, 4-10.

  10. Interpretive Repertoires as Mirrors on Society and as Tools for Action: Reflections on Zeyer and Roth's "A Mirror of Society"

    ERIC Educational Resources Information Center

    Milne, Catherine

    2009-01-01

    I respond to Zeyer and Roth's ("Cultural Studies of Science Education," 2009) paper on their use of interpretive repertoire analysis to explicate Swiss middle school students' dialogic responses to environmental issues. I focus on the strategy of interpretive repertoire analysis, making sense of the stance Zeyer and Roth take with this analysis by…

  11. Overcoming Language and Literacy Barriers: Using Student Response System Technology to Collect Quality Program Evaluation Data from Immigrant Participants

    ERIC Educational Resources Information Center

    Walker, Susan K.; Mao, Dung

    2016-01-01

    Student response system technology was employed for parenting education program evaluation data collection with Karen adults. The technology, with translation and use of an interpreter, provided an efficient and secure method that respected oral language and collective learning preferences and accommodated literacy needs. The method was popular…

  12. Customer involvement in greening the supply chain: an interpretive structural modeling methodology

    NASA Astrophysics Data System (ADS)

    Kumar, Sanjay; Luthra, Sunil; Haleem, Abid

    2013-04-01

    The role of customers in green supply chain management needs to be identified and recognized as an important research area. This paper is an attempt to explore the involvement aspect of customers towards greening of the supply chain (SC). An empirical research approach has been used to collect primary data to rank different variables for effective customer involvement in green concept implementation in SC. An interpretive structural-based model has been presented, and variables have been classified using matrice d' impacts croises- multiplication appliqué a un classement analysis. Contextual relationships among variables have been established using experts' opinions. The research may help practicing managers to understand the interaction among variables affecting customer involvement. Further, this understanding may be helpful in framing the policies and strategies to green SC. Analyzing interaction among variables for effective customer involvement in greening SC to develop the structural model in the Indian perspective is an effort towards promoting environment consciousness.

  13. BioPAX – A community standard for pathway data sharing

    PubMed Central

    Demir, Emek; Cary, Michael P.; Paley, Suzanne; Fukuda, Ken; Lemer, Christian; Vastrik, Imre; Wu, Guanming; D’Eustachio, Peter; Schaefer, Carl; Luciano, Joanne; Schacherer, Frank; Martinez-Flores, Irma; Hu, Zhenjun; Jimenez-Jacinto, Veronica; Joshi-Tope, Geeta; Kandasamy, Kumaran; Lopez-Fuentes, Alejandra C.; Mi, Huaiyu; Pichler, Elgar; Rodchenkov, Igor; Splendiani, Andrea; Tkachev, Sasha; Zucker, Jeremy; Gopinath, Gopal; Rajasimha, Harsha; Ramakrishnan, Ranjani; Shah, Imran; Syed, Mustafa; Anwar, Nadia; Babur, Ozgun; Blinov, Michael; Brauner, Erik; Corwin, Dan; Donaldson, Sylva; Gibbons, Frank; Goldberg, Robert; Hornbeck, Peter; Luna, Augustin; Murray-Rust, Peter; Neumann, Eric; Reubenacker, Oliver; Samwald, Matthias; van Iersel, Martijn; Wimalaratne, Sarala; Allen, Keith; Braun, Burk; Whirl-Carrillo, Michelle; Dahlquist, Kam; Finney, Andrew; Gillespie, Marc; Glass, Elizabeth; Gong, Li; Haw, Robin; Honig, Michael; Hubaut, Olivier; Kane, David; Krupa, Shiva; Kutmon, Martina; Leonard, Julie; Marks, Debbie; Merberg, David; Petri, Victoria; Pico, Alex; Ravenscroft, Dean; Ren, Liya; Shah, Nigam; Sunshine, Margot; Tang, Rebecca; Whaley, Ryan; Letovksy, Stan; Buetow, Kenneth H.; Rzhetsky, Andrey; Schachter, Vincent; Sobral, Bruno S.; Dogrusoz, Ugur; McWeeney, Shannon; Aladjem, Mirit; Birney, Ewan; Collado-Vides, Julio; Goto, Susumu; Hucka, Michael; Le Novère, Nicolas; Maltsev, Natalia; Pandey, Akhilesh; Thomas, Paul; Wingender, Edgar; Karp, Peter D.; Sander, Chris; Bader, Gary D.

    2010-01-01

    BioPAX (Biological Pathway Exchange) is a standard language to represent biological pathways at the molecular and cellular level. Its major use is to facilitate the exchange of pathway data (http://www.biopax.org). Pathway data captures our understanding of biological processes, but its rapid growth necessitates development of databases and computational tools to aid interpretation. However, the current fragmentation of pathway information across many databases with incompatible formats presents barriers to its effective use. BioPAX solves this problem by making pathway data substantially easier to collect, index, interpret and share. BioPAX can represent metabolic and signaling pathways, molecular and genetic interactions and gene regulation networks. BioPAX was created through a community process. Through BioPAX, millions of interactions organized into thousands of pathways across many organisms, from a growing number of sources, are available. Thus, large amounts of pathway data are available in a computable form to support visualization, analysis and biological discovery. PMID:20829833

  14. Enabling scientific workflows in virtual reality

    USGS Publications Warehouse

    Kreylos, O.; Bawden, G.; Bernardin, T.; Billen, M.I.; Cowgill, E.S.; Gold, R.D.; Hamann, B.; Jadamec, M.; Kellogg, L.H.; Staadt, O.G.; Sumner, D.Y.

    2006-01-01

    To advance research and improve the scientific return on data collection and interpretation efforts in the geosciences, we have developed methods of interactive visualization, with a special focus on immersive virtual reality (VR) environments. Earth sciences employ a strongly visual approach to the measurement and analysis of geologic data due to the spatial and temporal scales over which such data ranges, As observations and simulations increase in size and complexity, the Earth sciences are challenged to manage and interpret increasing amounts of data. Reaping the full intellectual benefits of immersive VR requires us to tailor exploratory approaches to scientific problems. These applications build on the visualization method's strengths, using both 3D perception and interaction with data and models, to take advantage of the skills and training of the geological scientists exploring their data in the VR environment. This interactive approach has enabled us to develop a suite of tools that are adaptable to a range of problems in the geosciences and beyond. Copyright ?? 2008 by the Association for Computing Machinery, Inc.

  15. Application of ground-penetrating radar imagery for three-dimensional visualisation of near-surface structures in ice-rich permafrost, Barrow, Alaska

    USGS Publications Warehouse

    Munroe, Jeffrey S.; Doolittle, James A.; Kanevskiy, Mikhail; Hinkel, Kenneth M.; Nelson, Frederick E.; Jones, Benjamin M.; Shur, Yuri; Kimble, John M.

    2007-01-01

    Three-dimensional ground-penetrating radar (3D GPR) was used to investigate the subsurface structure of ice-wedge polygons and other features of the frozen active layer and near-surface permafrost near Barrow, Alaska. Surveys were conducted at three sites located on landscapes of different geomorphic age. At each site, sediment cores were collected and characterised to aid interpretation of GPR data. At two sites, 3D GPR was able to delineate subsurface ice-wedge networks with high fidelity. Three-dimensional GPR data also revealed a fundamental difference in ice-wedge morphology between these two sites that is consistent with differences in landscape age. At a third site, the combination of two-dimensional and 3D GPR revealed the location of an active frost boil with ataxitic cryostructure. When supplemented by analysis of soil cores, 3D GPR offers considerable potential for imaging, interpreting and 3D mapping of near-surface soil and ice structures in permafrost environments.

  16. Using land-cover data to understand effects of agricultural and urban development on regional water quality

    USGS Publications Warehouse

    Karstensen, Krista A.; Warner, Kelly L.

    2010-01-01

    The Land-Cover Trends project is a collaborative effort between the Geographic Analysis and Monitoring Program of the U.S. Geological Survey (USGS), the U.S. Environmental Protection Agency (EPA) and the National Aeronautics and Space Administration (NASA) to understand the rates, trends, causes, and consequences of contemporary land-use and land-cover change in the United States. The data produced from this research can lead to an enriched understanding of the drivers of future landuse change, effects on environmental systems, and any associated feedbacks. USGS scientists are using the EPA Level III ecoregions as the geographic framework to process geospatial data collected between 1973 and 2000 to characterize ecosystem responses to land-use changes. General land-cover classes for these periods were interpreted from Landsat Multispectral Scanner, Thematic Mapper, and Enhanced Thematic Mapper Plus imagery to categorize and evaluate land-cover change using a modified Anderson Land-Use/Land-Cover Classification System for image interpretation.

  17. Rural families' interpretations of experiencing unexpected transition in the wake of a natural disaster.

    PubMed

    Fernandes, Gisele Cristina Manfrini; Boehs, Astrid Eggert; Denham, Sharon A; Nitschke, Rosane Gonçalves; Martini, Jussara Gue

    2017-02-13

    Natural disasters affect populations in various parts of the world. The impacts of disasters can cause many problems to the health of people and disruption to family life, potentially leading to an unexpected transition. The objective of this paper is to present the unexpected transitional experiences of rural families following a natural disaster. A multiple case study of six families was conducted with children and adolescents in a rural area affected by a 2008 disaster in southern Brazil. For data collection, we used participant observation, narrative interviews, genograms, ecomaps and an instrument called calendar routine. The analysis of the data resulted in different family interpretations about the changes resulting from the storm and compared life before and after the disaster. The loss of homes and loved ones, migration, unemployment, and losses from the farm were the main changes associated with new development tasks. The experiences of family transition after the disaster revealed that losses influenced social lives, daily routines and the preservation of cultural values.

  18. Spinal Cord Injury Model Systems: Review of Program and National Database From 1970 to 2015.

    PubMed

    Chen, Yuying; DeVivo, Michael J; Richards, J Scott; SanAgustin, Theresa B

    2016-10-01

    The Spinal Cord Injury Model Systems (SCIMS) centers have provided continuous, comprehensive multidisciplinary care for persons with spinal cord injury (SCI) in the United States since their inception in 1970. In addition, the research conducted and the analysis of data collected at these centers facilitate advances in the care and the overall quality of life for people with SCI. Over the past 45 years, the SCIMS program and National Spinal Cord Injury Database (NSCID) have undergone major revisions, which must be recognized in the planning, conduct, and interpretation of SCIMS research to prevent misinterpretation of findings. Therefore, we provide herein a brief review of the SCIMS program and the associated NSCID throughout its history, emphasizing changes and accomplishments within the past 15 years, to facilitate a better understanding and interpretation of the data presented in SCIMS research publications, including the articles published in this special issue of the Archives. Copyright © 2016 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  19. 2D - 3D high resolution seismic survey on the Sea of Marmara - Western High

    NASA Astrophysics Data System (ADS)

    Saritas, H.; Cifci, G.; Géli, L.; Thomas, Y.; Marsset, B.; Rochat, A.; Westbrook, G. K.; Ker, S.; Atgin, O.; Akhun ćoşkun, S. D.; Grall, C.; Henr, P.; Gürçay, S.; Okay, S.; ćoşkun, S.; Özkan, Ö.; Barın, B.

    2012-04-01

    In the Sea of Marmara the main strand of the NAF is made up of the Ganos (15km long), Central Marmara (150 km), and North Boundary (45 km) fault segment (Okay et al., 2000). The Central Marmara Fault crosses over The Western High which is located between Tekirdag and Central Marmara Basins. The Western High and Cinarcik Basin is one of the major regions of geological interest which is the area close to the NAF where evidence of gas hydrates and gas escapes have been observed during previous scientific cruises. To understand movement of the NAF and origin of the gas , collecting data was focused on these areas by the latter cruises. It started with TAMAM (Turkish-American Marmara Multichannel) cruise in July 2008 by R/V Koca Piri Reis which belongs to Dokuz Eylul University , and after that it continued with MARMESONET (Marmara Demonstration Mission Program supported by European Seafloor Observatory Network) in December 2009 by R/V Le Suroit which belongs to IFREMER. This cruise consisted of two leg; leg-1 was about collecting multibeam and AUV data, Leg-2 was about collecting High Resolution 3D Seismic data. The last cruise PirMarmara was carried out in June 2010 by R/V Koca Piri Reis , its aim was that collecting 2D High Resolution Seismic Data .These projects are grouped in ESONET MARMARA-DM Project. 3D seismic data provide detailed information about fault distribution and subsurface structures. Computer-based interpretation and display of 3D seismic data allow for more thorough analysis than 2D seismic data. The objectives of this survey are; find gas strata and gas hydrate formation location in the western high, geological description of this area, understand tectonical movement related to dextral strike slip North Anatolian fault, focus on the mud volcano in which close to NAF, find gas hydrate and origin of the existing gas , and location of the gas escaping, investigate the creation of the Marmara Sea concerning with Western High. Integrate good velocity information which is obtained from 2D seismic processing with to 3D seismic data for effective interpretation. In conclusion, there were some cruises related to collecting kind of the marine geology and geophysics data in The Western High. The investigations have been focused on gas hydrate, gas escape, location of the gas strata and tectonic movement. The Data has been processed and started to interpretation. Keywords: Sea of Marmara, Western High, Gas field, Gas Hyrate, 2D-3D Seismic

  20. Social network models predict movement and connectivity in ecological landscapes

    USGS Publications Warehouse

    Fletcher, R.J.; Acevedo, M.A.; Reichert, Brian E.; Pias, Kyle E.; Kitchens, W.M.

    2011-01-01

    Network analysis is on the rise across scientific disciplines because of its ability to reveal complex, and often emergent, patterns and dynamics. Nonetheless, a growing concern in network analysis is the use of limited data for constructing networks. This concern is strikingly relevant to ecology and conservation biology, where network analysis is used to infer connectivity across landscapes. In this context, movement among patches is the crucial parameter for interpreting connectivity but because of the difficulty of collecting reliable movement data, most network analysis proceeds with only indirect information on movement across landscapes rather than using observed movement to construct networks. Statistical models developed for social networks provide promising alternatives for landscape network construction because they can leverage limited movement information to predict linkages. Using two mark-recapture datasets on individual movement and connectivity across landscapes, we test whether commonly used network constructions for interpreting connectivity can predict actual linkages and network structure, and we contrast these approaches to social network models. We find that currently applied network constructions for assessing connectivity consistently, and substantially, overpredict actual connectivity, resulting in considerable overestimation of metapopulation lifetime. Furthermore, social network models provide accurate predictions of network structure, and can do so with remarkably limited data on movement. Social network models offer a flexible and powerful way for not only understanding the factors influencing connectivity but also for providing more reliable estimates of connectivity and metapopulation persistence in the face of limited data.

  1. Interpretation Analysis as a Competitive Event.

    ERIC Educational Resources Information Center

    Nading, Robert M.

    Interpretation analysis is a new and interesting event on the forensics horizon which appears to be attracting an ever larger number of supporters. This event, developed by Larry Lambert of Ball State University in 1989, requires a student to perform all three disciplines of forensic competition (interpretation, public speaking, and limited…

  2. Secondary School Senior Capstone Projects: A Descriptive and Interpretive Case Study on Post-Secondary Students' Perspectives of Learning Transfer

    ERIC Educational Resources Information Center

    Yasuda, Vanessa Applbaum

    2017-01-01

    This descriptive and interpretive case study investigates how 12 undergraduate college students perceived participation in their high school Senior Capstone Project (SCP) impacted their college academic experience. Learning transfer was explored from the learner's perspective. Data was collected using qualitative methods in three sequential phases…

  3. Graphic Comprehension and Interpretation Skills of Preservice Teachers with Different Learning Approaches in a Technology-Aided Learning Environment

    ERIC Educational Resources Information Center

    Çelik, Harun; Pektas, Hüseyin Miraç

    2017-01-01

    A one-group quasi-experimental design and survey methodology were used to investigate the effect of virtual laboratory practices on preservice teachers' (N = 29) graphic comprehension and interpretation skills with different learning approaches. Pretest and posttest data were collected with the Test of Understanding Kinematic Graphs. The Learning…

  4. In-Service Teachers' Perceptions and Interpretations of Students' Errors in Mathematics

    ERIC Educational Resources Information Center

    Chauraya, Million; Mashingaidze, Samuel

    2017-01-01

    This paper reports on findings of a research study that investigated in-service secondary school teachers' perceptions and interpretations of students' errors in mathematics. The study used a survey research design in which a questionnaire with two sections was used to collect data. The first section sought to find out the teachers' perceptions of…

  5. Listening for Competence through Documentation: Assessing Children with Language Delays Using Digital Video

    ERIC Educational Resources Information Center

    Suarez, Stephanie Cox; Daniels, Karen J.

    2009-01-01

    This case study uses documentation as a tool for formative assessment to interpret the learning of twin boys with significantly delayed language skills. Reggio-inspired documentation (the act of collecting, interpreting, and reflecting on traces of learning from video, images, and observation notes) focused on the unfolding of the boys' nonverbal…

  6. Participatory GIS in design of the Wroclaw University of Science and Technology campus web map and spatial analysis of campus area quality

    NASA Astrophysics Data System (ADS)

    Blachowski, Jan; Łuczak, Jakub; Zagrodnik, Paulina

    2018-01-01

    Public participation geographic information system (GIS) and participatory mapping data collection methods are means that enhance capacity in generating, managing, and communicating spatial information in various fields ranging from local planning to environmental management. In this study these methods have been used in two ways. The first one, to gather information on the additional functionality of campus web map expected by its potential users, i.e. students, staff and visitors, through web based survey. The second, to collect geographically referenced information on campus areas that are liked and disliked in a geo-survey carried out with ArcGIS Online GeoForm Application. The results of the first survey were used to map facilities such as: bicycle infrastructure, building entrances, wheelchair accessible infrastructure and benches. The results of the second one, to analyse the most and the least attractive parts of the campus with heat and hot spot analyses in GIS. In addition, the answers have been studied with regard to the visual and functional aspects of campus area raised in the survey. The thematic layers developed in the results of field mapping and geoprocessing of geosurvey data were included in the campus web map project. The paper describes the applied methodology of data collection, processing, analysis, interpretation and geovisualisation.

  7. Psychometric evaluation of the WHOQOL-BREF, Taiwan version, across five kinds of Taiwanese cancer survivors: Rasch analysis and confirmatory factor analysis.

    PubMed

    Lin, Chung-Ying; Hwang, Jing-Shiang; Wang, Wen-Chung; Lai, Wu-Wei; Su, Wu-Chou; Wu, Tzu-Yi; Yao, Grace; Wang, Jung-Der

    2018-04-13

    Quality of life (QoL) is important for clinicians to evaluate how cancer survivors judge their sense of well-being, and WHOQOL-BREF may be a good tool for clinical use. However, at least three issues remain unresolved: (1) the psychometric properties of the WHOQOL-BREF for cancer patients are insufficient; (2) the scoring method used for WHOQOL-BREF needs to be clarify; (3) whether different types of cancer patients interpret the WHOQOL-BREF similarly. We recruited 1000 outpatients with head/neck cancer, 1000 with colorectal cancer, 965 with liver cancer, 1438 with lung cancer and 1299 with gynecologic cancers in a medical center. Data analyses included Rasch models, confirmatory factor analysis (CFA), and Pearson correlations. The mean WHOQOL-BREF domain scores were between 13.34 and 14.77 among all participants. CFA supported construct validity; Rasch models revealed that almost all items were embedded in their expected domains and were interpreted similarly across five types of cancer patients; all correlation coefficients between Rasch scores and original domain scores were above 0.9. The linear relationship between Rasch scores and domain scores suggested that the current calculations for domain scores were applicable and without serious bias. Clinical practitioners may regularly collect and record the WHOQOL-BREF domain scores into electronic health records. Copyright © 2018. Published by Elsevier B.V.

  8. In Situ Three-Dimensional Reciprocal-Space Mapping of Diffuse Scattering Intensity Distribution and Data Analysis for Precursor Phenomenon in Shape-Memory Alloy

    NASA Astrophysics Data System (ADS)

    Cheng, Tian-Le; Ma, Fengde D.; Zhou, Jie E.; Jennings, Guy; Ren, Yang; Jin, Yongmei M.; Wang, Yu U.

    2012-01-01

    Diffuse scattering contains rich information on various structural disorders, thus providing a useful means to study the nanoscale structural deviations from the average crystal structures determined by Bragg peak analysis. Extraction of maximal information from diffuse scattering requires concerted efforts in high-quality three-dimensional (3D) data measurement, quantitative data analysis and visualization, theoretical interpretation, and computer simulations. Such an endeavor is undertaken to study the correlated dynamic atomic position fluctuations caused by thermal vibrations (phonons) in precursor state of shape-memory alloys. High-quality 3D diffuse scattering intensity data around representative Bragg peaks are collected by using in situ high-energy synchrotron x-ray diffraction and two-dimensional digital x-ray detector (image plate). Computational algorithms and codes are developed to construct the 3D reciprocal-space map of diffuse scattering intensity distribution from the measured data, which are further visualized and quantitatively analyzed to reveal in situ physical behaviors. Diffuse scattering intensity distribution is explicitly formulated in terms of atomic position fluctuations to interpret the experimental observations and identify the most relevant physical mechanisms, which help set up reduced structural models with minimal parameters to be efficiently determined by computer simulations. Such combined procedures are demonstrated by a study of phonon softening phenomenon in precursor state and premartensitic transformation of Ni-Mn-Ga shape-memory alloy.

  9. 78 FR 19314 - Agency Information Collection Activities; Submission for OMB Review; Comment Request: Transmittal...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-29

    ... for OMB Review; Comment Request: Transmittal of Unemployment Insurance Materials ACTION: Notice...) sponsored information collection request (ICR) titled, ``Transmittal of Unemployment Insurance Materials..., interpretations, court opinions, etc. In addition, the Unemployment Compensation for Federal Civilian Employees...

  10. [The Scope, Quality and Safety Requirements of Drug Abuse Testing].

    PubMed

    Küme, Tuncay; Karakükcü, Çiğdem; Pınar, Aslı; Coşkunol, Hakan

    2017-01-01

    The aim of this review is to inform about the scopes and requirements of drug abuse testing. Drug abuse testing is one of the tools for determination of drug use. It must fulfill the quality and safety requirements in judgmental legal and administrative decisions. Drug abuse testing must fulfill some requirements like selection of the appropriate test matrix, appropriate screening test panel, sampling in detection window, patient consent, identification of the donor, appropriate collection site, sample collection with observation, identification and control of the sample, specimen custody chain in preanalytical phase; analysis in authorized laboratories, specimen validity tests, reliable testing METHODS, strict quality control, two-step analysis in analytical phase; storage of the split specimen, confirmation of the split specimen in the objection, result custody chain, appropriate cut-off concentration, the appropriate interpretation of the result in postanalytical phase. The workflow and analytical processes of drug abuse testing are explained in last regulation of the Department of Medical Laboratory Services, Ministry of Health in Turkey. The clinical physicians have to know and apply the quality and safety requirements in drug abuse testing according to last regulations in Turkey.

  11. Spatio-temporal examination of precipitation isotopes from the North American monsoon in Arizona, New Mexico, and Utah from 2014 to 2017

    NASA Astrophysics Data System (ADS)

    Tulley-Cordova, C. L.; Bowen, G. J.

    2017-12-01

    A significant summertime feature of climate in the southwestern United States (US) is the North American monsoon (NAM), also known as the Mexican monsoon, Arizona monsoon, and the southwestern United States monsoon. NAM is a crucial contributor to total annual precipitation in the Four Corners region of the US. Modern investigation of NAM in this region using stable isotopes has been poorly studied. This study characterizes the spatio-temporal changes of NAM based on stable isotopic results from 40 sites, located within the boundaries of the Navajo Nation, in Arizona, New Mexico, and Utah from 2014 to 2017. Sample collections were collected monthly at each site from May to October. Examination of temporal trends of precipitation revealed strong monthly and interannual changes; spatial analysis showed weak large-scale relationships across the study area. Analysis of stable isotopes in precipitation, surface, ground, and spring waters can be used to interpret the isotopic differences in the modern hydro-climate of the Navajo Nation and Colorado Plateau to help predict future hydro-climate changes and its implications on future water resources.

  12. U.S. Geological Survey applied research studies of the Cheyenne River system, South Dakota; description and collation of data, water years 1985-86

    USGS Publications Warehouse

    Goddard, Kimball E.

    1988-01-01

    The Cheyenne River system in Western South Dakota has been impacted by the discharge of about 100 million metric tons of gold-mill tailings to Whitewood Creek near Lead, South Dakota. In April 1985, the U.S. Geological Survey initiated an extensive series of research studies to investigate the magnitude of the impact and to define important processes acting on the contaminated sediments present in the system. The report presents all data collected during the 1985 and 1986 water years for these research studies. Some of the data included have been published previously. Hydrologic, geochemical, and biologic data are available for sites on Whitewood Creek, the Belle Fourche and Cheyenne Rivers, and for the Cheyenne River arm of Lake Oahe. Data complexity varies from routine discharge and water quality to very complex photon-correlation spectroscopy and energy-dispersive x-ray analysis. Methods for sample collection, handling and preservation, and laboratory analysis are also presented. No interpretations or complex statistical summaries are included. (USGS)

  13. Quality of core collections for effective utilisation of genetic resources review, discussion and interpretation.

    PubMed

    Odong, T L; Jansen, J; van Eeuwijk, F A; van Hintum, T J L

    2013-02-01

    Definition of clear criteria for evaluation of the quality of core collections is a prerequisite for selecting high-quality cores. However, a critical examination of the different methods used in literature, for evaluating the quality of core collections, shows that there are no clear guidelines on the choices of quality evaluation criteria and as a result, inappropriate analyses are sometimes made leading to false conclusions being drawn regarding the quality of core collections and the methods to select such core collections. The choice of criteria for evaluating core collections appears to be based mainly on the fact that those criteria have been used in earlier publications rather than on the actual objectives of the core collection. In this study, we provide insight into different criteria used for evaluating core collections. We also discussed different types of core collections and related each type of core collection to their respective evaluation criteria. Two new criteria based on genetic distance are introduced. The consequences of the different evaluation criteria are illustrated using simulated and experimental data. We strongly recommend the use of the distance-based criteria since they not only allow the simultaneous evaluation of all variables describing the accessions, but they also provide intuitive and interpretable criteria, as compared with the univariate criteria generally used for the evaluation of core collections. Our findings will provide genebank curators and researchers with possibilities to make informed choices when creating, comparing and using core collections.

  14. Identification of cryovolcanism on Titan using fuzzy cognitive maps

    NASA Astrophysics Data System (ADS)

    Furfaro, Roberto; Kargel, Jeffrey S.; Lunine, Jonathan I.; Fink, Wolfgang; Bishop, Michael P.

    2010-04-01

    Future planetary exploration of Titan will require higher degrees of on-board automation, including autonomous determination of sites where the probability of significant scientific findings is the highest. In this paper, a novel Artificial Intelligence (AI) method for the identification and interpretation of sites that yield the highest potential of cryovolcanic activity is presented. We introduce the theory of fuzzy cognitive maps (FCM) as a tool for the analysis of remotely collected data in planetary exploration. A cognitive model embedded in a fuzzy logic framework is constructed via the synergistic interaction of planetary scientists and AI experts. As an application example, we show how FCM can be employed to solve the challenging problem of recognizing cryovolcanism from Synthetic Aperture Radar (SAR) Cassini data. The fuzzy cognitive map is constructed using what is currently known about cryovolcanism on Titan and relies on geological mapping performed by planetary scientists to interpret different locales as cryovolcanic in nature. The system is not conceived to replace the human scientific interpretation, but to enhance the scientists' ability to deal with large amounts of data, and it is a first step in designing AI systems that will be able, in the future, to autonomously make decisions in situations where human analysis and interpretation is not readily available or could not be sufficiently timely. The proposed FCM is tested on Cassini radar data to show the effectiveness of the system in reaching conclusions put forward by human experts and published in the literature. Four tests are performed using the Ta SAR image (October 2004 fly-by). Two regions (i.e. Ganesa Macula and the lobate high backscattering region East of Ganesa) are interpreted by the designed FCM as exhibiting cryovolcanism in agreement with the initial interpretation of the regions by Stofan et al. (2006). Importantly, the proposed FCM is shown to be flexible and adaptive as new data and knowledge are acquired during the course of exploration. Subsequently, the FCM has been modified to include topographic information derived from SAR stereo data. With this additional information, the map concludes that Ganesa Macula is not a cryovolcanic region. In conclusion, the FCM methodology is shown to be a critical and powerful component of future autonomous robotic spacecraft (e.g., orbiter(s), balloon(s), surface/lake lander(s), rover(s)) that will be deployed for the exploration of Titan.

  15. Interpretive focus groups: a participatory method for interpreting and extending secondary analysis of qualitative data.

    PubMed

    Redman-MacLaren, Michelle; Mills, Jane; Tommbe, Rachael

    2014-01-01

    Participatory approaches to qualitative research practice constantly change in response to evolving research environments. Researchers are increasingly encouraged to undertake secondary analysis of qualitative data, despite epistemological and ethical challenges. Interpretive focus groups can be described as a more participative method for groups to analyse qualitative data. To facilitate interpretive focus groups with women in Papua New Guinea to extend analysis of existing qualitative data and co-create new primary data. The purpose of this was to inform a transformational grounded theory and subsequent health promoting action. A two-step approach was used in a grounded theory study about how women experience male circumcision in Papua New Guinea. Participants analysed portions or 'chunks' of existing qualitative data in story circles and built upon this analysis by using the visual research method of storyboarding. New understandings of the data were evoked when women in interpretive focus groups analysed the data 'chunks'. Interpretive focus groups encouraged women to share their personal experiences about male circumcision. The visual method of storyboarding enabled women to draw pictures to represent their experiences. This provided an additional focus for whole-of-group discussions about the research topic. Interpretive focus groups offer opportunity to enhance trustworthiness of findings when researchers undertake secondary analysis of qualitative data. The co-analysis of existing data and co-generation of new data between research participants and researchers informed an emergent transformational grounded theory and subsequent health promoting action.

  16. Interpretive focus groups: a participatory method for interpreting and extending secondary analysis of qualitative data

    PubMed Central

    Redman-MacLaren, Michelle; Mills, Jane; Tommbe, Rachael

    2014-01-01

    Background Participatory approaches to qualitative research practice constantly change in response to evolving research environments. Researchers are increasingly encouraged to undertake secondary analysis of qualitative data, despite epistemological and ethical challenges. Interpretive focus groups can be described as a more participative method for groups to analyse qualitative data. Objective To facilitate interpretive focus groups with women in Papua New Guinea to extend analysis of existing qualitative data and co-create new primary data. The purpose of this was to inform a transformational grounded theory and subsequent health promoting action. Design A two-step approach was used in a grounded theory study about how women experience male circumcision in Papua New Guinea. Participants analysed portions or ‘chunks’ of existing qualitative data in story circles and built upon this analysis by using the visual research method of storyboarding. Results New understandings of the data were evoked when women in interpretive focus groups analysed the data ‘chunks’. Interpretive focus groups encouraged women to share their personal experiences about male circumcision. The visual method of storyboarding enabled women to draw pictures to represent their experiences. This provided an additional focus for whole-of-group discussions about the research topic. Conclusions Interpretive focus groups offer opportunity to enhance trustworthiness of findings when researchers undertake secondary analysis of qualitative data. The co-analysis of existing data and co-generation of new data between research participants and researchers informed an emergent transformational grounded theory and subsequent health promoting action. PMID:25138532

  17. Mapping trees outside forests using high-resolution aerial imagery: a comparison of pixel- and object-based classification approaches.

    PubMed

    Meneguzzo, Dacia M; Liknes, Greg C; Nelson, Mark D

    2013-08-01

    Discrete trees and small groups of trees in nonforest settings are considered an essential resource around the world and are collectively referred to as trees outside forests (ToF). ToF provide important functions across the landscape, such as protecting soil and water resources, providing wildlife habitat, and improving farmstead energy efficiency and aesthetics. Despite the significance of ToF, forest and other natural resource inventory programs and geospatial land cover datasets that are available at a national scale do not include comprehensive information regarding ToF in the United States. Additional ground-based data collection and acquisition of specialized imagery to inventory these resources are expensive alternatives. As a potential solution, we identified two remote sensing-based approaches that use free high-resolution aerial imagery from the National Agriculture Imagery Program (NAIP) to map all tree cover in an agriculturally dominant landscape. We compared the results obtained using an unsupervised per-pixel classifier (independent component analysis-[ICA]) and an object-based image analysis (OBIA) procedure in Steele County, Minnesota, USA. Three types of accuracy assessments were used to evaluate how each method performed in terms of: (1) producing a county-level estimate of total tree-covered area, (2) correctly locating tree cover on the ground, and (3) how tree cover patch metrics computed from the classified outputs compared to those delineated by a human photo interpreter. Both approaches were found to be viable for mapping tree cover over a broad spatial extent and could serve to supplement ground-based inventory data. The ICA approach produced an estimate of total tree cover more similar to the photo-interpreted result, but the output from the OBIA method was more realistic in terms of describing the actual observed spatial pattern of tree cover.

  18. How to interpret a small increase in AUC with an additional risk prediction marker: decision analysis comes through.

    PubMed

    Baker, Stuart G; Schuit, Ewoud; Steyerberg, Ewout W; Pencina, Michael J; Vickers, Andrew; Vickers, Andew; Moons, Karel G M; Mol, Ben W J; Lindeman, Karen S

    2014-09-28

    An important question in the evaluation of an additional risk prediction marker is how to interpret a small increase in the area under the receiver operating characteristic curve (AUC). Many researchers believe that a change in AUC is a poor metric because it increases only slightly with the addition of a marker with a large odds ratio. Because it is not possible on purely statistical grounds to choose between the odds ratio and AUC, we invoke decision analysis, which incorporates costs and benefits. For example, a timely estimate of the risk of later non-elective operative delivery can help a woman in labor decide if she wants an early elective cesarean section to avoid greater complications from possible later non-elective operative delivery. A basic risk prediction model for later non-elective operative delivery involves only antepartum markers. Because adding intrapartum markers to this risk prediction model increases AUC by 0.02, we questioned whether this small improvement is worthwhile. A key decision-analytic quantity is the risk threshold, here the risk of later non-elective operative delivery at which a patient would be indifferent between an early elective cesarean section and usual care. For a range of risk thresholds, we found that an increase in the net benefit of risk prediction requires collecting intrapartum marker data on 68 to 124 women for every correct prediction of later non-elective operative delivery. Because data collection is non-invasive, this test tradeoff of 68 to 124 is clinically acceptable, indicating the value of adding intrapartum markers to the risk prediction model. Copyright © 2014 John Wiley & Sons, Ltd.

  19. Clinical implementation of RNA signatures for pharmacogenomic decision-making

    PubMed Central

    Tang, Weihua; Hu, Zhiyuan; Muallem, Hind; Gulley, Margaret L

    2011-01-01

    RNA profiling is increasingly used to predict drug response, dose, or toxicity based on analysis of drug pharmacokinetic or pharmacodynamic pathways. Before implementing multiplexed RNA arrays in clinical practice, validation studies are carried out to demonstrate sufficient evidence of analytic and clinical performance, and to establish an assay protocol with quality assurance measures. Pathologists assure quality by selecting input tissue and by interpreting results in the context of the input tissue as well as the technologies that were used and the clinical setting in which the test was ordered. A strength of RNA profiling is the array-based measurement of tens to thousands of RNAs at once, including redundant tests for critical analytes or pathways to promote confidence in test results. Instrument and reagent manufacturers are crucial for supplying reliable components of the test system. Strategies for quality assurance include careful attention to RNA preservation and quality checks at pertinent steps in the assay protocol, beginning with specimen collection and proceeding through the various phases of transport, processing, storage, analysis, interpretation, and reporting. Specimen quality is checked by probing housekeeping transcripts, while spiked and exogenous controls serve as a check on analytic performance of the test system. Software is required to manipulate abundant array data and present it for interpretation by a laboratory physician who reports results in a manner facilitating therapeutic decision-making. Maintenance of the assay requires periodic documentation of personnel competency and laboratory proficiency. These strategies are shepherding genomic arrays into clinical settings to provide added value to patients and to the larger health care system. PMID:23226056

  20. Hybrid approach combining chemometrics and likelihood ratio framework for reporting the evidential value of spectra.

    PubMed

    Martyna, Agnieszka; Zadora, Grzegorz; Neocleous, Tereza; Michalska, Aleksandra; Dean, Nema

    2016-08-10

    Many chemometric tools are invaluable and have proven effective in data mining and substantial dimensionality reduction of highly multivariate data. This becomes vital for interpreting various physicochemical data due to rapid development of advanced analytical techniques, delivering much information in a single measurement run. This concerns especially spectra, which are frequently used as the subject of comparative analysis in e.g. forensic sciences. In the presented study the microtraces collected from the scenarios of hit-and-run accidents were analysed. Plastic containers and automotive plastics (e.g. bumpers, headlamp lenses) were subjected to Fourier transform infrared spectrometry and car paints were analysed using Raman spectroscopy. In the forensic context analytical results must be interpreted and reported according to the standards of the interpretation schemes acknowledged in forensic sciences using the likelihood ratio approach. However, for proper construction of LR models for highly multivariate data, such as spectra, chemometric tools must be employed for substantial data compression. Conversion from classical feature representation to distance representation was proposed for revealing hidden data peculiarities and linear discriminant analysis was further applied for minimising the within-sample variability while maximising the between-sample variability. Both techniques enabled substantial reduction of data dimensionality. Univariate and multivariate likelihood ratio models were proposed for such data. It was shown that the combination of chemometric tools and the likelihood ratio approach is capable of solving the comparison problem of highly multivariate and correlated data after proper extraction of the most relevant features and variance information hidden in the data structure. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Characterization of spatial and temporal variability in hydrochemistry of Johor Straits, Malaysia.

    PubMed

    Abdullah, Pauzi; Abdullah, Sharifah Mastura Syed; Jaafar, Othman; Mahmud, Mastura; Khalik, Wan Mohd Afiq Wan Mohd

    2015-12-15

    Characterization of hydrochemistry changes in Johor Straits within 5 years of monitoring works was successfully carried out. Water quality data sets (27 stations and 19 parameters) collected in this area were interpreted subject to multivariate statistical analysis. Cluster analysis grouped all the stations into four clusters ((Dlink/Dmax) × 100<90) and two clusters ((Dlink/Dmax) × 100<80) for site and period similarities. Principal component analysis rendered six significant components (eigenvalue>1) that explained 82.6% of the total variance of the data set. Classification matrix of discriminant analysis assigned 88.9-92.6% and 83.3-100% correctness in spatial and temporal variability, respectively. Times series analysis then confirmed that only four parameters were not significant over time change. Therefore, it is imperative that the environmental impact of reclamation and dredging works, municipal or industrial discharge, marine aquaculture and shipping activities in this area be effectively controlled and managed. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Comparison of velocity-log data collected using impeller and electromagnetic flowmeters

    USGS Publications Warehouse

    Newhouse, M.W.; Izbicki, J.A.; Smith, G.A.

    2005-01-01

    Previous studies have used flowmeters in environments that are within the expectations of their published ranges. Electromagnetic flowmeters have a published range from 0.1 to 79.0 m/min, and impeller flowmeters have a published range from 1.2 to 61.0 m/min. Velocity-log data collected in five long-screened production wells in the Pleasant Valley area of southern California showed that (1) electromagnetic flowmeter results were comparable within ??2% to results obtained using an impeller flowmeter for comparable depths; (2) the measured velocities from the electromagnetic flowmeter were up to 36% greater than the published maximum range; and (3) both data sets, collected without the use of centralizers or flow diverters, produced comparable and interpretable results. Although either method is acceptable for measuring wellbore velocities and the distribution of flow, the electromagnetic flowmeter enables collection of data over a now greater range of flows. In addition, changes in fluid temperature and fluid resistivity, collected as part of the electromagnetic flowmeter log, are useful in the identification of flow and hydrogeologic interpretation.

  3. Comparison of velocity-log data collected using impeller and electromagnetic flowmeters.

    PubMed

    Newhouse, M W; Izbicki, J A; Smith, G A

    2005-01-01

    Previous studies have used flowmeters in environments that are within the expectations of their published ranges. Electromagnetic flowmeters have a published range from 0.1 to 79.0 m/min, and impeller flowmeters have a published range from 1.2 to 61.0 m/min. Velocity-log data collected in five long-screened production wells in the Pleasant Valley area of southern California showed that (1) electromagnetic flowmeter results were comparable within +/-2% to results obtained using an impeller flowmeter for comparable depths; (2) the measured velocities from the electromagnetic flowmeter were up to 36% greater than the published maximum range; and (3) both data sets, collected without the use of centralizers or flow diverters, produced comparable and interpretable results. Although either method is acceptable for measuring wellbore velocities and the distribution of flow, the electromagnetic flowmeter enables collection of data over a now greater range of flows. In addition, changes in fluid temperature and fluid resistivity, collected as part of the electromagnetic flowmeter log, are useful in the identification of flow and hydrogeologic interpretation.

  4. State of the art in hair analysis for detection of drug and alcohol abuse.

    PubMed

    Pragst, Fritz; Balikova, Marie A

    2006-08-01

    Hair differs from other materials used for toxicological analysis because of its unique ability to serve as a long-term storage of foreign substances with respect to the temporal appearance in blood. Over the last 20 years, hair testing has gained increasing attention and recognition for the retrospective investigation of chronic drug abuse as well as intentional or unintentional poisoning. In this paper, we review the physiological basics of hair growth, mechanisms of substance incorporation, analytical methods, result interpretation and practical applications of hair analysis for drugs and other organic substances. Improved chromatographic-mass spectrometric techniques with increased selectivity and sensitivity and new methods of sample preparation have improved detection limits from the ng/mg range to below pg/mg. These technical advances have substantially enhanced the ability to detect numerous drugs and other poisons in hair. For example, it was possible to detect previous administration of a single very low dose in drug-facilitated crimes. In addition to its potential application in large scale workplace drug testing and driving ability examination, hair analysis is also used for detection of gestational drug exposure, cases of criminal liability of drug addicts, diagnosis of chronic intoxication and in postmortem toxicology. Hair has only limited relevance in therapy compliance control. Fatty acid ethyl esters and ethyl glucuronide in hair have proven to be suitable markers for alcohol abuse. Hair analysis for drugs is, however, not a simple routine procedure and needs substantial guidelines throughout the testing process, i.e., from sample collection to results interpretation.

  5. Phylogeny of a genomically diverse group of elymus (poaceae) allopolyploids reveals multiple levels of reticulation.

    PubMed

    Mason-Gamer, Roberta J

    2013-01-01

    The grass tribe Triticeae (=Hordeeae) comprises only about 300 species, but it is well known for the economically important crop plants wheat, barley, and rye. The group is also recognized as a fascinating example of evolutionary complexity, with a history shaped by numerous events of auto- and allopolyploidy and apparent introgression involving diploids and polyploids. The genus Elymus comprises a heterogeneous collection of allopolyploid genome combinations, all of which include at least one set of homoeologs, designated St, derived from Pseudoroegneria. The current analysis includes a geographically and genomically diverse collection of 21 tetraploid Elymus species, and a single hexaploid species. Diploid and polyploid relationships were estimated using four molecular data sets, including one that combines two regions of the chloroplast genome, and three from unlinked nuclear genes: phosphoenolpyruvate carboxylase, β-amylase, and granule-bound starch synthase I. Four gene trees were generated using maximum likelihood, and the phylogenetic placement of the polyploid sequences reveals extensive reticulation beyond allopolyploidy alone. The trees were interpreted with reference to numerous phenomena known to complicate allopolyploid phylogenies, and introgression was identified as a major factor in their history. The work illustrates the interpretation of complicated phylogenetic results through the sequential consideration of numerous possible explanations, and the results highlight the value of careful inspection of multiple independent molecular phylogenetic estimates, with particular focus on the differences among them.

  6. A Transparent and Transferable Framework for Tracking Quality Information in Large Datasets

    PubMed Central

    Smith, Derek E.; Metzger, Stefan; Taylor, Jeffrey R.

    2014-01-01

    The ability to evaluate the validity of data is essential to any investigation, and manual “eyes on” assessments of data quality have dominated in the past. Yet, as the size of collected data continues to increase, so does the effort required to assess their quality. This challenge is of particular concern for networks that automate their data collection, and has resulted in the automation of many quality assurance and quality control analyses. Unfortunately, the interpretation of the resulting data quality flags can become quite challenging with large data sets. We have developed a framework to summarize data quality information and facilitate interpretation by the user. Our framework consists of first compiling data quality information and then presenting it through 2 separate mechanisms; a quality report and a quality summary. The quality report presents the results of specific quality analyses as they relate to individual observations, while the quality summary takes a spatial or temporal aggregate of each quality analysis and provides a summary of the results. Included in the quality summary is a final quality flag, which further condenses data quality information to assess whether a data product is valid or not. This framework has the added flexibility to allow “eyes on” information on data quality to be incorporated for many data types. Furthermore, this framework can aid problem tracking and resolution, should sensor or system malfunctions arise. PMID:25379884

  7. Contextual factors in maternal and newborn health evaluation: a protocol applied in Nigeria, India and Ethiopia.

    PubMed

    Sabot, Kate; Marchant, Tanya; Spicer, Neil; Berhanu, Della; Gautham, Meenakshi; Umar, Nasir; Schellenberg, Joanna

    2018-01-01

    Understanding the context of a health programme is important in interpreting evaluation findings and in considering the external validity for other settings. Public health researchers can be imprecise and inconsistent in their usage of the word "context" and its application to their work. This paper presents an approach to defining context, to capturing relevant contextual information and to using such information to help interpret findings from the perspective of a research group evaluating the effect of diverse innovations on coverage of evidence-based, life-saving interventions for maternal and newborn health in Ethiopia, Nigeria, and India. We define "context" as the background environment or setting of any program, and "contextual factors" as those elements of context that could affect implementation of a programme. Through a structured, consultative process, contextual factors were identified while trying to strike a balance between comprehensiveness and feasibility. Thematic areas included demographics and socio-economics, epidemiological profile, health systems and service uptake, infrastructure, education, environment, politics, policy and governance. We outline an approach for capturing and using contextual factors while maximizing use of existing data. Methods include desk reviews, secondary data extraction and key informant interviews. Outputs include databases of contextual factors and summaries of existing maternal and newborn health policies and their implementation. Use of contextual data will be qualitative in nature and may assist in interpreting findings in both quantitative and qualitative aspects of programme evaluation. Applying this approach was more resource intensive than expected, in part because routinely available information was not consistently available across settings and more primary data collection was required than anticipated. Data was used only minimally, partly due to a lack of evaluation results that needed further explanation, but also because contextual data was not available for the precise units of analysis or time periods of interest. We would advise others to consider integrating contextual factors within other data collection activities, and to conduct regular reviews of maternal and newborn health policies. This approach and the learnings from its application could help inform the development of guidelines for the collection and use of contextual factors in public health evaluation.

  8. The Application of Non-Discrimination Law and Regulations To Collective Bargaining in Higher Education. Special Report No. 23.

    ERIC Educational Resources Information Center

    Academic Collective Bargaining Information Service, Washington, DC.

    This document explores some of the interrelationships between the collective bargaining process and equal employment issues. The National Labor Relations Act, the federal collective bargaining statute, is the focal point of the labor law discussion because it has had significant impact on the drafting and interpretation of state labor legislation…

  9. GC-MS analyses and chemometric processing to discriminate the local and long-distance sources of PAHs associated to atmospheric PM2.5.

    PubMed

    Masiol, Mauro; Centanni, Elena; Squizzato, Stefania; Hofer, Angelika; Pecorari, Eliana; Rampazzo, Giancarlo; Pavoni, Bruno

    2012-09-01

    This study presents a procedure to differentiate the local and remote sources of particulate-bound polycyclic aromatic hydrocarbons (PAHs). Data were collected during an extended PM(2.5) sampling campaign (2009-2010) carried out for 1 year in Venice-Mestre, Italy, at three stations with different emissive scenarios: urban, industrial, and semirural background. Diagnostic ratios and factor analysis were initially applied to point out the most probable sources. In a second step, the areal distribution of the identified sources was studied by applying the discriminant analysis on factor scores. Third, samples collected in days with similar atmospheric circulation patterns were grouped using a cluster analysis on wind data. Local contributions to PM(2.5) and PAHs were then assessed by interpreting cluster results with chemical data. Results evidenced that significantly lower levels of PM(2.5) and PAHs were found when faster winds changed air masses, whereas in presence of scarce ventilation, locally emitted pollutants were trapped and concentrations increased. This way, an estimation of pollutant loads due to local sources can be derived from data collected in days with similar wind patterns. Long-range contributions were detected by a cluster analysis on the air mass back-trajectories. Results revealed that PM(2.5) concentrations were relatively high when air masses had passed over the Po Valley. However, external sources do not significantly contribute to the PAHs load. The proposed procedure can be applied to other environments with minor modifications, and the obtained information can be useful to design local and national air pollution control strategies.

  10. Using Special Libraries to Interface with Developing Country Clientele.

    ERIC Educational Resources Information Center

    Schenck-Hamlin, Donna; George, Paulette Foss

    1986-01-01

    Describes two special collections focusing on postharvest systems of handling, transportation, storage, and marketing of food and feed grain. Highlights include information needs of developing countries (e.g., Egypt, Honduras, Pakistan), and information center activities (communication and marketing, collection building, interpreting client needs,…

  11. A proposed streamflow-data program for Wisconsin

    USGS Publications Warehouse

    Campbell, Roy E.; Dreher, Frederick C.

    1970-01-01

    The historical data acquired and the new data to be collected form the basis for analytical and interpretive reports. Recommendations were made as to expanding or initiating such studies. Streamflow data collection should be a continuing effort, reoriented as necessary to meet the changing needs.

  12. Population trends of North American shorebirds based on the International Shorebird Survey

    USGS Publications Warehouse

    Howe, M.A.; Geissler, P.H.; Harrington, B.A.

    1989-01-01

    Shorebirds (Charadiiformes) are prime candidates for population decline because of their dependence on wetlands that are being lost at a rapid pace. Thirty-six of the 49 species of shorebirds that breed in North America spend most of the year in Latin America. Because populations of most species breed and winter at remote sites , it may be feasible to monitor their numbers at migration stopovers. In this study, we used statistical trend analysis methods, developed for the North American Breeding Bird Survey, to analyze data on shorebird populations during south-bound migration in the United States. Survey data were collected by volunteers in the International Shorebird Survey (ISS). Methodological concerns over both the ISS and the trend analysis procedures are discussed in detail and biological interpretations of the results are suggested.

  13. Data analysis and interpretation of lunar dust exosphere

    NASA Technical Reports Server (NTRS)

    Andrews, George A., Jr.

    1992-01-01

    The lunar horizon glow observed by Apollo astronauts and captured on film during the Surveyor mission is believed to result from the scattering of sunlight off lunar fines suspended in a dust layer over the lunar surface. For scale heights on the order of tens of kilometers, it is anticipated that the size of the dust particles will be small enough to admit Rayleigh scattering. Such events would result in scattered light which is polarized to a degree which is a function of observation angle and produce spectra containing large high frequency components ('bluing'). Believing these signatures to be observable from ground based telescopes, observational data has been collected from McDonald Observatory and the task of reduction and analysis of this data is the focus of the present report.

  14. A Thematic Analysis of the Impact of MY MASCULINITY HELPS as a Tool for Sexual Violence Prevention.

    PubMed

    Grimmett, Marc A; Conley, Abigail H; Foster, Dominique; Clark, Cory W

    2018-04-01

    The purpose of this study is to explore the impact of an educational documentary, MY MASCULINITY HELPS ( MMH), as a sexual violence prevention tool. MMH is a short (i.e., 31 min) educational documentary that explores the role of African American men and boys in the prevention of sexual violence. Participants ( N = 88) completed an electronic, qualitative questionnaire after viewing the documentary and data collected were analyzed and interpreted using thematic analysis. Findings from the study highlighted the power of documentary film to impact knowledge, beliefs, social norms related to masculinity and the role of African American men as allies, empowerment, and commitment to action. Implications of MMH as a prosocial bystander behavior intervention and educational tool are discussed.

  15. Aircrew Discourse: Exploring Strategies of Information and Action Management

    NASA Technical Reports Server (NTRS)

    Irwin, Cheryl M.; Veinott, Elizabeth S.; Shafto, Michael G. (Technical Monitor)

    1995-01-01

    This paper explores methodology issues encountered in the analysis of flightcrew communications in aviation simulation research. Examples are provided by two recent studies which are compared on three issues: level of analysis, data definition, and interpretation of the results. The data discussed were collected in a study comparing two levels of aircraft automation. The first example is an investigation of how pilots' information transfer strategies differed as a function of automation during low and high-workload flight phases. The second study focuses on how crews managed actions in the two aircraft during a ten minute, high-workload flight segment. Results indicated that crews in the two aircraft differed in their strategies of information and action management. The differences are discussed in terms of their operational and research significance.

  16. Self-Perceived Professional Identity of Pharmacy Educators in South Africa

    PubMed Central

    Boschmans, Shirley-Anne; Hoelson, Chris

    2013-01-01

    Objective. To identify, describe, and analyze the self-perceived professional identities of pharmacy educators within the South African context. Methods. Narrative interviews were conducted, recorded, and transcribed. Thematic analysis and interpretation of the transcripts were conducted using qualitative data analysis software. Results. Multiplicities of self-perceived professional identities were identified. All of these were multi-faceted and could be situated on a continuum between pharmacist identity on one end and academic identity on the other. In addition, 6 key determinants were recognized as underpinning the participants’ self-perception of their professional identity. Conclusion. This study afforded a better understanding of who pharmacy educators in South Africa are as professionals. Moreover, the findings contribute to an international, collective understanding of the professional identity of pharmacy educators. PMID:24371334

  17. The effects of leaf size and microroughness on the branch-scale collection efficiency of ultrafine particles

    DOE PAGES

    Huang, C. W.; Lin, M. Y.; Khlystov, A.; ...

    2015-03-02

    In this study, wind tunnel experiments were performed to explore how leaf size and leaf microroughness impact the collection efficiency of ultrafine particles (UFP) at the branch scale. A porous media model previously used to characterize UFP deposition onto conifers (Pinus taeda and Juniperus chinensis) was employed to interpret these wind tunnel measurements for four different broadleaf species (Ilex cornuta, Quercus alba, Magnolia grandiflora, and Lonicera fragrantissima) and three wind speed (0.3–0.9 ms -1) conditions. Among the four broadleaf species considered, Ilex cornuta with its partially folded shape and sharp edges was the most efficient at collecting UFP followed bymore » the other three flat-shaped broadleaf species. The findings here suggest that a connection must exist between UFP collection and leaf dimension and roughness. This connection is shown to be primarily due to the thickness of a quasi-laminar boundary layer pinned to the leaf surface assuming the flow over a leaf resembles that of a flat plate. A scaling analysis that utilizes a three-sublayer depositional model for a flat plate of finite size and roughness embedded within the quasi-laminar boundary layer illustrates these connections. The analysis shows that a longer leaf dimension allows for thicker quasi-laminar boundary layers to develop. A thicker quasi-laminar boundary layer depth in turn increases the overall resistance to UFP deposition due to an increase in the diffusional path length thereby reducing the leaf-scale UFP collection efficiency. Finally, it is suggested that the effects of leaf microroughness are less relevant to the UFP collection efficiency than are the leaf dimensions for the four broadleaf species explored here.« less

  18. The effects of leaf size and microroughness on the branch-scale collection efficiency of ultrafine particles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, C. W.; Lin, M. Y.; Khlystov, A.

    In this study, wind tunnel experiments were performed to explore how leaf size and leaf microroughness impact the collection efficiency of ultrafine particles (UFP) at the branch scale. A porous media model previously used to characterize UFP deposition onto conifers (Pinus taeda and Juniperus chinensis) was employed to interpret these wind tunnel measurements for four different broadleaf species (Ilex cornuta, Quercus alba, Magnolia grandiflora, and Lonicera fragrantissima) and three wind speed (0.3–0.9 ms -1) conditions. Among the four broadleaf species considered, Ilex cornuta with its partially folded shape and sharp edges was the most efficient at collecting UFP followed bymore » the other three flat-shaped broadleaf species. The findings here suggest that a connection must exist between UFP collection and leaf dimension and roughness. This connection is shown to be primarily due to the thickness of a quasi-laminar boundary layer pinned to the leaf surface assuming the flow over a leaf resembles that of a flat plate. A scaling analysis that utilizes a three-sublayer depositional model for a flat plate of finite size and roughness embedded within the quasi-laminar boundary layer illustrates these connections. The analysis shows that a longer leaf dimension allows for thicker quasi-laminar boundary layers to develop. A thicker quasi-laminar boundary layer depth in turn increases the overall resistance to UFP deposition due to an increase in the diffusional path length thereby reducing the leaf-scale UFP collection efficiency. Finally, it is suggested that the effects of leaf microroughness are less relevant to the UFP collection efficiency than are the leaf dimensions for the four broadleaf species explored here.« less

  19. Institutional narratives in the discourse between oncology social workers and cancer patients' self-help organization.

    PubMed

    Kacen, Lea; Bakshy, Iris

    2005-09-01

    In this study, the authors examine a discourse between members of a cancer patients' self-help organization (CP-SHO) and oncological social workers (OSWs) on support groups for cancer patients. Eight OSWs and 8 CP-SHO volunteers served as the key research population. Using the interpretive-narrative approach to research, the authors apply a variety of data collection methods and a combination of data analysis methods: narrative analysis and discourse analysis. The findings point to the simultaneous existence of two institutional narratives for each organization, one internal and the other external. Discourse between the organizations takes place mainly at the external institutional narrative level, with each body maintaining the mistaken impression that the other's perception of reality is similar to its own (false consensus). In the meantime, the internal narratives that attest to the latent meaning of the discourse govern the interaction and prevent effective dialogue between the respective organizations.

  20. Poisoning: fact or fiction?

    PubMed

    Flanagan, Robert J

    2012-01-01

    Analytical toxicology is a complex discipline. Simply detecting a poison in a biological sample does not necessarily mean that the individual from whom the sample was obtained had been poisoned. An analysis can prove exposure and perhaps give an indication of the magnitude of exposure, but the results have to be placed in proper context. Even if sampling was ante-mortem an analysis does not necessarily prove the effects that the drug or poison had on the victim immediately before or at the time of sampling. Tolerance is one big issue, the mechanism of exposure (how the drug got into the body) is another, and of course with post-mortem work there are always additional considerations such as site of sample collection and the possibility of post-mortem change in analyte concentration. There are also questions of quality and reliability, and whether a particular analysis and the interpretation placed upon the result are appropriate in a particular case.

  1. Signal analysis of accelerometry data using gravity-based modeling

    NASA Astrophysics Data System (ADS)

    Davey, Neil P.; James, Daniel A.; Anderson, Megan E.

    2004-03-01

    Triaxial accelerometers have been used to measure human movement parameters in swimming. Interpretation of data is difficult due to interference sources including interaction of external bodies. In this investigation the authors developed a model to simulate the physical movement of the lower back. Theoretical accelerometery outputs were derived thus giving an ideal, or noiseless dataset. An experimental data collection apparatus was developed by adapting a system to the aquatic environment for investigation of swimming. Model data was compared against recorded data and showed strong correlation. Comparison of recorded and modeled data can be used to identify changes in body movement, this is especially useful when cyclic patterns are present in the activity. Strong correlations between data sets allowed development of signal processing algorithms for swimming stroke analysis using first the pure noiseless data set which were then applied to performance data. Video analysis was also used to validate study results and has shown potential to provide acceptable results.

  2. [A logical framework derived from philosophy of language for analysis of the terms of traditional Chinese medicine and an example for analysis of "kidney essence"].

    PubMed

    Huang, Jian-hua; Li, Wen-wei; Bian, Qin; Shen, Zi-yin

    2011-09-01

    The true meanings of the terms of traditional Chinese medicine (TCM) need to be analyzed on a logical basis. It is not suitable to use a new term to interpret an old term of TCM, or arbitrarily specify the special term of TCM corresponding to some substances of modern medicine. In philosophy of language, language has a logical structure, which reflects the structure of the world, that is to say, language is the picture of the world in a logical sense. Using this idea, the authors collected the ancient literature on "kidney essence", and extracted each necessary condition for "kidney essence". All necessary conditions formed a sufficient condition to define the term "kidney essence". It is expected that this example can show the effectiveness of philosophy of language in analysis of the terms of TCM.

  3. unmarked: An R package for fitting hierarchical models of wildlife occurrence and abundance

    USGS Publications Warehouse

    Fiske, Ian J.; Chandler, Richard B.

    2011-01-01

    Ecological research uses data collection techniques that are prone to substantial and unique types of measurement error to address scientific questions about species abundance and distribution. These data collection schemes include a number of survey methods in which unmarked individuals are counted, or determined to be present, at spatially- referenced sites. Examples include site occupancy sampling, repeated counts, distance sampling, removal sampling, and double observer sampling. To appropriately analyze these data, hierarchical models have been developed to separately model explanatory variables of both a latent abundance or occurrence process and a conditional detection process. Because these models have a straightforward interpretation paralleling mechanisms under which the data arose, they have recently gained immense popularity. The common hierarchical structure of these models is well-suited for a unified modeling interface. The R package unmarked provides such a unified modeling framework, including tools for data exploration, model fitting, model criticism, post-hoc analysis, and model comparison.

  4. Amended annual report for Brookhaven National Laboratory: Epidemiologic surveillance - 1994

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    Epidemiologic surveillance at DOE facilities consists of regular and systematic collection, analysis, and interpretation of data on absences due to illness and injury in the work force. Its purpose is to provide an early warning system for health problems occurring among employees at participating sites. Data are collected by coordinators at each site and submitted to the Epidemiologic Surveillance Data Center, located at the Oak Ridge Institute for Science and Education, where quality control procedures and analyses are carried out. Rates of absences and rates of diagnoses associated with absences are analyzed by occupation and other relevant variables. They maymore » be compared with the disease experience of different groups within the DOE work force and with populations and do not work for DOE to identify disease patterns or clusters that may be associated work activities. This report provides a final summary for BNL.« less

  5. US Geological Survey nutrient preservation experiment : experimental design, statistical analysis, and interpretation of analytical results

    USGS Publications Warehouse

    Patton, Charles J.; Gilroy, Edward J.

    1999-01-01

    Data on which this report is based, including nutrient concentrations in synthetic reference samples determined concurrently with those in real samples, are extensive (greater than 20,000 determinations) and have been published separately. In addition to confirming the well-documented instability of nitrite in acidified samples, this study also demonstrates that when biota are removed from samples at collection sites by 0.45-micrometer membrane filtration, subsequent preservation with sulfuric acid or mercury (II) provides no statistically significant improvement in nutrient concentration stability during storage at 4 degrees Celsius for 30 days. Biocide preservation had no statistically significant effect on the 30-day stability of phosphorus concentrations in whole-water splits from any of the 15 stations, but did stabilize Kjeldahl nitrogen concentrations in whole-water splits from three data-collection stations where ammonium accounted for at least half of the measured Kjeldahl nitrogen.

  6. Radioactive sample effects on EDXRF spectra

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Worley, Christopher G

    2008-01-01

    Energy dispersive X-ray fluorescence (EDXRF) is a rapid, straightforward method to determine sample elemental composition. A spectrum can be collected in a few minutes or less, and elemental content can be determined easily if there is adequate energy resolution. Radioactive alpha emitters, however, emit X-rays during the alpha decay process that complicate spectral interpretation. This is particularly noticeable when using a portable instrument where the detector is located in close proximity to the instrument analysis window held against the sample. A portable EDXRF instrument was used to collect spectra from specimens containing plutonium-239 (a moderate alpha emitter) and americium-241 (amore » heavy alpha emitter). These specimens were then analyzed with a wavelength dispersive XRF (WDXRF) instrument to demonstrate the differences to which sample radiation-induced X-ray emission affects the detectors on these two types of XRF instruments.« less

  7. Search for the Standard Model Higgs boson in the H to tau+ tau- decay mode in sqrt(s) = 7 TeV pp collisions with ATLAS

    DOE PAGES

    Aad, Georges

    2014-11-12

    A search for the neutral Higgs bosons predicted by the Minimal Supersymmetric Standard Model (MSSM) is reported. The analysis is performed on data from proton-proton collisions at a centre-of-mass energy of 8TeV collected with the ATLAS detector at the Large Hadron Collider. The samples used for this search were collected in 2012 and correspond to integrated luminosities in the range 19.5-20.3 fb -1. The MSSM Higgs bosons are searched for in the τ τ final state. No significant excess over the expected background is observed, and exclusion limits are derived for the production cross section times branching fraction of amore » scalar particle as a function of its mass. The results are also interpreted in the MSSM parameter space for various benchmark scenarios.« less

  8. Annual report for Hanford Site: Epidemiologic surveillance - 1994

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1998-01-01

    Epidemiologic surveillance at U.S. Department of Energy (DOE) facilities consists of regular and systematic collection, analysis, and interpretation of data on absences due to illness and injury in the work force. Its purpose is to provide an early warning system for health problems occurring among employees at participating sites. Data are collected by coordinators at each site and submitted to the Epidemiologic Surveillance Data Center, located at the Oak Ridge Institute for Science and Education, where quality control procedures and analyses are carried out. Rates of absences and rates of diagnoses associated with absences are analyzed by occupational and othermore » relevant variables. They may be compared with the disease experience of different groups within the DOE work force and with populations that do not work for DOE to identify disease patterns or clusters that may be associated with work activities.This report provides the final summary for the Hanford Reservation.« less

  9. Epidemiologic surveillance. Annual report for EG&G Rocky Flats

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1994-12-31

    Epidemiologic surveillance at U.S. Department of Energy (DOE) facilities consists of regular and systematic collection, analysis, and interpretation of data on absences resulting from illness and injury in the work force. Its purpose is to provide an early warning system for health problems occurring among employees at participating sites. Data are collected by coordinators at each site and submitted to the Epidemiologic Surveillance Data Center, located at the Oak Ridge Institute for Science and Education, where quality control procedures and analyses are carried out. Rates of absences and rates of diagnoses associated with absences are analyzed by occupation and othermore » relevant variables. They may be compared with the disease experience of different groups within the DOE work force and with populations that do not work for DOE to identify disease patterns or clusters that may be associated with work activities. This report presents the 1994 morbidity data for the Rocky Flats plant.« less

  10. Identification, Collection, and Preservation of Veterinary Forensic Evidence: On Scene and During the Postmortem Examination.

    PubMed

    Touroo, R; Fitch, A

    2016-09-01

    Although it is the obligation of the veterinary forensic pathologist to be competent in identifying, collecting, and preserving evidence from the body, it is also necessary for them to understand the relevance of conditions on the crime scene. The body is just one piece of the puzzle that needs to be considered when determining the cause of death. The information required for a complete postmortem analysis should also include details of the animal's environment and items of evidence present on the crime scene. These factors will assist the veterinary forensic pathologist in the interpretation of necropsy findings. Therefore, the veterinary forensic pathologist needs to have a basic understanding of how the crime scene is processed, as well as the role of the forensic veterinarian on scene. In addition, the veterinary forensic pathologist must remain unbiased, necessitating an understanding of evidence maintenance and authentication. © The Author(s) 2016.

  11. Teaching Real Data Interpretation with Models (TRIM): Analysis of Student Dialogue in a Large-Enrollment Cell and Developmental Biology Course

    PubMed Central

    Zagallo, Patricia; Meddleton, Shanice; Bolger, Molly S.

    2016-01-01

    We present our design for a cell biology course to integrate content with scientific practices, specifically data interpretation and model-based reasoning. A 2-yr research project within this course allowed us to understand how students interpret authentic biological data in this setting. Through analysis of written work, we measured the extent to which students’ data interpretations were valid and/or generative. By analyzing small-group audio recordings during in-class activities, we demonstrated how students used instructor-provided models to build and refine data interpretations. Often, students used models to broaden the scope of data interpretations, tying conclusions to a biological significance. Coding analysis revealed several strategies and challenges that were common among students in this collaborative setting. Spontaneous argumentation was present in 82% of transcripts, suggesting that data interpretation using models may be a way to elicit this important disciplinary practice. Argumentation dialogue included frequent co-construction of claims backed by evidence from data. Other common strategies included collaborative decoding of data representations and noticing data patterns before making interpretive claims. Focusing on irrelevant data patterns was the most common challenge. Our findings provide evidence to support the feasibility of supporting students’ data-interpretation skills within a large lecture course. PMID:27193288

  12. Perception of Health Problems Among Competitive Runners

    PubMed Central

    Jelvegård, Sara; Timpka, Toomas; Bargoria, Victor; Gauffin, Håkan; Jacobsson, Jenny

    2016-01-01

    Background: Approximately 2 of every 3 competitive runners sustain at least 1 health problem each season. Most of these problems are nontraumatic injuries with gradual onset. The main known risk indicator for sustaining a new running-related injury episode is a history of a previous injury, suggesting that behavioral habits are part of the causal mechanisms. Purpose: Identification of elements associated with purposeful interpretations of body perceptions and balanced behavioral responses may supply vital information for prevention of health problems in runners. This study set out to explore competitive runners’ cognitive appraisals of perceived symptoms on injury and illness and how these appraisals are transformed into behavior. Study Design: Cross-sectional study; Level of evidence, 3. Methods: The study population consisted of Swedish middle- and long-distance runners from the national top 15 list. Qualitative research methods were used to categorize interview data and perform a thematic analysis. The categories resulting from the analysis were used to construct an explanatory model. Results: Saturation of the thematic classification required that data from 8 male and 6 female runners (age range, 20-36 years) were collected. Symptoms interpreted to be caused by illness or injury with a sudden onset were found to lead to immediate action and changes to training and competition programs (activity pacing). In contrast, perceptions interpreted to be due to injuries with gradual onset led to varied behavioral reactions. These behavioral responses were planned with regard to short-term consequences and were characterized by indifference and neglect of long-term implications, consistent with an overactivity behavioral pattern. The latter pattern was consistent with a psychological adaptation to stimuli that is presented progressively to the athlete. Conclusion: Competitive runners appraise whether a health problem requires immediate withdrawal from training based on whether the problem is interpreted as an illness and/or has a sudden onset. The ensuing behaviors follow 2 distinct patterns that can be termed “activity pacing” and “overactivity.” PMID:28210643

  13. US forests are showing increased rates of decline in response to a changing climate

    Treesearch

    Warren B. Cohen; Zhiqiang Yang; David M. Bell; Stephen V. Stehman

    2015-01-01

    How vulnerable are US forest to a changing climate? We answer this question using Landsat time series data and a unique interpretation approach, TimeSync, a plot-based Landsat visualization and data collection tool. Original analyses were based on a stratified two-stage cluster sample design that included interpretation of 3858 forested plots. From these data, we...

  14. Preventing School Shootings: A Public Health Approach to Gun Violence

    DTIC Science & Technology

    2013-03-01

    Neither did the 17th Century English jurist and judge, William Blackstone , nor philosopher and physician, John Locke, known as the Father of Classical...Liberalism. The writings of Blackstone and Locke, both of whom wrote extensively about constitutional traditions and natural rights, respectively...provide considerable evidence for the collective interpretation as opposed to the individual interpretation of the Second Amendment. Blackstone

  15. [Artistic creativity in the light of Jungian analytical psychology].

    PubMed

    Trixler, Mátyás; Gáti, Agnes; Tényi, Tamás

    2010-01-01

    C.G. Jung's analytical psychology points at important issues in the psychological understanding of creativity. The theories of the Collective Unconscious and the Archetypes contributed to important discoveries in the interpretation of artistic creativity. Jung was concerned to show the relevance of Analytical Psychology to the understanding of European Modernism. Our paper deals with a short Jungian interpretation of Csontvary's art, too.

  16. Communism and the meaning of social memory: towards a critical-interpretive approach.

    PubMed

    Tileagă, Cristian

    2012-12-01

    Using a case study of representations of communism in Romania, the paper offers a sketch of a critical-interpretive approach for exploring and engaging with the social memory of communism. When one considers the various contemporary appraisals, responses to and positions towards the communist period one identifies and one is obliged to deal with a series of personal and collective moral/political quandaries. In their attempt to bring about historical justice, political elites create a world that conforms more to their needs and desires than to the diversity of meanings of communism, experiences and dilemmas of lay people. This paper argues that one needs to study formal aspects of social memory as well as "lived", often conflicting, attitudinal and mnemonic stances and interpretive frameworks. One needs to strive to find the meaning of the social memory of communism in the sometimes contradictory, paradoxical attitudes and meanings that members of society communicate, endorse and debate. Many of the ethical quandaries and dilemmas of collective memory and recent history can be better understood by describing the discursive and sociocultural processes of meaning-making and meaning-interpretation carried out by members of a polity.

  17. Insights into crustal structure of the Eastern North American Margin from community multichannel seismic and potential field data

    NASA Astrophysics Data System (ADS)

    Davis, J. K.; Becel, A.; Shillington, D. J.; Buck, W. R.

    2017-12-01

    In the fall of 2014, the R/V Marcus Langseth collected gravity, magnetic, and reflection seismic data as part of the Eastern North American Margin Community Seismic Experiment. The dataset covers a 500 km wide section of the Mid-Atlantic passive margin offshore North Carolina, which formed after the Mesozoic breakup of the supercontinent Pangaea. Using these seismic and potential field data, we present observations and interpretations along two cross margin and one along-margin profiles. Analyses and interpretations are conducted using pre-stack depth migrated reflection seismic profiles in conjunction with forward modeling of shipboard gravity and magnetic anomalies. Preliminary interpretations of the data reveal variations in basement character and structure across the entire transition between continental and oceanic domains. These interpretations help provide insight into the origin and nature of the prominent East Coast and Blake Spur magnetic anomalies, as well as the Inner Magnetic Quiet Zone which occupies the domain between the anomalies. Collectively, these observations can aid in deciphering the rift-to-drift transition during the breakup of North America and West Africa and formation of the Central Atlantic.

  18. Assessing the Validity of Discourse Analysis: Transdisciplinary Convergence

    ERIC Educational Resources Information Center

    Jaipal-Jamani, Kamini

    2014-01-01

    Research studies using discourse analysis approaches make claims about phenomena or issues based on interpretation of written or spoken text, which includes images and gestures. How are findings/interpretations from discourse analysis validated? This paper proposes transdisciplinary convergence as a way to validate discourse analysis approaches to…

  19. 48 CFR 952.237-70 - Collective bargaining agreements-protective services.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... collective bargaining agreements applicable to the work force under this contract, the Contractor shall use its best efforts to ensure such agreements contain provisions designed to assure continuity of... grievances and disputes involving the interpretation or application of the agreement will be settled without...

  20. An Ethnographic Diary Study

    ERIC Educational Resources Information Center

    Hall, Graham

    2008-01-01

    This article examines a small-scale ethnographic survey of a single classroom. Drawing on the collected data, the discussion focuses on some of the problems encountered whilst collecting and interpreting data through self-report diaries. Amongst the issues considered are the perceptions of teachers and learners and their ability to articulate…

  1. Applying Data Mining Principles to Library Data Collection.

    ERIC Educational Resources Information Center

    Guenther, Kim

    2000-01-01

    Explains how libraries can use data mining techniques for more effective data collection. Highlights include three phases: data selection and acquisition; data preparation and processing, including a discussion of the use of XML (extensible markup language); and data interpretation and integration, including database management systems. (LRW)

  2. Individual and collective bodies: using measures of variance and association in contextual epidemiology.

    PubMed

    Merlo, J; Ohlsson, H; Lynch, K F; Chaix, B; Subramanian, S V

    2009-12-01

    Social epidemiology investigates both individuals and their collectives. Although the limits that define the individual bodies are very apparent, the collective body's geographical or cultural limits (eg "neighbourhood") are more difficult to discern. Also, epidemiologists normally investigate causation as changes in group means. However, many variables of interest in epidemiology may cause a change in the variance of the distribution of the dependent variable. In spite of that, variance is normally considered a measure of uncertainty or a nuisance rather than a source of substantive information. This reasoning is also true in many multilevel investigations, whereas understanding the distribution of variance across levels should be fundamental. This means-centric reductionism is mostly concerned with risk factors and creates a paradoxical situation, as social medicine is not only interested in increasing the (mean) health of the population, but also in understanding and decreasing inappropriate health and health care inequalities (variance). Critical essay and literature review. The present study promotes (a) the application of measures of variance and clustering to evaluate the boundaries one uses in defining collective levels of analysis (eg neighbourhoods), (b) the combined use of measures of variance and means-centric measures of association, and (c) the investigation of causes of health variation (variance-altering causation). Both measures of variance and means-centric measures of association need to be included when performing contextual analyses. The variance approach, a new aspect of contextual analysis that cannot be interpreted in means-centric terms, allows perspectives to be expanded.

  3. Tectonic Tremor and the Collective Behavior of Low-Frequency Earthquakes

    NASA Astrophysics Data System (ADS)

    Frank, W.; Shapiro, N.; Husker, A. L.; Kostoglodov, V.; Campillo, M.; Gusev, A. A.

    2015-12-01

    Tectonic tremor, a long duration, emergent seismic signal observed along the deep roots of plate interfaces, is thought to be the superposition of repetitive shear events called low-frequency earthquakes (LFE) [e.g. Shelly et al., Nature, 2007]. We use a catalog of more than 1.8 million LFEs regrouped into more than 1000 families observed over 2 years in the Guerrero subduction zone in Mexico, considering each family as an individual repetitive source or asperity. We develop a statistical analysis to determine whether the subcatalogs corresponding to different sources represent random Poisson processes or if they exhibit scale-invariant clustering in time, which we interpret as a manifestation of collective behavior. For each individual LFE source, we compare their level of collective behavior during two time periods: during the six-month-long 2006 Mw 7.5 slow-slip event and during a calm period with no observed slow slip. We find that the collective behavior of LFEs depends on distance from the trench and increases when the subduction interface is slowly slipping. Our results suggest that the occurrence of strong episodes of tectonic tremors cannot be simply explained by increased rates of low frequency earthquakes at every individual LFE source but correspond to an enhanced collective behavior of the ensemble of LFE asperities.

  4. An Interpretative Phenomenological Analysis of the Common Core Standards Program in the State of South Dakota

    ERIC Educational Resources Information Center

    Alase, Abayomi

    2017-01-01

    This interpretative phenomenological analysis (IPA) study investigated and interpreted the Common Core State Standards program (the phenomenon) that has been the dominating topic of discussions amongst educators all across the country since the inauguration of the program in 2014/2015 school session. Common Core State Standards (CCSS) was a…

  5. What Is SEER?

    Cancer.gov

    An infographic describing the functions of NCI’s Surveillance, Epidemiology, and End Results (SEER) program: collecting, analyzing, interpreting, and disseminating reliable population-based statistics.

  6. HEP Software Foundation Community White Paper Working Group - Data Analysis and Interpretation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bauerdick, Lothar

    At the heart of experimental high energy physics (HEP) is the development of facilities and instrumentation that provide sensitivity to new phenomena. Our understanding of nature at its most fundamental level is advanced through the analysis and interpretation of data from sophisticated detectors in HEP experiments. The goal of data analysis systems is to realize the maximum possible scientific potential of the data within the constraints of computing and human resources in the least time. To achieve this goal, future analysis systems should empower physicists to access the data with a high level of interactivity, reproducibility and throughput capability. Asmore » part of the HEP Software Foundation Community White Paper process, a working group on Data Analysis and Interpretation was formed to assess the challenges and opportunities in HEP data analysis and develop a roadmap for activities in this area over the next decade. In this report, the key findings and recommendations of the Data Analysis and Interpretation Working Group are presented.« less

  7. Interim results of quality-control sampling of surface water for the Upper Colorado River National Water-Quality Assessment Study Unit, water years 1995-96

    USGS Publications Warehouse

    Spahr, N.E.; Boulger, R.W.

    1997-01-01

    Quality-control samples provide part of the information needed to estimate the bias and variability that result from sample collection, processing, and analysis. Quality-control samples of surface water collected for the Upper Colorado River National Water-Quality Assessment study unit for water years 1995?96 are presented and analyzed in this report. The types of quality-control samples collected include pre-processing split replicates, concurrent replicates, sequential replicates, post-processing split replicates, and field blanks. Analysis of the pre-processing split replicates, concurrent replicates, sequential replicates, and post-processing split replicates is based on differences between analytical results of the environmental samples and analytical results of the quality-control samples. Results of these comparisons indicate that variability introduced by sample collection, processing, and handling is low and will not affect interpretation of the environmental data. The differences for most water-quality constituents is on the order of plus or minus 1 or 2 lowest rounding units. A lowest rounding unit is equivalent to the magnitude of the least significant figure reported for analytical results. The use of lowest rounding units avoids some of the difficulty in comparing differences between pairs of samples when concentrations span orders of magnitude and provides a measure of the practical significance of the effect of variability. Analysis of field-blank quality-control samples indicates that with the exception of chloride and silica, no systematic contamination of samples is apparent. Chloride contamination probably was the result of incomplete rinsing of the dilute cleaning solution from the outlet ports of the decaport sample splitter. Silica contamination seems to have been introduced by the blank water. Sampling and processing procedures for water year 1997 have been modified as a result of these analyses.

  8. Structural disorder in the decagonal Al-Co-Ni. I. Patterson analysis of diffuse x-ray scattering data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kobas, Miroslav; Weber, Thomas; Steurer, Walter

    The three-dimensional (3D) difference Patterson (autocorrelation) function of a disordered quasicrystal (Edagawa phase) has been analyzed. 3D diffuse x-ray diffraction data were collected in situ at 300, 1070, and 1120 K. A method, the punch-and-fill technique, has been developed for separating diffuse scattering and Bragg reflections. Its potential and limits are discussed in detail. The different Patterson maps are interpreted in terms of intercluster correlations as a function of temperature. Both at high and low temperatures, the clusters decorate the vertices of the same quasiperiodic covering. At low temperatures, for the disordered part of the structure, short-range intercluster correlations aremore » present, whereas at higher temperatures, medium-range intercluster correlations are formed. This indicates disorder mainly inside clusters at low temperatures, whereas at higher temperatures disorder takes place inside larger superclusters. Qualitatively, the Patterson maps may be interpreted by intercluster correlations mainly inside pentagonal superclusters below 1120 K, and inside the larger decagonal superclusters at 1120 K. The results of our diffraction study are published in two parts. Part I focuses on the 3D Patterson analysis based on experimental data, Part II reports modeling of structural disorder in decagonal Al-Co-Ni.« less

  9. Interpreting Meta-Analyses of Genome-Wide Association Studies

    PubMed Central

    Han, Buhm; Eskin, Eleazar

    2012-01-01

    Meta-analysis is an increasingly popular tool for combining multiple genome-wide association studies in a single analysis to identify associations with small effect sizes. The effect sizes between studies in a meta-analysis may differ and these differences, or heterogeneity, can be caused by many factors. If heterogeneity is observed in the results of a meta-analysis, interpreting the cause of heterogeneity is important because the correct interpretation can lead to a better understanding of the disease and a more effective design of a replication study. However, interpreting heterogeneous results is difficult. The standard approach of examining the association p-values of the studies does not effectively predict if the effect exists in each study. In this paper, we propose a framework facilitating the interpretation of the results of a meta-analysis. Our framework is based on a new statistic representing the posterior probability that the effect exists in each study, which is estimated utilizing cross-study information. Simulations and application to the real data show that our framework can effectively segregate the studies predicted to have an effect, the studies predicted to not have an effect, and the ambiguous studies that are underpowered. In addition to helping interpretation, the new framework also allows us to develop a new association testing procedure taking into account the existence of effect. PMID:22396665

  10. Clinical methods for the assessment of the effects of environmental stress on fish health

    USGS Publications Warehouse

    Wedemeyer, Gary A.; Yasutake, William T.

    1977-01-01

    Clinical methods are presented for biological monitoring of hatchery and native fish populations to assess the effects of environmental stress on fish health. The choice of methods is based on the experience of the authors and the judgment of colleagues at fishery laboratories of the U.S. Fish and Wildlife Service. Detailed analysis methods, together with guidelines for sample collection and for the interpretation of results, are given for tests on blood (cell counts, chloride, cholesterol, clotting time, cortisol, glucose, hematocrit, hemoglobin, lactic acid, methemoglobin, osmolality, and total protein); water (ammonia and nitrite content); and liver and muscle (glycogen content).

  11. Critical social theory as a grounded process.

    PubMed

    Fleming, V E; Moloney, J A

    1996-09-01

    This article reflects upon the research process that uses critical social science as its basis. Some of the common criticisms of critical social science research are rebutted by following the research trail taken by the authors when undertaking their own projects. The similarities and differences of critical social science and the interpretative methodologies are outlined in the selection of study area, the relationship of the researcher and participants, ethical issues, and data collection and analysis. The writers conclude that critical social science research reports that are correctly carried out should be firmly grounded in each of these stages, thereby a foundation for nursing and midwifery practice.

  12. A bird's eye view: the cognitive strategies of experts interpreting seismic profiles

    NASA Astrophysics Data System (ADS)

    Bond, C. E.; Butler, R.

    2012-12-01

    Geoscience is perhaps unique in its reliance on incomplete datasets and building knowledge from their interpretation. This interpretation basis for the science is fundamental at all levels; from creation of a geological map to interpretation of remotely sensed data. To teach and understand better the uncertainties in dealing with incomplete data we need to understand the strategies individual practitioners deploy that make them effective interpreters. The nature of interpretation is such that the interpreter needs to use their cognitive ability in the analysis of the data to propose a sensible solution in their final output that is both consistent not only with the original data but also with other knowledge and understanding. In a series of experiments Bond et al. (2007, 2008, 2011, 2012) investigated the strategies and pitfalls of expert and non-expert interpretation of seismic images. These studies focused on large numbers of participants to provide a statistically sound basis for analysis of the results. The outcome of these experiments showed that techniques and strategies are more important than expert knowledge per se in developing successful interpretations. Experts are successful because of their application of these techniques. In a new set of experiments we have focused on a small number of experts to determine how they use their cognitive and reasoning skills, in the interpretation of 2D seismic profiles. Live video and practitioner commentary were used to track the evolving interpretation and to gain insight on their decision processes. The outputs of the study allow us to create an educational resource of expert interpretation through online video footage and commentary with associated further interpretation and analysis of the techniques and strategies employed. This resource will be of use to undergraduate, post-graduate, industry and academic professionals seeking to improve their seismic interpretation skills, develop reasoning strategies for dealing with incomplete datasets, and for assessing the uncertainty in these interpretations. Bond, C.E. et al. (2012). 'What makes an expert effective at interpreting seismic images?' Geology, 40, 75-78. Bond, C. E. et al. (2011). 'When there isn't a right answer: interpretation and reasoning, key skills for 21st century geoscience'. International Journal of Science Education, 33, 629-652. Bond, C. E. et al. (2008). 'Structural models: Optimizing risk analysis by understanding conceptual uncertainty'. First Break, 26, 65-71. Bond, C. E. et al., (2007). 'What do you think this is?: "Conceptual uncertainty" In geoscience interpretation'. GSA Today, 17, 4-10.

  13. Women's Careers and Transitions: A Collective Case Study of Leaders Who Took a Career Break to Be Stay-at-Home Mothers

    ERIC Educational Resources Information Center

    Webster, Janet Swanson

    2010-01-01

    This interpretive collective case study sought to fully understand and describe the experience of women leaders who took a career break to be stay-at-home mothers. Five women leaders representing a variety of industries and leadership roles participated in the study. Multiple methods were used to collect data with participant interviews being the…

  14. Seismic reflection response from cross-correlations of ambient vibrations on non-conventional hidrocarbon reservoir

    NASA Astrophysics Data System (ADS)

    Huerta, F. V.; Granados, I.; Aguirre, J.; Carrera, R. Á.

    2017-12-01

    Nowadays, in hydrocarbon industry, there is a need to optimize and reduce exploration costs in the different types of reservoirs, motivating the community specialized in the search and development of alternative exploration geophysical methods. This study show the reflection response obtained from a shale gas / oil deposit through the method of seismic interferometry of ambient vibrations in combination with Wavelet analysis and conventional seismic reflection techniques (CMP & NMO). The method is to generate seismic responses from virtual sources through the process of cross-correlation of records of Ambient Seismic Vibrations (ASV), collected in different receivers. The seismic response obtained is interpreted as the response that would be measured in one of the receivers considering a virtual source in the other. The acquisition of ASV records was performed in northern of Mexico through semi-rectangular arrays of multi-component geophones with instrumental response of 10 Hz. The in-line distance between geophones was 40 m while in cross-line was 280 m, the sampling used during the data collection was 2 ms and the total duration of the records was 6 hours. The results show the reflection response of two lines in the in-line direction and two in the cross-line direction for which the continuity of coherent events have been identified and interpreted as reflectors. There is certainty that the events identified correspond to reflections because the time-frequency analysis performed with the Wavelet Transform has allowed to identify the frequency band in which there are body waves. On the other hand, the CMP and NMO techniques have allowed to emphasize and correct the reflection response obtained during the correlation processes in the frequency band of interest. The results of the processing and analysis of ASV records through the seismic interferometry method have allowed us to see interesting results in light of the cross-correlation process in combination with the Wavelet analysis and conventional seismic reflection techniques. Therefore it was possible to recover the seismic response on each analyzed source-receiver pair, allowing us to obtain the reflection response of each analyzed seismic line.

  15. Introduction: Reengaging with instruments.

    PubMed

    Taub, Liba

    2011-12-01

    Over the past twenty years or so, historians of science have become increasingly sensitized to issues involved in studying and interpreting scientific and medical instruments. The contributors to this Focus section are historians of science who have worked closely with museum objects and collections, specifically instruments used in scientific and medical contexts. Such close engagement by historians of science is somewhat rare, provoking distinctive questions as to how we define and understand instruments, opening up issues regarding the value of broken or incomplete objects, and raising concerns about which scientific and medical artifacts are displayed and interpreted in museums and in what manner. It is hoped that these essays point historians of science in new directions for reengaging with scientific objects and collections.

  16. Theoretical interpretation of the nuclear structure of 88Se within the ACM and the QPM models.

    NASA Astrophysics Data System (ADS)

    Gratchev, I. N.; Thiamova, G.; Alexa, P.; Simpson, G. S.; Ramdhane, M.

    2018-02-01

    The four-parameter algebraic collective model (ACM) Hamiltonian is used to describe the nuclear structure of 88Se. It is shown that the ACM is capable of providing a reasonable description of the excitation energies and relative positions of the ground-state band and γ band. The most probable interpretation of the nuclear structure of 88Se is that of a transitional nucleus. The Quasiparticle-plus-Phonon Model (QPM) was also applied to describe the nuclear motion in 88Se. Preliminarily calculations show that the collectivity of second excited state {2}2+ is weak and that this state contains a strong two-quasiparticle component.

  17. Kansas Water Science Center bookmark

    USGS Publications Warehouse

    ,

    2017-03-27

    The U.S. Geological Survey Kansas Water Science Center has collected and interpreted hydrologic information in Kansas since 1895. Data collected include streamflow and gage height, reservoir content, water quality and water quantity, suspended sediment, and groundwater levels. Interpretative hydrologic studies are completed on national, regional, statewide, and local levels and cooperatively funded through more than 40 partnerships with these agencies. The U.S. Geological Survey provides impartial scientific information to describe and understand the health of our ecosystems and environment; minimize loss of life and property from natural disasters; manage water, biological, energy, and mineral resources; and enhance and protect our quality of life. These collected data are in the National Water Information System https://waterdata.usgs.gov/ks/nwis/rt, and all results are documented in reports that also are online at https://ks.water.usgs.gov/. Follow the USGS Kansas Water Science Center on Twitter for the most recent updates and other information: https://twitter.com/USGS_KS.

  18. 36 CFR 1290.1 - Scope of assassination record.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... JFK ASSASSINATION RECORDS GUIDANCE FOR INTERPRETATION AND IMPLEMENTATION OF THE PRESIDENT JOHN F. KENNEDY ASSASSINATION RECORDS COLLECTION ACT OF 1992 (JFK ACT) § 1290.1 Scope of assassination record. (a... limitation: (1) All records as defined in Section 3(2) of the JFK Act; (2) All records collected by or...

  19. 36 CFR 1290.1 - Scope of assassination record.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... JFK ASSASSINATION RECORDS GUIDANCE FOR INTERPRETATION AND IMPLEMENTATION OF THE PRESIDENT JOHN F. KENNEDY ASSASSINATION RECORDS COLLECTION ACT OF 1992 (JFK ACT) § 1290.1 Scope of assassination record. (a... limitation: (1) All records as defined in Section 3(2) of the JFK Act; (2) All records collected by or...

  20. 36 CFR 1290.1 - Scope of assassination record.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... JFK ASSASSINATION RECORDS GUIDANCE FOR INTERPRETATION AND IMPLEMENTATION OF THE PRESIDENT JOHN F. KENNEDY ASSASSINATION RECORDS COLLECTION ACT OF 1992 (JFK ACT) § 1290.1 Scope of assassination record. (a... limitation: (1) All records as defined in Section 3(2) of the JFK Act; (2) All records collected by or...

Top