Science.gov

Sample records for meaningful assessment method

  1. The Role of Leadership and Culture in Creating Meaningful Assessment: A Mixed Methods Case Study

    ERIC Educational Resources Information Center

    Guetterman, Timothy C.; Mitchell, Nancy

    2016-01-01

    With increased demands for institutional accountability and improved student learning, involvement in assessment has become a fundamental role of higher education faculty (Rhodes, 2010). However, faculty members and administrators often question whether assessment efforts do indeed improve student learning (Hutchings, 2010). This mixed methods…

  2. Assessment and Accountability to Support Meaningful Learning

    ERIC Educational Resources Information Center

    Marion, Scott; Leather, Paul

    2015-01-01

    This paper presents an overview of New Hampshire's efforts to implement a pilot accountability system designed to support deeper learning for students and powerful organization change for schools and districts. The accountability pilot, referred to as Performance Assessment of Competency Education or PACE, is grounded in a competencybased…

  3. Rising to the Challenge: Meaningful Assessment of Student Learning

    ERIC Educational Resources Information Center

    Association of Public and Land-grant Universities, 2010

    2010-01-01

    "Rising to the Challenge: Meaningful Assessment of Student Learning" was envisioned in response to a 2007 request for proposals from the U.S. Department of Education's Fund for Improvement of Post Secondary Education (FIPSE). FIPSE called for national, consortial contributions to improving the knowledge and abilities to assess student learning for…

  4. Implementing meaningful, educative curricula, and assessments in complex school environments

    PubMed Central

    Ennis, Catherine D.

    2015-01-01

    This commentary uses the lens of curricular implementation to consider issues and opportunities afforded by the papers in this special edition. While it is interesting to envision innovative approaches to physical education, actually implementing changes in the complex institutional school environment is exceptionally challenging. These authors have done an excellent job presenting viable solutions and fore grounding challenges. Yet, without a concerted effort to invite teachers to engage with us in this process, our implementation initiatives may not enhance the meaningful and educative process that these scholars envision for physical education. PMID:25960685

  5. The Meaningful Activity Participation Assessment: A Measure of Engagement in Personally Valued Activities

    ERIC Educational Resources Information Center

    Eakman, Aaron M.; Carlson, Mike E.; Clark, Florence A.

    2010-01-01

    The Meaningful Activity Participation Assessment (MAPA), a recently developed 28-item tool designed to measure the meaningfulness of activity, was tested in a sample of 154 older adults. The MAPA evidenced a sufficient level of internal consistency and test-retest reliability and correlated as theoretically predicted with the Life Satisfaction…

  6. Making Alternate Assessment Score Reports a Meaningful Tool for Parents

    ERIC Educational Resources Information Center

    Blackwell, William Hollis, III

    2012-01-01

    While No Child Left Behind assessment policies require student performance on alternate assessments to be reported to parents, there have been no research studies and limited guidance on how this information is best reported. There are two issues resulting from the lack of research and guidance. First, there is no established standard for what…

  7. Rigorous, Meaningful and Robust: Practical Ways Forward for Assessment

    ERIC Educational Resources Information Center

    Harrison, Simon

    2004-01-01

    How do we know how good our students are at history? For that matter, how precisely do we really know what "good" at history even means? Even harder, how does our assessment of our students' attainment fit in with the National Curriculum Levels for Key Stage 3? Simon Harrison has led a project to help history teachers in Hampshire to add…

  8. Strategies for the Meaningful Evaluation of Multiple-Choice Assessments

    ERIC Educational Resources Information Center

    Chesbro, Robert

    2010-01-01

    Too many multiple-choice tests are administered without an evaluative component. Teachers often return student assessments or Scantron cards--computerized bubble forms--without review, assuming that the printing of the correct answer will suffice. However, a more constructivist approach to follow up multiple-choice tests can make for more…

  9. THE MEANINGFUL ACTIVITY PARTICIPATION ASSESSMENT: A MEASURE OF ENGAGEMENT IN PERSONALLY VALUED ACTIVITIES*

    PubMed Central

    EAKMAN, AARON M.; CARLSON, MIKE E.; CLARK, FLORENCE A.

    2011-01-01

    The Meaningful Activity Participation Assessment (MAPA), a recently developed 28-item tool designed to measure the meaningfulness of activity, was tested in a sample of 154 older adults. The MAPA evidenced a sufficient level of internal consistency and test-retest reliability and correlated as theoretically predicted with the Life Satisfaction Index-Z, the Satisfaction with Life Scale, the Engagement in Meaningful Activities Survey, the Purpose in Life Test, the Center for Epidemiologic Studies Depression Inventory and the Rand SF-36v2 Health Survey subscales. Zero-order correlations consistently demonstrated meaningful relationships between the MAPA and scales of psychosocial well-being and health-related quality of life. Results from multiple regression analyses further substantiated these findings, as greater meaningful activity participation was associated with better psychological well-being and health-related quality of life. The MAPA appears to be a reliable and valid measure of meaningful activity, incorporating both subjective and objective indicators of activity engagement. PMID:20649161

  10. Assessing risk for sexual recidivism: some proposals on the nature of psychologically meaningful risk factors.

    PubMed

    Mann, Ruth E; Hanson, R Karl; Thornton, David

    2010-06-01

    Risk assessment and treatment for sexual offenders should focus on individual characteristics associated with recidivism risk. Although it is possible to conduct risk assessments based purely on empirical correlates, the most useful evaluations also explain the source of the risk. In this review, the authors propose that the basic requirements for a psychologically meaningful risk factor are (a) a plausible rationale that the factor is a cause of sexual offending and (b) strong evidence that it predicts sexual recidivism. Based on the second of these criteria, the authors categorize potential risk factors according to the strength of the evidence for their relationship with offending. The most strongly supported variables should be emphasized in both assessment and treatment of sexual offenders. Further research is required, however, to establish causal connections between these variables and recidivism and to examine the extent to which changes in these factors leads to reductions in recidivism potential. PMID:20363981

  11. Towards a meaningful assessment of marine ecological impacts in life cycle assessment (LCA).

    PubMed

    Woods, John S; Veltman, Karin; Huijbregts, Mark A J; Verones, Francesca; Hertwich, Edgar G

    2016-01-01

    Human demands on marine resources and space are currently unprecedented and concerns are rising over observed declines in marine biodiversity. A quantitative understanding of the impact of industrial activities on the marine environment is thus essential. Life cycle assessment (LCA) is a widely applied method for quantifying the environmental impact of products and processes. LCA was originally developed to assess the impacts of land-based industries on mainly terrestrial and freshwater ecosystems. As such, impact indicators for major drivers of marine biodiversity loss are currently lacking. We review quantitative approaches for cause-effect assessment of seven major drivers of marine biodiversity loss: climate change, ocean acidification, eutrophication-induced hypoxia, seabed damage, overexploitation of biotic resources, invasive species and marine plastic debris. Our review shows that impact indicators can be developed for all identified drivers, albeit at different levels of coverage of cause-effect pathways and variable levels of uncertainty and spatial coverage. Modeling approaches to predict the spatial distribution and intensity of human-driven interventions in the marine environment are relatively well-established and can be employed to develop spatially-explicit LCA fate factors. Modeling approaches to quantify the effects of these interventions on marine biodiversity are less well-developed. We highlight specific research challenges to facilitate a coherent incorporation of marine biodiversity loss in LCA, thereby making LCA a more comprehensive and robust environmental impact assessment tool. Research challenges of particular importance include i) incorporation of the non-linear behavior of global circulation models (GCMs) within an LCA framework and ii) improving spatial differentiation, especially the representation of coastal regions in GCMs and ocean-carbon cycle models. PMID:26826362

  12. Towards a meaningful assessment of marine ecological impacts in life cycle assessment (LCA).

    PubMed

    Woods, John S; Veltman, Karin; Huijbregts, Mark A J; Verones, Francesca; Hertwich, Edgar G

    2016-01-01

    Human demands on marine resources and space are currently unprecedented and concerns are rising over observed declines in marine biodiversity. A quantitative understanding of the impact of industrial activities on the marine environment is thus essential. Life cycle assessment (LCA) is a widely applied method for quantifying the environmental impact of products and processes. LCA was originally developed to assess the impacts of land-based industries on mainly terrestrial and freshwater ecosystems. As such, impact indicators for major drivers of marine biodiversity loss are currently lacking. We review quantitative approaches for cause-effect assessment of seven major drivers of marine biodiversity loss: climate change, ocean acidification, eutrophication-induced hypoxia, seabed damage, overexploitation of biotic resources, invasive species and marine plastic debris. Our review shows that impact indicators can be developed for all identified drivers, albeit at different levels of coverage of cause-effect pathways and variable levels of uncertainty and spatial coverage. Modeling approaches to predict the spatial distribution and intensity of human-driven interventions in the marine environment are relatively well-established and can be employed to develop spatially-explicit LCA fate factors. Modeling approaches to quantify the effects of these interventions on marine biodiversity are less well-developed. We highlight specific research challenges to facilitate a coherent incorporation of marine biodiversity loss in LCA, thereby making LCA a more comprehensive and robust environmental impact assessment tool. Research challenges of particular importance include i) incorporation of the non-linear behavior of global circulation models (GCMs) within an LCA framework and ii) improving spatial differentiation, especially the representation of coastal regions in GCMs and ocean-carbon cycle models.

  13. A new method for ecoacoustics? Toward the extraction and evaluation of ecologically-meaningful soundscape components using sparse coding methods.

    PubMed

    Eldridge, Alice; Casey, Michael; Moscoso, Paola; Peck, Mika

    2016-01-01

    Passive acoustic monitoring is emerging as a promising non-invasive proxy for ecological complexity with potential as a tool for remote assessment and monitoring (Sueur & Farina, 2015). Rather than attempting to recognise species-specific calls, either manually or automatically, there is a growing interest in evaluating the global acoustic environment. Positioned within the conceptual framework of ecoacoustics, a growing number of indices have been proposed which aim to capture community-level dynamics by (e.g., Pieretti, Farina & Morri, 2011; Farina, 2014; Sueur et al., 2008b) by providing statistical summaries of the frequency or time domain signal. Although promising, the ecological relevance and efficacy as a monitoring tool of these indices is still unclear. In this paper we suggest that by virtue of operating in the time or frequency domain, existing indices are limited in their ability to access key structural information in the spectro-temporal domain. Alternative methods in which time-frequency dynamics are preserved are considered. Sparse-coding and source separation algorithms (specifically, shift-invariant probabilistic latent component analysis in 2D) are proposed as a means to access and summarise time-frequency dynamics which may be more ecologically-meaningful. PMID:27413632

  14. A new method for ecoacoustics? Toward the extraction and evaluation of ecologically-meaningful soundscape components using sparse coding methods

    PubMed Central

    Casey, Michael; Moscoso, Paola; Peck, Mika

    2016-01-01

    Passive acoustic monitoring is emerging as a promising non-invasive proxy for ecological complexity with potential as a tool for remote assessment and monitoring (Sueur & Farina, 2015). Rather than attempting to recognise species-specific calls, either manually or automatically, there is a growing interest in evaluating the global acoustic environment. Positioned within the conceptual framework of ecoacoustics, a growing number of indices have been proposed which aim to capture community-level dynamics by (e.g., Pieretti, Farina & Morri, 2011; Farina, 2014; Sueur et al., 2008b) by providing statistical summaries of the frequency or time domain signal. Although promising, the ecological relevance and efficacy as a monitoring tool of these indices is still unclear. In this paper we suggest that by virtue of operating in the time or frequency domain, existing indices are limited in their ability to access key structural information in the spectro-temporal domain. Alternative methods in which time-frequency dynamics are preserved are considered. Sparse-coding and source separation algorithms (specifically, shift-invariant probabilistic latent component analysis in 2D) are proposed as a means to access and summarise time-frequency dynamics which may be more ecologically-meaningful. PMID:27413632

  15. Development of an Assessment Tool to Measure Students' Meaningful Learning in the Undergraduate Chemistry Laboratory

    ERIC Educational Resources Information Center

    Galloway, Kelli R.; Bretz, Stacey Lowery

    2015-01-01

    Research on learning in the undergraduate chemistry laboratory necessitates an understanding of students' perspectives of learning. Novak's Theory of Meaningful Learning states that the cognitive (thinking), affective (feeling), and psychomotor (doing) domains must be integrated for meaningful learning to occur. The psychomotor domain is the…

  16. Cancer Bioinformatic Methods to Infer Meaningful Data From Small-Size Cohorts.

    PubMed

    Bennani-Baiti, Nabila; Bennani-Baiti, Idriss M

    2015-01-01

    Whole-genome analyses have uncovered that most cancer-relevant genes cluster into 12 signaling pathways. Knowledge of the signaling pathways and associated gene signatures not only allows us to understand the mechanisms of oncogenesis inherent to specific cancers but also provides us with drug targets, molecular diagnostic and prognosis factors, as well as biomarkers for patient risk stratification and treatment. Publicly available genomic data sets constitute a wealth of gene mining opportunities for hypothesis generation and testing. However, the increasingly recognized genetic and epigenetic inter- and intratumor heterogeneity, combined with the preponderance of small-size cohorts, hamper reliable analysis and discovery. Here, we review two methods that are used to infer meaningful biological events from small-size data sets and discuss some of their applications and limitations.

  17. Novel methods to collect meaningful data from adolescents for the development of health interventions.

    PubMed

    Hieftje, Kimberly; Duncan, Lindsay R; Fiellin, Lynn E

    2014-09-01

    Health interventions are increasingly focused on young adolescents, and as a result, discussions with this population have become a popular method in qualitative research. Traditional methods used to engage adults in discussions do not translate well to this population, who may have difficulty conceptualizing abstract thoughts and opinions and communicating them to others. As part of a larger project to develop and evaluate a video game for risk reduction and HIV prevention in young adolescents, we were seeking information and ideas from the priority audience that would help us create authentic story lines and character development in the video game. To accomplish this authenticity, we conducted in-depth interviews and focus groups with young adolescents aged 10 to 15 years and employed three novel methods: Storytelling Using Graphic Illustration, My Life, and Photo Feedback Project. These methods helped provide a thorough understanding of the adolescents' experiences and perspectives regarding their environment and future aspirations, which we translated into active components of the video game intervention. This article describes the processes we used and the valuable data we generated using these three engaging methods. These three activities are effective tools for eliciting meaningful data from young adolescents for the development of health interventions.

  18. Novel methods to collect meaningful data from adolescents for the development of health interventions.

    PubMed

    Hieftje, Kimberly; Duncan, Lindsay R; Fiellin, Lynn E

    2014-09-01

    Health interventions are increasingly focused on young adolescents, and as a result, discussions with this population have become a popular method in qualitative research. Traditional methods used to engage adults in discussions do not translate well to this population, who may have difficulty conceptualizing abstract thoughts and opinions and communicating them to others. As part of a larger project to develop and evaluate a video game for risk reduction and HIV prevention in young adolescents, we were seeking information and ideas from the priority audience that would help us create authentic story lines and character development in the video game. To accomplish this authenticity, we conducted in-depth interviews and focus groups with young adolescents aged 10 to 15 years and employed three novel methods: Storytelling Using Graphic Illustration, My Life, and Photo Feedback Project. These methods helped provide a thorough understanding of the adolescents' experiences and perspectives regarding their environment and future aspirations, which we translated into active components of the video game intervention. This article describes the processes we used and the valuable data we generated using these three engaging methods. These three activities are effective tools for eliciting meaningful data from young adolescents for the development of health interventions. PMID:24519998

  19. Novel Methods to Collect Meaningful Data From Adolescents for the Development of Health Interventions

    PubMed Central

    Hieftje, Kimberly; Duncan, Lindsay R.; Fiellin, Lynn E.

    2014-01-01

    Health interventions are increasingly focused on young adolescents, and as a result, discussions with this population have become a popular method in qualitative research. Traditional methods used to engage adults in discussions do not translate well to this population, who may have difficulty conceptualizing abstract thoughts and opinions and communicating them to others. As part of a larger project to develop and evaluate a video game for risk reduction and HIV prevention in young adolescents, we were seeking information and ideas from the priority audience that would help us create authentic story lines and character development in the video game. To accomplish this authenticity, we conducted in-depth interviews and focus groups with young adolescents aged 10 to 15 years and employed three novel methods: Storytelling Using Graphic Illustration, My Life, and Photo Feedback Project. These methods helped provide a thorough understanding of the adolescents’ experiences and perspectives regarding their environment and future aspirations, which we translated into active components of the video game intervention. This article describes the processes we used and the valuable data we generated using these three engaging methods. These three activities are effective tools for eliciting meaningful data from young adolescents for the development of health interventions. PMID:24519998

  20. Does kinematics add meaningful information to clinical assessment in post-stroke upper limb rehabilitation? A case report

    PubMed Central

    Bigoni, Matteo; Baudo, Silvia; Cimolin, Veronica; Cau, Nicola; Galli, Manuela; Pianta, Lucia; Tacchini, Elena; Capodaglio, Paolo; Mauro, Alessandro

    2016-01-01

    [Purpose] The aims of this case study were to: (a) quantify the impairment and activity restriction of the upper limb in a hemiparetic patient; (b) quantitatively evaluate rehabilitation program effectiveness; and (c) discuss whether more clinically meaningful information can be gained with the use of kinematic analysis in addition to clinical assessment. The rehabilitation program consisted of the combined use of different traditional physiotherapy techniques, occupational therapy sessions, and the so-called task-oriented approach. [Subject and Methods] Subject was a one hemiplegic patient. The patient was assessed at the beginning and after 1 month of daily rehabilitation using the Medical Research Council scale, Nine Hole Peg Test, Motor Evaluation Scale for Upper Extremity in Stroke Patients, and Hand Grip Dynamometer test as well as a kinematic analysis using an optoelectronic system. [Results] After treatment, significant improvements were evident in terms of total movement duration, movement completion velocity, and some smoothness parameters. [Conclusion] Our case report showed that the integration of clinical assessment with kinematic evaluation appears to be useful for quantitatively assessing performance changes.

  1. Does kinematics add meaningful information to clinical assessment in post-stroke upper limb rehabilitation? A case report.

    PubMed

    Bigoni, Matteo; Baudo, Silvia; Cimolin, Veronica; Cau, Nicola; Galli, Manuela; Pianta, Lucia; Tacchini, Elena; Capodaglio, Paolo; Mauro, Alessandro

    2016-08-01

    [Purpose] The aims of this case study were to: (a) quantify the impairment and activity restriction of the upper limb in a hemiparetic patient; (b) quantitatively evaluate rehabilitation program effectiveness; and (c) discuss whether more clinically meaningful information can be gained with the use of kinematic analysis in addition to clinical assessment. The rehabilitation program consisted of the combined use of different traditional physiotherapy techniques, occupational therapy sessions, and the so-called task-oriented approach. [Subject and Methods] Subject was a one hemiplegic patient. The patient was assessed at the beginning and after 1 month of daily rehabilitation using the Medical Research Council scale, Nine Hole Peg Test, Motor Evaluation Scale for Upper Extremity in Stroke Patients, and Hand Grip Dynamometer test as well as a kinematic analysis using an optoelectronic system. [Results] After treatment, significant improvements were evident in terms of total movement duration, movement completion velocity, and some smoothness parameters. [Conclusion] Our case report showed that the integration of clinical assessment with kinematic evaluation appears to be useful for quantitatively assessing performance changes. PMID:27630445

  2. Does kinematics add meaningful information to clinical assessment in post-stroke upper limb rehabilitation? A case report

    PubMed Central

    Bigoni, Matteo; Baudo, Silvia; Cimolin, Veronica; Cau, Nicola; Galli, Manuela; Pianta, Lucia; Tacchini, Elena; Capodaglio, Paolo; Mauro, Alessandro

    2016-01-01

    [Purpose] The aims of this case study were to: (a) quantify the impairment and activity restriction of the upper limb in a hemiparetic patient; (b) quantitatively evaluate rehabilitation program effectiveness; and (c) discuss whether more clinically meaningful information can be gained with the use of kinematic analysis in addition to clinical assessment. The rehabilitation program consisted of the combined use of different traditional physiotherapy techniques, occupational therapy sessions, and the so-called task-oriented approach. [Subject and Methods] Subject was a one hemiplegic patient. The patient was assessed at the beginning and after 1 month of daily rehabilitation using the Medical Research Council scale, Nine Hole Peg Test, Motor Evaluation Scale for Upper Extremity in Stroke Patients, and Hand Grip Dynamometer test as well as a kinematic analysis using an optoelectronic system. [Results] After treatment, significant improvements were evident in terms of total movement duration, movement completion velocity, and some smoothness parameters. [Conclusion] Our case report showed that the integration of clinical assessment with kinematic evaluation appears to be useful for quantitatively assessing performance changes. PMID:27630445

  3. A Rubric for Assessing Teachers' Lesson Activities with Respect to TPACK for Meaningful Learning with ICT

    ERIC Educational Resources Information Center

    Koh, Joyce Hwee Ling

    2013-01-01

    Teachers' technological pedagogical content knowledge (TPACK) for meaningful learning with ICT describes their knowledge for designing ICT lesson activities with respect to five dimensions: active, constructive, authentic, intentional, and cooperative. The ICT lesson activities designed by teachers can be assessed to determine the strengths…

  4. Choosing the appropriate matrix to perform a scientifically meaningful lipemic plasma test in bioanalytical method validation.

    PubMed

    Mayrand-Provencher, Laurence; Furtado, Milton; Mess, Jean-Nicholas; Dumont, Isabelle; Garofolo, Fabio

    2014-01-01

    Laurence Mayrand-Provencher has obtained a Master of Science in Chemistry from Université de Montréal. With over 3 years of experience as a scientist in the bioanalysis industry, he is now a scientist in method development at Algorithme Pharma. His experiences have led him to conduct robust and effective method development of bioanalytical assays, specifically in the LC-MS/MS field. Many regulatory agencies include in their guidelines the need to investigate the effect of lipemic plasma on the reliability of the data as part of a bioanalytical assay validation. Lipids can cause matrix effect, specificity and recovery issues, which can potentially lead to inaccurate data if left unaccounted for. However, finding the appropriate matrix type to be used to perform a lipemic plasma test is a major challenge, as the differences between those commercially available are not well known. The work reported herein describes the differences in lipid content between normal plasma, synthetic lipemic plasma mixes, and two types of natural lipemic plasma. The results obtained show that natural plasma with high triglycerides content should be used to perform a scientifically meaningful lipemic plasma test.

  5. A Concurrent Mixed Methods Approach to Examining the Quantitative and Qualitative Meaningfulness of Absolute Magnitude Estimation Scales in Survey Research

    ERIC Educational Resources Information Center

    Koskey, Kristin L. K.; Stewart, Victoria C.

    2014-01-01

    This small "n" observational study used a concurrent mixed methods approach to address a void in the literature with regard to the qualitative meaningfulness of the data yielded by absolute magnitude estimation scaling (MES) used to rate subjective stimuli. We investigated whether respondents' scales progressed from less to more and…

  6. The Design and Implementation of a Meaningful Learning-Based Evaluation Method for Ubiquitous Learning

    ERIC Educational Resources Information Center

    Huang, Yueh-Min; Chiu, Po-Sheng; Liu, Tzu-Chien; Chen, Tzung-Shi

    2011-01-01

    If ubiquitous learning (u-learning) is to be effectively developed and feasibly applied to education, it is necessary to evaluate its effectiveness. Yet to achieve a sound evaluation, a particular paradigm must be employed to fit the problem domain. Toward this end, the authors of this study have adopted a meaningful learning paradigm. Meaningful…

  7. Electronic Health Records and Meaningful Use in Local Health Departments: Updates From the 2015 NACCHO Informatics Assessment Survey

    PubMed Central

    Shah, Gulzar H.

    2016-01-01

    Background: Electronic health records (EHRs) are evolving the scope of operations, practices, and outcomes of population health in the United States. Local health departments (LHDs) need adequate health informatics capacities to handle the quantity and quality of population health data. Purpose: The purpose of this study was to gain an updated view using the most recent data to identify the primary storage of clinical data, status of data for meaningful use, and characteristics associated with the implementation of EHRs in LHDs. Methods: Data were drawn from the 2015 Informatics Capacity and Needs Assessment Survey, which used a stratified random sampling design of LHD populations. Oversampling of larger LHDs was conducted and sampling weights were applied. Data were analyzed using descriptive statistics and logistic regression in SPSS. Results: Forty-two percent of LHDs indicated the use of an EHR system compared with 58% that use a non-EHR system for the storage of primary health data. Seventy-one percent of LHDs had reviewed some or all of the current systems to determine whether they needed to be improved or replaced, whereas only 6% formally conducted a readiness assessment for health information exchange. Twenty-seven percent of the LHDs had conducted informatics training within the past 12 months. LHD characteristics statistically associated with having an EHR system were having state or centralized governance, not having created a strategic plan related to informatics within the past 2 years throughout LHDs, provided informatics training in the past 12 months, and various levels of control over decisions regarding hardware allocation or acquisition, software selection, software support, and information technology budget allocation. Conclusion: A focus on EHR implementation in public health is pertinent to examining the impact of public health programming and interventions for the positive change in population health. PMID:27684614

  8. Scientific Caricatures in the Earth Science Classroom: An Alternative Assessment for Meaningful Science Learning

    NASA Astrophysics Data System (ADS)

    Clary, Renee M.; Wandersee, James H.

    2010-01-01

    Archive-based, historical research of materials produced during the Golden Age of Geology (1788-1840) uncovered scientific caricatures (SCs) which may serve as a unique form of knowledge representation for students today. SCs played important roles in the past, stimulating critical inquiry among early geologists and fueling debates that addressed key theoretical issues. When historical SCs were utilized in a large-enrollment college Earth History course, student response was positive. Therefore, we offered SCs as an optional assessment tool. Paired t-tests that compared individual students’ performances with the SC option, as well as without the SC option, showed a significant positive difference favoring scientific caricatures ( α = 0.05). Content analysis of anonymous student survey responses revealed three consistent findings: (a) students enjoyed expressing science content correctly but creatively through SCs, (b) development of SCs required deeper knowledge integration and understanding of the content than conventional test items, and (c) students appreciated having SC item options on their examinations, whether or not they took advantage of them. We think that incorporation of SCs during assessment may effectively expand the variety of methods for probing understanding, thereby increasing the mode validity of current geoscience tests.

  9. Inter-Observer Reliability Assessments in Time Motion Studies: The Foundation for Meaningful Clinical Workflow Analysis

    PubMed Central

    Lopetegui, Marcelo A.; Bai, Shasha; Yen, Po-Yin; Lai, Albert; Embi, Peter; Payne, Philip R.O.

    2013-01-01

    Understanding clinical workflow is critical for researchers and healthcare decision makers. Current workflow studies tend to oversimplify and underrepresent the complexity of clinical workflow. Continuous observation time motion studies (TMS) could enhance clinical workflow studies by providing rich quantitative data required for in-depth workflow analyses. However, methodological inconsistencies have been reported in continuous observation TMS, potentially reducing the validity of TMS’ data and limiting their contribution to the general state of knowledge. We believe that a cornerstone in standardizing TMS is to ensure the reliability of the human observers. In this manuscript we review the approaches for inter-observer reliability assessment (IORA) in a representative sample of TMS focusing on clinical workflow. We found that IORA is an uncommon practice, inconsistently reported, and often uses methods that provide partial and overestimated measures of agreement. Since a comprehensive approach to IORA is yet to be proposed and validated, we provide initial recommendations for IORA reporting in continuous observation TMS. PMID:24551381

  10. Toward Meaningful Assessment: Lessons from Five First-Grade Classrooms. Occasional Paper Series 26

    ERIC Educational Resources Information Center

    Kates, Laura R.

    2011-01-01

    Are teachers who are faced with mandated assessments more likely or less likely to explore their students' performance in depth and use their discoveries to enrich learning? This is the story of how six first-grade teachers in New York City responded to a mandated performance assessment--and how that response compared to a set of informal,…

  11. Rote versus Meaningful Learning.

    ERIC Educational Resources Information Center

    Mayer, Richard E.

    2002-01-01

    Examines the six categories that make up the cognitive process dimension of Bloom's Taxonomy Table, as well as the 19 specific cognitive processes that fit within them. After describing three learning outcomes, the paper focuses on retention versus transfer of learning and rote versus meaningful learning, discussing how teaching and assessment can…

  12. Qualitative methods for assessing risk

    SciTech Connect

    Mahn, J.A.; Hannaman, G.W.; Kryska, P.

    1995-04-01

    The Department of Energy`s (DOE) non-nuclear facilities generally require only a qualitative accident analysis to assess facility risks in accordance with DOE Order 5481.1B, Safety Analysis and Review System. Achieving a meaningful qualitative assessment of risk necessarily requires the use of suitable non-numerical assessment criteria. Typically, the methods and criteria for assigning facility-specific accident scenarios to the qualitative severity and likelihood classification system in the DOE order requires significant judgment in many applications. Systematic methods for more consistently assigning the total accident scenario frequency and associated consequences are required to substantiate and enhance future risk ranking between various activities at Sandia National Laboratories (SNL). SNL`s Risk Management and National Environmental Policy Act (NEPA) Department has developed an improved methodology for performing qualitative risk assessments in accordance wi the DOE order requirements. Products of this effort are an improved set of qualitative description that permit (1) definition of the severity for both technical and programmatic consequences that may result from a variety of accident scenarios, and (2) qualitative representation of the likelihood of occurrence. These sets of descriptions are intended to facilitate proper application of DOE criteria for assessing facility risks.

  13. Revisiting Individual Creativity Assessment: Triangulation in Subjective and Objective Assessment Methods

    ERIC Educational Resources Information Center

    Park, Namgyoo K.; Chun, Monica Youngshin; Lee, Jinju

    2016-01-01

    Compared to the significant development of creativity studies, individual creativity research has not reached a meaningful consensus regarding the most valid and reliable method for assessing individual creativity. This study revisited 2 of the most popular methods for assessing individual creativity: subjective and objective methods. This study…

  14. Validity Argument for Assessing L2 Pragmatics in Interaction Using Mixed Methods

    ERIC Educational Resources Information Center

    Youn, Soo Jung

    2015-01-01

    This study investigates the validity of assessing L2 pragmatics in interaction using mixed methods, focusing on the evaluation inference. Open role-plays that are meaningful and relevant to the stakeholders in an English for Academic Purposes context were developed for classroom assessment. For meaningful score interpretations and accurate…

  15. A Mixed-Methods Study of the Recovery Concept, "A Meaningful Day," in Community Mental Health Services for Individuals with Serious Mental Illnesses.

    PubMed

    Myers, Neely A L; Smith, Kelly; Pope, Alicia; Alolayan, Yazeed; Broussard, Beth; Haynes, Nora; Compton, Michael T

    2016-10-01

    The recovery concept encompasses overcoming or managing one's illness, being physically and emotionally healthy, and finding meaningful purpose through work, school, or volunteering, which connects one to others in mutually fulfilling ways. Using a mixed-methods approach, we studied the emphasis on "a meaningful day" in the new Opening Doors to Recovery (ODR) program in southeast Georgia. Among 100 participants, we measured the meaningful day construct using three quantitative items at baseline (hospital discharge) and at 4-, 8-, and 12-month follow-up, finding statistically significant linear trends over time for all three measures. Complementary qualitative interviews with 30 individuals (ODR participants, family members, and ODR's Community Navigation Specialists and program leaders) revealed themes pertaining to companionship, productivity, achieving stability, and autonomy, as well as the concern about insufficient resources. The concept of "a meaningful day" can be a focus of clinical attention and measured as a person-centered outcome for clients served by recovery-oriented community mental health services.

  16. Making Fractions Meaningful

    ERIC Educational Resources Information Center

    McCormick, Kelly K.

    2015-01-01

    To be able to support meaningful mathematical experiences, preservice elementary school teachers (PSTs) must learn mathematics in deep and meaningful ways (Ma 1999). They need to experience investigating and making sense of the mathematics they will be called on to teach. To expand their own--often limited--views of what it means to teach and…

  17. Meaningful Measurement: The Role of Assessments in Improving High School Education in the Twenty-First Century

    ERIC Educational Resources Information Center

    Pinkus, Lyndsay M., Ed.

    2009-01-01

    In the chapters presented in this volume, leading experts describe some of the assessment challenges in greater detail and provide federal recommendations on how to address them. In "College and Work Readiness as a Goal of High Schools: The Role of Standards, Assessments, and Accountability," John Tanner of the Center for Innovative Measures at…

  18. Use of a pre-assessment tool to start a meaningful dialogue: new paradigms in library instruction.

    PubMed

    Getselman, Anna; White, Mia S

    2011-01-01

    In 2009, the Woodruff Health Sciences Center Library started a library instruction dialogue with the medical students and faculty from the Emory School of Medicine. These discussions exposed a gap among faculty, students, and librarians in their perceptions of information processing. Follow-ups with the Associate Deans for Student Affairs and Medical Education led to the decision to administer an online assessment of the incoming student body and a complete redesign of the library orientation program. The aim of using self-assessment methodology in the framework of an orientation program was to set the students' foundation for self-discovery and introduce them to self-learning.

  19. From Mindless to Meaningful

    ERIC Educational Resources Information Center

    Billings, Laura; Roberts, Terry

    2014-01-01

    Despite teachers' best intentions, traditional whole-class discussions sometimes end up sounding like the monotonous drone of Charlie Brown's teacher. But with careful planning, teachers can structure discussions that encourage meaningful student interaction and collaborative thinking, write Laura Billings and Terry Roberts of the…

  20. Meaningful and Purposeful Practice

    ERIC Educational Resources Information Center

    Clementi, Donna

    2014-01-01

    This article describes a graphic, designed by Clementi and Terrill, the authors of "Keys to Planning for Learning" (2013), visually representing the components that contribute to meaningful and purposeful practice in learning a world language, practice that leads to greater proficiency. The entire graphic is centered around the letter…

  1. Miscues: Meaningful Assessment Aids Instruction

    ERIC Educational Resources Information Center

    Luft, Pamela

    2009-01-01

    LeRoy was a deaf sixth grader who used signs and his voice to communicate. Yanetta was a deaf eighth grader who had deaf parents and preferred American Sign Language (ASL). Michael was a deaf fifth grader in a suburban school who attended an oral program and used his voice exclusively to communicate. All three students struggled with reading. They…

  2. Students' Meaningful Learning Orientation and Their Meaningful Understandings of Meiosis and Genetics.

    ERIC Educational Resources Information Center

    Cavallo, Ann Liberatore

    This 1-week study explored the extent to which high school students (n=140) acquired meaningful understanding of selected biological topics (meiosis and the Punnett square method) and the relationship between these topics. This study: (1) examined "mental modeling" as a technique for measuring students' meaningful understanding of the topics; (2)…

  3. Toward meaningful noise research.

    PubMed

    Holding, D H; Baker, M A

    1987-10-01

    The present review considers a series of studies of noise conducted in collaboration with Dr. Michel Loeb. This review attempts to provide a theoretical perspective as well as to summarize the most important findings of those studies. The work reviewed shows that noise effects interact with other variables, such that a noise effect on one sex is reversed for the other, and is also reversed at different times of the day. A second experiment confirmed this finding with a different arithmetic task. Further work indicated parallels between noise and fatigue, with aftereffects depending upon both work and noise. The final experiment repeated some of these findings with a different task battery of information processing tasks while showing that noise effects further depend on the meaningfulness of the noise background.

  4. The Retention of Meaningful Understanding of Meiosis and Genetics.

    ERIC Educational Resources Information Center

    Cavallo, Ann Liberatore

    This study investigated the retention of meaningful understanding of the biological topics of meiosis, the Punnett square method and the relations between these two topics. This study also explored the predictive influence of students' general tendency to learn meaningfully or by rote (meaningful learning orientation), prior knowledge of meiosis,…

  5. Making reaccreditation meaningful.

    PubMed Central

    Nicol, F

    1995-01-01

    Reaccreditation is a well-accepted fact for many doctors outside the United Kingdom and is likely to become a reality for British general practitioners. The author'sabbatical year in the United States of America studying reaccreditation and its relationship to continuing medical education has enabled a critical analysis of recent proposals in the UK to be carried out. The aim of reaccreditation must be understood by the profession and must be clearly stated. To be credible it will have to be mandatory and linked to continuing medical education. Current types of continuing medical education must be developed so that they are meaningful, influence doctors' behaviour and include research, audit, training, reading and medical writing. The profession must confront the need to penalize the small number of doctors who have an unacceptable standard of practice. The potential benefits of an appropriate form of reaccreditation may include improved quality of care and patient outcome, enhanced job satisfaction and reduced rates of burnout. PMID:7619590

  6. Apical transportation: two assessment methods.

    PubMed

    López, Fernanda Ullmann; Travessas, Juliana Andréa Corrêa; Fachin, Elaine; Fontanella, Vania; Grecca, Fabiana

    2009-08-01

    Root canal transportation can lead to treatment failure. A large number of methodologies for assessing root canal preparation have been tried in the past. This study compared two methods for apical transportation measurement: digitised images of longitudinal root sections and radiographs. Sixty upper molar mesiobuccal root canals prepared for endodontic treatment were assessed. The results did not demonstrate statistically significant differences between the two imaging methods used to evaluate root canal transportation. The two methods were proven to be equally reliable. PMID:19703081

  7. Assessment Methods in Medical Education

    ERIC Educational Resources Information Center

    Norcini, John J.; McKinley, Danette W.

    2007-01-01

    Since the 1950s, there has been rapid and extensive change in the way assessment is conducted in medical education. Several new methods of assessment have been developed and implemented over this time and they have focused on clinical skills (taking a history from a patient and performing a physical examination), communication skills, procedural…

  8. Methods for Aquatic Resource Assessment

    EPA Science Inventory

    The Methods for Aquatic Resource Assessment (MARA) project consists of three main activities in support of assessing the conditions of the nation’s aquatic resources: 1) scientific support for EPA Office of Water’s national aquatic resource surveys; 2) spatial predications of riv...

  9. Teaching Absolute Value Meaningfully

    ERIC Educational Resources Information Center

    Wade, Angela

    2012-01-01

    What is the meaning of absolute value? And why do teachers teach students how to solve absolute value equations? Absolute value is a concept introduced in first-year algebra and then reinforced in later courses. Various authors have suggested instructional methods for teaching absolute value to high school students (Wei 2005; Stallings-Roberts…

  10. Qualitative methods for assessing risk

    SciTech Connect

    Mahn, J.A.; Hannaman, G.W.; Kryska, P.

    1995-03-01

    The purpose of this document is to describe a qualitative risk assessment process that supplements the requirements of DOE/AL 5481.1B. Although facility managers have a choice of assessing risk either quantitatively or qualitatively, trade offs are involved in making the most appropriate choice for a given application. The results that can be obtained from a quantitative risk assessment are significantly more robust than those results derived from a qualitative approach. However, the advantages derived from quantitative risk assessment are achieved at a greater expenditure of money, time and convenience. This document provides the elements of a framework for performing a much less costly qualitative risk assessment, while retaining the best attributes of quantitative methods. The approach discussed herein will; (1) provide facility managers with the tools to prepare consistent, site wide assessments, and (2) aid the reviewers who may be tasked to evaluate the assessments. Added cost/benefit measures of the qualitative methodology include the identification of mechanisms for optimally allocating resources for minimizing risk in an expeditious, and fiscally responsible manner.

  11. Improving Personal Characterization of Meaningful Activity in Adults with Chronic Conditions Living in a Low-Income Housing Community

    PubMed Central

    Ciro, Carrie A.; Smith, Patsy

    2015-01-01

    Purpose: To understand how adults living in a low-income, public housing community characterize meaningful activity (activity that gives life purpose) and if through short-term intervention, could overcome identified individual and environmental barriers to activity engagement. Methods: We used a mixed methods design where Phase 1 (qualitative) informed the development of Phase 2 (quantitative). Focus groups were conducted with residents of two low-income, public housing communities to understand their characterization of meaningful activity and health. From these results, we developed a theory-based group intervention for overcoming barriers to engagement in meaningful activity. Finally, we examined change in self-report scores from the Meaningful Activity Participation Assessment (MAPA) and the Engagement in Meaningful Activity Survey (EMAS). Results: Health literacy appeared to impact understanding of the questions in Phase 1. Activity availability, transportation, income and functional limitations were reported as barriers to meaningful activity. Phase 2 within group analysis revealed a significant difference in MAPA pre-post scores (p =0.007), but not EMAS (p =0.33). Discussion: Health literacy should be assessed and addressed in this population prior to intervention. After a group intervention, participants had a change in characterization of what is considered healthy, meaningful activity but reported fewer changes to how their activities aligned with their values. PMID:26378559

  12. Dietary assessment methods: dietary records.

    PubMed

    Ortega, Rosa M; Pérez-Rodrigo, Carmen; López-Sobaler, Ana M

    2015-02-26

    Dietary records or food diaries can be highlighted among dietary assessment methods of the current diet for their interest and validity. It is a prospective, open-ended survey method collecting data about the foods and beverages consumed over a previously specified period of time. Dietary records can be used to estimate current diet of individuals and population groups, as well as to identify groups at risk of inadequacy. It is a dietary assessment method interesting for its use in epidemiological or in clinical studies. High validity and precision has been reported for the method when used following adequate procedures and considering the sufficient number of days. Thus, dietary records are often considered as a reference method in validation studies. Nevertheless, the method is affected by error and has limitations due mainly to the tendency of subjects to report food consumption close to those socially desirable. Additional problems are related to the high burden posed on respondents. The method can also influence food behavior in respondents in order to simplify the registration of food intake and some subjects can experience difficulties in writing down the foods and beverages consumed or in describing the portion sizes. Increasing the number of days observed reduces the quality of completed diet records. It should also be considered the high cost of coding and processing information collected in diet records. One of the main advantages of the method is the registration of the foods and beverages as consumed, thus reducing the problem of food omissions due to memory failure. Weighted food records provide more precise estimates of consumed portions. New Technologies can be helpful to improve and ease collaboration of respondents, as well as precision of the estimates, although it would be desirable to evaluate the advantages and limitations in order to optimize the implementation.

  13. State Capacity for Leadership: Ensuring Meaningful Higher Education Involvement in State Implementation of New Assessments Aligned with the Common Core State Standards

    ERIC Educational Resources Information Center

    National Center for Higher Education Management Systems (NJ1), 2011

    2011-01-01

    The Common Core State Standards (CCSS) and assessments aligned to them represent a significant milestone in public education reform in the U.S. Developed with consultation from higher education, the rigorous new standards and the assessments now being drafted by two consortia promise to help students reach higher levels of academic achievement and…

  14. Quality Assessment of Qualitative Evidence for Systematic Review and Synthesis: Is It Meaningful, and if So, How Should It Be Performed?

    ERIC Educational Resources Information Center

    Carroll, Christopher; Booth, Andrew

    2015-01-01

    The critical appraisal and quality assessment of primary research are key stages in systematic review and evidence synthesis. These processes are driven by the need to determine how far the primary research evidence, singly and collectively, should inform findings and, potentially, practice recommendations. Quality assessment of primary…

  15. Meaningful Use of Health Information Technology by Rural Hospitals

    ERIC Educational Resources Information Center

    McCullough, Jeffrey; Casey, Michelle; Moscovice, Ira; Burlew, Michele

    2011-01-01

    Purpose: This study examines the current status of meaningful use of health information technology (IT) in Critical Access Hospitals (CAHs), other rural, and urban US hospitals, and it discusses the potential role of Medicare payment incentives and disincentives in encouraging CAHs and other rural hospitals to achieve meaningful use. Methods: Data…

  16. LNG Safety Assessment Evaluation Methods

    SciTech Connect

    Muna, Alice Baca; LaFleur, Angela Christine

    2015-05-01

    Sandia National Laboratories evaluated published safety assessment methods across a variety of industries including Liquefied Natural Gas (LNG), hydrogen, land and marine transportation, as well as the US Department of Defense (DOD). All the methods were evaluated for their potential applicability for use in the LNG railroad application. After reviewing the documents included in this report, as well as others not included because of repetition, the Department of Energy (DOE) Hydrogen Safety Plan Checklist is most suitable to be adapted to the LNG railroad application. This report was developed to survey industries related to rail transportation for methodologies and tools that can be used by the FRA to review and evaluate safety assessments submitted by the railroad industry as a part of their implementation plans for liquefied or compressed natural gas storage ( on-board or tender) and engine fueling delivery systems. The main sections of this report provide an overview of various methods found during this survey. In most cases, the reference document is quoted directly. The final section provides discussion and a recommendation for the most appropriate methodology that will allow efficient and consistent evaluations to be made. The DOE Hydrogen Safety Plan Checklist was then revised to adapt it as a methodology for the Federal Railroad Administration’s use in evaluating safety plans submitted by the railroad industry.

  17. Quality assessment of qualitative evidence for systematic review and synthesis: Is it meaningful, and if so, how should it be performed?

    PubMed

    Carroll, Christopher; Booth, Andrew

    2015-06-01

    The critical appraisal and quality assessment of primary research are key stages in systematic review and evidence synthesis. These processes are driven by the need to determine how far the primary research evidence, singly and collectively, should inform findings and, potentially, practice recommendations. Quality assessment of primary qualitative research remains a contested area. This article reviews recent developments in the field charting a perceptible shift from whether such quality assessment should be conducted to how it might be performed. It discusses the criteria that are used in the assessment of quality and how the findings of the process are used in synthesis. It argues that recent research indicates that sensitivity analysis offers one potentially useful means for advancing this controversial issue.

  18. Eight Steps to Meaningful Grading

    ERIC Educational Resources Information Center

    Deddeh, Heather; Main, Erin; Fulkerson, Sharon Ratzlaff

    2010-01-01

    A group of teachers at Clifford Smart Middle School in Michigan's Walled Lake Consolidated School District have broken free from traditional grading in order to embrace a more meaningful grading practice. Using standards-based grading practices, they believe their grading now accurately communicates to students and parents the student's mastery…

  19. Relationships between students' meaningful learning orientation and their understanding of genetics topics

    NASA Astrophysics Data System (ADS)

    Cavallo, Ann M. Liberatore; Schafer, Larry E.

    This study explored factors predicting the extent to which high school students (N = 140) acquired meaningful understanding of the biological topics of meiosis, the Punnett-square method, and the relationships between these topics. This study (a) examined mental modeling as a technique for measuring students' meaningful understanding of the topics, (b) measured students' predisposed, generalized tendency to learn meaningfully (meaningful learning orientation), (c) determined the extent to which students' meaningful learning orientation predicted meaningful understanding beyond that predicted by aptitude and achievement motivation, (d) experimentally tested two instructional treatments (relationships presented to students, relationships generated by students), (e) explored the relationships of meaningful learning orientation, prior knowledge, instructional treatment, and all interactions of these variables in predicting meaningful understanding. The results of correlations and multiple regressions indicated that meaningful learning orientation contributed to students' attainment of meaningful understanding independent of aptitude and achievement motivation. Meaningful learning orientation and prior knowledge interacted in unique ways for each topic to predict students' attainment of meaningful understanding. Instructional treatment had relatively little relationship to students' acquisition of meaningful understanding, except for learners midrange between meaningful and rote. These findings imply that a meaningful learning approach among students may be important, perhaps as much or more than aptitude and achievement motivation, for their acquisition of interrelated, meaningful understandings of science.

  20. Birth, meaningful viability and abortion.

    PubMed

    Jensen, David

    2015-06-01

    What role does birth play in the debate about elective abortion? Does the wrongness of infanticide imply the wrongness of late-term abortion? In this paper, I argue that the same or similar factors that make birth morally significant with regard to abortion make meaningful viability morally significant due to the relatively arbitrary time of birth. I do this by considering the positions of Mary Anne Warren and José Luis Bermúdez who argue that birth is significant enough that the wrongness of infanticide does not imply the wrongness of late-term abortion. On the basis of the relatively arbitrary timing of birth, I argue that meaningful viability is the point at which elective abortion is prima facie morally wrong.

  1. Allergic rhinitis: meaningful and less meaningful combination treatments including reminiscences.

    PubMed

    Szelenyi, I

    2014-06-01

    Allergic rhinitis (AR) results from a complex allergen-driven mucosal inflammation in the nasal cavity. Current guideline-based therapy for allergic rhinitis include oral and nasal antihistamines, topical and systemic glucocorticoids, decongestants, antimuscarinic agents, mast cell stabilizing drugs, leukotriene-receptor antagonists, and others. In spite of guideline recommendations, most patients are using multiple therapies in an attempt to achieve symptom control. Therefore, more effective therapies for the management of AR are clearly required. Recently, a novel fixed dose combination containing azelastine and fluticasone propionate has successfully been introduced. At present, it represents the only meaningful topical drug combination. Perhaps, it will be followed by others. PMID:24974572

  2. The Use of Qualitative Methods in Large-Scale Evaluation: Improving the Quality of the Evaluation and the Meaningfulness of the Findings

    ERIC Educational Resources Information Center

    Slayton, Julie; Llosa, Lorena

    2005-01-01

    In light of the current debate over the meaning of "scientifically based research", we argue that qualitative methods should be an essential part of large-scale program evaluations if program effectiveness is to be determined and understood. This article chronicles the challenges involved in incorporating qualitative methods into the large-scale…

  3. Accuracy of a semiquantitative method for Dermal Exposure Assessment (DREAM)

    PubMed Central

    van Wendel, de Joo... B; Vermeulen, R; van Hemmen, J J; Fransman, W; Kromhout, H

    2005-01-01

    Background: The authors recently developed a Dermal Exposure Assessment Method (DREAM), an observational semiquantitative method to assess dermal exposures by systematically evaluating exposure determinants using pre-assigned default values. Aim: To explore the accuracy of the DREAM method by comparing its estimates with quantitative dermal exposure measurements in several occupational settings. Methods: Occupational hygienists observed workers performing a certain task, whose exposure to chemical agents on skin or clothing was measured quantitatively simultaneously, and filled in the DREAM questionnaire. DREAM estimates were compared with measurement data by estimating Spearman correlation coefficients for each task and for individual observations. In addition, mixed linear regression models were used to study the effect of DREAM estimates on the variability in measured exposures between tasks, between workers, and from day to day. Results: For skin exposures, spearman correlation coefficients for individual observations ranged from 0.19 to 0.82. DREAM estimates for exposure levels on hands and forearms showed a fixed effect between and within surveys, explaining mainly between-task variance. In general, exposure levels on clothing layer were only predicted in a meaningful way by detailed DREAM estimates, which comprised detailed information on the concentration of the agent in the formulation to which exposure occurred. Conclusions: The authors expect that the DREAM method can be successfully applied for semiquantitative dermal exposure assessment in epidemiological and occupational hygiene surveys of groups of workers with considerable contrast in dermal exposure levels (variability between groups >1.0). For surveys with less contrasting exposure levels, quantitative dermal exposure measurements are preferable. PMID:16109819

  4. Methods of airway resistance assessment.

    PubMed

    Urbankowski, Tomasz; Przybyłowski, Tadeusz

    2016-01-01

    Airway resistance is the ratio of driving pressure to the rate of the airflow in the airways. The most frequent methods used to measure airway resistance are whole-body plethysmography, the interrupter technique and the forced oscillation technique. All these methods allow to measure resistance during respiration at the level close to tidal volume, they do not require forced breathing manoeuvres or deep breathing during measurement. The most popular method for measuring airway resistance is whole-body plethysmography. The results of plethysmography include among others the following parameters: airway resistance (Raw), airway conductance (Gaw), specific airway resistance (sRaw) and specific airway conductance (sGaw). The interrupter technique is based on the assumption that at the moment of airway occlusion, air pressure in the mouth is equal to the alveolar pressure . In the forced oscillation technique (FOT), airway resistance is calculated basing on the changes in pressure and flow caused by air vibration. The methods for measurement of airway resistance that are described in the present paper seem to be a useful alternative to the most common lung function test - spirometry. The target group in which these methods may be widely used are particularly the patients who are unable to perform spirometry.

  5. Psychology and death. Meaningful rediscovery.

    PubMed

    Feifel, H

    1990-04-01

    The place of death in psychology is reviewed historically. Leading causes for its being slighted as an area of investigation during psychology's early years are presented. Reasons for its rediscovery in the mid-1950s as a legitimate sector for scientific inquiry are then discussed, along with some vicissitudes encountered in carrying out research in the field. This is followed by a description of principal empirical findings, clinical perceptions, and perspectives emerging from work in the thanatological realm. The probability that such urgent social issues as abortion, acquired immunodeficiency syndrome (AIDS), and euthanasia, and such destructive behaviors as drug abuse, alcoholism, and certain acts of violence are associated with attitudes toward death offers a challenge to psychology to enhance the vitality of human response to maladaptive conduct and loss. Recognition of personal mortality is a major entryway to self-knowledge. Although death is manifestly too complex to be the special sphere of any one discipline, psychology's position as an arena in which humanist and physicist-engineer cultures intersect provides us with a meaningful opportunity to advance our comprehension of how death can serve life. PMID:2186680

  6. Enhancing Institutional Assessment Efforts through Qualitative Methods

    ERIC Educational Resources Information Center

    Van Note Chism, Nancy; Banta, Trudy W.

    2007-01-01

    Qualitative methods can do much to describe context and illuminate the why behind patterns encountered in institutional assessment. Alone, or in combination with quantitative methods, they should be the approach of choice for many of the most important assessment questions. (Contains 1 table.)

  7. Screeners and brief assessment methods.

    PubMed

    Pérez Rodrigo, Carmen; Morán Fagúndez, Luis Juan; Riobó Serván, Pilar; Aranceta Bartrina, Javier

    2015-02-26

    In the last two decades easy-to-use simple instruments have been developed and validated to assess specific aspects of the diet or a general profile that can be compared with a reference dietary pattern as the Mediterranean Diet or with the recommendations of the Dietary Guidelines. Brief instruments are rapid, simple and easy to use tools that can be implemented by unskilled personnel without specific training. These tools are useful both in clinical settings and in Primary Health Care or in the community as a tool for triage, as a screening tool to identify individuals or groups of people at risk who require further care or even they have been used in studies to investigate associations between specific aspects of the diet and health outcomes. They are also used in interventions focused on changing eating behaviors as a diagnostic tool, for self-evaluation purposes, or to provide tailored advice in web based interventions or mobile apps. There are some specific instruments for use in children, adults, elderly or specific population groups.

  8. Assessment methods for the evaluation of vitiligo.

    PubMed

    Alghamdi, K M; Kumar, A; Taïeb, A; Ezzedine, K

    2012-12-01

    There is no standardized method for assessing vitiligo. In this article, we review the literature from 1981 to 2011 on different vitiligo assessment methods. We aim to classify the techniques available for vitiligo assessment as subjective, semi-objective or objective; microscopic or macroscopic; and as based on morphometry or colorimetry. Macroscopic morphological measurements include visual assessment, photography in natural or ultraviolet light, photography with computerized image analysis and tristimulus colorimetry or spectrophotometry. Non-invasive micromorphological methods include confocal laser microscopy (CLM). Subjective methods include clinical evaluation by a dermatologist and a vitiligo disease activity score. Semi-objective methods include the Vitiligo Area Scoring Index (VASI) and point-counting methods. Objective methods include software-based image analysis, tristimulus colorimetry, spectrophotometry and CLM. Morphometry is the measurement of the vitiliginous surface area, whereas colorimetry quantitatively analyses skin colour changes caused by erythema or pigment. Most methods involve morphometry, except for the chromameter method, which assesses colorimetry. Some image analysis software programs can assess both morphometry and colorimetry. The details of these programs (Corel Draw, Image Pro Plus, AutoCad and Photoshop) are discussed in the review. Reflectance confocal microscopy provides real-time images and has great potential for the non-invasive assessment of pigmentary lesions. In conclusion, there is no single best method for assessing vitiligo. This review revealed that VASI, the rule of nine and Wood's lamp are likely to be the best techniques available for assessing the degree of pigmentary lesions and measuring the extent and progression of vitiligo in the clinic and in clinical trials. PMID:22416879

  9. Assessment methods for the evaluation of vitiligo.

    PubMed

    Alghamdi, K M; Kumar, A; Taïeb, A; Ezzedine, K

    2012-12-01

    There is no standardized method for assessing vitiligo. In this article, we review the literature from 1981 to 2011 on different vitiligo assessment methods. We aim to classify the techniques available for vitiligo assessment as subjective, semi-objective or objective; microscopic or macroscopic; and as based on morphometry or colorimetry. Macroscopic morphological measurements include visual assessment, photography in natural or ultraviolet light, photography with computerized image analysis and tristimulus colorimetry or spectrophotometry. Non-invasive micromorphological methods include confocal laser microscopy (CLM). Subjective methods include clinical evaluation by a dermatologist and a vitiligo disease activity score. Semi-objective methods include the Vitiligo Area Scoring Index (VASI) and point-counting methods. Objective methods include software-based image analysis, tristimulus colorimetry, spectrophotometry and CLM. Morphometry is the measurement of the vitiliginous surface area, whereas colorimetry quantitatively analyses skin colour changes caused by erythema or pigment. Most methods involve morphometry, except for the chromameter method, which assesses colorimetry. Some image analysis software programs can assess both morphometry and colorimetry. The details of these programs (Corel Draw, Image Pro Plus, AutoCad and Photoshop) are discussed in the review. Reflectance confocal microscopy provides real-time images and has great potential for the non-invasive assessment of pigmentary lesions. In conclusion, there is no single best method for assessing vitiligo. This review revealed that VASI, the rule of nine and Wood's lamp are likely to be the best techniques available for assessing the degree of pigmentary lesions and measuring the extent and progression of vitiligo in the clinic and in clinical trials.

  10. Expanding Assessment Methods and Moments in History

    ERIC Educational Resources Information Center

    Frost, Jennifer; de Pont, Genevieve; Brailsford, Ian

    2012-01-01

    History courses at The University of Auckland are typically assessed at two or three moments during a semester. The methods used normally employ two essays and a written examination answering questions set by the lecturer. This study describes an assessment innovation in 2008 that expanded both the frequency and variety of activities completed by…

  11. Personality, Assessment Methods and Academic Performance

    ERIC Educational Resources Information Center

    Furnham, Adrian; Nuygards, Sarah; Chamorro-Premuzic, Tomas

    2013-01-01

    This study examines the relationship between personality and two different academic performance (AP) assessment methods, namely exams and coursework. It aimed to examine whether the relationship between traits and AP was consistent across self-reported versus documented exam results, two different assessment techniques and across different…

  12. Scientific method, adversarial system, and technology assessment

    NASA Technical Reports Server (NTRS)

    Mayo, L. H.

    1975-01-01

    A basic framework is provided for the consideration of the purposes and techniques of scientific method and adversarial systems. Similarities and differences in these two techniques of inquiry are considered with reference to their relevance in the performance of assessments.

  13. EMERGY METHODS: VALUABLE INTEGRATED ASSESSMENT TOOLS

    EPA Science Inventory

    NHEERL's Atlantic Ecology Division is investigating emergy methods as tools for integrated assessment in several projects evaluating environmental impacts, policies, and alternatives for remediation and intervention. Emergy accounting is a methodology that provides a quantitative...

  14. Energy efficiency assessment methods and tools evaluation

    SciTech Connect

    McMordie, K.L.; Richman, E.E.; Keller, J.M.; Dixon, D.R.

    1994-08-01

    Many different methods of assessing the energy savings potential at federal installations, and identifying attractive projects for capital investment have been used by the different federal agencies. These methods range from high-level estimating tools to detailed design tools, both manual and software assisted. These methods have different purposes and provide results that are used for different parts of the project identification, and implementation process. Seven different assessment methods are evaluated in this study. These methods were selected by the program managers at the DoD Energy Policy Office, and DOE Federal Energy Management Program (FEMP). Each of the methods was applied to similar buildings at Bolling Air Force Base (AFB), unless it was inappropriate or the method was designed to make an installation-wide analysis, rather than focusing on particular buildings. Staff at Bolling AFB controlled the collection of data.

  15. Assessment and Evaluation Methods for Access Services

    ERIC Educational Resources Information Center

    Long, Dallas

    2014-01-01

    This article serves as a primer for assessment and evaluation design by describing the range of methods commonly employed in library settings. Quantitative methods, such as counting and benchmarking measures, are useful for investigating the internal operations of an access services department in order to identify workflow inefficiencies or…

  16. How Do Novice Art Teachers Define and Implement Meaningful Curriculum?

    ERIC Educational Resources Information Center

    Bain, Christina; Newton, Connie; Kuster, Deborah; Milbrandt, Melody

    2010-01-01

    Four researchers collaborated on this qualitative case study that examined 11 first-year novice art teachers' understanding and implementation of meaningful curriculum. Participants were selected through a criterion method sampling strategy; the subjects were employed in rural, urban, and suburban public school districts. In order to conduct a…

  17. Student Engagement and Teacher Guidance in Meaningful Mathematics: Enduring Principles

    ERIC Educational Resources Information Center

    Freeman, Gregory D.; Lucius, Lisa B.

    2008-01-01

    In mathematics, developing a conceptual understanding and observing properly modeled methods rarely lead to successful student performance. The student must participate. As with bike riding, participation with monitoring and guidance makes initial efforts meaningful and beneficial. In this article, the authors share a bike riding experience and…

  18. Russian risk assessment methods and approaches

    SciTech Connect

    Dvorack, M.A.; Carlson, D.D.; Smith, R.E.

    1996-07-01

    One of the benefits resulting from the collapse of the Soviet Union is the increased dialogue currently taking place between American and Russian nuclear weapons scientists in various technical arenas. One of these arenas currently being investigated involves collaborative studies which illustrate how risk assessment is perceived and utilized in the Former Soviet Union (FSU). The collaborative studies indicate that, while similarities exist with respect to some methodologies, the assumptions and approaches in performing risk assessments were, and still are, somewhat different in the FSU as opposed to that in the US. The purpose of this paper is to highlight the present knowledge of risk assessment methodologies and philosophies within the two largest nuclear weapons laboratories of the Former Soviet Union, Arzamas-16 and Chelyabinsk-70. Furthermore, This paper will address the relative progress of new risk assessment methodologies, such as Fuzzy Logic, within the framework of current risk assessment methods at these two institutes.

  19. Assessing the Assessment Methods: Climate Change and Hydrologic Impacts

    NASA Astrophysics Data System (ADS)

    Brekke, L. D.; Clark, M. P.; Gutmann, E. D.; Mizukami, N.; Mendoza, P. A.; Rasmussen, R.; Ikeda, K.; Pruitt, T.; Arnold, J. R.; Rajagopalan, B.

    2014-12-01

    The Bureau of Reclamation, the U.S. Army Corps of Engineers, and other water management agencies have an interest in developing reliable, science-based methods for incorporating climate change information into longer-term water resources planning. Such assessments must quantify projections of future climate and hydrology, typically relying on some form of spatial downscaling and bias correction to produce watershed-scale weather information that subsequently drives hydrology and other water resource management analyses (e.g., water demands, water quality, and environmental habitat). Water agencies continue to face challenging method decisions in these endeavors: (1) which downscaling method should be applied and at what resolution; (2) what observational dataset should be used to drive downscaling and hydrologic analysis; (3) what hydrologic model(s) should be used and how should these models be configured and calibrated? There is a critical need to understand the ramification of these method decisions, as they affect the signal and uncertainties produced by climate change assessments and, thus, adaptation planning. This presentation summarizes results from a three-year effort to identify strengths and weaknesses of widely applied methods for downscaling climate projections and assessing hydrologic conditions. Methods were evaluated from two perspectives: historical fidelity, and tendency to modulate a global climate model's climate change signal. On downscaling, four methods were applied at multiple resolutions: statistically using Bias Correction Spatial Disaggregation, Bias Correction Constructed Analogs, and Asynchronous Regression; dynamically using the Weather Research and Forecasting model. Downscaling results were then used to drive hydrologic analyses over the contiguous U.S. using multiple models (VIC, CLM, PRMS), with added focus placed on case study basins within the Colorado Headwaters. The presentation will identify which types of climate changes are

  20. A method for assessing reflective journal writing.

    PubMed

    Plack, Margaret M; Driscoll, Maryanne; Blissett, Sylvene; McKenna, Raymond; Plack, Thomas P

    2005-01-01

    Reflection is widely accepted as a learning tool and is considered integral to professional practice. Journal writing is advocated in facilitating reflection, yet little is written about how to assess reflection in journals. The purpose of this study was to develop and test a method of assessing the elements of reflection in journals and to determine whether, and to what level, reflection occurs in journals. Twenty-seven physical therapy students maintained written reflective journals throughout three of their four eight-week clinical affiliations. The students were introduced to concepts of reflective practice with definitions of terms and reflective questions before their second affiliation. A coding schema was developed to assess the journals. Three raters assessed forty-three journals. The text of each journal was analyzed for evidence of nine elements of reflection, and each journal was categorized as showing no evidence of reflection, evidence of reflection, or evidence of critical reflection. Descriptive statistics were used to demonstrate evidence of reflection. Reliability between each pair of raters was assessed using percent agreement, phi coefficients, and gamma statistics. Interrater reliability of all raters was assessed using intraclass correlation coefficients (ICC[2,1]). Results showed that the raters assessed 95.3%-100% of the journals as showing at least one element of reflection. The percent agreement between rater pairs for the nine elements of reflection ranged from 65.1% to 93.0%, the phi coefficient ranged from 0.08 to 0.81, and the ICC(2,1) values used to assess reliability among the three raters on each element ranged from 0.03 to 0.72. Averaging the assessment of the three raters for the overall journal, 14.7% of the journals were assessed as showing no evidence of reflection, 43.4% as showing evidence of reflection, and 41.9% as showing evidence of critical reflection. The percent agreement between rater pairs for the overall assessment

  1. A method for assessing reflective journal writing.

    PubMed

    Plack, Margaret M; Driscoll, Maryanne; Blissett, Sylvene; McKenna, Raymond; Plack, Thomas P

    2005-01-01

    Reflection is widely accepted as a learning tool and is considered integral to professional practice. Journal writing is advocated in facilitating reflection, yet little is written about how to assess reflection in journals. The purpose of this study was to develop and test a method of assessing the elements of reflection in journals and to determine whether, and to what level, reflection occurs in journals. Twenty-seven physical therapy students maintained written reflective journals throughout three of their four eight-week clinical affiliations. The students were introduced to concepts of reflective practice with definitions of terms and reflective questions before their second affiliation. A coding schema was developed to assess the journals. Three raters assessed forty-three journals. The text of each journal was analyzed for evidence of nine elements of reflection, and each journal was categorized as showing no evidence of reflection, evidence of reflection, or evidence of critical reflection. Descriptive statistics were used to demonstrate evidence of reflection. Reliability between each pair of raters was assessed using percent agreement, phi coefficients, and gamma statistics. Interrater reliability of all raters was assessed using intraclass correlation coefficients (ICC[2,1]). Results showed that the raters assessed 95.3%-100% of the journals as showing at least one element of reflection. The percent agreement between rater pairs for the nine elements of reflection ranged from 65.1% to 93.0%, the phi coefficient ranged from 0.08 to 0.81, and the ICC(2,1) values used to assess reliability among the three raters on each element ranged from 0.03 to 0.72. Averaging the assessment of the three raters for the overall journal, 14.7% of the journals were assessed as showing no evidence of reflection, 43.4% as showing evidence of reflection, and 41.9% as showing evidence of critical reflection. The percent agreement between rater pairs for the overall assessment

  2. Meaningful Improvement in Gait Speed in Hip Fracture Recovery

    PubMed Central

    Alley, Dawn E.; Hicks, Gregory E.; Shardell, Michelle; Hawkes, William; Miller, Ram; Craik, Rebecca L.; Mangione, Kathleen K.; Orwig, Denise; Hochberg, Marc; Resnick, Barbara; Magaziner, Jay

    2011-01-01

    OBJECTIVES To estimate meaningful improvements in gait speed observed during recovery from hip fracture and to evaluate the sensitivity and specificity of gait speed changes in detecting change in self-reported mobility. DESIGN Secondary longitudinal data analysis from two randomized controlled trials SETTING Twelve hospitals in the Baltimore, Maryland, area. PARTICIPANTS Two hundred seventeen women admitted with hip fracture. MEASUREMENTS Usual gait speed and self-reported mobility (ability to walk 1 block and climb 1 flight of stairs) measured 2 and 12 months after fracture. RESULTS Effect size–based estimates of meaningful differences were 0.03 for small differences and 0.09 for substantial differences. Depending on the anchor (stairs vs walking) and method (mean difference vs regression), anchor-based estimates ranged from 0.10 to 0.17 m/s for small meaningful improvements and 0.17 to 0.26 m/s for substantial meaningful improvement. Optimal gait speed cut-points yielded low sensitivity (0.39–0.62) and specificity (0.57–0.76) for improvements in self-reported mobility. CONCLUSION Results from this sample of women recovering from hip fracture provide only limited support for the 0.10-m/s cut point for substantial meaningful change previously identified in community-dwelling older adults experiencing declines in walking abilities. Anchor-based estimates and cut points derived from receiver operating characteristic curve analysis suggest that greater improvements in gait speed may be required for substantial perceived mobility improvement in female hip fracture patients. Furthermore, gait speed change performed poorly in discriminating change in self-reported mobility. Estimates of meaningful change in gait speed may differ based on the direction of change (improvement vs decline) or between patient populations. PMID:21883109

  3. Method and apparatus for assessing cardiovascular risk

    NASA Technical Reports Server (NTRS)

    Albrecht, Paul (Inventor); Bigger, J. Thomas (Inventor); Cohen, Richard J. (Inventor)

    1998-01-01

    The method for assessing risk of an adverse clinical event includes detecting a physiologic signal in the subject and determining from the physiologic signal a sequence of intervals corresponding to time intervals between heart beats. The long-time structure of fluctuations in the intervals over a time period of more than fifteen minutes is analyzed to assess risk of an adverse clinical event. In a preferred embodiment, the physiologic signal is an electrocardiogram and the time period is at least fifteen minutes. A preferred method for analyzing the long-time structure variability in the intervals includes computing the power spectrum and fitting the power spectrum to a power law dependence on frequency over a selected frequency range such as 10.sup.-4 to 10.sup.-2 Hz. Characteristics of the long-time structure fluctuations in the intervals is used to assess risk of an adverse clinical event.

  4. A comparison of preference-assessment methods.

    PubMed

    Verriden, Amanda L; Roscoe, Eileen M

    2016-06-01

    In Study 1, we evaluated preference stability across 4 preference-assessment methods for 6 individuals, 5 of whom had autism spectrum disorder and 1 of whom had traumatic brain injury. We also measured participants' problem behavior as a corollary measure during all assessment methods. The highest mean correlation coefficients and Kendall rank coefficients of concordance across administrations were observed for the paired-stimulus and multiple-stimulus-without-replacement methods. Lower correspondence across administrations was observed for the free-operant and response-restriction methods. Although differentially higher levels of problem behavior did not occur with a single method, lower levels were consistently observed with the free-operant method. During Study 2, we evaluated the implications of lower coefficients on reinforcer efficacy by comparing an initially identified and an immediately identified high-preference stimulus in a reinforcer assessment. Initially identified and immediately identified high-preference stimuli were equally effective reinforcers, suggesting that fluctuations in preference do not necessarily affect reinforcer efficacy in practice.

  5. Ecological Theory and Method for Behavioral Assessment.

    ERIC Educational Resources Information Center

    Carlson, Cindy I.; And Others

    1980-01-01

    The theoretical perspective and naturalistic methods of ecological psychology developed by Roger Barker are described. Ecological assessment implies examining (1) naturally occurring behavior; (2) environment immediately surrounding behavior; and (3) the individual-environment link. The specimen record, chronolog, and behavior-setting survey are…

  6. Methods of Assessment for Affected Family Members

    ERIC Educational Resources Information Center

    Orford, Jim; Templeton, Lorna; Velleman, Richard; Copello, Alex

    2010-01-01

    The article begins by making the point that a good assessment of the needs and circumstances of family members is important if previous neglect of affected family members is to be reversed. The methods we have used in research studies are then described. They include a lengthy semi-structured interview covering seven topic areas and standard…

  7. A New Method to Assess Eye Dominance

    ERIC Educational Resources Information Center

    Valle-Inclan, Fernando; Blanco, Manuel J.; Soto, David; Leiros, Luz

    2008-01-01

    People usually show a stable preference for one of their eyes when monocular viewing is required ("sighting dominance") or under dichoptic stimulation conditions ("sensory eye-dominance"). Current procedures to assess this "eye dominance" are prone to error. Here we present a new method that provides a continuous measure of eye dominance and…

  8. A meaningful MESS (Medical Education Scholarship Support)

    PubMed Central

    Whicker, Shari A.; Engle, Deborah L.; Chudgar, Saumil; DeMeo, Stephen; Bean, Sarah M.; Narayan, Aditee P.; Grochowski, Colleen O'Connor; Nagler, Alisa

    2016-01-01

    Background Graduate medical education faculty bear the responsibility of demonstrating active research and scholarship; however, faculty who choose education-focused careers may face unique obstacles related to the lack of promotion tracks, funding, career options, and research opportunities. Our objective was to address education research and scholarship barriers by providing a collaborative peer-mentoring environment and improve the production of research and scholarly outputs. Methods We describe a Medical Education Scholarship Support (MESS) group created in 2013. MESS is an interprofessional, multidisciplinary peer-mentoring education research community that now spans multiple institutions. This group meets monthly to address education research and scholarship challenges. Through this process, we develop new knowledge, research, and scholarly products, in addition to meaningful collaborations. Results MESS originated with eight founding members, all of whom still actively participate. MESS has proven to be a sustainable unfunded local community of practice, encouraging faculty to pursue health professions education (HPE) careers and fostering scholarship. We have met our original objectives that involved maintaining 100% participant retention; developing increased knowledge in at least seven content areas; and contributing to the development of 13 peer-reviewed publications, eight professional presentations, one Masters of Education project, and one educational curriculum. Discussion The number of individuals engaged in HPE research continues to rise. The MESS model could be adapted for use at other institutions, thereby reducing barriers HPE researchers face, providing an effective framework for trainees interested in education-focused careers, and having a broader impact on the education research landscape. PMID:27476538

  9. A classification scheme for risk assessment methods.

    SciTech Connect

    Stamp, Jason Edwin; Campbell, Philip LaRoche

    2004-08-01

    This report presents a classification scheme for risk assessment methods. This scheme, like all classification schemes, provides meaning by imposing a structure that identifies relationships. Our scheme is based on two orthogonal aspects--level of detail, and approach. The resulting structure is shown in Table 1 and is explained in the body of the report. Each cell in the Table represent a different arrangement of strengths and weaknesses. Those arrangements shift gradually as one moves through the table, each cell optimal for a particular situation. The intention of this report is to enable informed use of the methods so that a method chosen is optimal for a situation given. This report imposes structure on the set of risk assessment methods in order to reveal their relationships and thus optimize their usage.We present a two-dimensional structure in the form of a matrix, using three abstraction levels for the rows and three approaches for the columns. For each of the nine cells in the matrix we identify the method type by name and example. The matrix helps the user understand: (1) what to expect from a given method, (2) how it relates to other methods, and (3) how best to use it. Each cell in the matrix represent a different arrangement of strengths and weaknesses. Those arrangements shift gradually as one moves through the table, each cell optimal for a particular situation. The intention of this report is to enable informed use of the methods so that a method chosen is optimal for a situation given. The matrix, with type names in the cells, is introduced in Table 2 on page 13 below. Unless otherwise stated we use the word 'method' in this report to refer to a 'risk assessment method', though often times we use the full phrase. The use of the terms 'risk assessment' and 'risk management' are close enough that we do not attempt to distinguish them in this report. The remainder of this report is organized as follows. In Section 2 we provide context for this report

  10. New method for assessing risks of email

    NASA Astrophysics Data System (ADS)

    Raja, Seyyed H.; Afrooz, Farzad

    2013-03-01

    E-mail technology, has become one of the requirements of human lives for correspondence between individuals. Given this, the important point is that the messages, server and client of e-mail and correspondences that exchanged between different people have acceptable security, to make people sure to use of this technology. In the information age, many of financial and non financial transactions are done electronically, data exchange takes place via the internet and theft and manipulation of data can make exorbitant cost in terms of integrity, financial, political, economic and culture. E-mail correspondence in there is same and it is very important. With review took place, a method that will focus on email system for risks assessment is not provided. We are examining ways of assessing for other systems and their strengths and weaknesses, then we use Mr Convery method for assessing email risks which it is for assessing network risks. At the end of paper we have offered special table for email risk assessment.

  11. Methods of geodiversity assessment and theirs application

    NASA Astrophysics Data System (ADS)

    Zwoliński, Zbigniew; Najwer, Alicja; Giardino, Marco

    2016-04-01

    The concept of geodiversity has rapidly gained the approval of scientists around the world (Wiedenbein 1993, Sharples 1993, Kiernan 1995, 1996, Dixon 1996, Eberhard 1997, Kostrzewski 1998, 2011, Gray 2004, 2008, 2013, Zwoliński 2004, Serrano, Ruiz- Flano 2007, Gordon et al. 2012). However, the problem recognition is still at an early stage, and in effect not explicitly understood and defined (Najwer, Zwoliński 2014). Nevertheless, despite widespread use of the concept, little progress has been made in its assessment and mapping. Less than the last decade can be observing investigation of methods for geodiversity assessment and its visualisation. Though, many have acknowledged the importance of geodiversity evaluation (Kozłowski 2004, Gray 2004, Reynard, Panizza 2005, Zouros 2007, Pereira et al. 2007, Hjort et al. 2015). Hitherto, only a few authors have undertaken that kind of methodological issues. Geodiversity maps are being created for a variety of purposes and therefore their methods are quite manifold. In the literature exists some examples of the geodiversity maps applications for the geotourism purpose, basing mainly on the geological diversity, in order to point the scale of the area's tourist attractiveness (Zwoliński 2010, Serrano and Gonzalez Trueba 2011, Zwoliński and Stachowiak 2012). In some studies, geodiversity maps were created and applied to investigate the spatial or genetic relationships with the richness of particular natural environmental components (Burnett et al. 1998, Silva 2004, Jačková, Romportl 2008, Hjort et al. 2012, 2015, Mazurek et al. 2015, Najwer et al. 2014). There are also a few examples of geodiversity assessment in order to geoconservation and efficient management and planning of the natural protected areas (Serrano and Gonzalez Trueba 2011, Pellitero et al. 2011, 2014, Jaskulska et al. 2013, Melelli 2014, Martinez-Grana et al. 2015). The most popular method of assessing the diversity of abiotic components of the natural

  12. Methods of geodiversity assessment and theirs application

    NASA Astrophysics Data System (ADS)

    Zwoliński, Zbigniew; Najwer, Alicja; Giardino, Marco

    2016-04-01

    The concept of geodiversity has rapidly gained the approval of scientists around the world (Wiedenbein 1993, Sharples 1993, Kiernan 1995, 1996, Dixon 1996, Eberhard 1997, Kostrzewski 1998, 2011, Gray 2004, 2008, 2013, Zwoliński 2004, Serrano, Ruiz- Flano 2007, Gordon et al. 2012). However, the problem recognition is still at an early stage, and in effect not explicitly understood and defined (Najwer, Zwoliński 2014). Nevertheless, despite widespread use of the concept, little progress has been made in its assessment and mapping. Less than the last decade can be observing investigation of methods for geodiversity assessment and its visualisation. Though, many have acknowledged the importance of geodiversity evaluation (Kozłowski 2004, Gray 2004, Reynard, Panizza 2005, Zouros 2007, Pereira et al. 2007, Hjort et al. 2015). Hitherto, only a few authors have undertaken that kind of methodological issues. Geodiversity maps are being created for a variety of purposes and therefore their methods are quite manifold. In the literature exists some examples of the geodiversity maps applications for the geotourism purpose, basing mainly on the geological diversity, in order to point the scale of the area's tourist attractiveness (Zwoliński 2010, Serrano and Gonzalez Trueba 2011, Zwoliński and Stachowiak 2012). In some studies, geodiversity maps were created and applied to investigate the spatial or genetic relationships with the richness of particular natural environmental components (Burnett et al. 1998, Silva 2004, Jačková, Romportl 2008, Hjort et al. 2012, 2015, Mazurek et al. 2015, Najwer et al. 2014). There are also a few examples of geodiversity assessment in order to geoconservation and efficient management and planning of the natural protected areas (Serrano and Gonzalez Trueba 2011, Pellitero et al. 2011, 2014, Jaskulska et al. 2013, Melelli 2014, Martinez-Grana et al. 2015). The most popular method of assessing the diversity of abiotic components of the natural

  13. Method and apparatus to assess compartment syndrome

    NASA Technical Reports Server (NTRS)

    Ueno, Toshiaki (Inventor); Hargens, Alan R. (Inventor); Yost, William T. (Inventor)

    2008-01-01

    A method and apparatus for measuring pressure buildup in a body compartment that encases muscular tissue. The method includes assessing the body compartment configuration and identifying the effect of pulsatile components on at least one compartment dimension. This process is used in preventing tissue necrosis, and in decisions of whether to perform surgery on the body compartment for prevention of Compartment Syndrome. An apparatus is used for measuring excess pressure in the body compartment having components for imparting ultrasonic waves such as a transducer, placing the transducer to impart the ultrasonic waves, capturing the reflected imparted ultrasonic waves, and converting them to electrical signals, a pulsed phase-locked loop device for assessing a body compartment configuration and producing an output signal, and means for mathematically manipulating the output signal to thereby categorize pressure build-up in the body compartment from the mathematical manipulations.

  14. Nurse informaticians critical to proving meaningful use.

    PubMed

    Simpson, Roy L

    2011-01-01

    Nurses at the bedside serve on "the front lines" as hospitals strive to prove their "meaningful use" of technology to the federal government in hopes of securing significant funds from the American Recovery and Reinvestment Act. Nurse informaticians, working in concert with chief nursing officers, guide the nursing organization toward the most effective and efficient ways to demonstrate "meaningful use." Armed with data points from the point of care, nurse informaticians and chief nursing officers will be able to quantify, for the very first time, the value of nursing's contribution to the quality of patient care in America.

  15. Method of assessing heterogeneity in images

    DOEpatents

    Jacob, Richard E.; Carson, James P.

    2016-08-23

    A method of assessing heterogeneity in images is disclosed. 3D images of an object are acquired. The acquired images may be filtered and masked. Iterative decomposition is performed on the masked images to obtain image subdivisions that are relatively homogeneous. Comparative analysis, such as variogram analysis or correlogram analysis, is performed of the decomposed images to determine spatial relationships between regions of the images that are relatively homogeneous.

  16. [Methods of risk assessment and their validation].

    PubMed

    Baracco, Alessandro

    2014-01-01

    The review of the literature data shows several methods for the the risks assessment of biomnechanical overload of the musculoskeletal system in activities with repetitive strain of the upper limbs and manual material handling. The application of these methods should allow the quantification ofriskfor the working population, the identification of the preventive measures to reduce the risk and their effectiveness and thle design of a specific health surveillance scheme. In this paper we analyze the factors which must be taken into account in Occupational Medicine to implement a process of validation of these methods. In conclusion we believe it will necessary in the future the availability of new methods able to analyze and reduce the risk already in the design phase of the production process. PMID:25558718

  17. Using Meaningful Interpretation and Chunking to Enhance Memory: The Case of Chinese Character Learning

    ERIC Educational Resources Information Center

    Xu, Xiaoqiu; Padilla, Amado M.

    2013-01-01

    Learning and retaining Chinese characters are often considered to be the most challenging elements in learning Chinese as a foreign language. Applying the theory of meaningful interpretation, the chunking mnemonic technique, and the linguistic features of Chinese characters, this study examines whether the method of meaningful interpretation and…

  18. Spatial Heterogeneity of Rana boylii Habitat: Quantification and Ecological Meaningfulness

    NASA Astrophysics Data System (ADS)

    Yarnell, S. M.

    2005-05-01

    Analysis of the heterogeneity of stream habitat and how biological communities respond to that complexity are fundamental components of ecosystem analysis that are often inadequately addressed in watershed assessments and restoration practices. Many aquatic species, such as the Foothill Yellow-legged Frog (Rana boylii), known to associate with certain physical habitats at various times throughout their lifecycle may require some degree of habitat complexity at a larger reach scale for a population to persist. Recent research in the field of landscape ecology has expanded the use of spatial heterogeneity indices to other fields of ecology as an objective method to quantify variability in habitat. Provided that indices are used in an appropriate context and are shown to be ecologically meaningful, they provide a potentially useful tool for quantifying the variability in riverine habitat for aquatic species such as R. boylii. This study evaluated whether stream reaches with a high heterogeneity of geomorphic features, as measured by several key spatial heterogeneity indices, correlated with a greater relative abundance of R. boylii. R. boylii habitat associations were quantified throughout a single season to obtain further insight into the local hydraulic and geomorphic conditions preferred by each lifestage. The two best predictors of habitat associations by lifestage were velocity and substrate size, two key characteristics of geomorphic units such as riffles and pools. The heterogeneity of geomorphic units was then quantified and measured at the reach scale using a variety of spatial indices. Indices of spatial composition, such as Shannon's Diversity Index, were found to correlate well with frog abundance, while indices of spatial configuration, such as Contagion, were not significant. These findings indicate R. boylii may select stream reaches with increased geomorphic complexity that potentially provide habitats suitable to each lifestage with multiple functions

  19. An assessment of vapour pressure estimation methods.

    PubMed

    O'Meara, Simon; Booth, Alastair Murray; Barley, Mark Howard; Topping, David; McFiggans, Gordon

    2014-09-28

    Laboratory measurements of vapour pressures for atmospherically relevant compounds were collated and used to assess the accuracy of vapour pressure estimates generated by seven estimation methods and impacts on predicted secondary organic aerosol. Of the vapour pressure estimation methods that were applicable to all the test set compounds, the Lee-Kesler [Reid et al., The Properties of Gases and Liquids, 1987] method showed the lowest mean absolute error and the Nannoolal et al. [Nannoonal et al., Fluid Phase Equilib., 2008, 269, 117-133] method showed the lowest mean bias error (when both used normal boiling points estimated using the Nannoolal et al. [Nannoolal et al., Fluid Phase Equilib., 2004, 226, 45-63] method). The effect of varying vapour pressure estimation methods on secondary organic aerosol (SOA) mass loading and composition was investigated using an absorptive partitioning equilibrium model. The Myrdal and Yalkowsky [Myrdal and Yalkowsky, Ind. Eng. Chem. Res., 1997, 36, 2494-2499] vapour pressure estimation method using the Nannoolal et al. [Nannoolal et al., Fluid Phase Equilib., 2004, 226, 45-63] normal boiling point gave the most accurate estimation of SOA loading despite not being the most accurate for vapour pressures alone. PMID:25105180

  20. Normalization in sustainability assessment: Methods and implications

    DOE PAGES

    Pollesch, N. L.; Dale, Virginia H.

    2016-08-08

    One approach to assessing progress towards sustainability makes use of diverse indicators spanning the environmental, social, and economic dimensions of the system being studied. Given the use of multiple indicators and the inherent complexity entailed in interpreting several metrics, aggregation of sustainability indicators is a common step after indicator measures are quantified. Diverse indicators have different units of measurement, and normalization is the procedure employed to transform differing indicator measures onto similar scales or to unit-free measures. It is often difficult for stakeholders to make clear connections between specific indicator measurements and resulting aggregate scores of sustainability. Normalization can alsomore » create implicit weightings of indicator measures that are independent of actual stakeholder preference or explicit weighting. This paper explores normalization methods utilized in sustainability assessment including ratio normalization, target normalization, Z-score normalization, and unit equivalence normalization. A mathematical analysis of the impact of changes in raw indicator data measurements on an aggregate sustainability score is developed. Theoretical results are clarified through a case study of data used in assessment of progress towards bioenergy sustainability. Advantages and drawbacks associated with different normalization schemes are discussed within the context of sustainability assessment.« less

  1. Meaningful Learning in the Cooperative Classroom

    ERIC Educational Resources Information Center

    Sharan, Yael

    2015-01-01

    Meaningful learning is based on more than what teachers transmit; it promotes the construction of knowledge out of learners' experience, feelings and exchanges with other learners. This educational view is based on the constructivist approach to learning and the co-operative learning approach. Researchers and practitioners in various…

  2. Making Biodiversity Meaningful through Environmental Education.

    ERIC Educational Resources Information Center

    van Weelie, Daan; Wals, Arjen E. J.

    2002-01-01

    Explores the crossroads between science education and environmental education and presents a framework for tapping environmental education's potential of biodiversity. Outlines a number of stepping stones for making biodiversity meaningful to learners. From the perspective of environmental education, the ill-defined nature of biodiversity is a…

  3. Meaningful Experiences in the Counseling Process

    ERIC Educational Resources Information Center

    Sackett, Corrine; Lawson, Gerard; Burge, Penny L.

    2012-01-01

    Researchers examined the experiences of a counseling session from the perspectives of counselors-intraining (CITs) and clients. Post-session phenomenological interviews were conducted to elicit participants' meaningful experiences, and the analysis revealed both similarities and differences. Researchers found the following themes most meaningful…

  4. Values: The Natural Result of Meaningful Relationships.

    ERIC Educational Resources Information Center

    Beedy, Jeff; Gordon, John

    1997-01-01

    The New Hampton School (New Hampshire) uses the holistic Total Human Development Model with both students and faculty to instill principles focused on relationships as central to teaching and learning; respect and responsibility; sense of community; whole person development within the community; compassion and service; and the meaningful,…

  5. Meaningful Use of School Health Data

    ERIC Educational Resources Information Center

    Johnson, Kathleen Hoy; Bergren, Martha Dewey

    2011-01-01

    Meaningful use (MU) of Electronic Health Records (EHRs) is an important development in the safety and security of health care delivery in the United States. Advancement in the use of EHRs occurred with the passage of the American Recovery and Reinvestment Act of 2009, which provides incentives for providers to support adoption and use of EHRs.…

  6. On Meaningful Measurement: Concepts, Technology and Examples.

    ERIC Educational Resources Information Center

    Cheung, K. C.

    This paper discusses how concepts and procedural skills in problem-solving tasks, as well as affects and emotions, can be subjected to meaningful measurement (MM), based on a multisource model of learning and a constructivist information-processing theory of knowing. MM refers to the quantitative measurement of conceptual and procedural knowledge…

  7. Autonomic pain: features and methods of assessment

    SciTech Connect

    Gandhavadi, B.; Rosen, J.S.; Addison, R.G.

    1982-01-01

    The distribution of pain originating in the sympathetic nervous system does not match the somatic segmental sensory distribution at the postganglionic level. The two types of distribution are separate and different. At the preganglionic level, fibers show typical segmental sensory distribution, which resembles but is not identical to somatic segmental sensory distribution. Instead, sympathetic pain has its own distribution along the vascular supply and some peripheral nerves. It cannot be called atypical in terms of somatic segmental sensory distribution. Several techniques are available to assess autonomic function in cases of chronic pain. Infrared thermography is superior to any other physiologic or pharmacologic method to assess sympathetic function. Overactivity of sympathetic function in the area of pain is the probable cause of temperature reduction in that area. Accordingly it would appear that in cases in which thermography demonstrates decreased temperature, sympathetic block or sympathectomy would provide relief from the pain.

  8. Rangeland assessment and monitoring methods guide - an interactive tool for selecting methods for assessment and monitoring

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A common concern expressed by land managers and biologists is that they do not know enough about the strengths and weaknesses of different field and remote-sensing methods for rangeland assessment and monitoring. The Methods Guide is a web-based tool and resource that provides researchers and manage...

  9. Cardiovascular complications in acromegaly: methods of assessment.

    PubMed

    Vitale, G; Pivonello, R; Galderisi, M; D'Errico, A; Spinelli, L; Lupoli, G; Lombardi, G; Colao, A

    2001-09-01

    Cardiac involvement is common in acromegaly. Evidence for cardiac hypertrophy, dilation and diastolic filling abnormalities has been widely reported in literature. Generally, ventricular hypertrophy is revealed by echocardiography but early data referred increased cardiac size by standard X-ray. Besides, echocardiography investigates cardiac function and value disease. There are new technologic advances in ultrasonic imaging. Pulsed Tissue Doppler is a new non-invasive ultrasound tool which extends Doppler applications beyond the analysis of intra-cardiac flow velocities until the quantitative assessment of the regional myocardial left ventricular wall motion, measuring directly velocities and time intervals of myocardium. The radionuclide techniques permit to study better the cardiac performance. In fact, diastolic as well as systolic function can be assessed at rest and at peak exercise by equilibrium radionuclide angiography. This method has a main advantage of providing direct evaluation of ventricular function, being operator independent. Coronary artery disease has been poorly studied mainly because of the necessity to perform invasive procedures. Only a few cases have been reported with heart failure study by coronarography and having alterations of perfusion which ameliorated after somatostatin analog treatment. More recently, a few data have been presented using perfusional scintigraphy in acromegaly, even if coronary artery disease does not seem very frequent in acromegaly. Doppler analysis of carotid arteries can be also performed to investigate atherosclerosis: however, patients with active acromegaly have endothelial dysfunction more than clear-cut atherosclerotic plaques. In conclusion, careful assessments of cardiac function, morphology and activity need in patients with acromegaly.

  10. An empirical method for dynamic camouflage assessment

    NASA Astrophysics Data System (ADS)

    Blitch, John G.

    2011-06-01

    As camouflage systems become increasingly sophisticated in their potential to conceal military personnel and precious cargo, evaluation methods need to evolve as well. This paper presents an overview of one such attempt to explore alternative methods for empirical evaluation of dynamic camouflage systems which aspire to keep pace with a soldier's movement through rapidly changing environments that are typical of urban terrain. Motivating factors are covered first, followed by a description of the Blitz Camouflage Assessment (BCA) process and results from an initial proof of concept experiment conducted in November 2006. The conclusion drawn from these results, related literature and the author's personal experience suggest that operational evaluation of personal camouflage needs to be expanded beyond its foundation in signal detection theory and embrace the challenges posed by high levels of cognitive processing.

  11. Toward More Substantively Meaningful Automated Essay Scoring

    ERIC Educational Resources Information Center

    Ben-Simon, Anat; Bennett, Randy Elliott

    2007-01-01

    This study evaluated a "substantively driven" method for scoring NAEP writing assessments automatically. The study used variations of an existing commercial program, e-rater[R], to compare the performance of three approaches to automated essay scoring: a "brute-empirical" approach in which variables are selected and weighted solely according to…

  12. Electrophysiological methods for hearing assessment in pinnipeds

    NASA Astrophysics Data System (ADS)

    Reichmuth Kastak, Colleen; Kastak, David; Finneran, James J.; Houser, Dorian S.; Supin, Alexander

    2005-04-01

    Studies of auditory sensitivity in marine mammals generally rely on behavioral psychophysical methodologies. While these studies are the standard for hearing assessment in marine mammals, data are limited to only a few individuals representing a small proportion of species. Accumulating research on dolphin auditory physiology has resulted in the refinement of electrophysiological methods appropriate for odontocete cetaceans and an increase in available audiometric information. Electrophysiological methods have also been used with pinnipeds, but there are significant gaps in our understanding of pinniped auditory physiology that must be addressed before such appoaches can be broadly applied to investigations of pinniped hearing. We are taking a bottom-up approach to developing suitable methods for evoked potential audiometry in pinnipeds, including technology transfer from studies of cetaceans and other mammals, mapping of response amplitude with respect to recording positions on the skull, characterization of responses in relationship to various stimulus types and presentation parameters, and determination of whether useful frequency-specific data can be reliably obtained using electrophysiological methods. This approach is being taken with representative pinniped species including California sea lions (Zalophus californianus), harbor seals (Phoca vitulina), and northern elephant seals (Mirounga angustirostris) using both training and chemical immobilization techniques. [Work supported by NOPP.

  13. Screening for Meares-Irlen sensitivity in adults: can assessment methods predict changes in reading speed?

    PubMed

    Hollis, Jarrod; Allen, Peter M

    2006-11-01

    Two methods of assessing candidates for coloured overlays were compared with the aim of determining which method had the most practical utility. A total of 58 adults were assessed as potential candidates for coloured overlays, using two methods; a questionnaire, which identified self-reported previous symptoms, and a measure of perceptual distortions immediately prior to testing. Participants were classified as normal, Meares-Irlen sensitive, and borderline sensitive. Reading speed was measured with and without coloured overlays, using the Wilkins Rate of Reading Test and the change in speed was calculated. Participants classified as normal did not show any significant benefit from reading with an overlay. In contrast, a significant reading advantage was found for the borderline and Meares-Irlen participants. Current symptom rating was found to be a significant predictor of the change in reading speed, however the previous symptom rating was not found to be a reliable predictor. These data indicate that the assessment of perceptual distortions immediately prior to measuring colour preference and reading speed is the most meaningful method of assessing pattern glare and determining the utility of coloured overlays.

  14. Methods for probabilistic assessments of geologic hazards

    SciTech Connect

    Mann, C.J.

    1987-01-01

    Although risk analysis today is considered to include three separate aspects: (1) identifying sources of risk, (2) estimating probabilities quantitatively, and (3) evaluating consequences of risk, here, only estimation of probabilities for natural geologic events, processes, and phenomena is addressed. Ideally, evaluation of potential future hazards includes an objective determination of probabilities that has been derived from past occurrences of identical events or components contributing to complex processes or phenomena. In practice, however, data which would permit objective estimation of those probabilities of interest may not be adequate, or may not even exist. Another problem that arises normally, regardless of the extent of data, is that risk assessments involve estimating extreme values. Rarely are extreme values accurately predictable even when an empirical frequency distribution is established well by data. In the absence of objective methods for estimating probabilities of natural events or processes, subjective probabilities for the hazard must be established through Bayesian methods, expert opinion, or Delphi methods. Uncertainty of every probability determination must be stated for each component of an event, process, or phenomenon. These uncertainties also must be propagated through the quantitative analysis so that a realistic estimate of total uncertainty can be associated with each final probability estimate for a geologic hazard.

  15. An interpolation method for stream habitat assessments

    USGS Publications Warehouse

    Sheehan, Kenneth R.; Welsh, Stuart A.

    2015-01-01

    Interpolation of stream habitat can be very useful for habitat assessment. Using a small number of habitat samples to predict the habitat of larger areas can reduce time and labor costs as long as it provides accurate estimates of habitat. The spatial correlation of stream habitat variables such as substrate and depth improves the accuracy of interpolated data. Several geographical information system interpolation methods (natural neighbor, inverse distance weighted, ordinary kriging, spline, and universal kriging) were used to predict substrate and depth within a 210.7-m2 section of a second-order stream based on 2.5% and 5.0% sampling of the total area. Depth and substrate were recorded for the entire study site and compared with the interpolated values to determine the accuracy of the predictions. In all instances, the 5% interpolations were more accurate for both depth and substrate than the 2.5% interpolations, which achieved accuracies up to 95% and 92%, respectively. Interpolations of depth based on 2.5% sampling attained accuracies of 49–92%, whereas those based on 5% percent sampling attained accuracies of 57–95%. Natural neighbor interpolation was more accurate than that using the inverse distance weighted, ordinary kriging, spline, and universal kriging approaches. Our findings demonstrate the effective use of minimal amounts of small-scale data for the interpolation of habitat over large areas of a stream channel. Use of this method will provide time and cost savings in the assessment of large sections of rivers as well as functional maps to aid the habitat-based management of aquatic species.

  16. Methods for regional assessment of geothermal resources

    USGS Publications Warehouse

    Muffler, P.; Cataldi, R.

    1978-01-01

    A consistent, agreed-upon terminology is prerequisite for geothermal resource assessment. Accordingly, we propose a logical, sequential subdivision of the "geothermal resource base", accepting its definition as all the thermal energy in the earth's crust under a given area, measured from mean annual temperature. That part of the resource base which is shallow enough to be tapped by production drilling is termed the "accessible resource base", and it in turn is divided into "useful" and "residual" components. The useful component (i.e. the thermal energy that could reasonably be extracted at costs competitive with other forms of energy at some specified future time) is termed the "geothermal resource". This in turn is divided into "economic" and "subeconomic" components, based on conditions existing at the time of assessment. In the format of a McKelvey diagram, this logic defines the vertical axis (degree of economic feasibility). The horizontal axis (degree of geologic assurance) contains "identified" and "undiscovered" components. "Reserve" is then designated as the identified economic resource. All categories should be expressed in units of thermal energy, with resource and reserve figures calculated at wellhead, prior to the inevitable large losses inherent in any practical thermal use or in conversion to electricity. Methods for assessing geothermal resources can be grouped into 4 classes: (a) surface thermal flux, (b) volume, (c) planar fracture and (d) magmatic heat budget. The volume method appears to be most useful because (1) it is applicable to virtually any geologic environment, (2) the required parameters can in Sprinciple be measured or estimated, (3) the inevitable errors are in part compensated and (4) the major uncertainties (recoverability and resupply) are amenable to resolution in the foreseeable future. The major weakness in all the methods rests in the estimation of how much of the accessible resource base can be extracted at some time in the

  17. Evaluation of methods to assess physical activity

    NASA Astrophysics Data System (ADS)

    Leenders, Nicole Y. J. M.

    Epidemiological evidence has accumulated that demonstrates that the amount of physical activity-related energy expenditure during a week reduces the incidence of cardiovascular disease, diabetes, obesity, and all-cause mortality. To further understand the amount of daily physical activity and related energy expenditure that are necessary to maintain or improve the functional health status and quality of life, instruments that estimate total (TDEE) and physical activity-related energy expenditure (PAEE) under free-living conditions should be determined to be valid and reliable. Without evaluation of the various methods that estimate TDEE and PAEE with the doubly labeled water (DLW) method in females there will be eventual significant limitations on assessing the efficacy of physical activity interventions on health status in this population. A triaxial accelerometer (Tritrac-R3D, (TT)), an uniaxial (Computer Science and Applications Inc., (CSA)) activity monitor, a Yamax-Digiwalker-500sp°ler , (YX-stepcounter), by measuring heart rate responses (HR method) and a 7-d Physical Activity Recall questionnaire (7-d PAR) were compared with the "criterion method" of DLW during a 7-d period in female adults. The DLW-TDEE was underestimated on average 9, 11 and 15% using 7-d PAR, HR method and TT. The underestimation of DLW-PAEE by 7-d PAR was 21% compared to 47% and 67% for TT and YX-stepcounter. Approximately 56% of the variance in DLW-PAEE*kgsp{-1} is explained by the registration of body movement with accelerometry. A larger proportion of the variance in DLW-PAEE*kgsp{-1} was explained by jointly incorporating information from the vertical and horizontal movement measured with the CSA and Tritrac-R3D (rsp2 = 0.87). Although only a small amount of variance in DLW-PAEE*kgsp{-1} is explained by the number of steps taken per day, because of its low cost and ease of use, the Yamax-stepcounter is useful in studies promoting daily walking. Thus, studies involving the

  18. Direct toxicity assessment - Methods, evaluation, interpretation.

    PubMed

    Gruiz, Katalin; Fekete-Kertész, Ildikó; Kunglné-Nagy, Zsuzsanna; Hajdu, Csilla; Feigl, Viktória; Vaszita, Emese; Molnár, Mónika

    2016-09-01

    Direct toxicity assessment (DTA) results provide the scale of the actual adverse effect of contaminated environmental samples. DTA results are used in environmental risk management of contaminated water, soil and waste, without explicitly translating the results into chemical concentration. The end points are the same as in environmental toxicology in general, i.e. inhibition rate, decrease in the growth rate or in yield and the 'no effect' or the 'lowest effect' measurement points of the sample dilution-response curve. The measurement unit cannot be a concentration, since the contaminants and their content in the sample is unknown. Thus toxicity is expressed as the sample proportion causing a certain scale of inhibition or no inhibition. Another option for characterizing the scale of toxicity of an environmental sample is equivalencing. Toxicity equivalencing represents an interpretation tool which enables toxicity of unknown mixtures of chemicals be converted into the concentration of an equivalently toxic reference substance. Toxicity equivalencing, (i.e. expressing the toxicity of unknown contaminants as the concentration of the reference) makes DTA results better understandable for non-ecotoxicologists and other professionals educated and thinking based on the chemical model. This paper describes and discusses the role, the principles, the methodology and the interpretation of direct toxicity assessment (DTA) with the aim to contribute to the understanding of the necessity to integrate DTA results into environmental management of contaminated soil and water. The paper also introduces the benefits of the toxicity equivalency method. The use of DTA is illustrated through two case studies. The first case study focuses on DTA of treated wastewater with the aim to characterize the treatment efficacy of a biological wastewater treatment plant by frequent bioassaying. The second case study applied DTA to investigate the cover layers of two bauxite residue (red mud

  19. Direct toxicity assessment - Methods, evaluation, interpretation.

    PubMed

    Gruiz, Katalin; Fekete-Kertész, Ildikó; Kunglné-Nagy, Zsuzsanna; Hajdu, Csilla; Feigl, Viktória; Vaszita, Emese; Molnár, Mónika

    2016-09-01

    Direct toxicity assessment (DTA) results provide the scale of the actual adverse effect of contaminated environmental samples. DTA results are used in environmental risk management of contaminated water, soil and waste, without explicitly translating the results into chemical concentration. The end points are the same as in environmental toxicology in general, i.e. inhibition rate, decrease in the growth rate or in yield and the 'no effect' or the 'lowest effect' measurement points of the sample dilution-response curve. The measurement unit cannot be a concentration, since the contaminants and their content in the sample is unknown. Thus toxicity is expressed as the sample proportion causing a certain scale of inhibition or no inhibition. Another option for characterizing the scale of toxicity of an environmental sample is equivalencing. Toxicity equivalencing represents an interpretation tool which enables toxicity of unknown mixtures of chemicals be converted into the concentration of an equivalently toxic reference substance. Toxicity equivalencing, (i.e. expressing the toxicity of unknown contaminants as the concentration of the reference) makes DTA results better understandable for non-ecotoxicologists and other professionals educated and thinking based on the chemical model. This paper describes and discusses the role, the principles, the methodology and the interpretation of direct toxicity assessment (DTA) with the aim to contribute to the understanding of the necessity to integrate DTA results into environmental management of contaminated soil and water. The paper also introduces the benefits of the toxicity equivalency method. The use of DTA is illustrated through two case studies. The first case study focuses on DTA of treated wastewater with the aim to characterize the treatment efficacy of a biological wastewater treatment plant by frequent bioassaying. The second case study applied DTA to investigate the cover layers of two bauxite residue (red mud

  20. Dioxin surrogates - are there any meaningful ones?

    SciTech Connect

    Rigo, H.G.

    1997-12-01

    A number of easily measured pollutants have been offered up as dioxin surrogates - easily measured pollutants or process variables that consistently parallel dioxin emissions. Surrogates are a very attractive concept because the expense of direct dioxin measurement could be avoided. Also, the public can be assured that as long as easily and potentially continuously measured surrogates are within acceptable limits, people are not being exposed to excessive amounts of pollution. The question remains, however, is there a meaningful surrogate at conventionally regulated levels? Or, perhaps more importantly, at elevated levels that provide a comfortable margin of safety between normal operations and the conditions likely to represent elevated dioxin emissions?

  1. Meaningful Understanding and Systems Thinking in Organic Chemistry: Validating Measurement and Exploring Relationships

    NASA Astrophysics Data System (ADS)

    Vachliotis, Theodoros; Salta, Katerina; Tzougraki, Chryssa

    2014-04-01

    The purpose of this study was dual: First, to develop and validate assessment schemes for assessing 11th grade students' meaningful understanding of organic chemistry concepts, as well as their systems thinking skills in the domain. Second, to explore the relationship between the two constructs of interest based on students' performance on the applied assessment framework. For this purpose, (a) various types of objective assessment questions were developed and evaluated for assessing meaningful understanding, (b) a specific type of systemic assessment questions (SAQs) was developed and evaluated for assessing systems thinking skills, and (c) the association between students' responses on the applied assessment schemes was explored. The results indicated that properly designed objective questions can effectively capture aspects of students' meaningful understanding. It was also found that the SAQs can elicit systems thinking skills in the context of a formalistic systems thinking theoretical approach. Moreover, a significant relationship was observed between students' responses on the two assessment strategies. This research provides evidence that students' systems thinking level within a science domain is significantly related to their meaningful understanding of relative science concepts.

  2. A Screening Method for Assessing Cumulative Impacts

    PubMed Central

    Alexeeff, George V.; Faust, John B.; August, Laura Meehan; Milanes, Carmen; Randles, Karen; Zeise, Lauren; Denton, Joan

    2012-01-01

    The California Environmental Protection Agency (Cal/EPA) Environmental Justice Action Plan calls for guidelines for evaluating “cumulative impacts.” As a first step toward such guidelines, a screening methodology for assessing cumulative impacts in communities was developed. The method, presented here, is based on the working definition of cumulative impacts adopted by Cal/EPA [1]: “Cumulative impacts means exposures, public health or environmental effects from the combined emissions and discharges in a geographic area, including environmental pollution from all sources, whether single or multi-media, routinely, accidentally, or otherwise released. Impacts will take into account sensitive populations and socio-economic factors, where applicable and to the extent data are available.” The screening methodology is built on this definition as well as current scientific understanding of environmental pollution and its adverse impacts on health, including the influence of both intrinsic, biological factors and non-intrinsic socioeconomic factors in mediating the effects of pollutant exposures. It addresses disparities in the distribution of pollution and health outcomes. The methodology provides a science-based tool to screen places for relative cumulative impacts, incorporating both the pollution burden on a community- including exposures to pollutants, their public health and environmental effects- and community characteristics, specifically sensitivity and socioeconomic factors. The screening methodology provides relative rankings to distinguish more highly impacted communities from less impacted ones. It may also help identify which factors are the greatest contributors to a community’s cumulative impact. It is not designed to provide quantitative estimates of community-level health impacts. A pilot screening analysis is presented here to illustrate the application of this methodology. Once guidelines are adopted, the methodology can serve as a screening

  3. Assessment Methods and Tools for Architectural Curricula

    ERIC Educational Resources Information Center

    Marriott, Christine A.

    2012-01-01

    This research explores the process of assessment within the arena of architectural education by questioning traditional assessment practices and probing into the conditions that necessitate change. As architectural educators we have opened our studios to digital technologies for the purposes of design and representation, but how do we measure and…

  4. The Assessment of Oracy: Feasibility and Methods.

    ERIC Educational Resources Information Center

    Bourke, Sid

    A feasibility study has been conducted in Australia to investigate school and teacher objectives and practices in the development of oracy, to determine oracy skills agreed to be important, and to assess the feasibility and desirability of testing competence in oracy. The need for oracy assessment arises from a need for schools to account for…

  5. Meaningful Understanding and Systems Thinking in Organic Chemistry: Validating Measurement and Exploring Relationships

    ERIC Educational Resources Information Center

    Vachliotis, Theodoros; Salta, Katerina; Tzougraki, Chryssa

    2014-01-01

    The purpose of this study was dual: First, to develop and validate assessment schemes for assessing 11th grade students' meaningful understanding of organic chemistry concepts, as well as their systems thinking skills in the domain. Second, to explore the relationship between the two constructs of interest based on students' performance…

  6. Using Corporate-Based Methods To Assess Technical Communication Programs.

    ERIC Educational Resources Information Center

    Faber, Brenton; Bekins, Linn; Karis, Bill

    2002-01-01

    Investigates methods of program assessment used by corporate learning sites and profiles value added methods as a way to both construct and evaluate academic programs in technical communication. Examines and critiques assessment methods from corporate training environments including methods employed by corporate universities and value added…

  7. Towards a mathematical theory of meaningful communication.

    PubMed

    Corominas-Murtra, Bernat; Fortuny, Jordi; Solé, Ricard V

    2014-04-04

    Meaning has been left outside most theoretical approaches to information in biology. Functional responses based on an appropriate interpretation of signals have been replaced by a probabilistic description of correlations between emitted and received symbols. This assumption leads to potential paradoxes, such as the presence of a maximum information associated to a channel that creates completely wrong interpretations of the signals. Game-theoretic models of language evolution and other studies considering embodied communicating agents show that the correct (meaningful) match resulting from agent-agent exchanges is always achieved and natural systems obviously solve the problem correctly. Inspired by the concept of duality of the communicative sign stated by the swiss linguist Ferdinand de Saussure, here we present a complete description of the minimal system necessary to measure the amount of information that is consistently decoded. Several consequences of our developments are investigated, such as the uselessness of a certain amount of information properly transmitted for communication among autonomous agents.

  8. Towards a mathematical theory of meaningful communication

    PubMed Central

    Corominas-Murtra, Bernat; Fortuny, Jordi; Solé, Ricard V.

    2014-01-01

    Meaning has been left outside most theoretical approaches to information in biology. Functional responses based on an appropriate interpretation of signals have been replaced by a probabilistic description of correlations between emitted and received symbols. This assumption leads to potential paradoxes, such as the presence of a maximum information associated to a channel that creates completely wrong interpretations of the signals. Game-theoretic models of language evolution and other studies considering embodied communicating agents show that the correct (meaningful) match resulting from agent-agent exchanges is always achieved and natural systems obviously solve the problem correctly. Inspired by the concept of duality of the communicative sign stated by the swiss linguist Ferdinand de Saussure, here we present a complete description of the minimal system necessary to measure the amount of information that is consistently decoded. Several consequences of our developments are investigated, such as the uselessness of a certain amount of information properly transmitted for communication among autonomous agents. PMID:24699312

  9. Facilitating critical discourse through "meaningful disagreement" online.

    PubMed

    Dalley-Hewer, Jayne; Clouder, Deanne Lynn; Jackson, Ann; Goodman, Simon; Bluteau, Patricia; Davies, Bernadette

    2012-11-01

    This paper is concerned with identifying ways of facilitating "meaningful disagreement" amongst students in interprofessional online discussion forums. It builds on previous research that identified a trend toward polite agreement and only limited evidence of disagreement in this setting. Given the suggestion that disagreement indicates a deeper level of engagement in group discussion and therefore leads to deeper learning, our aim was to critique the pedagogical approach adopted by analyzing whether we were promoting a particular interprofessional discourse amongst students that favored agreement and therefore limited potential learning. Agreement in this context has been conceptualized as a form of online interprofessional "netiquette" existing amongst participants. Findings suggest that creating an online context for critical discourse is challenging; however, the careful construction of learning outcomes, trigger material/resources and learning activities, as well as attention to students' stage of study and life experience, can provoke the desired effects. PMID:22897367

  10. Towards a mathematical theory of meaningful communication

    NASA Astrophysics Data System (ADS)

    Corominas-Murtra, Bernat; Fortuny, Jordi; Solé, Ricard V.

    2014-04-01

    Meaning has been left outside most theoretical approaches to information in biology. Functional responses based on an appropriate interpretation of signals have been replaced by a probabilistic description of correlations between emitted and received symbols. This assumption leads to potential paradoxes, such as the presence of a maximum information associated to a channel that creates completely wrong interpretations of the signals. Game-theoretic models of language evolution and other studies considering embodied communicating agents show that the correct (meaningful) match resulting from agent-agent exchanges is always achieved and natural systems obviously solve the problem correctly. Inspired by the concept of duality of the communicative sign stated by the swiss linguist Ferdinand de Saussure, here we present a complete description of the minimal system necessary to measure the amount of information that is consistently decoded. Several consequences of our developments are investigated, such as the uselessness of a certain amount of information properly transmitted for communication among autonomous agents.

  11. Towards a mathematical theory of meaningful communication.

    PubMed

    Corominas-Murtra, Bernat; Fortuny, Jordi; Solé, Ricard V

    2014-01-01

    Meaning has been left outside most theoretical approaches to information in biology. Functional responses based on an appropriate interpretation of signals have been replaced by a probabilistic description of correlations between emitted and received symbols. This assumption leads to potential paradoxes, such as the presence of a maximum information associated to a channel that creates completely wrong interpretations of the signals. Game-theoretic models of language evolution and other studies considering embodied communicating agents show that the correct (meaningful) match resulting from agent-agent exchanges is always achieved and natural systems obviously solve the problem correctly. Inspired by the concept of duality of the communicative sign stated by the swiss linguist Ferdinand de Saussure, here we present a complete description of the minimal system necessary to measure the amount of information that is consistently decoded. Several consequences of our developments are investigated, such as the uselessness of a certain amount of information properly transmitted for communication among autonomous agents. PMID:24699312

  12. Simplified method for wetland habitat assessment

    NASA Astrophysics Data System (ADS)

    Cable, Ted T.; Brack, Virgil; Holmes, Virgil R.

    1989-03-01

    This article presents a wetland habitat assessment technique (HAT) using birds as indicators of habitat quality. The technique is quick, simple, inexpensive, and lends itself to screening large numbers of wetlands. HAT can provide input to more extensive evaluation techniques. Measures of species diversity and rarity are used to assess the quality of the wetland. By applying the notion of ecologically optimum size, the technique addresses the issue of economic efficiency. Results of field testing HAT on 11 tidally influenced wetlands are presented to illustrate HAT's utility. Application of HAT in a variety of situations is discussed.

  13. Assessment of seismic margin calculation methods

    SciTech Connect

    Kennedy, R.P.; Murray, R.C.; Ravindra, M.K.; Reed, J.W.; Stevenson, J.D.

    1989-03-01

    Seismic margin review of nuclear power plants requires that the High Confidence of Low Probability of Failure (HCLPF) capacity be calculated for certain components. The candidate methods for calculating the HCLPF capacity as recommended by the Expert Panel on Quantification of Seismic Margins are the Conservative Deterministic Failure Margin (CDFM) method and the Fragility Analysis (FA) method. The present study evaluated these two methods using some representative components in order to provide further guidance in conducting seismic margin reviews. It is concluded that either of the two methods could be used for calculating HCLPF capacities. 21 refs., 9 figs., 6 tabs.

  14. Spiritual Assessment in Counseling: Methods and Practice

    ERIC Educational Resources Information Center

    Oakes, K. Elizabeth; Raphel, Mary M.

    2008-01-01

    Given the widely expanding professional and empirical support for integrating spirituality into counseling, the authors present a practical discussion for raising counselors' general awareness and skill in the critical area of spiritual assessment. A discussion of rationale, measurement, and clinical practice is provided along with case examples.…

  15. REVIEW OF RAPID METHODS FOR ASSESSING WETLAND CONDITION

    EPA Science Inventory

    We evaluated over 40 wetland rapid assessment methods developed for a variety of purposes for their use in the assessment of ecological integrity or ecosystem condition. Four criteria were used to screen methods: 1) the method can be used to measure condition, 2) it is truly rap...

  16. Critical assessment of accelerating trypsination methods.

    PubMed

    Hustoft, Hanne Kolsrud; Reubsaet, Leon; Greibrokk, Tyge; Lundanes, Elsa; Malerod, Helle

    2011-12-15

    In LC-MS based proteomics, several accelerating trypsination methods have been introduced in order to speed up the protein digestion, which is often considered a bottleneck. Traditionally and most commonly, due to sample heterogeneity, overnight digestion at 37 °C is performed in order to digest both easily and more resistant proteins. High efficiency protein identification is important in proteomics, hours with LC-MS/MS analysis is needless if the majority of the proteins are not digested. Based on preliminary experiments utilizing some of the suggested accelerating methods, the question of whether accelerating digestion methods really provide the same protein identification efficiency as the overnight digestion was asked. In the present study we have evaluated four different accelerating trypsination methods (infrared (IR) and microwave assisted, solvent aided and immobilized trypsination). The methods were compared with conventional digestion at 37 °C in the same time range using a four protein mixture. Sequence coverage and peak area of intact proteins were used for the comparison. The accelerating methods were able to digest the proteins, but none of the methods appeared to be more efficient than the conventional digestion method at 37 °C. The conventional method at 37 °C is easy to perform using commercially available instrumentation and appears to be the digestion method to use. The digestion time in targeted proteomics can be optimized for each protein, while in comprehensive proteomics the digestion time should be extended due to sample heterogeneity and influence of other proteins present. Recommendations regarding optimizing and evaluating the tryptic digestion for both targeted and comprehensive proteomics are given, and a digestion method suitable as the first method for newcomers in comprehensive proteomics is suggested.

  17. Assessment of plaque assay methods for alphaviruses.

    PubMed

    Juarez, Diana; Long, Kanya C; Aguilar, Patricia; Kochel, Tadeusz J; Halsey, Eric S

    2013-01-01

    Viruses from the Alphavirus genus are responsible for numerous arboviral diseases impacting human health throughout the world. Confirmation of acute alphavirus infection is based on viral isolation, identification of viral RNA, or a fourfold or greater increase in antibody titers between acute and convalescent samples. In convalescence, the specificity of antibodies to an alphavirus may be confirmed by plaque reduction neutralization test. To identify the best method for alphavirus and neutralizing antibody recognition, the standard solid method using a cell monolayer overlay with 0.4% agarose and the semisolid method using a cell suspension overlay with 0.6% carboxymethyl cellulose (CMC) overlay were evaluated. Mayaro virus, Una virus, Venezuelan equine encephalitis virus (VEEV), and Western equine encephalitis virus (WEEV) were selected to be tested by both methods. The results indicate that the solid method showed consistently greater sensitivity than the semisolid method. Also, a "semisolid-variant method" using a 0.6% CMC overlay on a cell monolayer was assayed for virus titration. This method provided the same sensitivity as the solid method for VEEV and also had greater sensitivity for WEEV titration. Modifications in plaque assay conditions affect significantly results and therefore evaluation of the performance of each new assay is needed.

  18. Mixing Methods in Assessing Coaches' Decision Making

    ERIC Educational Resources Information Center

    Vergeer, Ineke; Lyle, John

    2007-01-01

    Mixing methods has recently achieved respectability as an appropriate approach to research design, offering a variety of advantages (Tashakkori & Teddlie, 2003). The purpose of this paper is to outline and evaluate a mixed methods approach within the domain of coaches' decision making. Illustrated with data from a policy-capturing study on…

  19. Cyber Assessment Methods For SCADA Security

    SciTech Connect

    May Robin Permann; Kenneth Rohde

    2005-06-01

    The terrorist attacks of September 11, 2001 brought to light threats and vulnerabilities that face the United States. In response, the U.S. Government is directing the effort to secure the nation's critical infrastructure by creating programs to implement the National Strategy to Secure Cyberspace (1). One part of this effort involves assessing Supervisory Control and Data Acquisition (SCADA) systems. These systems are essential to the control of critical elements of our national infrastructure, such as electric power, oil, and gas production and distribution. Since their incapacitation or destruction would have a debilitating impact on the defense or economic security of the United States, one of the main objectives of this program is to identify vulnerabilities and encourage the public and private sectors to work together to design secure control systems that resolve these weaknesses. This paper describes vulnerability assessment methodologies used in ongoing research and assessment activities designed to identify and resolve vulnerabilities so as to improve the security of the nation's critical infrastructure.

  20. Cyber Assessment Methods for SCADA Security

    SciTech Connect

    Not Available

    2005-06-01

    The terrorist attacks of September 11, 2001 brought to light threats and vulnerabilities that face the United States. In response, the U.S. Government is directing the effort to secure the nation's critical infrastructure by creating programs to implement the National Strategy to Secure Cyberspace (1). One part of this effort involves assessing Supervisory Control and Data Acquisition (SCADA) systems. These systems are essential to the control of critical elements of our national infrastructure, such as electric power, oil, and gas production and distribution. Since their incapacitation or destruction would have a debilitating impact on the defense or economic security of the United States, one of the main objectives of this program is to identify vulnerabilities and encourage the public and private sectors to work together to design secure control systems that resolve these weaknesses. This paper describes vulnerability assessment methodologies used in ongoing research and assessment activities designed to identify and resolve vulnerabilities so as to improve the security of the nation's critical infrastructure.

  1. Making biodiversity meaningful through environmental education

    NASA Astrophysics Data System (ADS)

    van Weelie, Daan

    2002-11-01

    Biodiversity is an emerging theme in science, society and, more recently, education. There is no one single definition of biodiversity that is adequate in all situations. Both the knowledge base and the value base of biodiversity are variable and questionable. Because of these characteristics, biodiversity makes for an interesting vehicle for linking science and society, and the investigation of the normative underpinnings of 'science-in-the making'. Based on a 3-year study, this paper explores the crossroads between science education and environmental education and presents a framework for tapping the environmental education potential of biodiversity. Outlined are a number stepping stones for making biodiversity meaningful to learners. It is argued that, from the perspective of environmental education, the illdefined nature of biodiversity is a useful feature. Biodiversity is renewing the discourse on nature conservation issues by bringing together different groups in society that are searching for a common language to discuss nature conservation issues in relation to sustainability issues. The resulting debate allows the socio-scientific dispute character of 'science-in-the-making' to surface. Participation in such a dispute is an excellent opportunity to learn about a highly relevant, controversial, emotionally charged and debatable topic at the crossroads of science, technology and society

  2. Information and perception of meaningful patterns.

    PubMed

    Del Viva, Maria M; Punzi, Giovanni; Benedetti, Daniele

    2013-01-01

    The visual system needs to extract the most important elements of the external world from a large flux of information in a short time for survival purposes. It is widely believed that in performing this task, it operates a strong data reduction at an early stage, by creating a compact summary of relevant information that can be handled by further levels of processing. In this work we formulate a model of early vision based on a pattern-filtering architecture, partly inspired by high-speed digital data reduction in experimental high-energy physics (HEP). This allows a much stronger data reduction than models based just on redundancy reduction. We show that optimizing this model for best information preservation under tight constraints on computational resources yields surprisingly specific a-priori predictions for the shape of biologically plausible features, and for experimental observations on fast extraction of salient visual features by human observers. Interestingly, applying the same optimized model to HEP data acquisition systems based on pattern-filtering architectures leads to specific a-priori predictions for the relevant data patterns that these devices extract from their inputs. These results suggest that the limitedness of computing resources can play an important role in shaping the nature of perception, by determining what is perceived as "meaningful features" in the input data.

  3. Postponement of death until symbolically meaningful occasions.

    PubMed

    Phillips, D P; Smith, D G

    1990-04-11

    This study shows that mortality dips before a symbolically meaningful occasion and peaks just afterward. Mortality among Chinese (n = 1288) dips by 35.1% in the week before the Harvest Moon Festival and peaks by the same amount (34.6%) in the week after. We chose to study mortality among Chinese and a Chinese holiday for two reasons. First, the holiday moves around the calendar, thus allowing separation of the effects of the holiday from fixed, monthly mortality effects. Second, the holiday appeals strongly to one (experimental) group and not to others (which can be used as control groups). In terms of percentage, cerebrovascular diseases displayed the largest dip/peak pattern, followed by diseases of the heart, and then malignant neoplasms. The dip/peak mortality pattern does not appear in various non-Chinese control groups. The statistical significance of the findings was demonstrated with linear and curvilinear regression analysis and with two nonparametric tests. After testing alternative explanations for the findings, we concluded that the dip/peak pattern occurs because death can be briefly postponed until after the occurrence of a significant occasion. PMID:2313872

  4. Assessing Social Isolation: Pilot Testing Different Methods.

    PubMed

    Taylor, Harry Owen; Herbers, Stephanie; Talisman, Samuel; Morrow-Howell, Nancy

    2016-04-01

    Social isolation is a significant public health problem among many older adults; however, most of the empirical knowledge about isolation derives from community-based samples. There has been less attention given to isolation in senior housing communities. The objectives of this pilot study were to test two methods to identify socially isolated residents in low-income senior housing and compare findings about the extent of isolation from these two methods. The first method, self-report by residents, included 47 out of 135 residents who completed in-person interviews. To determine self-report isolation, residents completed the Lubben Social Network Scale 6 (LSNS-6). The second method involved a staff member who reported the extent of isolation on all 135 residents via an online survey. Results indicated that 26% of residents who were interviewed were deemed socially isolated by the LSNS-6. Staff members rated 12% of residents as having some or a lot of isolation. In comparing the two methods, staff members rated 2% of interviewed residents as having a lot of isolation. The combination of self-report and staff report could be more informative than just self-report alone, particularly when participation rates are low. However, researchers should be aware of the potential discrepancy between these two methods. PMID:27276687

  5. Assessing Social Isolation: Pilot Testing Different Methods.

    PubMed

    Taylor, Harry Owen; Herbers, Stephanie; Talisman, Samuel; Morrow-Howell, Nancy

    2016-04-01

    Social isolation is a significant public health problem among many older adults; however, most of the empirical knowledge about isolation derives from community-based samples. There has been less attention given to isolation in senior housing communities. The objectives of this pilot study were to test two methods to identify socially isolated residents in low-income senior housing and compare findings about the extent of isolation from these two methods. The first method, self-report by residents, included 47 out of 135 residents who completed in-person interviews. To determine self-report isolation, residents completed the Lubben Social Network Scale 6 (LSNS-6). The second method involved a staff member who reported the extent of isolation on all 135 residents via an online survey. Results indicated that 26% of residents who were interviewed were deemed socially isolated by the LSNS-6. Staff members rated 12% of residents as having some or a lot of isolation. In comparing the two methods, staff members rated 2% of interviewed residents as having a lot of isolation. The combination of self-report and staff report could be more informative than just self-report alone, particularly when participation rates are low. However, researchers should be aware of the potential discrepancy between these two methods.

  6. Assessment of User Home Location Geoinference Methods

    SciTech Connect

    Harrison, Joshua J.; Bell, Eric B.; Corley, Courtney D.; Dowling, Chase P.; Cowell, Andrew J.

    2015-05-29

    This study presents an assessment of multiple approaches to determine the home and/or other important locations to a Twitter user. In this study, we present a unique approach to the problem of geotagged data sparsity in social media when performing geoinferencing tasks. Given the sparsity of explicitly geotagged Twitter data, the ability to perform accurate and reliable user geolocation from a limited number of geotagged posts has proven to be quite useful. In our survey, we have achieved accuracy rates of over 86% in matching Twitter user profile locations with their inferred home locations derived from geotagged posts.

  7. Influence of Assessment Center Methods on Assessor Ratings.

    ERIC Educational Resources Information Center

    Silverman, William H.; And Others

    One potential source of variation in assessment center ratings may be the way the assessor organizes and processes information. To examine how assessment center methods affect the way assessors organize and process assessment center information and the ratings they make, independent groups of assessors (N=24) used one of two models for integrating…

  8. Evaluation of Dynamic Methods for Earthwork Assessment

    NASA Astrophysics Data System (ADS)

    Vlček, Jozef; Ďureková, Dominika; Zgútová, Katarína

    2015-05-01

    Rapid development of road construction imposes requests on fast and quality methods for earthwork quality evaluation. Dynamic methods are now adopted in numerous civil engineering sections. Especially evaluation of the earthwork quality can be sped up using dynamic equipment. This paper presents the results of the parallel measurements of chosen devices for determining the level of compaction of soils. Measurements were used to develop the correlations between values obtained from various apparatuses. Correlations show that examined apparatuses are suitable for examination of compaction level of fine-grained soils with consideration of boundary conditions of used equipment. Presented methods are quick and results can be obtained immediately after measurement, and they are thus suitable in cases when construction works have to be performed in a short period of time.

  9. A review of regional mineral resource assessment methods.

    USGS Publications Warehouse

    Singer, D.A.; Mosier, D.L.

    1981-01-01

    Over 100 papers on regional mineral resource assessment of nonfuels are classified according to method(s) and form(s) of product in order to help identify possible methods for future assessments. Methods considered are: time-rate, crustal abundance, cumulative tonnage versus grade, simple subjective, complex subjective, Bayesian, frequency, trend, geometric probability, multiple regression, discriminant analysis, modified component, multivariate logistic, cluster analysis or pattern recognition, and simulation. Selection of an assessment method should be based on: 1) appropriateness of the product to the problem; 2) limitations in resources, such as information or time available for the assessment; 3) the level of uncertainty and acceptability of biases in the assessment; and 4) the need for verification of results and acceptance of the method. -Authors

  10. Surface water quality assessment by environmetric methods.

    PubMed

    Boyacioglu, Hülya; Boyacioglu, Hayal

    2007-08-01

    This environmetric study deals with the interpretation of river water monitoring data from the basin of the Buyuk Menderes River and its tributaries in Turkey. Eleven variables were measured to estimate water quality at 17 sampling sites. Factor analysis was applied to explain the correlations between the observations in terms of underlying factors. Results revealed that, water quality was strongly affected from agricultural uses. Cluster analysis was used to classify stations with similar properties and results distinguished three groups of stations. Water quality at downstream of the river was quite different from the other part. It is recommended to involve the environmetric data treatment as a substantial procedure in assessment of water quality data.

  11. [Radiographic assessment of pulmonary hypertension: Methodical aspects].

    PubMed

    Korobkova, I Z; Lazutkina, V K; Nizovtsova, L A; Riden, T V

    2015-01-01

    Pulmonary hypertension is a menacing complication of a number of diseases, which is responsible for high mortality rates and considerably poorer quality of life in a patient. The timely detection for pulmonary hypertension allows timely initiation of treatment, thus improvement in prognosis in the patient. Chest X-ray is the most commonly used radiographic technique for various causes. Physicians' awareness about the radiographic manifestations of pulmonary hypertension may contribute to the earlier detection of this severe disease. Owing to the natural contrast of reflected structures, a chest X-ray film gives a unique opportunity to assess pulmonary circulation vessels, to reveal the signs of pulmonary hypertension, and to estimate trends in the course of the disease. The paper details a procedure for analysis and the normal radiographic anatomy of pulmonary circulation vessels, gives the present classification of pulmonary hypertension, and sets forth its X-ray semiotics. PMID:26552229

  12. [Radiographic assessment of pulmonary hypertension: Methodical aspects].

    PubMed

    Korobkova, I Z; Lazutkina, V K; Nizovtsova, L A; Riden, T V

    2015-01-01

    Pulmonary hypertension is a menacing complication of a number of diseases, which is responsible for high mortality rates and considerably poorer quality of life in a patient. The timely detection for pulmonary hypertension allows timely initiation of treatment, thus improvement in prognosis in the patient. Chest X-ray is the most commonly used radiographic technique for various causes. Physicians' awareness about the radiographic manifestations of pulmonary hypertension may contribute to the earlier detection of this severe disease. Owing to the natural contrast of reflected structures, a chest X-ray film gives a unique opportunity to assess pulmonary circulation vessels, to reveal the signs of pulmonary hypertension, and to estimate trends in the course of the disease. The paper details a procedure for analysis and the normal radiographic anatomy of pulmonary circulation vessels, gives the present classification of pulmonary hypertension, and sets forth its X-ray semiotics.

  13. Assessing Institutional Effectiveness: Issues, Methods, and Management.

    ERIC Educational Resources Information Center

    Fincher, Cameron, Ed.

    This collection of 12 papers was presented at a 1987 conference at which speakers presented personal perspectives on institutional effectiveness. Papers are organized under three major headings: "Managing Quality: Methods and Outcomes,""Institutional Response," and "Special Issues." Titles include: (1) "Managing the Meaning of Institutional…

  14. Meaningful metrics for observatory publication statistics

    NASA Astrophysics Data System (ADS)

    Rots, Arnold H.; Winkelman, Sherry L.; Becker, Glenn E.

    2012-09-01

    Observatories have wrestled for decades with the questions how to measure their importance to the astronomical community, what their scientific impact is, and how their performance in that respect compares to that of other observatories. There is a general sense that the answer is to be found in the publication record - specifically, in the refereed journal articles. However, simple parameters (such as the number of papers) are not helpful, because in isolation (applied to a single observatory) they are meaningless, while in comparison between observatories they are subject to external influences that all but invalidate the comparisons. We were fortunate in having the Chandra X-ray Observatory's bibliographic database with its rich variety of metadata available as a resource for experimenting with more sophisticated metrics. Out of this project we propose a modest set that contains meaningful information when viewed in the isolation of a single observatory as well as in comparison with other observatories. Even so, we urge users not to draw conclusions on the basis of the face value of the comparisons, but only after a serious analysis of potential causes for any differences or similarities. We have designed our metrics to provide useful information in three main areas of interest: speed of publication; fraction of observing time published; and archival usage. The basic measured parameters are the percentage of available observing time published as a function of the data's age, at a few specific age values; the median time it takes to publish observations; and similar parameters for multiple publications of the same observations. Citation of results is a fourth category, but it does not lend itself well to comparisons and defies the search for definite statements.

  15. A Novel Method for Learner Assessment Based on Learner Annotations

    ERIC Educational Resources Information Center

    Noorbehbahani, Fakhroddin; Samani, Elaheh Biglar Beigi; Jazi, Hossein Hadian

    2013-01-01

    Assessment is one of the most essential parts of any instructive learning process which aims to evaluate a learner's knowledge about learning concepts. In this work, a new method for learner assessment based on learner annotations is presented. The proposed method exploits the M-BLEU algorithm to find the most similar reference annotations…

  16. Aerodynamic drag in cycling: methods of assessment.

    PubMed

    Debraux, Pierre; Grappe, Frederic; Manolova, Aneliya V; Bertucci, William

    2011-09-01

    When cycling on level ground at a speed greater than 14 m/s, aerodynamic drag is the most important resistive force. About 90% of the total mechanical power output is necessary to overcome it. Aerodynamic drag is mainly affected by the effective frontal area which is the product of the projected frontal area and the coefficient of drag. The effective frontal area represents the position of the cyclist on the bicycle and the aerodynamics of the cyclist-bicycle system in this position. In order to optimise performance, estimation of these parameters is necessary. The aim of this study is to describe and comment on the methods used during the last 30 years for the evaluation of the effective frontal area and the projected frontal area in cycling, in both laboratory and actual conditions. Most of the field methods are not expensive and can be realised with few materials, providing valid results in comparison with the reference method in aerodynamics, the wind tunnel. Finally, knowledge of these parameters can be useful in practice or to create theoretical models of cycling performance.

  17. Assessment of dental plaque by optoelectronic methods

    NASA Astrophysics Data System (ADS)

    Negrutiu, Meda-Lavinia; Sinescu, Cosmin; Bortun, Cristina Maria; Levai, Mihaela-Codrina; Topala, Florin Ionel; Crǎciunescu, Emanuela Lidia; Cojocariu, Andreea Codruta; Duma, Virgil Florin; Podoleanu, Adrian Gh.

    2016-03-01

    The formation of dental biofilm follows specific mechanisms of initial colonization on the surface, microcolony formation, development of organized three dimensional community structures, and detachment from the surface. The structure of the plaque biofilm might restrict the penetration of antimicrobial agents, while bacteria on a surface grow slowly and display a novel phenotype; the consequence of the latter is a reduced sensitivity to inhibitors. The aim of this study was to evaluate with different optoelectronic methods the morphological characteristics of the dental biofilm. The study was performed on samples from 25 patients aged between 18 and 35 years. The methods used in this study were Spectral Domain Optical Coherence Tomography (SD-OCT) working at 870 nm for in vivo evaluations and Scanning Electron Microscopy (SEM) for validations. For each patient a sample of dental biofilm was obtained directly from the vestibular surface of the teeth's. SD-OCT produced C- and B-scans that were used to generate three dimensional (3D) reconstructions of the sample. The results were compared with SEM evaluations. The biofilm network was dramatically destroyed after the professional dental cleaning. OCT noninvasive methods can act as a valuable tool for the 3D characterization of dental biofilms.

  18. [Thiel's method of embalming and its usefulness in surgical assessments].

    PubMed

    Okada, Ryuhei; Tsunoda, Atsunobu; Momiyama, Naoko; Kishine, Naomi; Kitamura, Ken; Kishimoto, Seiji; Akita, Keiichi

    2012-08-01

    When we assess anatomical problems and the safety and effectiveness for performing a difficult surgical procedure or planning novel surgical approaches, preoperative human dissections are very helpful. However, embalming with the conventional formaldehyde method makes the soft tissue of the cadaver harder than that of a living body. Therefore, the cadaver embalmed with conventional formaldehyde is not appropriate for dissections when assess surgical approaches. Thiel's method is a novel embalming technique, first reported by W. Theil in 1992. This method can preserve color and softness of the cadaver without risk of infections. We have used cadavers embalmed with Thiel's method for preoperative assessments and have confirmed the usefulness of this method especially for the prevention of complications or in assessing surgical approaches. The cadaver embalmed with this method has several advantages over other embalming methods and it might be also useful for the developments of new surgical devices or evaluation of a surgeon's skill.

  19. Assessment methods in surgical training in the United Kingdom

    PubMed Central

    Evgeniou, Evgenios; Peter, Loizou; Tsironi, Maria; Iyer, Srinivasan

    2013-01-01

    A career in surgery in the United Kingdom demands a commitment to a long journey of assessment. The assessment methods used must ensure that the appropriate candidates are selected into a programme of study or a job and must guarantee public safety by regulating the progression of surgical trainees and the certification of trained surgeons. This review attempts to analyse the psychometric properties of various assessment methods used in the selection of candidates to medical school, job selection, progression in training, and certification. Validity is an indicator of how well an assessment measures what it is designed to measure. Reliability informs us whether a test is consistent in its outcome by measuring the reproducibility and discriminating ability of the test. In the long journey of assessment in surgical training, the same assessment formats are frequently being used for selection into a programme of study, job selection, progression, and certification. Although similar assessment methods are being used for different purposes in surgical training, the psychometric properties of these assessment methods have not been examined separately for each purpose. Because of the significance of these assessments for trainees and patients, their reliability and validity should be examined thoroughly in every context where the assessment method is being used. PMID:23441076

  20. Methods for Assessing Mitochondrial Function in Diabetes

    PubMed Central

    Kane, Daniel A.; Lanza, Ian R.; Neufer, P. Darrell

    2013-01-01

    A growing body of research is investigating the potential contribution of mitochondrial function to the etiology of type 2 diabetes. Numerous in vitro, in situ, and in vivo methodologies are available to examine various aspects of mitochondrial function, each requiring an understanding of their principles, advantages, and limitations. This review provides investigators with a critical overview of the strengths, limitations and critical experimental parameters to consider when selecting and conducting studies on mitochondrial function. In vitro (isolated mitochondria) and in situ (permeabilized cells/tissue) approaches provide direct access to the mitochondria, allowing for study of mitochondrial bioenergetics and redox function under defined substrate conditions. Several experimental parameters must be tightly controlled, including assay media, temperature, oxygen concentration, and in the case of permeabilized skeletal muscle, the contractile state of the fibers. Recently developed technology now offers the opportunity to measure oxygen consumption in intact cultured cells. Magnetic resonance spectroscopy provides the most direct way of assessing mitochondrial function in vivo with interpretations based on specific modeling approaches. The continuing rapid evolution of these technologies offers new and exciting opportunities for deciphering the potential role of mitochondrial function in the etiology and treatment of diabetes. PMID:23520284

  1. Regional method to assess offshore slope stability.

    USGS Publications Warehouse

    Lee, H.J.; Edwards, B.D.

    1986-01-01

    The slope stability of some offshore environments can be evaluated by using only conventional acoustic profiling and short-core sampling, followed by laboratory consolidation and strength testing. The test results are synthesized by using normalized-parameter techniques. The normalized data are then used to calculate the critical earthquake acceleration factors or the wave heights needed to initiate failure. These process-related parameters provide a quantitative measure of the relative stability for locations from which short cores were obtained. The method is most applicable to offshore environments of gentle relief and simple subsurface structure and is not considered a substitute for subsequent site-specific analysis. -from ASCE Publications Information

  2. MIMIC Methods for Assessing Differential Item Functioning in Polytomous Items

    ERIC Educational Resources Information Center

    Wang, Wen-Chung; Shih, Ching-Lin

    2010-01-01

    Three multiple indicators-multiple causes (MIMIC) methods, namely, the standard MIMIC method (M-ST), the MIMIC method with scale purification (M-SP), and the MIMIC method with a pure anchor (M-PA), were developed to assess differential item functioning (DIF) in polytomous items. In a series of simulations, it appeared that all three methods…

  3. A new assessment method for image fusion quality

    NASA Astrophysics Data System (ADS)

    Li, Liu; Jiang, Wanying; Li, Jing; Yuchi, Ming; Ding, Mingyue; Zhang, Xuming

    2013-03-01

    Image fusion quality assessment plays a critically important role in the field of medical imaging. To evaluate image fusion quality effectively, a lot of assessment methods have been proposed. Examples include mutual information (MI), root mean square error (RMSE), and universal image quality index (UIQI). These image fusion assessment methods could not reflect the human visual inspection effectively. To address this problem, we have proposed a novel image fusion assessment method which combines the nonsubsampled contourlet transform (NSCT) with the regional mutual information in this paper. In this proposed method, the source medical images are firstly decomposed into different levels by the NSCT. Then the maximum NSCT coefficients of the decomposed directional images at each level are obtained to compute the regional mutual information (RMI). Finally, multi-channel RMI is computed by the weighted sum of the obtained RMI values at the various levels of NSCT. The advantage of the proposed method lies in the fact that the NSCT can represent image information using multidirections and multi-scales and therefore it conforms to the multi-channel characteristic of human visual system, leading to its outstanding image assessment performance. The experimental results using CT and MRI images demonstrate that the proposed assessment method outperforms such assessment methods as MI and UIQI based measure in evaluating image fusion quality and it can provide consistent results with human visual assessment.

  4. Interlaboratory Validation of the Leaching Environmental Assessment Framework (LEAF) Method 1313 and Method 1316

    EPA Science Inventory

    This document summarizes the results of an interlaboratory study conducted to generate precision estimates for two parallel batch leaching methods which are part of the Leaching Environmental Assessment Framework (LEAF). These methods are: (1) Method 1313: Liquid-Solid Partition...

  5. Stimulus set meaningfulness and neurophysiological differentiation: a functional magnetic resonance imaging study.

    PubMed

    Boly, Melanie; Sasai, Shuntaro; Gosseries, Olivia; Oizumi, Masafumi; Casali, Adenauer; Massimini, Marcello; Tononi, Giulio

    2015-01-01

    A meaningful set of stimuli, such as a sequence of frames from a movie, triggers a set of different experiences. By contrast, a meaningless set of stimuli, such as a sequence of 'TV noise' frames, triggers always the same experience--of seeing 'TV noise'--even though the stimuli themselves are as different from each other as the movie frames. We reasoned that the differentiation of cortical responses underlying the subject's experiences, as measured by Lempel-Ziv complexity (incompressibility) of functional MRI images, should reflect the overall meaningfulness of a set of stimuli for the subject, rather than differences among the stimuli. We tested this hypothesis by quantifying the differentiation of brain activity patterns in response to a movie sequence, to the same movie scrambled in time, and to 'TV noise', where the pixels from each movie frame were scrambled in space. While overall cortical activation was strong and widespread in all conditions, the differentiation (Lempel-Ziv complexity) of brain activation patterns was correlated with the meaningfulness of the stimulus set, being highest in the movie condition, intermediate in the scrambled movie condition, and minimal for 'TV noise'. Stimulus set meaningfulness was also associated with higher information integration among cortical regions. These results suggest that the differentiation of neural responses can be used to assess the meaningfulness of a given set of stimuli for a given subject, without the need to identify the features and categories that are relevant to the subject, nor the precise location of selective neural responses.

  6. Life is pretty meaningful and/or purposeful?: On conflations, contexts, and consequences.

    PubMed

    Hill, Patrick L; Burrow, Anthony L; Sumner, Rachel; Young, Robin K

    2015-09-01

    Comments on the original article "Life is pretty meaningful," by S. J. Heintzelman and L. A. King (see record 2014-03265-001). Heintzelman and King condense descriptive data from numerous studies to conclude that individuals tend to see life as meaningful, because average scores on the meaning and purpose in life assessments fall above the midpoint. However, in so doing, they make two contentious assumptions. The first is the expectation that scale midpoints actually reflect an average score on that construct. However, one should not interpret this metric to suggest that people generally live meaningful lives without great caution and consideration of the second assumption: the conflation of purpose and meaning in life. In response, the current authors address this second assumption and the need to develop better questions and measures for both meaning and purpose. PMID:26348347

  7. Influence of expertise on rockfall hazard assessment using empirical methods

    NASA Astrophysics Data System (ADS)

    Delonca, Adeline; Verdel, Thierry; Gunzburger, Yann

    2016-07-01

    To date, many rockfall hazard assessment methods still consider qualitative observations within their analysis. Based on this statement, knowledge and expertise are supposed to be major parameters of rockfall assessment. To test this hypothesis, an experiment was carried out in order to evaluate the influence of knowledge and expertise on rockfall hazard assessment. Three populations were selected, having different levels of expertise: (1) students in geosciences, (2) researchers in geosciences and (3) confirmed experts. These three populations evaluated the rockfall hazard level on the same site, considering two different methods: the Laboratoire des Ponts et Chaussées (LPC) method and a method partly based on the "slope mass rating" (SMR) method. To complement the analysis, the completion of an "a priori" assessment of the rockfall hazard was requested of each population, without using any method. The LPC method is the most widely used method in France for official hazard mapping. It combines two main indicators: the predisposition to instability and the expected magnitude. Reversely, the SMR method was used as an ad hoc quantitative method to investigate the effect of quantification within a method. These procedures were applied on a test site divided into three different sectors. A statistical treatment of the results (descriptive statistical analysis, chi-square independent test and ANOVA) shows that there is a significant influence of the method used on the rockfall hazard assessment, whatever the sector. However, there is a non-significant influence of the level of expertise of the population the sectors 2 and 3. On sector 1, there is a significant influence of the level of expertise, explained by the importance of the temporal probability assessment in the rockfall hazard assessment process. The SMR-based method seems highly sensitive to the "site activity" indicator and exhibits an important dispersion in its results. However, the results are more similar

  8. Compounding conservatisms: EPA's health risk assessment methods

    SciTech Connect

    Stackelberg, K. von; Burmaster, D.E. )

    1993-03-01

    Superfund conjures up images of hazardous waste sites, which EPA is spending billions of dollars to remediate. One of the law's most worrisome effects is that it drains enormous economic resources without returning commensurate benefits. In a Sept. 1, 1991, front page article in The New York Times, experts argued that most health dangers at Superfund sites could be eliminated for a fraction of the billions that will be spent cleaning up the 1,200 high-priority sites across the country. Even EPA has suggested that the Superfund program may receive disproportionate resources, compared with other public health programs, such as radon in houses, the diminishing ozone layer and occupational diseases. Public opinion polls over the last decade consistently have mirrored the public's vast fear of hazardous waste sites, a fear as great as that held for nuclear power plants. Fear notwithstanding, the high cost of chosen remedies at given sites may have less to do with public health goals than with the method EPA uses to translate them into acceptable contaminant concentrations in soil, groundwater and other environmental media.

  9. 42 CFR 495.210 - Meaningful EHR user attestation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 5 2013-10-01 2013-10-01 false Meaningful EHR user attestation. 495.210 Section... INCENTIVE PROGRAM Requirements Specific to Medicare Advantage (MA) Organizations § 495.210 Meaningful EHR... EHR user. (b) Qualifying MA organizations are required to attest within 2 months after the close of...

  10. 42 CFR 495.210 - Meaningful EHR user attestation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 5 2014-10-01 2014-10-01 false Meaningful EHR user attestation. 495.210 Section... INCENTIVE PROGRAM Requirements Specific to Medicare Advantage (MA) Organizations § 495.210 Meaningful EHR... EHR user. (b) Qualifying MA organizations are required to attest within 2 months after the close of...

  11. 42 CFR 495.210 - Meaningful EHR user attestation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 5 2012-10-01 2012-10-01 false Meaningful EHR user attestation. 495.210 Section... INCENTIVE PROGRAM Requirements Specific to Medicare Advantage (MA) Organizations § 495.210 Meaningful EHR... EHR user. (b) Qualifying MA organizations are required to attest within 60 days after the close of...

  12. Meaningful Literacy: Writing Poetry in the Language Classroom

    ERIC Educational Resources Information Center

    Hanauer, David I.

    2012-01-01

    This paper develops the concept of meaningful literacy and offers a classroom methodology--poetry writing--that manifests this approach to ESL/EFL literacy instruction. The paper is divided into three sections. The first deals with the concept of meaningful literacy learning in second and foreign language pedagogy; the second summarizes empirical…

  13. Meaningfulness of service and marital satisfaction in Army couples.

    PubMed

    Bergmann, Jeffrey S; Renshaw, Keith D; Allen, Elizabeth S; Markman, Howard J; Stanley, Scott M

    2014-10-01

    The vast numbers of military service members who have been deployed since 2001 highlights the need to better understand relationships of military couples. A unique consideration in military couples is the concept of meaningfulness of service, or the value service members and their partners place on military service in spite of the sacrifices it requires. In a sample of 606 Army couples, the authors used path analysis to examine how male service members' and female spouses' perceived meaningfulness of service added to the prediction of marital satisfaction in both members of the couple, when accounting for service members' PTSD symptoms. Spouses' perceived meaningfulness of service was linked with higher marital satisfaction in spouses, regardless of service member's perceived meaningfulness of service. Service members' perceived meaningfulness of service was also associated with increased marital satisfaction in service members, but only when their spouses also perceived higher meaningfulness. There were no significant interactions between service members' PTSD and either partner's perceived meaningfulness. Implications for enhanced attention to spousal perceptions of meaningfulness of service are discussed. PMID:25046347

  14. Self-Determination and Meaningful Work: Exploring Socioeconomic Constraints

    PubMed Central

    Allan, Blake A.

    2016-01-01

    This study examined a model of meaningful work among a diverse sample of working adults. From the perspectives of Self-Determination Theory and the Psychology of Working Framework, we tested a structural model with social class and work volition predicting SDT motivation variables, which in turn predicted meaningful work. Partially supporting hypotheses, work volition was positively related to internal regulation and negatively related to amotivation, whereas social class was positively related to external regulation and amotivation. In turn, internal regulation was positively related to meaningful work, whereas external regulation and amotivation were negatively related to meaningful work. Indirect effects from work volition to meaningful work via internal regulation and amotivation were significant, and indirect effects from social class to meaningful work via external regulation and amotivation were significant. This study highlights the important relations between SDT motivation variables and meaningful work, especially the large positive relation between internal regulation and meaningful work. However, results also reveal that work volition and social class may play critical roles in predicting internal regulation, external regulation, and amotivation. PMID:26869970

  15. Self-Determination and Meaningful Work: Exploring Socioeconomic Constraints.

    PubMed

    Allan, Blake A; Autin, Kelsey L; Duffy, Ryan D

    2016-01-01

    This study examined a model of meaningful work among a diverse sample of working adults. From the perspectives of Self-Determination Theory and the Psychology of Working Framework, we tested a structural model with social class and work volition predicting SDT motivation variables, which in turn predicted meaningful work. Partially supporting hypotheses, work volition was positively related to internal regulation and negatively related to amotivation, whereas social class was positively related to external regulation and amotivation. In turn, internal regulation was positively related to meaningful work, whereas external regulation and amotivation were negatively related to meaningful work. Indirect effects from work volition to meaningful work via internal regulation and amotivation were significant, and indirect effects from social class to meaningful work via external regulation and amotivation were significant. This study highlights the important relations between SDT motivation variables and meaningful work, especially the large positive relation between internal regulation and meaningful work. However, results also reveal that work volition and social class may play critical roles in predicting internal regulation, external regulation, and amotivation.

  16. Exploring the Meaningful Learning of Students in Second Life

    ERIC Educational Resources Information Center

    Keskitalo, Tuulikki; Pyykko, Elli; Ruokamo, Heli

    2011-01-01

    This study reports a case study in which a pedagogical model, namely the Global Virtual Education (GloVEd) model, which is based on the teaching-studying-learning process (TSL process) and the characteristics of meaningful learning, is developed and used to evaluate students' meaningful learning experiences during the Global Virtual Collaboration…

  17. Developing Meaningfulness at Work through Emotional Intelligence Training

    ERIC Educational Resources Information Center

    Thory, Kathryn

    2016-01-01

    To date, there remains a significant gap in the human resource development (HRD) literature in understanding how training and development contributes to meaningful work. In addition, little is known about how individuals proactively make their work more meaningful. This article shows how emotional intelligence (EI) training promotes learning about…

  18. Meaningful region extraction based on three-stage unsupervised segmentation algorithm

    NASA Astrophysics Data System (ADS)

    Ben, Zhiwei; Zhao, Xunjie; Li, Chengjin

    2009-11-01

    From a theoretical standpoint, meaningful region segmentation based only on gray level or color usually presents over segmentation or non-continuous regions. In view of this, we adopt a number of classical powerful algorithms (mean shift clustering, edge detection and region growing) to extract the meaningful regions adds spatial information. These algorithms are subjectively connected together and impact the results each other. The experiments indicate that the proposed method can avoid over-segmentation phenomenon and the results can be easily accepted by human eyes. Experimental results are superior to that of kmeans clustering method in both real-time performance and image segmentation performance. Finally, we achieved a new procedure to extract meaningful regions by clicking some place of a color image. It possesses a good application prospect and owns an effective real-time performance.

  19. Research iris serial images quality assessment method based on HVS

    NASA Astrophysics Data System (ADS)

    Li, Zhi-hui; Zhang, Chang-hai; Ming, Xing; Zhao, Yong-hua

    2006-01-01

    Iris recognition can be widely used in security and customs, and it provides superiority security than other human feature recognition such as fingerprint, face and so on. The iris image quality is crucial to recognition effect. Accordingly reliable image quality assessments are necessary for evaluating iris image quality. However, there haven't uniformly criterion to Image quality assessment. Image quality assessment have Objective and Subjective Evaluation methods, In practice, However Subjective Evaluation method is fussy and doesn't effective on iris recognition. Objective Evaluation method should be used in iris recognition. According to human visual system model (HVS) Multi-scale and selectivity characteristic, it presents a new iris Image quality assessment method. In the paper, ROI is found and wavelet transform zero-crossing is used to find Multi-scale edge, and Multi-scale fusion measure is used to assess iris image quality. In experiment, Objective and Subjective Evaluation methods are used to assess iris images. From the results, the method is effectively to iris image quality assessment.

  20. Language Assessment Methods for Three Age Groups of Children.

    ERIC Educational Resources Information Center

    Beck, Ann R.

    1995-01-01

    This article describes results of a survey of licensed Midwestern school-based speech-language pathologists (N=326) regarding methods used to assess the language of children ages 3 to 5 years, 6 to 11 years, and 12 to 18 years. Striking similarities were found in methods used for each age group. The relationship of these methods to recommended…

  1. REGIONAL VULNERABILITY ASSESSMENT OF THE MID-ATLANTIC REGION: EVALUATION OF INTEGRATION METHODS AND ASSESSMENTS RESULTS

    EPA Science Inventory

    This report describes methods for quantitative regional assessment developed by the Regional Vulnerability Assessment (ReVA) program. The goal of ReVA is to develop regional-scale assessments of the magnitude, extent, distribution, and uncertainty of current and anticipated envir...

  2. Adapting Chemical Mixture Risk Assessment Methods to Assess Chemical and Non-Chemical Stressor Combinations

    EPA Science Inventory

    Presentation based on the following abstract: Chemical mixtures risk assessment methods are routinely used. To address combined chemical and nonchemical stressors, component-based approaches may be applicable, depending on the toxic action among diverse stressors. Such methods a...

  3. Improved reliability analysis method based on the failure assessment diagram

    NASA Astrophysics Data System (ADS)

    Zhou, Yu; Zhang, Zheng; Zhong, Qunpeng

    2012-07-01

    With the uncertainties related to operating conditions, in-service non-destructive testing (NDT) measurements and material properties considered in the structural integrity assessment, probabilistic analysis based on the failure assessment diagram (FAD) approach has recently become an important concern. However, the point density revealing the probabilistic distribution characteristics of the assessment points is usually ignored. To obtain more detailed and direct knowledge from the reliability analysis, an improved probabilistic fracture mechanics (PFM) assessment method is proposed. By integrating 2D kernel density estimation (KDE) technology into the traditional probabilistic assessment, the probabilistic density of the randomly distributed assessment points is visualized in the assessment diagram. Moreover, a modified interval sensitivity analysis is implemented and compared with probabilistic sensitivity analysis. The improved reliability analysis method is applied to the assessment of a high pressure pipe containing an axial internal semi-elliptical surface crack. The results indicate that these two methods can give consistent sensitivities of input parameters, but the interval sensitivity analysis is computationally more efficient. Meanwhile, the point density distribution and its contour are plotted in the FAD, thereby better revealing the characteristics of PFM assessment. This study provides a powerful tool for the reliability analysis of critical structures.

  4. Using a Multilevel Assessment Scheme in Reforming Science Methods Courses

    ERIC Educational Resources Information Center

    Baxter, Bonnie K.; Jenkins, Carolyn C.; Southerland, Sherry A.; Wilson, Paula

    2004-01-01

    The development of new courses is strengthened by assessment and a response to the assessment. Two new science methods courses for elementary and secondary preservice teachers were developed, fostered by the Great Salt Lake Project. The preservice teachers designed and performed research projects that they then converted into inquiry-based…

  5. Models and Methods for Assessing Refugee Mental Health Needs.

    ERIC Educational Resources Information Center

    Deinard, Amos S.; And Others

    This background paper on refugee needs assessment discusses the assumptions, goals, objectives, strategies, models, and methods that the state refugee programs can consider in designing their strategies for assessing the mental health needs of refugees. It begins with a set of background assumptions about the ethnic profile of recent refugee…

  6. Comparison of Cognitive Assessment Methods With Heterosocially Anxious College Women.

    ERIC Educational Resources Information Center

    Myszka, Michael T.; And Others

    1986-01-01

    Investigated comparability of self-statements generated by different cognitive assessment methods; effect of an assessment delay on cognitive phenomena; and interrelationships among different cognitive variables. Subjects were heterosocially anxious women (N=64) who engaged in a conversation with a male confederate. Self-statements generated by…

  7. 3rd International Workshop on Designing Empirical Studies: Assessing the Effectiveness of Agile Methods (IWDES 2009)

    NASA Astrophysics Data System (ADS)

    di Penta, Massimiliano; Morasca, Sandro; Sillitti, Alberto

    Assessing the effectiveness of a development methodology is difficult and requires an extensive empirical investigation. Moreover, the design of such investigations is complex since they involve several stakeholders and their validity can be questioned if not replicated in similar and different contexts. Agilists are aware that data collection is important and the problem of designing and execute meaningful experiments is common. This workshop aims at creating a critical mass for the development of new and extensive investigations in the Agile world.

  8. Teaching Physics in a Physiologically Meaningful Manner

    ERIC Educational Resources Information Center

    Plomer, Michael; Jessen, Karsten; Rangelov, Georgi; Meyer, Michael

    2010-01-01

    The learning outcome of a physics laboratory course for medical students was examined in an interdisciplinary field study and discussed for the electrical physiology ("Propagation of Excitation and Nerve Cells"). At the Ludwig-Maximilians-University of Munich (LMU) at a time about 300 medicine students were assessed in two successive years.…

  9. A method for assessing the risks of pipeline operations

    SciTech Connect

    Gloven, M.P.

    1996-09-01

    This paper presents a method for assessing the risks of hazardous liquid and natural gas pipeline systems. The method assesses risk by measuring historical and projected performance data against selected benchmarks, which if exceeded, may indicate that the pipeline may have a greater potential for failure or adverse consequence at certain points. Once these areas are determined, plans are developed and implemented to minimize risk.

  10. Comparison of selected multi-criteria assessment methods

    NASA Astrophysics Data System (ADS)

    Krzemiński, Michał

    2016-06-01

    The article presents the results of earlier work done in conjunction with the author in which the focus was on assessing the impact of the selection methods for the evaluation of multi-criteria and methods of normalization of the input matrix on the final result of the prioritization of possible variants. Also done an assessment of these variants using fuzzy logic. The aim of the article was to compare the results obtained.

  11. Integrating rangeland and pastureland assessment methods into a national grazingland assessment approach

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Grazingland resource allocation and decision making at the national scale need to be based on comparable metrics. However, in the USA, rangelands and pasturelands have traditionally been assessed using different methods and indicators. These differences in assessment methods limit the ability to con...

  12. Methods of Postural Assessment Used for Sports Persons

    PubMed Central

    Singla, Deepika

    2014-01-01

    Occurrence of postural defects has become very common now-a-days not only in general population but also in sports persons. There are various methods which can be used to assess these postural defects. These methods have evolved over a period of many years. This paper is first of its kind to summarize the methods of postural assessment which have been used and which can be used for evaluation of postural abnormalities in sports persons such as the visual observation, plumbline, goniometry, photographic, radiographic, photogrammetric, flexiruler, electromagnetic tracking device etc. We recommend more and more postural evaluation studies to be done in future based on the photogrammetric method. PMID:24959470

  13. Robust methods for assessing the accuracy of linear interpolated DEM

    NASA Astrophysics Data System (ADS)

    Wang, Bin; Shi, Wenzhong; Liu, Eryong

    2015-02-01

    Methods for assessing the accuracy of a digital elevation model (DEM) with emphasis on robust methods have been studied in this paper. Based on the squared DEM residual population generated by the bi-linear interpolation method, three average-error statistics including (a) mean, (b) median, and (c) M-estimator are thoroughly investigated for measuring the interpolated DEM accuracy. Correspondingly, their confidence intervals are also constructed for each average error statistic to further evaluate the DEM quality. The first method mainly utilizes the student distribution while the second and third are derived from the robust theories. These innovative robust methods possess the capability of counteracting the outlier effects or even the skew distributed residuals in DEM accuracy assessment. Experimental studies using Monte Carlo simulation have commendably investigated the asymptotic convergence behavior of confidence intervals constructed by these three methods with the increase of sample size. It is demonstrated that the robust methods can produce more reliable DEM accuracy assessment results compared with those by the classical t-distribution-based method. Consequently, these proposed robust methods are strongly recommended for assessing DEM accuracy, particularly for those cases where the DEM residual population is evidently non-normal or heavily contaminated with outliers.

  14. Ideas in Practice - Making Motion More Meaningful

    ERIC Educational Resources Information Center

    Cutchins, Malcolm A.

    1971-01-01

    Three methods of studying motion are described. A wind tunnel is utilized in demonstrating flutter. Computer graphics with an oscilloscope are used to investigate the natural modes of vibration and to track the simulated motion of missiles. (TS)

  15. Deriving meaningful climate-effects data from social media

    NASA Astrophysics Data System (ADS)

    Fuka, M. Z.; Fuka, D. R.

    2011-12-01

    This paper presents our research on extracting meaningful climate indicator data from unsolicited observations ("tweets") made by Twitter users regarding their physical surroundings and events occurring around them. Our goal is to establish whether the existing understanding of climate indicator data collected by more traditional means could be usefully supplemented by information derived from the potentially rich but also statistically diffuse data resource represented by social media. To this end, we've initiated an ongoing effort to collect and analyze Twitter observations made on a wide variety of climate-related phenological, biological, epidemiological and meteorological phenomena. We report on our acquisition methodology and discuss in particular our rationale for selecting keywords, phrases and filters for our searches. The iterative process of assembling an inventory of hundreds of climate-related search terms has in and of itself yielded interesting and sometimes surprising insights on what is and isn't noticed and commented on via social media with respect to climate indicator phenomenology. We report some of the highlights of those analyses along with significant findings from the data acquisition to date. In conclusion, we discuss our preliminary assessment of the approach, how it can be generalized and extended for social media other than Twitter, and how the resulting data could be used to serve climate science objectives.

  16. Write Another Poem about Marigold: Meaningful Writing as a Process of Change.

    ERIC Educational Resources Information Center

    Teichmann, Sandra Gail

    1995-01-01

    Considers a process approach toward the goal of meaningful writing which may aid in positive personal change. Outlines recent criticism of contemporary poetry; argues against tradition and practice of craft in writing poetry. Proposes a means of writing centered on a method of inquiry involving elements of self-involvement, curiosity, and risk to…

  17. Unit Costs Provide Basis for Meaningful Evaluation of Efficiency of TV Courses.

    ERIC Educational Resources Information Center

    Jones, Gardner; And Others

    1969-01-01

    Efficient use of television for teaching cannot be achieved without meaningful cost comparisons with conventional classroom methods. Considerable effort has been spent at the University of Michigan in developing a unit cost basis for televised filmed lectures to include not only salaries, but administrative costs, supplies, amortization of…

  18. Project 6: Cumulative Risk Assessment (CRA) Methods and Applications

    EPA Science Inventory

    Project 6: CRA Methods and Applications addresses the need to move beyond traditional risk assessment practices by developing CRA methods to integrate and evaluate impacts of chemical and nonchemical stressors on the environment and human health. Project 6 has three specific obje...

  19. A Comparison of Treatment Integrity Assessment Methods for Behavioral Intervention

    ERIC Educational Resources Information Center

    Koh, Seong A.

    2010-01-01

    The purpose of this study was to examine the similarity of outcomes from three different treatment integrity (TI) methods, and to identify the method which best corresponded to the assessment of a child's behavior. Six raters were recruited through individual contact via snowball sampling. A modified intervention component list and 19 video clips…

  20. Methods for Assessing Honeycomb Sandwich Panel Wrinkling Failures

    NASA Technical Reports Server (NTRS)

    Zalewski, Bart F.; Dial, William B.; Bednarcyk, Brett A.

    2012-01-01

    Efficient closed-form methods for predicting the facesheet wrinkling failure mode in sandwich panels are assessed. Comparisons were made with finite element model predictions for facesheet wrinkling, and a validated closed-form method was implemented in the HyperSizer structure sizing software.

  1. AN APPROACH TO METHODS DEVELOPMENT FOR HUMAN EXPOSURE ASSESSMENT STUDIES

    EPA Science Inventory

    Human exposure assessment studies require methods that are rapid, cost-effective and have a high sample through-put. The development of analytical methods for exposure studies should be based on specific information for individual studies. Human exposure studies suggest that di...

  2. A clinically meaningful theory of outcome measures in rehabilitation medicine.

    PubMed

    Massof, Robert W

    2010-01-01

    Comparative effectiveness research in rehabilitation medicine requires the development and validation of clinically meaningful and scientifically rigorous measurements of patient states and theories that explain and predict outcomes of intervention. Patient traits are latent (unobservable) variables that can be measured only by inference from observations of surrogate manifest (observable) variables. In the behavioral sciences, latent variables are analogous to intensive physical variables such as temperature and manifest variables are analogous to extensive physical variables such as distance. Although only one variable at a time can be measured, the variable can have a multidimensional structure that must be understood in order to explain disagreements among different measures of the same variable. The use of Rasch theory to measure latent trait variables can be illustrated with a balance scale metaphor that has randomly added variability in the weights of the objects being measured. Knowledge of the distribution of the randomly added variability provides the theoretical structure for estimating measures from ordinal observation scores (e.g., performance measures or rating scales) using statistical inference. In rehabilitation medicine, the latent variable of primary interest is the patient's functional ability. Functional ability can be estimated from observations of surrogate performance measures (e.g., speed and accuracy) or self-report of the difficulty the patient experiences performing specific activities. A theoretical framework borrowed from project management, called the Activity Breakdown Structure (ABS), guides the choice of activities for assessment, based on the patient's value judgments, to make the observations clinically meaningful. In the case of low vision, the functional ability measure estimated from Rasch analysis of activity difficulty ratings was discovered to be a two-dimensional variable. The two visual function dimensions are independent

  3. Exploring valid and reliable assessment methods for care management education.

    PubMed

    Gennissen, Lokke; Stammen, Lorette; Bueno-de-Mesquita, Jolien; Wieringa, Sietse; Busari, Jamiu

    2016-07-01

    Purpose It is assumed that the use of valid and reliable assessment methods can facilitate the development of medical residents' management and leadership competencies. To justify this assertion, the perceptions of an expert panel of health care leaders were explored on assessment methods used for evaluating care management (CM) development in Dutch residency programs. This paper aims to investigate how assessors and trainees value these methods and examine for any inherent benefits or shortcomings when they are applied in practice. Design/methodology/approach A Delphi survey was conducted among members of the platform for medical leadership in The Netherlands. This panel of experts was made up of clinical educators, practitioners and residents interested in CM education. Findings Of the respondents, 40 (55.6 per cent) and 31 (43 per cent) participated in the first and second rounds of the Delphi survey, respectively. The respondents agreed that assessment methods currently being used to measure residents' CM competencies were weak, though feasible for use in many residency programs. Multi-source feedback (MSF, 92.1 per cent), portfolio/e-portfolio (86.8 per cent) and knowledge testing (76.3 per cent) were identified as the most commonly known assessment methods with familiarity rates exceeding 75 per cent. Practical implications The findings suggested that an "assessment framework" comprising MSF, portfolios, individual process improvement projects or self-reflections and observations in clinical practice should be used to measure CM competencies in residents. Originality/value This study reaffirms the need for objective methods to assess CM skills in post-graduate medical education, as there was not a single assessment method that stood out as the best instrument. PMID:27397747

  4. From Magic Show to Meaningful Science

    ERIC Educational Resources Information Center

    Crawford, Teresa

    2003-01-01

    As science teachers, they understand the importance of gaining student interest to promote learning. They know how challenging it is to spark the curiosity to truly engage students in the processes of "doing science." One often-used method of motivation is the demonstration of science in action. Demonstrations such as "discrepant events". They are…

  5. Educating toward Meaningful Tefillah. Notes from ATID.

    ERIC Educational Resources Information Center

    Finkelman, Yoel, Ed.

    This collection of articles serves as a record of some of the deliberations of members of the Academy for Torah Initiatives and Directions (ATID) (Jerusalem, Israel). The collection captures the collective thinking that the ATID fellows and faculty members underwent as they explored methods of transforming prayer in Jewish schools into a more…

  6. Pronunciation and the Frequency Meaningfulness Effect in Children's Frequency Discrimination.

    ERIC Educational Resources Information Center

    Ghatala, Elizabeth S.; And Others

    In an absolute frequency judgment task, 130 sixth graders received either high-frequency (Hi-F), low-frequency, high-meaningfulness (Lo-F/Hi-M), or low-frequency, low-meaningfulness (Lo-F/Lo-M) words selected from the 1944 Thorndike-Lorge list. Subjects were asked to either pronounce the words aloud, listen to the examiner prounounce the written…

  7. A method of computerized assessment in introductory physics

    NASA Astrophysics Data System (ADS)

    Tikhomirov, Yu V.

    2000-05-01

    A method of computerized assessment of students' ability to replicate basic physical facts and procedural sequences is presented. The method is based on the construction of physical definitions, laws, formulae, phenomena, etc, from logical (symbolic or graphical) elements. Such testing is characterized by high reliability and releases both teachers' and students' time for the creative educational process. Experimental data on the developed method based on introductory physics teaching at a technical university are presented.

  8. Physician Beliefs about the Impact of Meaningful use of the EHR

    PubMed Central

    Ting, D.Y.; Healey, M.; Lipsitz, S.R.; Karson, A.S.; Einbinder, J. S.; Leinen, L.; Suric, V.; Bates, D.W.

    2014-01-01

    Summary Background As adoption and use of electronic health records (EHRs) grows in the United States, there is a growing need in the field of applied clinical informatics to evaluate physician perceptions and beliefs about the impact of EHRs. The meaningful use of EHR incentive program provides a suitable context to examine physician beliefs about the impact of EHRs. Objective Contribute to the sparse literature on physician beliefs about the impact of EHRs in areas such as quality of care, effectiveness of care, and delivery of care. Methods A cross-sectional online survey of physicians at two academic medical centers (AMCs) in the northeast who were preparing to qualify for the meaningful use of EHR incentive program. Results Of the 1,797 physicians at both AMCs who were preparing to qualify for the incentive program, 967 completed the survey for an overall response rate of 54%. Only 23% and 27% of physicians agreed or strongly agreed that meaningful use of the EHR will help them improve the care they personally deliver and improve quality of care respectively. Physician specialty was significantly associated with beliefs; e.g., 35% of primary care physicians agreed or strongly agreed that meaningful use will improve quality of care compared to 26% of medical specialists and 21% of surgical specialists (p=0.009). Satisfaction with outpatient EHR was also significantly related to all belief items. Conclusions Only about a quarter of physicians in our study responded positively that meaningful use of the EHR will improve quality of care and the care they personally provide. These findings are similar to and extend findings from qualitative studies about negative perceptions that physicians hold about the impact of EHRs. Factors outside of the regulatory context, such as physician beliefs, need to be considered in the implementation of the meaningful use of the EHR incentive program. PMID:25298817

  9. Herbal hepatotoxicity: Challenges and pitfalls of causality assessment methods

    PubMed Central

    Teschke, Rolf; Frenzel, Christian; Schulze, Johannes; Eickhoff, Axel

    2013-01-01

    The diagnosis of herbal hepatotoxicity or herb induced liver injury (HILI) represents a particular clinical and regulatory challenge with major pitfalls for the causality evaluation. At the day HILI is suspected in a patient, physicians should start assessing the quality of the used herbal product, optimizing the clinical data for completeness, and applying the Council for International Organizations of Medical Sciences (CIOMS) scale for initial causality assessment. This scale is structured, quantitative, liver specific, and validated for hepatotoxicity cases. Its items provide individual scores, which together yield causality levels of highly probable, probable, possible, unlikely, and excluded. After completion by additional information including raw data, this scale with all items should be reported to regulatory agencies and manufacturers for further evaluation. The CIOMS scale is preferred as tool for assessing causality in hepatotoxicity cases, compared to numerous other causality assessment methods, which are inferior on various grounds. Among these disputed methods are the Maria and Victorino scale, an insufficiently qualified, shortened version of the CIOMS scale, as well as various liver unspecific methods such as the ad hoc causality approach, the Naranjo scale, the World Health Organization (WHO) method, and the Karch and Lasagna method. An expert panel is required for the Drug Induced Liver Injury Network method, the WHO method, and other approaches based on expert opinion, which provide retrospective analyses with a long delay and thereby prevent a timely assessment of the illness in question by the physician. In conclusion, HILI causality assessment is challenging and is best achieved by the liver specific CIOMS scale, avoiding pitfalls commonly observed with other approaches. PMID:23704820

  10. Herbal hepatotoxicity: challenges and pitfalls of causality assessment methods.

    PubMed

    Teschke, Rolf; Frenzel, Christian; Schulze, Johannes; Eickhoff, Axel

    2013-05-21

    The diagnosis of herbal hepatotoxicity or herb induced liver injury (HILI) represents a particular clinical and regulatory challenge with major pitfalls for the causality evaluation. At the day HILI is suspected in a patient, physicians should start assessing the quality of the used herbal product, optimizing the clinical data for completeness, and applying the Council for International Organizations of Medical Sciences (CIOMS) scale for initial causality assessment. This scale is structured, quantitative, liver specific, and validated for hepatotoxicity cases. Its items provide individual scores, which together yield causality levels of highly probable, probable, possible, unlikely, and excluded. After completion by additional information including raw data, this scale with all items should be reported to regulatory agencies and manufacturers for further evaluation. The CIOMS scale is preferred as tool for assessing causality in hepatotoxicity cases, compared to numerous other causality assessment methods, which are inferior on various grounds. Among these disputed methods are the Maria and Victorino scale, an insufficiently qualified, shortened version of the CIOMS scale, as well as various liver unspecific methods such as the ad hoc causality approach, the Naranjo scale, the World Health Organization (WHO) method, and the Karch and Lasagna method. An expert panel is required for the Drug Induced Liver Injury Network method, the WHO method, and other approaches based on expert opinion, which provide retrospective analyses with a long delay and thereby prevent a timely assessment of the illness in question by the physician. In conclusion, HILI causality assessment is challenging and is best achieved by the liver specific CIOMS scale, avoiding pitfalls commonly observed with other approaches.

  11. Global Considerations in Hierarchical Clustering Reveal Meaningful Patterns in Data

    PubMed Central

    Varshavsky, Roy; Horn, David; Linial, Michal

    2008-01-01

    Background A hierarchy, characterized by tree-like relationships, is a natural method of organizing data in various domains. When considering an unsupervised machine learning routine, such as clustering, a bottom-up hierarchical (BU, agglomerative) algorithm is used as a default and is often the only method applied. Methodology/Principal Findings We show that hierarchical clustering that involve global considerations, such as top-down (TD, divisive), or glocal (global-local) algorithms are better suited to reveal meaningful patterns in the data. This is demonstrated, by testing the correspondence between the results of several algorithms (TD, glocal and BU) and the correct annotations provided by experts. The correspondence was tested in multiple domains including gene expression experiments, stock trade records and functional protein families. The performance of each of the algorithms is evaluated by statistical criteria that are assigned to clusters (nodes of the hierarchy tree) based on expert-labeled data. Whereas TD algorithms perform better on global patterns, BU algorithms perform well and are advantageous when finer granularity of the data is sought. In addition, a novel TD algorithm that is based on genuine density of the data points is presented and is shown to outperform other divisive and agglomerative methods. Application of the algorithm to more than 500 protein sequences belonging to ion-channels illustrates the potential of the method for inferring overlooked functional annotations. ClustTree, a graphical Matlab toolbox for applying various hierarchical clustering algorithms and testing their quality is made available. Conclusions Although currently rarely used, global approaches, in particular, TD or glocal algorithms, should be considered in the exploratory process of clustering. In general, applying unsupervised clustering methods can leverage the quality of manually-created mapping of proteins families. As demonstrated, it can also provide

  12. Safety assessment and detection methods of genetically modified organisms.

    PubMed

    Xu, Rong; Zheng, Zhe; Jiao, Guanglian

    2014-01-01

    Genetically modified organisms (GMOs), are gaining importance in agriculture as well as the production of food and feed. Along with the development of GMOs, health and food safety concerns have been raised. These concerns for these new GMOs make it necessary to set up strict system on food safety assessment of GMOs. The food safety assessment of GMOs, current development status of safety and precise transgenic technologies and GMOs detection have been discussed in this review. The recent patents about GMOs and their detection methods are also reviewed. This review can provide elementary introduction on how to assess and detect GMOs. PMID:25342147

  13. A mixed methods assessment of coping with pediatric cancer

    PubMed Central

    Alderfer, Melissa A.; Deatrick, Janet A.; Marsac, Meghan L.

    2014-01-01

    The purpose of this study was to describe child coping and parent coping assistance with cancer-related stressors during treatment. Fifteen children (aged 6-12) with cancer and their parents (N = 17) completed semi-structured interviews and self-report measures to assess coping and coping assistance. Results suggest families utilized a broad array of approach and avoidance strategies to manage cancer and its treatment. Quantitative and qualitative assessments provided complementary and unique contributions to understanding coping among children with cancer and their parents. Using a mixed methods approach to assess coping provides a richer understanding of families’ experiences, which can better inform clinical practice. PMID:24428250

  14. Safety assessment and detection methods of genetically modified organisms.

    PubMed

    Xu, Rong; Zheng, Zhe; Jiao, Guanglian

    2014-01-01

    Genetically modified organisms (GMOs), are gaining importance in agriculture as well as the production of food and feed. Along with the development of GMOs, health and food safety concerns have been raised. These concerns for these new GMOs make it necessary to set up strict system on food safety assessment of GMOs. The food safety assessment of GMOs, current development status of safety and precise transgenic technologies and GMOs detection have been discussed in this review. The recent patents about GMOs and their detection methods are also reviewed. This review can provide elementary introduction on how to assess and detect GMOs.

  15. Assessing and evaluating multidisciplinary translational teams: a mixed methods approach.

    PubMed

    Wooten, Kevin C; Rose, Robert M; Ostir, Glenn V; Calhoun, William J; Ameredes, Bill T; Brasier, Allan R

    2014-03-01

    A case report illustrates how multidisciplinary translational teams can be assessed using outcome, process, and developmental types of evaluation using a mixed-methods approach. Types of evaluation appropriate for teams are considered in relation to relevant research questions and assessment methods. Logic models are applied to scientific projects and team development to inform choices between methods within a mixed-methods design. Use of an expert panel is reviewed, culminating in consensus ratings of 11 multidisciplinary teams and a final evaluation within a team-type taxonomy. Based on team maturation and scientific progress, teams were designated as (a) early in development, (b) traditional, (c) process focused, or (d) exemplary. Lessons learned from data reduction, use of mixed methods, and use of expert panels are explored.

  16. Building Passion Develops Meaningful Mentoring Relationships among Canadian Physiotherapists

    PubMed Central

    Ezzat, Allison M.

    2012-01-01

    ABSTRACT Purpose: To describe the meaning of mentorship among Canadian orthopaedic physiotherapists. Methods: As part of a phenomenological qualitative study, 14 registered physiotherapists (13 women, 1 man) each participated in a single 60-minute, semi-structured face-to-face interview. Participants reflected on their experiences in receiving and providing mentorship and described the impact of mentorship on their careers. Interviews were transcribed verbatim and analyzed using a phenomenological approach. Results: Participants described mentorship as any nurturing process in which they used their skills and experience to guide, teach, and encourage a less skilled or less experienced colleague for the purpose of promoting professional and personal development. Participants experienced mentorship as a positive, reflective phenomenon. According to participants, the true essence of mentorship in physiotherapy consists of building passion, keeping fresh, making us stronger, and promoting deeper learning. Conclusions: Building a shared passion for learning, as well as a mentor's commitment to the mentee's success, forms the foundation of meaningful mentorship in physiotherapy. These mentoring relationships enable physiotherapists to adapt to the changing health care system, advance patient care, and develop the profession. PMID:23277688

  17. The meaningful encounter: patient and next-of-kin stories about their experience of meaningful encounters in health-care.

    PubMed

    Gustafsson, Lena-Karin; Snellma, Ingrid; Gustafsson, Christine

    2013-12-01

    This study focuses on the meaningful encounters of patients and next of kin, as seen from their perspective. Identifying the attributes within meaningful encounters is important for increased understanding of caring and to expand and develop earlier formulated knowledge about caring relationships. Caring theory about the caring relationship provided a point of departure to illuminate the meaningful encounter in healthcare contexts. A qualitative explorative design with a hermeneutic narrative approach was used to analyze and interpret written narratives. The phases of the analysis were naïve interpretation, structure analysis on two different levels (narrative structure, and deep structure through metaphors) and finally a dialectic interpretation. The narratives revealed the meaning of the meaningful encounter as sharing, a nourishing fellowship, common responsibility and coming together, experienced as safety and warmth, that gives, by extension, life-changing moments, a healing force and dissipated insight. The meaningful encounter can be seen as a complex phenomenon with various attributes. Understanding the meaningful encounter will enable nurses to plan and provide professional care, based on caring science, focusing on patient and next-of-kin experiences. PMID:23181930

  18. Analysis of CASP8 targets, predictions and assessment methods

    PubMed Central

    Shi, ShuoYong; Pei, Jimin; Sadreyev, Ruslan I.; Kinch, Lisa N.; Majumdar, Indraneel; Tong, Jing; Cheng, Hua; Kim, Bong-Hyun; Grishin, Nick V.

    2009-01-01

    Results of the recent Critical Assessment of Techniques for Protein Structure Prediction, CASP8, present several valuable sources of information. First, CASP targets comprise a realistic sample of currently solved protein structures and exemplify the corresponding challenges for predictors. Second, the plethora of predictions by all possible methods provides an unusually rich material for evolutionary analysis of target proteins. Third, CASP results show the current state of the field and highlight specific problems in both predicting and assessing. Finally, these data can serve as grounds to develop and analyze methods for assessing prediction quality. Here we present results of our analysis in these areas. Our objective is not to duplicate CASP assessment, but to use our unique experience as former CASP5 assessors and CASP8 predictors to (i) offer more insights into CASP targets and predictions based on expert analysis, including invaluable analysis prior to target structure release; and (ii) develop an assessment methodology tailored towards current challenges in the field. Specifically, we discuss preparing target structures for assessment, parsing protein domains, balancing evaluations based on domains and on whole chains, dividing targets into categories and developing new evaluation scores. We also present evolutionary analysis of the most interesting and challenging targets. Database URL: Our results are available as a comprehensive database of targets and predictions at http://prodata.swmed.edu/CASP8. PMID:20157476

  19. Analysis of CASP8 targets, predictions and assessment methods.

    PubMed

    Shi, Shuoyong; Pei, Jimin; Sadreyev, Ruslan I; Kinch, Lisa N; Majumdar, Indraneel; Tong, Jing; Cheng, Hua; Kim, Bong-Hyun; Grishin, Nick V

    2009-01-01

    Results of the recent Critical Assessment of Techniques for Protein Structure Prediction, CASP8, present several valuable sources of information. First, CASP targets comprise a realistic sample of currently solved protein structures and exemplify the corresponding challenges for predictors. Second, the plethora of predictions by all possible methods provides an unusually rich material for evolutionary analysis of target proteins. Third, CASP results show the current state of the field and highlight specific problems in both predicting and assessing. Finally, these data can serve as grounds to develop and analyze methods for assessing prediction quality. Here we present results of our analysis in these areas. Our objective is not to duplicate CASP assessment, but to use our unique experience as former CASP5 assessors and CASP8 predictors to (i) offer more insights into CASP targets and predictions based on expert analysis, including invaluable analysis prior to target structure release; and (ii) develop an assessment methodology tailored towards current challenges in the field. Specifically, we discuss preparing target structures for assessment, parsing protein domains, balancing evaluations based on domains and on whole chains, dividing targets into categories and developing new evaluation scores. We also present evolutionary analysis of the most interesting and challenging targets.Database URL: Our results are available as a comprehensive database of targets and predictions at http://prodata.swmed.edu/CASP8. PMID:20157476

  20. [Study on the risk assessment method of regional groundwater pollution].

    PubMed

    Yang, Yan; Yu, Yun-Jiang; Wang, Zong-Qing; Li, Ding-Long; Sun, Hong-Wei

    2013-02-01

    Based on the boundary elements of system risk assessment, the regional groundwater pollution risk assessment index system was preliminarily established, which included: regional groundwater specific vulnerability assessment, the regional pollution sources characteristics assessment and the health risk assessment of regional featured pollutants. The three sub-evaluation systems were coupled with the multi-index comprehensive method, the risk was characterized with the Spatial Analysis of ArcMap, and a new method to evaluate regional groundwater pollution risk that suitable for different parts of natural conditions, different types of pollution was established. Take Changzhou as an example, the risk of shallow groundwater pollution was studied with the new method, and found that the vulnerability index of groundwater in Changzhou is high and distributes unevenly; The distribution of pollution sources is concentrated and has a great impact on groundwater pollution risks; Influenced by the pollutants and pollution sources, the values of health risks are high in the urban area of Changzhou. The pollution risk of shallow groundwater is high and distributes unevenly, and distributes in the north of the line of Anjia-Xuejia-Zhenglu, the center of the city and the southeast, where the human activities are more intense and the pollution sources are intensive.

  1. [Assessment of ecosystem and its services conservation: indicators and methods].

    PubMed

    Lü, Yi-He; Zhang, Li-Wei; Wang, Jiang-Lei

    2013-05-01

    To conserve ecosystem and its services is a frontier and hot topic in the researches of conservation ecology. This paper reviewed the newest concepts and methods in the assessment of ecosystem and its services conservation, with the focus on the indicators and criteria for assessing the conservation status and the endangerment level of ecosystem as well as the main methods of ecosystem services assessment and conservation (including benefit transfer, systematic modeling, and quantitative indicator-based estimation). With the consideration of the research progress and the demands of ecological conservation in China, some issues to be urgently solved were put forward: 1) formulating the indicators, criteria, and methods suitable for the assessment of ecosystem conservation in China, 2) developing the methodologies for the quantitative assessment of ecosystem services, 3) determining the demands and optimal spatial arrangement of ecosystem and its services conservation in China, and 4) establishing the policies and incentive mechanisms for ecosystem and its services conservation. The resolution of these issues would supply important guarantee to the development of ecological civilization in China.

  2. [Assessment of ecosystem and its services conservation: indicators and methods].

    PubMed

    Lü, Yi-He; Zhang, Li-Wei; Wang, Jiang-Lei

    2013-05-01

    To conserve ecosystem and its services is a frontier and hot topic in the researches of conservation ecology. This paper reviewed the newest concepts and methods in the assessment of ecosystem and its services conservation, with the focus on the indicators and criteria for assessing the conservation status and the endangerment level of ecosystem as well as the main methods of ecosystem services assessment and conservation (including benefit transfer, systematic modeling, and quantitative indicator-based estimation). With the consideration of the research progress and the demands of ecological conservation in China, some issues to be urgently solved were put forward: 1) formulating the indicators, criteria, and methods suitable for the assessment of ecosystem conservation in China, 2) developing the methodologies for the quantitative assessment of ecosystem services, 3) determining the demands and optimal spatial arrangement of ecosystem and its services conservation in China, and 4) establishing the policies and incentive mechanisms for ecosystem and its services conservation. The resolution of these issues would supply important guarantee to the development of ecological civilization in China. PMID:24015539

  3. Method inventory for assessment of physical activity at VDU workplaces.

    PubMed

    Ellegast, Rolf; Weber, Britta; Mahlberg, Rena

    2012-01-01

    Physical inactivity and prolonged static work tasks may seriously affect health. There are numerous indications that promoting physical activity (PA) at sedentary workplaces can reduce these health risks. However, PA interventions have so far been documented rarely on the basis of medical parameters. Effects on the PA behavior are often studied only through the methods of subjective self-assessment. For this reason an extensive method inventory was developed consisting of objective PA assessment methods and various methods for documenting PA related health outcomes. The developed method inventory has been tested in a pilot intervention study at office workplaces. The current paper presents and discusses a part of the applied inventory. The methods considered here demonstrated several positive intervention effects: intervention subjects were more active, felt better, increased muscle strength and showed improvements in resting heart rate and BMI. Not all data has been analyzed to date, but the preliminary results suggest that most of the investigated methods turned out to be suitable for the documentation of intervention effects. Among the methods for which no effects were found, the question remains whether this is due to a lack of sensitivity of the method or due to aspects related to the study design.

  4. [Quantitative method of representative contaminants in groundwater pollution risk assessment].

    PubMed

    Wang, Jun-Jie; He, Jiang-Tao; Lu, Yan; Liu, Li-Ya; Zhang, Xiao-Liang

    2012-03-01

    In the light of the problem that stress vulnerability assessment in groundwater pollution risk assessment is lack of an effective quantitative system, a new system was proposed based on representative contaminants and corresponding emission quantities through the analysis of groundwater pollution sources. And quantitative method of the representative contaminants in this system was established by analyzing the three properties of representative contaminants and determining the research emphasis using analytic hierarchy process. The method had been applied to the assessment of Beijing groundwater pollution risk. The results demonstrated that the representative contaminants hazards greatly depended on different research emphasizes. There were also differences between the sequence of three representative contaminants hazards and their corresponding properties. It suggested that subjective tendency of the research emphasis had a decisive impact on calculation results. In addition, by the means of sequence to normalize the three properties and to unify the quantified properties results would zoom in or out of the relative properties characteristic of different representative contaminants.

  5. Method and apparatus of assessing down-hole drilling conditions

    DOEpatents

    Hall, David R.; Pixton, David S.; Johnson, Monte L.; Bartholomew, David B.; Fox, Joe

    2007-04-24

    A method and apparatus for use in assessing down-hole drilling conditions are disclosed. The apparatus includes a drill string, a plurality of sensors, a computing device, and a down-hole network. The sensors are distributed along the length of the drill string and are capable of sensing localized down-hole conditions while drilling. The computing device is coupled to at least one sensor of the plurality of sensors. The data is transmitted from the sensors to the computing device over the down-hole network. The computing device analyzes data output by the sensors and representative of the sensed localized conditions to assess the down-hole drilling conditions. The method includes sensing localized drilling conditions at a plurality of points distributed along the length of a drill string during drilling operations; transmitting data representative of the sensed localized conditions to a predetermined location; and analyzing the transmitted data to assess the down-hole drilling conditions.

  6. Stimulus Set Meaningfulness and Neurophysiological Differentiation: A Functional Magnetic Resonance Imaging Study

    PubMed Central

    Boly, Melanie; Sasai, Shuntaro; Gosseries, Olivia; Oizumi, Masafumi; Casali, Adenauer; Massimini, Marcello; Tononi, Giulio

    2015-01-01

    A meaningful set of stimuli, such as a sequence of frames from a movie, triggers a set of different experiences. By contrast, a meaningless set of stimuli, such as a sequence of ‘TV noise’ frames, triggers always the same experience—of seeing ‘TV noise’—even though the stimuli themselves are as different from each other as the movie frames. We reasoned that the differentiation of cortical responses underlying the subject’s experiences, as measured by Lempel-Ziv complexity (incompressibility) of functional MRI images, should reflect the overall meaningfulness of a set of stimuli for the subject, rather than differences among the stimuli. We tested this hypothesis by quantifying the differentiation of brain activity patterns in response to a movie sequence, to the same movie scrambled in time, and to ‘TV noise’, where the pixels from each movie frame were scrambled in space. While overall cortical activation was strong and widespread in all conditions, the differentiation (Lempel-Ziv complexity) of brain activation patterns was correlated with the meaningfulness of the stimulus set, being highest in the movie condition, intermediate in the scrambled movie condition, and minimal for ‘TV noise’. Stimulus set meaningfulness was also associated with higher information integration among cortical regions. These results suggest that the differentiation of neural responses can be used to assess the meaningfulness of a given set of stimuli for a given subject, without the need to identify the features and categories that are relevant to the subject, nor the precise location of selective neural responses. PMID:25970444

  7. Using Empirical Article Analysis to Assess Research Methods Courses

    ERIC Educational Resources Information Center

    Bachiochi, Peter; Everton, Wendi; Evans, Melanie; Fugere, Madeleine; Escoto, Carlos; Letterman, Margaret; Leszczynski, Jennifer

    2011-01-01

    Developing students who can apply their knowledge of empirical research is a key outcome of the undergraduate psychology major. This learning outcome was assessed in two research methods courses by having students read and analyze a condensed empirical journal article. At the start and end of the semester, students in multiple sections of an…

  8. A Comparison of Assessment Methods and Raters in Product Creativity

    ERIC Educational Resources Information Center

    Lu, Chia-Chen; Luh, Ding-Bang

    2012-01-01

    Although previous studies have attempted to use different experiences of raters to rate product creativity by adopting the Consensus Assessment Method (CAT) approach, the validity of replacing CAT with another measurement tool has not been adequately tested. This study aimed to compare raters with different levels of experience (expert ves.…

  9. 50 CFR 270.18 - Method of imposing assessments.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 50 Wildlife and Fisheries 11 2013-10-01 2013-10-01 false Method of imposing assessments. 270.18 Section 270.18 Wildlife and Fisheries NATIONAL MARINE FISHERIES SERVICE, NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE FISH AND SEAFOOD PROMOTION SPECIES-SPECIFIC SEAFOOD MARKETING...

  10. ANALYZING SHORT CUT METHODS FOR LIFE CYCLE ASSESSMENT INVENTORIES

    EPA Science Inventory

    Work in progress at the U.S. EPA's National Risk Management Research Laboratory is developing methods for quickly, easily, and inexpensively developing Life Cycle Assessment (LCA) inventories. An LCA inventory represents the inputs and outputs from processes, including fuel and ...

  11. River Pollution: Part II. Biological Methods for Assessing Water Quality.

    ERIC Educational Resources Information Center

    Openshaw, Peter

    1984-01-01

    Discusses methods used in the biological assessment of river quality and such indicators of clean and polluted waters as the Trent Biotic Index, Chandler Score System, and species diversity indexes. Includes a summary of a river classification scheme based on quality criteria related to water use. (JN)

  12. Myths and Misconceptions about Using Qualitative Methods in Assessment

    ERIC Educational Resources Information Center

    Harper, Shaun R.; Kuh, George D.

    2007-01-01

    The value of qualitative assessment approaches has been underestimated primarily because they are often juxtaposed against long-standing quantitative traditions and the widely accepted premise that the best research produces generalizable and statistically significant findings. Institutional researchers avoid qualitative methods for at least three…

  13. [A method for assessing lipid peroxidation in a biological substrate].

    PubMed

    Shvetsova, M M; Zatolokin, V D; Kaznacheev, N N; Luk'ianchikov, G F

    1990-01-01

    A new method for the assessment of lipid peroxidation permits simultaneous assays of total lipids and formed malonic dialdehyde in the examined chloroform substrate; this will help define the criteria of cellular membrane destruction in various biological media of the body.

  14. A combined scoring method to assess behavioral recovery after mouse spinal cord injury.

    PubMed

    Pajoohesh-Ganji, Ahdeah; Byrnes, Kimberly R; Fatemi, Gita; Faden, Alan I

    2010-06-01

    Although the rat has been the predominant rodent used to investigate the pathophysiology and treatment of experimental spinal cord injury (SCI), the increasing availability of transgenic animals has led to greater use of mouse models. However, behavioral assessment after SCI in mice has been less extensively investigated than in rats and few studies have critically examined the correlation between behavioral tests and injury severity or tissue damage. The present study characterized hindlimb functional performance in C57Bl/6 mice after contusion SCI at T9 using the weight drop method. A number of behavioral tests were examined with regard to variability, inter-rater reliability, and correlation to injury severity and white matter sparing. Mice were subjected to sham, mild-moderate or moderate-severe SCI and evaluated at day 1 and weekly up to 42 days using the Basso mouse scale (BMS), ladder climb, grid walk, inclined plane, plantar test and tail flick tests. The ladder climb and grid walk tests proved sub-optimal for use in mice, but modifications enhanced their predictive value with regard to injury severity. The inclined plane, plantar test and tail flick test showed far too much variability to have meaningful predictive value. The BMS score proved reliable, as previously reported, but a combined score (BLG) using BMS, Ladder climb (modified), and Grip walk (modified grid walk) provided better separation across injury levels and less variability than the individual tests. These data provide support for use of a combined scoring method to follow motor recovery in mice after contusion SCI. PMID:20188770

  15. Improving Educational Assessment: A Computer-Adaptive Multiple Choice Assessment Using NRET as the Scoring Method

    ERIC Educational Resources Information Center

    Sie Hoe, Lau; Ngee Kiong, Lau; Kian Sam, Hong; Bin Usop, Hasbee

    2009-01-01

    Assessment is central to any educational process. Number Right (NR) scoring method is a conventional scoring method for multiple choice items, where students need to pick one option as the correct answer. One point is awarded for the correct response and zero for any other responses. However, it has been heavily criticized for guessing and failure…

  16. A comparative study of ship hull structures fatigue assessment methods

    NASA Astrophysics Data System (ADS)

    Petinov, Sergei V.; Polezhayeva, Helena A.; Yermolayeva, Natalya S.

    1992-07-01

    Several methods of fatigue assessment in ship hull structures are compared. The analysis is focused on fatigue problems of hull structures concerning: evaluation, the design state of fatigue damage of a structure formulation, and the adequacy of methods and data bases for the purpose of the analyses. To illustrate the discussion, examples of allowable nominal stress at a given fatigue life calculation are presented for bottom frame web slot and for a bottom longitudinal transverse bulkhead bracket connection in the case of a container ship. The low cycle (local strain) method is regarded as the most advantageous at present almost in all practical problems connected to fatigue.

  17. Total System Performance Assessment - License Application Methods and Approach

    SciTech Connect

    J. McNeish

    2003-12-08

    ''Total System Performance Assessment-License Application (TSPA-LA) Methods and Approach'' provides the top-level method and approach for conducting the TSPA-LA model development and analyses. The method and approach is responsive to the criteria set forth in Total System Performance Assessment Integration (TSPAI) Key Technical Issues (KTIs) identified in agreements with the U.S. Nuclear Regulatory Commission, the ''Yucca Mountain Review Plan'' (YMRP), ''Final Report'' (NRC 2003 [163274]), and the NRC final rule 10 CFR Part 63 (NRC 2002 [156605]). This introductory section provides an overview of the TSPA-LA, the projected TSPA-LA documentation structure, and the goals of the document. It also provides a brief discussion of the regulatory framework, the approach to risk management of the development and analysis of the model, and the overall organization of the document. The section closes with some important conventions that are used in this document.

  18. Total System Performance Assessment-License Application Methods and Approach

    SciTech Connect

    J. McNeish

    2002-09-13

    ''Total System Performance Assessment-License Application (TSPA-LA) Methods and Approach'' provides the top-level method and approach for conducting the TSPA-LA model development and analyses. The method and approach is responsive to the criteria set forth in Total System Performance Assessment Integration (TSPAI) Key Technical Issue (KTI) agreements, the ''Yucca Mountain Review Plan'' (CNWRA 2002 [158449]), and 10 CFR Part 63. This introductory section provides an overview of the TSPA-LA, the projected TSPA-LA documentation structure, and the goals of the document. It also provides a brief discussion of the regulatory framework, the approach to risk management of the development and analysis of the model, and the overall organization of the document. The section closes with some important conventions that are utilized in this document.

  19. Meaningful learning: theoretical support for concept-based teaching.

    PubMed

    Getha-Eby, Teresa J; Beery, Theresa; Xu, Yin; O'Brien, Beth A

    2014-09-01

    Novice nurses’ inability to transfer classroom knowledge to the bedside has been implicated in adverse patient outcomes, including death. Concept-based teaching is a pedagogy found to improve knowledge transfer. Concept-based teaching emanates from a constructivist paradigm of teaching and learning and can be implemented most effectively when the underlying theory and principles are applied. Ausubel’s theory of meaningful learning and its construct of substantive knowledge integration provides a model to help educators to understand, implement, and evaluate concept-based teaching. Contemporary findings from the fields of cognitive psychology, human development, and neurobiology provide empirical evidence of the relationship between concept-based teaching, meaningful learning, and knowledge transfer. This article describes constructivist principles and meaningful learning as they apply to nursing pedagogy.

  20. In Vivo Methods for the Assessment of Topical Drug Bioavailability

    PubMed Central

    Herkenne, Christophe; Alberti, Ingo; Naik, Aarti; Kalia, Yogeshvar N.; Mathy, François-Xavier; Préat, Véronique

    2007-01-01

    This paper reviews some current methods for the in vivo assessment of local cutaneous bioavailability in humans after topical drug application. After an introduction discussing the importance of local drug bioavailability assessment and the limitations of model-based predictions, the focus turns to the relevance of experimental studies. The available techniques are then reviewed in detail, with particular emphasis on the tape stripping and microdialysis methodologies. Other less developed techniques, including the skin biopsy, suction blister, follicle removal and confocal Raman spectroscopy techniques are also described. PMID:17985216

  1. Errors associated with three methods of assessing respirator fit.

    PubMed

    Coffey, Christopher C; Lawrence, Robert B; Zhuang, Ziqing; Duling, Matthew G; Campbell, Donald L

    2006-01-01

    Three fit test methods (Bitrex, saccharin, and TSI PortaCount Plus with the N95-Companion) were evaluated for their ability to identify wearers of respirators that do not provide adequate protection during a simulated workplace test. Thirty models of NIOSH-certified N95 half-facepiece respirators (15 filtering-facepiece models and 15 elastomeric models) were tested by a panel of 25 subjects using each of the three fit testing methods. Fit testing results were compared to 5th percentiles of simulated workplace protection factors. Alpha errors (the chance of failing a fit test in error) for all 30 respirators were 71% for the Bitrex method, 68% for the saccharin method, and 40% for the Companion method. Beta errors (the chance of passing a fit test in error) for all 30 respirator models combined were 8% for the Bitrex method, 8% for the saccharin method, and 9% for the Companion method. The three fit test methods had different error rates when assessed with filtering facepieces and when assessed with elastomeric respirators. For example, beta errors for the three fit test methods assessed with the 15 filtering facepiece respirators were < or = 5% but ranged from 14% to 21% when assessed with the 15 elastomeric respirators. To predict what happens in a realistic fit testing program, the data were also used to estimate the alpha and beta errors for a simulated respiratory protection program in which a wearer is given up to three trials with one respirator model to pass a fit test before moving onto another model. A subject passing with any of the three methods was considered to have passed the fit test program. The alpha and beta errors for the fit testing in this simulated respiratory protection program were 29% and 19%, respectively. Thus, it is estimated, under the conditions of the simulation, that roughly one in three respirator wearers receiving the expected reduction in exposure (with a particular model) will fail to pass (with that particular model), and that

  2. Analytical resource assessment method for continuous (unconventional) oil and gas accumulations - The "ACCESS" Method

    USGS Publications Warehouse

    Crovelli, Robert A.; revised by Charpentier, Ronald R.

    2012-01-01

    The U.S. Geological Survey (USGS) periodically assesses petroleum resources of areas within the United States and the world. The purpose of this report is to explain the development of an analytic probabilistic method and spreadsheet software system called Analytic Cell-Based Continuous Energy Spreadsheet System (ACCESS). The ACCESS method is based upon mathematical equations derived from probability theory. The ACCESS spreadsheet can be used to calculate estimates of the undeveloped oil, gas, and NGL (natural gas liquids) resources in a continuous-type assessment unit. An assessment unit is a mappable volume of rock in a total petroleum system. In this report, the geologic assessment model is defined first, the analytic probabilistic method is described second, and the spreadsheet ACCESS is described third. In this revised version of Open-File Report 00-044 , the text has been updated to reflect modifications that were made to the ACCESS program. Two versions of the program are added as appendixes.

  3. Assessing the Impact of Tutorial Services

    ERIC Educational Resources Information Center

    Ticknor, Cindy S.; Shaw, Kimberly A.; Howard, Timothy

    2014-01-01

    Many institutions struggle to develop a meaningful way to assess the effectiveness of drop-in tutorial services provided to students. This article discusses the development of a data collection system based on a visitor sign-in system that proved to be an efficient method of gathering assessment data, including frequency of visits, end-of-course…

  4. Application of geosites assessment method in geopark context

    NASA Astrophysics Data System (ADS)

    Martin, Simon; Perret, Amandine; Renau, Pierre; Cartier-Moulin, Olivier; Regolini-Bissig, Géraldine

    2014-05-01

    The regional natural park of the Monts d'Ardèche (Ardèche and Haute-Loire departments, France) is candidate to the European Geopark Network (EGN) in 2014. The area has a wide geodiversity - with rocks from Cambrian to Pleistocene (basalt flows) - and interesting features like phonolitic protrusions, maars and granite boulders fields. Around 115 sites were selected and documented through a geosites inventory carried out in the territory. This pre-selection was supervised by the Ardèche Geological Society and is therefore expert advice based. In the context of EGN candidature, these potential geosites were assessed with a simplified method. It follows the spirit of the method from the University of Lausanne (Reynard et al., 2007) and its recent developments: assessment of the scientific (central) value and of a set of additional values (ecological and cultural). As this assessment aimed to offer a management tool to the future geopark's authorities, a special focus was given to management aspects. In particular, the opportunities to use the site for education (from schools to universities) and for tourism as well as the existence of protection and of interpretive facilities were documented and assessed. Several interesting conclusions may be drawn from this case study: (1) expert assessment is effective when it is based on a pre-existing inventory which is well structured and documented; (2) even simplified, an assessment method is a very useful framework to expert assessment as it focuses the discussions on most important points and helps to balance the assessment; (3) whereas the inventory can be extensively detailed and partly academic, the assessment in the geopark context is objective-driven in order to answer management needs. The place of the geosites assessment among the three key players of a geopark construction process (i.e. territory's managers, local geoscientists and EGN) is also discussed. This place can be defined as the point of consensus of needs

  5. An Observational Assessment Method for Aging Laboratory Rats

    PubMed Central

    Phillips, Pamela M; Jarema, Kimberly A; Kurtz, David M; MacPhail, Robert C

    2010-01-01

    The rapid growth of the aging human population highlights the need for laboratory animal models to study the basic biologic processes of aging and susceptibility to disease, drugs, and environmental pollutants. Methods are needed to evaluate the health of aging animals over time, particularly methods for efficiently monitoring large research colonies. Here we describe an observational assessment method that scores appearance, posture, mobility, and muscle tone on a 5-point scale that can be completed in about 1 min. A score of 1 indicates no deterioration, whereas a score of 5 indicates severe deterioration. Tests were applied to male Brown Norway rats between 12 and 36 mo of age (n = 32). The rats were participating concurrently in experiments on the behavioral effects of intermittent exposure (approximately every 4 mo) to short-acting environmental chemicals. Results demonstrated that aging-related signs of deterioration did not appear before 18 mo of age. Assessment scores and variability then increased with age. Body weights increased until approximately 24 mo, then remained stable, but decreased after 31 mo for the few remaining rats. The incidence of death increased slightly from 20 to 28 mo of age and then rose sharply; median survival age was approximately 30 mo, with a maximum of 36 mo. The results indicate that our observational assessment method supports efficient monitoring of the health of aging rats and may be useful in studies on susceptibility to diseases, drugs, and toxicants during old age. PMID:21205442

  6. Climate change and occupational heat stress: methods for assessment

    PubMed Central

    Holmér, Ingvar

    2010-01-01

    Background Presumed effects of global warming on occupational heat stress aggravate conditions in many parts of the world, in particular in developing countries. In order to assess and evaluate conditions, heat stress must be described and measured correctly. Objective Assessment of heat stress using internationally recognized methods. Design Two such methods are wet bulb globe temperature (WBGT; ISO 7243) and predicted heat strain (PHS; ISO 7933). Both methods measure relevant climatic factors and provide recommendations for limit values in terms of time when heat stress becomes imminent. The WBGT as a heat stress index is empirical and widely recognized. It requires, however, special sensors for the climatic factors that can introduce significant measurement errors if prescriptions in ISO 7243 are not followed. The PHS (ISO 7933) is based on climatic factors that can easily be measured with traditional instruments. It evaluates the conditions for heat balance in a more rational way and it applies equally to all combinations of climates. Results Analyzing similar climatic conditions with WBGT and PHS indicates that WBGT provides a more conservative assessment philosophy that allows much shorter working time than predicted with PHS. Conclusions PHS prediction of physiological strain appears to fit better with published data from warm countries. Both methods should be used and validated more extensively worldwide in order to give reliable and accurate information about the actual heat stress. PMID:21139697

  7. How to assess the quality of your analytical method?

    PubMed

    Topic, Elizabeta; Nikolac, Nora; Panteghini, Mauro; Theodorsson, Elvar; Salvagno, Gian Luca; Miler, Marijana; Simundic, Ana-Maria; Infusino, Ilenia; Nordin, Gunnar; Westgard, Sten

    2015-10-01

    Laboratory medicine is amongst the fastest growing fields in medicine, crucial in diagnosis, support of prevention and in the monitoring of disease for individual patients and for the evaluation of treatment for populations of patients. Therefore, high quality and safety in laboratory testing has a prominent role in high-quality healthcare. Applied knowledge and competencies of professionals in laboratory medicine increases the clinical value of laboratory results by decreasing laboratory errors, increasing appropriate utilization of tests, and increasing cost effectiveness. This collective paper provides insights into how to validate the laboratory assays and assess the quality of methods. It is a synopsis of the lectures at the 15th European Federation of Clinical Chemistry and Laboratory Medicine (EFLM) Continuing Postgraduate Course in Clinical Chemistry and Laboratory Medicine entitled "How to assess the quality of your method?" (Zagreb, Croatia, 24-25 October 2015). The leading topics to be discussed include who, what and when to do in validation/verification of methods, verification of imprecision and bias, verification of reference intervals, verification of qualitative test procedures, verification of blood collection systems, comparability of results among methods and analytical systems, limit of detection, limit of quantification and limit of decision, how to assess the measurement uncertainty, the optimal use of Internal Quality Control and External Quality Assessment data, Six Sigma metrics, performance specifications, as well as biological variation. This article, which continues the annual tradition of collective papers from the EFLM continuing postgraduate courses in clinical chemistry and laboratory medicine, aims to provide further contributions by discussing the quality of laboratory methods and measurements and, at the same time, to offer continuing professional development to the attendees.

  8. Meaningful Use Attestations among US Hospitals: The Growing Rural-Urban Divide.

    PubMed

    Sandefer, Ryan H; Marc, David T; Kleeberg, Paul

    2015-01-01

    The purpose of this study was to assess EHR Incentive Program attestations of eligible US hospitals across geography and hospital type. The proportions of attestations were compared between metropolitan, micropolitan, and rural hospitals and by whether a hospital was critical access or prospective payment system. From 2011 until December 2013, rural and critical access hospitals were attesting to meaningful use and receiving federal incentive payments at a significantly lower proportion than their urban counterparts. The data suggest that the digital divide between urban and rural hospitals that are adopting electronic health records and using the technology effectively is widening. These findings illustrate that the needs of rural hospitals currently and into the future are different than urban hospitals, and the meaningful use program does not appear to provide the resources needed to propel these rural hospitals forward. PMID:26755900

  9. DREAM: a method for semi-quantitative dermal exposure assessment.

    PubMed

    Van-Wendel-de-Joode, Berna; Brouwer, Derk H; Vermeulen, Roel; Van Hemmen, Joop J; Heederik, Dick; Kromhout, Hans

    2003-01-01

    This paper describes a new method (DREAM) for structured, semi-quantitative dermal exposure assessment for chemical or biological agents that can be used in occupational hygiene or epidemiology. It is anticipated that DREAM could serve as an initial assessment of dermal exposure, amongst others, resulting in a ranking of tasks and subsequently jobs. DREAM consists of an inventory and evaluation part. Two examples of dermal exposure of workers of a car-construction company show that DREAM characterizes tasks and gives insight into exposure mechanisms, forming a basis for systematic exposure reduction. DREAM supplies estimates for exposure levels on the outside clothing layer as well as on skin, and provides insight into the distribution of dermal exposure over the body. Together with the ranking of tasks and people, this provides information for measurement strategies and helps to determine who, where and what to measure. In addition to dermal exposure assessment, the systematic description of dermal exposure pathways helps to prioritize and determine most adequate measurement strategies and methods. DREAM could be a promising approach for structured, semi-quantitative, dermal exposure assessment. PMID:12505908

  10. Methods for Developing Emissions Scenarios for Integrated Assessment Models

    SciTech Connect

    Prinn, Ronald; Webster, Mort

    2007-08-20

    The overall objective of this research was to contribute data and methods to support the future development of new emissions scenarios for integrated assessment of climate change. Specifically, this research had two main objectives: 1. Use historical data on economic growth and energy efficiency changes, and develop probability density functions (PDFs) for the appropriate parameters for two or three commonly used integrated assessment models. 2. Using the parameter distributions developed through the first task and previous work, we will develop methods of designing multi-gas emission scenarios that usefully span the joint uncertainty space in a small number of scenarios. Results on the autonomous energy efficiency improvement (AEEI) parameter are summarized, an uncertainty analysis of elasticities of substitution is described, and the probabilistic emissions scenario approach is presented.

  11. Assessing Security of Supply: Three Methods Used in Finland

    NASA Astrophysics Data System (ADS)

    Sivonen, Hannu

    Public Private Partnership (PPP) has an important role in securing supply in Finland. Three methods are used in assessing the level of security of supply. First, in national expert groups, a linear mathematical model has been used. The model is based on interdependency estimates. It ranks societal functions or its more detailed components, such as items in the food supply chain, according to the effect and risk pertinent to the interdependencies. Second, the security of supply is assessed in industrial branch committees (clusters and pools) in the form of indicators. The level of security of supply is assessed against five generic factors (dimension 1) and tens of business branch specific functions (dimension 2). Third, in two thousand individual critical companies, the maturity of operational continuity management is assessed using Capability Maturity Model (CMM) in an extranet application. The pool committees and authorities obtain an anonymous summary. The assessments are used in allocating efforts for securing supply. The efforts may be new instructions, training, exercising, and in some cases, investment and regulation.

  12. Using the statistical analysis method to assess the landslide susceptibility

    NASA Astrophysics Data System (ADS)

    Chan, Hsun-Chuan; Chen, Bo-An; Wen, Yo-Ting

    2015-04-01

    This study assessed the landslide susceptibility in Jing-Shan River upstream watershed, central Taiwan. The landslide inventories during typhoons Toraji in 2001, Mindulle in 2004, Kalmaegi and Sinlaku in 2008, Morakot in 2009, and the 0719 rainfall event in 2011, which were established by Taiwan Central Geological Survey, were used as landslide data. This study aims to assess the landslide susceptibility by using different statistical methods including logistic regression, instability index method and support vector machine (SVM). After the evaluations, the elevation, slope, slope aspect, lithology, terrain roughness, slope roughness, plan curvature, profile curvature, total curvature, average of rainfall were chosen as the landslide factors. The validity of the three established models was further examined by the receiver operating characteristic curve. The result of logistic regression showed that the factor of terrain roughness and slope roughness had a stronger impact on the susceptibility value. Instability index method showed that the factor of terrain roughness and lithology had a stronger impact on the susceptibility value. Due to the fact that the use of instability index method may lead to possible underestimation around the river side. In addition, landslide susceptibility indicated that the use of instability index method laid a potential issue about the number of factor classification. An increase of the number of factor classification may cause excessive variation coefficient of the factor. An decrease of the number of factor classification may make a large range of nearby cells classified into the same susceptibility level. Finally, using the receiver operating characteristic curve discriminate the three models. SVM is a preferred method than the others in assessment of landslide susceptibility. Moreover, SVM is further suggested to be nearly logistic regression in terms of recognizing the medium-high and high susceptibility.

  13. Methods for assessing the effects of dehydration on cognitive function.

    PubMed

    Lieberman, Harris R

    2012-11-01

    Studying the effects of dehydration on cognitive function presents a variety of unique and difficult challenges to investigators. These challenges, which are addressed in this article, can be divided into three general categories: 1) choosing an appropriate method of generating a consistent level of dehydration; 2) determining and effectively employing appropriate and sensitive measures of cognitive state; and 3) adequately controlling the many confounding factors that interfere with assessment of cognitive function. The design and conduct of studies on the effects of dehydration on cognitive function should carefully consider various methodological issues, and investigators should carefully weigh the benefits and disadvantages of particular methods and procedures.

  14. Methods for assessing nutritional status of patients with renal failure.

    PubMed

    Blumenkrantz, M J; Kopple, J D; Gutman, R A; Chan, Y K; Barbour, G L; Roberts, C; Shen, F H; Gandhi, V C; Tucker, C T; Curtis, F K; Coburn, J W

    1980-07-01

    Since wasting and malnutrition are common problems in patients with renal failure, it is important to develop techniques for the longitudinal assessment of nutritional status. This paper reviews available methods for assessing the nutritional status; their possible limitations when applied to uremic patients are discussed. If carefully done, dietary intake can be estimated by recall interviews augmented with dietary diaries. Also, in a stable patient with chronic renal failure, the serum urea nitrogen (N)/creatinine ratio and the rate of urea N appearance reflect dietary protein intake. A comparison of N intake and urea N appearance will give an estimate of N balance. Anthropometric parameters such as the relationship between height and weight, thickness of subcutaneous skinfolds, and midarm muscle circumference are simple methods for evaluating body composition. Other methods for assessing body composition, such as densitometry and total body potassium, may not be readily applicable in patients with renal failure. More traditional biochemical estimates of nutritional status such as serum protein, albumin, transferrin, and selected serum complement determinations show that abnormalities are common among uremic patients. Certain anthropometric and biochemical measurements of nutritional status are abnormal in chronically uremic patients who appear to be particularly robust; thus, factors other than altered nutritional intake may lead to abnormal parameters in such patients. Serial monitoring of selected nutritional parameters in the same individual may improve the sensitivity of these measurements to detect changes. Standards for measuring nutritional status are needed for patients with renal failure so that realistic goals can be established optimal body nutriture.

  15. Assessment of mesoscopic particle-based methods in microfluidic geometries

    NASA Astrophysics Data System (ADS)

    Zhao, Tongyang; Wang, Xiaogong; Jiang, Lei; Larson, Ronald G.

    2013-08-01

    We assess the accuracy and efficiency of two particle-based mesoscopic simulation methods, namely, Dissipative Particle Dynamics (DPD) and Stochastic Rotation Dynamics (SRD) for predicting a complex flow in a microfluidic geometry. Since both DPD and SRD use soft or weakly interacting particles to carry momentum, both methods contain unavoidable inertial effects and unphysically high fluid compressibility. To assess these effects, we compare the predictions of DPD and SRD for both an exact Stokes-flow solution and nearly exact solutions at finite Reynolds numbers from the finite element method for flow in a straight channel with periodic slip boundary conditions. This flow represents a periodic electro-osmotic flow, which is a complex flow with an analytical solution for zero Reynolds number. We find that SRD is roughly ten-fold faster than DPD in predicting the flow field, with better accuracy at low Reynolds numbers. However, SRD has more severe problems with compressibility effects than does DPD, which limits the Reynolds numbers attainable in SRD to around 25-50, while DPD can achieve Re higher than this before compressibility effects become too large. However, since the SRD method runs much faster than DPD does, we can afford to enlarge the number of grid cells in SRD to reduce the fluid compressibility at high Reynolds number. Our simulations provide a method to estimate the range of conditions for which SRD or DPD is preferable for mesoscopic simulations.

  16. Subjective video quality assessment methods for recognition tasks

    NASA Astrophysics Data System (ADS)

    Ford, Carolyn G.; McFarland, Mark A.; Stange, Irena W.

    2009-02-01

    To develop accurate objective measurements (models) for video quality assessment, subjective data is traditionally collected via human subject testing. The ITU has a series of Recommendations that address methodology for performing subjective tests in a rigorous manner. These methods are targeted at the entertainment application of video. However, video is often used for many applications outside of the entertainment sector, and generally this class of video is used to perform a specific task. Examples of these applications include security, public safety, remote command and control, and sign language. For these applications, video is used to recognize objects, people or events. The existing methods, developed to assess a person's perceptual opinion of quality, are not appropriate for task-based video. The Institute for Telecommunication Sciences, under a program from the Department of Homeland Security and the National Institute for Standards and Technology's Office of Law Enforcement, has developed a subjective test method to determine a person's ability to perform recognition tasks using video, thereby rating the quality according to the usefulness of the video quality within its application. This new method is presented, along with a discussion of two examples of subjective tests using this method.

  17. Numerical methods for assessment of the ship's pollutant emissions

    NASA Astrophysics Data System (ADS)

    Jenaru, A.; Acomi, N.

    2016-08-01

    The maritime transportation sector constitutes a source of atmospheric pollution. To avoid or minimize ships pollutant emissions the first step is to assess them. Two methods of estimation of the ships’ emissions are proposed in this paper. These methods prove their utility for shipboard and shore based management personnel from the practical perspective. The methods were demonstrated for a product tanker vessel where a permanent monitoring system for the pollutant emissions has previously been fitted. The values of the polluting agents from the exhaust gas were determined for the ship from the shipyard delivery and were used as starting point. Based on these values, the paper aimed at numerical assessing of ship's emissions in order to determine the ways for avoiding environmental pollution: the analytical method of determining the concentrations of the exhaust gas components, by using computation program MathCAD, and the graphical method of determining the concentrations of the exhaust gas components, using variation diagrams of the parameters, where the results of the on board measurements were introduced, following the application of pertinent correction factors. The results should be regarded as a supporting tool during the decision making process linked to the reduction of ship's pollutant emissions.

  18. Connecting the Dots: The Decline in Meaningful Learning

    ERIC Educational Resources Information Center

    Stewart, Kenneth; Kilmartin, Christopher

    2014-01-01

    The authors describe cross-decades changes in the achievement attitudes and behaviors of average U. S. undergraduates that parallel the declines in meaningful learning reported by Arum and colleagues. Comparisons of pre-1987 and 2004-8 students on seven achievement-predictive measures revealed that (a) average 2004-8 undergraduates scored…

  19. Seeking Meaningful School Reform: Characteristics of Inspired Schools

    ERIC Educational Resources Information Center

    Michael, Christine N.; Young, Nicholas D.

    2005-01-01

    The purpose of this study was two-fold: (1) to gain an understanding of how senior school administrators define inspired public schools; and (2) to discern the characteristics of inspired schools to guide meaningful school improvement efforts. Twenty-nine senior leaders--school superintendents and assistant superintendents--from across New England…

  20. The Role of Meaningful Dialogue in Early Childhood Education Leadership

    ERIC Educational Resources Information Center

    Deakins, Eric

    2007-01-01

    Action research was used to study the effectiveness of Learning Organisation and Adaptive Enterprise theories for promoting organisation-wide learning and creating a more effective early childhood education organisation. This article describes the leadership steps taken to achieve shared vision via meaningful dialogue between board, management and…

  1. Parent Meetings: Creative Ways to Make Them Meaningful

    ERIC Educational Resources Information Center

    Stephens, Karen

    2007-01-01

    This article shares a bevy of ways to make parent meetings successful--in a meaningful way. It presents some guidelines to keep in mind when planning parent meetings: (1) survey parents on what they want to learn about or discuss regarding childrearing or family life; (2) include parents in planning meetings; (3) find motivating ways to get…

  2. Cache-Cache Comparison for Supporting Meaningful Learning

    ERIC Educational Resources Information Center

    Wang, Jingyun; Fujino, Seiji

    2015-01-01

    The paper presents a meaningful discovery learning environment called "cache-cache comparison" for a personalized learning support system. The processing of seeking hidden relations or concepts in "cache-cache comparison" is intended to encourage learners to actively locate new knowledge in their knowledge framework and check…

  3. Water Habitat Study: Prediction Makes It More Meaningful.

    ERIC Educational Resources Information Center

    Glasgow, Dennis R.

    1982-01-01

    Suggests a teaching strategy for water habitat studies to help students make a meaningful connection between physiochemical data (dissolved oxygen content, pH, and water temperature) and biological specimens they collect. Involves constructing a poster and using it to make predictions. Provides sample poster. (DC)

  4. 42 CFR 495.210 - Meaningful EHR user attestation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Meaningful EHR user attestation. 495.210 Section 495.210 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) STANDARDS AND CERTIFICATION STANDARDS FOR THE ELECTRONIC HEALTH RECORD...

  5. 42 CFR 495.210 - Meaningful EHR user attestation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... INCENTIVE PROGRAM Requirements Specific to Medicare Advantage (MA) Organizations § 495.210 Meaningful EHR user attestation. (a) Qualifying MA organizations are required to attest, in a form and manner specified by CMS, that each qualifying MA EP and qualifying MA-affiliated eligible hospitals is a...

  6. Kilimanjaro: A Case of Meaningful Adventure and Service Learning Abroad

    ERIC Educational Resources Information Center

    Cavanaugh, Cathy; Gajer, Ewa; Mayberry, John; O'Connor, Brendan; Hargis, Jace

    2015-01-01

    This qualitative evaluation explored how female undergraduate students developed an understanding of themselves and the broader world as a result of an adventure and service learning experience in Tanzania, Africa. The project built upon theoretical frameworks regarding meaningful learning--active, constructive, intentional, and authentic--and…

  7. 'Meaningful use' hoists hospital IT to next level.

    PubMed

    Weinstock, Matthew; Hoppszallern, Suzanna

    2010-07-01

    The winners of H & HN's annual survey of information technology use among hospitals and health systems. Plus: the Most Wired-Small and Rural, Most Improved and Most Wireless. And we consider how leaders of top IT hospitals plan to make "meaningful use" a guide for improvement.

  8. A Heuristic Study of Religious Spirituality and Meaningful Work

    ERIC Educational Resources Information Center

    Kennedy, Keight Tucker

    2016-01-01

    Spirituality in the workplace has received increased focus over the past two decades. This heuristic study examined how religious spirituality informs and/or influences individual perceptions of meaningful work experiences. A literature review on the subject found a dearth of research. The primary research question was the following: What is the…

  9. Comprehension for What? Preparing Students for Their Meaningful Future

    ERIC Educational Resources Information Center

    Conley, Mark W.; Wise, Antoinette

    2011-01-01

    Researchers, policymakers, and educators face a daunting task these days concerning literacy education for the here and now and literacy for the future. Even though one clings to the romantic notion that education provides the building blocks in a straight line to a meaningful future, the reality is that mixed goals and instructional messages…

  10. Attributes of Meaningful Learning Experiences in an Outdoor Education Program

    ERIC Educational Resources Information Center

    Taniguchi, Stacy T.; Freeman, Patti A.; Richards, A. LeGrand

    2005-01-01

    This phenomenological study sought to identify the attributes of meaningful learning experiences as found in an outdoor education program. Thirteen students in the Wilderness Writing Program at Brigham Young University were the sample of this study. Their participation in outdoor recreational activities and their reflections about their…

  11. Operating Room Delays: Meaningful Use in Electronic Health Record.

    PubMed

    Van Winkle, Rachelle A; Champagne, Mary T; Gilman-Mays, Meri; Aucoin, Julia

    2016-06-01

    Perioperative areas are the most costly to operate and account for more than 40% of expenses. The high costs prompted one organization to analyze surgical delays through a retrospective review of their new electronic health record. Electronic health records have made it easier to access and aggregate clinical data; 2123 operating room cases were analyzed. Implementing a new electronic health record system is complex; inaccurate data and poor implementation can introduce new problems. Validating the electronic health record development processes determines the ease of use and the user interface, specifically related to user compliance with the intent of the electronic health record development. The revalidation process after implementation determines if the intent of the design was fulfilled and data can be meaningfully used. In this organization, the data fields completed through automation provided quantifiable, meaningful data. However, data fields completed by staff that required subjective decision making resulted in incomplete data nearly 24% of the time. The ease of use was further complicated by 490 permutations (combinations of delay types and reasons) that were built into the electronic health record. Operating room delay themes emerged notwithstanding the significant complexity of the electronic health record build; however, improved accuracy could improve meaningful data collection and a more accurate root cause analysis of operating room delays. Accurate and meaningful use of data affords a more reliable approach in quality, safety, and cost-effective initiatives. PMID:27046388

  12. Facilitating Meaningful Discussion Groups in the Primary Grades

    ERIC Educational Resources Information Center

    Moses, Lindsey; Ogden, Meridith; Kelly, Laura Beth

    2015-01-01

    This Teaching Tips describes a yearlong process of facilitating meaningful discussion groups about literature with first-grade students in an urban Title I school. At the beginning of the year, the teacher provided explicit instruction in speaking and listening skills to support students with the social skills needed for thoughtful discussion. She…

  13. Increasing Meaningful Assistive Technology Use in the Classrooms

    ERIC Educational Resources Information Center

    Connor, Cynthia; Beard, Lawrence A.

    2015-01-01

    Although personal technology is consistently used by students and teachers, meaningful use of technology for instruction may not be feasible without providing teachers specific training and support. One university is providing workshops, feedback through coursework, and hands-on training to teacher candidates and local area teachers. In addition,…

  14. Creating Meaningful Inquiry in Inclusive Classrooms: Practitioners' Stories of Research

    ERIC Educational Resources Information Center

    Jones, Phyllis, Ed.; Whitehurst, Teresa, Ed.; Egerton, Jo, Ed.

    2012-01-01

    In recent years, the concept of teachers as researchers in both special and mainstream school settings has become part of our everyday language. Whilst many educational practitioners will see the need for research within their setting, many may not be familiar with the technical elements they believe are required. "Creating Meaningful Inquiry in…

  15. Categorizing Drugs and Drug-Taking: A More Meaningful Approach.

    ERIC Educational Resources Information Center

    Gold, Robert S.; Duncan, David F.

    This document reviews various definitions of the nature and classification of drugs. Difficulties with existing categorizations which use such bases as clinical utility, molecular structure, effects on the central nervous system, legality, and hazard potential are disucssed. A more meaningful categorization based on the availability and sources of…

  16. Preparing Meaningful and Communicative Exercises for the Language Lab.

    ERIC Educational Resources Information Center

    Strei, Gerry

    1980-01-01

    A workshop was given to identify and point out the limitations of mechanical language laboratory drills, and to compare them to drills which have been classified as being meaningful or communicative. Mechanical drills do not require an understanding of the meaning of the sentence; there is not consideration of context; and there is no connection…

  17. Using Meaningful Contexts to Promote Understanding of Pronumerals

    ERIC Educational Resources Information Center

    Linsell, Chris; Cavanagh, Michael; Tahir, Salma

    2013-01-01

    Developing a conceptual understanding of elementary algebra has been the focus of a number of recent articles in this journal. Baroudi (2006) advocated problem solving to assist students' transition from arithmetic to algebra, and Shield (2008) described the use of meaningful contexts for developing the concept of function. Samson (2011, 2012)…

  18. Comparing potential early caries assessment methods for teledentistry

    PubMed Central

    2013-01-01

    Background Optical caries detection has the potential to be incorporated in telehealth medicine for preventive dental screening. The objective of this study was to evaluate and compare visible and near infrared detection methods for identifying early non-cavitated ex vivo occlusal demineralization. Methods Six blinded examiners were used to compare the accuracy of the following three examinations in detecting occlusal demineralization: Midwest Caries ID™ (MID), visual photographic examination (CAM) and Cross Polarization Optical Coherence Tomography (CP-OCT). For each diagnostic method, two examiners assessed the extracted tooth samples 1–2 weeks apart. Teeth were then sectioned and lesion depth was confirmed (n = 42) by a blinded histological examination using a glycol based caries indicator dye. The sensitivity (Sen), specificity (Sp), Intraclass Correlation Coefficient (ICC), and Area under the Receiver Operator Curve (AUC) were calculated. Results For detecting any demineralization versus sound pit and fissure enamel, the mean Sen/Sp found was 46.9/85.0 for MID, 80.5/52.5 for CAM, and 83.4/45.0 for CP-OCT. For detecting non-cavitated demineralization that progressed into the dentin, the mean Sen/Sp found was 17.3/88.0 for MID, 48.0/57.8 for CAM, and 44.2/72.7 for CP-OCT. AUC values were statistically significant (P < 0.05) in three out of four examiner assessments when MID and CP-OCT were used to detect any demineralization. AUC values were significant for a single CAM examination. When assessing deeper non-cavitated lesions, none of the assessment methods were able to yield AUC values that were significantly different than a random ‘coin flip’ test. When examining reliability, MID demonstrated the highest ICC score (0.83) and CP-OCT had the lowest (0.49). Conclusion Although MID and CP-OCT were useful in detecting the presence of demineralization, examiners were not able to utilize these devices to adequately assess the depth of the

  19. A proposed impact assessment method for genetically modified plants (AS-GMP Method)

    SciTech Connect

    Jesus-Hitzschky, Katia Regina Evaristo de; Silveira, Jose Maria F.J. da

    2009-11-15

    An essential step in the development of products based on biotechnology is an assessment of their potential economic impacts and safety, including an evaluation of the potential impact of transgenic crops and practices related to their cultivation on the environment and human or animal health. The purpose of this paper is to provide an assessment method to evaluate the impact of biotechnologies that uses quantifiable parameters and allows a comparative analysis between conventional technology and technologies using GMOs. This paper introduces a method to perform an impact analysis associated with the commercial release and use of genetically modified plants, the Assessment System GMP Method. The assessment is performed through indicators that are arranged according to their dimension criterion likewise: environmental, economic, social, capability and institutional approach. To perform an accurate evaluation of the GMP specific indicators related to genetic modification are grouped in common fields: genetic insert features, GM plant features, gene flow, food/feed field, introduction of the GMP, unexpected occurrences and specific indicators. The novelty is the possibility to include specific parameters to the biotechnology under assessment. In this case by case analysis the factors of moderation and the indexes are parameterized to perform an available assessment.

  20. Rapid-estimation method for assessing scour at highway bridges

    USGS Publications Warehouse

    Holnbeck, Stephen R.

    1998-01-01

    A method was developed by the U.S. Geological Survey for rapid estimation of scour at highway bridges using limited site data and analytical procedures to estimate pier, abutment, and contraction scour depths. The basis for the method was a procedure recommended by the Federal Highway Administration for conducting detailed scour investigations, commonly referred to as the Level 2 method. Using pier, abutment, and contraction scour results obtained from Level 2 investigations at 122 sites in 10 States, envelope curves and graphical relations were developed that enable determination of scour-depth estimates at most bridge sites in a matter of a few hours. Rather than using complex hydraulic variables, surrogate variables more easily obtained in the field were related to calculated scour-depth data from Level 2 studies. The method was tested by having several experienced individuals apply the method in the field, and results were compared among the individuals and with previous detailed analyses performed for the sites. Results indicated that the variability in predicted scour depth among individuals applying the method generally was within an acceptable range, and that conservatively greater scour depths generally were obtained by the rapid-estimation method compared to the Level 2 method. The rapid-estimation method is considered most applicable for conducting limited-detail scour assessments and as a screening tool to determine those bridge sites that may require more detailed analysis. The method is designed to be applied only by a qualified professional possessing knowledge and experience in the fields of bridge scour, hydraulics, and flood hydrology, and having specific expertise with the Level 2 method.

  1. Developing the RIAM method (rapid impact assessment matrix) in the context of impact significance assessment

    SciTech Connect

    Ijaes, Asko; Kuitunen, Markku T.; Jalava, Kimmo

    2010-02-15

    In this paper the applicability of the RIAM method (rapid impact assessment matrix) is evaluated in the context of impact significance assessment. The methodological issues considered in the study are: 1) to test the possibilities of enlarging the scoring system used in the method, and 2) to compare the significance classifications of RIAM and unaided decision-making to estimate the consistency between these methods. The data used consisted of projects for which funding had been applied for via the European Union's Regional Development Trust in the area of Central Finland. Cases were evaluated with respect to their environmental, social and economic impacts using an assessment panel. The results showed the scoring framework used in RIAM could be modified according to the problem situation at hand, which enhances its application potential. However the changes made in criteria B did not significantly affect the final ratings of the method, which indicates the high importance of criteria A1 (importance) and A2 (magnitude) to the overall results. The significance classes obtained by the two methods diverged notably. In general the ratings given by RIAM tended to be smaller compared to intuitive judgement implying that the RIAM method may be somewhat conservative in character.

  2. Computerized Communication Assessment Management: A Multi-Method Approach to Skills and Field Assessment.

    ERIC Educational Resources Information Center

    Aitken, Joan E.

    The Department of Communication Studies at the University of Missouri-Kansas City is developing a computer package designed to teach and assess aural, visual, and oral communication skills through a multi-media approach with classroom tests, portfolios, and performance measures. Before developing the multi-method approach, the department tried six…

  3. A new method for assessing PET-MRI coregistration

    NASA Astrophysics Data System (ADS)

    DeLorenzo, Christine; Klein, Arno; Mikhno, Arthur; Gray, Neil; Zanderigo, Francesca; Mann, J. John; Parsey, Ramin V.

    2009-02-01

    Positron emission tomography (PET) images are acquired for many purposes, from diagnostic assessment to aiding in the development of novel therapies. Whatever the intended use, it is often necessary to distinguish between different anatomical regions within these images. Because of this, magnetic resonance images (MRIs) are generally acquired to provide an anatomical reference. This reference will only be accurate if the PET image is properly coregistered to the MRI; yet currently, a method to evaluate PET-MRI coregistration accuracy does not exist. This problem is compounded by the fact that two visually indistinguishable coregistration results can produce estimates of ligand binding that vary significantly. Therefore, the focus of this work was to develop a method that can evaluate coregistration performance based on measured ligand binding within certain regions of the coregistered PET image. The evaluation method is based on the premise that a more accurate coregistration will result in higher ligand binding in certain anatomical regions defined by the MRI. This fully automated method was able to assess coregistration results within the variance of an expert manual rater and shows promise as a possible coregistration cost function.

  4. A new method for assessing the gloss of human skin.

    PubMed

    Lentner, A; Wienert, V

    1996-01-01

    A new method for an objective assessment of the gloss of human skin is presented. The reflectometric measuring set-up complies with DIN 67530. The principle of this new method is based on a contactless determination of the skin's reflection of light from a tungsten filament lamp, recorded at an angle of 60 degrees by a silicon photocell. In a comparative study with 30 test persons it was discovered that the forehead, with 2.70 standardised reflectometer units (RU; SD +/- 0.59 RU), displayed a significantly higher gloss than the lower arm (1.99 RU, SD 0.28 RU, p < 0.0001). In an investigation into the influence of four different cream bases on the skin gloss it could be determined that the value depends on the percentage of grease, the water concentration and the consistency of the respective base. The method presented permits a fast, contactless, randomly repeatable objective assessment of skin gloss. Since the acceptance of cosmetics and pharmaceutical products depends not least on their skin gloss effect, this method can provide valuable information when estimating the success of old and new products. PMID:8737915

  5. Geomorphometry-based method of landform assessment for geodiversity

    NASA Astrophysics Data System (ADS)

    Najwer, Alicja; Zwoliński, Zbigniew

    2015-04-01

    Climate variability primarily induces the variations in the intensity and frequency of surface processes and consequently, principal changes in the landscape. As a result, abiotic heterogeneity may be threatened and the key elements of the natural diversity even decay. The concept of geodiversity was created recently and has rapidly gained the approval of scientists around the world. However, the problem recognition is still at an early stage. Moreover, little progress has been made concerning its assessment and geovisualisation. Geographical Information System (GIS) tools currently provide wide possibilities for the Earth's surface studies. Very often, the main limitation in that analysis is acquisition of geodata in appropriate resolution. The main objective of this study was to develop a proceeding algorithm for the landform geodiversity assessment using geomorphometric parameters. Furthermore, final maps were compared to those resulting from thematic layers method. The study area consists of two peculiar valleys, characterized by diverse landscape units and complex geological setting: Sucha Woda in Polish part of Tatra Mts. and Wrzosowka in Sudetes Mts. Both valleys are located in the National Park areas. The basis for the assessment is a proper selection of geomorphometric parameters with reference to the definition of geodiversity. Seven factor maps were prepared for each valley: General Curvature, Topographic Openness, Potential Incoming Solar Radiation, Topographic Position Index, Topographic Wetness Index, Convergence Index and Relative Heights. After the data integration and performing the necessary geoinformation analysis, the next step with a certain degree of subjectivity is score classification of the input maps using an expert system and geostatistical analysis. The crucial point to generate the final maps of geodiversity by multi-criteria evaluation (MCE) with GIS-based Weighted Sum technique is to assign appropriate weights for each factor map by

  6. Methods to Assess Measurement Error in Questionnaires of Sedentary Behavior

    PubMed Central

    Sampson, Joshua N; Matthews, Charles E; Freedman, Laurence; Carroll, Raymond J.; Kipnis, Victor

    2015-01-01

    Sedentary behavior has already been associated with mortality, cardiovascular disease, and cancer. Questionnaires are an affordable tool for measuring sedentary behavior in large epidemiological studies. Here, we introduce and evaluate two statistical methods for quantifying measurement error in questionnaires. Accurate estimates are needed for assessing questionnaire quality. The two methods would be applied to validation studies that measure a sedentary behavior by both questionnaire and accelerometer on multiple days. The first method fits a reduced model by assuming the accelerometer is without error, while the second method fits a more complete model that allows both measures to have error. Because accelerometers tend to be highly accurate, we show that ignoring the accelerometer’s measurement error, can result in more accurate estimates of measurement error in some scenarios. In this manuscript, we derive asymptotic approximations for the Mean-Squared Error of the estimated parameters from both methods, evaluate their dependence on study design and behavior characteristics, and offer an R package so investigators can make an informed choice between the two methods. We demonstrate the difference between the two methods in a recent validation study comparing Previous Day Recalls (PDR) to an accelerometer-based ActivPal. PMID:27340315

  7. A GIS-based method for flood risk assessment

    NASA Astrophysics Data System (ADS)

    Kalogeropoulos, Kleomenis; Stathopoulos, Nikos; Psarogiannis, Athanasios; Penteris, Dimitris; Tsiakos, Chrisovalantis; Karagiannopoulou, Aikaterini; Krikigianni, Eleni; Karymbalis, Efthimios; Chalkias, Christos

    2016-04-01

    Floods are physical global hazards with negative environmental and socio-economic impacts on local and regional scale. The technological evolution during the last decades, especially in the field of geoinformatics, has offered new advantages in hydrological modelling. This study seeks to use this technology in order to quantify flood risk assessment. The study area which was used is an ungauged catchment and by using mostly GIS hydrological and geomorphological analysis together with a GIS-based distributed Unit Hydrograph model, a series of outcomes have risen. More specifically, this paper examined the behaviour of the Kladeos basin (Peloponnese, Greece) using real rainfall data, as well hypothetical storms. The hydrological analysis held using a Digital Elevation Model of 5x5m pixel size, while the quantitative drainage basin characteristics were calculated and were studied in terms of stream order and its contribution to the flood. Unit Hydrographs are, as it known, useful when there is lack of data and in this work, based on time-area method, a sequences of flood risk assessments have been made using the GIS technology. Essentially, the proposed methodology estimates parameters such as discharge, flow velocity equations etc. in order to quantify flood risk assessment. Keywords Flood Risk Assessment Quantification; GIS; hydrological analysis; geomorphological analysis.

  8. Methods for assessment of keel bone damage in poultry.

    PubMed

    Casey-Trott, T; Heerkens, J L T; Petrik, M; Regmi, P; Schrader, L; Toscano, M J; Widowski, T

    2015-10-01

    Keel bone damage (KBD) is a critical issue facing the laying hen industry today as a result of the likely pain leading to compromised welfare and the potential for reduced productivity. Recent reports suggest that damage, while highly variable and likely dependent on a host of factors, extends to all systems (including battery cages, furnished cages, and non-cage systems), genetic lines, and management styles. Despite the extent of the problem, the research community remains uncertain as to the causes and influencing factors of KBD. Although progress has been made investigating these factors, the overall effort is hindered by several issues related to the assessment of KBD, including quality and variation in the methods used between research groups. These issues prevent effective comparison of studies, as well as difficulties in identifying the presence of damage leading to poor accuracy and reliability. The current manuscript seeks to resolve these issues by offering precise definitions for types of KBD, reviewing methods for assessment, and providing recommendations that can improve the accuracy and reliability of those assessments. PMID:26287001

  9. Assessments of lung digestion methods for recovery of fibers.

    PubMed

    Warheit, D B; Hwang, H C; Achinko, L

    1991-04-01

    Evaluation of the pulmonary hazards associated with exposure to fibrous materials tends to be more complicated than assessments required for particulate materials. Fibers are defined by aspect ratios and it is generally considered that physical dimensions play an important role in the pathogenesis of fiber-related lung diseases. Several digestion techniques have been used to recover fibers from exposed lung tissue for clearance studies. Because many of the digestion fluids are corrosive (e.g., bleach, KOH), it is conceivable that the dimensions of recovered fibers are modified during the tissue digestion methods to assess whether the physical dimensions of bulk samples of fibers were altered following simulated digestion processing. Aliquots of crocidolite and chrysotile asbestos, Kevlar aramid, wollastonite, polyacrylonitrile (pan)-based carbon, and glass fibers were incubated with either saline, bleach, or KOH and then filtered. Scanning electron microscopy techniques were utilized to measure the physical dimensions (i.e., lengths and diameters) of at least 160 fibers per treatment group of each fiber type. Our results showed that the lengths and diameters of glass fibers and wollastonite were altered after treatment with KOH. In addition, treatment with bleach produced a small reduction in both asbestos fiber-type diameters, and greater changes in Kevlar and wollastonite diameters and carbon fiber lengths (P less than 0.05). These results indicate that lung digestion methods should be carefully assessed for each fiber type before initiating fiber clearance studies.

  10. Survey methods for assessing land cover map accuracy

    USGS Publications Warehouse

    Nusser, S.M.; Klaas, E.E.

    2003-01-01

    The increasing availability of digital photographic materials has fueled efforts by agencies and organizations to generate land cover maps for states, regions, and the United States as a whole. Regardless of the information sources and classification methods used, land cover maps are subject to numerous sources of error. In order to understand the quality of the information contained in these maps, it is desirable to generate statistically valid estimates of accuracy rates describing misclassification errors. We explored a full sample survey framework for creating accuracy assessment study designs that balance statistical and operational considerations in relation to study objectives for a regional assessment of GAP land cover maps. We focused not only on appropriate sample designs and estimation approaches, but on aspects of the data collection process, such as gaining cooperation of land owners and using pixel clusters as an observation unit. The approach was tested in a pilot study to assess the accuracy of Iowa GAP land cover maps. A stratified two-stage cluster sampling design addressed sample size requirements for land covers and the need for geographic spread while minimizing operational effort. Recruitment methods used for private land owners yielded high response rates, minimizing a source of nonresponse error. Collecting data for a 9-pixel cluster centered on the sampled pixel was simple to implement, and provided better information on rarer vegetation classes as well as substantial gains in precision relative to observing data at a single-pixel.

  11. A qualitative method proposal to improve environmental impact assessment

    SciTech Connect

    Toro, Javier; Requena, Ignacio; Duarte, Oscar; Zamorano, Montserrat

    2013-11-15

    In environmental impact assessment, qualitative methods are used because they are versatile and easy to apply. This methodology is based on the evaluation of the strength of the impact by grading a series of qualitative attributes that can be manipulated by the evaluator. The results thus obtained are not objective, and all too often impacts are eliminated that should be mitigated with corrective measures. However, qualitative methodology can be improved if the calculation of Impact Importance is based on the characteristics of environmental factors and project activities instead on indicators assessed by evaluators. In this sense, this paper proposes the inclusion of the vulnerability of environmental factors and the potential environmental impact of project activities. For this purpose, the study described in this paper defined Total Impact Importance and specified a quantification procedure. The results obtained in the case study of oil drilling in Colombia reflect greater objectivity in the evaluation of impacts as well as a positive correlation between impact values, the environmental characteristics at and near the project location, and the technical characteristics of project activities. -- Highlights: • Concept of vulnerability has been used to calculate the importance impact assessment. • This paper defined Total Impact Importance and specified a quantification procedure. • The method includes the characteristics of environmental and project activities. • The application has shown greater objectivity in the evaluation of impacts. • Better correlation between impact values, environment and the project has been shown.

  12. Assessment of a novel method for teaching veterinary parasitology.

    PubMed

    Pereira, Mary Mauldin; Yvorchuk-St Jean, Kathleen E; Wallace, Charles E; Krecek, Rosina C

    2014-01-01

    A student-centered innovative method of teaching veterinary parasitology was launched and evaluated at the Ross University School of Veterinary Medicine (RUSVM) in St. Kitts, where Parasitology is a required course for second-semester veterinary students. A novel method, named Iron Parasitology, compared lecturer-centered teaching with student-centered teaching and assessed the retention of parasitology knowledge of students in their second semester and again when they reached their seventh semester. Members of five consecutive classes chose to participate in Iron Parasitology with the opportunity to earn an additional 10 points toward their final grade by demonstrating their knowledge, communication skills, clarity of message, and creativity in the Iron Parasitology exercise. The participants and nonparticipants were assessed using seven parameters. The initial short-term study parameters used to evaluate lecturer- versus student-centered teaching were age, gender, final Parasitology course grade without Iron Parasitology, RUSVM overall grade point average (GPA), RUSVM second-semester GPA, overall GPA before RUSVM, and prerequisite GPA before RUSVM. The long-term reassessment study assessed retention of parasitology knowledge in members of the seventh-semester class who had Iron Parasitology as a tool in their second semester. These students were invited to complete a parasitology final examination during their seventh semester. There were no statistically significant differences for the parameters measured in the initial study. In addition, Iron Parasitology did not have an effect on the retention scores in the reassessment study.

  13. A new assessment method of outdoor tobacco smoke (OTS) exposure

    NASA Astrophysics Data System (ADS)

    Cho, Hyeri; Lee, Kiyoung

    2014-04-01

    Outdoor tobacco smoke (OTS) is concerned due to potential health effects. An assessment method of OTS exposure is needed to determine effects of OTS and validate outdoor smoking policies. The objective of this study was to develop a new method to assess OTS exposure. This study was conducted at 100 bus stops including 50 centerline bus stops and 50 roadside bus stops in Seoul, Korea. Using real-time aerosol monitor, PM2.5 was measured for 30 min at each bus stop in two seasons. ‘Peak analysis' method was developed to assess short term PM2.5 exposure by OTS. The 30-min average PM2.5 exposure at each bus stop was associated with season and bus stop location but not smoking activity. The PM2.5 peak occurrence rate by the peak analysis method was significantly associated with season, bus stop location, observed smoking occurrence, and the number of buses servicing a route. The PM2.5 peak concentration was significantly associated with season, smoking occurrence, and the number of buses servicing a route. When a smoker was standing still at the bus stop, magnitude of peak concentrations were significantly higher than when the smoker walking-through the bus stop. People were exposed to high short-term PM2.5 peak levels at bus stops, and the magnitude of peak concentrations were highest when a smoker was located close to the monitor. The magnitude of peak concentration was a good indicator helped distinguish nearby OTS exposure. Further research using ‘peak analysis' is needed to measure smoking-related exposure to PM2.5 in other outdoor locations.

  14. A new method for assessing surface solar irradiance: Heliosat-4

    NASA Astrophysics Data System (ADS)

    Qu, Z.; Oumbe, A.; Blanc, P.; Lefèvre, M.; Wald, L.; Schroedter-Homscheidt, M.; Gesell, G.

    2012-04-01

    Downwelling shortwave irradiance at surface (SSI) is more and more often assessed by means of satellite-derived estimates of optical properties of the atmosphere. Performances are judged satisfactory for the time being but there is an increasing need for the assessment of the direct and diffuse components of the SSI. MINES ParisTech and the German Aerospace Center (DLR) are currently developing the Heliosat-4 method to assess the SSI and its components in a more accurate way than current practices. This method is composed by two parts: a clear sky module based on the radiative transfer model libRadtran, and a cloud-ground module using two-stream and delta-Eddington approximations for clouds and a database of ground albedo. Advanced products derived from geostationary satellites and recent Earth Observation missions are the inputs of the Heliosat-4 method. Such products are: cloud optical depth, cloud phase, cloud type and cloud coverage from APOLLO of DLR, aerosol optical depth, aerosol type, water vapor in clear-sky, ozone from MACC products (FP7), and ground albedo from MODIS of NASA. In this communication, we briefly present Heliosat-4 and focus on its performances. The results of Heliosat-4 for the period 2004-2010 will be compared to the measurements made in five stations within the Baseline Surface Radiation Network. Extensive statistic analysis as well as case studies are performed in order to better understand Heliosat-4 and have an in-depth view of the performance of Heliosat-4, to understand its advantages comparing to existing methods and to identify its defaults for future improvements. The research leading to these results has received funding from the European Union's Seventh Framework Programme (FP7/2007-2013) under Grant Agreement no. 218793 (MACC project) and no. 283576 (MACC-II project).

  15. The Effects of Cognitive Process and Decision Making Training in Reading Experience on Meaningful Learning with Underachieving College Students

    ERIC Educational Resources Information Center

    Dean, Rebecca J.

    2010-01-01

    The ability of underprepared college students to read and learn from their reading is essential to their academic success and to their ability to persist towards completing their degree. The purposes of this study were to (a) assess the relationship between the cognitive processes of reading-based decision making and meaningful learning and (b)…

  16. Methods for assessing biochemical oxygen demand (BOD): a review.

    PubMed

    Jouanneau, S; Recoules, L; Durand, M J; Boukabache, A; Picot, V; Primault, Y; Lakel, A; Sengelin, M; Barillon, B; Thouand, G

    2014-02-01

    The Biochemical Oxygen Demand (BOD) is one of the most widely used criteria for water quality assessment. It provides information about the ready biodegradable fraction of the organic load in water. However, this analytical method is time-consuming (generally 5 days, BOD5), and the results may vary according to the laboratory (20%), primarily due to fluctuations in the microbial diversity of the inoculum used. Work performed during the two last decades has resulted in several technologies that are less time-consuming and more reliable. This review is devoted to the analysis of the technical features of the principal methods described in the literature in order to compare their performances (measuring window, reliability, robustness) and to identify the pros and the cons of each method.

  17. Current methods in use for assessing clinical competencies: what works?

    PubMed

    Hardie, Elizabeth M

    2008-01-01

    An online survey was used to capture qualitative descriptions of methods used by a veterinary college to assess clinical competencies in its students. Each college was specifically asked about use of the methods detailed in the Toolbox of Assessment Methods developed by the Accreditation Council for Graduate Medical Education and the American Board of Medical Specialties. Additionally, each college was asked to detail the methods used to ensure competency in each of the nine areas specified by the American Veterinary Medical Association Council on Education. Associate deans of academic affairs or their equivalents at veterinary colleges in the United States, the United Kingdom, Canada, and the Caribbean were contacted by e-mail and asked to complete the survey. Responses were obtained from 24 of 32 colleges. The methods most often used were review of students' medical records (16), checklist evaluation of must-learn skills (16), procedural logs (11), multiple-choice skill examinations (11), case simulations using role-playing (7), short-answer skill examinations (7), global rating of live or recorded performance (7), case simulations using computerized case simulations (7), 360-degree evaluation of clinical performance (4), and standardized patient or client examination (3). Additional methods used included medical record portfolio review, paper-and-pencil branching problems, chart-stimulated oral exams, externship mentor evaluation, performance rubrics for clinical rotations, direct observation and query on cases, video evaluation, case correlation tasks, and an employer survey. Non-realistic models were used more often for skill evaluation than realistic models. One college used virtual-reality models for testing.

  18. Falcon: automated optimization method for arbitrary assessment criteria

    DOEpatents

    Yang, Tser-Yuan; Moses, Edward I.; Hartmann-Siantar, Christine

    2001-01-01

    FALCON is a method for automatic multivariable optimization for arbitrary assessment criteria that can be applied to numerous fields where outcome simulation is combined with optimization and assessment criteria. A specific implementation of FALCON is for automatic radiation therapy treatment planning. In this application, FALCON implements dose calculations into the planning process and optimizes available beam delivery modifier parameters to determine the treatment plan that best meets clinical decision-making criteria. FALCON is described in the context of the optimization of external-beam radiation therapy and intensity modulated radiation therapy (IMRT), but the concepts could also be applied to internal (brachytherapy) radiotherapy. The radiation beams could consist of photons or any charged or uncharged particles. The concept of optimizing source distributions can be applied to complex radiography (e.g. flash x-ray or proton) to improve the imaging capabilities of facilities proposed for science-based stockpile stewardship.

  19. Comparison of Body Composition Assessment Methods in Pediatric Intestinal Failure

    PubMed Central

    Mehta, Nilesh M.; Raphael, Bram; Guteirrez, Ivan; Quinn, Nicolle; Mitchell, Paul D.; Litman, Heather J.; Jaksic, Tom; Duggan, Christopher P.

    2015-01-01

    Objectives To examine the agreement of multifrequency bioelectric impedance analysis (BIA) and anthropometry with reference methods for body composition assessment in children with intestinal failure (IF). Methods We conducted a prospective pilot study in children 14 years of age or younger with IF resulting from either short bowel syndrome (SBS) or motility disorders. Bland Altman analysis was used to examine the agreement between BIA and deuterium dilution in measuring total body water (TBW) and lean body mass (LBM); and between BIA and dual X-ray absorptiometry (DXA) techniques in measuring LBM and FM. Fat mass (FM) and percent body fat (%BF) measurements by BIA and anthropometry, were also compared in relation to those measured by deuterium dilution. Results Fifteen children with IF, median (IQR) age 7.2 (5.0, 10.0) years, 10 (67%) male, were studied. BIA and deuterium dilution were in good agreement with a mean bias (limits of agreement) of 0.9 (-3.2, 5.0) for TBW (L) and 0.1 (-5.4 to 5.6) for LBM (kg) measurements. The mean bias (limits) for FM (kg) and %BF measurements were 0.4 (-3.8, 4.6) kg and 1.7 (-16.9, 20.3)% respectively. The limits of agreement were within 1 SD of the mean bias in 12/14 (86%) subjects for TBW and LBM, and in 11/14 (79%) for FM and %BF measurements. Mean bias (limits) for LBM (kg) and FM (kg) between BIA and DXA were 1.6 (-3.0 to 6.3) kg and -0.1 (-3.2 to 3.1) kg, respectively. Mean bias (limits) for FM (kg) and %BF between anthropometry and deuterium dilution were 0.2 (-4.2, 4.6) and -0.2 (-19.5 to 19.1), respectively. The limits of agreement were within 1 SD of the mean bias in 10/14 (71%) subjects. Conclusions In children with intestinal failure, TBW and LBM measurements by multifrequency BIA method were in agreement with isotope dilution and DXA methods, with small mean bias. In comparison to deuterium dilution, BIA was comparable to anthropometry for FM and %BF assessments with small mean bias. However, the limits of agreement

  20. A Study of the Relative Effectiveness of a Meaningful Concrete and a Meaningful Symbolic Model in Learning a Selected Mathematical Principle.

    ERIC Educational Resources Information Center

    Fennema, Elizabeth

    Reported is a study to determine the relative effectiveness of a meaningful concrete and a meaningful symbolic model in learning a selected mathematics principle. Subjects were from a second grade population and they were assigned to three treatments. Students assigned to Treatment 1 received instruction in the principle with a meaningful symbolic…

  1. Non-destructive assessment of parchment deterioration by optical methods.

    PubMed

    Dolgin, Bella; Bulatov, Valery; Schechter, Israel

    2007-08-01

    A non-destructive and non-invasive method for quantitative characterization of parchment deterioration, based on spectral measurements, is proposed. Deterioration due to both natural aging (ancient parchments) and artificial aging (achieved by means of controlled UV irradiation and temperature treatment) was investigated. The effect of aging on parchment native fluorescence was correlated with its deterioration condition. Aging causes fluorescence intensity drop, spectral shift of the main peak, and an overall change in the fluorescence spectral features. Digital color imaging analysis based on visible reflectance from the parchment surface was also applied, and the correspondent color components (RGB) were successively correlated with the state of parchment deterioration/aging. The fluorescence and color imaging data were validated by analysis of historical parchments, aged between 50 and 2000 years and covering a large variety of states of deterioration. The samples were independently assessed by traditional microscopy methods. We conclude that the proposed optical method qualifies well as a non-destructive tool for rapid assessment of the stage of parchment deterioration.

  2. New actigraphic assessment method for periodic leg movements (PLM).

    PubMed

    Kazenwadel, J; Pollmächer, T; Trenkwalder, C; Oertel, W H; Kohnen, R; Künzel, M; Krüger, H P

    1995-10-01

    A new actigraphic method by which periodic leg movements (PLM) can be measured is presented. Data acquistion and analysis were brought into line to distinguish short-lasting repetive leg movements from random motor restlessness. The definition of PLM follows the generally accepted criteria for PLM scoring. Thirty restless legs patients, all also suffering from PLM, were investigated three times by polysomnography, including tibialis anterior surface electromyography and actigraphy. A high correlation (reliability) was found for the number of PLM per hour spent in bed between the two methods. Furthermore, the actigraph records PLM specifically. An index of random motor restlessness is not sufficient for a reliable PLM according. In addition, periodic movements in sleep (PMS) and PLM show comparable variability in general. The actigraphic assessment of PLM, however, gives a better measure because PMS recordings may result in a substantial underestimation of PLM when sleep efficiency is reduced. This method is an ambulatory assessment tool that can also be used for screening purposes.

  3. Epidemiological designs for vaccine safety assessment: methods and pitfalls.

    PubMed

    Andrews, Nick

    2012-09-01

    Three commonly used designs for vaccine safety assessment post licensure are cohort, case-control and self-controlled case series. These methods are often used with routine health databases and immunisation registries. This paper considers the issues that may arise when designing an epidemiological study, such as understanding the vaccine safety question, case definition and finding, limitations of data sources, uncontrolled confounding, and pitfalls that apply to the individual designs. The example of MMR and autism, where all three designs have been used, is presented to help consider these issues. PMID:21985898

  4. Epidemiological designs for vaccine safety assessment: methods and pitfalls.

    PubMed

    Andrews, Nick

    2012-09-01

    Three commonly used designs for vaccine safety assessment post licensure are cohort, case-control and self-controlled case series. These methods are often used with routine health databases and immunisation registries. This paper considers the issues that may arise when designing an epidemiological study, such as understanding the vaccine safety question, case definition and finding, limitations of data sources, uncontrolled confounding, and pitfalls that apply to the individual designs. The example of MMR and autism, where all three designs have been used, is presented to help consider these issues.

  5. Research Spotlight: New method to assess coral reef health

    NASA Astrophysics Data System (ADS)

    Tretkoff, Ernie

    2011-03-01

    Coral reefs around the world are becoming stressed due to rising temperatures, ocean acidification, overfishing, and other factors. Measuring community level rates of photosynthesis, respiration, and biogenic calcification is essential to assessing the health of coral reef ecosystems because the balance between these processes determines the potential for reef growth and the export of carbon. Measurements of biological productivity have typically been made by tracing changes in dissolved oxygen in seawater as it passes over a reef. However, this is a labor-intensive and difficult method, requiring repeated measurements. (Geophysical Research Letters, doi:10.1029/2010GL046179, 2011)

  6. Testing, Testing: Good Teaching Is Difficult; so is Meaningful Testing

    ERIC Educational Resources Information Center

    Toby, Sidney; Plano, Richard J.

    2004-01-01

    The limitations of teaching and assessment of knowledge are emphasized, and an improved method is suggested to test the students. This superior examination technique would replace multiple-choice questions with free-response questions for numerical problems, wherein the numerical inputs fed in the computer would be optically scanned and graded…

  7. Comparative assessment of the methods for exchangeable acidity measuring

    NASA Astrophysics Data System (ADS)

    Vanchikova, E. V.; Shamrikova, E. V.; Bespyatykh, N. V.; Zaboeva, G. A.; Bobrova, Yu. I.; Kyz"yurova, E. V.; Grishchenko, N. V.

    2016-05-01

    A comparative assessment of the results of measuring the exchangeable acidity and its components by different methods was performed for the main mineral genetic horizons of texturally-differentiated gleyed and nongleyed soddy-podzolic and gley-podzolic soils of the Komi Republic. It was shown that the contents of all the components of exchangeable soil acidity determined by the Russian method (with potassium chloride solution as extractant, c(KCl) = 1 mol/dm3) were significantly higher than those obtained by the international method (with barium chloride solution as extractant, c(BaCl2) = 0.1 mol/dm3). The error of the estimate of the concentration of H+ ions extracted with barium chloride solution equaled 100%, and this allowed only qualitative description of this component of the soil acidity. In the case of the extraction with potassium chloride, the error of measurements was 50%. It was also shown that the use of potentiometric titration suggested by the Russian method overestimates the results of soil acidity measurement caused by the exchangeable metal ions (Al(III), Fe(III), and Mn(II)) in comparison with the atomic emission method.

  8. Assessment of changing interdependencies between human electroencephalograms using nonlinear methods

    NASA Astrophysics Data System (ADS)

    Pereda, E.; Rial, R.; Gamundi, A.; González, J.

    2001-01-01

    We investigate the problems that might arise when two recently developed methods for detecting interdependencies between time series using state space embedding are applied to signals of different complexity. With this aim, these methods were used to assess the interdependencies between two electroencephalographic channels from 10 adult human subjects during different vigilance states. The significance and nature of the measured interdependencies were checked by comparing the results of the original data with those of different types of surrogates. We found that even with proper reconstructions of the dynamics of the time series, both methods may give wrong statistical evidence of decreasing interdependencies during deep sleep due to changes in the complexity of each individual channel. The main factor responsible for this result was the use of an insufficient number of neighbors in the calculations. Once this problem was surmounted, both methods showed the existence of a significant relationship between the channels which was mostly of linear type and increased from awake to slow wave sleep. We conclude that the significance of the qualitative results provided for both methods must be carefully tested before drawing any conclusion about the implications of such results.

  9. A solution quality assessment method for swarm intelligence optimization algorithms.

    PubMed

    Zhang, Zhaojun; Wang, Gai-Ge; Zou, Kuansheng; Zhang, Jianhua

    2014-01-01

    Nowadays, swarm intelligence optimization has become an important optimization tool and wildly used in many fields of application. In contrast to many successful applications, the theoretical foundation is rather weak. Therefore, there are still many problems to be solved. One problem is how to quantify the performance of algorithm in finite time, that is, how to evaluate the solution quality got by algorithm for practical problems. It greatly limits the application in practical problems. A solution quality assessment method for intelligent optimization is proposed in this paper. It is an experimental analysis method based on the analysis of search space and characteristic of algorithm itself. Instead of "value performance," the "ordinal performance" is used as evaluation criteria in this method. The feasible solutions were clustered according to distance to divide solution samples into several parts. Then, solution space and "good enough" set can be decomposed based on the clustering results. Last, using relative knowledge of statistics, the evaluation result can be got. To validate the proposed method, some intelligent algorithms such as ant colony optimization (ACO), particle swarm optimization (PSO), and artificial fish swarm algorithm (AFS) were taken to solve traveling salesman problem. Computational results indicate the feasibility of proposed method.

  10. A method for assessing strand breaks in DNA.

    PubMed

    Peng, L; Brisco, M J; Morley, A A

    1998-08-15

    A simple method has been developed to assess strand breaks in extracted DNA. The method uses the enzyme terminal deoxynucleotidyl transferase (TDT) to incorporate labeled deoxycytidine triphosphate (dCTP) in the presence of dideoxy-CTP (ddCTP) which is added to ensure that the reaction goes to completion. Following development of the method, the extent of DNA degradation in 21 blood or bone marrow samples, which had varying degrees of DNA degradation, was measured by the TDT assay, by gel electrophoresis, or by a laborious PCR-based method which quantifies the number of amplifiable N-ras targets in a sample. The TDT assay was more sensitive at detecting strand breaks than electrophoresis and there was good correlation between the results of the TDT assay and the N-ras assay. The TDT assay was also used to demonstrate the development of strand breaks during induced apoptosis. The TDT assay is thus a simple and semiquantitative method to study strand breaks produced by DNA damage.

  11. Compatibility assessment of methods used for soil hydrophobicity determination

    NASA Astrophysics Data System (ADS)

    Papierowska, Ewa; Szatyłowicz, Jan; Kalisz, Barbara; Łachacz, Andrzej; Matysiak, Wojciech; Debaene, Guillaume

    2016-04-01

    Soil hydrophobicity is a global problem. Effect of hydrophobicity on the soil environment is very important, because it can cause irreversible changes in ecosystems, leading to their complete degradation. The choice of method used to determine soil hydrophobicity is not simple because there is no obvious criteria for their selection. The results obtained by various methods may not be coherent and may indicate different degrees of hydrophobicity within the same soil sample. The objective of the study was to assess the compatibility between methods used to determine the hydrophobicity of selected organic and mineral-organic soils. Two groups of soil materials were examined: hydrogenic (87 soil samples) and autogenic soils (19 soil samples) collected from 41 soil profiles located in north-eastern Poland. Air-dry soil samples were used. Hydrophobicity was determined using two different methods i.e. on the basis of wetting contact angle measurements between water and solid phase of soils and with water drop penetration time tests. The value of the wetting contact angle was measured using the sessile drop method with optical goniometer CAM 100 (KSV Instruments). The wetting contact angles were determined at room temperature (20° C) within 10 min after sample preparation using standard procedure. In addition, water drop penetration time was measured. In order to compare the methods used for the assessment of soil hydrophobicity, the agreement between observers model was applied. In this model five categories of soil hydrophobicity were proposed according to the class used in the soil hydrofobicity classification based on water drop penetration time test. Based on this classification the values of the weighted kappa coefficients were calculated using SAS 9.4 (SAS Institute, 2013, Cary NC) for evaluating relationships between between the different investigated methods. The results of agreement were presented in forms of agreement charts. Research results indicated good

  12. Concept Mapping Using Cmap Tools to Enhance Meaningful Learning

    NASA Astrophysics Data System (ADS)

    Cañas, Alberto J.; Novak, Joseph D.

    Concept maps are graphical tools that have been used in all facets of education and training for organizing and representing knowledge. When learners build concept maps, meaningful learning is facilitated. Computer-based concept mapping software such as CmapTools have further extended the use of concept mapping and greatly enhanced the potential of the tool, facilitating the implementation of a concept map-centered learning environment. In this chapter, we briefly present concept mapping and its theoretical foundation, and illustrate how it can lead to an improved learning environment when it is combined with CmapTools and the Internet. We present the nationwide “Proyecto Conéctate al Conocimiento” in Panama as an example of how concept mapping, together with technology, can be adopted by hundreds of schools as a means to enhance meaningful learning.

  13. Patient experience should be part of meaningful-use criteria.

    PubMed

    Ralston, James D; Coleman, Katie; Reid, Robert J; Handley, Matthew R; Larson, Eric B

    2010-04-01

    The proposed federal "meaningful use" criteria for electronic health records include the direct engagement of patients in their care. In this study, we sought to describe the adoption and use of online services linked to the electronic health record at Group Health Cooperative. By August 2009, six years after the introduction of these services, 30 percent of outpatient "encounters" were actually conducted through secure electronic messaging. Meanwhile, 10 percent of enrollees reviewed medical test results online, while 10 percent went online to request medication refills. These results highlight the need to measure the patient experience as part of meaningful use and to enact policies supporting online and phone communication by patients and providers. PMID:20368589

  14. Using different methods to assess the discomfort during car driving.

    PubMed

    Ravnik, David; Otáhal, Stanislav; Dodic Fikfak, Metoda

    2008-03-01

    This study investigated the discomfort caused by car driving. Discomfort estimates were achieved by self-administered questionnaire, measured by different testing methods, and through the goniometry of principal angles. Data from a total of 200 non-professional drivers who fulfilled the questionnaire was analysed. 118 subjects were analysed by goniometry and 30 drivers were assessed using the OWAS (Ovaco orking Posture Analysis), RULA (Rapid Upper Limb Assessment), and CORLETT tests. The aim of this paper was to assess the appearance of the discomfort and to find some correlations between drivers' postures. Results suggest that different levels of discomfort are perceived in different body regions when driving cars. Differences appear mostly between the genders concerning the discomfort. With the questionnaire and the different estimation techniques, it is possible to identify 'at risk' drivers and ensure urgent attention when necessary. It can be concluded that the questionnare and the CORLETT test are good in predicting location of discomfort. TheB org CRI10scale is good indicator of the level of the discomfort, while OWAS and RULA can appraise the body posture to predict discomfort appearance. According to the goniometry data, the drivers posture could be one of the contributing factors in appearing of discomfort.

  15. Cumulative Risk Assessment Toolbox: Methods and Approaches for the Practitioner

    PubMed Central

    MacDonell, Margaret M.; Haroun, Lynne A.; Teuschler, Linda K.; Rice, Glenn E.; Hertzberg, Richard C.; Butler, James P.; Chang, Young-Soo; Clark, Shanna L.; Johns, Alan P.; Perry, Camarie S.; Garcia, Shannon S.; Jacobi, John H.; Scofield, Marcienne A.

    2013-01-01

    The historical approach to assessing health risks of environmental chemicals has been to evaluate them one at a time. In fact, we are exposed every day to a wide variety of chemicals and are increasingly aware of potential health implications. Although considerable progress has been made in the science underlying risk assessments for real-world exposures, implementation has lagged because many practitioners are unaware of methods and tools available to support these analyses. To address this issue, the US Environmental Protection Agency developed a toolbox of cumulative risk resources for contaminated sites, as part of a resource document that was published in 2007. This paper highlights information for nearly 80 resources from the toolbox and provides selected updates, with practical notes for cumulative risk applications. Resources are organized according to the main elements of the assessment process: (1) planning, scoping, and problem formulation; (2) environmental fate and transport; (3) exposure analysis extending to human factors; (4) toxicity analysis; and (5) risk and uncertainty characterization, including presentation of results. In addition to providing online access, plans for the toolbox include addressing nonchemical stressors and applications beyond contaminated sites and further strengthening resource accessibility to support evolving analyses for cumulative risk and sustainable communities. PMID:23762048

  16. Medical Imaging Image Quality Assessment with Monte Carlo Methods

    NASA Astrophysics Data System (ADS)

    Michail, C. M.; Karpetas, G. E.; Fountos, G. P.; Kalyvas, N. I.; Martini, Niki; Koukou, Vaia; Valais, I. G.; Kandarakis, I. S.

    2015-09-01

    The aim of the present study was to assess image quality of PET scanners through a thin layer chromatography (TLC) plane source. The source was simulated using a previously validated Monte Carlo model. The model was developed by using the GATE MC package and reconstructed images obtained with the STIR software for tomographic image reconstruction, with cluster computing. The PET scanner simulated in this study was the GE DiscoveryST. A plane source consisted of a TLC plate, was simulated by a layer of silica gel on aluminum (Al) foil substrates, immersed in 18F-FDG bath solution (1MBq). Image quality was assessed in terms of the Modulation Transfer Function (MTF). MTF curves were estimated from transverse reconstructed images of the plane source. Images were reconstructed by the maximum likelihood estimation (MLE)-OSMAPOSL algorithm. OSMAPOSL reconstruction was assessed by using various subsets (3 to 21) and iterations (1 to 20), as well as by using various beta (hyper) parameter values. MTF values were found to increase up to the 12th iteration whereas remain almost constant thereafter. MTF improves by using lower beta values. The simulated PET evaluation method based on the TLC plane source can be also useful in research for the further development of PET and SPECT scanners though GATE simulations.

  17. Methods to assess the amenability of petroleum hydrocarbons to bioremediation.

    PubMed

    Dobson, Richard; Schroth, Martin H; Schuermann, Andreas; Zeyer, Josef

    2004-04-01

    Bioremediation has achieved acceptance as a cost-effective technique for the remediation of soils and groundwater contaminated with petroleum hydrocarbons (PHC). A range of laboratory techniques to assess the biodegradability and bioavailability of PHCs are presented. Biodegradability and bioavailability are important determinants of the bioremediation performance of PHCs. Novel methods for the assessment of the bioavailability of PHC components are described. The techniques are demonstrated for a hydraulic fluid and a spindle oil from a contaminated site. Biodegradation is measured by oxygen consumption and carbon dioxide production. Bioavailability of the PHCs is estimated based on the PHC-water partitioning of tracer compounds and a novel analysis of gas chromatograms based on Raoult's law. The PHCs tested were only partially biodegradable (< 25% in 78 d) due to the low solubility and likely recalcitrance of some of their components. The combination of techniques outlined is expected to be of use in assessing the likely bioremediation performance of PHCs for which published data are scarce or inadequate.

  18. Meaningful QQ adjustment of TRMM/GPM daily rainfall estimates.

    NASA Astrophysics Data System (ADS)

    Pegram, Geoff; Bardossy, Andras; Sinclair, Scott

    2016-04-01

    In many parts of the world, particularly in Africa, the daily raingauge networks are sparse. It is therefore sensible to use remote sensing estimates of precipitation to fill the gaps, but readily available products like TRMM and it successor GPM are frequently found to be biased. This presentation describes a method of bias adjustment of TRMM using quantile-quantile (QQ) transforms of the probability distributions of TRMM daily rainfall accumulations over its grid of 0.25 degree pixels/blocks. There are 4 main steps in the procedure. The first is to collect the daily gauge readings in those TRMM pixels containing gauges to obtain useful estimates of spatial rainfall for ground referencing. These estimates need to be adjusted from gauge to areal estimates taking the number of gauges in each pixel into account. We found that the distributions of the areal rainfall estimates are influenced by the number of gauges in each block, so we devised a means of transforming point to areal rainfall meaningfully. The second step is to determine the parameters of the probability distributions of the gauge-based block areal rainfall; we found that the Weibull distribution with 2 parameters is a suitable and useful choice. The pairs of Weibull parameters of rainfall on many blocks are correlated. To enable their interpolation, as an intermediate step, they have to be decorrelated using canonical decomposition. These transformed parameter pairs are then separately interpolated to empty blocks over the region of choice. They are then back-transformed at each TRMM pixel to Weibull parameters to provide gauge referenced daily rainfall distributions. The third step is to determine the Weibull distributions of the TRMM daily rainfall estimates in each block, based on their brief 11-year history. The fourth and last step is to QQ transform the individual daily TRMM rainfall estimates via the interpolated gauge-block rainfall distributions. This procedure achieves the desired corrected

  19. On Detectable and Meaningful Speech-Intelligibility Benefits.

    PubMed

    Whitmer, William M; McShefferty, David; Akeroyd, Michael A

    2016-01-01

    The most important parameter that affects the ability to hear and understand speech in the presence of background noise is the signal-to-noise ratio (SNR). Despite decades of research in speech intelligibility, it is not currently known how much improvement in SNR is needed to provide a meaningful benefit to someone. We propose that the underlying psychophysical basis to a meaningful benefit should be the just noticeable difference (JND) for SNR. The SNR JND was measured in a series of experiments using both adaptive and fixed-level procedures across participants of varying hearing ability. The results showed an average SNR JND of approximately 3 dB for sentences in same-spectrum noise. The role of the stimulus and link to intelligibility was examined by measuring speech-intelligibility psychometric functions and comparing the intelligibility JND estimated from those functions with measured SNR JNDs. Several experiments were then conducted to establish a just meaningful difference (JMD) for SNR. SNR changes that could induce intervention-seeking behaviour for an individual were measured with subjective scaling and report, using the same stimuli as the SNR JND experiment as pre- and post-benefit examples. The results across different rating and willingness-to-change tasks showed that the mean ratings increased near linearly with a change in SNR, but a change of at least 6 dB was necessary to reliably motivate participants to seek intervention. The magnitude of the JNDs and JMDs for speech-intelligibility benefits measured here suggest a gap between what is achievable and what is meaningful.

  20. Providing Professionally Meaningful Recognition to Enhance Frontline Engagement.

    PubMed

    Zwickel, Karen; Koppel, Jenna; Katz, Marie; Virkstis, Katherine; Rothenberger, Sarah; Boston-Fleischhauer, Carol

    2016-01-01

    To achieve transformation in care delivery, frontline nursing staff must be committed to their organization's mission, engaged in their work, and capable of delivering high-quality care. However, data from Advisory Board Survey Solutions show that, when compared with other frontline staff, nurses are the least engaged and most disengaged. In this article, the authors describe strategies for addressing a top opportunity for improving nurse engagement-ensuring nurses feel meaningfully recognized for their professional impact. PMID:27442898

  1. Integrating patients into meaningful real-world research.

    PubMed

    Bartlett, Susan J; Barnes, Teresa; McIvor, R Andrew

    2014-02-01

    Research in respiratory, sleep, and critical care medicine has historically been the domain of scientists and clinicians attempting to understand pathophysiological mechanisms and consequences of disease in an effort to develop effective treatments. This traditional approach of placing scientific rigor before the patient's reality is changing. There is growing recognition of the importance of integrating patient perspectives (e.g., preferences, expectations, and expanded definitions of what constitutes "successful" outcomes) into clinical research to achieve meaningful results for a broader group of stakeholders. This evolution is reflected in the growth of patient-centered organizations and patient advocacy groups that seek to meaningfully integrate patients into the process of prioritizing research needs and creating alliances wherein patients and researchers can partner together to accomplish research goals. In tandem, a growing number of real-world trials (i.e., those with broader, more representative patient populations and routine care pathways) now complement findings from traditional randomized controlled trials and offer new opportunities to design studies that better reflect patients' healthcare preferences and experiences. Patients' perspectives are key determinants of treatment adherence and outcomes, as well as the feasibility and likely value of implementing care pathways. The advent of smartphone and push technologies offer new opportunities for the collection of more patient-centered and ecologically valid patient data, thereby adding new dimensions to meaningfully integrate patients into real-world research. PMID:24559023

  2. Passive sampling methods for contaminated sediments: Risk assessment and management

    PubMed Central

    Greenberg, Marc S; Chapman, Peter M; Allan, Ian J; Anderson, Kim A; Apitz, Sabine E; Beegan, Chris; Bridges, Todd S; Brown, Steve S; Cargill, John G; McCulloch, Megan C; Menzie, Charles A; Shine, James P; Parkerton, Thomas F

    2014-01-01

    This paper details how activity-based passive sampling methods (PSMs), which provide information on bioavailability in terms of freely dissolved contaminant concentrations (Cfree), can be used to better inform risk management decision making at multiple points in the process of assessing and managing contaminated sediment sites. PSMs can increase certainty in site investigation and management, because Cfree is a better predictor of bioavailability than total bulk sediment concentration (Ctotal) for 4 key endpoints included in conceptual site models (benthic organism toxicity, bioaccumulation, sediment flux, and water column exposures). The use of passive sampling devices (PSDs) presents challenges with respect to representative sampling for estimating average concentrations and other metrics relevant for exposure and risk assessment. These challenges can be addressed by designing studies that account for sources of variation associated with PSMs and considering appropriate spatial scales to meet study objectives. Possible applications of PSMs include: quantifying spatial and temporal trends in bioavailable contaminants, identifying and evaluating contaminant source contributions, calibrating site-specific models, and, improving weight-of-evidence based decision frameworks. PSM data can be used to assist in delineating sediment management zones based on likelihood of exposure effects, monitor remedy effectiveness, and, evaluate risk reduction after sediment treatment, disposal, or beneficial reuse after management actions. Examples are provided illustrating why PSMs and freely dissolved contaminant concentrations (Cfree) should be incorporated into contaminated sediment investigations and study designs to better focus on and understand contaminant bioavailability, more accurately estimate exposure to sediment-associated contaminants, and better inform risk management decisions. Research and communication needs for encouraging broader use are discussed. Integr

  3. Biomonitoring as a method for assessing exposure to perchlorate.

    PubMed

    Blount, Benjamin C; Valentín-Blasini, Liza

    2007-09-01

    Biomonitoring provides direct and quantitative information regarding human exposure to environmental toxicants, such as perchlorate (ClO(4)(-)). Because of concerns surrounding widespread exposure to ClO(4)(-), we are using biomonitoring methods to assess exposure to ClO(4)(-) and other physiologically relevant anions that can impact iodide uptake by the thyroid. These methods quantify ClO(4)(-), thiocyanate, nitrate, and iodide in human urine, milk, serum, blood spots, amniotic fluid, and infant formula using ion chromatography coupled with electrospray ionization tandem mass spectrometry. In this paper we summarize recent ClO(4)(-) biomonitoring research and provide three additional examples of the utility of biomonitoring for characterizing ClO(4)(-) exposure. Specifically, we examine variability in ClO(4)(-) excretion, compare the relative importance of different exposure sources in adults, and estimate ClO(4)(-) exposure in formula-fed infants. These applications provide examples of how biomonitoring can improve individual exposure assessment. Individual biomarker data can subsequently be compared with individual thyroid function data to better evaluate potential linkage between ClO(4)(-) exposure and health. PMID:17822374

  4. Terrestrial Method for Airborne Lidar Quality Control and Assessment

    NASA Astrophysics Data System (ADS)

    Alsubaie, N. M.; Badawy, H. M.; Elhabiby, M. M.; El-Sheimy, N.

    2014-11-01

    Most of LiDAR systems do not provide the end user with the calibration and acquisition procedures that can use to validate the quality of the data acquired by the airborne system. Therefore, this system needs data Quality Control (QC) and assessment procedures to verify the accuracy of the laser footprints and mainly at building edges. This research paper introduces an efficient method for validating the quality of the airborne LiDAR point clouds data using terrestrial laser scanning data integrated with edge detection techniques. This method will be based on detecting the edge of buildings from these two independent systems. Hence, the building edges are extracted from the airborne data using an algorithm that is based on the standard deviation of neighbour point's height from certain threshold with respect to centre points using radius threshold. The algorithm is adaptive to different point densities. The approach is combined with another innovative edge detection technique from terrestrial laser scanning point clouds that is based on the height and point density constraints. Finally, statistical analysis and assessment will be applied to compare these two systems in term of edge detection extraction precision, which will be a priori step for 3D city modelling generated from heterogeneous LiDAR systems

  5. Novel Method for Border Irregularity Assessment in Dermoscopic Color Images

    PubMed Central

    Jaworek-Korjakowska, Joanna

    2015-01-01

    Background. One of the most important lesion features predicting malignancy is border irregularity. Accurate assessment of irregular borders is clinically important due to significantly different occurrence in benign and malignant skin lesions. Method. In this research, we present a new approach for the detection of border irregularities, as one of the major parameters in a widely used diagnostic algorithm the ABCD rule of dermoscopy. The proposed work is focused on designing an efficient automatic algorithm containing the following steps: image enhancement, lesion segmentation, borderline calculation, and irregularities detection. The challenge lies in determining the exact borderline. For solving this problem we have implemented a new method based on lesion rotation and borderline division. Results. The algorithm has been tested on 350 dermoscopic images and achieved accuracy of 92% indicating that the proposed computational approach captured most of the irregularities and provides reliable information for effective skin mole examination. Compared to the state of the art, we obtained improved classification results. Conclusions. The current study suggests that computer-aided system is a practical tool for dermoscopic image assessment and could be recommended for both research and clinical applications. The proposed algorithm can be applied in different fields of medical image analysis including, for example, CT and MRI images. PMID:26604980

  6. Interactive Rapid Dose Assessment Model (IRDAM): reactor-accident assessment methods. Vol. 2

    SciTech Connect

    Poeton, R.W.; Moeller, M.P.; Laughlin, G.J.; Desrosiers, A.E.

    1983-05-01

    As part of the continuing emphasis on emergency preparedness, the US Nuclear Regulatory Commission (NRC) sponsored the development of a rapid dose assessment system by Pacific Northwest Laboratory (PNL). This system, the Interactive Rapid Dose Assessment Model (IRDAM) is a micro-computer based program for rapidly assessing the radiological impact of accidents at nuclear power plants. This document describes the technical bases for IRDAM including methods, models and assumptions used in calculations. IRDAM calculates whole body (5-cm depth) and infant thyroid doses at six fixed downwind distances between 500 and 20,000 meters. Radionuclides considered primarily consist of noble gases and radioiodines. In order to provide a rapid assessment capability consistent with the capacity of the Osborne-1 computer, certain simplifying approximations and assumptions are made. These are described, along with default values (assumptions used in the absence of specific input) in the text of this document. Two companion volumes to this one provide additional information on IRDAM. The user's Guide (NUREG/CR-3012, Volume 1) describes the setup and operation of equipment necessary to run IRDAM. Scenarios for Comparing Dose Assessment Models (NUREG/CR-3012, Volume 3) provides the results of calculations made by IRDAM and other models for specific accident scenarios.

  7. Assessment study of lichenometric methods for dating surfaces

    NASA Astrophysics Data System (ADS)

    Jomelli, Vincent; Grancher, Delphine; Naveau, Philippe; Cooley, Daniel; Brunstein, Daniel

    2007-04-01

    In this paper, we discuss the advantages and drawbacks of the most classical approaches used in lichenometry. In particular, we perform a detailed comparison among methods based on the statistical analysis of either the largest lichen diameters recorded on geomorphic features or the frequency of all lichens. To assess the performance of each method, a careful comparison design with well-defined criteria is proposed and applied to two distinct data sets. First, we study 350 tombstones. This represents an ideal test bed because tombstone dates are known and, therefore, the quality of the estimated lichen growth curve can be easily tested for the different techniques. Secondly, 37 moraines from two tropical glaciers are investigated. This analysis corresponds to our real case study. For both data sets, we apply our list of criteria that reflects precision, error measurements and their theoretical foundations when proposing estimated ages and their associated confidence intervals. From this comparison, it clearly appears that two methods, the mean of the n largest lichen diameters and the recent Bayesian method based on extreme value theory, offer the most reliable estimates of moraine and tombstones dates. Concerning the spread of the error, the latter approach provides the smallest uncertainty and it is the only one that takes advantage of the statistical nature of the observations by fitting an extreme value distribution to the largest diameters.

  8. Improving Academic Program Assessment: A Mixed Methods Study

    ERIC Educational Resources Information Center

    Rodgers, Megan; Grays, Makayla P.; Fulcher, Keston H.; Jurich, Daniel P.

    2013-01-01

    Starting with the premise that better assessment leads to more informed decisions about student learning, we investigated the factors that lead to assessment improvement. We used "meta-assessment" (i.e., evaluating the assessment process) to identify academic programs in which the assessment process had improved over a two-year period.…

  9. Expanding the Aperture of Psychological Assessment: Introduction to the Special Section on Innovative Clinical Assessment Technologies and Methods

    ERIC Educational Resources Information Center

    Trull, Timothy J.

    2007-01-01

    Contemporary psychological assessment is dominated by tried-and-true methods like clinical interviewing, self-report questionnaires, intellectual assessment, and behavioral observation. These approaches have served as the mainstays of psychological assessment for decades. To be sure, these methods have survived over the years because clinicians…

  10. Hydrogeological Methods for Assessing Feasibility of Artificial Recharge

    NASA Astrophysics Data System (ADS)

    Kim, Y.; Koo, M.; Lee, K.; Moon, D.; Barry, J. M.

    2009-12-01

    This study presents the hydrogeological methods to assess the feasibility of artificial recharge in Jeju Island, Korea for securing both sustainable groundwater resources and severe floods. Jeju-friendly Aquifer Recharge Technology (J-ART) in this study is developing by capturing ephemeral stream water with no interference in the environments such as natural recharge or eco-system, storing the flood water in the reservoirs, recharging it through designed borehole after appropriate water treatment, and then making it to be used at down-gradient production wells. Many hydrogeological methods, including physico-chemical surface water and groundwater monitoring, geophysical survey, stable isotope analysis, and groundwater modeling have been employed to predict and assess the artificially recharged surface waters flow and circulation between recharge area and discharge area. In the study of physico-chemical water monitoring survey, the analyses of surface water level and velocity, of water qualities including turbidity, and of suspended soil settling velocity were performed. For understanding subsurface hydrogeologic characteristics the injection test was executed and the results are 118-336 m2/day of transmissivity and 4,367-11,032 m3/day of the maximum intake water capacity. Characterizing groundwater flow from recharge area to discharge area should be achieved to assess the efficiency of J-ART. The resistivity logging was carried out to predict water flow in unsaturated zone during artificial recharge based on the inverse modeling and resistivity change patterns. Stable isotopes of deuterium and oxygen-18 of surface waters and groundwaters have been determined to interpret mixing and flow in groundwaters impacted by artificial recharge. A numerical model simulating groundwater flow and heat transport to assess feasibility of artificial recharge has been developed using the hydraulic properties of aquifers, groundwater levels, borehole temperatures, and meteorological

  11. Assessment of Proper Bonding Methods and Mechanical Characterization FPGA CQFPs

    NASA Technical Reports Server (NTRS)

    Davis, Milton C.

    2008-01-01

    This presentation discusses fractured leads on field-programmable gate array (FPGA) during flight vibration. Actions taken to determine root cause and resolution of the failure include finite element analysis (FEA) and vibration testing and scanning electron microscopy (with X-ray microanalysis) and energy dispersive spectrometry (SEM/EDS) failure assessment. Bonding methods for surface mount parts is assessed, including critical analysis and assessment of random fatigue damage. Regarding ceramic quad flat pack (CQFP) lead fracture, after disassembling the attitude control electronics (ACE) configuration, photographs showed six leads cracked on FPGA RTSX72SU-1 CQ208B package located on the RWIC card. An identical package (FPGA RTSX32SU-1 CQ208B) mounted on the RWIC did not results in cracked pins due to vibration. FPGA lead failure theories include workmanship issues in the lead-forming, material defect in the leads of the FPGA packages, and the insecure mounting of the board in the card guides, among other theories. Studies were conducted using simple calculations to determine the response and fatigue life of the package. Shorter packages exhibited more response when loaded by out-of-plane displacement of PCB while taller packages exhibit more response when loaded by in-plane acceleration of PCB. Additionally, under-fill did not contribute to reducing stress in leads due to out-of-plane PCB loading or from component twisting, as much as corner bonding. The combination of corner bond and under-fill is best to address mechanical and thermal S/C environment. Test results of bonded parts showed reduced (dampened) amplitude and slightly shifted peaks at the un-bonded natural frequency and an additional response at the bonded frequency. Stress due to PCBB out-of-plane loading was decreased on in the corners when only a corner bond was used. Future work may address CQFP fatigue assessment, including the investigation of discrepancy in predicted fatigue damage, as well as

  12. Alternative Assessment Methods Based on Categorizations, Supporting Technologies, and a Model for Betterment

    ERIC Educational Resources Information Center

    Ben-Jacob, Marion G.; Ben-Jacob, Tyler E.

    2014-01-01

    This paper explores alternative assessment methods from the perspective of categorizations. It addresses the technologies that support assessment. It discusses initial, formative, and summative assessment, as well as objective and subjective assessment, and formal and informal assessment. It approaches each category of assessment from the…

  13. CoverCAM - a land cover composition change assessment method

    NASA Astrophysics Data System (ADS)

    Ali, A.; Bie, C. D.; Skidmore, A. K.

    2013-12-01

    The cover-composition on a specific piece of land can change over time due to natural and anthropogenic factors. Accurate detection of where and when changes occur requires a method that uses remotely sensed imagery that represents a continuous and consistent record on the state of the green land-cover. Such data are offered through hyper-temporal NDVI-imagery. Until recently, NDVI-images were mainly used for anomaly mapping to monitor the influence of weather on vegetation; the monitoring basically assume that, over time, the land cover composition of a studied area remains static. This study presents a novel cover change assessment method, labelled ';CoverCAM' that extracts from hyper-temporal NDVI-imagery the probabilities that the original land cover composition did change. CoverCAM, unlike all existing change-detection methods, makes adjustments based on seasonal NDVI-anomalies experienced at landscape level. We tested the method by processing SPOT-VGT NDVI-imagery (10-day Maximum Value Composites; 1km pixels) for Andalucía, Spain. CoverCAM requires specification that two time periods are specified: a reference period (we used 2000-04), and a change detection period (we used 2005-10). All images of the reference period were classified using the ISODATA algorithm and by evaluating divergence statistics. The generated map depicts strata (group of polygons), characterized by temporal NDVI and standard deviation (SD) profiles. For the change assessment period, first, mean NDVI-values were calculated by decade and polygon (NDVId,p), and then for each pixel of the polygon its pixel change values specified through the remaining difference between the pixel NDVI and [NDVId,p × the SD value of the stratum for that decade]. The above process was repeated to produce decadal land cover change probability maps, each with its own undefined scale. The decadal change maps were then aggregated to annual change probability maps. This validation was only carried out for

  14. Comparison of methods for assessing thyroid function in nonthyroidal illness

    SciTech Connect

    Melmed, S.; Geola, F.L.; Reed, A.W.; Pekary, A.E.; Park, J.; Hershman, J.M.

    1982-02-01

    Various tests of thyroid function were studied in sick patients with nonthyroidal illness (NTI) in order to determine the utility of each test for differentiating these patients from a group with hypothyroidism. We evaluated each test in 22 healthy volunteers who served as controls, 20 patients with hypothyroidism, 14 patients admitted to medical intensive care unit whose serum T/sub 4/ was less than 5 ..mu..g/dl, 13 patients with chronic liver disease, 32 patients on chronic hemodialysis for renal failure, 13 ambulatory oncology patients receiving chemotherapy 16 pregnant women, 7 women on estrogens, and 20 hyperthyroid patients. On all samples, we measured serum T/sub 4/, the free T/sub 4/ index by several methods, free T/sub 4/ by equilibrium dialysis, free T/sub 4/ calculated from thyronine-binding globulin (TBG) RIA, free T/sub 4/ by three commercial kits (Gammacoat, Immophase, and Liquisol), T/sub 3/, rT/sub 3/, and TSH (by 3 different RIA). Although all of the methods used for measuring free T/sub 4/ (including free T/sub 4/ index, free T/sub 4/ by dialysis, free T/sub 4/ assessed by TBG, and free T/sub 4/ assessed by the 3 commercial kits) were excellent for the diagnosis of hypothyroidism, hyperthyroidism, and euthyroidism in the presence of high TBG, none of these methods showed that free T/sub 4/ was consistently normal in patients with NTI; with each method, a number of NTI patients had subnormal values. In the NTI groups, free T/sub 4/ measured by dialysis and the free T/sub 4/ index generally correlated significantly with the commercial free T/sub 4/ methods. Serum rT/sub 3/ was elevated or normal in NTI patients and low in hypothyroid subjects. Serum TSH provided the most reliable differentiation between patients with primary hypothyroidism and those with NTI and low serum T/sub 4/ levels.

  15. Dental age assessment among Tunisian children using the Demirjian method

    PubMed Central

    Aissaoui, Abir; Salem, Nidhal Haj; Mougou, Meryam; Maatouk, Fethi; Chadly, Ali

    2016-01-01

    Context: Since Demirjian system of estimating dental maturity was first described, many researchers from different countries have tested its accuracy among diverse populations. Some of these studies have pointed out a need to determine population-specific standards. Aim: The aim of this study is to evaluate the suitability of the Demirjian's method for dental age assessment in Tunisian children. Materials and Methods: This is a prospective study previously approved by the Research Ethics Local Committee of the University Hospital Fattouma Bourguiba of Monastir (Tunisia). Panoramic radiographs of 280 healthy Tunisian children of age 2.8–16.5 years were examined with Demirjian method and scored by three trained observers. Statistical Analysis Used: Dental age was compared to chronological age by using the analysis of variance (ANOVA) test. Cohen's Kappa test was performed to calculate the intra- and inter-examiner agreements. Results: Underestimation was seen in children aged between 9 and 16 years and the range of accuracy varied from −0.02 to 3 years. The advancement in dental age as determined by Demirjian system when compared to chronological age ranged from 0.3 to 1.32 year for young males and from 0.26 to 1.37 year for young females (age ranged from 3 to 8 years). Conclusions: The standards provided by Demirjian for French-Canadian children may not be suitable for Tunisian children. Each population of children may need their own specific standard for an accurate estimation of chronological age. PMID:27051223

  16. Assessment of substitution model adequacy using frequentist and Bayesian methods.

    PubMed

    Ripplinger, Jennifer; Sullivan, Jack

    2010-12-01

    In order to have confidence in model-based phylogenetic methods, such as maximum likelihood (ML) and Bayesian analyses, one must use an appropriate model of molecular evolution identified using statistically rigorous criteria. Although model selection methods such as the likelihood ratio test and Akaike information criterion are widely used in the phylogenetic literature, model selection methods lack the ability to reject all models if they provide an inadequate fit to the data. There are two methods, however, that assess absolute model adequacy, the frequentist Goldman-Cox (GC) test and Bayesian posterior predictive simulations (PPSs), which are commonly used in conjunction with the multinomial log likelihood test statistic. In this study, we use empirical and simulated data to evaluate the adequacy of common substitution models using both frequentist and Bayesian methods and compare the results with those obtained with model selection methods. In addition, we investigate the relationship between model adequacy and performance in ML and Bayesian analyses in terms of topology, branch lengths, and bipartition support. We show that tests of model adequacy based on the multinomial likelihood often fail to reject simple substitution models, especially when the models incorporate among-site rate variation (ASRV), and normally fail to reject less complex models than those chosen by model selection methods. In addition, we find that PPSs often fail to reject simpler models than the GC test. Use of the simplest substitution models not rejected based on fit normally results in similar but divergent estimates of tree topology and branch lengths. In addition, use of the simplest adequate substitution models can affect estimates of bipartition support, although these differences are often small with the largest differences confined to poorly supported nodes. We also find that alternative assumptions about ASRV can affect tree topology, tree length, and bipartition support. Our

  17. Assessments of lung digestion methods for recovery of fibers

    SciTech Connect

    Warheit, D.B.; Hwang, H.C.; Achinko, L. )

    1991-04-01

    Evaluation of the pulmonary hazards associated with exposure to fibrous materials tends to be more complicated than assessments required for particulate materials. Fibers are defined by aspect ratios and it is generally considered that physical dimensions play an important role in the pathogenesis of fiber-related lung diseases. Several digestion techniques have been used to recover fibers from exposed lung tissue for clearance studies. Because many of the digestion fluids are corrosive (e.g., bleach, KOH), it is conceivable that the dimensions of recovered fibers are modified during the tissue digestion process, thus creating erroneous data. Accordingly, the authors evaluated two lung digestion methods to assess whether the physical dimensions of bulk samples of fibers were altered following simulated digestion processing. Aliquots of crocidolite and chrysotile asbestos, Kevlar aramid, wollastonite, polyacrylonitrile (pan)-based carbon, and glass fibers were incubated with either saline, bleach, or KOH and then filtered. Scanning electron microscopy techniques were utilized to measure the physical dimensions (i.e., lengths and diameters) of at least 160 fibers per treatment group of each fiber type. Their results showed that the lengths and diameters of glass fibers and wollastonite were altered after treatment with KOH. In addition, treatment with bleach produced a small reduction in both asbestos fiber-type diameters, and greater changes in Kevlar and wollastonite diameters and carbon fiber lengths.

  18. Rapid assessment methods of resilience for natural and agricultural systems.

    PubMed

    Torrico, Juan C; Janssens, Marc J J

    2010-12-01

    The resilience, ecological function and quality of both agricultural and natural systems were evaluated in the mountainous region of the Atlantic Rain Forest of Rio de Janeiro through Rapid Assessment Methods. For this goal new indicators were proposed, such as eco-volume, eco-height, bio-volume, volume efficiency, and resilience index. The following agricultural and natural systems have been compared according: (i) vegetables (leaf, fruit and mixed); (ii) citrus; (iii) ecological system; (iv) cattle, (v) silvo-pastoral system, (vi) forest fragment and (vii) forest in regeneration stage (1, 2 and 3 years old). An alternative measure (index) of resilience was proposed by considering the actual bio-volume as a function of the potential eco-volume. The objectives and hypotheses were fulfilled; it is shown that there does exist a high positive correlation between resilience index, biomass, energy efficiency and biodiversity. Cattle and vegetable systems have lowest resilience, whilst ecological and silvo-pastoral systems have greatest resilience. This new approach offers a rapid, though valuable assessment tool for ecological studies, agricultural development and landscape planning, particularly in tropical countries.

  19. Viscoelastic Methods of Blood Clotting Assessment – A Multidisciplinary Review

    PubMed Central

    Benes, Jan; Zatloukal, Jan; Kletecka, Jakub

    2015-01-01

    Viscoelastic methods (VEM) made available the bedside assessment of blood clotting. Unlike standard laboratory tests, the results are based on the whole blood coagulation and are available in real time at a much faster turnaround time. In combination with our new knowledge about pathophysiology of the trauma-induced coagulopathy, the goal-oriented treatment protocols have been recently proposed for the initial management of bleeding in trauma victims. Additionally, the utility of viscoelastic monitoring devices has been proved even outside this setting in cardiosurgical patients or those undergoing liver transplantation. Many other situations were described in literature showing the potential use of bedside analysis of coagulation for the management of bleeding or critically ill patients. In the near future, we may expect further improvement in current bedside diagnostic tools enabling not only the assessment of secondary hemostasis but also the platelet aggregation. More sensitive assays for new anticoagulants are underway. Aim of this review is to offer the reader a multidisciplinary overview of VEM and their potential use in anesthesiology and critical care. PMID:26442265

  20. Non contact method for in vivo assessment of skin mechanical properties for assessing effect of ageing.

    PubMed

    Boyer, G; Pailler Mattei, C; Molimard, J; Pericoi, M; Laquieze, S; Zahouani, H

    2012-03-01

    The assessment of human tissue properties by objective and quantitative devices is very important to improve the understanding of its mechanical behaviour. The aim of this paper is to present a non contact method to measure the mechanical properties of human skin in vivo. A complete non contact device using an air flow system has been developed. Validation and assessment of the method have been performed on inert visco-elastic material. An in vivo study on the forearm of two groups of healthy women aged of 23.2±1.6 and 60.4±2.4 has been performed. Main parameters assessed are presented and a first interpretation to evaluate the reduced Young's modulus is proposed. Significant differences between the main parameters of the curve are shown with ageing. As tests were performed with different loads, the influence of the stress is also observed. We found a reduced Young's modulus with an air flow force of 10 mN of 14.38±3.61 kPa for the youngest group and 6.20±1.45 kPa for the oldest group. These values agree with other studies using classical or dynamic indentation. Non contact test using the developed device gives convincing results.

  1. Methods for assessing the quality of runoff from Minnesota peatlands

    SciTech Connect

    Clausen, J.C.

    1981-01-01

    The quality of runoff from large, undisturbed peatlands in Minnesota is chaaracterized and sampling results from a number of bogs (referred to as a multiple watershed approach) was used to assess the effects of peat mining on the quality of bog runoff. Runoff from 45 natural peatlands and one mined bog was sampled five times in 1979-80 and analyzed for 34 water quality characteristics. Peatland watersheds were classified as bog, transition, or fen, based upon both water quality and watershed characteristics. Alternative classification methods were based on frequency distributions, cluster analysis, discriminant analysis, and principal component analysis results. A multiple watershed approach was used as a basis of drawing inferences regarding the quality of runoff from a representative sample of natural bogs and a mined bog. The multiple watershed technique applied provides an alternative to long-term paired watershed experiments in evaluating the effects of land use activities on the quality of runoff from peatlands in Minnesota.

  2. Performance Assessment Method for a Forged Fingerprint Detection Algorithm

    NASA Astrophysics Data System (ADS)

    Shin, Yong Nyuo; Jun, In-Kyung; Kim, Hyun; Shin, Woochang

    The threat of invasion of privacy and of the illegal appropriation of information both increase with the expansion of the biometrics service environment to open systems. However, while certificates or smart cards can easily be cancelled and reissued if found to be missing, there is no way to recover the unique biometric information of an individual following a security breach. With the recognition that this threat factor may disrupt the large-scale civil service operations approaching implementation, such as electronic ID cards and e-Government systems, many agencies and vendors around the world continue to develop forged fingerprint detection technology, but no objective performance assessment method has, to date, been reported. Therefore, in this paper, we propose a methodology designed to evaluate the objective performance of the forged fingerprint detection technology that is currently attracting a great deal of attention.

  3. Comparative Assessment of Advanced Gay Hydrate Production Methods

    SciTech Connect

    M. D. White; B. P. McGrail; S. K. Wurstner

    2009-06-30

    Displacing natural gas and petroleum with carbon dioxide is a proven technology for producing conventional geologic hydrocarbon reservoirs, and producing additional yields from abandoned or partially produced petroleum reservoirs. Extending this concept to natural gas hydrate production offers the potential to enhance gas hydrate recovery with concomitant permanent geologic sequestration. Numerical simulation was used to assess a suite of carbon dioxide injection techniques for producing gas hydrates from a variety of geologic deposit types. Secondary hydrate formation was found to inhibit contact of the injected CO{sub 2} regardless of injectate phase state, thus diminishing the exchange rate due to pore clogging and hydrate zone bypass of the injected fluids. Additional work is needed to develop methods of artificially introducing high-permeability pathways in gas hydrate zones if injection of CO{sub 2} in either gas, liquid, or micro-emulsion form is to be more effective in enhancing gas hydrate production rates.

  4. Development of exposure assessment method with the chamber

    NASA Astrophysics Data System (ADS)

    Kato, N.; Koyama, Y.; Yokoyama, H.; Matsui, Y.; Yoneda, M.

    2015-05-01

    This study aims at developing the measurement method of nanoparticle concentration and at getting a representative value of nanoparticle uniform concentration due to chamber ventilation. We conducted a chamber equipped with HEPA filter and control the background nanoparticles concentration by using an adequate ventilation. Then, we used generator to evaluate concentration in the chamber uniformity. We measured background value and source counts at the particle size distribution by SMPS. In addition, we performed numerical analysis with CFD model OpenFoam. As results, we found that there is no aggregate in experimental conditions in this study. Though we confirmed that it is difficult to uniformalise nanoparticle concentration, However we also found simulation results showed higher reproducibility. Therefore, we could assess nanoparticle size distribution and concentration in our chamber at this stage.

  5. Reduced-reference image quality assessment using moment method

    NASA Astrophysics Data System (ADS)

    Yang, Diwei; Shen, Yuantong; Shen, Yongluo; Li, Hongwei

    2016-10-01

    Reduced-reference image quality assessment (RR IQA) aims to evaluate the perceptual quality of a distorted image through partial information of the corresponding reference image. In this paper, a novel RR IQA metric is proposed by using the moment method. We claim that the first and second moments of wavelet coefficients of natural images can have approximate and regular change that are disturbed by different types of distortions, and that this disturbance can be relevant to human perceptions of quality. We measure the difference of these statistical parameters between reference and distorted image to predict the visual quality degradation. The introduced IQA metric is suitable for implementation and has relatively low computational complexity. The experimental results on Laboratory for Image and Video Engineering (LIVE) and Tampere Image Database (TID) image databases indicate that the proposed metric has a good predictive performance.

  6. Ultrasonic Apparatus and Method to Assess Compartment Syndrome

    NASA Technical Reports Server (NTRS)

    Yost, William T. (Inventor); Ueno, Toshiaki (Inventor); Hargens, Alan R. (Inventor)

    2009-01-01

    A process and apparatus for measuring pressure buildup in a body compartment that encases muscular tissue. The method includes assessing the body compartment configuration and identifying the effect of pulsatible components on compartment dimensions and muscle tissue characteristics. This process is used in preventing tissue necrosis, and in decisions of whether to perform surgery on the body compartment for prevention of Compartment Syndrome. An apparatus is used for measuring pressure build-up in the body compartment having components for imparting ultrasonic waves such as a transducer, placing the transducer to impart the ultrasonic waves, capturing the imparted ultrasonic waves, mathematically manipulating the captured ultrasonic waves and categorizing pressure build-up in the body compartment from the mathematical manipulations.

  7. Methods for assessing uncertainty in fundamental assumptions and associated models for cancer risk assessment.

    PubMed

    Small, Mitchell J

    2008-10-01

    The distributional approach for uncertainty analysis in cancer risk assessment is reviewed and extended. The method considers a combination of bioassay study results, targeted experiments, and expert judgment regarding biological mechanisms to predict a probability distribution for uncertain cancer risks. Probabilities are assigned to alternative model components, including the determination of human carcinogenicity, mode of action, the dosimetry measure for exposure, the mathematical form of the dose-response relationship, the experimental data set(s) used to fit the relationship, and the formula used for interspecies extrapolation. Alternative software platforms for implementing the method are considered, including Bayesian belief networks (BBNs) that facilitate assignment of prior probabilities, specification of relationships among model components, and identification of all output nodes on the probability tree. The method is demonstrated using the application of Evans, Sielken, and co-workers for predicting cancer risk from formaldehyde inhalation exposure. Uncertainty distributions are derived for maximum likelihood estimate (MLE) and 95th percentile upper confidence limit (UCL) unit cancer risk estimates, and the effects of resolving selected model uncertainties on these distributions are demonstrated, considering both perfect and partial information for these model components. A method for synthesizing the results of multiple mechanistic studies is introduced, considering the assessed sensitivities and selectivities of the studies for their targeted effects. A highly simplified example is presented illustrating assessment of genotoxicity based on studies of DNA damage response caused by naphthalene and its metabolites. The approach can provide a formal mechanism for synthesizing multiple sources of information using a transparent and replicable weight-of-evidence procedure. PMID:18844862

  8. Authentic Assessment in Vocational Education. Trends and Issues Alerts.

    ERIC Educational Resources Information Center

    Kerka, Sandra

    The authentic assessment method of student evaluation is particularly well suited to vocational education. It connects the way schoolwork is assessed with the way knowledge and competence are judged in the workplace by focusing on tasks that are simultaneously meaningful to learners and linked to school and nonschool demands. Portfolios are one…

  9. Visual attentional capture predicts belief in a meaningful world.

    PubMed

    Bressan, Paola; Kramer, Peter; Germani, Mara

    2008-01-01

    Here we show that the automatic, involuntary process of attentional capture is predictive of beliefs that are typically considered as much more complex and higher-level. Whereas some beliefs are well supported by evidence, others, such as the belief that coincidences occur for a reason, are not. We argue that the tendency to assign meaning to coincidences is a byproduct of an adaptive system that creates and maintains cognitive schemata, and automatically directs attention to violations of a currently active schema. Earlier studies have shown that, within subjects, attentional capture increases with schema strength. Yet, between-subjects effects could exist too: whereas each of us has schemata of various strengths, most likely different individuals are differently inclined to maintain strong or weak ones. Since schemata can be interpreted as beliefs, we predict more attentional capture for subjects with stronger beliefs than for subjects with weaker ones. We measured visual attentional capture in a reaction time experiment, and correlated it with scores on questionnaires about religious and other beliefs and about meaningfulness and surprisingness of coincidences. We found that visual attentional capture predicts a belief in meaningfulness of coincidences, and that this belief mediates a relationship between visual attentional capture and religiosity. Remarkably, strong believers were more disturbed by schema violations than weak believers, and yet appeared less aware of the disrupting events. We conclude that (a) religious people have a stronger belief in meaningfulness of coincidences, indicative of a more general tendency to maintain strong schemata, and that (b) this belief leads them to suppress, ignore, or forget information that has demonstrably captured their attention, but happens to be inconsistent with their schemata.

  10. Measuring meaningful learning in the undergraduate chemistry laboratory

    NASA Astrophysics Data System (ADS)

    Galloway, Kelli R.

    The undergraduate chemistry laboratory has been an essential component in chemistry education for over a century. The literature includes reports on investigations of singular aspects laboratory learning and attempts to measure the efficacy of reformed laboratory curriculum as well as faculty goals for laboratory learning which found common goals among instructors for students to learn laboratory skills, techniques, experimental design, and to develop critical thinking skills. These findings are important for improving teaching and learning in the undergraduate chemistry laboratory, but research is needed to connect the faculty goals to student perceptions. This study was designed to explore students' ideas about learning in the undergraduate chemistry laboratory. Novak's Theory of Meaningful Learning was used as a guide for the data collection and analysis choices for this research. Novak's theory states that in order for meaningful learning to occur the cognitive, affective, and psychomotor domains must be integrated. The psychomotor domain is inherent in the chemistry laboratory, but the extent to which the cognitive and affective domains are integrated is unknown. For meaningful learning to occur in the laboratory, students must actively integrate both the cognitive domain and the affective domains into the "doing" of their laboratory work. The Meaningful Learning in the Laboratory Instrument (MLLI) was designed to measure students' cognitive and affective expectations and experiences within the context of conducting experiments in the undergraduate chemistry laboratory. Evidence for the validity and reliability of the data generated by the MLLI were collected from multiple quantitative studies: a one semester study at one university, a one semester study at 15 colleges and universities across the United States, and a longitudinal study where the MLLI was administered 6 times during two years of general and organic chemistry laboratory courses. Results from

  11. Organisational impact: Definition and assessment methods for medical devices.

    PubMed

    Roussel, Christophe; Carbonneil, Cédric; Audry, Antoine

    2016-02-01

    Health technology assessment (HTA) is a rapidly developing area and the value of taking non-clinical fields into consideration is growing. Although the health-economic aspect is commonly recognised, evaluating organisational impact has not been studied nearly as much. The goal of this work was to provide a definition of organisational impact in the sector of medical devices by defining its contours and exploring the evaluation methods specific to this field. Following an analysis of the literature concerning the impact of technologies on organisations as well as the medical literature, and also after reviewing the regulatory texts in this respect, the group of experts identified 12 types of organisational impact. A number of medical devices were carefully screened using the criteria grid, which proved to be operational and to differentiate properly. From the analysis of the practice and of the methods described, the group was then able to derive a few guidelines to successfully evaluate organisational impact. This work shows that taking organisational impact into consideration may be critical alongside of the other criteria currently in favour (clinically and economically). What remains is to confer a role in the decision-making process on this factor and one that meets the economic efficiency principle. PMID:27080633

  12. Organisational impact: Definition and assessment methods for medical devices.

    PubMed

    Roussel, Christophe; Carbonneil, Cédric; Audry, Antoine

    2016-02-01

    Health technology assessment (HTA) is a rapidly developing area and the value of taking non-clinical fields into consideration is growing. Although the health-economic aspect is commonly recognised, evaluating organisational impact has not been studied nearly as much. The goal of this work was to provide a definition of organisational impact in the sector of medical devices by defining its contours and exploring the evaluation methods specific to this field. Following an analysis of the literature concerning the impact of technologies on organisations as well as the medical literature, and also after reviewing the regulatory texts in this respect, the group of experts identified 12 types of organisational impact. A number of medical devices were carefully screened using the criteria grid, which proved to be operational and to differentiate properly. From the analysis of the practice and of the methods described, the group was then able to derive a few guidelines to successfully evaluate organisational impact. This work shows that taking organisational impact into consideration may be critical alongside of the other criteria currently in favour (clinically and economically). What remains is to confer a role in the decision-making process on this factor and one that meets the economic efficiency principle.

  13. Quantitative assessment of gene expression network module-validation methods.

    PubMed

    Li, Bing; Zhang, Yingying; Yu, Yanan; Wang, Pengqian; Wang, Yongcheng; Wang, Zhong; Wang, Yongyan

    2015-01-01

    Validation of pluripotent modules in diverse networks holds enormous potential for systems biology and network pharmacology. An arising challenge is how to assess the accuracy of discovering all potential modules from multi-omic networks and validating their architectural characteristics based on innovative computational methods beyond function enrichment and biological validation. To display the framework progress in this domain, we systematically divided the existing Computational Validation Approaches based on Modular Architecture (CVAMA) into topology-based approaches (TBA) and statistics-based approaches (SBA). We compared the available module validation methods based on 11 gene expression datasets, and partially consistent results in the form of homogeneous models were obtained with each individual approach, whereas discrepant contradictory results were found between TBA and SBA. The TBA of the Zsummary value had a higher Validation Success Ratio (VSR) (51%) and a higher Fluctuation Ratio (FR) (80.92%), whereas the SBA of the approximately unbiased (AU) p-value had a lower VSR (12.3%) and a lower FR (45.84%). The Gray area simulated study revealed a consistent result for these two models and indicated a lower Variation Ratio (VR) (8.10%) of TBA at 6 simulated levels. Despite facing many novel challenges and evidence limitations, CVAMA may offer novel insights into modular networks. PMID:26470848

  14. New method for the assessment of molluscum contagiosum virus infectivity.

    PubMed

    Sherwani, Subuhi; Blythe, Niamh; Farleigh, Laura; Bugert, Joachim J

    2012-01-01

    Molluscum contagiosum virus (MCV), a poxvirus pathogenic for humans, replicates well in human skin in vivo, but not in vitro in standard monolayer cell cultures. In order to determine the nature of the replication deficiency in vitro, the MCV infection process in standard culture has to be studied step by step. The method described in this chapter uses luciferase and GFP reporter constructs to measure poxviral mRNA transcription activity in cells in standard culture infected with known quantities of MCV or vaccinia virus. Briefly, MCV isolated from human tissue specimen is quantitated by PCR and used to infect human HEK293 cells, selected for ease of transfection. The cells are subsequently transfected with a reporter plasmid encoding firefly luciferase gene under the control of a synthetic early/late poxviral promoter and a control plasmid encoding a renilla luciferase reporter under the control of a eukaryotic promoter. After 16 h, cells are harvested and tested for expression of luciferase. MCV genome units are quantitated by PCR targeting a genome area conserved between MCV and vaccinia virus. Using a GFP reporter plasmid, this method can be further used to infect a series of epithelial and fibroblast-type cell lines of human and animal origin to microscopically visualize MCV-infected cells, to assess late promoter activation, and, using these parameters, to optimize MCV infectivity and gene expression in more complex eukaryotic cell culture models. PMID:22688765

  15. Assessment of Methods for the Intracellular Blockade of GABAA Receptors.

    PubMed

    Atherton, Laura A; Burnell, Erica S; Mellor, Jack R

    2016-01-01

    Selective blockade of inhibitory synaptic transmission onto specific neurons is a useful tool for dissecting the excitatory and inhibitory synaptic components of ongoing network activity. To achieve this, intracellular recording with a patch solution capable of blocking GABAA receptors has advantages over other manipulations, such as pharmacological application of GABAergic antagonists or optogenetic inhibition of populations of interneurones, in that the majority of inhibitory transmission is unaffected and hence the remaining network activity preserved. Here, we assess three previously described methods to block inhibition: intracellular application of the molecules picrotoxin, 4,4'-dinitro-stilbene-2,2'-disulphonic acid (DNDS) and 4,4'-diisothiocyanostilbene-2,2'-disulphonic acid (DIDS). DNDS and picrotoxin were both found to be ineffective at blocking evoked, monosynaptic inhibitory postsynaptic currents (IPSCs) onto mouse CA1 pyramidal cells. An intracellular solution containing DIDS and caesium fluoride, but lacking nucleotides ATP and GTP, was effective at decreasing the amplitude of IPSCs. However, this effect was found to be independent of DIDS, and the absence of intracellular nucleotides, and was instead due to the presence of fluoride ions in this intracellular solution, which also blocked spontaneously occurring IPSCs during hippocampal sharp waves. Critically, intracellular fluoride ions also caused a decrease in both spontaneous and evoked excitatory synaptic currents and precluded the inclusion of nucleotides in the intracellular solution. Therefore, of the methods tested, only fluoride ions were effective for intracellular blockade of IPSCs but this approach has additional cellular effects reducing its selectivity and utility. PMID:27501143

  16. Exploring non-invasive methods to assess pain in sheep.

    PubMed

    Stubsjøen, Solveig M; Flø, Andreas S; Moe, Randi O; Janczak, Andrew M; Skjerve, Eystein; Valle, Paul S; Zanella, Adroaldo J

    2009-12-01

    The aim of this study was to determine whether changes in eye temperature, measured using infrared thermography (IRT), and heart rate variability (HRV) can detect moderate levels of pain in sheep. Six ewes received the following treatments: 1) noxious ischaemic stimulus by application of a forelimb tourniquet (S), 2) noxious ischaemic stimulus and flunixin meglumine (S+F), and 3) flunixin meglumine (F). Maximum eye temperature, HRV, mechanical nociceptive threshold, blood pressure and behaviour were recorded for up to 60 min, including 15 min of baseline, 30 min during intervention and 15 min post-intervention. There was a tendency towards a decrease in the heart rate variability parameters RMSSD (the root mean square of successive R-R intervals) and SDNN (the standard deviation of all interbeat intervals) in treatment S compared to treatment F, and a significant increase in the same parameters between test day 1 and 3. A reduction in eye temperature was detected for all treatments during intervention, but no difference was found between S and F and S+N and F during intervention. The eye temperature decreased more in test day 2 and 3 compared to test day 1 during intervention. A significant reduction for both lip licking and vocalisation was observed between test day 1 and 3, and forward facing ears was the ear posture most frequently recorded in test day 1. We suggest that HRV is a sensitive, non-invasive method to assess mild to moderate pain in sheep, whereas IRT is a less sensitive method.

  17. Indirect methods of dried sewage sludge contamination assessments.

    PubMed

    Werle, Sebastian; Dudziak, Mariusz; Grübel, Klaudiusz

    2016-07-28

    Thermal conversion (combustion, co-combustion, gasification and pyrolysis) appears to be the most promising alternative for sewage sludge management in the future. Nevertheless, safe and ecological usage of sewage sludge as a fuel requires information about their contamination. The aim of this paper is to present the photoacoustic spectroscopy (PAS) as a good method for contamination assessments of dried sewage sludge. Two types of granular sewage sludge: Sewage sludge 1 (SS1) taken from Polish wastewater treatment plant operating in the mechanical-biological system and sewage sludge 2 (SS2) taken from mechanical-biological-chemical wastewater treatment plant with phosphorus precipitation were analysed. The spectrophotometer FTIR Nicolet 6700 equipped with photoacoustic cell (Model 300, MTEC, USA) was used. The comparison with the most popular analytical methods (GC-MS) was also done. The results of PAS studies confirm the difference between the SS1 and SS2 which is in agreement with the GC-MS analysis. Higher absorbance was observed at each wavelength characteristics for the oscillator of chemical moieties for the SS1 with respect to the SS2. PMID:27149560

  18. Assessing Mitochondrial Movement Within Neurons: Manual Versus Automated Tracking Methods.

    PubMed

    Bros, Helena; Hauser, Anja; Paul, Friedemann; Niesner, Raluca; Infante-Duarte, Carmen

    2015-08-01

    Owing to the small size of mitochondria and the complexity of their motility patterns, mitochondrial tracking is technically challenging. Mitochondria are often tracked manually; however, this is time-consuming and prone to measurement error. Here, we examined the suitability of four commercial and open-source software alternatives for automated mitochondrial tracking in neurons compared with manual measurements. We show that all the automated tracking tools dramatically underestimated track length, mitochondrial displacement and movement duration, with reductions ranging from 45 to 77% of the values obtained manually. In contrast, mitochondrial velocity was generally overestimated. Only the number of motile mitochondria and their directionality were similar between strategies. Despite these discrepancies, we show that automated tools successfully detected transport alterations after applying an oxidant agent. Thus, automated methods appear to be suitable for assessing relative transport differences between experimental groups, but not for absolute quantification of mitochondrial dynamics. Although useful for objective and time-efficient measurements of mitochondrial movements, results provided by automated methods should be interpreted with caution.

  19. Chronic intraoral pain--assessment of diagnostic methods and prognosis.

    PubMed

    Pigg, Maria

    2011-01-01

    The overall goal of this thesis was to broaden our knowledge of chronic intraoral pain. The research questions were: What methods can be used to differentiate inflammatory, odontogenic tooth pain from pain that presents as toothache but is non-odontogenic in origin? What is the prognosis of chronic tooth pain of non-odontogenic origin, and which factors affect the prognosis? Atypical odontalgia (AO) is a relatively rare but severe and chronic pain condition affecting the dentoalveolar region. Recent research indicates that the origin is peripheral nerve damage: neuropathic pain. The condition presents as tooth pain and is challenging to dentists because it is difficult to distinguish from ordinary toothache due to inflammation or infection. AO is of interest to the pain community because it shares many characteristics with other chronic pain conditions, and pain perpetuation mechanisms are likely to be similar. An AO diagnosis is made after a comprehensive examination and assessment of patients' self-reported characteristics: the pain history. Traditional dental diagnostic methods do not appear to suffice, since many patients report repeated care-seeking and numerous treatment efforts with little or no pain relief. Developing methods that are useful in the clinical setting is a prerequisite for a correct diagnosis and adequate treatment decisions. Quantitative sensory testing (QST) is used to assess sensory function on skin when nerve damage or disease is suspected. A variety of stimuli has been used to examine the perception of, for example, touch, temperature (painful and non-painful), vibration, pinprick pain, and pressure pain. To detect sensory abnormalities and nerve damage in the oral cavity, the same methods may be possible to use. Study I examined properties of thermal thresholds in and around the mouth in 30 pain-free subjects: the influence of measurement location and stimulation area size on threshold levels, and time variability of thresholds

  20. Parental Functioning in Families of Children with ADHD: Evidence for Behavioral Parent Training and Importance of Clinically Meaningful Change

    ERIC Educational Resources Information Center

    Gerdes, Alyson C.; Haack, Lauren M.; Schneider, Brian W.

    2012-01-01

    Objective/Method: Statistically significant and clinically meaningful effects of behavioral parent training on parental functioning were examined for 20 children with ADHD and their parents who had successfully completed a psychosocial treatment for ADHD. Results/Conclusion: Findings suggest that behavioral parent training resulted in…

  1. Qualitative Insights from a Canadian Multi-Institutional Research Study: In Search of Meaningful E-Learning

    ERIC Educational Resources Information Center

    Carter, Lorraine M.; Salyers, Vince; Myers, Sue; Hipfner, Carol; Hoffart, Caroline; MacLean, Christa; White, Kathy; Matus, Theresa; Forssman, Vivian; Barrett, Penelope

    2014-01-01

    This paper reports the qualitative findings of a mixed methods research study conducted at three Canadian post-secondary institutions. Called the Meaningful E-learning or MEL project, the study was an exploration of the teaching and learning experiences of faculty and students as well as their perceptions of the benefits and challenges of…

  2. Overall survival in non-small cell lung cancer—what is clinically meaningful?

    PubMed Central

    Fenchel, Klaus; Sellmann, Ludger

    2016-01-01

    The development of molecularly targeted therapies [tyrosine kinase inhibitors (TKIs) and monoclonal antibodies] has significantly improved outcomes for patients with advanced or metastatic non-small cell lung cancer (NSCLC) resulting in improved progression-free survival (PFS), overall survival (OS) and quality of life (QoL). In addition, targeting the immune axis (CTLA-4, PD-1/PD-L1) has also shown promising results. Major goals of almost all clinical trials based on histology and molecular markers for NSCLC patients are improvements of OS and QoL. However, in the majority of these trials only small incremental improvements in OS were seen. Food and Drug Administration (FDA) and other health authorities have recommended to consider OS to be the standard clinical benefit endpoint that should be used to establish the efficacy of a treatment for NSCLC patients, however, the question remains what is clinically meaningful and how can this outcome be measured. According to suggestions of the American Society of Clinical Oncology (ASCO) Cancer Research Committee a relative improvement in median OS of at least 20% (3–4 months) is regarded to define a clinically meaningful improvement in outcome of NSCLC patients. However, this should not diminish PFS as a valid endpoint since a PFS improvement can also result in a meaningful palliation (e.g., painful bone metastases). Other factors (e.g., QoL) may also be involved to measure and to define the clinical importance of a given trial result. Using the “Quality-adjusted Time Without Symptoms of Toxicity” (Q-TWiST) analysis method it has been demonstrated that a clinically important and meaningful difference for Q-TWiST is 10–15% of OS in a study. Trials that are designed with less ambitious goals, however, may still be of benefit to individual NSCLC patients if the trial endpoints are met. Since there is no single factor which will make a trial clinically meaningful, these recommendations, however, are not intended to

  3. Basic theory and methods of dosimetry for use in risk assessment of genotoxic chemicals. Final report

    SciTech Connect

    Ehrenberg, L.; Granath, F.

    1992-12-31

    This project is designed to be theoretical, with limited experimental input. The work then would have to be directed towards an identification of problems, with an emphasis on the potential ability of molecular/biochemical methods to reach a solution, rather than aiming at solutions of the problems. In addition, the work is dependent on experimental work within parallel projects. Initially, projects running at this laboratory were strongly tied up with practical matters, such as the development of monitoring methods for specific exposures, with limited resources for basic research. As sketched in the scientific report below, section 4 the meaningfulness of molecular/biochemical methods and their potential contribution to the problem of dsk estimation has to be seen against a broad overview of this problem and current efforts to solve it. This overview, given as a brief summary in section 3, shows the necessity of combining different fields of research, holding them together by strictly quantitative aspects.

  4. Critical Assessment of Correction Methods for Fisheye Lens Distortion

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Tian, C.; Huang, Y.

    2016-06-01

    A fisheye lens is widely used to create a wide panoramic or hemispherical image. It is an ultra wide-angle lens that produces strong visual distortion. The distortion modeling and estimation of the fisheye lens are the crucial step for fisheye lens calibration and image rectification in computer vision and close-range photography. There are two kinds of distortion: radial and tangential distortion. Radial distortion is large for fisheye imaging and critical for the subsequent image processing. Although many researchers have developed calibration algorithms of radial distortion of fisheye lens, quantitative evaluation of the correction performance has remained a challenge. This is the first paper that intuitively and objectively evaluates the performance of five different calibration algorithms. Upto- date research on fisheye lens calibration is comprehensively reviewed to identify the research need. To differentiate their performance in terms of precision and ease-using, five methods are then tested using a diverse set of actual images of the checkerboard that are taken at Wuhan University, China under varying lighting conditions, shadows, and shooting angles. The method of rational function model, which was generally used for wide-angle lens correction, outperforms the other methods. However, the one parameter division model is easy for practical use without compromising too much the precision. The reason is that it depends on the linear structure in the image and requires no preceding calibration. It is a tradeoff between correction precision and ease-using. By critically assessing the strengths and limitations of the existing algorithms, the paper provides valuable insight and guideline for future practice and algorithm development that are important for fisheye lens calibration. It is promising for the optimal design of lens correction models that are suitable for the millions of portable imaging devices.

  5. A comparison of data-driven groundwater vulnerability assessment methods.

    PubMed

    Sorichetta, Alessandro; Ballabio, Cristiano; Masetti, Marco; Robinson, Gilpin R; Sterlacchini, Simone

    2013-01-01

    Increasing availability of geo-environmental data has promoted the use of statistical methods to assess groundwater vulnerability. Nitrate is a widespread anthropogenic contaminant in groundwater and its occurrence can be used to identify aquifer settings vulnerable to contamination. In this study, multivariate Weights of Evidence (WofE) and Logistic Regression (LR) methods, where the response variable is binary, were used to evaluate the role and importance of a number of explanatory variables associated with nitrate sources and occurrence in groundwater in the Milan District (central part of the Po Plain, Italy). The results of these models have been used to map the spatial variation of groundwater vulnerability to nitrate in the region, and we compare the similarities and differences of their spatial patterns and associated explanatory variables. We modify the standard WofE method used in previous groundwater vulnerability studies to a form analogous to that used in LR; this provides a framework to compare the results of both models and reduces the effect of sampling bias on the results of the standard WofE model. In addition, a nonlinear Generalized Additive Model has been used to extend the LR analysis. Both approaches improved discrimination of the standard WofE and LR models, as measured by the c-statistic. Groundwater vulnerability probability outputs, based on rank-order classification of the respective model results, were similar in spatial patterns and identified similar strong explanatory variables associated with nitrate source (population density as a proxy for sewage systems and septic sources) and nitrate occurrence (groundwater depth).

  6. The Myth of Objectivity in Mathematics Assessment.

    ERIC Educational Resources Information Center

    Romagnano, Lew

    2001-01-01

    Investigates meaningful assessment to give teachers information on students' understanding of mathematical ideas and how their understanding changes over time. Presents examples collected from a teacher-made quiz, the Advanced Placement calculus test, and the SAT-I Mathematics test. Illustrates both the inherent subjectivity of these methods and…

  7. Passive sampling methods for contaminated sediments: risk assessment and management.

    PubMed

    Greenberg, Marc S; Chapman, Peter M; Allan, Ian J; Anderson, Kim A; Apitz, Sabine E; Beegan, Chris; Bridges, Todd S; Brown, Steve S; Cargill, John G; McCulloch, Megan C; Menzie, Charles A; Shine, James P; Parkerton, Thomas F

    2014-04-01

    This paper details how activity-based passive sampling methods (PSMs), which provide information on bioavailability in terms of freely dissolved contaminant concentrations (Cfree ), can be used to better inform risk management decision making at multiple points in the process of assessing and managing contaminated sediment sites. PSMs can increase certainty in site investigation and management, because Cfree is a better predictor of bioavailability than total bulk sediment concentration (Ctotal ) for 4 key endpoints included in conceptual site models (benthic organism toxicity, bioaccumulation, sediment flux, and water column exposures). The use of passive sampling devices (PSDs) presents challenges with respect to representative sampling for estimating average concentrations and other metrics relevant for exposure and risk assessment. These challenges can be addressed by designing studies that account for sources of variation associated with PSMs and considering appropriate spatial scales to meet study objectives. Possible applications of PSMs include: quantifying spatial and temporal trends in bioavailable contaminants, identifying and evaluating contaminant source contributions, calibrating site-specific models, and, improving weight-of-evidence based decision frameworks. PSM data can be used to assist in delineating sediment management zones based on likelihood of exposure effects, monitor remedy effectiveness, and, evaluate risk reduction after sediment treatment, disposal, or beneficial reuse after management actions. Examples are provided illustrating why PSMs and freely dissolved contaminant concentrations (Cfree ) should be incorporated into contaminated sediment investigations and study designs to better focus on and understand contaminant bioavailability, more accurately estimate exposure to sediment-associated contaminants, and better inform risk management decisions. Research and communication needs for encouraging broader use are discussed. PMID

  8. Development of Probabilistic Methods to Assess Meteotsunami Hazards

    NASA Astrophysics Data System (ADS)

    Geist, E. L.; Ten Brink, U. S.

    2014-12-01

    A probabilistic method to assess the hazard from meteotsunamis is developed from both probabilistic tsunami hazard analysis (PTHA) and probabilistic storm-surge forecasting. Meteotsunamis are unusual sea level events, generated when the speed of an atmospheric pressure or wind disturbance is comparable to the phase speed of long waves in the ocean. A general aggregation equation, similar to that used in PTHA, incorporates different meteotsunami sources. A historical record of 116 pressure disturbances recorded between 2000 and 2013 by the U.S. Automated Surface Observing Stations (ASOS) along the U.S. East Coast is used to establish a continuous analytic distribution of each source parameter as well as the overall Poisson rate of occurrence. Initially, atmospheric parameters are considered independently such that the joint probability distribution is given by the product of each marginal distribution. The probabilistic equations are implemented using a Monte Carlo scheme, where a synthetic catalog of pressure disturbances is compiled by sampling the parameter distributions. For each entry in the catalog, ocean wave amplitudes are computed using a finite-difference hydrodynamic model that solves for the linearized long-wave equations. Aggregation of the results from the Monte Carlo scheme results in a meteotsunami hazard curve that plots the annualized rate of exceedance with respect to maximum event amplitude for a particular location along the coast. Results from using 20 synthetic catalogs of 116 events each, resampled from the parent parameter distributions, yield mean and quantile hazard curves. An example is presented for four Mid-Atlantic sites using ASOS data in which only atmospheric pressure disturbances from squall lines and derechos are considered. Results indicate that site-to-site variations among meteotsunami hazard curves are related to the geometry and width of the adjacent continental shelf. The new hazard analysis of meteotsunamis is important for

  9. Passive sampling methods for contaminated sediments: risk assessment and management.

    PubMed

    Greenberg, Marc S; Chapman, Peter M; Allan, Ian J; Anderson, Kim A; Apitz, Sabine E; Beegan, Chris; Bridges, Todd S; Brown, Steve S; Cargill, John G; McCulloch, Megan C; Menzie, Charles A; Shine, James P; Parkerton, Thomas F

    2014-04-01

    This paper details how activity-based passive sampling methods (PSMs), which provide information on bioavailability in terms of freely dissolved contaminant concentrations (Cfree ), can be used to better inform risk management decision making at multiple points in the process of assessing and managing contaminated sediment sites. PSMs can increase certainty in site investigation and management, because Cfree is a better predictor of bioavailability than total bulk sediment concentration (Ctotal ) for 4 key endpoints included in conceptual site models (benthic organism toxicity, bioaccumulation, sediment flux, and water column exposures). The use of passive sampling devices (PSDs) presents challenges with respect to representative sampling for estimating average concentrations and other metrics relevant for exposure and risk assessment. These challenges can be addressed by designing studies that account for sources of variation associated with PSMs and considering appropriate spatial scales to meet study objectives. Possible applications of PSMs include: quantifying spatial and temporal trends in bioavailable contaminants, identifying and evaluating contaminant source contributions, calibrating site-specific models, and, improving weight-of-evidence based decision frameworks. PSM data can be used to assist in delineating sediment management zones based on likelihood of exposure effects, monitor remedy effectiveness, and, evaluate risk reduction after sediment treatment, disposal, or beneficial reuse after management actions. Examples are provided illustrating why PSMs and freely dissolved contaminant concentrations (Cfree ) should be incorporated into contaminated sediment investigations and study designs to better focus on and understand contaminant bioavailability, more accurately estimate exposure to sediment-associated contaminants, and better inform risk management decisions. Research and communication needs for encouraging broader use are discussed.

  10. Use of the Attribute Hierarchy Method for Development of Student Cognitive Models and Diagnostic Assessments in Geoscience Education

    NASA Astrophysics Data System (ADS)

    Corrigan, S.; Brodsky, L. M.; Loper, S.; Brown, N.; Curley, J.; Baker, J.; Goss, M.; Castek, J.; Barber, J.

    2010-12-01

    There is a recognized need to better understand student learning in the geosciences (Stofflet, 1994; Zalles, Quallmalz, Gobert and Pallant, 2007). Educators, cognitive psychologists and practicing scientists have also called for instructional approaches that support deep conceptual development (Manduca, Mogk and Stillings, 2004, Libarkin and Kurdziel, 2006). In both cases there is an important role for educational measures that can generate descriptions of how student understanding develops over time and inform instruction. The presenters will suggest one way of responding to these needs by describing the Attribute Hierarchy Method (AHM) of assessment (Leighton, Gierl and Hunka, 2004; Gierl, Cui, Wang and Zhou, 2008) as enacted in a large-scale earth science curriculum development project funded by the Bill and Melinda Gates Foundation. The AHM is one approach to criterion referenced, diagnostic assessment that ties measure design to cognitive models of student learning in order to support justified inferences about students’ understanding and the knowledge required for continued development. The Attribute Hierarchy Method bears potential for researchers and practitioners interested in learning progressions and solves many problems associated with making meaningful, justified inferences about students’ understanding based on their assessment performances. The process followed to design and develop the project’s cognitive models as well as a description of how they are used in subsequent assessment task design will be emphasized in order to demonstrate how the AHM may be applied in the context of geoscience education. Results from over twenty student cognitive interviews, and two hypothesized cognitive models -- one describing a student pathway for understanding rock formation and a second describing a student pathway for increasingly sophisticated use of maps and models in the geosciences - are also described. Sample assessment items will be provided as

  11. Evaluation of Current Assessment Methods in Engineering Entrepreneurship Education

    ERIC Educational Resources Information Center

    Purzer, Senay; Fila, Nicholas; Nataraja, Kavin

    2016-01-01

    Quality assessment is an essential component of education that allows educators to support student learning and improve educational programs. The purpose of this study is to evaluate the current state of assessment in engineering entrepreneurship education. We identified 52 assessment instruments covered in 29 journal articles and conference…

  12. Age differences in the neuroelectric adaptation to meaningful sounds.

    PubMed

    Leung, Ada W S; He, Yu; Grady, Cheryl L; Alain, Claude

    2013-01-01

    Much of what we know regarding the effect of stimulus repetition on neuroelectric adaptation comes from studies using artificially produced pure tones or harmonic complex sounds. Little is known about the neural processes associated with the representation of everyday sounds and how these may be affected by aging. In this study, we used real life, meaningful sounds presented at various azimuth positions and found that auditory evoked responses peaking at about 100 and 180 ms after sound onset decreased in amplitude with stimulus repetition. This neural adaptation was greater in young than in older adults and was more pronounced when the same sound was repeated at the same location. Moreover, the P2 waves showed differential patterns of domain-specific adaptation when location and identity was repeated among young adults. Background noise decreased ERP amplitudes and modulated the magnitude of repetition effects on both the N1 and P2 amplitude, and the effects were comparable in young and older adults. These findings reveal an age-related difference in the neural processes associated with adaptation to meaningful sounds, which may relate to older adults' difficulty in ignoring task-irrelevant stimuli.

  13. An evaluation of meaningful learning in a high school chemistry course

    NASA Astrophysics Data System (ADS)

    Bross, April J.

    This study utilized an action research methodology to examine students' understandings of science knowledge, and meaningful learning using the SLD (Science Lecture Demonstration) and laboratory instructional method in a high school chemistry classroom. This method was a modification of the Science Lecture Demonstration Method as developed by Majerich and Schmuckler (2004, in press), the modification due to the addition of a laboratory component. The participants in this study represented a convenience sample which included one class of twenty-two, middle to high socio-economic status students (Mean family income over $75,000/year in 2005 U.S. dollars) in an honors chemistry course at a public high school in the state of New Jersey. These participants included nine girls and thirteen boys. The results of this study indicated what the students' understandings of science knowledge were, how the understandings differed among students, and to what extent those understandings were indicative of meaningful learning. These results were obtained by careful analysis of student generated concept maps, narratives from demonstration quizzes, laboratory reports, and test questions, as well as a teacher/researcher reflection upon the classroom experience. A simple taxonomy for analyzing students' understandings of science knowledge was developed, based upon the work of Majerich (2004). Findings indicated that the students' understanding of science knowledge, as well as the extent of meaningful learning that occurs in the chemistry classroom may be influenced by the roles of: explicit directions, pre-existing knowledge from elementary and middle school science classes, using examples vs. non-examples, macroscopic vs. microscopic views of nature, time for reflection, and everyday vs. scientific language. Results obtained from high school student responses confirmed Novak's observation of elementary students' lack of differentiation between the terms vapor and gas (Novak, 1998).

  14. Assessment of Methods for the Intracellular Blockade of GABAA Receptors

    PubMed Central

    Atherton, Laura A.; Burnell, Erica S.; Mellor, Jack R.

    2016-01-01

    Selective blockade of inhibitory synaptic transmission onto specific neurons is a useful tool for dissecting the excitatory and inhibitory synaptic components of ongoing network activity. To achieve this, intracellular recording with a patch solution capable of blocking GABAA receptors has advantages over other manipulations, such as pharmacological application of GABAergic antagonists or optogenetic inhibition of populations of interneurones, in that the majority of inhibitory transmission is unaffected and hence the remaining network activity preserved. Here, we assess three previously described methods to block inhibition: intracellular application of the molecules picrotoxin, 4,4’-dinitro-stilbene-2,2’-disulphonic acid (DNDS) and 4,4’-diisothiocyanostilbene-2,2’-disulphonic acid (DIDS). DNDS and picrotoxin were both found to be ineffective at blocking evoked, monosynaptic inhibitory postsynaptic currents (IPSCs) onto mouse CA1 pyramidal cells. An intracellular solution containing DIDS and caesium fluoride, but lacking nucleotides ATP and GTP, was effective at decreasing the amplitude of IPSCs. However, this effect was found to be independent of DIDS, and the absence of intracellular nucleotides, and was instead due to the presence of fluoride ions in this intracellular solution, which also blocked spontaneously occurring IPSCs during hippocampal sharp waves. Critically, intracellular fluoride ions also caused a decrease in both spontaneous and evoked excitatory synaptic currents and precluded the inclusion of nucleotides in the intracellular solution. Therefore, of the methods tested, only fluoride ions were effective for intracellular blockade of IPSCs but this approach has additional cellular effects reducing its selectivity and utility. PMID:27501143

  15. Making Each Other’s Daily Life: Nurse Assistants’ Experiences and Knowledge on Developing a Meaningful Daily Life in Nursing Homes

    PubMed Central

    James, Inger; Fredriksson, Carin; Wahlström, Catrin; Kihlgren, Annica; Blomberg, Karin

    2014-01-01

    Background: In a larger action research project, guidelines were generated for how a meaningful daily life could be developed for older persons. In this study, we focused on the nurse assistants’ (NAs) perspectives, as their knowledge is essential for a well-functioning team and quality of care. The aim was to learn from NAs’ experiences and knowledge about how to develop a meaningful daily life for older persons in nursing homes and the meaning NAs ascribe to their work. Methods: The project is based on Participatory and Appreciative Action and Reflection. Data were generated through interviews, participating observations and informal conversations with 27 NAs working in nursing homes in Sweden, and a thematic analysis was used. Result: NAs developed a meaningful daily life by sensing and finding the “right” way of being (Theme 1). They sense and read the older person in order to judge how the person was feeling (Theme 2). They adapt to the older person (Theme 3) and share their daily life (Theme 4). NAs use emotional involvement to develop a meaningful daily life for the older person and meaning in their own work (Theme 5), ultimately making each other’s daily lives meaningful. Conclusion: It was obvious that NAs based the development of a meaningful daily life on different forms of knowledge: the oreticaland practical knowledge, and practical wisdom, all of which are intertwined. These results could be used within the team to constitute a meaningful daily life for older persons in nursing homes. PMID:25246997

  16. Bayesian methods for assessing system reliability: models and computation.

    SciTech Connect

    Graves, T. L.; Hamada, Michael,

    2004-01-01

    There are many challenges with assessing the reliability of a system today. These challenges arise because a system may be aging and full system tests may be too expensive or can no longer be performed. Without full system testing, one must integrate (1) all science and engineering knowledge, models and simulations, (2) information and data at various levels of the system, e.g., subsystems and components and (3) information and data from similar systems, subsystems and components. The analyst must work with various data types and how the data are collected, account for measurement bias and uncertainty, deal with model and simulation uncertainty and incorporate expert knowledge. Bayesian hierarchical modeling provides a rigorous way to combine information from multiple sources and different types of information. However, an obstacle to applying Bayesian methods is the need to develop new software to analyze novel statistical models. We discuss a new statistical modeling environment, YADAS, that facilitates the development of Bayesian statistical analyses. It includes classes that help analysts specify new models, as well as classes that support the creation of new analysis algorithms. We illustrate these concepts using several examples.

  17. Full scale assessment of pansharpening methods and data products

    NASA Astrophysics Data System (ADS)

    Aiazzi, B.; Alparone, L.; Baronti, S.; Carlà, R.; Garzelli, A.; Santurri, L.

    2014-10-01

    Quality assessment of pansharpened images is traditionally carried out either at degraded spatial scale by checking the synthesis property ofWald's protocol or at the full spatial scale by separately checking the spectral and spatial consistencies. The spatial distortion of the QNR protocol and the spectral distortion of Khan's protocol may be combined into a unique quality index, referred to as hybrid QNR (HQNR), that is calculated at full scale. Alternatively, multiscale measurements of indices requiring a reference, like SAM, ERGAS and Q4, may be extrapolated to yield a quality measurement at the full scale of the fusion product, where a reference does not exist. Experiments on simulated Pĺeiades data, of which reference originals at full scale are available, highlight that quadratic polynomials having three-point support, i.e. fitting three measurements at as many progressively doubled scales, are adequate. Q4 is more suitable for extrapolation than ERGAS and SAM. The Q4 value predicted from multiscale measurements and the Q4 value measured at full scale thanks to the reference original, differ by very few percents for six different state-of-the-art methods that have been compared. HQNR is substantially comparable to the extrapolated Q4.

  18. Proactive interference does not meaningfully distort visual working memory capacity estimates in the canonical change detection task.

    PubMed

    Lin, Po-Han; Luck, Steven J

    2012-01-01

    The change detection task has become a standard method for estimating the storage capacity of visual working memory. Most researchers assume that this task isolates the properties of an active short-term storage system that can be dissociated from long-term memory systems. However, long-term memory storage may influence performance on this task. In particular, memory traces from previous trials may create proactive interference that sometimes leads to errors, thereby reducing estimated capacity. Consequently, the capacity of visual working memory may be higher than is usually thought, and correlations between capacity and other measures of cognition may reflect individual differences in proactive interference rather than individual differences in the capacity of the short-term storage system. Indeed, previous research has shown that change detection performance can be influenced by proactive interference under some conditions. The purpose of the present study was to determine whether the canonical version of the change detection task - in which the to-be-remembered information consists of simple, briefly presented features - is influenced by proactive interference. Two experiments were conducted using methods that ordinarily produce substantial evidence of proactive interference, but no proactive interference was observed. Thus, the canonical version of the change detection task can be used to assess visual working memory capacity with no meaningful influence of proactive interference.

  19. 76 FR 4345 - A Method To Assess Climate-Relevant Decisions: Application in the Chesapeake Bay

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-25

    ... AGENCY A Method To Assess Climate-Relevant Decisions: Application in the Chesapeake Bay AGENCY... review draft document titled, ``A Method to Assess Climate-Relevant Decisions: Application in the.../conferences/peerreview/register-chesapeake.htm . The draft ``A Method to Assess Climate-Relevant...

  20. Processing Instruction and Meaningful Output-Based Instruction: Effects on Second Language Development

    ERIC Educational Resources Information Center

    Morgan-Short, Kara; Bowden, Harriet Wood

    2006-01-01

    This study investigates the effects of meaningful input- and output-based practice on SLA. First-semester Spanish students (n = 45) were assigned to processing instruction, meaningful output-based instruction, or control groups. Experimental groups received the same input in instruction but received meaningful practice that was input or output…

  1. HDMR methods to assess reliability in slope stability analyses

    NASA Astrophysics Data System (ADS)

    Kozubal, Janusz; Pula, Wojciech; Vessia, Giovanna

    2014-05-01

    -soil masses) resulting in sliding mechanisms have been investigated in this study. The reliability indices values drawn from the HDRM method have been compared with conventional approaches as neural networks: the efficiency of HDRM is shown in the case studied. References Chowdhury R., Rao B.N. and Prasad A.M. 2009. High-dimensional model representation for structural reliability analysis. Commun. Numer. Meth. Engng, 25: 301-337. Chowdhury R. and Rao B. 2010. Probabilistic Stability Assessment of Slopes Using High Dimensional Model Representation. Computers and Geotechnics, 37: 876-884.

  2. Current Development in Elderly Comprehensive Assessment and Research Methods

    PubMed Central

    Jiang, Shantong; Li, Pingping

    2016-01-01

    Comprehensive geriatric assessment (CGA) is a core and an essential part of the comprehensive care of the aging population. CGA uses specific tools to summarize elderly status in several domains that may influence the general health and outcomes of diseases of elderly patients, including assessment of medical, physical, psychological, mental, nutritional, cognitive, social, economic, and environmental status. Here, in this paper, we review different assessment tools used in elderly patients with chronic diseases. The development of comprehensive assessment tools and single assessment tools specially used in a dimension of CGA was discussed. CGA provides substantial insight into the comprehensive management of elderly patients. Developing concise and effective assessment instruments is helpful to carry out CGA widely to create a higher clinical value. PMID:27042661

  3. The higher vocational colleges’ exploration of “Tour Guide Service Skills” curriculum assessment methods

    NASA Astrophysics Data System (ADS)

    Tang, Yin; Gao, Jin Yue

    There are many problems in assessment methods of the higher vocational education curricula. In the exploration of assessment methods ofTour Guide Service Skills, our school proposes the theory that the assessment methods should take the competency assessment as the orientation, and take the improvement of the students' multi-faceted capacities as the principle. Besides, assessment methods should be diversified and gradually developed, combining the traditional standardized tests and hierarchical tests, assessing and grading at different stages, and breaking the pattern that a test determines the results.

  4. Historical overview of diet assessment and food consumption surveys in Spain: assessment methods and applications.

    PubMed

    Morán Fagúndez, Luis Juan; Rivera Torres, Alejandra; González Sánchez, María Eugenia; de Torres Aured, Mari Lourdes; López-Pardo Martínez, Mercedes; Irles Rocamora, José Antonio

    2015-02-26

    The food consumption assessment methods are used in nutrition and health population surveys and are the basis for the development of guidelines, nutritional recommendations and health plans, The study of these issues is one of the major tasks of the research and health policy in developed countries. Major advances nationally in this area have been made since 1940, both in the reliability of the data and in the standardization of studies, which is a necessary condition to compare changes over time. In this article the history and application of different dietary surveys, dietary history and food frequency records are analyzed. Besides information from surveys conducted at a national level, the main data currently available for public health planning in nutrition comes from nutritional analysis of household budget surveys and food balance sheets, based on data provided by the Ministry of Agriculture.

  5. Meaningful rehabilitation of the end-stage renal disease patient.

    PubMed

    Thornton, T A; Hakim, R M

    1997-05-01

    In this highly technological age, health care providers are called to attend to the patient as a whole person, with dreams and goals and a desire for purpose and meaning in life. In this article, we propose a broadened definition of rehabilitation and a rehabilitation program designed to effect an improvement in the quality of life of each renal patient by aiming to restore meaningful existence in each of their lives. An individualized plan for rehabilitation can be constructed and implemented with far-reaching success when the focus is on the life goals of the patient, whether physical, social, psychological, or intellectual. These programs not only enhance the quality of life of the patient with end-stage renal disease, but are cost-effective, both at the societal level and at the level of the dialysis clinic. PMID:9165654

  6. School nurse evaluations: making the process meaningful and motivational.

    PubMed

    McDaniel, Kathryn H; Overman, Muriel; Guttu, Martha; Engelke, Martha Keehner

    2013-02-01

    The professional standards of school nursing practice provide a framework to help school nurses focus on their unique mission of promoting health and academic achievement for all students. Without the standards, the nurse's role can become task oriented and limited in scope. By using an evaluation tool that reflects the standards, nurses not only become aware and begin to understand the standards; they also become directly accountable for meeting them. In addition, developing an evaluation process based on the standards of school nurse practice increases the visibility of school nurses and helps school administrators understand the role of the school nurse. This article describes how one school district integrated the scope and standards of school nursing into the job description and performance evaluation of the nurse. The process which is used to complete the evaluation in a manner that is meaningful and motivational to the school nurse is described. PMID:23263263

  7. Meaningful use: participating in the federal incentive program.

    PubMed

    Krishnaraj, Arun; Siddiqui, Adeel; Goldszal, Alberto

    2014-12-01

    Meaningful use legislation was first introduced in the American Recovery and Reinvestment Act of 2009 as a multistaged program to incentivize adoption of electronic health record technology. Since that time, numerous eligible providers and eligible hospitals have captured incentive payments by installing certified electronic health record technology and capturing and reporting on key elements for patients whose health records are stored in an electronic format. Although the question of whether radiologists should participate in the program was initially debated, the evidence is now clear that lack of participation leaves a significant amount of money at risk. This article provides an overview of how the program is structured, what technology needs to be installed, the necessary data elements to capture in an electronic format, and how radiologists can effectively participate in the program to capture their maximum incentive payment. PMID:25467896

  8. Ocular bubble formation as a method of assessing decompression stress.

    PubMed

    Mekjavić, I B; Campbell, D G; Jaki, P; Dovsak, P A

    1998-01-01

    Tear film bubble formation and ultrasound reflectivity of the lens-vitreous humor compartments were monitored following simulated dives in a hyperbaric chamber. the sensitivity of these methods in determining decompression stress was compared with the results of precordial Doppler ultrasound. In addition, the utility of these diagnostic techniques in testing decompression dive profiles was evaluated. Eleven divers completed two series of chamber dives according to the decompression schedule of the Professional Association of Diving Instructors. The first dive series comprised dives to 70 feet of seawater (fsw) for 15, 29, and 40 min. The second series comprised maximum duration no-stop decompression dives to 40 fsw for 140 min, 70 fsw for 40 min, 90 fsw for 25 min, and 120 fsw for 13 min. Before and immediately after each dive, the following measurements were obtained from each subject: eye surface tear film bubble counts with a slit-lamp microscope, lens and vitreous humor reflectivity using A- and B-mode ophthalmic ultrasonic scan, and precordial Doppler ultrasonic detection of venous gas bubbles. Tear film bubble assessment and ocular scanning ultrasound were observed to be more sensitive in detecting decompression stress than the conventional Doppler ultrasonic surveillance of the precordial region. In contrast to precordial Doppler ultrasonic surveillance, which failed to detect any significant changes in circulating bubbles, tear film bubble formation displayed a dose-response relationship with increasing duration of the 70-fsw dives. Reflectivity changes of the lens-vitreous humor interface were not significant until the no-stop decompression limit was reached. In addition, for each of the no-stop decompression limit dives, increases in the average tear film bubble formation and lens-vitreous humor interface reflectivity were similar. Ocular bubble observations may provide a practical and objective ocular bubble index for analyzing existing decompression

  9. New Methods for Assessing the Fascinating Nature of Nature Experiences

    PubMed Central

    Joye, Yannick; Pals, Roos; Steg, Linda; Evans, Ben Lewis

    2013-01-01

    In recent years, numerous environmental psychology studies have demonstrated that contact with nature as opposed to urban settings can improve an individual’s mood, can lead to increased levels of vitality, and can offer an opportunity to recover from stress. According to Attention Restoration Theory (ART) the restorative potential of natural environments is situated in the fact that nature can replenish depleted attentional resources. This replenishment takes place, in part, because nature is deemed to be a source of fascination, with fascination being described as having an “attentional”, an “affective” and an “effort” dimension. However, the claim that fascination with nature involves these three dimensions is to a large extent based on intuition or derived from introspection-based measurement methods, such as self-reports. In three studies, we aimed to more objectively assess whether these three dimensions indeed applied to experiences related to natural environments, before any (attentional) depletion has taken place. The instruments that were used were: (a) the affect misattribution procedure (Study 1), (b) the dot probe paradigm (Study 2) and (c) a cognitively effortful task (Study 3). These instrument were respectively aimed at verifying the affective, attentional and effort dimension of fascination. Overall, the results provide objective evidence for the claims made within the ART framework, that natural as opposed to urban settings are affectively positive (cfr., affective dimension) and that people have an attentional bias to natural (rather than urban) environments (cfr., attentional dimension). The results regarding the effort dimension are less straightforward, and suggest that this dimension only becomes important in sufficiently difficult cognitive tasks. PMID:23922645

  10. Assessing groundwater quality for irrigation using indicator kriging method

    NASA Astrophysics Data System (ADS)

    Delbari, Masoomeh; Amiri, Meysam; Motlagh, Masoud Bahraini

    2014-09-01

    One of the key parameters influencing sprinkler irrigation performance is water quality. In this study, the spatial variability of groundwater quality parameters (EC, SAR, Na+, Cl-, HCO3 - and pH) was investigated by geostatistical methods and the most suitable areas for implementation of sprinkler irrigation systems in terms of water quality are determined. The study was performed in Fasa county of Fars province using 91 water samples. Results indicated that all parameters are moderately to strongly spatially correlated over the study area. The spatial distribution of pH and HCO3 - was mapped using ordinary kriging. The probability of concentrations of EC, SAR, Na+ and Cl- exceeding a threshold limit in groundwater was obtained using indicator kriging (IK). The experimental indicator semivariograms were often fitted well by a spherical model for SAR, EC, Na+ and Cl-. For HCO3 - and pH, an exponential model was fitted to the experimental semivariograms. Probability maps showed that the risk of EC, SAR, Na+ and Cl- exceeding the given critical threshold is higher in lower half of the study area. The most proper agricultural lands for sprinkler irrigation implementation were identified by evaluating all probability maps. The suitable areas for sprinkler irrigation design were determined to be 25,240 hectares, which is about 34 percent of total agricultural lands and are located in northern and eastern parts. Overall the results of this study showed that IK is an appropriate approach for risk assessment of groundwater pollution, which is useful for a proper groundwater resources management.

  11. New methods for assessing the fascinating nature of nature experiences.

    PubMed

    Joye, Yannick; Pals, Roos; Steg, Linda; Evans, Ben Lewis

    2013-01-01

    In recent years, numerous environmental psychology studies have demonstrated that contact with nature as opposed to urban settings can improve an individual's mood, can lead to increased levels of vitality, and can offer an opportunity to recover from stress. According to Attention Restoration Theory (ART) the restorative potential of natural environments is situated in the fact that nature can replenish depleted attentional resources. This replenishment takes place, in part, because nature is deemed to be a source of fascination, with fascination being described as having an "attentional", an "affective" and an "effort" dimension. However, the claim that fascination with nature involves these three dimensions is to a large extent based on intuition or derived from introspection-based measurement methods, such as self-reports. In three studies, we aimed to more objectively assess whether these three dimensions indeed applied to experiences related to natural environments, before any (attentional) depletion has taken place. The instruments that were used were: (a) the affect misattribution procedure (Study 1), (b) the dot probe paradigm (Study 2) and (c) a cognitively effortful task (Study 3). These instrument were respectively aimed at verifying the affective, attentional and effort dimension of fascination. Overall, the results provide objective evidence for the claims made within the ART framework, that natural as opposed to urban settings are affectively positive (cfr., affective dimension) and that people have an attentional bias to natural (rather than urban) environments (cfr., attentional dimension). The results regarding the effort dimension are less straightforward, and suggest that this dimension only becomes important in sufficiently difficult cognitive tasks.

  12. Are there meaningful individual differences in temporal inconsistency in self-reported personality?

    PubMed Central

    Soubelet, Andrea; Salthouse, Timothy A.; Oishi, Shigehiro

    2014-01-01

    The current project had three goals. The first was to examine whether it is meaningful to refer to across-time variability in self-reported personality as an individual differences characteristic. The second was to investigate whether negative affect was associated with variability in self-reported personality, while controlling for mean levels, and correcting for measurement errors. The third goal was to examine whether variability in self-reported personality would be larger among young adults than among older adults, and whether the relation of variability with negative affect would be stronger at older ages than at younger ages. Two moderately large samples of participants completed the International Item Pool Personality questionnaire assessing the Big Five personality dimensions either twice or thrice, in addition to several measures of negative affect. Results were consistent with the hypothesis that within-person variability in self-reported personality is a meaningful individual difference characteristic. Some people exhibited greater across-time variability than others after removing measurement error, and people who showed temporal instability in one trait also exhibited temporal instability across the other four traits. However, temporal variability was not related to negative affect, and there was no evidence that either temporal variability or its association with negative affect varied with age. PMID:25132698

  13. Clinically meaningful improvement on the quality of erection questionnaire in men with erectile dysfunction.

    PubMed

    Hvidsten, K; Carlsson, M; Stecher, V J; Symonds, T; Levinson, I

    2010-01-01

    Defining the minimal clinically meaningful improvement (MCMI) is crucial to understanding the treatment effects on health-status measures. We estimated the MCMI on the quality of erection questionnaire (QEQ), a validated measure specific to assess erectile quality during sexual intercourse. Data were from two controlled trials of an investigational phosphodiesterase type 5 inhibitor. Improvement on the Erectile Function domain of the International Index of Erectile Function was used as the anchor. For men who improved by exactly 1 erectile dysfunction severity category (anchor group (n=95)), clinically meaningful improvement (CMI, estimated with mean QEQ total change score from baseline to end of treatment) and MCMI (estimated with the lower limit of the 95% confidence interval of the mean) were 22.4 (s.d., 2.2) and 18.0 points, respectively. For the difference between the anchor group and men with no change in severity category (n=116), CMI and MCMI were 17.7 (s.d., 2.9) and 12 points, respectively. Distribution-based analyses (baseline s.e. of measurement (s.e.m.)=7.99, end-of-treatment s.e.m.=8.22 and s.e. of difference=11.46) supported a proposed MCMI of 12 points. Convergence of anchor-based and distribution-based criteria supports at least a 12-point difference in QEQ scores between treatments as clinically important.

  14. Effects of Rater Characteristics and Scoring Methods on Speaking Assessment

    ERIC Educational Resources Information Center

    Matsugu, Sawako

    2013-01-01

    Understanding the sources of variance in speaking assessment is important in Japan where society's high demand for English speaking skills is growing. Three challenges threaten fair assessment of speaking. First, in Japanese university speaking courses, teachers are typically the only raters, but teachers' knowledge of their students may…

  15. School Violence Assessment: A Conceptual Framework, Instruments, and Methods

    ERIC Educational Resources Information Center

    Benbenishty, Rami; Astor, Ron Avi; Estrada, Joey Nunez

    2008-01-01

    This article outlines a philosophical and theoretical framework for conducting school violence assessments at the local level. The authors advocate that assessments employ a strong conceptual foundation based on social work values. These values include the active measurement of ecological factors inside and outside the school that reflect the…

  16. AN OVERVIEW OF DATA INTEGRATION METHODS FOR REGIONAL ASSESSMENT

    EPA Science Inventory

    One of the goals of the EPA's Regional Vulnerability Assessment (ReVA) project is to take diverse environmental data and develop objective criteria to evaluate environmental risk assessments at the regions: scale. The data include (but are not limited to) variables for forests, ...

  17. Peer Assessment as a Method of Improving Student Engagement

    ERIC Educational Resources Information Center

    Weaver, Debbi; Esposto, Alexis

    2012-01-01

    To encourage increased student attendance and engagement in a third-year economics unit, the curriculum was redesigned to incorporate continuous assessment throughout the semester. A component of group project marks were allocated to peer assessment, in an attempt to address concerns about free-riding colleagues sharing a common mark. This study…

  18. Assessing Mantle Models with the Spectral-Element Method

    NASA Astrophysics Data System (ADS)

    Tromp, J.; Komatitsch, D.; Ritsema, J.; Allen, R.

    2001-12-01

    We have developed and implemented a spectral-element method (SEM) to simulate seismic wave propagation throughout the entire globe. Our SEM incorporates the effects of fluid-solid boundaries, attenuation, anisotropy, the oceans, rotation, self-gravitation and 3-D mantle and crustal heterogeneity. The method is implemented on a massively parallel PC cluster computer using message-passing software (MPI). The effects of crustal thickness, anisotropy, and attenuation on surface waves are quite dramatic. Self-gravitation and, in particular, the presence of a water layer slow the Rayleigh wave down. For spherically symmetric Earth models the SEM is in excellent agreement with normal-mode synthetics at periods greater than 20~seconds. We use the SEM to assess the quality of mantle model S20RTS, developed by Ritsema and colleagues, and Iceland model ICEMAN, developed by Allen and colleagues. The effects of 3-D heterogeneity can be spectacular. For example, along oceanic paths from Fiji-Tonga to Western North America or Japan the Rayleigh wave arrives more than a minute earlier than in PREM, and the Love wave exhibits very little dispersion, unlike in PREM. These effects are largely due to the fact that the oceanic crust is much thinner than in PREM. For a set of well-recorded earthquakes we use the SEM to determine how well model S20RTS fits the travel-time data. Because the SEM synthetics are essentially exact at periods greater than 20~seconds, this facilitates a difficult test for a 3-D model. For Iceland we are investigating whether or not a narrow plume can explain the differential travel-time data used to constrain the model. The width of the plume is so small that standard ray theory may be inadequate for waves with periods greater than 20~seconds. Due to finite-frequency effects, a ray that `misses' the plume can still be significantly affected by its presence. The question is whether a thin plume, which is preferred in geodynamic models, can explain the data as

  19. Dynamics of Boolean networks controlled by biologically meaningful functions.

    PubMed

    Raeymaekers, L

    2002-10-01

    The remarkably stable dynamics displayed by randomly constructed Boolean networks is one of the most striking examples of the spontaneous emergence of self-organization in model systems composed of many interacting elements (Kauffman, S., J. theor. Biol.22, 437-467, 1969; The Origins of Order, Oxford University Press, Oxford, 1993). The dynamics of such networks is most stable for a connectivity of two inputs per element, and decreases dramatically with increasing number of connections. Whereas the simplicity of this model system allows the tracing of the dynamical trajectories, it leaves out many features of real biological connections. For instance, the dynamics has been studied in detail only for networks constructed by allowing all theoretically possible Boolean rules, whereas only a subset of them make sense in the material world. This paper analyses the effect on the dynamics of using only Boolean functions which are meaningful in a biological sense. This analysis is particularly relevant for nets with more than two inputs per element because biological networks generally appear to be more extensively interconnected. Sets of the meaningful functions were assembled for up to four inputs per element. The use of these rules results in a smaller number of distinct attractors which have a shorter length, with relatively little sensitivity to the size of the network and to the number of inputs per element. Forcing away the activator/inhibitor ratio from the expected value of 50% further enhances the stability. This effect is more pronounced for networks consisting of a majority of activators than for networks with a corresponding majority of inhibitors, indicating that the former allow the evolution of larger genetic networks. The data further support the idea of the usefulness of logical networks as a conceptual framework for the understanding of real-world phenomena.

  20. Meaningful Engagement to Enhance Diversity: Broadened Impact Actualized

    NASA Astrophysics Data System (ADS)

    Whitney, V. W.; Pyrtle, A. J.

    2008-12-01

    The MS PHD'S Professional Development Program was established by and for UR/US populations to facilitate increased and sustained participation within the Earth system science community. MS PHD'S is jointly funded by NSF and NASA. Fourteen (14) minority Earth system scientists served as Program mentors and one- hundred fifteen (115) minority and non-minority scientists served as Meeting Mentors to student participants. Representatives from fifty-six (56) agencies and institutions provided support and exposure to MS PHD'S student participants. Two hundred fifty-eight (258) highly qualified UR/US students completed on-line applications to participate in the MS PHD'S Professional Development Program. Because of funding limitations, slightly fewer than 50% of the applicants were selected to participate. One-hundred twenty-six (126) undergraduate and graduate students from 26 states and Puerto Rico participated in the MS PHD'S program. Sixty-eight (68) MS PHD'S student participants self-identified as African American; thirty-four (34) as Puerto Rican; nine (9) as Hispanic/Mexican American, ten (10) as Native American and one (1) each as African, Asian, Pacific Islander, Hispanic and Multi-Ethnic. During the five year span of MS PHD'S programming, sixteen (16) student participants completed BS degrees, twelve (12) completed MS degrees and ten (10) completed the Doctoral degrees. How did MS PHD'S establish meaningful engagement to enhance diversity within the Earth system science community? This case study reveals replicable processes and constructs to enhance the quality of meaningful collaboration and engagement. In addition, the study addresses frequently asked questions (FAQ's) on outreach, recruitment, engagement, retention and success of students from underrepresented populations within diversity-focused programs.

  1. Friends in the Classroom: A Comparison between Two Methods for the Assessment of Students' Friendship Networks

    ERIC Educational Resources Information Center

    Pijl, Sip Jan; Koster, Marloes; Hannink, Anne; Stratingh, Anna

    2011-01-01

    One of the methods used most often to assess students' friendships and friendship networks is the reciprocal nomination method. However, an often heard complaint is that this technique produces rather negative outcomes. This study compares the reciprocal nomination method with another method to assess students' friendships and friendship networks:…

  2. Comparing Yes/No Angoff and Bookmark Standard Setting Methods in the Context of English Assessment

    ERIC Educational Resources Information Center

    Hsieh, Mingchuan

    2013-01-01

    The Yes/No Angoff and Bookmark method for setting standards on educational assessment are currently two of the most popular standard-setting methods. However, there is no research into the comparability of these two methods in the context of language assessment. This study compared results from the Yes/No Angoff and Bookmark methods as applied to…

  3. Review of Methods Related to Assessing Human Performance in Nuclear Power Plant Control Room Simulations

    SciTech Connect

    Katya L Le Blanc; Ronald L Boring; David I Gertman

    2001-11-01

    With the increased use of digital systems in Nuclear Power Plant (NPP) control rooms comes a need to thoroughly understand the human performance issues associated with digital systems. A common way to evaluate human performance is to test operators and crews in NPP control room simulators. However, it is often challenging to characterize human performance in meaningful ways when measuring performance in NPP control room simulations. A review of the literature in NPP simulator studies reveals a variety of ways to measure human performance in NPP control room simulations including direct observation, automated computer logging, recordings from physiological equipment, self-report techniques, protocol analysis and structured debriefs, and application of model-based evaluation. These methods and the particular measures used are summarized and evaluated.

  4. Application of Watershed Ecological Risk Assessment Methods to Watershed Management

    EPA Science Inventory

    Watersheds are frequently used to study and manage environmental resources because hydrologic boundaries define the flow of contaminants and other stressors. Ecological assessments of watersheds are complex because watersheds typically overlap multiple jurisdictional boundaries,...

  5. Assessment of Automated Measurement and Verification (M&V) Methods

    SciTech Connect

    Granderson, Jessica; Touzani, Samir; Custodio, Claudine; Sohn, Michael; Fernandes, Samuel; Jump, David

    2015-07-01

    This report documents the application of a general statistical methodology to assess the accuracy of baseline energy models, focusing on its application to Measurement and Verification (M&V) of whole-building energy savings.

  6. Relative Contributions of Three Descriptive Methods: Implications for Behavioral Assessment

    ERIC Educational Resources Information Center

    Pence, Sacha T.; Roscoe, Eileen M.; Bourret, Jason C.; Ahearn, William H.

    2009-01-01

    This study compared the outcomes of three descriptive analysis methods--the ABC method, the conditional probability method, and the conditional and background probability method--to each other and to the results obtained from functional analyses. Six individuals who had been diagnosed with developmental delays and exhibited problem behavior…

  7. Alignathon: a competitive assessment of whole-genome alignment methods

    PubMed Central

    Earl, Dent; Nguyen, Ngan; Hickey, Glenn; Harris, Robert S.; Fitzgerald, Stephen; Beal, Kathryn; Seledtsov, Igor; Molodtsov, Vladimir; Raney, Brian J.; Clawson, Hiram; Kim, Jaebum; Kemena, Carsten; Chang, Jia-Ming; Erb, Ionas; Poliakov, Alexander; Hou, Minmei; Herrero, Javier; Kent, William James; Solovyev, Victor; Darling, Aaron E.; Ma, Jian; Notredame, Cedric; Brudno, Michael; Dubchak, Inna; Haussler, David; Paten, Benedict

    2014-01-01

    Multiple sequence alignments (MSAs) are a prerequisite for a wide variety of evolutionary analyses. Published assessments and benchmark data sets for protein and, to a lesser extent, global nucleotide MSAs are available, but less effort has been made to establish benchmarks in the more general problem of whole-genome alignment (WGA). Using the same model as the successful Assemblathon competitions, we organized a competitive evaluation in which teams submitted their alignments and then assessments were performed collectively after all the submissions were received. Three data sets were used: Two were simulated and based on primate and mammalian phylogenies, and one was comprised of 20 real fly genomes. In total, 35 submissions were assessed, submitted by 10 teams using 12 different alignment pipelines. We found agreement between independent simulation-based and statistical assessments, indicating that there are substantial accuracy differences between contemporary alignment tools. We saw considerable differences in the alignment quality of differently annotated regions and found that few tools aligned the duplications analyzed. We found that many tools worked well at shorter evolutionary distances, but fewer performed competitively at longer distances. We provide all data sets, submissions, and assessment programs for further study and provide, as a resource for future benchmarking, a convenient repository of code and data for reproducing the simulation assessments. PMID:25273068

  8. A Faculty Team Works to Create Content Linkages among Various Courses to Increase Meaningful Learning of Targeted Concepts of Microbiology

    PubMed Central

    Marbach-Ad, Gili; Briken, Volker; Frauwirth, Kenneth; Gao, Lian-Yong; Hutcheson, Steven W.; Joseph, Sam W.; Mosser, David; Parent, Beth; Shields, Patricia; Song, Wenxia; Stein, Daniel C.; Swanson, Karen; Thompson, Katerina V.; Yuan, Robert

    2007-01-01

    As research faculty with expertise in the area of host–pathogen interactions (HPI), we used a research group model to effect our professional development as scientific educators. We have established a working hypothesis: The implementation of a curriculum that forms bridges between our seven HPI courses allows our students to achieve deep and meaningful learning of HPI concepts. Working collaboratively, we identified common learning goals, and we chose two microorganisms to serve as anchors for student learning. We instituted variations of published active-learning methods to engage students in research-oriented learning. In parallel, we are developing an assessment tool. The value of this work is in the development of a teaching model that successfully allowed faculty who already work collaboratively in the research area of HPI to apply a “research group approach” to further scientific teaching initiatives at a research university. We achieved results that could not be accomplished by even the most dedicated instructor working in isolation. PMID:17548877

  9. Methods of failure and reliability assessment for mechanical heart pumps.

    PubMed

    Patel, Sonna M; Allaire, Paul E; Wood, Houston G; Throckmorton, Amy L; Tribble, Curt G; Olsen, Don B

    2005-01-01

    Artificial blood pumps are today's most promising bridge-to-recovery (BTR), bridge-to-transplant (BTT), and destination therapy solutions for patients suffering from intractable congestive heart failure (CHF). Due to an increased need for effective, reliable, and safe long-term artificial blood pumps, each new design must undergo failure and reliability testing, an important step prior to approval from the United States Food and Drug Administration (FDA), for clinical testing and commercial use. The FDA has established no specific standards or protocols for these testing procedures and there are only limited recommendations provided by the scientific community when testing an overall blood pump system and individual system components. Product development of any medical device must follow a systematic and logical approach. As the most critical aspects of the design phase, failure and reliability assessments aid in the successful evaluation and preparation of medical devices prior to clinical application. The extent of testing, associated costs, and lengthy time durations to execute these experiments justify the need for an early evaluation of failure and reliability. During the design stages of blood pump development, a failure modes and effects analysis (FMEA) should be completed to provide a concise evaluation of the occurrence and frequency of failures and their effects on the overall support system. Following this analysis, testing of any pump typically involves four sequential processes: performance and reliability testing in simple hydraulic or mock circulatory loops, acute and chronic animal experiments, human error analysis, and ultimately, clinical testing. This article presents recommendations for failure and reliability testing based on the National Institutes of Health (NIH), Society for Thoracic Surgeons (STS) and American Society for Artificial Internal Organs (ASAIO), American National Standards Institute (ANSI), the Association for Advancement of

  10. Searching Remotely Sensed Images for Meaningful Nested Gestalten

    NASA Astrophysics Data System (ADS)

    Michaelsen, E.; Muench, D.; Arens, M.

    2016-06-01

    Even non-expert human observers sometimes still outperform automatic extraction of man-made objects from remotely sensed data. We conjecture that some of this remarkable capability can be explained by Gestalt mechanisms. Gestalt algebra gives a mathematical structure capturing such part-aggregate relations and the laws to form an aggregate called Gestalt. Primitive Gestalten are obtained from an input image and the space of all possible Gestalt algebra terms is searched for well-assessed instances. This can be a very challenging combinatorial effort. The contribution at hand gives some tools and structures unfolding a finite and comparably small subset of the possible combinations. Yet, the intended Gestalten still are contained and found with high probability and moderate efforts. Experiments are made with images obtained from a virtual globe system, and use the SIFT method for extraction of the primitive Gestalten. Comparison is made with manually extracted ground-truth Gestalten salient to human observers.

  11. Silent Reading Fluency Using Underlining: Evidence for an Alternative Method of Assessment

    ERIC Educational Resources Information Center

    Price, Katherine W.; Meisinger, Elizabeth B.; Louwerse, Max M.; D'Mello, Sidney K.

    2012-01-01

    Assessing silent reading fluency in classroom environments is challenging. This article reports on a method of assessing silent reading using underlining, an approach that solves many problems other silent reading fluency assessment measures face. This method computationally monitors readers' silent reading fluency by the speed they underline…

  12. A STANDARDIZED ASSESSMENT METHOD (SAM) FOR RIVERINE MACROINVERTEBRATES

    EPA Science Inventory

    A macroinvertebrate sampling method for large rivers based on desirable characteristics of existing nonwadeable methods was developed and tested. Six sites each were sampled on the Great Miami and Kentucky Rivers, reflecting a human disturbance gradient. Samples were collected ...

  13. DO TIE LABORATORY BASED ASSESSMENT METHODS REALLY PREDICT FIELD EFFECTS?

    EPA Science Inventory

    Sediment Toxicity Identification and Evaluation (TIE) methods have been developed for both porewaters and whole sediments. These relatively simple laboratory methods are designed to identify specific toxicants or classes of toxicants in sediments; however, the question of whethe...

  14. Quantitative risk assessment methods for cancer and noncancer effects.

    PubMed

    Baynes, Ronald E

    2012-01-01

    Human health risk assessments have evolved from the more qualitative approaches to more quantitative approaches in the past decade. This has been facilitated by the improvement in computer hardware and software capability and novel computational approaches being slowly recognized by regulatory agencies. These events have helped reduce the reliance on experimental animals as well as better utilization of published animal toxicology data in deriving quantitative toxicity indices that may be useful for risk management purposes. This chapter briefly describes some of the approaches as described in the guidance documents from several of the regulatory agencies as it pertains to hazard identification and dose-response assessment of a chemical. These approaches are contrasted with more novel computational approaches that provide a better grasp of the uncertainty often associated with chemical risk assessments.

  15. Cervical spinal cord injury: tailoring clinical trial endpoints to reflect meaningful functional improvements

    PubMed Central

    Bond, Lisa M.; McKerracher, Lisa

    2014-01-01

    Cervical spinal cord injury (SCI) results in partial to full paralysis of the upper and lower extremities. Traditional primary endpoints for acute SCI clinical trials are too broad to assess functional recovery in cervical subjects, raising the possibility of false positive outcomes in trials for cervical SCI. Endpoints focused on the recovery of hand and arm control (e.g., upper extremity motor score, motor level change) show the most potential for use as primary outcomes in upcoming trials of cervical SCI. As the field moves forward, the most reliable way to ensure meaningful clinical testing in cervical subjects may be the development of a composite primary endpoint that measures both neurological recovery and functional improvement. PMID:25317162

  16. Fostering Self-Reflection and Meaningful Learning: Earth Science Professional Development for Middle School Science Teachers

    NASA Astrophysics Data System (ADS)

    Monet, Julie A.; Etkina, Eugenia

    2008-10-01

    This paper describes the analysis of teachers’ journal reflections during an inquiry-based professional development program. As a part of their learning experience, participants reflected on what they learned and how they learned. Progress of subject matter and pedagogical content knowledge was assessed though surveys and pre- and posttests. We found that teachers have difficulties reflecting on their learning and posing meaningful questions. The teachers who could describe how they reasoned from evidence to understand a concept had the highest learning gains. In contrast those teachers who seldom or never described learning a concept by reasoning from evidence showed the smallest learning gains. This analysis suggests that learning to reflect on one’s learning should be an integral part of teachers’ professional development experiences.

  17. Informativeness Improvement of Hardness Test Methods for Metal Product Assessment

    NASA Astrophysics Data System (ADS)

    Osipov, S.; Podshivalov, I.; Osipov, O.; Zhantybaev, A.

    2016-06-01

    The paper presents a combination of theoretical suggestions, results, and observations allowing to improve the informativeness of hardness testing process in solving problems of metal product assessment while in operation. The hardness value of metal surface obtained by a single measurement is considered to be random. Various measures of location and scattering of the random variable were experimentally estimated for a number of test samples using the correlation analysis, and their close interaction was studied. It was stated that in metal assessment, the main informative characteristics of hardness testing process are its average value and mean-square deviation for measures of location and scattering, respectively.

  18. INTEGRATION OF SPATIAL DATA: METHODS EVALUATION WITH REGARD TO DATA ISSUES AND ASSESSMENT QUESTIONS

    EPA Science Inventory

    EPA's Regional Vulnerability Assessment (REVA) Program is developing and demonstrating approaches to assess current and future environmental vulnerabilities at a regional scale. An initial effort within this research program has been to develop and evaluate methods to synthesize ...

  19. RECOVERY ACT - Methods for Decision under Technological Change Uncertainty and Risk Assessment for Integrated Assessment of Climate Change

    SciTech Connect

    Webster, Mort D.

    2015-11-30

    This report presents the final outcomes and products of the project as performed both at the Massachusetts Institute of Technology and subsequently at Pennsylvania State University. The research project can be divided into three main components: methodology development for decision-making under uncertainty, improving the resolution of the electricity sector to improve integrated assessment, and application of these methods to integrated assessment.

  20. RECOVERY ACT - Methods for Decision under Technological Change Uncertainty and Risk Assessment for Integrated Assessment of Climate Change

    SciTech Connect

    Webster, Mort David

    2015-03-10

    This report presents the final outcomes and products of the project as performed at the Massachusetts Institute of Technology. The research project consists of three main components: methodology development for decision-making under uncertainty, improving the resolution of the electricity sector to improve integrated assessment, and application of these methods to integrated assessment. Results in each area is described in the report.

  1. Assessment of in silico methods to estimate aquatic species sensitivity

    EPA Science Inventory

    Determining the sensitivity of a diversity of species to environmental contaminants continues to be a significant challenge in ecological risk assessment because toxicity data are generally limited to a few standard species. In many cases, QSAR models are used to estimate toxici...

  2. Data screening methods for baseline ecological risk assessments

    SciTech Connect

    Schmeising, L.M.

    1994-12-31

    In conducting ecological risk assessments (ERAs), it is commonplace to take a phased approach in assessing potential impacts to ecological receptors. The first phase, the baseline ecological risk assessment (BERA) often includes a component which involves the systematic screening of the analytical data for abiotic media (i.e., surface water, sediment, surface soil) versus available ecology-based criteria, standards, guidelines, and benchmark values. Examples of ecological benchmark values include applicable toxicity data, such as no observed effects levels (NOELS) , lowest observed effects levels (LOELS) , or lethal doses (LC50, LD50) for selected indicator species or surrogates. An additional step often included in the screening process, is the calculation of ecological quotients (EQs), or environmental concentration/ benchmark ratios. The intent of the data screening process in performing BERAs is to determine which contaminants at a site are potentially posing a threat to ecological receptors. These contaminants, known as the ecological contaminants of concern (COCS) , are retained for further, detailed evaluations in later phases of the risk assessment. Application of these screening methodologies is presented, along with examples of ecology-based criteria, standards, and guidelines, and benchmark values.

  3. [The method of quantitative assessment of dentition aesthetic parameters].

    PubMed

    Ryakhovsky, A N; Kalacheva, Ya A

    2016-01-01

    This article describes the formula for calculating the aesthetic index of treatment outcome. The formula was derived on the basis of the obtained regression equations showing the dependence of visual assessment of the value of aesthetic violations. The formula can be used for objective quantitative evaluation of the aesthetics of the teeth when smiling before and after dental treatment.

  4. "Portfolios" as a method of assessment in medical education.

    PubMed

    Haldane, Thea

    2014-01-01

    Portfolios are increasingly used in postgraduate medical education and in gastroenterology training as an assessment tool, as documentation of competence, a database of procedure experience (for example endoscopy experience) and for revalidation purposes. In this paper the educational theory behind their use is described and the evidence for their use is discussed.

  5. The Adult Asperger Assessment (AAA): A Diagnostic Method

    ERIC Educational Resources Information Center

    Baron-Cohen, Simon; Wheelwright, Sally; Robinson, Janine; Woodbury-Smith, Marc

    2005-01-01

    At the present time there are a large number of adults who have "suspected" Asperger syndrome (AS). In this paper we describe a new instrument, the Adult Asperger Assessment (AAA), developed in our clinic for adults with AS. The need for a new instrument relevant to the diagnosis of AS in adulthood arises because existing instruments are designed…

  6. Using Qualitative Methods to Assess Diverse Institutional Cultures

    ERIC Educational Resources Information Center

    Museus, Samuel D.

    2007-01-01

    This article focuses on describing how institutional researchers can use qualitative cultural assessments to better understand the role that their campus cultures play in shaping individual and group behaviors and experiences. A special emphasis is given to the implications of institutional diversity in the processes of designing and conducting…

  7. Engine non-containment: UK risk assessment methods

    NASA Technical Reports Server (NTRS)

    Wallin, J. C.

    1977-01-01

    More realistic guideline data must be developed for use in aircraft design in order to comply with recent changes in British civil airworthiness requirements. Unrealistically pessimistic results were obtained when the methodology developed during the Concorde SST certification program was extended to assess catastrophic risks resulting from uncontained engine rotors.

  8. Developing New Tools and Methods for Risk Assessment

    EPA Science Inventory

    Traditionally, risk assessment for environmental chemicals is based upon epidemiological and/or animal toxicity data. Since the release of the National Academy of Sciences Toxicity in the 21st Century: A Vision and a Strategy (2007) and Science and Decisions: Advancing Risk Asses...

  9. A Brief Method for Conducting a Negative-Reinforcement Assessment.

    ERIC Educational Resources Information Center

    Zarcone, Jennifer R.; Crosland, Kimberly; Fisher, Wayne W.; Worsdell, April S.; Herman, Kelly

    1999-01-01

    A brief negative-reinforcement assessment was conducted with five children (ages 4 to 14) with developmental disabilities with severe destructive behavior. Children were trained to engage in an escape response and were presented with a variety of stimuli. For each child, several stimuli were identified that may serve as effective negative…

  10. [The method of quantitative assessment of dentition aesthetic parameters].

    PubMed

    Ryakhovsky, A N; Kalacheva, Ya A

    2016-01-01

    This article describes the formula for calculating the aesthetic index of treatment outcome. The formula was derived on the basis of the obtained regression equations showing the dependence of visual assessment of the value of aesthetic violations. The formula can be used for objective quantitative evaluation of the aesthetics of the teeth when smiling before and after dental treatment. PMID:27367198

  11. Paper Trail: One Method of Information Literacy Assessment

    ERIC Educational Resources Information Center

    Nutefall, Jennifer

    2004-01-01

    Assessing students' information literacy skills can be difficult depending on the involvement of the librarian in a course. To overcome this, librarians created an assignment called the Paper Trail, where students wrote a short essay about their research process and reflected on what they would do differently. Through reviewing and grading these…

  12. Out of This World Genetics: A Fun, Simple Assessment Method.

    ERIC Educational Resources Information Center

    Nelson, Julie M.

    2002-01-01

    Presents a science activity in genetics that explains concepts such as dominant and recessive traits, monohybrid and dihybrid crosses, Punnett squares, and Mendel's Laws of Segregation and Independent Assortment. Uses the activity as an assessment tool to measure students' fundamental understanding. (YDS)

  13. [Aesthetics theory and method of landscape resource assessment].

    PubMed

    Wang, Baozhong; Wang, Baoming; He, Ping

    2006-09-01

    With the destruction of natural environment by human beings, scenic resources are no longer inexhaustible in supply and use. Human beings begin to lay the scenic resources on the same important strategic status as other natural resources, while landscape resources assessment is the prerequisite of their sustainable exploitation and conservation. This paper illustrated the psychological mechanisms of aesthetic and its approaches, compared with the methodologies of traditional and modem landscape aesthetic research, discussed the characteristics of important aesthetic theories (Platonism, Kant paradigm, Empathizing theory, Gestalt paradigm, Marxism aesthetics theory, and Appleton theory) and the landscape assessment theories of 4 paradigms (expert, psychological, cognitive, and empirical) and 2 groups (landscape environment science and landscape architecture culture), and summarized the important practices and successful examples at home and abroad. It was demonstrated that the historical development of landscape assessment had the feature of a contest between expert- and perception-based approaches, with the expert approach dominated in landscape management, while the perception-based approach dominated in landscape research. Both of these approaches generallty accepted that landscape quality was derived from the interaction between the biophysical features of landscape and the percepultual (judgmental) processes of human viewer. In the future, landscape quality assessment will evolve toward a shaky marriage, both expert- and perceptual approaches will be applied in parallel and merged in the final landscape management decision-making process in some but unspecified way, landscape information and complex geo-temporal dynamics representation central to scenic ecosystem management will present major challenges to the traditional landscape aesthetic assessment, and modem science and technology will continue to help meet these challenges. The main trends of landscape

  14. Neural dissociations between meaningful and mere inconsistency in impression updating.

    PubMed

    Mende-Siedlecki, Peter; Todorov, Alexander

    2016-09-01

    Recent neuroimaging work has identified a network of regions that work in concert to update impressions of other people, particularly in response to inconsistent behavior. However, the specific functional contributions of these regions to the updating process remain unclear. Using fMRI, we tested whether increases in activity triggered by inconsistent behavior reflect changes in the stored representations of other people in response to behavioral inconsistency, or merely a response to the inconsistency itself. Participants encountered a series of individuals whose behavior either changed in an attributionally meaningful fashion or was merely inconsistent with the immediately preceding behavior. We observed that left ventrolateral prefrontal cortex (vlPFC) and left inferior frontal gyrus (IFG) were preferentially recruited in response to unexpected, immoral behavior, whereas a separate set of regions (including dorsal anterior cingulate cortex, posterior cingulate cortex and temporoparietal junction/inferior parietal lobule) was preferentially recruited in response to more mundane inconsistencies in behavior. These results shed light on the distributed systems supporting impression updating. Specifically, while many regions supporting updating may primarily respond to moment-to-moment changes in behavior, a subset of regions (e.g. vlPFC and IFG) may contribute to updating person representations in response to trait-relevant changes in behavior.

  15. Individual olfactory perception reveals meaningful nonolfactory genetic information.

    PubMed

    Secundo, Lavi; Snitz, Kobi; Weissler, Kineret; Pinchover, Liron; Shoenfeld, Yehuda; Loewenthal, Ron; Agmon-Levin, Nancy; Frumin, Idan; Bar-Zvi, Dana; Shushan, Sagit; Sobel, Noam

    2015-07-14

    Each person expresses a potentially unique subset of ∼ 400 different olfactory receptor subtypes. Given that the receptors we express partially determine the odors we smell, it follows that each person may have a unique nose; to capture this, we devised a sensitive test of olfactory perception we termed the "olfactory fingerprint." Olfactory fingerprints relied on matrices of perceived odorant similarity derived from descriptors applied to the odorants. We initially fingerprinted 89 individuals using 28 odors and 54 descriptors. We found that each person had a unique olfactory fingerprint (P < 10(-10)), which was odor specific but descriptor independent. We could identify individuals from this pool using randomly selected sets of 7 odors and 11 descriptors alone. Extrapolating from this data, we determined that using 34 odors and 35 descriptors we could individually identify each of the 7 billion people on earth. Olfactory perception, however, fluctuates over time, calling into question our proposed perceptual readout of presumably stable genetic makeup. To test whether fingerprints remain informative despite this temporal fluctuation, building on the linkage between olfactory receptors and HLA, we hypothesized that olfactory perception may relate to HLA. We obtained olfactory fingerprints and HLA typing for 130 individuals, and found that olfactory fingerprint matching using only four odorants was significantly related to HLA matching (P < 10(-4)), such that olfactory fingerprints can save 32% of HLA tests in a population screen (P < 10(-6)). In conclusion, a precise measure of olfactory perception reveals meaningful nonolfactory genetic information.

  16. The Pediatric Epilepsy Side Effects Questionnaire: Establishing clinically meaningful change.

    PubMed

    Junger, Katherine W; Morita, Diego; Modi, Avani C

    2015-04-01

    The present study extends the utility of the Pediatric Epilepsy Side Effects Questionnaire (PESQ) by determining distribution-based minimally clinically important difference (MCID) scores. Participants (N=682) were youth (ages 2-25) with newly diagnosed and chronic epilepsy pooled from research and clinical data in the Comprehensive Epilepsy Center. Caregivers completed the PESQ. Demographic and medical data were extracted from medical chart reviews or via a questionnaire. The MCIDs, which are the standard errors of measurement for each scale, for the entire sample were as follows: Cognitive=4.66, Motor=4.67, Behavior=8.05, General Neurological=7.41, Weight=9.58, and Total Side Effects=3.25. Additionally, MCIDs for patients with new-onset (<12months) epilepsy on monotherapy, new-onset epilepsy on polytherapy, chronic epilepsy on monotherapy (>12months), and chronic epilepsy on polytherapy were calculated. Results from the present study extend the utility of the PESQ by providing clinicians and researchers an enhanced understanding about clinically meaningful changes in side effect profiles across the pediatric epilepsy spectrum. These data can inform clinical decision-making for clinicians and researchers. PMID:25842203

  17. The Pediatric Epilepsy Side Effects Questionnaire: Establishing clinically meaningful change

    PubMed Central

    Junger, Katherine W.; Morita, Diego; Modi, Avani C.

    2015-01-01

    The present study extends the utility of the Pediatric Epilepsy Side Effects Questionnaire (PESQ) by determining distribution-based minimally clinically important difference (MCID) scores. Participants (N=682) were youth (ages 2–25) with newly diagnosed and chronic epilepsy pooled from research and clinical data in the Comprehensive Epilepsy Center. Caregivers completed the PESQ. Demographic and medical data were extracted from medical chart reviews or via a questionnaire. The MCIDs, which are the standard errors of measurement for each scale, for the entire sample were: Cognitive = 4.66; Motor = 4.67; Behavior = 8.05; General Neurological = 7.41; Weight = 9.58; Total PESQ = 3.25. Additionally, MCIDs for patients with new-onset (<12 months) epilepsy on monotherapy, new-onset epilepsy on polytherapy, chronic epilepsy on monotherapy (>12 months), and chronic epilepsy on polytherapy were calculated. Results from the present study extend the utility of the PESQ by providing clinicians and researchers an enhanced understanding about clinically meaningful changes in side effect profiles across the pediatric epilepsy spectrum. These data can inform clinical decision making for clinicians and researchers. PMID:25842203

  18. Neural dissociations between meaningful and mere inconsistency in impression updating.

    PubMed

    Mende-Siedlecki, Peter; Todorov, Alexander

    2016-09-01

    Recent neuroimaging work has identified a network of regions that work in concert to update impressions of other people, particularly in response to inconsistent behavior. However, the specific functional contributions of these regions to the updating process remain unclear. Using fMRI, we tested whether increases in activity triggered by inconsistent behavior reflect changes in the stored representations of other people in response to behavioral inconsistency, or merely a response to the inconsistency itself. Participants encountered a series of individuals whose behavior either changed in an attributionally meaningful fashion or was merely inconsistent with the immediately preceding behavior. We observed that left ventrolateral prefrontal cortex (vlPFC) and left inferior frontal gyrus (IFG) were preferentially recruited in response to unexpected, immoral behavior, whereas a separate set of regions (including dorsal anterior cingulate cortex, posterior cingulate cortex and temporoparietal junction/inferior parietal lobule) was preferentially recruited in response to more mundane inconsistencies in behavior. These results shed light on the distributed systems supporting impression updating. Specifically, while many regions supporting updating may primarily respond to moment-to-moment changes in behavior, a subset of regions (e.g. vlPFC and IFG) may contribute to updating person representations in response to trait-relevant changes in behavior. PMID:27217118

  19. The potential for new methods to assess human reproductive genotoxicity

    SciTech Connect

    Mendelsohn, M.L.

    1987-09-01

    The immediate prospects are not good for practical methods for measuring the human heritable mutation rate. The methods discussed here range from speculative to impractical, and at best are sensitive enough only for large numbers of subjects. Given the rapid development of DNA methods and the current status of two-dimensional gel electrophoresis, there is some hope that the intermediate prospects may be better. In contrast, the prospects for useful cellular-based male germinal methods seem more promising and immediate. Effective specific locus methods for sperm are already conceivable and may be practical in a few years. Obviously such methods will not predict heritable effects definitively, but they will provide direct information on reproductive genotoxicity and should contribute significantly to many current medical and environmental situations where genetic damage is suspected. 22 refs.

  20. Peer Assessment in Group Projects Accounting for Assessor Reliability by an Iterative Method

    ERIC Educational Resources Information Center

    Ko, Sung-Seok

    2014-01-01

    This study proposes an advanced method to factor in the contributions of individual group members engaged in an integrated group project using peer assessment procedures. Conway et al. proposed the Individual Weight Factor (IWF) method for peer assessment which has been extensively developed over the years. However, most methods associated with…

  1. Assessing Grammar Teaching Methods Using a Metacognitive Framework.

    ERIC Educational Resources Information Center

    Burkhalter, Nancy

    A study examined 3 grammar teaching methods to understand why some methods may carry over into writing better than others. E. Bialystok and E. B. Ryan's (1985) metacognitive model of language skills was adapted to plot traditional grammar, sentence combining, and the functional/inductive approach according to the amount of analyzed knowledge and…

  2. Assessment of gene order computing methods for Alzheimer's disease

    PubMed Central

    2013-01-01

    Background Computational genomics of Alzheimer disease (AD), the most common form of senile dementia, is a nascent field in AD research. The field includes AD gene clustering by computing gene order which generates higher quality gene clustering patterns than most other clustering methods. However, there are few available gene order computing methods such as Genetic Algorithm (GA) and Ant Colony Optimization (ACO). Further, their performance in gene order computation using AD microarray data is not known. We thus set forth to evaluate the performances of current gene order computing methods with different distance formulas, and to identify additional features associated with gene order computation. Methods Using different distance formulas- Pearson distance and Euclidean distance, the squared Euclidean distance, and other conditions, gene orders were calculated by ACO and GA (including standard GA and improved GA) methods, respectively. The qualities of the gene orders were compared, and new features from the calculated gene orders were identified. Results Compared to the GA methods tested in this study, ACO fits the AD microarray data the best when calculating gene order. In addition, the following features were revealed: different distance formulas generated a different quality of gene order, and the commonly used Pearson distance was not the best distance formula when used with both GA and ACO methods for AD microarray data. Conclusion Compared with Pearson distance and Euclidean distance, the squared Euclidean distance generated the best quality gene order computed by GA and ACO methods. PMID:23369541

  3. Assessing the Classroom Potential of the Keyword Method.

    ERIC Educational Resources Information Center

    Levin, Joel R.; And Others

    1979-01-01

    Six investigations of the keyword method of foreign language vocabulary learning were conducted to evaluate differences found in highly structured laboratory-like settings. Two experiments produced keyword effects when the method was administered to small or classroom-sized groups of elementary school children. (Author/RD)

  4. An observational assessment method for aging laboratory rats

    EPA Science Inventory

    The growth of the aging population highlights the need for laboratory animal models to study the basic biological processes ofaging and susceptibility to toxic chemicals and disease. Methods to evaluate health ofaging animals over time are needed, especially efficient methods for...

  5. Assessing Faculty Salary Compression: An Application of Two Methods.

    ERIC Educational Resources Information Center

    Fraas, John W.

    2002-01-01

    Two methods utilizing multiple regression models and correlation values were used to determine whether faculty salaries at Ashland University (Ohio) reflected salary compression, an unusually small salary differential between junior and senior faculty. Salary compression was not found by either the suppressor effect method, which looked for a…

  6. Method for assessing motor insulation on operating motors

    DOEpatents

    Kueck, John D.; Otaduy, Pedro J.

    1997-01-01

    A method for monitoring the condition of electrical-motor-driven devices. The method is achieved by monitoring electrical variables associated with the functioning of an operating motor, applying these electrical variables to a three phase equivalent circuit and determining non-symmetrical faults in the operating motor based upon symmetrical components analysis techniques.

  7. Method for assessing motor insulation on operating motors

    DOEpatents

    Kueck, J.D.; Otaduy, P.J.

    1997-03-18

    A method for monitoring the condition of electrical-motor-driven devices is disclosed. The method is achieved by monitoring electrical variables associated with the functioning of an operating motor, applying these electrical variables to a three phase equivalent circuit and determining non-symmetrical faults in the operating motor based upon symmetrical components analysis techniques. 15 figs.

  8. Method and system for dynamic probabilistic risk assessment

    NASA Technical Reports Server (NTRS)

    Dugan, Joanne Bechta (Inventor); Xu, Hong (Inventor)

    2013-01-01

    The DEFT methodology, system and computer readable medium extends the applicability of the PRA (Probabilistic Risk Assessment) methodology to computer-based systems, by allowing DFT (Dynamic Fault Tree) nodes as pivot nodes in the Event Tree (ET) model. DEFT includes a mathematical model and solution algorithm, supports all common PRA analysis functions and cutsets. Additional capabilities enabled by the DFT include modularization, phased mission analysis, sequence dependencies, and imperfect coverage.

  9. When Collaborative Learning Meets Nature: Collaborative Learning as a Meaningful Learning Tool in the Ecology Inquiry Based Project

    NASA Astrophysics Data System (ADS)

    Rozenszayn, Ronit; Ben-Zvi Assaraf, Orit

    2011-01-01

    This research suggests utilizing collaborative learning among high school students for better performance on ecology inquiry-based projects. A case study of nine 12th grade students who participated in collaborative learning sessions in the open field and in class is examined. The results show that the students concentrated on discussing the methods of measurement and observation in the open field, rather than the known methods from class or from the laboratory. Another major part of their discussions concentrated on knowledge construction. Knowledge construction occurred between students with same or similar learning abilities. The role of the teacher in these discussions was crucial: she had to deal with and dispel misconceptions; and she had to bridge the gap between low-ability and high-ability students, for enabling meaningful learning to occur. The article ends with a number of recommendations for using collaborative learning as a tool for achieving meaningful learning in high school ecology inquiry-based projects.

  10. Comparative assessment of bioanalytical method validation guidelines for pharmaceutical industry.

    PubMed

    Kadian, Naveen; Raju, Kanumuri Siva Rama; Rashid, Mamunur; Malik, Mohd Yaseen; Taneja, Isha; Wahajuddin, Muhammad

    2016-07-15

    The concepts, importance, and application of bioanalytical method validation have been discussed for a long time and validation of bioanalytical methods is widely accepted as pivotal before they are taken into routine use. United States Food and Drug Administration (USFDA) guidelines issued in 2001 have been referred for every guideline released ever since; may it be European Medical Agency (EMA) Europe, National Health Surveillance Agency (ANVISA) Brazil, Ministry of Health and Labour Welfare (MHLW) Japan or any other guideline in reference to bioanalytical method validation. After 12 years, USFDA released its new draft guideline for comments in 2013, which covers the latest parameters or topics encountered in bioanalytical method validation and approached towards the harmonization of bioanalytical method validation across the globe. Even though the regulatory agencies have general agreement, significant variations exist in acceptance criteria and methodology. The present review highlights the variations, similarities and comparison between bioanalytical method validation guidelines issued by major regulatory authorities worldwide. Additionally, other evaluation parameters such as matrix effect, incurred sample reanalysis including other stability aspects have been discussed to provide an ease of access for designing a bioanalytical method and its validation complying with the majority of drug authority guidelines. PMID:27179186

  11. Comparative assessment of bioanalytical method validation guidelines for pharmaceutical industry.

    PubMed

    Kadian, Naveen; Raju, Kanumuri Siva Rama; Rashid, Mamunur; Malik, Mohd Yaseen; Taneja, Isha; Wahajuddin, Muhammad

    2016-07-15

    The concepts, importance, and application of bioanalytical method validation have been discussed for a long time and validation of bioanalytical methods is widely accepted as pivotal before they are taken into routine use. United States Food and Drug Administration (USFDA) guidelines issued in 2001 have been referred for every guideline released ever since; may it be European Medical Agency (EMA) Europe, National Health Surveillance Agency (ANVISA) Brazil, Ministry of Health and Labour Welfare (MHLW) Japan or any other guideline in reference to bioanalytical method validation. After 12 years, USFDA released its new draft guideline for comments in 2013, which covers the latest parameters or topics encountered in bioanalytical method validation and approached towards the harmonization of bioanalytical method validation across the globe. Even though the regulatory agencies have general agreement, significant variations exist in acceptance criteria and methodology. The present review highlights the variations, similarities and comparison between bioanalytical method validation guidelines issued by major regulatory authorities worldwide. Additionally, other evaluation parameters such as matrix effect, incurred sample reanalysis including other stability aspects have been discussed to provide an ease of access for designing a bioanalytical method and its validation complying with the majority of drug authority guidelines.

  12. Error assessment in recombinant baculovirus titration: evaluation of different methods.

    PubMed

    Roldão, António; Oliveira, Rui; Carrondo, Manuel J T; Alves, Paula M

    2009-07-01

    The success of baculovirus/insect cells system in heterologous protein expression depends on the robustness and efficiency of the production workflow. It is essential that process parameters are controlled and include as little variability as possible. The multiplicity of infection (MOI) is the most critical factor since irreproducible MOIs caused by inaccurate estimation of viral titers hinder batch consistency and process optimization. This lack of accuracy is related to intrinsic characteristics of the method such as the inability to distinguish between infectious and non-infectious baculovirus. In this study, several methods for baculovirus titration were compared. The most critical issues identified were the incubation time and cell concentration at the time of infection. These variables influence strongly the accuracy of titers and must be defined for optimal performance of the titration method. Although the standard errors of the methods varied significantly (7-36%), titers were within the same order of magnitude; thus, viral titers can be considered independent of the method of titration. A cost analysis of the baculovirus titration methods used in this study showed that the alamarblue, real time Q-PCR and plaque assays were the most expensive techniques. The remaining methods cost on average 75% less than the former methods. Based on the cost, time and error analysis undertaken in this study, the end-point dilution assay, microculture tetrazolium assay and flow cytometric assay were found to be the techniques that combine all these three main factors better. Nevertheless, it is always recommended to confirm the accuracy of the titration either by comparison with a well characterized baculovirus reference stock or by titration using two different methods and verification of the variability of results.

  13. A method to assess the bacterial content of refrigerated meat.

    PubMed Central

    Perez de Castro, B; Asensio, M A; Sanz, B; Ordoñez, J A

    1988-01-01

    A new method has been developed to estimate the levels of gram-negative bacteria on refrigerated meat. The method is based on the aminopeptidase activity of these bacteria, which cleaves L-alanine-p-nitroanilide to yield p-nitroaniline, which is easily determined spectrophotometrically. This method allows the determination of levels around 10(6) to 10(7) CFU cm-2 in about 3 h. Because of the yellow color of p-nitroaniline, bacterial loads around 10(7) CFU cm-2 develop a color intense enough to be detected with the naked eye. PMID:3415222

  14. Assessment of left ventricular function by noninvasive methods.

    PubMed

    Luisada, A A; Singhal, A; Portaluppi, F

    1985-01-01

    The possibility of evaluating left ventricular function by noninvasive methods is discussed in detail. The methods that are considered are electrocardiograph, phonocardiography, apex cardiography, sphygmography, impedance cardiography, electrokymography, and echocardiography. Following a brief section of 'definitions', each method is described in detail including technical problems, difficulties, and results. The systolic time intervals and the stress tests are briefly discussed. Based on modern experimental studies, the stress test should include both an electro- and a phonocardiogram. In the latter, one would measure the amplitude of the first heart sound as an index of contractility. The conclusion is that combined methods give the best results. They are electrocardiography, phonocardiography, impedance cardiography, and echocardiography. An alternative, dictated by technical problems, is to use at first phonocardiography and impedance plus electrocardiography; then echocardiography plus electrocardiography; and then, if indicated, a stress test might complete the study; the latter should include both an electrocardiogram and a phonocardiogram. PMID:4003144

  15. Geostatistical methods for hazard assessment and site characterization in mining

    SciTech Connect

    Riefenberg, J.

    1996-12-01

    Ground control hazards, coal quality, ore reserve estimation, and pollution modeling seem unrelated topics from most mining perspectives. However, geostatistical methods can be used to characterize each of these, and more topics. Exploratory drill core data, and continued drilling and field measurements, can provide a wealth of information related to each of the above areas and are often severely underutilized. Recent studies have led to the development of the Multiple Parameter Mapping (MPM) technology, which utilizes geostatistics and other numerical modeling methods, to generate a {open_quotes}hazard index{close_quotes} map, often from exploratory drill core data. This mapping has been presented for ground control hazards relating roof quality, floor quality, numerically modelled stresses due to mining geometry, and geologic features. A review of the MPM method, future directions with the MPM, and a discussion of using these and other geostatistical methods to quantify coal quality, ore reserve estimation, and pollutant modeling are presented in this paper.

  16. Anti-aging cosmetics and its efficacy assessment methods

    NASA Astrophysics Data System (ADS)

    Li, Xiang

    2015-07-01

    The mechanisms of skin aging, the active ingredients used in anti-aging cosmetics and evaluation methods for anti-aging cosmetics were surmised in this paper. And the mechanisms of skin aging were introduced in the intrinsic and extrinsic ways. Meanwhile, the anti-aging cosmetic active ingredients were classified in accordance with the mechanism of action. Various evaluation methods such as human evaluation, in vitro evaluation were also summarized.

  17. Quantitative assessment of susceptibility weighted imaging processing methods

    PubMed Central

    Li, Ningzhi; Wang, Wen-Tung; Sati, Pascal; Pham, Dzung L.; Butman, John A.

    2013-01-01

    Purpose To evaluate different susceptibility weighted imaging (SWI) phase processing methods and parameter selection, thereby improving understanding of potential artifacts, as well as facilitating choice of methodology in clinical settings. Materials and Methods Two major phase processing methods, Homodyne-filtering and phase unwrapping-high pass (HP) filtering, were investigated with various phase unwrapping approaches, filter sizes, and filter types. Magnitude and phase images were acquired from a healthy subject and brain injury patients on a 3T clinical Siemens MRI system. Results were evaluated based on image contrast to noise ratio and presence of processing artifacts. Results When using a relatively small filter size (32 pixels for the matrix size 512 × 512 pixels), all Homodyne-filtering methods were subject to phase errors leading to 2% to 3% masked brain area in lower and middle axial slices. All phase unwrapping-filtering/smoothing approaches demonstrated fewer phase errors and artifacts compared to the Homodyne-filtering approaches. For performing phase unwrapping, Fourier-based methods, although less accurate, were 2–4 orders of magnitude faster than the PRELUDE, Goldstein and Quality-guide methods. Conclusion Although Homodyne-filtering approaches are faster and more straightforward, phase unwrapping followed by HP filtering approaches perform more accurately in a wider variety of acquisition scenarios. PMID:24923594

  18. The Assessment of Experimental Methods of Serial Number Restoration

    NASA Astrophysics Data System (ADS)

    Argo, Mackenzie

    Serial number restoration is a common and successful process of revealing obliterated serial numbers on firearms. In a crime laboratory setting, obliterated serial numbers are commonly processed in order to tie a person to a crime scene or provide an investigative lead for officers. Currently serial numbers are restored using a chemical etchant method that can eat away at the metal on the firearm even after the examination is complete. It can also take several hours to complete and only provide an examiner with a partial number. There are other nondestructive options however little to no literature is available. The purpose of this study is to discover new methods for nondestructive serial number restoration and to compare them to the traditional chemical method used. Metal bars of premeasured obliteration depths and different compositions were examined using three proposed experimental methods: near infrared imaging, cold frost, and scanning acoustic microscopy. Results did not indicate significant difference in the median number of visible digits recovered for each of the three proposed methods compared to the traditional chemical method. There were significant results in the median number of composition utilized and depth of obliteration. This indicates that different firearm compositions and depth of obliteration has an effect on serial number restoration.

  19. A model for selecting assessment methods for evaluating medical students in African medical schools.

    PubMed

    Walubo, Andrew; Burch, Vanessa; Parmar, Paresh; Raidoo, Deshandra; Cassimjee, Mariam; Onia, Rudy; Ofei, Francis

    2003-09-01

    Introduction of more effective and standardized assessment methods for testing students' performance in Africa's medical institutions has been hampered by severe financial and personnel shortages. Nevertheless, some African institutions have recognized the problem and are now revising their medical curricula, and, therefore, their assessment methods. These institutions, and those yet to come, need guidance on selecting assessment methods so as to adopt models that can be sustained locally. The authors provide a model for selecting assessment methods for testing medical students' performance in African medical institutions. The model systematically evaluates factors that influence implementation of an assessment method. Six commonly used methods (the essay examinations, short-answer questions, multiple-choice questions, patient-based clinical examination, problem-based oral examination [POE], and objective structured clinical examination) are evaluated by scoring and weighting against performance, cost, suitability, and safety factors. In the model, the highest score identifies the most appropriate method. Selection of an assessment method is illustrated using two institutional models, one depicting an ideal situation in which the objective structured clinical examination was preferred, and a second depicting the typical African scenario in which the essay and short-answer-question examinations were best. The POE method received the highest score and could be recommended as the most appropriate for Africa's medical institutions, but POE assessments require changing the medical curricula to a problem-based learning approach. The authors' model is easy to understand and promotes change in the medical curriculum and method of student assessment.

  20. Non-invasive methods of assessing the tear film.

    PubMed

    Yokoi, Norihiko; Komuro, Aoi

    2004-03-01

    The interaction between the tear film and the ocular surface epithelium is crucial for the maintenance of ocular surface health; interference with this relationship may cause dry eye. Several diagnostic techniques have been developed to assess the tear film and diagnose dry eye but many of these tests are invasive and modify the parameter which they are designed to measure. Non-invasive or minimally invasive tests may overcome this problem and provide more reproducible and objective data. One test of this kind is meniscometry, which is particularly useful in assessing tear volume indirectly by measuring tear meniscus radius. The newly developed video-meniscometer, which enables calculation of the meniscus radius digitally, is useful for the diagnosis of tear-deficient dry eye. Video-meniscometry also has other applications, to the study of tear and eye drop turnover, determining the indication for punctal plugs and in demonstrating dysfunction of the tear meniscus. Interferometry of the tear film lipid layer is useful in screening and evaluating dry eye severity and in selecting dry eye candidates for punctal occlusion. It is also useful for analysing tear lipid layer pathophysiology more clearly, especially in combination with meniscometry. Meibometry is a minimally invasive technique to quantify the amount of meibomian lipid on the lid margin. Lipid is blotted onto a plastic tape and the change in optical density is used to calculate lipid uptake. Laser meibometry has increased the scope of this technique for the assessment of meibomian gland dysfunction; also, the delivery of lipids from the lid reservoir to the preocular tear film can be analysed using interferometry and laser meibometry. The present report reviews the application of these techniques to the study of tear film physiology and dry eye.

  1. Caries assessment: establishing mathematical link of clinical and benchtop method

    NASA Astrophysics Data System (ADS)

    Amaechi, Bennett T.

    2009-02-01

    It is well established that the development of new technologies for early detection and quantitative monitoring of dental caries at its early stage could provide health and economic benefits ranging from timely preventive interventions to reduction of the time required for clinical trials of anti-caries agents. However, the new technologies currently used in clinical setting cannot assess and monitor caries using the actual mineral concentration within the lesion, while a laboratory-based microcomputed tomography (MCT) has been shown to possess this capability. Thus we envision the establishment of mathematical equations relating the measurements of each of the clinical technologies to that of MCT will enable the mineral concentration of lesions detected and assessed in clinical practice to be extrapolated from the equation, and this will facilitate preventitive care in dentistry to lower treatment cost. We utilize MCT and the two prominent clinical caries assessment devices (Quantitative Light-induced Fluorescence [QLF] and Diagnodent) to longitudinally monitor the development of caries in a continuous flow mixed-organisms biofilm model (artificial mouth), and then used the collected data to establish mathematical equation relating the measurements of each of the clinical technologies to that of MCT. A linear correlation was observed between the measurements of MicroCT and that of QLF and Diagnodent. Thus mineral density in a carious lesion detected and measured using QLF or Diagnodent can be extrapolated using the developed equation. This highlights the usefulness of MCT for monitoring the progress of an early caries being treated with therapeutic agents in clinical practice or trials.

  2. Assessing Internet energy intensity: A review of methods and results

    SciTech Connect

    Coroama, Vlad C.; Hilty, Lorenz M.

    2014-02-15

    Assessing the average energy intensity of Internet transmissions is a complex task that has been a controversial subject of discussion. Estimates published over the last decade diverge by up to four orders of magnitude — from 0.0064 kilowatt-hours per gigabyte (kWh/GB) to 136 kWh/GB. This article presents a review of the methodological approaches used so far in such assessments: i) top–down analyses based on estimates of the overall Internet energy consumption and the overall Internet traffic, whereby average energy intensity is calculated by dividing energy by traffic for a given period of time, ii) model-based approaches that model all components needed to sustain an amount of Internet traffic, and iii) bottom–up approaches based on case studies and generalization of the results. Our analysis of the existing studies shows that the large spread of results is mainly caused by two factors: a) the year of reference of the analysis, which has significant influence due to efficiency gains in electronic equipment, and b) whether end devices such as personal computers or servers are included within the system boundary or not. For an overall assessment of the energy needed to perform a specific task involving the Internet, it is necessary to account for the types of end devices needed for the task, while the energy needed for data transmission can be added based on a generic estimate of Internet energy intensity for a given year. Separating the Internet as a data transmission system from the end devices leads to more accurate models and to results that are more informative for decision makers, because end devices and the networking equipment of the Internet usually belong to different spheres of control. -- Highlights: • Assessments of the energy intensity of the Internet differ by a factor of 20,000. • We review top–down, model-based, and bottom–up estimates from literature. • Main divergence factors are the year studied and the inclusion of end devices

  3. Purpose in Life Test assessment using latent variable methods.

    PubMed

    Harlow, L L; Newcomb, M D; Bentler, P M

    1987-09-01

    A psychometric assessment was conducted on a slightly revised version of the Purpose in Life Test (PIL-R). Factor analyses revealed a large general factor plus four primary factors comprising lack of purpose in life, positive sense of purpose, motivation for meaning, and existential confusion. Validity models showed that the PIL-R was positively related to a construct of happiness and was negatively related to suicidality and meaninglessness. Reliability estimates ranged from 0.78 to 0.86. The revised version can be presented compactly and may be less confusing to subjects than the original PIL. PMID:3664045

  4. Apparatus and Method for Assessing Vestibulo-Ocular Function

    NASA Technical Reports Server (NTRS)

    Shelhamer, Mark J. (Inventor)

    2015-01-01

    A system for assessing vestibulo-ocular function includes a motion sensor system adapted to be coupled to a user's head; a data processing system configured to communicate with the motion sensor system to receive the head-motion signals; a visual display system configured to communicate with the data processing system to receive image signals from the data processing system; and a gain control device arranged to be operated by the user and to communicate gain adjustment signals to the data processing system.

  5. Signal Processing Methods for Liquid Rocket Engine Combustion Stability Assessments

    NASA Technical Reports Server (NTRS)

    Kenny, R. Jeremy; Lee, Erik; Hulka, James R.; Casiano, Matthew

    2011-01-01

    The J2X Gas Generator engine design specifications include dynamic, spontaneous, and broadband combustion stability requirements. These requirements are verified empirically based high frequency chamber pressure measurements and analyses. Dynamic stability is determined with the dynamic pressure response due to an artificial perturbation of the combustion chamber pressure (bomb testing), and spontaneous and broadband stability are determined from the dynamic pressure responses during steady operation starting at specified power levels. J2X Workhorse Gas Generator testing included bomb tests with multiple hardware configurations and operating conditions, including a configuration used explicitly for engine verification test series. This work covers signal processing techniques developed at Marshall Space Flight Center (MSFC) to help assess engine design stability requirements. Dynamic stability assessments were performed following both the CPIA 655 guidelines and a MSFC in-house developed statistical-based approach. The statistical approach was developed to better verify when the dynamic pressure amplitudes corresponding to a particular frequency returned back to pre-bomb characteristics. This was accomplished by first determining the statistical characteristics of the pre-bomb dynamic levels. The pre-bomb statistical characterization provided 95% coverage bounds; these bounds were used as a quantitative measure to determine when the post-bomb signal returned to pre-bomb conditions. The time for post-bomb levels to acceptably return to pre-bomb levels was compared to the dominant frequency-dependent time recommended by CPIA 655. Results for multiple test configurations, including stable and unstable configurations, were reviewed. Spontaneous stability was assessed using two processes: 1) characterization of the ratio of the peak response amplitudes to the excited chamber acoustic mode amplitudes and 2) characterization of the variability of the peak response

  6. Alternative methods of nutritional status assessment in adolescents.

    PubMed

    Jorga, Jagoda; Marinković, Jelena; Kentrić, Brana; Hetherington, Marion

    2007-06-01

    The main objective of this cross-sectional study was to determine the validity of the silhouette rating scale and reported values of height and weight in assessing weight status in a group of adolescents. 245 adolescents, students of the Belgrade elementary school, aged 11-14 (12.33 +/- 0.50), were involved. Weight status was assessed by anthropometry, self-reported height and weight and by figure rating scale. From the results obtained significant differences emerged as a function of weight status. The majority of normal weight adolescents were accurate in reporting their body size. The percentage of under-reporters was significantly higher in the overweight/obese group than in the normal weight group (chi2 = 9.741, p = 0.003). The correlation between BMI, both measured and self-reported, and perceived body size was positive and highly significant (p < 0.001). Self-reported weight and height appears acceptable for estimating weight status in normal weight adolescents, but not in those who are overweight or obese. This study also demonstrated that adolescents can estimate with some accuracy their body size using figure ratings scales.

  7. Participation in health impact assessment: objectives, methods and core values.

    PubMed Central

    Wright, John; Parry, Jayne; Mathers, Jonathan

    2005-01-01

    Health impact assessment (HIA) is a multidisciplinary aid to decision-making that assesses the impact of policy on public health and on health inequalities. Its purpose is to assist decision-makers to maximize health gains and to reduce inequalities. The 1999 Gothenburg Consensus Paper (GCP) provides researchers with a rationale for establishing community participation as a core value of HIA. According to the GCP, participation in HIA empowers people within the decision-making process and redresses the democratic deficit between government and society. Participation in HIA generates a sense that health and decision-making is community-owned, and the personal experiences of citizens become integral to the formulation of policy. However, the participatory and empowering dimensions of HIA may prove difficult to operationalize. In this review of the participation strategies adopted in key applications of HIA in the United Kingdom, we found that HIA's aim of influencing decision-making creates tension between its participatory and knowledge-gathering dimensions. Accordingly, researchers have decreased the participatory dimension of HIA by reducing the importance attached to the community's experience of empowerment, ownership and democracy, while enlarging its knowledge-gathering dimension by giving pre-eminence to "expert" and "research-generated" evidence. Recent applications of HIA offer a serviceable rationale for participation as a means of information gathering and it is no longer tenable to uphold HIA as a means of empowering communities and advancing the aims of participatory democracy. PMID:15682250

  8. Developing suitable methods of nutritional status assessment: a continuous challenge.

    PubMed

    Elmadfa, Ibrahim; Meyer, Alexa L

    2014-09-01

    Reliable information about the nutritional status is essential to identify potential critical nutrients and the population groups at risk of deficiency, as well as to develop effective public health policies to counteract unfavorable nutrition patterns that contribute to morbidity and mortality. In this review, the important role of biomarkers in the assessment of nutritional status is outlined, major strengths and limitations of established and new biomarkers are described, and important criteria for biomarker selection and development are discussed. Indeed, biomarkers offer a more objective assessment tool than pure dietary approaches that suffer from inadequate data reporting in particular, although biomarkers are often only measured in subsamples because of the higher costs and proband burden they entail. However, biomarkers are subject to individual variability and influences from other factors besides the nutrient of interest. Rapid turnover or tight control of nutrient concentrations in blood (homeostasis) limits their sensitivity as biomarkers, as in the case of many trace elements. The existence of different forms of a micronutrient in the body adds additional complexity. Functional biomarkers, such as enzyme activities, mirror long-term status better but are subject to confounding factors, and some are influenced by several micronutrients, not specific for only 1, so using a combination of biomarkers is advisable. Additionally, the applicability of a biomarker also depends on the existence of adequate reference values and cutoff points for the target population. Therefore, a careful selection is warranted, especially when biomarkers are to be used in larger samples.

  9. Assessment of Geostatistical Methods in Drought Monitoring Systems

    NASA Astrophysics Data System (ADS)

    Shahabfar, A.; Eitzinger, J.

    2009-09-01

    One of the essential components of drought risk management is drought monitoring and drought phenomenon has become a recurrent phenomenon in Iran in the last few decades. As the aim of construction of a drought monitoring system over Iran, in this paper according to last results that have been obtained by authors, three drought indices include China-Z index (CZI), modified CZI (MCZI), Z-Score which have high performance in detecting and measuring of drought intensity, have been calculated over 180 weather stations located in 10 separate agro-climatic zones in Iran. For finding, evaluating and refining an appropriate interpolation method, several geostatistical methods including ordinary kriging (Spherical, Circular, Exponential, Gaussian and Linear), Inverse Distance Weighed (IDW) and Spline have been applied and all of calculated drought indices have been interpolated over 10 different agro-climatic zones. The performance of the seven mentioned methods was evaluated and compared using the monthly data and the cross-validation technique. The comparison criterions were Mean Absolute Error (MAE) and Mean Biased Error (MBE). The results indicate that although ordinary kriging is the most accurate method but Inverse Distance Weighed and Spline have reasonable and more accurate results in several agro-climatic zones and can be used as high performance geostatistical tools for interpolation of different drought indices in Iran. Key words: drought monitoring, drought indices, geostatistical methods, interpolation.

  10. Assessing Autonomous Learning in Research Methods Courses: Implementing the Student-Driven Research Project

    ERIC Educational Resources Information Center

    Vandiver, Donna M.; Walsh, Jeffrey A.

    2010-01-01

    As empirical assessments of teaching strategies increase in many disciplines and across many different courses, a paucity of such assessment seems to exist in courses devoted to social science research methods. This lack of assessment and evaluation impedes progress in developing successful teaching pedagogy. The teaching-learning issue addressed…

  11. Responding to the Crisis of Accountability: A Review of Program Assessment Methods.

    ERIC Educational Resources Information Center

    Haley, Eric; Jackson, DeForrest

    The advertising program at the University of Tennessee, Knoxville (UTK) has at least 12 measures of program assessment, which serve as a basis for discussion rather than as a prescription for an effective assessment program. The program assessment methods are accrediting, internal program review, teaching evaluations, a university survey of…

  12. A Method for Evaluating Competency in Assessment and Management of Suicide Risk

    ERIC Educational Resources Information Center

    Hung, Erick K.; Binder, Renee L.; Fordwood, Samantha R.; Hall, Stephen E.; Cramer, Robert J.; McNiel, Dale E.

    2012-01-01

    Objective: Although health professionals increasingly are expected to be able to assess and manage patients' risk for suicide, few methods are available to evaluate this competency. This report describes development of a competency-assessment instrument for suicide risk-assessment (CAI-S), and evaluates its use in an objective structured clinical…

  13. PWSCC Assessment by Using Extended Finite Element Method

    NASA Astrophysics Data System (ADS)

    Lee, Sung-Jun; Lee, Sang-Hwan; Chang, Yoon-Suk

    2015-12-01

    The head penetration nozzle of control rod driving mechanism (CRDM) is known to be susceptible to primary water stress corrosion cracking (PWSCC) due to the welding-induced residual stress. Especially, the J-groove dissimilar metal weld regions have received many attentions in the previous studies. However, even though several advanced techniques such as weight function and finite element alternating methods have been introduced to predict the occurrence of PWSCC, there are still difficulties in respect of applicability and efficiency. In this study, the extended finite element method (XFEM), which allows convenient crack element modeling by enriching degree of freedom (DOF) with special displacement function, was employed to evaluate structural integrity of the CRDM head penetration nozzle. The resulting stress intensity factors of surface cracks were verified for the reliability of proposed method through the comparison with those suggested in the American Society of Mechanical Engineering (ASME) code. The detailed results from the FE analyses are fully discussed in the manuscript.

  14. Automated manometric method to assess anaerobic toxicity of chemicals.

    PubMed

    Fdz-Polanco, F; Nieto, P; Pérez-Elvira, S I; Fdz-Polanco, M

    2006-01-01

    Industrial additives eventually used for different purposes (antifoaming, cleaning, bactericides, antiscale, etc) are discharged to the wastewater treatment plant. The anaerobic toxicity of these commercial products is not provided by suppliers. A new manometric method is developed and tested to evaluate anaerobic toxicity or inhibition using four different commercial products. Antifoaming Cleron 6 (50-200 ppm), bactericide Divosan-forte (0.05-1.0% v/v), bleach (0.1-1.0% v/v) and cleaning agent Topax 66 (0.10-1.0% v/v). According to the different methods proposed in the literature, from the methane production rate, it is possible to calculate both methanogenic activity evolution and final substrate removal and quantify the potential inhibitory effect of commercial additives. The experimental method is simple and reliable.

  15. Effects of Node-Link Mapping on Non-Science Majors' Meaningful Learning and Conceptual Change in a Life-Science Survey Lecture Course

    ERIC Educational Resources Information Center

    Park-Martinez, Jayne Irene

    2011-01-01

    The purpose of this study was to assess the effects of node-link mapping on students' meaningful learning and conceptual change in a 1-semester introductory life-science course. This study used node-link mapping to integrate and apply the National Research Council's (NRC, 2005) three principles of human learning: engaging students' prior…

  16. An optical method to assess water clarity in coastal waters.

    PubMed

    Kulshreshtha, Anuj; Shanmugam, Palanisamy

    2015-12-01

    Accurate estimation of water clarity in coastal regions is highly desired by various activities such as search and recovery operations, dredging and water quality monitoring. This study intends to develop a practical method for estimating water clarity based on a larger in situ dataset, which includes Secchi depth (Z sd ), turbidity, chlorophyll and optical properties from several field campaigns in turbid coastal waters. The Secchi depth parameter is found to closely vary with the concentration of suspended sediments, vertical diffuse attenuation coefficient K d (m(-1)) and beam attenuation coefficient c (m(-1)). The optical relationships obtained for the selected wavelengths (i.e. 520, 530 and 540 nm) exhibit an inverse relationship between Secchi depth and the length attenuation coefficient (1/(c + K d )). The variation in Secchi depth is expressed in terms of undetermined coupling coefficient which is composed of light penetration factor (expressed by z(1%)K d (λ)) and a correction factor (ξ) (essentially governed by turbidity of the water column). This method of estimating water clarity was validated using independent in situ data from turbid coastal waters, and its results were compared with those obtained from the existing methods. The statistical analysis of the measured and the estimated Z sd showed that the present method yields lower error when compared to the existing methods. The spatial structures of the measured and predicted Z sd are also highly consistent with in situ data, which indicates the potential of the present method for estimating the water clarity in turbid coastal and associated lagoon waters.

  17. Methods to assess Drosophila heart development, function and aging

    PubMed Central

    Ocorr, Karen; Vogler, Georg; Bodmer, Rolf

    2014-01-01

    In recent years the Drosophila heart has become an established model of many different aspects of human cardiac disease. This model has allowed identification of disease-causing mechanisms underlying congenital heart disease and cardiomyopathies and has permitted the study underlying genetic, metabolic and age-related contributions to heart function. In this review we discuss methods currently employed in the analysis of the Drosophila heart structure and function, such as optical methods to infer heart function and performance, electrophysiological and mechanical approaches to characterize cardiac tissue properties, and conclude with histological techniques used in the study of heart development and adult structure. PMID:24727147

  18. Sequential sampling: a novel method in farm animal welfare assessment.

    PubMed

    Heath, C A E; Main, D C J; Mullan, S; Haskell, M J; Browne, W J

    2016-02-01

    Lameness in dairy cows is an important welfare issue. As part of a welfare assessment, herd level lameness prevalence can be estimated from scoring a sample of animals, where higher levels of accuracy are associated with larger sample sizes. As the financial cost is related to the number of cows sampled, smaller samples are preferred. Sequential sampling schemes have been used for informing decision making in clinical trials. Sequential sampling involves taking samples in stages, where sampling can stop early depending on the estimated lameness prevalence. When welfare assessment is used for a pass/fail decision, a similar approach could be applied to reduce the overall sample size. The sampling schemes proposed here apply the principles of sequential sampling within a diagnostic testing framework. This study develops three sequential sampling schemes of increasing complexity to classify 80 fully assessed UK dairy farms, each with known lameness prevalence. Using the Welfare Quality herd-size-based sampling scheme, the first 'basic' scheme involves two sampling events. At the first sampling event half the Welfare Quality sample size is drawn, and then depending on the outcome, sampling either stops or is continued and the same number of animals is sampled again. In the second 'cautious' scheme, an adaptation is made to ensure that correctly classifying a farm as 'bad' is done with greater certainty. The third scheme is the only scheme to go beyond lameness as a binary measure and investigates the potential for increasing accuracy by incorporating the number of severely lame cows into the decision. The three schemes are evaluated with respect to accuracy and average sample size by running 100 000 simulations for each scheme, and a comparison is made with the fixed size Welfare Quality herd-size-based sampling scheme. All three schemes performed almost as well as the fixed size scheme but with much smaller average sample sizes. For the third scheme, an overall

  19. Distributed electrical generation technologies and methods for their economic assessment

    SciTech Connect

    Kreider, J.F.; Curtiss, P.S.

    2000-07-01

    A confluence of events in the electrical generation and transmission industry has produced a new paradigm for distributed electrical generation and distribution in the US Electrical deregulation, reluctance of traditional utilities to commit capital to large central plants and transmission lines, and a suite of new, efficient generation hardware have all combined to bring this about. Persistent environmental concerns have further stimulated several new approaches. In this paper the authors describe the near term distributed generation technologies and their differentiating characteristics along with their readiness for the US market. In order to decide which approaches are well suited to a specific project, an assessment methodology is needed. A technically sound approach is therefore described and example results are given.

  20. Methods to assess airborne concentrations of cotton dust.

    PubMed

    Corn, M

    1987-01-01

    Assessment of concentrations of airborne cotton dust in the factory is necessary to determine adherence to applicable Permissible Exposure Limits (PELs) on a day-to-day basis, as well as for investigatory studies of an epidemiological nature. The latter are required on an ongoing basis to determine the adequacy of PELs to prevent disease in the exposed population. A strategy of sampling includes considerations of the numbers of samples to be obtained for statistical validity and the locations of samples. Current practice is to obtain more "personal samples" of exposure wherever possible, but with regard to cotton dust, instrumentation is not available for such sampling. In the U.S., the vertical elutriator is the instrument of choice for determining the concentrations of cotton dust in air. Results are expressed as milligrams of airborne particulate (cotton dust) per cubic meter. PMID:3434562

  1. Development of partial failure analysis method in probability risk assessments

    SciTech Connect

    Ni, T.; Modarres, M.

    1996-12-01

    This paper presents a new approach to evaluate the partial failure effect on current Probability Risk Assessments (PRAs). An integrated methodology of the thermal-hydraulic analysis and fuzzy logic simulation using the Dynamic Master Logic Diagram (DMLD) was developed. The thermal-hydraulic analysis used in this approach is to identify partial operation effect of any PRA system function in a plant model. The DMLD is used to simulate the system performance of the partial failure effect and inspect all minimal cut sets of system functions. This methodology can be applied in the context of a full scope PRA to reduce core damage frequency. An example of this application of the approach is presented. The partial failure data used in the example is from a survey study of partial failure effects from the Nuclear Plant Reliability Data System (NPRDS).

  2. Methods of assessing structural integrity for space shuttle vehicles

    NASA Technical Reports Server (NTRS)

    Anderson, R. E.; Stuckenberg, F. H.

    1971-01-01

    A detailed description and evaluation of nondestructive evaluation (NDE) methods are given which have application to space shuttle vehicles. Appropriate NDE design data is presented in twelve specifications in an appendix. Recommendations for NDE development work for the space shuttle program are presented.

  3. Assessing Affective Constructs in Reading: A Mixed Methods Study

    ERIC Educational Resources Information Center

    Conradi, Kristin

    2011-01-01

    Research investigating affective dimensions in reading has long been plagued by vaguely defined constructs and, consequently, by an array of potentially problematic instruments designed to measure them. This mixed-methods study investigated the relationship among three popular group-administered instruments intended to tap affective constructs in…

  4. EXPOSURE ASSESSMENT METHODS DEVELOPMENT PILOTS FOR THE NATIONAL CHILDREN'S STUDY

    EPA Science Inventory

    Accurate exposure classification tools are needed to link exposure with health effects. EPA began methods development pilot studies in 2000 to address general questions about exposures and outcome measures. Selected pilot studies are highlighted in this poster. The “Literature Re...

  5. Assessment of Entrepreneurial Territorial Attractiveness by the Ranking Method

    ERIC Educational Resources Information Center

    Gavrilova, Marina A.; Shepelev, Victor M.; Kosyakova, Inessa V.; Belikova, Lyudmila F.; Chistik, Olga F.

    2016-01-01

    The relevance of the researched problem is caused by existence of differentiation in development of separate regional units (urban districts and municipalities) within the region. The aim of this article is to offer a method, which determines the level of differentiation in development of various components of the region, and also in producing a…

  6. Assessing Clinical Significance: Does it Matter which Method we Use?

    ERIC Educational Resources Information Center

    Atkins, David C.; Bedics, Jamie D.; Mcglinchey, Joseph B.; Beauchaine, Theodore P.

    2005-01-01

    Measures of clinical significance are frequently used to evaluate client change during therapy. Several alternatives to the original method devised by N. S. Jacobson, W. C. Follette, & D. Revenstorf (1984) have been proposed, each purporting to increase accuracy. However, researchers have had little systematic guidance in choosing among…

  7. Assessment of mild steel damage characteristics by physical methods

    NASA Astrophysics Data System (ADS)

    Botvina, L. R.; Soldatenkov, A. P.; Levin, V. P.; Tyutin, M. R.; Demina, Yu. A.; Petersen, T. B.; Dubov, A. A.; Semashko, N. A.

    2016-01-01

    The deformation and fracture localization characteristics are estimated by the methods of replicas, acoustic emission, metal magnetic memory, ultrasonic attenuation, microhardness, and electrical resistance. The relation between the estimated physical parameters on the one hand and the plastic zone size and the microcrack concentration in this zone, on the other, is considered.

  8. Generalized Bootstrap Method for Assessment of Uncertainty in Semivariogram Inference

    USGS Publications Warehouse

    Olea, R.A.; Pardo-Iguzquiza, E.

    2011-01-01

    The semivariogram and its related function, the covariance, play a central role in classical geostatistics for modeling the average continuity of spatially correlated attributes. Whereas all methods are formulated in terms of the true semivariogram, in practice what can be used are estimated semivariograms and models based on samples. A generalized form of the bootstrap method to properly model spatially correlated data is used to advance knowledge about the reliability of empirical semivariograms and semivariogram models based on a single sample. Among several methods available to generate spatially correlated resamples, we selected a method based on the LU decomposition and used several examples to illustrate the approach. The first one is a synthetic, isotropic, exhaustive sample following a normal distribution, the second example is also a synthetic but following a non-Gaussian random field, and a third empirical sample consists of actual raingauge measurements. Results show wider confidence intervals than those found previously by others with inadequate application of the bootstrap. Also, even for the Gaussian example, distributions for estimated semivariogram values and model parameters are positively skewed. In this sense, bootstrap percentile confidence intervals, which are not centered around the empirical semivariogram and do not require distributional assumptions for its construction, provide an achieved coverage similar to the nominal coverage. The latter cannot be achieved by symmetrical confidence intervals based on the standard error, regardless if the standard error is estimated from a parametric equation or from bootstrap. ?? 2010 International Association for Mathematical Geosciences.

  9. The Methods Behind 2015 Informatics Capacity and Needs Assessment Study

    PubMed Central

    2016-01-01

    The 2015 Informatics Needs and Capacity of Local Health Departments (LHDs) survey is the most recent comprehensive source of quantitative data on LHD informatics. Conducted by the National Association of County & City Health Officials (NACCHO), this is the third nationally representative quantitative study of LHD informatics since 2009. The previous 2 comprehensive quantitative assessments were conducted by NACCHO in 2009-2010 and 2011. Given that public health informatics is rapidly evolving, the 2015 Informatics survey is a much-needed country-wide assessment of the current informatics needs and capacities of LHDs. This article outlines detailed methodology used in the 2015 Informatics survey, including instrument development, pretesting, sampling design and sample size, survey administration, and sampling weights. A 9-member advisory committee representing federal, state, and local health agency representatives guided the design and implementation of this study. The survey instrument was organized into 6 topic areas: demographics, physical infrastructure, skills and capacity available, public health workforce development needs, electronic health records, and health information exchange. The instrument was pretested with a sample of 20 LHDs and subsequently pilot-tested with 30 LHDs. The survey was administered via the Qualtrics survey software to the sample of 650 LHDs, selected using stratified random sampling. The survey was fielded for approximately 8 weeks and 324 usable responses were received, constituting a response rate of 50%. Statistical weights were developed to account for 3 factors: (a) disproportionate response rate by population size (using 7 population strata), (b) oversampling of LHDs with larger population sizes, and (c) sampling rather than a census approach. PMID:27684627

  10. Stratovolcano stability assessment methods and results from Citlaltepetl, Mexico

    USGS Publications Warehouse

    Zimbelman, D.R.; Watters, R.J.; Firth, I.R.; Breit, G.N.; Carrasco-Nunez, Gerardo

    2004-01-01

    Citlaltépetl volcano is the easternmost stratovolcano in the Trans-Mexican Volcanic Belt. Situated within 110 km of Veracruz, it has experienced two major collapse events and, subsequent to its last collapse, rebuilt a massive, symmetrical summit cone. To enhance hazard mitigation efforts we assess the stability of Citlaltépetl's summit cone, the area thought most likely to fail during a potential massive collapse event. Through geologic mapping, alteration mineralogy, geotechnical studies, and stability modeling we provide important constraints on the likelihood, location, and size of a potential collapse event. The volcano's summit cone is young, highly fractured, and hydrothermally altered. Fractures are most abundant within 5–20-m wide zones defined by multiple parallel to subparallel fractures. Alteration is most pervasive within the fracture systems and includes acid sulfate, advanced argillic, argillic, and silicification ranks. Fractured and altered rocks both have significantly reduced rock strengths, representing likely bounding surfaces for future collapse events. The fracture systems and altered rock masses occur non-uniformly, as an orthogonal set with N–S and E–W trends. Because these surfaces occur non-uniformly, hazards associated with collapse are unevenly distributed about the volcano. Depending on uncertainties in bounding surfaces, but constrained by detailed field studies, potential failure volumes are estimated to range between 0.04–0.5 km3. Stability modeling was used to assess potential edifice failure events. Modeled failure of the outer portion of the cone initially occurs as an "intact block" bounded by steeply dipping joints and outwardly dipping flow contacts. As collapse progresses, more of the inner cone fails and the outer "intact" block transforms into a collection of smaller blocks. Eventually, a steep face develops in the uppermost and central portion of the cone. This modeled failure morphology mimics collapse

  11. A Dynamic System To Keep Teacher Education Prorams Meaningful.

    ERIC Educational Resources Information Center

    Zelazek, John R.; And Others

    This report represents the work of the Teacher Education Assessment Committee (TEAC) at Central Missouri State University (CMSU), established in April of 1988. The TEAC is a multifaceted system that conducts and publishes results of periodic assessments and an evaluation of CMSU's Teacher Education Programs by soliciting input from: (1) CMSU…

  12. Evaluation of a clinical simulation-based assessment method for EHR-platforms.

    PubMed

    Jensen, Sanne; Rasmussen, Stine Loft; Lyng, Karen Marie

    2014-01-01

    In a procurement process assessment of issues like human factors and interaction between technology and end-users can be challenging. In a large public procurement of an Electronic health record-platform (EHR-platform) in Denmark a clinical simulation-based method for assessing and comparing human factor issues was developed and evaluated. This paper describes the evaluation of the method, its advantages and disadvantages. Our findings showed that clinical simulation is beneficial for assessing user satisfaction, usefulness and patient safety, all though it is resource demanding. The method made it possible to assess qualitative topics during the procurement and it provides an excellent ground for user involvement.

  13. Evaluation of a clinical simulation-based assessment method for EHR-platforms.

    PubMed

    Jensen, Sanne; Rasmussen, Stine Loft; Lyng, Karen Marie

    2014-01-01

    In a procurement process assessment of issues like human factors and interaction between technology and end-users can be challenging. In a large public procurement of an Electronic health record-platform (EHR-platform) in Denmark a clinical simulation-based method for assessing and comparing human factor issues was developed and evaluated. This paper describes the evaluation of the method, its advantages and disadvantages. Our findings showed that clinical simulation is beneficial for assessing user satisfaction, usefulness and patient safety, all though it is resource demanding. The method made it possible to assess qualitative topics during the procurement and it provides an excellent ground for user involvement. PMID:25160323

  14. 75 FR 53298 - A Method to Assess Climate-Relevant Decisions: Application in the Chesapeake Bay

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-31

    ... AGENCY A Method to Assess Climate-Relevant Decisions: Application in the Chesapeake Bay AGENCY... 60-day public comment period for the draft document titled, ``A Method to Assess Climate-Relevant... climate change.'' The EO Strategy also commits EPA to ensuring that ``TMDL allocations account for...

  15. A Method for the Systematic Observation of Examiner Behavior during Psychoeducational Assessments.

    ERIC Educational Resources Information Center

    Strein, William

    1984-01-01

    Describes a specific empirical method for the systematic observation of examiner behavior during psychoeducational assessments, the Systematic Observation Scale for Assessments (SOS-A). Discusses development of the instrument and presents data on intra- and interobserver agreement. The instrument is regarded as an initially adequate method for…

  16. 40 CFR 63.1412 - Continuous process vent applicability assessment procedures and methods.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... engineering principles, measurable process parameters, or physical or chemical laws or properties. Examples of... site selection method. Method 1 or 1A of 40 CFR part 60, appendix A, as appropriate, shall be used for... values, and engineering assessment control applicability assessment requirements are to be...

  17. 40 CFR 63.1412 - Continuous process vent applicability assessment procedures and methods.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... engineering principles, measurable process parameters, or physical or chemical laws or properties. Examples of... site selection method. Method 1 or 1A of 40 CFR part 60, appendix A, as appropriate, shall be used for... values, and engineering assessment control applicability assessment requirements are to be...

  18. 40 CFR 63.1412 - Continuous process vent applicability assessment procedures and methods.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... engineering principles, measurable process parameters, or physical or chemical laws or properties. Examples of... site selection method. Method 1 or 1A of 40 CFR part 60, appendix A, as appropriate, shall be used for... values, and engineering assessment control applicability assessment requirements are to be...

  19. 40 CFR 63.1412 - Continuous process vent applicability assessment procedures and methods.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... engineering principles, measurable process parameters, or physical or chemical laws or properties. Examples of... site selection method. Method 1 or 1A of 40 CFR part 60, appendix A, as appropriate, shall be used for... values, and engineering assessment control applicability assessment requirements are to be...

  20. Deciding which chemical mixtures risk assessment methods work best for what mixtures

    SciTech Connect

    Teuschler, Linda K.

    2007-09-01

    The most commonly used chemical mixtures risk assessment methods involve simple notions of additivity and toxicological similarity. Newer methods are emerging in response to the complexities of chemical mixture exposures and effects. Factors based on both science and policy drive decisions regarding whether to conduct a chemical mixtures risk assessment and, if so, which methods to employ. Scientific considerations are based on positive evidence of joint toxic action, elevated human exposure conditions or the potential for significant impacts on human health. Policy issues include legislative drivers that may mandate action even though adequate toxicity data on a specific mixture may not be available and risk assessment goals that impact the choice of risk assessment method to obtain the amount of health protection desired. This paper discusses three important concepts used to choose among available approaches for conducting a chemical mixtures risk assessment: (1) additive joint toxic action of mixture components; (2) toxicological interactions of mixture components; and (3) chemical composition of complex mixtures. It is proposed that scientific support for basic assumptions used in chemical mixtures risk assessment should be developed by expert panels, risk assessment methods experts, and laboratory toxicologists. This is imperative to further develop and refine quantitative methods and provide guidance on their appropriate applications. Risk assessors need scientific support for chemical mixtures risk assessment methods in the form of toxicological data on joint toxic action for high priority mixtures, statistical methods for analyzing dose-response for mixtures, and toxicological and statistical criteria for determining sufficient similarity of complex mixtures.