Sample records for work sample methodology

  1. Methodological integrative review of the work sampling technique used in nursing workload research.

    PubMed

    Blay, Nicole; Duffield, Christine M; Gallagher, Robyn; Roche, Michael

    2014-11-01

    To critically review the work sampling technique used in nursing workload research. Work sampling is a technique frequently used by researchers and managers to explore and measure nursing activities. However, work sampling methods used are diverse making comparisons of results between studies difficult. Methodological integrative review. Four electronic databases were systematically searched for peer-reviewed articles published between 2002-2012. Manual scanning of reference lists and Rich Site Summary feeds from contemporary nursing journals were other sources of data. Articles published in the English language between 2002-2012 reporting on research which used work sampling to examine nursing workload. Eighteen articles were reviewed. The review identified that the work sampling technique lacks a standardized approach, which may have an impact on the sharing or comparison of results. Specific areas needing a shared understanding included the training of observers and subjects who self-report, standardization of the techniques used to assess observer inter-rater reliability, sampling methods and reporting of outcomes. Work sampling is a technique that can be used to explore the many facets of nursing work. Standardized reporting measures would enable greater comparison between studies and contribute to knowledge more effectively. Author suggestions for the reporting of results may act as guidelines for researchers considering work sampling as a research method. © 2014 John Wiley & Sons Ltd.

  2. Teaching Research Methodology through Active Learning

    ERIC Educational Resources Information Center

    Lundahl, Brad W.

    2008-01-01

    To complement traditional learning activities in a masters-level research methodology course, social work students worked on a formal research project which involved: designing the study, constructing measures, selecting a sampling strategy, collecting data, reducing and analyzing data, and finally interpreting and communicating the results. The…

  3. Exploring School Counselors' Motivations to Work with LGBTQQI Students in Schools: A Q Methodology Study

    ERIC Educational Resources Information Center

    Goodrich, Kristopher M.

    2017-01-01

    This study surveyed a national sample of school counselors who were members of ASCA to understand what motivated their work, or not, with lesbian, gay, bisexual, transgender, queer, questioning, and intersex (LGBTQQI) students in school. The author implemented Q methodology to collect and analyze the data, and results provide scholars and…

  4. Design strategies from sexual exploitation and sex work studies among women and girls: Methodological considerations in a hidden and vulnerable population.

    PubMed

    Gerassi, Lara; Edmond, Tonya; Nichols, Andrea

    2017-06-01

    The study of sex trafficking, prostitution, sex work, and sexual exploitation is associated with many methodological issues and challenges. Researchers' study designs must consider the many safety issues related to this vulnerable and hidden population. Community advisory boards and key stakeholder involvement are essential to study design to increase safety of participants, usefulness of study aims, and meaningfulness of conclusions. Nonrandomized sampling strategies are most often utilized when studying exploited women and girls, which have the capacity to provide rich data and require complex sampling and recruitment methods. This article reviews the current methodological issues when studying this marginalized population as well as strategies to address challenges while working with the community in order to bring about social change. The authors also discuss their own experiences in collaborating with community organizations to conduct research in this field.

  5. Design strategies from sexual exploitation and sex work studies among women and girls: Methodological considerations in a hidden and vulnerable population

    PubMed Central

    Gerassi, Lara; Edmond, Tonya; Nichols, Andrea

    2016-01-01

    The study of sex trafficking, prostitution, sex work, and sexual exploitation is associated with many methodological issues and challenges. Researchers’ study designs must consider the many safety issues related to this vulnerable and hidden population. Community advisory boards and key stakeholder involvement are essential to study design to increase safety of participants, usefulness of study aims, and meaningfulness of conclusions. Nonrandomized sampling strategies are most often utilized when studying exploited women and girls, which have the capacity to provide rich data and require complex sampling and recruitment methods. This article reviews the current methodological issues when studying this marginalized population as well as strategies to address challenges while working with the community in order to bring about social change. The authors also discuss their own experiences in collaborating with community organizations to conduct research in this field. PMID:28824337

  6. Documentation of indigenous Pacific agroforestry systems: a review of methodologies

    Treesearch

    Bill Raynor

    1993-01-01

    Recent interest in indigenous agroforestry has led to a need for documentation of these systems. However, previous work is very limited, and few methodologies are well-known or widely accepted. This paper outlines various methodologies (including sampling methods, data to be collected, and considerations in analysis) for documenting structure and productivity of...

  7. Development of a Methodology for Assessing Aircrew Workloads.

    DTIC Science & Technology

    1981-11-01

    Workload Feasibility Study. .. ...... 52 Subjects. .. .............. ........ 53 Equipment .. ............... ....... 53 Date Analysis ... analysis ; simulation; standard time systems; switching synthetic time systems; task activities; task interference; time study; tracking; workload; work sampl...standard data systems, information content analysis , work sampling and job evaluation. Con- ventional methods were found to be deficient in accounting

  8. Returning to Work after Cancer: Quantitative Studies and Prototypical Narratives

    PubMed Central

    Steiner, John F.; Nowels, Carolyn T.; Main, Deborah S.

    2009-01-01

    Objective A combination of quantitative data and illustrative narratives may allow cancer survivorship researchers to disseminate their research findings more broadly. We identified recent, methodologically rigorous quantitative studies on return to work after cancer, summarized the themes from these studies, and illustrated those themes with narratives of individual cancer survivors. Methods We reviewed English-language studies of return to work for adult cancer survivors through June, 2008, and identified 13 general themes from papers that met methodological criteria (population-based sampling, prospective and longitudinal assessment, detailed assessment of work, evaluation of economic impact, assessment of moderators of work return, and large sample size). We drew survivorship narratives from a prior qualitative research study to illustrate these themes. Results Nine quantitative studies met 4 or more of our 6 methodological criteria. These studies suggested that most cancer survivors could return to work without residual disabilities. Cancer site, clinical prognosis, treatment modalities, socioeconomic status, and attributes of the job itself influenced the likelihood of work return. Three narratives - a typical survivor who returned to work after treatment, an individual unable to return to work, and an inspiring survivor who returned to work despite substantial barriers - illustrated many of the themes from the quantitative literature while providing additional contextual details. Conclusion Illustrative narratives can complement the findings of cancer survivorship research if researchers are rigorous and transparent in the selection, analysis, and retelling of those stories. PMID:19507264

  9. Connecting Teaching and Learning: History, Evolution, and Case Studies of Teacher Work Sample Methodology

    ERIC Educational Resources Information Center

    Rosselli, Hilda, Ed.; Girod, Mark, Ed.; Brodsky, Meredith, Ed.

    2011-01-01

    As accountability in education has become an increasingly prominent topic, teacher preparation programs are being asked to provide credible evidence that their teacher candidates can impact student learning. Teacher Work Samples, first developed 30 years ago, have emerged as an effective method of quantifying the complex set of tasks that comprise…

  10. How to do a grounded theory study: a worked example of a study of dental practices

    PubMed Central

    2011-01-01

    Background Qualitative methodologies are increasingly popular in medical research. Grounded theory is the methodology most-often cited by authors of qualitative studies in medicine, but it has been suggested that many 'grounded theory' studies are not concordant with the methodology. In this paper we provide a worked example of a grounded theory project. Our aim is to provide a model for practice, to connect medical researchers with a useful methodology, and to increase the quality of 'grounded theory' research published in the medical literature. Methods We documented a worked example of using grounded theory methodology in practice. Results We describe our sampling, data collection, data analysis and interpretation. We explain how these steps were consistent with grounded theory methodology, and show how they related to one another. Grounded theory methodology assisted us to develop a detailed model of the process of adapting preventive protocols into dental practice, and to analyse variation in this process in different dental practices. Conclusions By employing grounded theory methodology rigorously, medical researchers can better design and justify their methods, and produce high-quality findings that will be more useful to patients, professionals and the research community. PMID:21902844

  11. How to do a grounded theory study: a worked example of a study of dental practices.

    PubMed

    Sbaraini, Alexandra; Carter, Stacy M; Evans, R Wendell; Blinkhorn, Anthony

    2011-09-09

    Qualitative methodologies are increasingly popular in medical research. Grounded theory is the methodology most-often cited by authors of qualitative studies in medicine, but it has been suggested that many 'grounded theory' studies are not concordant with the methodology. In this paper we provide a worked example of a grounded theory project. Our aim is to provide a model for practice, to connect medical researchers with a useful methodology, and to increase the quality of 'grounded theory' research published in the medical literature. We documented a worked example of using grounded theory methodology in practice. We describe our sampling, data collection, data analysis and interpretation. We explain how these steps were consistent with grounded theory methodology, and show how they related to one another. Grounded theory methodology assisted us to develop a detailed model of the process of adapting preventive protocols into dental practice, and to analyse variation in this process in different dental practices. By employing grounded theory methodology rigorously, medical researchers can better design and justify their methods, and produce high-quality findings that will be more useful to patients, professionals and the research community.

  12. A Chinese Longitudinal Study on Work/Family Enrichment

    ERIC Educational Resources Information Center

    Lu, Luo

    2011-01-01

    Purpose: The purpose of this paper is to explore reciprocal relationships between work/family resources, work/family enrichment (WFE), and work/family satisfaction in a Chinese society. Design/methodology/approach: A longitudinal design was adopted using a three-wave panel sample. Data were obtained from 310 Taiwanese employees on three occasions,…

  13. Faster the better: a reliable technique to sample anopluran lice in large hosts.

    PubMed

    Leonardi, María Soledad

    2014-06-01

    Among Anoplura, the family Echinophthiriidae includes those species that infest mainly the pinnipeds. Working with large hosts implies methodological considerations as the time spent in the sampling, and the way in that the animal is restrained. Previous works on echinophthiriids combined a diverse array of analyses including field counts of lice and in vitro observations. To collect lice, the authors used forceps, and each louse was collected individually. This implied a long manipulation time, i.e., ≈60 min and the need to physically and/or chemically immobilize the animal. The present work described and discussed for the first a sample technique that minimized the manipulation time and also avoiding the use of anesthesia. This methodology implied combing the host's pelage with a fine-tooth plastic comb, as used in the treatment of human pediculosis, and keeping the comb with the lice retained in a Ziploc® bag with ethanol. This technique was used successfully in studies of population dynamic, habitat selection, and transmission pattern, being a reliable methodology. Lice are collected entirely and are in a good condition to prepare them for mounting for studying under light or scanning electron microscopy. Moreover, the use of the plastic comb protects from damaging taxonomically important structures as spines being also recommended to reach taxonomic or morphological goals.

  14. Newcomer adjustment during organizational socialization: a meta-analytic review of antecedents, outcomes, and methods.

    PubMed

    Bauer, Talya N; Bodner, Todd; Erdogan, Berrin; Truxillo, Donald M; Tucker, Jennifer S

    2007-05-01

    The authors tested a model of antecedents and outcomes of newcomer adjustment using 70 unique samples of newcomers with meta-analytic and path modeling techniques. Specifically, they proposed and tested a model in which adjustment (role clarity, self-efficacy, and social acceptance) mediated the effects of organizational socialization tactics and information seeking on socialization outcomes (job satisfaction, organizational commitment, job performance, intentions to remain, and turnover). The results generally supported this model. In addition, the authors examined the moderating effects of methodology on these relationships by coding for 3 methodological issues: data collection type (longitudinal vs. cross-sectional), sample characteristics (school-to-work vs. work-to-work transitions), and measurement of the antecedents (facet vs. composite measurement). Discussion focuses on the implications of the findings and suggestions for future research. 2007 APA, all rights reserved

  15. From Study to Work: Methodological Challenges of a Graduate Destination Survey in the Western Cape, South Africa

    ERIC Educational Resources Information Center

    du Toit, Jacques; Kraak, Andre; Favish, Judy; Fletcher, Lizelle

    2014-01-01

    Current literature proposes several strategies for improving response rates to student evaluation surveys. Graduate destination surveys pose the difficulty of tracing graduates years later when their contact details may have changed. This article discusses the methodology of one such a survey to maximise response rates. Compiling a sample frame…

  16. In situ study of live specimens in an environmental scanning electron microscope.

    PubMed

    Tihlaříková, Eva; Neděla, Vilém; Shiojiri, Makoto

    2013-08-01

    In this paper we introduce new methodology for the observation of living biological samples in an environmental scanning electron microscope (ESEM). The methodology is based on an unconventional initiation procedure for ESEM chamber pumping, free from purge-flood cycles, and on the ability to control thermodynamic processes close to the sample. The gradual and gentle change of the working environment from air to water vapor enables the study of not only living samples in dynamic in situ experiments and their manifestation of life (sample walking) but also its experimentally stimulated physiological reactions. Moreover, Monte Carlo simulations of primary electron beam energy losses in a water layer on the sample surface were studied; consequently, the influence of the water thickness on radiation, temperature, or chemical damage of the sample was considered.

  17. Australasian emergency physicians: a learning and educational needs analysis. Part one: background and methodology. Profile of FACEM.

    PubMed

    Dent, Andrew W; Asadpour, Ali; Weiland, Tracey J; Paltridge, Debbie

    2008-02-01

    Fellows of the Australasian College for Emergency Medicine (FACEM) have opportunities to participate in a range of continuing professional development activities. To inform FACEM and assist those involved in planning continuing professional development interventions for FACEM, we undertook a learning needs analysis of emergency physicians. Exploratory study using survey methodology. Following questionnaire development by iterative feedback with emergency physicians and researchers, a mailed survey was distributed to all FACEM. The survey comprised eight items on work and demographic characteristics of FACEM, and 194 items on attitudes to existing learning opportunities, barriers to learning, and perceived learning needs and preferences. Fifty-eight percent (503/854) of all FACEM surveyed responded to the questionnaire, almost half of whom attained their FACEM after year 2000. The sample comprised mostly males (72.8%) with mean age of the sample 41.6 years, similar to ACEM database. Most respondents reported working in ACEM accredited hospitals (89%), major referral hospitals (54%), and practiced on both children and adults (78%). FACEM reported working on average 26.7 clinical hours per week with those at private hospitals working a greater proportion of clinical hours than other hospital types. As the first of six related reports, this paper documents the methodology used, including questionnaire development, and provides the demographics of responding FACEM, including the clinical and non-clinical hours worked and type of hospital of principal employment.

  18. Lean six sigma methodologies improve clinical laboratory efficiency and reduce turnaround times.

    PubMed

    Inal, Tamer C; Goruroglu Ozturk, Ozlem; Kibar, Filiz; Cetiner, Salih; Matyar, Selcuk; Daglioglu, Gulcin; Yaman, Akgun

    2018-01-01

    Organizing work flow is a major task of laboratory management. Recently, clinical laboratories have started to adopt methodologies such as Lean Six Sigma and some successful implementations have been reported. This study used Lean Six Sigma to simplify the laboratory work process and decrease the turnaround time by eliminating non-value-adding steps. The five-stage Six Sigma system known as define, measure, analyze, improve, and control (DMAIC) is used to identify and solve problems. The laboratory turnaround time for individual tests, total delay time in the sample reception area, and percentage of steps involving risks of medical errors and biological hazards in the overall process are measured. The pre-analytical process in the reception area was improved by eliminating 3 h and 22.5 min of non-value-adding work. Turnaround time also improved for stat samples from 68 to 59 min after applying Lean. Steps prone to medical errors and posing potential biological hazards to receptionists were reduced from 30% to 3%. Successful implementation of Lean Six Sigma significantly improved all of the selected performance metrics. This quality-improvement methodology has the potential to significantly improve clinical laboratories. © 2017 Wiley Periodicals, Inc.

  19. National working conditions surveys in Latin America: comparison of methodological characteristics

    PubMed Central

    Merino-Salazar, Pamela; Artazcoz, Lucía; Campos-Serna, Javier; Gimeno, David; Benavides, Fernando G.

    2015-01-01

    Background: High-quality and comparable data to monitor working conditions and health in Latin America are not currently available. In 2007, multiple Latin American countries started implementing national working conditions surveys. However, little is known about their methodological characteristics. Objective: To identify commonalities and differences in the methodologies of working conditions surveys (WCSs) conducted in Latin America through 2013. Methods: The study critically examined WCSs in Latin America between 2007 and 2013. Sampling design, data collection, and questionnaire content were compared. Results: Two types of surveys were identified: (1) surveys covering the entire working population and administered at the respondent's home and (2) surveys administered at the workplace. There was considerable overlap in the topics covered by the dimensions of employment and working conditions measured, but less overlap in terms of health outcomes, prevention resources, and activities. Conclusions: Although WCSs from Latin America are similar, there was heterogeneity across surveyed populations and location of the interview. Reducing differences in surveys between countries will increase comparability and allow for a more comprehensive understanding of occupational health in the region. PMID:26079314

  20. National working conditions surveys in Latin America: comparison of methodological characteristics.

    PubMed

    Merino-Salazar, Pamela; Artazcoz, Lucía; Campos-Serna, Javier; Gimeno, David; Benavides, Fernando G

    2015-01-01

    High-quality and comparable data to monitor working conditions and health in Latin America are not currently available. In 2007, multiple Latin American countries started implementing national working conditions surveys. However, little is known about their methodological characteristics. To identify commonalities and differences in the methodologies of working conditions surveys (WCSs) conducted in Latin America through 2013. The study critically examined WCSs in Latin America between 2007 and 2013. Sampling design, data collection, and questionnaire content were compared. Two types of surveys were identified: (1) surveys covering the entire working population and administered at the respondent's home and (2) surveys administered at the workplace. There was considerable overlap in the topics covered by the dimensions of employment and working conditions measured, but less overlap in terms of health outcomes, prevention resources, and activities. Although WCSs from Latin America are similar, there was heterogeneity across surveyed populations and location of the interview. Reducing differences in surveys between countries will increase comparability and allow for a more comprehensive understanding of occupational health in the region.

  1. Qualitative carbonyl profile in coffee beans through GDME-HPLC-DAD-MS/MS for coffee preliminary characterization.

    PubMed

    Cordeiro, Liliana; Valente, Inês M; Santos, João Rodrigo; Rodrigues, José A

    2018-05-01

    In this work, an analytical methodology for volatile carbonyl compounds characterization in green and roasted coffee beans was developed. The methodology relied on a recent and simple sample preparation technique, gas diffusion microextraction for extraction of the samples' volatiles, followed HPLC-DAD-MS/MS analysis. The experimental conditions in terms of extraction temperature and extraction time were studied. A profile for carbonyl compounds was obtained for both arabica and robusta coffee species (green and roasted samples). Twenty-seven carbonyl compounds were identified and further discussed, in light of reported literature, with different coffee characteristics: coffee ageing, organoleptic impact, presence of defective beans, authenticity, human's health implication, post-harvest coffee processing and roasting. The applied methodology showed to be a powerful analytical tool to be used for coffee characterization as it measures marker compounds of different coffee characteristics. Copyright © 2018 Elsevier Ltd. All rights reserved.

  2. The Value of LIS Schools' Research Topics to Library Authors' Professional Work

    ERIC Educational Resources Information Center

    Perkins, Gay Helen; Helbig, Tuesdi L.

    2008-01-01

    Stoan's distinction between library skills and research skills based on different philosophies of information seeking suggests the value of training in research methodology for the librarian. Such training could lead to more effective patron consultations, committee/administrative work, and personal research. Thus, a convenience sample of…

  3. Methodological Issues in the Study of Teachers' Careers: Critical Features of a Truly Longitudinal Study. Working Paper Series.

    ERIC Educational Resources Information Center

    Singer, Judith D.; Willett, John B.

    The National Center for Education Statistics (NCES) is exploring the possibility of conducting a large-scale multi-year study of teachers' careers. The proposed new study is intended to follow a national probability sample of teachers over an extended period of time. A number of methodological issues need to be addressed before the study can be…

  4. Factors Associated with Job Content Plateauing among Older Workers

    ERIC Educational Resources Information Center

    Armstrong-Stassen, Marjorie

    2008-01-01

    Purpose: The purpose of this paper is to identify personal and work environment factors associated with the experience of job content plateauing among older workers. Design/methodology/approach: Two cross-sectional studies, each including two samples, were conducted. In each study, one sample consisted of a diverse group of older workers and the…

  5. A "Career" Work Ethic versus Just a Job

    ERIC Educational Resources Information Center

    Porter, Gayle

    2005-01-01

    Purpose: To provide current information on managers' expectations of their employees, toward structuring future research on amount of time and energy devoted to work. Design/methodology/approach: Qualitative data, acquired through focus groups and interviews, provide a sample of the perceptions of 57 managers in the mid-Atlantic region of the USA…

  6. Examining Foundations of Qualitative Research: A Review of Social Work Dissertations, 2008-2010

    ERIC Educational Resources Information Center

    Gringeri, Christina; Barusch, Amanda; Cambron, Christopher

    2013-01-01

    This study examined the treatment of epistemology and methodological rigor in qualitative social work dissertations. Template-based review was conducted on a random sample of 75 dissertations completed between 2008 and 2010. For each dissertation, we noted the presence or absence of four markers of epistemology: theory, paradigm, reflexivity, and…

  7. Validating Analytical Protocols to Determine Selected Pesticides and PCBs Using Routine Samples.

    PubMed

    Pindado Jiménez, Oscar; García Alonso, Susana; Pérez Pastor, Rosa María

    2017-01-01

    This study aims at providing recommendations concerning the validation of analytical protocols by using routine samples. It is intended to provide a case-study on how to validate the analytical methods in different environmental matrices. In order to analyze the selected compounds (pesticides and polychlorinated biphenyls) in two different environmental matrices, the current work has performed and validated two analytical procedures by GC-MS. A description is given of the validation of the two protocols by the analysis of more than 30 samples of water and sediments collected along nine months. The present work also scopes the uncertainty associated with both analytical protocols. In detail, uncertainty of water sample was performed through a conventional approach. However, for the sediments matrices, the estimation of proportional/constant bias is also included due to its inhomogeneity. Results for the sediment matrix are reliable, showing a range 25-35% of analytical variability associated with intermediate conditions. The analytical methodology for the water matrix determines the selected compounds with acceptable recoveries and the combined uncertainty ranges between 20 and 30%. Analyzing routine samples is rarely applied to assess trueness of novel analytical methods and up to now this methodology was not focused on organochlorine compounds in environmental matrices.

  8. The Relationship between Gender and Aspirations to Senior Management

    ERIC Educational Resources Information Center

    Litzky, Barrie; Greenhaus, Jeffrey

    2007-01-01

    Purpose: The purpose of this paper is to examine the relationship of gender, work factors, and non-work factors with aspirations to positions in senior management. A process model of senior management aspirations was developed and tested. Design/methodology/approach: Data were collected via an online survey that resulted in a sample of 368 working…

  9. Research MethodologyOverview of Qualitative Research

    PubMed Central

    GROSSOEHME, DANIEL H.

    2015-01-01

    Qualitative research methods are a robust tool for chaplaincy research questions. Similar to much of chaplaincy clinical care, qualitative research generally works with written texts, often transcriptions of individual interviews or focus group conversations and seeks to understand the meaning of experience in a study sample. This article describes three common methodologies: ethnography, grounded theory, and phenomenology. Issues to consider relating to the study sample, design, and analysis are discussed. Enhancing the validity of the data, as well reliability and ethical issues in qualitative research are described. Qualitative research is an accessible way for chaplains to contribute new knowledge about the sacred dimension of people's lived experience. PMID:24926897

  10. A History of the Development of the Navy Medical Department’s Workload Management System for Nursing.

    DTIC Science & Technology

    1987-08-01

    efficient and rational methodology. Young, Giovannetti, Lewison , and Thomas (1981) offer the following comprehensive definition of a staffing methodology... Johns Hopkins Hospital. Connor used industrial engineering techniques, such as work sampling, time and motion studies, and continuous observation, to time...ADA 109883-ADA 109886). Young, J.P., Giovannetti, P., Lewison , D., & Thomas, M.L. Factors Affecting Nurse Staffing in Acute Care Hospitals: A Review

  11. Systematic review and consensus guidelines for environmental sampling of Burkholderia pseudomallei.

    PubMed

    Limmathurotsakul, Direk; Dance, David A B; Wuthiekanun, Vanaporn; Kaestli, Mirjam; Mayo, Mark; Warner, Jeffrey; Wagner, David M; Tuanyok, Apichai; Wertheim, Heiman; Yoke Cheng, Tan; Mukhopadhyay, Chiranjay; Puthucheary, Savithiri; Day, Nicholas P J; Steinmetz, Ivo; Currie, Bart J; Peacock, Sharon J

    2013-01-01

    Burkholderia pseudomallei, a Tier 1 Select Agent and the cause of melioidosis, is a Gram-negative bacillus present in the environment in many tropical countries. Defining the global pattern of B. pseudomallei distribution underpins efforts to prevent infection, and is dependent upon robust environmental sampling methodology. Our objective was to review the literature on the detection of environmental B. pseudomallei, update the risk map for melioidosis, and propose international consensus guidelines for soil sampling. An international working party (Detection of Environmental Burkholderia pseudomallei Working Party (DEBWorP)) was formed during the VIth World Melioidosis Congress in 2010. PubMed (January 1912 to December 2011) was searched using the following MeSH terms: pseudomallei or melioidosis. Bibliographies were hand-searched for secondary references. The reported geographical distribution of B. pseudomallei in the environment was mapped and categorized as definite, probable, or possible. The methodology used for detecting environmental B. pseudomallei was extracted and collated. We found that global coverage was patchy, with a lack of studies in many areas where melioidosis is suspected to occur. The sampling strategies and bacterial identification methods used were highly variable, and not all were robust. We developed consensus guidelines with the goals of reducing the probability of false-negative results, and the provision of affordable and 'low-tech' methodology that is applicable in both developed and developing countries. The proposed consensus guidelines provide the basis for the development of an accurate and comprehensive global map of environmental B. pseudomallei.

  12. Do Italian Companies Manage Work-Related Stress Effectively? A Process Evaluation in Implementing the INAIL Methodology

    PubMed Central

    Di Tecco, Cristina; Ronchetti, Matteo; Ghelli, Monica; Russo, Simone; Persechino, Benedetta

    2015-01-01

    Studies on Intervention Process Evaluation are attracting growing attention in the literature on interventions linked to stress and the wellbeing of workers. There is evidence that some elements relating to the process and content of an intervention may have a decisive role in implementing it by facilitating or hindering the effectiveness of the results. This study aimed to provide a process evaluation on interventions to assess and manage risks related to work-related stress using a methodological path offered by INAIL. The final sample is composed of 124 companies participating to an interview on aspects relating to each phase of the INAIL methodological path put in place to implement the intervention. INAIL methodology has been defined as useful in the process of assessing and managing the risks related to work-related stress. Some factors related to the process (e.g., implementation of a preliminary phase, workers' involvement, and use of external consultants) showed a role in significant differences that emerged in the levels of risk, particularly in relation to findings from the preliminary assessment. Main findings provide information on the key aspects of process and content that are useful in implementing an intervention for assessing and managing risks related to work-related stress. PMID:26504788

  13. Do Italian Companies Manage Work-Related Stress Effectively? A Process Evaluation in Implementing the INAIL Methodology.

    PubMed

    Di Tecco, Cristina; Ronchetti, Matteo; Ghelli, Monica; Russo, Simone; Persechino, Benedetta; Iavicoli, Sergio

    2015-01-01

    Studies on Intervention Process Evaluation are attracting growing attention in the literature on interventions linked to stress and the wellbeing of workers. There is evidence that some elements relating to the process and content of an intervention may have a decisive role in implementing it by facilitating or hindering the effectiveness of the results. This study aimed to provide a process evaluation on interventions to assess and manage risks related to work-related stress using a methodological path offered by INAIL. The final sample is composed of 124 companies participating to an interview on aspects relating to each phase of the INAIL methodological path put in place to implement the intervention. INAIL methodology has been defined as useful in the process of assessing and managing the risks related to work-related stress. Some factors related to the process (e.g., implementation of a preliminary phase, workers' involvement, and use of external consultants) showed a role in significant differences that emerged in the levels of risk, particularly in relation to findings from the preliminary assessment. Main findings provide information on the key aspects of process and content that are useful in implementing an intervention for assessing and managing risks related to work-related stress.

  14. Contentious issues in research on trafficked women working in the sex industry: study design, ethics, and methodology.

    PubMed

    Cwikel, Julie; Hoban, Elizabeth

    2005-11-01

    The trafficking of women and children for work in the globalized sex industry is a global social problem. Quality data is needed to provide a basis for legislation, policy, and programs, but first, numerous research design, ethical, and methodological problems must be addressed. Research design issues in studying women trafficked for sex work (WTSW) include how to (a) develop coalitions to fund and support research, (b) maintain a critical stance on prostitution, and therefore WTSW (c) use multiple paradigms and methods to accurately reflect WTSW's reality, (d) present the purpose of the study, and (e) protect respondents' identities. Ethical issues include (a) complications with informed consent procedures, (b) problematic access to WTSW (c) loss of WTSW to follow-up, (d) inability to intervene in illegal acts or human rights violations, and (e) the need to maintain trustworthiness as researchers. Methodological issues include (a) constructing representative samples, (b) managing media interest, and (c) handling incriminating materials about law enforcement and immigration.

  15. Anthropogenic microfibres pollution in marine biota. A new and simple methodology to minimize airborne contamination.

    PubMed

    Torre, Michele; Digka, Nikoletta; Anastasopoulou, Aikaterini; Tsangaris, Catherine; Mytilineou, Chryssi

    2016-12-15

    Research studies on the effects of microlitter on marine biota have become more and more frequent the last few years. However, there is strong evidence that scientific results based on microlitter analyses can be biased by contamination from air transported fibres. This study demonstrates a low cost and easy to apply methodology to minimize the background contamination and thus to increase results validity. The contamination during the gastrointestinal content analysis of 400 fishes was tested for several sample processing steps of high risk airborne contamination (e.g. dissection, stereomicroscopic analysis, and chemical digestion treatment for microlitter extraction). It was demonstrated that, using our methodology based on hermetic enclosure devices, isolating the working areas during the various processing steps, airborne contamination reduced by 95.3%. The simplicity and low cost of this methodology provide the benefit that it could be applied not only to laboratory but also to field or on board work. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Influence of extraction methodology on grape composition values

    USDA-ARS?s Scientific Manuscript database

    This work demonstrated similarities and differences in quantifying many grape quality components (> 45 compounds) that were extracted from berries by three distinct preparations, and analyzed by eight spectrophotometric and HPLC methods. All sample extraction methods were appropriate for qualitative...

  17. An ultrasonic methodology for muscle cross section measurement of support space flight

    NASA Astrophysics Data System (ADS)

    Hatfield, Thomas R.; Klaus, David M.; Simske, Steven J.

    2004-09-01

    The number one priority for any manned space mission is the health and safety of its crew. The study of the short and long term physiological effects on humans is paramount to ensuring crew health and mission success. One of the challenges associated in studying the physiological effects of space flight on humans, such as loss of bone and muscle mass, has been that of readily attaining the data needed to characterize the changes. The small sampling size of astronauts, together with the fact that most physiological data collection tends to be rather tedious, continues to hinder elucidation of the underlying mechanisms responsible for the observed changes that occur in space. Better characterization of the muscle loss experienced by astronauts requires that new technologies be implemented. To this end, we have begun to validate a 360° ultrasonic scanning methodology for muscle measurements and have performed empirical sampling of a limb surrogate for comparison. Ultrasonic wave propagation was simulated using 144 stations of rotated arm and calf MRI images. These simulations were intended to provide a preliminary check of the scanning methodology and data analysis before its implementation with hardware. Pulse-echo waveforms were processed for each rotation station to characterize fat, muscle, bone, and limb boundary interfaces. The percentage error between MRI reference values and calculated muscle areas, as determined from reflection points for calf and arm cross sections, was -2.179% and +2.129%, respectively. These successful simulations suggest that ultrasound pulse scanning can be used to effectively determine limb cross-sectional areas. Cross-sectional images of a limb surrogate were then used to simulate signal measurements at several rotation angles, with ultrasonic pulse-echo sampling performed experimentally at the same stations on the actual limb surrogate to corroborate the results. The objective of the surrogate sampling was to compare the signal output of the simulation tool used as a methodology validation for actual tissue signals. The disturbance patterns of the simulated and sampled waveforms were consistent. Although only discussed as a small part of the work presented, the sampling portion also helped identify important considerations such as tissue compression and transducer positioning for future work involving tissue scanning with this methodology.

  18. Single-particle mineralogy of Chinese soil particles by the combined use of low-Z particle electron probe X-ray microanalysis and attenuated total reflectance-FT-IR imaging techniques.

    PubMed

    Malek, Md Abdul; Kim, Bowha; Jung, Hae-Jin; Song, Young-Chul; Ro, Chul-Un

    2011-10-15

    Our previous work on the speciation of individual mineral particles of micrometer size by the combined use of attenuated total reflectance FT-IR (ATR-FT-IR) imaging and a quantitative energy-dispersive electron probe X-ray microanalysis technique (EPMA), low-Z particle EPMA, demonstrated that the combined use of these two techniques is a powerful approach for looking at the single-particle mineralogy of externally heterogeneous minerals. In this work, this analytical methodology was applied to characterize six soil samples collected at arid areas in China, in order to identify mineral types present in the samples. The six soil samples were collected from two types of soil, i.e., loess and desert soils, for which overall 665 particles were analyzed on a single particle basis. The six soil samples have different mineralogical characteristics, which were clearly differentiated in this work. As this analytical methodology provides complementary information, the ATR-FT-IR imaging on mineral types, and low-Z particle EPMA on the morphology and elemental concentrations, on the same individual particles, more detailed information can be obtained using this approach than when either low-Z particle EPMA or ATR-FT-IR imaging techniques are used alone, which has a great potential for the characterization of Asian dust and mineral dust particles. © 2011 American Chemical Society

  19. Advancements in mass spectrometry for biological samples: Protein chemical cross-linking and metabolite analysis of plant tissues

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klein, Adam

    2015-01-01

    This thesis presents work on advancements and applications of methodology for the analysis of biological samples using mass spectrometry. Included in this work are improvements to chemical cross-linking mass spectrometry (CXMS) for the study of protein structures and mass spectrometry imaging and quantitative analysis to study plant metabolites. Applications include using matrix-assisted laser desorption/ionization-mass spectrometry imaging (MALDI-MSI) to further explore metabolic heterogeneity in plant tissues and chemical interactions at the interface between plants and pests. Additional work was focused on developing liquid chromatography-mass spectrometry (LC-MS) methods to investigate metabolites associated with plant-pest interactions.

  20. Human Caring in the Social Work Context: Continued Development and Validation of a Complex Measure

    ERIC Educational Resources Information Center

    Ellis, Jacquelyn I.; Ellett, Alberta J.; DeWeaver, Kevin

    2007-01-01

    Objectives: (a) to continue the development of a measure of human caring in the context of social work practice and (b) to expand a line of inquiry exploring the relationship between human caring characteristics and the retention of public child welfare workers. Methodology: Surveys were received from a sample (n = 786) child welfare workers in…

  1. The Forgotten Half of Program Evaluation: A Focus on the Translation of Rating Scales for Use with Hispanic Populations

    ERIC Educational Resources Information Center

    Dogan, Shannon J.; Sitnick, Stephanie L.; Onati, Lenna L.

    2012-01-01

    Extension professionals often work with diverse clientele; however, most assessment tools have been developed and validated with English-speaking samples. There is little research and practical guidance on the cultural adaptation and translation of rating scales. The purpose of this article is to summarize the methodological work in this area as…

  2. Systematic Review and Consensus Guidelines for Environmental Sampling of Burkholderia pseudomallei

    PubMed Central

    Limmathurotsakul, Direk; Dance, David A. B.; Wuthiekanun, Vanaporn; Kaestli, Mirjam; Mayo, Mark; Warner, Jeffrey; Wagner, David M.; Tuanyok, Apichai; Wertheim, Heiman; Yoke Cheng, Tan; Mukhopadhyay, Chiranjay; Puthucheary, Savithiri; Day, Nicholas P. J.; Steinmetz, Ivo; Currie, Bart J.; Peacock, Sharon J.

    2013-01-01

    Background Burkholderia pseudomallei, a Tier 1 Select Agent and the cause of melioidosis, is a Gram-negative bacillus present in the environment in many tropical countries. Defining the global pattern of B. pseudomallei distribution underpins efforts to prevent infection, and is dependent upon robust environmental sampling methodology. Our objective was to review the literature on the detection of environmental B. pseudomallei, update the risk map for melioidosis, and propose international consensus guidelines for soil sampling. Methods/Principal Findings An international working party (Detection of Environmental Burkholderia pseudomallei Working Party (DEBWorP)) was formed during the VIth World Melioidosis Congress in 2010. PubMed (January 1912 to December 2011) was searched using the following MeSH terms: pseudomallei or melioidosis. Bibliographies were hand-searched for secondary references. The reported geographical distribution of B. pseudomallei in the environment was mapped and categorized as definite, probable, or possible. The methodology used for detecting environmental B. pseudomallei was extracted and collated. We found that global coverage was patchy, with a lack of studies in many areas where melioidosis is suspected to occur. The sampling strategies and bacterial identification methods used were highly variable, and not all were robust. We developed consensus guidelines with the goals of reducing the probability of false-negative results, and the provision of affordable and ‘low-tech’ methodology that is applicable in both developed and developing countries. Conclusions/Significance The proposed consensus guidelines provide the basis for the development of an accurate and comprehensive global map of environmental B. pseudomallei. PMID:23556010

  3. NEW SAMPLING THEORY FOR MEASURING ECOSYSTEM STRUCTURE

    EPA Science Inventory

    This research considered the application of systems analysis to the study of laboratory ecosystems. The work concerned the development of a methodology which was shown to be useful in the design of laboratory experiments, the processing and interpretation of the results of these ...

  4. A Randomized Clinical Trial of Cogmed Working Memory Training in School-Age Children with ADHD: A Replication in a Diverse Sample Using a Control Condition

    ERIC Educational Resources Information Center

    Chacko, A.; Bedard, A. C.; Marks, D. J.; Feirsen, N.; Uderman, J. Z.; Chimiklis, A.; Rajwan, E.; Cornwell, M.; Anderson, L.; Zwilling, A.; Ramon, M.

    2014-01-01

    Background: Cogmed Working Memory Training (CWMT) has received considerable attention as a promising intervention for the treatment of Attention-Deficit/Hyperactivity Disorder (ADHD) in children. At the same time, methodological weaknesses in previous clinical trials call into question reported efficacy of CWMT. In particular, lack of equivalence…

  5. In situ photoacoustic characterization for porous silicon growing: Detection principles

    NASA Astrophysics Data System (ADS)

    Ramirez-Gutierrez, C. F.; Castaño-Yepes, J. D.; Rodriguez-García, M. E.

    2016-05-01

    There are a few methodologies for monitoring the in-situ formation of Porous Silicon (PS). One of the methodologies is photoacoustic. Previous works that reported the use of photoacoustic to study the PS formation do not provide the physical explanation of the origin of the signal. In this paper, a physical explanation of the origin of the photoacoustic signal during the PS etching is provided. The incident modulated radiation and changes in the reflectance are taken as thermal sources. In this paper, a useful methodology is proposed to determine the etching rate, porosity, and refractive index of a PS film by the determination of the sample thickness, using scanning electron microscopy images. This method was developed by carrying out two different experiments using the same anodization conditions. The first experiment consisted of growth of the samples with different etching times to prove the periodicity of the photoacoustic signal, while the second one considered the growth samples using three different wavelengths that are correlated with the period of the photoacoustic signal. The last experiment showed that the period of the photoacoustic signal is proportional to the laser wavelength.

  6. Improved intracellular PHA determinations with novel spectrophotometric quantification methodologies based on Sudan black dye.

    PubMed

    Porras, Mauricio A; Villar, Marcelo A; Cubitto, María A

    2018-05-01

    The presence of intracellular polyhydroxyalkanoates (PHAs) is usually studied using Sudan black dye solution (SB). In a previous work it was shown that the PHA could be directly quantified using the absorbance of SB fixed by PHA granules in wet cell samples. In the present paper, the optimum SB amount and the optimum conditions to be used for SB assays were determined following an experimental design by hybrid response surface methodology and desirability-function. In addition, a new methodology was developed in which it is shown that the amount of SB fixed by PHA granules can also be determined indirectly through the absorbance of the supernatant obtained from the stained cell samples. This alternative methodology allows a faster determination of the PHA content (involving 23 and 42 min for indirect and direct determinations, respectively), and can be undertaken by means of basic laboratory equipment and reagents. The correlation between PHA content in wet cell samples and the spectra of the SB stained supernatant was determined by means of multivariate and linear regression analysis. The best calibration adjustment (R 2  = 0.91, RSE: 1.56%), and the good PHA prediction obtained (RSE = 1.81%), shows that the proposed methodology constitutes a reasonably precise way for PHA content determination. Thus, this methodology could anticipate the probable results of the above mentioned direct PHA determination. Compared with the most used techniques described in the scientific literature, the combined implementation of these two methodologies seems to be one of the most economical and environmentally friendly, suitable for rapid monitoring of the intracellular PHA content. Copyright © 2018 Elsevier B.V. All rights reserved.

  7. Multiplex cytokine profiling with highly pathogenic material: use of formalin solution in luminex analysis.

    PubMed

    Dowall, Stuart D; Graham, Victoria A; Tipton, Thomas R W; Hewson, Roger

    2009-08-31

    Work with highly pathogenic material mandates the use of biological containment facilities, involving microbiological safety cabinets and specialist laboratory engineering structures typified by containment level 3 (CL3) and CL4 laboratories. Consequences of working in high containment are the practical difficulties associated with containing specialist assays and equipment often essential for experimental analyses. In an era of increased interest in biodefence pathogens and emerging diseases, immunological analysis has developed rapidly alongside traditional techniques in virology and molecular biology. For example, in order to maximise the use of small sample volumes, multiplexing has become a more popular and widespread approach to quantify multiple analytes simultaneously, such as cytokines and chemokines. The luminex microsphere system allows for the detection of many cytokines and chemokines in a single sample, but the detection method of using aligned lasers and fluidics means that samples often have to be analysed in low containment facilities. In order to perform cytokine analysis in materials from high containment (CL3 and CL4 laboratories), we have developed an appropriate inactivation methodology after staining steps, which although results in a reduction of median fluorescent intensity, produces statistically comparable outcomes when judged against non-inactivated samples. This methodology thus extends the use of luminex technology for material that contains highly pathogenic biological agents.

  8. An Empirical Study of Re-sampling Techniques as a Method for Improving Error Estimates in Split-plot Designs

    DTIC Science & Technology

    2010-03-01

    sufficient replications often lead to models that lack precision in error estimation and thus imprecision in corresponding conclusions. This work develops...v Preface This work is dedicated to all who gave and continue to give in order for me to achieve some semblance of success. Benjamin M. Lee vi...develop, examine and test methodologies for an- alyzing test results from split-plot designs. In particular, this work determines the applicability

  9. Experience-Sampling Methodology with a Mobile Device in Fibromyalgia

    PubMed Central

    Diana, Castilla; Cristina, Botella; Azucena, García-Palacios; Luis, Farfallini; Ignacio, Miralles

    2012-01-01

    This work describes the usability studies conducted in the development of an experience-sampling methodology (ESM) system running in a mobile device. The goal of the system is to improve the accuracy and ecology in gathering daily self-report data in individuals suffering a chronic pain condition, fibromyalgia. The usability studies showed that the developed software to conduct ESM with mobile devices (smartphones, cell phones) can be successfully used by individuals with fibromyalgia of different ages and with low level of expertise in the use of information and communication technologies. 100% of users completed the tasks successfully, although some have completely illiterate. Also there seems to be a clear difference in the way of interaction obtained in the two studies carried out. PMID:23304132

  10. Application of gamma-ray spectrometry in a NORM industry for its radiometrical characterization

    NASA Astrophysics Data System (ADS)

    Mantero, J.; Gázquez, M. J.; Hurtado, S.; Bolívar, J. P.; García-Tenorio, R.

    2015-11-01

    Industrial activities involving Naturally Occurring Radioactive Materials (NORM) are found among the most important industrial sectors worldwide as oil/gas facilities, metal production, phosphate Industry, zircon treatment, etc. being really significant the radioactive characterization of the materials involved in their production processes in order to assess the potential radiological risk for workers or natural environment. High resolution gamma spectrometry is a versatile non-destructive radiometric technique that makes simultaneous determination of several radionuclides possible with little sample preparation. However NORM samples cover a wide variety of densities and composition, as opposed to the standards used in gamma efficiency calibration, which are either water-based solutions or standard/reference sources of similar composition. For that reason self-absorption correction effects (especially in the low energy range) must be considered individually in every sample. In this work an experimental and a semi-empirical methodology of self-absorption correction were applied to NORM samples, and the obtained results compared critically, in order to establish the best practice in relation to the circumstances of an individual laboratory. This methodology was applied in samples coming from a TiO2 factory (NORM industry) located in the south-west of Spain where activity concentration of several radionuclides from the Uranium and Thorium series through the production process was measured. These results will be shown in this work.

  11. Geostatistical approach for assessing soil volumes requiring remediation: validation using lead-polluted soils underlying a former smelting works.

    PubMed

    Demougeot-Renard, Helene; De Fouquet, Chantal

    2004-10-01

    Assessing the volume of soil requiring remediation and the accuracy of this assessment constitutes an essential step in polluted site management. If this remediation volume is not properly assessed, misclassification may lead both to environmental risks (polluted soils may not be remediated) and financial risks (unexpected discovery of polluted soils may generate additional remediation costs). To minimize such risks, this paper proposes a geostatistical methodology based on stochastic simulations that allows the remediation volume and the uncertainty to be assessed using investigation data. The methodology thoroughly reproduces the conditions in which the soils are classified and extracted at the remediation stage. The validity of the approach is tested by applying it on the data collected during the investigation phase of a former lead smelting works and by comparing the results with the volume that has actually been remediated. This real remediated volume was composed of all the remediation units that were classified as polluted after systematic sampling and analysis during clean-up stage. The volume estimated from the 75 samples collected during site investigation slightly overestimates (5.3% relative error) the remediated volume deduced from 212 remediation units. Furthermore, the real volume falls within the range of uncertainty predicted using the proposed methodology.

  12. Factors Influencing Self-Directed Career Management: An Integrative Investigation

    ERIC Educational Resources Information Center

    Park, Yongho

    2009-01-01

    Purpose: This paper aims to investigate the relationship between the protean career and other variables, including organizational learning climate, individual calling work orientation, and demographic variables. Design/methodology/approach: The research data were obtained from a sample consisting of 292 employees of two South Korean manufacturing…

  13. Sex Work Research: Methodological and Ethical Challenges

    ERIC Educational Resources Information Center

    Shaver, Frances M.

    2005-01-01

    The challenges involved in the design of ethical, nonexploitative research projects with sex workers or any other marginalized population are significant. First, the size and boundaries of the population are unknown, making it extremely difficult to get a representative sample. Second, because membership in hidden populations often involves…

  14. Critical evaluation of distillation procedure for the determination of methylmercury in soil samples.

    PubMed

    Perez, Pablo A; Hintelman, Holger; Quiroz, Waldo; Bravo, Manuel A

    2017-11-01

    In the present work, the efficiency of distillation process for extracting monomethylmercury (MMHg) from soil samples was studied and optimized using an experimental design methodology. The influence of soil composition on MMHg extraction was evaluated by testing of four soil samples with different geochemical characteristics. Optimization suggested that the acid concentration and the duration of the distillation process were most significant and the most favorable conditions, established as a compromise for the studied soils, were determined to be a 70 min distillation using an 0.2 M acid. Corresponding limits of detection (LOD) and quantification (LOQ) were 0.21 and 0.7 pg absolute, respectively. The optimized methodology was applied with satisfactory results to soil samples and was compared to a reference methodology based on isotopic dilution analysis followed by gas chromatography-inductively coupled plasma mass spectrometry (IDA-GC-ICP-MS). Using the optimized conditions, recoveries ranged from 82 to 98%, which is an increase of 9-34% relative to the previously used standard operating procedure. Finally, the validated methodology was applied to quantify MMHg in soils collected from different sites impacted by coal fired power plants in the north-central zone of Chile, measuring MMHg concentrations ranging from 0.091 to 2.8 ng g -1 . These data are to the best of our knowledge the first MMHg measurements reported for Chile. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Episodic work-family conflict, cardiovascular indicators, and social support: an experience sampling approach.

    PubMed

    Shockley, Kristen M; Allen, Tammy D

    2013-07-01

    Work-family conflict, a prevalent stressor in today's workforce, has been linked to several detrimental consequences for the individual, including physical health. The present study extends this area of research by examining episodic work-family conflict in relation to objectively measured cardiovascular health indicators (systolic and diastolic blood pressure and heart rate) using an experience sampling methodology. The results suggested that the occurrence of an episode of work interference with family conflict is linked to a subsequent increase in heart rate but not blood pressure; however, the relationship between episodes of family interference with work conflict and both systolic and diastolic blood pressure is moderated by perceptions of family-supportive supervision. No evidence was found for the moderating role of work-supportive family. Further theoretical and practical implications are discussed. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  16. Simulations of the National Ignition Facility Opacity Sample

    NASA Astrophysics Data System (ADS)

    Martin, M. E.; London, R. A.; Heeter, R. F.; Dodd, E. S.; Devolder, B. G.; Opachich, Y. P.; Liedahl, D. A.; Perry, T. S.

    2017-10-01

    A platform to study the opacity of high temperature materials at the National Ignition Facility has been developed. Experiments to study the opacity of materials relevant to inertial confinement fusion and stellar astrophysics are being conducted. The initial NIF experiments are focused on reaching the same plasma conditions (T >150 eV and Ne >= 7 ×1021 cm-3) , for iron, as those achieved in previous experiments at Sandia National Laboratories' (SNL) Z-facility which have shown discrepancies between opacity theory and experiment. We developed a methodology, using 1D HYDRA simulations, to study the effects of tamper thickness on the conditions of iron-magnesium samples. We heat the sample using an x-ray drive from 2D LASNEX hohlraum simulations. We also use this methodology to predict sample uniformity and expansion for comparison with experimental data. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344. Lawrence Livermore National Security, LLC.

  17. Oracle or Monacle: Research Concerning Attitudes Toward Feminism.

    ERIC Educational Resources Information Center

    Prescott, Suzanne; Schmid, Margaret

    Both popular studies and more serious empirical studies of attitudes toward feminism are reviewed beginning with Clifford Kirkpatrick's early empirical work and including the more recent empirical studies completed since 1970. The review examines the contents of items used to measure feminism, and the methodology and sampling used in studies, as…

  18. Exploring Career Success of Late Bloomers from the TVET Background

    ERIC Educational Resources Information Center

    Omar, Zoharah; Krauss, Steven Eric; Sail, Rahim M.; Ismail, Ismi Arif

    2011-01-01

    Purpose: The purpose of this paper is to explore objective and subjective career success and to identify factors contributing to career success among a sample of technical and vocational education and training (TVET) "late bloomers" working in Malaysia. Design/methodology/approach: Incorporating a mixed method design, the authors…

  19. Gas-diffusion microextraction coupled with spectrophotometry for the determination of formaldehyde in cork agglomerates.

    PubMed

    Brandão, Pedro F; Ramos, Rui M; Valente, Inês M; Almeida, Paulo J; Carro, Antonia M; Lorenzo, Rosa A; Rodrigues, José A

    2017-04-01

    In this work, a simple methodology was developed for the extraction and determination of free formaldehyde content in cork agglomerate samples. For the first time, gas-diffusion microextraction was used for the extraction of volatile formaldehyde directly from samples, with simultaneous derivatization with acetylacetone (Hantzsch reaction). The absorbance of the coloured solution was read in a spectrophotometer at 412 nm. Different extraction parameters were studied and optimized (extraction temperature, sample mass, volume of acceptor solution, extraction time and concentration of derivatization reagent) by means of an asymmetric screening. The developed methodology proved to be a reliable tool for the determination of formaldehyde in cork agglomerates with the following suitable method features: low LOD (0.14 mg kg -1 ) and LOQ (0.47 mg kg -1 ), r 2  = 0.9994, and intraday and interday precision of 3.5 and 4.9%, respectively. The developed methodology was applied to the determination of formaldehyde in different cork agglomerate samples, and contents between 1.9 and 9.4 mg kg -1 were found. Furthermore, formaldehyde was also determined by the standard method EN 717-3 for comparison purposes; no significant differences between the results of both methods were observed. Graphical abstract Representation of the GDME system and its main components.

  20. A rapid and sensitive analytical method for the determination of 14 pyrethroids in water samples.

    PubMed

    Feo, M L; Eljarrat, E; Barceló, D

    2010-04-09

    A simple, efficient and environmentally friendly analytical methodology is proposed for extracting and preconcentrating pyrethroids from water samples prior to gas chromatography-negative ion chemical ionization mass spectrometry (GC-NCI-MS) analysis. Fourteen pyrethroids were selected for this work: bifenthrin, cyfluthrin, lambda-cyhalothrin, cypermethrin, deltamethrin, esfenvalerate, fenvalerate, fenpropathrin, tau-fluvalinate, permethrin, phenothrin, resmethrin, tetramethrin and tralomethrin. The method is based on ultrasound-assisted emulsification-extraction (UAEE) of a water-immiscible solvent in an aqueous medium. Chloroform was used as extraction solvent in the UAEE technique. Target analytes were quantitatively extracted achieving an enrichment factor of 200 when 20 mL aliquot of pure water spiked with pyrethroid standards was extracted. The method was also evaluated with tap water and river water samples. Method detection limits (MDLs) ranged from 0.03 to 35.8 ng L(-1) with RSDs values < or =3-25% (n=5). The coefficients of estimation of the calibration curves obtained following the proposed methodology were > or =0.998. Recovery values were in the range of 45-106%, showing satisfactory robustness of the method for analyzing pyrethroids in water samples. The proposed methodology was applied for the analysis of river water samples. Cypermethrin was detected at concentration levels ranging from 4.94 to 30.5 ng L(-1). Copyright 2010 Elsevier B.V. All rights reserved.

  1. Accelerated solvent extraction followed by on-line solid-phase extraction coupled to ion trap LC/MS/MS for analysis of benzalkonium chlorides in sediment samples

    USGS Publications Warehouse

    Ferrer, I.; Furlong, E.T.

    2002-01-01

    Benzalkonium chlorides (BACs) were successfully extracted from sediment samples using a new methodology based on accelerated solvent extraction (ASE) followed by an on-line cleanup step. The BACs were detected by liquid chromatography/ion trap mass spectrometry (LC/MS) or tandem mass spectrometry (MS/MS) using an electrospray interface operated in the positive ion mode. This methodology combines the high efficiency of extraction provided by a pressurized fluid and the high sensitivity offered by the ion trap MS/MS. The effects of solvent type and ASE operational variables, such as temperature and pressure, were evaluated. After optimization, a mixture of acetonitrile/water (6:4 or 7:3) was found to be most efficient for extracting BACs from the sediment samples. Extraction recoveries ranged from 95 to 105% for C12 and C14 homologues, respectively. Total method recoveries from fortified sediment samples, using a cleanup step followed by ASE, were 85% for C12BAC and 79% for C14-BAC. The methodology developed in this work provides detection limits in the subnanogram per gram range. Concentrations of BAC homologues ranged from 22 to 206 ??g/kg in sediment samples from different river sites downstream from wastewater treatment plants. The high affinity of BACs for soil suggests that BACs preferentially concentrate in sediment rather than in water.

  2. Fast methodology for the reliable determination of nonylphenol in water samples by minimal labeling isotope dilution mass spectrometry.

    PubMed

    Fabregat-Cabello, Neus; Castillo, Ángel; Sancho, Juan V; González, Florenci V; Roig-Navarro, Antoni Francesc

    2013-08-02

    In this work we have developed and validated an accurate and fast methodology for the determination of 4-nonylphenol (technical mixture) in complex matrix water samples by UHPLC-ESI-MS/MS. The procedure is based on isotope dilution mass spectrometry (IDMS) in combination with isotope pattern deconvolution (IPD), which provides the concentration of the analyte directly from the spiked sample without requiring any methodological calibration graph. To avoid any possible isotopic effect during the analytical procedure the in-house synthesized (13)C1-4-(3,6-dimethyl-3-heptyl)phenol was used as labeled compound. This proposed surrogate was able to compensate the matrix effect even from wastewater samples. A SPE pre-concentration step together with exhaustive efforts to avoid contamination were included to reach the signal-to-noise ratio necessary to detect the endogenous concentrations present in environmental samples. Calculations were performed acquiring only three transitions, achieving limits of detection lower than 100ng/g for all water matrix assayed. Recoveries within 83-108% and coefficients of variation ranging from 1.5% to 9% were obtained. On the contrary a considerable overestimation was obtained with the most usual classical calibration procedure using 4-n-nonylphenol as internal standard, demonstrating the suitability of the minimal labeling approach. Copyright © 2013 Elsevier B.V. All rights reserved.

  3. Methods of sampling airborne fungi in working environments of waste treatment facilities.

    PubMed

    Černá, Kristýna; Wittlingerová, Zdeňka; Zimová, Magdaléna; Janovský, Zdeněk

    2016-01-01

    The objective of the present study was to evaluate and compare the efficiency of a filter based sampling method and a high volume sampling method for sampling airborne culturable fungi present in waste sorting facilities. Membrane filters method was compared with surface air system method. The selected sampling methods were modified and tested in 2 plastic waste sorting facilities. The total number of colony-forming units (CFU)/m3 of airborne fungi was dependent on the type of sampling device, on the time of sampling, which was carried out every hour from the beginning of the work shift, and on the type of cultivation medium (p < 0.001). Detected concentrations of airborne fungi ranged 2×102-1.7×106 CFU/m3 when using the membrane filters (MF) method, and 3×102-6.4×104 CFU/m3 when using the surface air system (SAS) method. Both methods showed comparable sensitivity to the fluctuations of the concentrations of airborne fungi during the work shifts. The SAS method is adequate for a fast indicative determination of concentration of airborne fungi. The MF method is suitable for thorough assessment of working environment contamination by airborne fungi. Therefore we recommend the MF method for the implementation of a uniform standard methodology of airborne fungi sampling in working environments of waste treatment facilities. This work is available in Open Access model and licensed under a CC BY-NC 3.0 PL license.

  4. The temporal structure of pollution levels in developed cities.

    PubMed

    Barrigón Morillas, Juan Miguel; Ortiz-Caraballo, Carmen; Prieto Gajardo, Carlos

    2015-06-01

    Currently, the need for mobility can cause significant pollution levels in cities, with important effects on health and quality of life. Any approach to the study of urban pollution and its effects requires an analysis of spatial distribution and temporal variability. It is a crucial dilemma to obtain proven methodologies that allow an increase in the quality of the prediction and the saving of resources in the spatial and temporal sampling. This work proposes a new analytical methodology in the study of temporal structure. As a result, a model for estimating annual levels of urban traffic noise was proposed. The average errors are less than one decibel in all acoustics indicators. A new working methodology of urban noise has begun. Additionally, a general application can be found for the study of the impacts of pollution associated with traffic, with implications for urban design and possibly in economic and sociological aspects. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Geoscience Education Research Methods: Thinking About Sample Size

    NASA Astrophysics Data System (ADS)

    Slater, S. J.; Slater, T. F.; CenterAstronomy; Physics Education Research

    2011-12-01

    Geoscience education research is at a critical point in which conditions are sufficient to propel our field forward toward meaningful improvements in geosciences education practices. Our field has now reached a point where the outcomes of our research is deemed important to endusers and funding agencies, and where we now have a large number of scientists who are either formally trained in geosciences education research, or who have dedicated themselves to excellence in this domain. At this point we now must collectively work through our epistemology, our rules of what methodologies will be considered sufficiently rigorous, and what data and analysis techniques will be acceptable for constructing evidence. In particular, we have to work out our answer to that most difficult of research questions: "How big should my 'N' be??" This paper presents a very brief answer to that question, addressing both quantitative and qualitative methodologies. Research question/methodology alignment, effect size and statistical power will be discussed, in addition to a defense of the notion that bigger is not always better.

  6. In situ photoacoustic characterization for porous silicon growing: Detection principles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramirez-Gutierrez, C. F.; Licenciatura en Ingeniería Física, Facultad de Ingeniería, Universidad Autónoma de Querétaro, C. P. 76010 Querétaro, Qro.; Castaño-Yepes, J. D.

    There are a few methodologies for monitoring the in-situ formation of Porous Silicon (PS). One of the methodologies is photoacoustic. Previous works that reported the use of photoacoustic to study the PS formation do not provide the physical explanation of the origin of the signal. In this paper, a physical explanation of the origin of the photoacoustic signal during the PS etching is provided. The incident modulated radiation and changes in the reflectance are taken as thermal sources. In this paper, a useful methodology is proposed to determine the etching rate, porosity, and refractive index of a PS film bymore » the determination of the sample thickness, using scanning electron microscopy images. This method was developed by carrying out two different experiments using the same anodization conditions. The first experiment consisted of growth of the samples with different etching times to prove the periodicity of the photoacoustic signal, while the second one considered the growth samples using three different wavelengths that are correlated with the period of the photoacoustic signal. The last experiment showed that the period of the photoacoustic signal is proportional to the laser wavelength.« less

  7. Selective separation of iron from uranium in quantitative determination of traces of uranium by alpha spectrometry in soil/sediment sample.

    PubMed

    Singhal, R K; Narayanan, Usha; Karpe, Rupali; Kumar, Ajay; Ranade, A; Ramachandran, V

    2009-04-01

    During this work, controlled redox potential methodology was adopted for the complete separation of traces of uranium from the host matrix of mixed hydroxide of Iron. Precipitates of Fe(+2) and Fe(+3) along with other transuranic elements were obtained from acid leached solution of soil by raising the pH to 9 with 14N ammonia solution. The concentration of the uranium observed in the soil samples was 200-600 ppb, whereas in sediment samples, the concentration range was 61-400 ppb.

  8. Epistemological Issues in Astronomy Education Research: How Big of a Sample is "Big Enough"?

    NASA Astrophysics Data System (ADS)

    Slater, Stephanie; Slater, T. F.; Souri, Z.

    2012-01-01

    As astronomy education research (AER) continues to evolve into a sophisticated enterprise, we must begin to grapple with defining our epistemological parameters. Moreover, as we attempt to make pragmatic use of our findings, we must make a concerted effort to communicate those parameters in a sensible way to the larger astronomical community. One area of much current discussion involves a basic discussion of methodologies, and subsequent sample sizes, that should be considered appropriate for generating knowledge in the field. To address this question, we completed a meta-analysis of nearly 1,000 peer-reviewed studies published in top tier professional journals. Data related to methodologies and sample sizes were collected from "hard science” and "human science” journals to compare the epistemological systems of these two bodies of knowledge. Working back in time from August 2011, the 100 most recent studies reported in each journal were used as a data source: Icarus, ApJ and AJ, NARST, IJSE and SciEd. In addition, data was collected from the 10 most recent AER dissertations, a set of articles determined by the science education community to be the most influential in the field, and the nearly 400 articles used as reference materials for the NRC's Taking Science to School. Analysis indicates these bodies of knowledge have a great deal in common; each relying on a large variety of methodologies, and each building its knowledge through studies that proceed from surprisingly low sample sizes. While both fields publish a small percentage of studies with large sample sizes, the vast majority of top tier publications consist of rich studies of a small number of objects. We conclude that rigor in each field is determined not by a circumscription of methodologies and sample sizes, but by peer judgments that the methods and sample sizes are appropriate to the research question.

  9. The Maintenance of Whiteness in Urban Education: Explorations of Rhetoric and Reality

    ERIC Educational Resources Information Center

    Miller, Erin; Starker-Glass, Tehia

    2018-01-01

    Told from the perspective of two early career professors teaching courses in elementary education diversity, this study uses purposive sampling and qualitative methodologies to examine how white students with impervious dispositions that would likely not qualify them to work with diverse children at this point in their lives present us with…

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Korte, Andrew R

    This thesis presents efforts to improve the methodology of matrix-assisted laser desorption ionization-mass spectrometry imaging (MALDI-MSI) as a method for analysis of metabolites from plant tissue samples. The first chapter consists of a general introduction to the technique of MALDI-MSI, and the sixth and final chapter provides a brief summary and an outlook on future work.

  11. Relationships between Teacher Organizational Commitment, Psychological Hardiness and Some Demographic Variables in Turkish Primary Schools

    ERIC Educational Resources Information Center

    Sezgin, Ferudun

    2009-01-01

    Purpose: The purpose of this paper is to examine the relationships between teachers' organizational commitment perceptions and both their psychological hardiness and some demographic variables in a sample of Turkish primary schools. Design/methodology/approach: A total of 405 randomly selected teachers working at primary schools in Ankara…

  12. On the development of a methodology for extensive in-situ and continuous atmospheric CO2 monitoring

    NASA Astrophysics Data System (ADS)

    Wang, K.; Chang, S.; Jhang, T.

    2010-12-01

    Carbon dioxide is recognized as the dominating greenhouse gas contributing to anthropogenic global warming. Stringent controls on carbon dioxide emissions are viewed as necessary steps in controlling atmospheric carbon dioxide concentrations. From the view point of policy making, regulation of carbon dioxide emissions and its monitoring are keys to the success of stringent controls on carbon dioxide emissions. Especially, extensive atmospheric CO2 monitoring is a crucial step to ensure that CO2 emission control strategies are closely followed. In this work we develop a methodology that enables reliable and accurate in-situ and continuous atmospheric CO2 monitoring for policy making. The methodology comprises the use of gas filter correlation (GFC) instrument for in-situ CO2 monitoring, the use of CO2 working standards accompanying the continuous measurements, and the use of NOAA WMO CO2 standard gases for calibrating the working standards. The use of GFC instruments enables 1-second data sampling frequency with the interference of water vapor removed from added dryer. The CO2 measurements are conducted in the following timed and cycled manner: zero CO2 measurement, two standard CO2 gases measurements, and ambient air measurements. The standard CO2 gases are calibrated again NOAA WMO CO2 standards. The methodology is used in indoor CO2 measurements in a commercial office (about 120 people working inside), ambient CO2 measurements, and installed in a fleet of in-service commercial cargo ships for monitoring CO2 over global marine boundary layer. These measurements demonstrate our method is reliable, accurate, and traceable to NOAA WMO CO2 standards. The portability of the instrument and the working standards make the method readily applied for large-scale and extensive CO2 measurements.

  13. PCB congener analysis with Hall electrolytic conductivity detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edstrom, R.D.

    1989-01-01

    This work reports the development of an analytical methodology for the analysis of PCB congeners based on integrating relative retention data provided by other researchers. The retention data were transposed into a multiple retention marker system which provided good precision in the calculation of relative retention indices for PCB congener analysis. Analytical run times for the developed methodology were approximately one hour using a commercially available GC capillary column. A Tracor Model 700A Hall Electrolytic Conductivity Detector (HECD) was employed in the GC detection of Aroclor standards and environmental samples. Responses by the HECD provided good sensitivity and were reasonablymore » predictable. Ten response factors were calculated based on the molar chlorine content of each homolog group. Homolog distributions were determined for Aroclors 1016, 1221, 1232, 1242, 1248, 1254, 1260, 1262 along with binary and ternary mixtures of the same. These distributions were compared with distributions reported by other researchers using electron capture detection as well as chemical ionization mass spectrometric methodologies. Homolog distributions acquired by the HECD methodology showed good correlation with the previously mentioned methodologies. The developed analytical methodology was used in the analysis of bluefish (Pomatomas saltatrix) and weakfish (Cynoscion regalis) collected from the York River, lower James River and lower Chesapeake Bay in Virginia. Total PCB concentrations were calculated and homolog distributions were constructed from the acquired data. Increases in total PCB concentrations were found in the analyzed fish samples during the fall of 1985 collected from the lower James River and lower Chesapeake Bay.« less

  14. Status of Activities to Implement a Sustainable System of MC&A Equipment and Methodological Support at Rosatom Facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    J.D. Sanders

    Under the U.S.-Russian Material Protection, Control and Accounting (MPC&A) Program, the Material Control and Accounting Measurements (MCAM) Project has supported a joint U.S.-Russian effort to coordinate improvements of the Russian MC&A measurement system. These efforts have resulted in the development of a MC&A Equipment and Methodological Support (MEMS) Strategic Plan (SP), developed by the Russian MEM Working Group. The MEMS SP covers implementation of MC&A measurement equipment, as well as the development, attestation and implementation of measurement methodologies and reference materials at the facility and industry levels. This paper provides an overview of the activities conducted under the MEMS SP,more » as well as a status on current efforts to develop reference materials, implement destructive and nondestructive assay measurement methodologies, and implement sample exchange, scrap and holdup measurement programs across Russian nuclear facilities.« less

  15. A methodology for finding the optimal iteration number of the SIRT algorithm for quantitative Electron Tomography.

    PubMed

    Okariz, Ana; Guraya, Teresa; Iturrondobeitia, Maider; Ibarretxe, Julen

    2017-02-01

    The SIRT (Simultaneous Iterative Reconstruction Technique) algorithm is commonly used in Electron Tomography to calculate the original volume of the sample from noisy images, but the results provided by this iterative procedure are strongly dependent on the specific implementation of the algorithm, as well as on the number of iterations employed for the reconstruction. In this work, a methodology for selecting the iteration number of the SIRT reconstruction that provides the most accurate segmentation is proposed. The methodology is based on the statistical analysis of the intensity profiles at the edge of the objects in the reconstructed volume. A phantom which resembles a a carbon black aggregate has been created to validate the methodology and the SIRT implementations of two free software packages (TOMOJ and TOMO3D) have been used. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Sample survey methods as a quality assurance tool in a general practice immunisation audit.

    PubMed

    Cullen, R

    1994-04-27

    In a multidoctor family practice there are often just too many sets of patients records to make it practical to repeat an audit by census of even an age band of the practice on a regular basis. This paper attempts to demonstrate how sample survey methodology can be incorporated into the quality assurance cycle. A simple random sample (with replacement) of 120 from 580 children with permanent records who were aged between 6 weeks and 2 years old from an Auckland general practice was performed, with sample size selected to give a predetermined precision. The survey was then repeated after 4 weeks. Both surveys were able to be completed within the course of a normal working day. An unexpectedly low level of under 2 years olds that were recorded as not overdue for any immunisations was found (22.5%) with only a modest improvement after a standard telephone/letter catch up campaign. Seventy-two percent of the sample held a group one community services card. The advantages of properly conducted sample surveys in producing useful estimates of known precision without disrupting office routines excessively were demonstrated. Through some attention to methodology, the trauma of a practice census can be avoided.

  17. Determination of As, Se, and Hg in fuel samples by in-chamber chemical vapor generation ICP OES using a Flow Blurring® multinebulizer.

    PubMed

    García, Miriam; Aguirre, Miguel Ángel; Canals, Antonio

    2017-09-01

    In this work, a new and simple analytical methodology based on in-chamber chemical vapor generation has been developed for the spectrochemical analysis of commercial fuel samples. A multiple nebulizer with three nebulization units has been employed for this purpose: One unit was used for sample introduction, while the other two were used for the necessary reagent introduction. In this way, the aerosols were mixed inside the spray chamber. Through this method, analyte transport and, therefore, sensitivity are improved in inductively coupled plasma-optical emission spectrometry. The factors (i.e., variables), influencing chemical vapor generation, have been optimized using a multivariate approach. Under optimum chemical vapor generation conditions ([NaBH 4 ] = 1.39%, [HCl] = 2.97 M, total liquid flow = 936 μL min -1 ), the proposed sample introduction system allowed the determination of arsenic, selenium, and mercury up to 5 μg g -1 with a limit of detection of 25, 140, and 13 μg kg -1 , respectively. Analyzing spiked commercial fuel samples, recovery values obtained were between 96 and 113%, and expanded uncertainty values ranged from 4 to 16%. The most striking practical conclusion of this investigation is that no carbon deposit appears on the plasma torch after extended periods of working. Graphical abstract A new and simple analytical methodology based on in-chamber chemical vapor generation has been developed for the spectrochemical analysis of commercial fuel samples in ICP OES.

  18. Choosing the Most Effective Pattern Classification Model under Learning-Time Constraint.

    PubMed

    Saito, Priscila T M; Nakamura, Rodrigo Y M; Amorim, Willian P; Papa, João P; de Rezende, Pedro J; Falcão, Alexandre X

    2015-01-01

    Nowadays, large datasets are common and demand faster and more effective pattern analysis techniques. However, methodologies to compare classifiers usually do not take into account the learning-time constraints required by applications. This work presents a methodology to compare classifiers with respect to their ability to learn from classification errors on a large learning set, within a given time limit. Faster techniques may acquire more training samples, but only when they are more effective will they achieve higher performance on unseen testing sets. We demonstrate this result using several techniques, multiple datasets, and typical learning-time limits required by applications.

  19. Modeling and Simulation of Upset-Inducing Disturbances for Digital Systems in an Electromagnetic Reverberation Chamber

    NASA Technical Reports Server (NTRS)

    Torres-Pomales, Wilfredo

    2014-01-01

    This report describes a modeling and simulation approach for disturbance patterns representative of the environment experienced by a digital system in an electromagnetic reverberation chamber. The disturbance is modeled by a multi-variate statistical distribution based on empirical observations. Extended versions of the Rejection Samping and Inverse Transform Sampling techniques are developed to generate multi-variate random samples of the disturbance. The results show that Inverse Transform Sampling returns samples with higher fidelity relative to the empirical distribution. This work is part of an ongoing effort to develop a resilience assessment methodology for complex safety-critical distributed systems.

  20. Monte Carlo and Molecular Dynamics in the Multicanonical Ensemble: Connections between Wang-Landau Sampling and Metadynamics

    NASA Astrophysics Data System (ADS)

    Vogel, Thomas; Perez, Danny; Junghans, Christoph

    2014-03-01

    We show direct formal relationships between the Wang-Landau iteration [PRL 86, 2050 (2001)], metadynamics [PNAS 99, 12562 (2002)] and statistical temperature molecular dynamics [PRL 97, 050601 (2006)], the major Monte Carlo and molecular dynamics work horses for sampling from a generalized, multicanonical ensemble. We aim at helping to consolidate the developments in the different areas by indicating how methodological advancements can be transferred in a straightforward way, avoiding the parallel, largely independent, developments tracks observed in the past.

  1. Determination of tocopherols and sitosterols in seeds and nuts by QuEChERS-liquid chromatography.

    PubMed

    Delgado-Zamarreño, M Milagros; Fernández-Prieto, Cristina; Bustamante-Rangel, Myriam; Pérez-Martín, Lara

    2016-02-01

    In the present work a simple, reliable and affordable sample treatment method for the simultaneous analysis of tocopherols and free phytosterols in nuts was developed. Analyte extraction was carried out using the QuEChERS methodology and analyte separation and detection were accomplished using HPLC-DAD. The use of this methodology for the extraction of natural occurring substances provides advantages such as speed, simplicity and ease of use. The parameters evaluated for the validation of the method developed included the linearity of the calibration plots, the detection and quantification limits, repeatability, reproducibility and recovery. The proposed method was successfully applied to the analysis of tocopherols and free phytosterols in samples of almonds, cashew nuts, hazelnuts, peanuts, tiger nuts, sun flower seeds and pistachios. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Correcting sample drift using Fourier harmonics.

    PubMed

    Bárcena-González, G; Guerrero-Lebrero, M P; Guerrero, E; Reyes, D F; Braza, V; Yañez, A; Nuñez-Moraleda, B; González, D; Galindo, P L

    2018-07-01

    During image acquisition of crystalline materials by high-resolution scanning transmission electron microscopy, the sample drift could lead to distortions and shears that hinder their quantitative analysis and characterization. In order to measure and correct this effect, several authors have proposed different methodologies making use of series of images. In this work, we introduce a methodology to determine the drift angle via Fourier analysis by using a single image based on the measurements between the angles of the second Fourier harmonics in different quadrants. Two different approaches, that are independent of the angle of acquisition of the image, are evaluated. In addition, our results demonstrate that the determination of the drift angle is more accurate by using the measurements of non-consecutive quadrants when the angle of acquisition is an odd multiple of 45°. Copyright © 2018 Elsevier Ltd. All rights reserved.

  3. Recent archaeomagnetic studies in Slovakia: Comparison of methodological approaches

    NASA Astrophysics Data System (ADS)

    Kubišová, Lenka

    2016-03-01

    We review the recent archaeomagnetic studies carried out on the territory of Slovakia, focusing on the comparison of methodological approaches, discussing pros and cons of the individual applied methods from the perspective of our experience. The most widely used methods for the determination of intensity and direction of the archaeomegnetic field by demagnetisation of the sample material are the alternating field (AF) demagnetisation and the Thellier double heating method. These methods are used not only for archaeomagnetic studies but also help to solve some geological problems. The two methods were applied to samples collected recently at several sites of Slovakia, where archaeological prospection invoked by earthwork or reconstruction work of developing projects demanded archaeomagnetic dating. Then we discuss advantages and weaknesses of the investigated methods from different perspectives based on several examples and our recent experience.

  4. Cross-Sectional And Longitudinal Uncertainty Propagation In Drinking Water Risk Assessment

    NASA Astrophysics Data System (ADS)

    Tesfamichael, A. A.; Jagath, K. J.

    2004-12-01

    Pesticide residues in drinking water can vary significantly from day to day. However, drinking water quality monitoring performed under the Safe Drinking Water Act (SDWA) at most community water systems (CWSs) is typically limited to four data points per year over a few years. Due to limited sampling, likely maximum residues may be underestimated in risk assessment. In this work, a statistical methodology is proposed to study the cross-sectional and longitudinal uncertainties in observed samples and their propagated effect in risk estimates. The methodology will be demonstrated using data from 16 CWSs across the US that have three independent databases of atrazine residue to estimate the uncertainty of risk in infants and children. The results showed that in 85% of the CWSs, chronic risks predicted with the proposed approach may be two- to four-folds higher than that predicted with the current approach, while intermediate risks may be two- to three-folds higher in 50% of the CWSs. In 12% of the CWSs, however, the proposed methodology showed a lower intermediate risk. A closed-form solution of propagated uncertainty will be developed to calculate the number of years (seasons) of water quality data and sampling frequency needed to reduce the uncertainty in risk estimates. In general, this methodology provided good insight into the importance of addressing uncertainty of observed water quality data and the need to predict likely maximum residues in risk assessment by considering propagation of uncertainties.

  5. Cognitive Difficulties in Struggling Comprehenders and Their Relation to Reading Comprehension: A Comparison of Group Selection and Regression-Based Models

    ERIC Educational Resources Information Center

    Barnes, Marcia A.; Stuebing, Karla K.; Fletcher, Jack M.; Barth, Amy E.; Francis, David J.

    2016-01-01

    Difficulties suppressing previously encountered but currently irrelevant information from working memory characterize less skilled comprehenders in studies in which they are matched to skilled comprehenders on word decoding and nonverbal IQ. These "extreme" group designs are associated with several methodological issues. When sample size…

  6. Waging a Living: Career Development and Long-Term Employment Outcomes for Young Adults with Disabilities

    ERIC Educational Resources Information Center

    Lindstrom, Lauren; Doren, Bonnie; Miesch, Jennifer

    2011-01-01

    Youth with disabilities face many barriers in making the transition from high school to stable long-term employment. Researchers used case study methodology to examine the career development process and postschool employment outcomes for a sample of individuals with disabilities who were working in living wage occupations 7 to 10 years after…

  7. School Principals at Their Lonely Work: Recording Workday Practices through ESM Logs

    ERIC Educational Resources Information Center

    Lopez, Veronica; Ahumada, Luis; Galdames, Sergio; Madrid, Romina

    2012-01-01

    This study used portable technology based on Experience Sampling Methodology (ESM log) to register workday practices of school principals and heads from Chilean schools who were implementing school improvement plans aimed at developing a culture of organizational learning. For a week, Smartphone devices which beeped seven times a day were given to…

  8. Relationship among School Size, School Culture and Students' Achievement at Secondary Level in Pakistan

    ERIC Educational Resources Information Center

    Ahmad Salfi, Naseer; Saeed, Muhammad

    2007-01-01

    Purpose: This paper seeks to determine the relationship among school size, school culture and students' achievement at secondary level in Pakistan. Design/methodology/approach: The study was descriptive (survey type). It was conducted on a sample of 90 secondary school head teachers and 540 primary, elementary and high school teachers working in…

  9. Counseling Chinese Patients about Cigarette Smoking: The Role of Nurses

    ERIC Educational Resources Information Center

    Li, Han Zao; Zhang, Yu; MacDonell, Karen; Li, Xiao Ping; Chen, Xinguang

    2012-01-01

    Purpose: The main purpose of this study is to determine the cigarette smoking rate and smoking cessation counseling frequency in a sample of Chinese nurses. Design/methodology/approach: At the time of data collection, the hospital had 260 nurses, 255 females and five males. The 200 nurses working on the two daytime shifts were given the…

  10. Teachers' Perceptions of Employment-Related Problems: A Survey of Teachers in Two States.

    ERIC Educational Resources Information Center

    Cutrer, Susan S.; Daniel, Larry G.

    This study was conducted to determine the degree to which a randomly selected sample of teachers in Mississippi and Louisiana (N=291) experience various types of work-related problems. It provides an opportunity to either confirm or deny the findings of previous studies, many of them limited by various methodological problems. Data were collected…

  11. Climate Profile and OCBs of Teachers in Public and Private Schools of India

    ERIC Educational Resources Information Center

    Garg, Pooja; Rastogi, Renu

    2006-01-01

    Purpose: This research aims to assess the significant differences in the climate profile and organizational citizenship behaviors (OCBs) of teachers working in public and private schools of India. Design/methodology/approach: The sample comprised of 100 teachers, out of which 50 teachers were from public school and 50 teachers were from private…

  12. INTERPRETING PHYSICAL AND BEHAVIORAL HEALTH SCORES FROM NEW WORK DISABILITY INSTRUMENTS

    PubMed Central

    Marfeo, Elizabeth E.; Ni, Pengsheng; Chan, Leighton; Rasch, Elizabeth K.; McDonough, Christine M.; Brandt, Diane E.; Bogusz, Kara; Jette, Alan M.

    2015-01-01

    Objective To develop a system to guide interpretation of scores generated from 2 new instruments measuring work-related physical and behavioral health functioning (Work Disability – Physical Function (WD-PF) and WD – Behavioral Function (WD-BH)). Design Cross-sectional, secondary data from 3 independent samples to develop and validate the functional levels for physical and behavioral health functioning. Subjects Physical group: 999 general adult subjects, 1,017 disability applicants and 497 work-disabled subjects. Behavioral health group: 1,000 general adult subjects, 1,015 disability applicants and 476 work-disabled subjects. Methods Three-phase analytic approach including item mapping, a modified-Delphi technique, and known-groups validation analysis were used to develop and validate cut-points for functional levels within each of the WD-PF and WD-BH instrument’s scales. Results Four and 5 functional levels were developed for each of the scales in the WD-PF and WD-BH instruments. Distribution of the comparative samples was in the expected direction: the general adult samples consistently demonstrated scores at higher functional levels compared with the claimant and work-disabled samples. Conclusion Using an item-response theory-based methodology paired with a qualitative process appears to be a feasible and valid approach for translating the WD-BH and WD-PF scores into meaningful levels useful for interpreting a person’s work-related physical and behavioral health functioning. PMID:25729901

  13. Biomarkers of exposure to metal dust in exhaled breath condensate: methodology optimization.

    PubMed

    Félix, P M; Franco, C; Barreiros, M A; Batista, B; Bernardes, S; Garcia, S M; Almeida, A B; Almeida, S M; Wolterbeek, H Th; Pinheiro, T

    2013-01-01

    In occupational assessments where workers are exposed to metal dust, the liquid condensate of exhaled breath (EBC) may provide unique indication of pulmonary exposure. The main goal of this study was to demonstrate the quality of EBC to biological monitoring of human exposure. A pilot study was performed in a group of metal dust-exposed workers and a group of nonexposed individuals working in offices. Only metal dust-exposed workers were followed along the working week to determine the best time of collection. Metal analyses were performed with inductively coupled plasma mass spectrometry (ICP-MS). Analytical methodology was tested using an EBC sample pool for several occupationally exposed metals: potassium, chromium, manganese, copper, zinc, strontium, cadmium, antimony, and lead. Metal contents in EBC of exposed workers were higher than controls at the beginning of the shift and remained augmented throughout the working week. The results obtained support the establishment of EBC as an indicator of pulmonary exposure to metals.

  14. Ergonomic initiatives at Inmetro: measuring occupational health and safety.

    PubMed

    Drucker, L; Amaral, M; Carvalheira, C

    2012-01-01

    This work studies biomechanical hazards to which the workforce of Instituto Nacional de Metrologia, Qualidade e Tecnologia Industrial (Inmetro) is exposed. It suggests a model for ergonomic evaluation of work, based on the concepts of resilience engineering which take into consideration the institute's ability to manage risk and deal with its consequences. Methodology includes the stages of identification, inventory, analysis, and risk management. Diagnosis of the workplace uses as parameters the minimal criteria stated in Brazilian legislation. The approach has several prospectives and encompasses the points of view of public management, safety engineering, physical therapy and ergonomics-oriented design. The suggested solution integrates all aspects of the problem: biological, psychological, sociological and organizational. Results obtained from a pilot Project allow to build a significant sample of Inmetro's workforce, identifying problems and validating the methodology employed as a tool to be applied to the whole institution. Finally, this work intends to draw risk maps and support goals and methods based on resiliency engineering to assess environmental and ergonomic risk management.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bekar, Kursat B; Miller, Thomas Martin; Patton, Bruce W

    The characteristic X-rays produced by the interactions of the electron beam with the sample in a scanning electron microscope (SEM) are usually captured with a variable-energy detector, a process termed energy dispersive spectrometry (EDS). The purpose of this work is to exploit inverse simulations of SEM-EDS spectra to enable rapid determination of sample properties, particularly elemental composition. This is accomplished using penORNL, a modified version of PENELOPE, and a modified version of the traditional Levenberg Marquardt nonlinear optimization algorithm, which together is referred to as MOZAIK-SEM. The overall conclusion of this work is that MOZAIK-SEM is a promising method formore » performing inverse analysis of X-ray spectra generated within a SEM. As this methodology exists now, MOZAIK-SEM has been shown to calculate the elemental composition of an unknown sample within a few percent of the actual composition.« less

  16. 21 CFR 118.7 - Sampling methodology for Salmonella Enteritidis (SE).

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 2 2013-04-01 2013-04-01 false Sampling methodology for Salmonella Enteritidis (SE). 118.7 Section 118.7 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN....7 Sampling methodology for Salmonella Enteritidis (SE). (a) Environmental sampling. An environmental...

  17. 21 CFR 118.7 - Sampling methodology for Salmonella Enteritidis (SE).

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 2 2014-04-01 2014-04-01 false Sampling methodology for Salmonella Enteritidis (SE). 118.7 Section 118.7 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN....7 Sampling methodology for Salmonella Enteritidis (SE). (a) Environmental sampling. An environmental...

  18. 21 CFR 118.7 - Sampling methodology for Salmonella Enteritidis (SE).

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Sampling methodology for Salmonella Enteritidis (SE). 118.7 Section 118.7 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN....7 Sampling methodology for Salmonella Enteritidis (SE). (a) Environmental sampling. An environmental...

  19. 21 CFR 118.7 - Sampling methodology for Salmonella Enteritidis (SE).

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Sampling methodology for Salmonella Enteritidis (SE). 118.7 Section 118.7 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN....7 Sampling methodology for Salmonella Enteritidis (SE). (a) Environmental sampling. An environmental...

  20. 21 CFR 118.7 - Sampling methodology for Salmonella Enteritidis (SE).

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Sampling methodology for Salmonella Enteritidis (SE). 118.7 Section 118.7 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN....7 Sampling methodology for Salmonella Enteritidis (SE). (a) Environmental sampling. An environmental...

  1. Rushed, unhappy, and drained: an experience sampling study of relations between time pressure, perceived control, mood, and emotional exhaustion in a group of accountants.

    PubMed

    Teuchmann, K; Totterdell, P; Parker, S K

    1999-01-01

    Experience sampling methodology was used to examine how work demands translate into acute changes in affective response and thence into chronic response. Seven accountants reported their reactions 3 times a day for 4 weeks on pocket computers. Aggregated analysis showed that mood and emotional exhaustion fluctuated in parallel with time pressure over time. Disaggregated time-series analysis confirmed the direct impact of high-demand periods on the perception of control, time pressure, and mood and the indirect impact on emotional exhaustion. A curvilinear relationship between time pressure and emotional exhaustion was shown. The relationships between work demands and emotional exhaustion changed between high-demand periods and normal working periods. The results suggest that enhancing perceived control may alleviate the negative effects of time pressure.

  2. [Methodological quality of an article on the treatment of gastric cancer adopted as protocol by some Chilean hospitals].

    PubMed

    Manterola, Carlos; Torres, Rodrigo; Burgos, Luis; Vial, Manuel; Pineda, Viviana

    2006-07-01

    Surgery is a curative treatment for gastric cancer (GC). As relapse is frequent, adjuvant therapies such as postoperative chemo radiotherapy have been tried. In Chile, some hospitals adopted Macdonald's study as a protocol for the treatment of GC. To determine methodological quality and internal and external validity of the Macdonald study. Three instruments were applied that assess methodological quality. A critical appraisal was done and the internal and external validity of the methodological quality was analyzed with two scales: MINCIR (Methodology and Research in Surgery), valid for therapy studies and CONSORT (Consolidated Standards of Reporting Trials), valid for randomized controlled trials (RCT). Guides and scales were applied by 5 researchers with training in clinical epidemiology. The reader's guide verified that the Macdonald study was not directed to answer a clearly defined question. There was random assignment, but the method used is not described and the patients were not considered until the end of the study (36% of the group with surgery plus chemo radiotherapy did not complete treatment). MINCIR scale confirmed a multicentric RCT, not blinded, with an unclear randomized sequence, erroneous sample size estimation, vague objectives and no exclusion criteria. CONSORT system proved the lack of working hypothesis and specific objectives as well as an absence of exclusion criteria and identification of the primary variable, an imprecise estimation of sample size, ambiguities in the randomization process, no blinding, an absence of statistical adjustment and the omission of a subgroup analysis. The instruments applied demonstrated methodological shortcomings that compromise the internal and external validity of the.

  3. [An approach to a methodology of scientific research for assistant-students].

    PubMed

    Novak, Ivón T C; Bejarano, Paola Antón; Rodríguez, Fernando Marcos

    2007-01-01

    This work is presented from a "problematic" perspective in the attempt to establish a dialogic relationship between the educator and the student-subject, mediated by the object of knowledge. It is oriented to the integral education of the helping students departing from a closer approach to the scientific research. This work was carried out by a teacher and two hired students. This project was developed in relation with the profile required for the career of medicine in the Faculty of Medicine of the National University of Cordoba which--among other aspects- addresses the importance of "adopting a positive attitude towards research based on knowledge and the application of the scientific methodology" and towards "the development of a responsible self-learning and continuous improvements" (sic). Thus, this work tries to be aligned with this perspectives. I. Characterization of the scientific methodology. Search for bibliography and discussion of scientific works. II. Optimization of the methodology for the observation of leucocytes: blood samples donated by healthy people, non-coagulating with citrate or with EDTA (Blood reservoir of the UNC (National University of Cordoba) n = 20. a) Blood smear of full blood. b) centrifugation at 200g of plasma and aspirated leucocytes after erythro sedimentation and re suspension of the cell pellet and cyto-dispersion. Cytological and cyto-chemical techniques. I. Deeper knowledge about blood field was achieved. It generated an appropriate atmosphere to produce scientific questioning and the activities involved in the process were carried out responsibly. II. Better results were achieved using EDTA for the observation and analysis of leucocytes. It was possible to attain the objectives for an approach to a scientific research as well as for a contribution towards a responsible development in the continuous learning process.

  4. Q-Sample Construction: A Critical Step for a Q-Methodological Study.

    PubMed

    Paige, Jane B; Morin, Karen H

    2016-01-01

    Q-sample construction is a critical step in Q-methodological studies. Prior to conducting Q-studies, researchers start with a population of opinion statements (concourse) on a particular topic of interest from which a sample is drawn. These sampled statements are known as the Q-sample. Although literature exists on methodological processes to conduct Q-methodological studies, limited guidance exists on the practical steps to reduce the population of statements to a Q-sample. A case exemplar illustrates the steps to construct a Q-sample in preparation for a study that explored perspectives nurse educators and nursing students hold about simulation design. Experts in simulation and Q-methodology evaluated the Q-sample for readability, clarity, and for representativeness of opinions contained within the concourse. The Q-sample was piloted and feedback resulted in statement refinement. Researchers especially those undertaking Q-method studies for the first time may benefit from the practical considerations to construct a Q-sample offered in this article. © The Author(s) 2014.

  5. Evaluation of ridesharing programs in Michigan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kulp, G.; Tsao, H.J.; Webber, R.E.

    1982-10-01

    The design, implementation, and results of a carpool and vanpool evaluation are described. Objectives of the evaluation were: to develop credible estimates of the energy savings attributable to the ridesharing program, to provide information for improving the performance of the ridesharing program, and to add to a general understanding of the ridesharing process. Previous evaluation work is critiqued and the research methodology adopted for this study is discussed. The ridesharing program in Michigan is described and the basis for selecting Michigan as the evaluation site is discussed. The evaluation methodology is presented, including research design, sampling procedure, data collection, andmore » data validation. Evaluation results are analyzed. (LEW)« less

  6. Identification and Discrimination of Brands of Fuels by Gas Chromatography and Neural Networks Algorithm in Forensic Research

    PubMed Central

    Ugena, L.; Moncayo, S.; Manzoor, S.; Rosales, D.

    2016-01-01

    The detection of adulteration of fuels and its use in criminal scenes like arson has a high interest in forensic investigations. In this work, a method based on gas chromatography (GC) and neural networks (NN) has been developed and applied to the identification and discrimination of brands of fuels such as gasoline and diesel without the necessity to determine the composition of the samples. The study included five main brands of fuels from Spain, collected from fifteen different local petrol stations. The methodology allowed the identification of the gasoline and diesel brands with a high accuracy close to 100%, without any false positives or false negatives. A success rate of three blind samples was obtained as 73.3%, 80%, and 100%, respectively. The results obtained demonstrate the potential of this methodology to help in resolving criminal situations. PMID:27375919

  7. Accurate EPR radiosensitivity calibration using small sample masses

    NASA Astrophysics Data System (ADS)

    Hayes, R. B.; Haskell, E. H.; Barrus, J. K.; Kenner, G. H.; Romanyukha, A. A.

    2000-03-01

    We demonstrate a procedure in retrospective EPR dosimetry which allows for virtually nondestructive sample evaluation in terms of sample irradiations. For this procedure to work, it is shown that corrections must be made for cavity response characteristics when using variable mass samples. Likewise, methods are employed to correct for empty tube signals, sample anisotropy and frequency drift while considering the effects of dose distribution optimization. A demonstration of the method's utility is given by comparing sample portions evaluated using both the described methodology and standard full sample additive dose techniques. The samples used in this study are tooth enamel from teeth removed during routine dental care. We show that by making all the recommended corrections, very small masses can be both accurately measured and correlated with measurements of other samples. Some issues relating to dose distribution optimization are also addressed.

  8. Trust, Isolation, and Presence: The Virtual Work Environment and Acceptance of Deep Organizational Change

    ERIC Educational Resources Information Center

    Rose, Laurence Michael

    2013-01-01

    The primary focus of this research was to explore through the use of a grounded theory methodology if the human perceptions of trust, isolation, and presence affected the virtual workers ability to accept deep organizational change. The study found that the virtual workers in the sample defined their acceptance of deep organizational change by…

  9. All Work and No Pay: Violations of Employment and Labor Laws in Chicago, Los Angeles and New York City

    ERIC Educational Resources Information Center

    Bernhardt, Annette; Spiller, Michael W.; Polson, Diana

    2013-01-01

    Despite three decades of scholarship on economic restructuring in the United States, employers' violations of minimum wage, overtime and other workplace laws remain understudied. This article begins to fill the gap by presenting evidence from a large-scale, original worker survey that draws on recent advances in sampling methodology to reach…

  10. The next Generation at Work--Business Students' Views, Values and Job Search Strategy: Implications for Universities and Employers

    ERIC Educational Resources Information Center

    Ng, Eddy S. W.; Burke, Ronald J.

    2006-01-01

    Purpose: The purpose of the paper is to explore the views, career expectations, and job search behaviours among a sample of business students. It also aims to examine the role of campus career services in shaping students' careers and how cooperative education influences their expectations and aspirations. Design/methodology/approach: A field…

  11. A methodology to address mixed AGN and starlight contributions in emission line galaxies found in the RESOLVE survey and ECO catalog

    NASA Astrophysics Data System (ADS)

    Richardson, Chris T.; Kannappan, Sheila; Bittner, Ashley; Isaac, Rohan; RESOLVE

    2017-01-01

    We present a novel methodology for modeling emission line galaxy samples that span the entire BPT diagram. Our methodology has several advantages over current modeling schemes: the free variables in the model are identical for both AGN and SF galaxies; these free variables are more closely linked to observable galaxy properties; and the ionizing spectra including an AGN and starlight are handled self-consistently rather than empirically. We show that our methodology is capable of fitting the vast majority of SDSS galaxies that fall within the traditional regions of galaxy classification on the BPT diagram. We also present current results for relaxing classification boundaries and extending our galaxies into the dwarf regime, using the REsolved Spectroscopy of a Local VolumE (RESOLVE) survey and the Environmental COntext (ECO) catalog, with special attention to compact blue E/S0s. We compare this methodology to PCA decomposition of the spectra. This work is supported by National Science Foundation awards AST-0955368 and CISE/ACI-1156614.

  12. [Qualitative research methodology in health care].

    PubMed

    Bedregal, Paula; Besoain, Carolina; Reinoso, Alejandro; Zubarew, Tamara

    2017-03-01

    Health care research requires different methodological approaches such as qualitative and quantitative analyzes to understand the phenomena under study. Qualitative research is usually the least considered. Central elements of the qualitative method are that the object of study is constituted by perceptions, emotions and beliefs, non-random sampling by purpose, circular process of knowledge construction, and methodological rigor throughout the research process, from quality design to the consistency of results. The objective of this work is to contribute to the methodological knowledge about qualitative research in health services, based on the implementation of the study, “The transition process from pediatric to adult services: perspectives from adolescents with chronic diseases, caregivers and health professionals”. The information gathered through the qualitative methodology facilitated the understanding of critical points, barriers and facilitators of the transition process of adolescents with chronic diseases, considering the perspective of users and the health team. This study allowed the design of a transition services model from pediatric to adult health services based on the needs of adolescents with chronic diseases, their caregivers and the health team.

  13. To P or Not to P: Backing Bayesian Statistics.

    PubMed

    Buchinsky, Farrel J; Chadha, Neil K

    2017-12-01

    In biomedical research, it is imperative to differentiate chance variation from truth before we generalize what we see in a sample of subjects to the wider population. For decades, we have relied on null hypothesis significance testing, where we calculate P values for our data to decide whether to reject a null hypothesis. This methodology is subject to substantial misinterpretation and errant conclusions. Instead of working backward by calculating the probability of our data if the null hypothesis were true, Bayesian statistics allow us instead to work forward, calculating the probability of our hypothesis given the available data. This methodology gives us a mathematical means of incorporating our "prior probabilities" from previous study data (if any) to produce new "posterior probabilities." Bayesian statistics tell us how confidently we should believe what we believe. It is time to embrace and encourage their use in our otolaryngology research.

  14. A sustainable on-line CapLC method for quantifying antifouling agents like irgarol-1051 and diuron in water samples: Estimation of the carbon footprint.

    PubMed

    Pla-Tolós, J; Serra-Mora, P; Hakobyan, L; Molins-Legua, C; Moliner-Martinez, Y; Campins-Falcó, P

    2016-11-01

    In this work, in-tube solid phase microextraction (in-tube SPME) coupled to capillary LC (CapLC) with diode array detection has been reported, for on-line extraction and enrichment of booster biocides (irgarol-1051 and diuron) included in Water Frame Directive 2013/39/UE (WFD). The analytical performance has been successfully demonstrated. Furthermore, in the present work, the environmental friendliness of the procedure has been quantified by means of the implementation of the carbon footprint calculation of the analytical procedure and the comparison with other methodologies previously reported. Under the optimum conditions, the method presents good linearity over the range assayed, 0.05-10μg/L for irgarol-1051 and 0.7-10μg/L for diuron. The LODs were 0.015μg/L and 0.2μg/L for irgarol-1051 and diuron, respectively. Precision was also satisfactory (relative standard deviation, RSD<3.5%). The proposed methodology was applied to monitor water samples, taking into account the EQS standards for these compounds. The carbon footprint values for the proposed procedure consolidate the operational efficiency (analytical and environmental performance) of in-tube SPME-CapLC-DAD, in general, and in particular for determining irgarol-1051 and diuron in water samples. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Design and development of molecularly imprinted polymers for the selective extraction of deltamethrin in olive oil: An integrated computational-assisted approach.

    PubMed

    Martins, Nuno; Carreiro, Elisabete P; Locati, Abel; Ramalho, João P Prates; Cabrita, Maria João; Burke, Anthony J; Garcia, Raquel

    2015-08-28

    This work firstly addresses the design and development of molecularly imprinted systems selective for deltamethrin aiming to provide a suitable sorbent for solid phase (SPE) extraction that will be further used for the implementation of an analytical methodology for the trace analysis of the target pesticide in spiked olive oil samples. To achieve this goal, a preliminary evaluation of the molecular recognition and selectivity of the molecularly imprinted polymers has been performed. In order to investigate the complexity of the mechanistic basis for template selective recognition in these polymeric matrices, the use of a quantum chemical approach has been attempted providing new insights about the mechanisms underlying template recognition, and in particular the crucial role of the crosslinker agent and the solvent used. Thus, DFT calculations corroborate the results obtained by experimental molecular recognition assays enabling one to select the most suitable imprinting system for MISPE extraction technique which encompasses acrylamide as functional monomer and ethylene glycol dimethacrylate as crosslinker. Furthermore, an analytical methodology comprising a sample preparation step based on solid phase extraction has been implemented using this "tailor made" imprinting system as sorbent, for the selective isolation/pre-concentration of deltamethrin from olive oil samples. Molecularly imprinted solid phase extraction (MISPE) methodology was successfully applied for the clean-up of spiked olive oil samples, with recovery rates up to 94%. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Analytical methodology for sampling and analysing eight siloxanes and trimethylsilanol in biogas from different wastewater treatment plants in Europe.

    PubMed

    Raich-Montiu, J; Ribas-Font, C; de Arespacochaga, N; Roig-Torres, E; Broto-Puig, F; Crest, M; Bouchy, L; Cortina, J L

    2014-02-17

    Siloxanes and trimethylsilanol belong to a family of organic silicone compounds that are currently used extensively in industry. Those that are prone to volatilisation become minor compounds in biogas adversely affecting energetic applications. However, non-standard analytical methodologies are available to analyse biogas-based gaseous matrixes. To this end, different sampling techniques (adsorbent tubes, impingers and tedlar bags) were compared using two different configurations: sampling directly from the biogas source or from a 200 L tedlar bag filled with biogas and homogenised. No significant differences were apparent between the two sampling configurations. The adsorbent tubes performed better than the tedlar bags and impingers, particularly for quantifying low concentrations. A method for the speciation of silicon compounds in biogas was developed using gas chromatography coupled with mass spectrometry working in dual scan/single ion monitoring mode. The optimised conditions could separate and quantify eight siloxane compounds (L2, L3, L4, L5, D3, D4, D5 and D6) and trimethylsilanol within fourteen minutes. Biogas from five waste water treatment plants located in Spain, France and England was sampled and analysed using the developed methodology. The siloxane concentrations in the biogas samples were influenced by the anaerobic digestion temperature, as well as the nature and composition of the sewage inlet. Siloxanes D4 and D5 were the most abundant, ranging in concentration from 1.5 to 10.1 and 10.8 to 124.0 mg Nm(-3), respectively, and exceeding the tolerance limit of most energy conversion systems. Copyright © 2013 Elsevier B.V. All rights reserved.

  17. Quantitative analysis of N-glycans from human alfa-acid-glycoprotein using stable isotope labeling and zwitterionic hydrophilic interaction capillary liquid chromatography electrospray mass spectrometry as tool for pancreatic disease diagnosis.

    PubMed

    Giménez, Estela; Balmaña, Meritxell; Figueras, Joan; Fort, Esther; de Bolós, Carme; Sanz-Nebot, Victòria; Peracaula, Rosa; Rizzi, Andreas

    2015-03-25

    In this work we demonstrate the potential of glycan reductive isotope labeling (GRIL) using [(12)C]- and [(13)C]-coded aniline and zwitterionic hydrophilic interaction capillary liquid chromatography electrospray mass spectrometry (μZIC-HILIC-ESI-MS) for relative quantitation of glycosylation variants in selected glycoproteins present in samples from cancer patients. Human α1-acid-glycoprotein (hAGP) is an acute phase serum glycoprotein whose glycosylation has been described to be altered in cancer and chronic inflammation. However, it is not clear yet whether some particular glycans in hAGP can be used as biomarker for differentiating between these two pathologies. In this work, hAGP was isolated by immunoaffinity chromatography (IAC) from serum samples of healthy individuals and from those suffering chronic pancreatitis and different stages of pancreatic cancer, respectively. After de-N-glycosylation, relative quantitation of the hAGP glycans was carried out using stable isotope labeling and μZIC-HILIC-ESI-MS analysis. First, protein denaturing conditions prior to PNGase F digestion were optimized to achieve quantitative digestion yields, and the reproducibility of the established methodology was evaluated with standard hAGP. Then, the proposed method was applied to the analysis of the clinical samples (control vs. pathological). Pancreatic cancer samples clearly showed an increase in the abundance of fucosylated glycans as the stage of the disease increases and this was unlike to samples from chronic pancreatitis. The results gained here indicate the mentioned glycan in hAGP as a candidate structure worth to be corroborated by an extended study including more clinical cases; especially those with chronic pancreatitis and initial stages of pancreatic cancer. Importantly, the results demonstrate that the presented methodology combining an enrichment of a target protein by IAC with isotope coded relative quantitation of N-glycans can be successfully used for targeted glycomics studies. The methodology is assumed being suitable as well for other such studies aimed at finding novel cancer associated glycoprotein biomarkers. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. [Methodological and operational notes for the assessment and management of the risk of work-related stress].

    PubMed

    De Ambrogi, Francesco; Ratti, Elisabetta Ceppi

    2011-01-01

    Today the Italian national debate over the Work-Related Stress Risk Assessment methodology is rather heated. Several methodological proposals and guidelines have been published in recent months, not least those by the "Commissione Consultiva". But despite this wide range of proposals, it appears that there is still a lack of attention to some of the basic methodological issues that must be taken into account in order to correctly implement the above-mentioned guidelines. The aim of this paper is to outline these methodological issues. In order to achieve this, the most authoritative methodological proposals and guidelines have been reviewed. The study focuses in particular on the methodological issues that could lead to important biases if not considered properly. The study leads to some considerations about the methodological validity of a Work-Related Stress Risk Assessment based exclusively on the literal interpretation of the considered proposals. Furthermore, the study provides some hints and working hypotheses on how to overcome these methodological limits. This study should be considered as a starting point for further investigations and debate on the Work-Related Stress Risk Assessment methodology on a national level.

  19. Widespread Trypanosoma cruzi infection in government working dogs along the Texas-Mexico border: Discordant serology, parasite genotyping and associated vectors

    PubMed Central

    Meyers, Alyssa C.; Meinders, Marvin

    2017-01-01

    Background Chagas disease, caused by the vector-borne protozoan Trypanosoma cruzi, is increasingly recognized in the southern U.S. Government-owned working dogs along the Texas-Mexico border could be at heightened risk due to prolonged exposure outdoors in habitats with high densities of vectors. We quantified working dog exposure to T. cruzi, characterized parasite strains, and analyzed associated triatomine vectors along the Texas-Mexico border. Methodology/Principle findings In 2015–2016, we sampled government working dogs in five management areas plus a training center in Texas and collected triatomine vectors from canine environments. Canine serum was tested for anti-T. cruzi antibodies with up to three serological tests including two immunochromatographic assays (Stat-Pak and Trypanosoma Detect) and indirect fluorescent antibody (IFA) test. The buffy coat fraction of blood and vector hindguts were tested for T. cruzi DNA and parasite discrete typing unit was determined. Overall seroprevalence was 7.4 and 18.9% (n = 528) in a conservative versus inclusive analysis, respectively, based on classifying weakly reactive samples as negative versus positive. Canines in two western management areas had 2.6–2.8 (95% CI: 1.0–6.8 p = 0.02–0.04) times greater odds of seropositivity compared to the training center. Parasite DNA was detected in three dogs (0.6%), including TcI and TcI/TcIV mix. Nine of 20 (45%) T. gerstaeckeri and T. rubida were infected with TcI and TcIV; insects analyzed for bloodmeals (n = 11) fed primarily on canine (54.5%). Conclusions/Significance Government working dogs have widespread exposure to T. cruzi across the Texas-Mexico border. Interpretation of sample serostatus was challenged by discordant results across testing platforms and very faint serological bands. In the absence of gold standard methodologies, epidemiological studies will benefit from presenting a range of results based on different tests/interpretation criteria to encompass uncertainty. Working dogs are highly trained in security functions and potential loss of duty from the clinical outcomes of infection could affect the work force and have broad consequences. PMID:28787451

  20. Work-family conflict and job burnout among correctional staff: a comment on Lambert and Hogan (2010)1.

    PubMed

    Smith, Kenneth J

    2011-02-01

    Lambert and Hogan (2010) examined the relations between work-family conflict, role stress, and other noted predictors, on reported emotional exhaustion among a sample of 272 correctional staff at a maximum security prison. Using an ordinary least squares (OLS) regression model, the authors found work-on-family conflict, perceived dangerousness of the job, and role strain to have positive relations with emotional exhaustion. However, contrary to expectation they found that custody officers reported lower exhaustion than did their noncustody staff counterparts. Suggestions are provided for follow-up efforts designed to extend this line of research and correct methodological issues.

  1. Recent developments and future trends in solid phase microextraction techniques towards green analytical chemistry.

    PubMed

    Spietelun, Agata; Marcinkowski, Łukasz; de la Guardia, Miguel; Namieśnik, Jacek

    2013-12-20

    Solid phase microextraction find increasing applications in the sample preparation step before chromatographic determination of analytes in samples with a complex composition. These techniques allow for integrating several operations, such as sample collection, extraction, analyte enrichment above the detection limit of a given measuring instrument and the isolation of analytes from sample matrix. In this work the information about novel methodological and instrumental solutions in relation to different variants of solid phase extraction techniques, solid-phase microextraction (SPME), stir bar sorptive extraction (SBSE) and magnetic solid phase extraction (MSPE) is presented, including practical applications of these techniques and a critical discussion about their advantages and disadvantages. The proposed solutions fulfill the requirements resulting from the concept of sustainable development, and specifically from the implementation of green chemistry principles in analytical laboratories. Therefore, particular attention was paid to the description of possible uses of novel, selective stationary phases in extraction techniques, inter alia, polymeric ionic liquids, carbon nanotubes, and silica- and carbon-based sorbents. The methodological solutions, together with properly matched sampling devices for collecting analytes from samples with varying matrix composition, enable us to reduce the number of errors during the sample preparation prior to chromatographic analysis as well as to limit the negative impact of this analytical step on the natural environment and the health of laboratory employees. Copyright © 2013 Elsevier B.V. All rights reserved.

  2. Non-invasive determination of glucose directly in raw fruits using a continuous flow system based on microdialysis sampling and amperometric detection at an integrated enzymatic biosensor.

    PubMed

    Vargas, E; Ruiz, M A; Campuzano, S; Reviejo, A J; Pingarrón, J M

    2016-03-31

    A non-destructive, rapid and simple to use sensing method for direct determination of glucose in non-processed fruits is described. The strategy involved on-line microdialysis sampling coupled with a continuous flow system with amperometric detection at an enzymatic biosensor. Apart from direct determination of glucose in fruit juices and blended fruits, this work describes for the first time the successful application of an enzymatic biosensor-based electrochemical approach to the non-invasive determination of glucose in raw fruits. The methodology correlates, through previous calibration set-up, the amperometric signal generated from glucose in non-processed fruits with its content in % (w/w). The comparison of the obtained results using the proposed approach in different fruits with those provided by other method involving the same commercial biosensor as amperometric detector in stirred solutions pointed out that there were no significant differences. Moreover, in comparison with other available methodologies, this microdialysis-coupled continuous flow system amperometric biosensor-based procedure features straightforward sample preparation, low cost, reduced assay time (sampling rate of 7 h(-1)) and ease of automation. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Method for Determining the Coalbed Methane Content with Determination the Uncertainty of Measurements

    NASA Astrophysics Data System (ADS)

    Szlązak, Nikodem; Korzec, Marek

    2016-06-01

    Methane has a bad influence on safety in underground mines as it is emitted to the air during mining works. Appropriate identification of methane hazard is essential to determining methane hazard prevention methods, ventilation systems and methane drainage systems. Methane hazard is identified while roadways are driven and boreholes are drilled. Coalbed methane content is one of the parameters which is used to assess this threat. This is a requirement according to the Decree of the Minister of Economy dated 28 June 2002 on work safety and hygiene, operation and special firefighting protection in underground mines. For this purpose a new method for determining coalbed methane content in underground coal mines has been developed. This method consists of two stages - collecting samples in a mine and testing the sample in the laboratory. The stage of determining methane content in a coal sample in a laboratory is essential. This article presents the estimation of measurement uncertainty of determining methane content in a coal sample according to this methodology.

  4. When Is the Story in the Subgroups? Strategies for Interpreting and Reporting Intervention Effects on Subgroups. MDRC Working Papers on Research Methodology

    ERIC Educational Resources Information Center

    Bloom, Howard S.; Michalopoulos, Charles

    2010-01-01

    This paper examines strategies for interpreting and reporting estimates of intervention effects for subgroups of a study sample. Specifically, the paper considers: why and how subgroup findings are important for applied research, the importance of pre-specifying sub- groups before analyses are conducted, the importance of using existing theory and…

  5. The Impact of Bundled High Performance Human Resource Practices on Intention to Leave: Mediating Role of Emotional Exhaustion

    ERIC Educational Resources Information Center

    Jyoti, Jeevan; Rani, Roomi; Gandotra, Rupali

    2015-01-01

    Purpose: The purpose of this paper is to examine the mediating effect of emotional exhaustion (EE) in between bundled high-performance human resource practices (HPHRPs) and intention to leave (ITL) in the education sector. Design/methodology/approach: A survey questionnaire method was used to collect data from a sample of 514 teachers working in…

  6. Disposable Screen Printed Electrochemical Sensors: Tools for Environmental Monitoring

    PubMed Central

    Hayat, Akhtar; Marty, Jean Louis

    2014-01-01

    Screen printing technology is a widely used technique for the fabrication of electrochemical sensors. This methodology is likely to underpin the progressive drive towards miniaturized, sensitive and portable devices, and has already established its route from “lab-to-market” for a plethora of sensors. The application of these sensors for analysis of environmental samples has been the major focus of research in this field. As a consequence, this work will focus on recent important advances in the design and fabrication of disposable screen printed sensors for the electrochemical detection of environmental contaminants. Special emphasis is given on sensor fabrication methodology, operating details and performance characteristics for environmental applications. PMID:24932865

  7. Unraveling wall conditioning effects on plasma facing components in NSTX-U with the Materials Analysis Particle Probe (MAPP)

    DOE PAGES

    Bedoya, F.; Allain, J. P.; Kaita, R.; ...

    2016-07-14

    A novel PFC diagnostic, the Materials Analysis Particle Probe (MAPP), has been recently commissioned in the National Spherical Torus Experiment Upgrade (NSTX-U). MAPP is currently monitoring the chemical evolution of the PFCs in the NSTX-U lower divertor at 107 cm from the tokamak axis on a day-to-day basis. Here in this work, we summarize the methodology that was adopted to obtain qualitative and quantitative descriptions of the samples chemistry. Using this methodology, we were able to describe all the features in all our spectra to within a standard deviation of ±0.22 eV in position and ±248 s -1 eV inmore » area. Additionally, we provide an example of this methodology with data of boronized ATJ graphite exposed to NSTX-U plasmas.« less

  8. Can weekly noise levels of urban road traffic, as predominant noise source, estimate annual ones?

    PubMed

    Prieto Gajardo, Carlos; Barrigón Morillas, Juan Miguel; Rey Gozalo, Guillermo; Vílchez-Gómez, Rosendo

    2016-11-01

    The effects of noise pollution on human quality of life and health were recognised by the World Health Organisation a long time ago. There is a crucial dilemma for the study of urban noise when one is looking for proven methodologies that can allow, on the one hand, an increase in the quality of predictions, and on the other hand, saving resources in the spatial and temporal sampling. The temporal structure of urban noise is studied in this work from a different point of view. This methodology, based on Fourier analysis, is applied to several measurements of urban noise, mainly from road traffic and one-week long, carried out in two cities located on different continents and with different sociological life styles (Cáceres, Spain and Talca, Chile). Its capacity to predict annual noise levels from weekly measurements is studied. The relation between this methodology and the categorisation method is also analysed.

  9. The National Criminal Justice Treatment Practices survey: Multilevel survey methods and procedures⋆

    PubMed Central

    Taxman, Faye S.; Young, Douglas W.; Wiersema, Brian; Rhodes, Anne; Mitchell, Suzanne

    2007-01-01

    The National Criminal Justice Treatment Practices (NCJTP) survey provides a comprehensive inquiry into the nature of programs and services provided to adult and juvenile offenders involved in the justice system in the United States. The multilevel survey design covers topics such as the mission and goals of correctional and treatment programs; organizational climate and culture for providing services; organizational capacity and needs; opinions of administrators and staff regarding rehabilitation, punishment, and services provided to offenders; treatment policies and procedures; and working relationships between correctional and other agencies. The methodology generates national estimates of the availability of programs and services for offenders. This article details the methodology and sampling frame for the NCJTP survey, response rates, and survey procedures. Prevalence estimates of juvenile and adult offenders under correctional control are provided with externally validated comparisons to illustrate the veracity of the methodology. Limitations of the survey methods are also discussed. PMID:17383548

  10. Immediate drop on demand technology (I-DOT) coupled with mass spectrometry via an open port sampling interface.

    PubMed

    Van Berkel, Gary J; Kertesz, Vilmos; Boeltz, Harry

    2017-11-01

    The aim of this work was to demonstrate and evaluate the analytical performance of coupling the immediate drop on demand technology to a mass spectrometer via the recently introduced open port sampling interface and ESI. Methodology & results: A maximum sample analysis throughput of 5 s per sample was demonstrated. Signal reproducibility was 10% or better as demonstrated by the quantitative analysis of propranolol and its stable isotope-labeled internal standard propranolol-d7. The ability of the system to multiply charge and analyze macromolecules was demonstrated using the protein cytochrome c. This immediate drop on demand technology/open port sampling interface/ESI-MS combination allowed for the quantitative analysis of relatively small mass analytes and was used for the identification of macromolecules like proteins.

  11. The affective shift model of work engagement.

    PubMed

    Bledow, Ronald; Schmitt, Antje; Frese, Michael; Kühnel, Jana

    2011-11-01

    On the basis of self-regulation theories, the authors develop an affective shift model of work engagement according to which work engagement emerges from the dynamic interplay of positive and negative affect. The affective shift model posits that negative affect is positively related to work engagement if negative affect is followed by positive affect. The authors applied experience sampling methodology to test the model. Data on affective events, mood, and work engagement was collected twice a day over 9 working days among 55 software developers. In support of the affective shift model, negative mood and negative events experienced in the morning of a working day were positively related to work engagement in the afternoon if positive mood in the time interval between morning and afternoon was high. Individual differences in positive affectivity moderated within-person relationships. The authors discuss how work engagement can be fostered through affect regulation. (c) 2011 APA, all rights reserved.

  12. Methodology discourses as boundary work in the construction of engineering education.

    PubMed

    Beddoes, Kacey

    2014-04-01

    Engineering education research is a new field that emerged in the social sciences over the past 10 years. This analysis of engineering education research demonstrates that methodology discourses have played a central role in the construction and development of the field of engineering education, and that they have done so primarily through boundary work. This article thus contributes to science and technology studies literature by examining the role of methodology discourses in an emerging social science field. I begin with an overview of engineering education research before situating the case within relevant bodies of literature on methodology discourses and boundary work. I then identify two methodology discourses--rigor and methodological diversity--and discuss how they contribute to the construction and development of engineering education research. The article concludes with a discussion of how the findings relate to prior research on methodology discourses and boundary work and implications for future research.

  13. Maybe Small Is Too Small a Term: Introduction to Advancing Small Sample Prevention Science.

    PubMed

    Fok, Carlotta Ching Ting; Henry, David; Allen, James

    2015-10-01

    Prevention research addressing health disparities often involves work with small population groups experiencing such disparities. The goals of this special section are to (1) address the question of what constitutes a small sample; (2) identify some of the key research design and analytic issues that arise in prevention research with small samples; (3) develop applied, problem-oriented, and methodologically innovative solutions to these design and analytic issues; and (4) evaluate the potential role of these innovative solutions in describing phenomena, testing theory, and evaluating interventions in prevention research. Through these efforts, we hope to promote broader application of these methodological innovations. We also seek whenever possible, to explore their implications in more general problems that appear in research with small samples but concern all areas of prevention research. This special section includes two sections. The first section aims to provide input for researchers at the design phase, while the second focuses on analysis. Each article describes an innovative solution to one or more challenges posed by the analysis of small samples, with special emphasis on testing for intervention effects in prevention research. A concluding article summarizes some of their broader implications, along with conclusions regarding future directions in research with small samples in prevention science. Finally, a commentary provides the perspective of the federal agencies that sponsored the conference that gave rise to this special section.

  14. Mood spillover and crossover among dual-earner couples: a cell phone event sampling study.

    PubMed

    Song, Zhaoli; Foo, Maw-Der; Uy, Marilyn A

    2008-03-01

    In this study, the authors examined affective experiences of dual-earner couples. More specifically, the authors explored how momentary moods can spill over between work and family and cross over from one spouse to another. Fifty couples used their cell phones to provide reports of their momentary moods over 8 consecutive days. Results show significant spillover and crossover effects for both positive and negative moods. Work orientation moderated negative mood spillover from work to home, and the presence of children in the family decreased negative mood crossover between spouses. Crossover was observed when spouses were physically together and when the time interval between the spouses' reports was short. With this study, the authors contribute to the work and family research by examining the nature of mood transfers among dual-earner couples, including the direction, valence, and moderators of these transfers across work and family domains. The authors also contribute to the event sampling methodology by introducing a new method of using cell phones to collect momentary data. Copyright 2008 APA

  15. Physical parameter determinations of young Ms. Taking advantage of the Virtual Observatory to compare methodologies

    NASA Astrophysics Data System (ADS)

    Bayo, A.; Rodrigo, C.; Barrado, D.; Allard, F.

    One of the very first steps astronomers working in stellar physics perform to advance in their studies, is to determine the most common/relevant physical parameters of the objects of study (effective temperature, bolometric luminosity, surface gravity, etc.). Different methodologies exist depending on the nature of the data, intrinsic properties of the objects, etc. One common approach is to compare the observational data with theoretical models passed through some simulator that will leave in the synthetic data the same imprint than the observational data carries, and see what set of parameters reproduce the observations best. Even in this case, depending on the kind of data the astronomer has, the methodology changes slightly. After parameters are published, the community tend to quote, praise and criticize them, sometimes paying little attention on whether the possible discrepancies come from the theoretical models, the data themselves or just the methodology used in the analysis. In this work we perform the simple, yet interesting, exercise of comparing the effective temperatures obtained via SED and more detailed spectral fittings (to the same grid of models), of a sample of well known and characterized young M-type objects members to different star forming regions and show how differences in temperature of up to 350 K can be expected just from the difference in methodology/data used. On the other hand we show how these differences are smaller for colder objects even when the complexity of the fit increases like for example introducing differential extinction. To perform this exercise we benefit greatly from the framework offered by the Virtual Observaotry.

  16. Women's work and symptoms during midlife: Korean immigrant women.

    PubMed

    Im, E O; Meleis, A I

    2001-01-01

    To describe how Korean immigrant women tend to describe their work experiences within their daily lives and how they relate their work to the symptoms experienced during midlife. Cross-sectional study using methodological triangulation. Using a convenience sampling method, 119 Korean immigrant women were recruited for the quantitative phase, and 21 among the 119 women were recruited for the qualitative phase. Data were collected using both questionnaires and in-depth interviews. The data were analyzed using descriptive and inferential statistics and thematic analysis. FINDINGS AND DISCUSSIONS: The symptoms that the women experienced during midlife were influenced by their work experience, which was complicated by their cultural heritage, gender issues embedded in their daily lives, and immigration transition. Complexities and diversities in women's work need to be incorporated in menopausal studies.

  17. Sequential determination of lead and cobalt in tap water and foods samples by fluorescence.

    PubMed

    Talio, María Carolina; Alesso, Magdalena; Acosta, María Gimena; Acosta, Mariano; Fernández, Liliana P

    2014-09-01

    In this work, a new procedure was developed for the separation and preconcentration of lead(II) and cobalt(II) in several water and foods samples. Complexes of metal ions with 8-hydroxyquinolein (8-HQ) were formed in aqueous solution. The proposed methodology is based on the preconcentration/separation of Pb(II) by solid-phase extraction using paper filter, followed by spectrofluorimetric determination of both metals, on the solid support and the filtered aqueous solution, respectively. The solid surface fluorescence determination was carried out at λem=455 nm (λex=385 nm) for Pb(II)-8-HQ complex and the fluorescence of Co(II)-8-HQ was determined in aqueous solution using λem=355 nm (λex=225 nm). The calibration graphs are linear in the range 0.14-8.03×10(4) μg L(-1) and 7.3×10(-2)-4.12×10(3) μg L(-1), for Pb(II) and Co(II), respectively, with a detection limit of 4.3×10(-2) and 2.19×10(-2) μg L(-1) (S/N=3). The developed methodology showed good sensitivity and adequate selectivity and it was successfully applied to the determination of trace amounts of lead and cobalt in tap waters belonging of different regions of Argentina and foods samples (milk powder, express coffee, cocoa powder) with satisfactory results. The new methodology was validated by electrothermal atomic absorption spectroscopy with adequate agreement. The proposed methodology represents a novel application of fluorescence to Pb(II) and Co(II) quantification with sensitivity and accuracy similar to atomic spectroscopies. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. Identification of proteins in renaissance paintings by proteomics.

    PubMed

    Tokarski, Caroline; Martin, Elisabeth; Rolando, Christian; Cren-Olivé, Cécile

    2006-03-01

    The presented work proposes a new methodology based on proteomics techniques to identify proteins in old art paintings. The main challenging tasks of this work were (i) to find appropriate conditions for extracting proteins from the binding media without protein hydrolysis in amino acids and (ii) to develop analytical methods adapted to the small sample quantity available. Starting from microsamples of painting models (ovalbumin-based, which is the major egg white protein, and egg-based paintings), multiple extraction solutions (HCl, HCOOH, NH3, NaOH) and conditions (ultrasonic bath, mortar and pestle, grinding resin) were evaluated. The best results were obtained using a commercial kit including a synthetic resin, mortar and pestle to grind the sample in an aqueous solution acidified with trifluoroacetic acid at 1% with additional multiple steps of ultrasonic baths. The resulting supernatant was analyzed by MALDI-TOF in linear mode to verify the efficiency of the extraction solution. An enzymatic hydrolysis step was also performed for protein identification; the peptide mixture was analyzed by nanoLC/nanoESI/Q-q-TOF MS/MS with an adapted chromatographic run for the low sample quantity. Finally, the developed methodology was successfully applied to Renaissance art painting microsamples of approximately 10 microg from Benedetto Bonfigli's triptych, The Virgin and Child, St. John the Baptist, St. Sebastian (XVth century), and Niccolo di Pietro Gerini's painting, The Virgin and Child (XIVth century), identifying, for the first time and without ambiguity, the presence of whole egg proteins (egg yolk and egg white) in a painting binder.

  19. Overview of qualitative research.

    PubMed

    Grossoehme, Daniel H

    2014-01-01

    Qualitative research methods are a robust tool for chaplaincy research questions. Similar to much of chaplaincy clinical care, qualitative research generally works with written texts, often transcriptions of individual interviews or focus group conversations and seeks to understand the meaning of experience in a study sample. This article describes three common methodologies: ethnography, grounded theory, and phenomenology. Issues to consider relating to the study sample, design, and analysis are discussed. Enhancing the validity of the data, as well reliability and ethical issues in qualitative research are described. Qualitative research is an accessible way for chaplains to contribute new knowledge about the sacred dimension of people's lived experience.

  20. Life Prediction Methodologies for Aerospace Materials Annual Report, 2003

    DTIC Science & Technology

    2003-06-01

    peening parameters are obtained using a simplified model [Cao, et al .]. The solutions ultimately will need to be fine-tuned by simulating the...clamping stress and applied axial stress, identified from prior work [Hutson, et al .]. Accumulated damage on some samples was characterized using...defined using single fiber creep data [Wilson, et al .]. A two-level Mori-Tanaka model [Mori and Tanaka] has been used to define the effective

  1. Genetic Influences on Peer and Family Relationships Across Adolescent Development: Introduction to the Special Issue.

    PubMed

    Mullineaux, Paula Y; DiLalla, Lisabeth Fisher

    2015-07-01

    Nearly all aspects of human development are influenced by genetic and environmental factors, which conjointly shape development through several gene-environment interplay mechanisms. More recently, researchers have begun to examine the influence of genetic factors on peer and family relationships across the pre-adolescent and adolescent time periods. This article introduces the special issue by providing a critical overview of behavior genetic methodology and existing research demonstrating gene-environment processes operating on the link between peer and family relationships and adolescent adjustment. The overview is followed by a summary of new research studies, which use genetically informed samples to examine how peer and family environment work together with genetic factors to influence behavioral outcomes across adolescence. The studies in this special issue provide further evidence of gene-environment interplay through innovative behavior genetic methodological approaches across international samples. Results from the quantitative models indicate environmental moderation of genetic risk for coercive adolescent-parent relationships and deviant peer affiliation. The molecular genetics studies provide support for a gene-environment interaction differential susceptibility model for dopamine regulation genes across positive and negative peer and family environments. Overall, the findings from the studies in this special issue demonstrate the importance of considering how genes and environments work in concert to shape developmental outcomes during adolescence.

  2. Identifying promising accessions of cherry tomato: a sensory strategy using consumers and chefs.

    PubMed

    Rocha, Mariella C; Deliza, Rosires; Ares, Gastón; Freitas, Daniela De G C; Silva, Aline L S; Carmo, Margarida G F; Abboud, Antonio C S

    2013-06-01

    An increased production of cherry and gourmet tomato cultivars that are harvested at advanced colour stages and sold at a higher price has been observed in the last 10 years. In this context, producers need information on the sensory characteristics of new cultivars and their perception by potential consumers. The aim of the present work was to obtain a sensory characterisation of nine cherry tomato cultivars produced under Brazilian organic cultivation conditions from a chef and consumer perspective. Nine organic cherry tomato genotypes were evaluated by ten chefs using an open-ended question and by 110 consumers using a check-all-that-apply question. Both methodologies provided similar information on the sensory characteristics of the cherry tomato accessions. The superimposed representation of the samples in a multiple factor analysis was similar for consumers' and chefs' descriptions (RV coefficient 0.728), although they used different methodologies. According to both panels, cherry tomatoes were sorted into five groups of samples with similar sensory characteristics. Results from the present work may provide information to help organic producers in the selection of the most promising cultivars for cultivation, taking into account consumers' and chefs' perceptions, as well as in the design of communication and marketing strategies. © 2012 Society of Chemical Industry.

  3. [The methodology and sample description of the National Survey on Addiction Problems in Hungary 2015 (NSAPH 2015)].

    PubMed

    Paksi, Borbala; Demetrovics, Zsolt; Magi, Anna; Felvinczi, Katalin

    2017-06-01

    This paper introduces the methods and methodological findings of the National Survey on Addiction Problems in Hungary (NSAPH 2015). Use patterns of smoking, alcohol use and other psychoactive substances were measured as well as that of certain behavioural addictions (problematic gambling - PGSI, DSM-V, eating disorders - SCOFF, problematic internet use - PIUQ, problematic on-line gaming - POGO, problematic social media use - FAS, exercise addictions - EAI-HU, work addiction - BWAS, compulsive buying - CBS). The paper describes the applied measurement techniques, sample selection, recruitment of respondents and the data collection strategy as well. Methodological results of the survey including reliability and validity of the measures are reported. The NSAPH 2015 research was carried out on a nationally representative sample of the Hungarian adult population aged 16-64 yrs (gross sample 2477, net sample 2274 persons) with the age group of 18-34 being overrepresented. Statistical analysis of the weight-distribution suggests that weighting did not create any artificial distortion in the database leaving the representativeness of the sample unaffected. The size of the weighted sample of the 18-64 years old adult population is 1490 persons. The extent of the theoretical margin of error in the weighted sample is ±2,5%, at a reliability level of 95% which is in line with the original data collection plans. Based on the analysis of reliability and the extent of errors beyond sampling within the context of the database we conclude that inconsistencies create relatively minor distortions in cumulative prevalence rates; consequently the database makes possible the reliable estimation of risk factors related to different substance use behaviours. The reliability indexes of measurements used for prevalence estimates of behavioural addictions proved to be appropriate, though the psychometric features in some cases suggest the presence of redundant items. The comparison of parameters of errors beyond sample selection in the current and previous data collections indicates that trend estimates and their interpretation requires outstanding attention and in some cases even correction procedures might become necessary.

  4. The methodology of preparing the end faces of cylindrical waveguide of polydimethylsiloxane

    NASA Astrophysics Data System (ADS)

    Novak, M.; Nedoma, J.; Jargus, J.; Bednarek, L.; Cvejn, D.; Vasinek, V.

    2017-10-01

    Polydimethylsiloxane (PDMS) can be used for its optical properties and its composition offers the possibility of use in the dangerous environments. Therefore authors of this article focused on more detailed working with this material. The article describes the methodology of preparing the end faces of the cylindrical waveguide of polymer polydimethylsiloxane (PDMS) to minimize losses during joining. The first method of preparing the end faces of the cylindrical waveguide of polydimethylsiloxane is based on the polishing surface of the sandpaper of different sizes of grains (3 species). The second method using so-called heat smoothing and the third method using aligning end faces by a new layer of polydimethylsiloxane. The outcome of the study is to evaluate the quality of the end faces of the cylindrical waveguide of polymer polydimethylsiloxane based on evaluating the attenuation. For this experiment, it was created a total of 140 samples. The attenuation was determined from both sides of the created samples for three different wavelengths of the visible spectrum.

  5. Improving the implementation of marine monitoring in the northeast Atlantic.

    PubMed

    Turrell, W R

    2018-03-01

    Marine monitoring in the northeast Atlantic is delivered within identifiable monitoring themes, established through time and defined by the geographical area and policy drivers they serve, the sampling methodologies they use, their assessment methodologies, their funding and governance structures and the people or organisations involved in their implementation. Within a monitoring theme, essential components for effective monitoring are governance, strategy and work plan, sampling protocols, quality assurance, and data and assessment structures. This simple framework is used to analyse two monitoring theme case studies; national ecosystem health monitoring, and regional fish stock monitoring. Such essential component analyses, within marine monitoring themes, can help improve monitoring implementation by identifying gaps and overlaps. Once monitoring themes are recognised, explicitly defined and streamlined, travel towards integrated monitoring may be made easier as the current lack of clarity in thematic marine monitoring implementation is one barrier to integration at both national and regional scales. Copyright © 2018 The Author. Published by Elsevier Ltd.. All rights reserved.

  6. GEOTHERMAL EFFLUENT SAMPLING WORKSHOP

    EPA Science Inventory

    This report outlines the major recommendations resulting from a workshop to identify gaps in existing geothermal effluent sampling methodologies, define needed research to fill those gaps, and recommend strategies to lead to a standardized sampling methodology.

  7. State of the art of environmentally friendly sample preparation approaches for determination of PBDEs and metabolites in environmental and biological samples: A critical review.

    PubMed

    Berton, Paula; Lana, Nerina B; Ríos, Juan M; García-Reyes, Juan F; Altamirano, Jorgelina C

    2016-01-28

    Green chemistry principles for developing methodologies have gained attention in analytical chemistry in recent decades. A growing number of analytical techniques have been proposed for determination of organic persistent pollutants in environmental and biological samples. In this light, the current review aims to present state-of-the-art sample preparation approaches based on green analytical principles proposed for the determination of polybrominated diphenyl ethers (PBDEs) and metabolites (OH-PBDEs and MeO-PBDEs) in environmental and biological samples. Approaches to lower the solvent consumption and accelerate the extraction, such as pressurized liquid extraction, microwave-assisted extraction, and ultrasound-assisted extraction, are discussed in this review. Special attention is paid to miniaturized sample preparation methodologies and strategies proposed to reduce organic solvent consumption. Additionally, extraction techniques based on alternative solvents (surfactants, supercritical fluids, or ionic liquids) are also commented in this work, even though these are scarcely used for determination of PBDEs. In addition to liquid-based extraction techniques, solid-based analytical techniques are also addressed. The development of greener, faster and simpler sample preparation approaches has increased in recent years (2003-2013). Among green extraction techniques, those based on the liquid phase predominate over those based on the solid phase (71% vs. 29%, respectively). For solid samples, solvent assisted extraction techniques are preferred for leaching of PBDEs, and liquid phase microextraction techniques are mostly used for liquid samples. Likewise, green characteristics of the instrumental analysis used after the extraction and clean-up steps are briefly discussed. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Toward decentralized analysis of mercury (II) in real samples. A critical review on nanotechnology-based methodologies.

    PubMed

    Botasini, Santiago; Heijo, Gonzalo; Méndez, Eduardo

    2013-10-24

    In recent years, it has increased the number of works focused on the development of novel nanoparticle-based sensors for mercury detection, mainly motivated by the need of low cost portable devices capable of giving fast and reliable analytical response, thus contributing to the analytical decentralization. Methodologies employing colorimetric, fluorometric, magnetic, and electrochemical output signals allowed reaching detection limits within the pM and nM ranges. Most of these developments proved their suitability in detecting and quantifying mercury (II) ions in synthetic solutions or spiked water samples. However, the state of art in these technologies is still behind the standard methods of mercury quantification, such as cold vapor atomic absorption spectrometry and inductively coupled plasma techniques, in terms of reliability and sensitivity. This is mainly because the response of nanoparticle-based sensors is highly affected by the sample matrix. The developed analytical nanosystems may fail in real samples because of the negative incidence of the ionic strength and the presence of exchangeable ligands. The aim of this review is to critically consider the recently published innovations in this area, and highlight the needs to include more realistic assays in future research in order to make these advances suitable for on-site analysis. Copyright © 2013 Elsevier B.V. All rights reserved.

  9. Inulin blend as prebiotic and fat replacer in dairy desserts: optimization by response surface methodology.

    PubMed

    Arcia, P L; Costell, E; Tárrega, A

    2011-05-01

    The purpose of this work was to optimize the formulation of a prebiotic dairy dessert with low fat content (<0.1g/100g) using a mixture of short- and long-chain inulin. Response surface methodology was applied to obtain the experimental design and data analysis. Nineteen formulations of dairy dessert were prepared, varying inulin concentration (3 to 9 g/100g), sucrose concentration (4 to 16 g/100g), and lemon flavor concentration (25 to 225 mg/kg). Sample acceptability evaluated by 100 consumers varied mainly in terms of inulin and sucrose concentrations and, to a lesser extent, of lemon flavor content. An interaction effect among inulin and sucrose concentration was also found. According to the model obtained, the formulation with 5.5 g/100g inulin, 10 g/100g sucrose and 60 mg/kg of lemon flavor was selected. Finally, this sample was compared sensorially with the regular fat content (2.8 g/100g) sample previously optimized in terms of lemon flavor (146 mg/kg) and sucrose (11.4 g/100g). No significant difference in acceptability was found between them but the low-fat sample with inulin possessed stronger lemon flavor and greater thickness and creaminess. Copyright © 2011 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  10. Non-invasive identification of organic materials in historical stringed musical instruments by reflection infrared spectroscopy: a methodological approach.

    PubMed

    Invernizzi, Claudia; Daveri, Alessia; Vagnini, Manuela; Malagodi, Marco

    2017-05-01

    The analysis of historical musical instruments is becoming more relevant and the interest is increasingly moving toward the non-invasive reflection FTIR spectroscopy, especially for the analysis of varnishes. In this work, a specific infrared reflectance spectral library of organic compounds was created with the aim of identifying musical instrument materials in a totally non-invasive way. The analyses were carried out on pure organic compounds, as bulk samples and laboratory wooden models, to evaluate the diagnostic reflection mid-infrared (MIR) bands of proteins, polysaccharides, lipids, and resins by comparing reflection spectra before and after the KK correction. This methodological approach was applied to real case studies represented by four Stradivari violins and a Neapolitan mandolin.

  11. Statistical analysis of radiation dose derived from ingestion of foods

    NASA Astrophysics Data System (ADS)

    Dougherty, Ward L.

    2001-09-01

    This analysis undertook the task of designing and implementing a methodology to determine an individual's probabilistic radiation dose from ingestion of foods utilizing Crystal Ball. A dietary intake model was determined by comparing previous existing models. Two principal radionuclides were considered-Lead210 (Pb-210) and Radium 226 (Ra-226). Samples from three different local grocery stores-Publix, Winn Dixie, and Albertsons-were counted on a gamma spectroscopy system with a GeLi detector. The same food samples were considered as those in the original FIPR database. A statistical analysis, utilizing the Crystal Ball program, was performed on the data to assess the most accurate distribution to use for these data. This allowed a determination of a radiation dose to an individual based on the above-information collected. Based on the analyses performed, radiation dose for grocery store samples was lower for Radium-226 than FIPR debris analyses, 2.7 vs. 5.91 mrem/yr. Lead-210 had a higher dose in the grocery store sample than the FIPR debris analyses, 21.4 vs. 518 mrem/yr. The output radiation dose was higher for all evaluations when an accurate estimation of distributions for each value was considered. Radium-226 radiation dose for FIPR and grocery rose to 9.56 and 4.38 mrem/yr. Radiation dose from ingestion of Pb-210 rose to 34.7 and 854 mrem/yr for FIPR and grocery data, respectively. Lead-210 was higher than initial doses for many reasons: Different peak examined, lower edge of detection limit, and minimum detectable concentration was considered. FIPR did not utilize grocery samples as a control because they calculated radiation dose that appeared unreasonably high. Consideration of distributions with the initial values allowed reevaluation of radiation does and showed a significant difference to original deterministic values. This work shows the value and importance of considering distributions to ensure that a person's radiation dose is accurately calculated. Probabilistic dose methodology was proved to be a more accurate and realistic method of radiation dose determination. This type of methodology provides a visual presentation of dose distribution that can be a vital aid in risk methodology.

  12. A portable cryo-plunger for on-site intact cryogenic microscopy sample preparation in natural environments.

    PubMed

    Comolli, Luis R; Duarte, Robert; Baum, Dennis; Luef, Birgit; Downing, Kenneth H; Larson, David M; Csencsits, Roseann; Banfield, Jillian F

    2012-06-01

    We present a modern, light portable device specifically designed for environmental samples for cryogenic transmission-electron microscopy (cryo-TEM) by on-site cryo-plunging. The power of cryo-TEM comes from preparation of artifact-free samples. However, in many studies, the samples must be collected at remote field locations, and the time involved in transporting samples back to the laboratory for cryogenic preservation can lead to severe degradation artifacts. Thus, going back to the basics, we developed a simple mechanical device that is light and easy to transport on foot yet effective. With the system design presented here we are able to obtain cryo-samples of microbes and microbial communities not possible to culture, in their near-intact environmental conditions as well as in routine laboratory work, and in real time. This methodology thus enables us to bring the power of cryo-TEM to microbial ecology. Copyright © 2011 Wiley Periodicals, Inc.

  13. Application of multiwalled carbon nanotubes as sorbents for the extraction of mycotoxins in water samples and infant milk formula prior to high performance liquid chromatography mass spectrometry analysis.

    PubMed

    Socas-Rodríguez, Bárbara; González-Sálamo, Javier; Hernández-Borges, Javier; Rodríguez Delgado, Miguel Ángel

    2016-05-01

    In this work, a simple and environmental friendly methodology has been developed for the analysis of a group of six mycotoxins with estrogenic activity produced by Fusarium species (i.e. zearalanone, zearalenone, α-zearalanol, β-zearalanol, α-zearalenol, and β-zearalenol), using microdispersive SPE the symbol micro should de before dSPE with multiwalled carbon nanotubes as sorbent. Separation, determination, and quantification were achieved by HPLC coupled to ion trap MS with an ESI interface. Parameters affecting the extraction efficiency of µ-dSPE such as pH of the sample, amount of multiwalled carbon nanotubes, and type and volume of elution solvent, were studied and optimized. The methodology was validated for mineral, pond, and wastewater as well as for powdered infant milk using 17β-estradiol-2,4,16,16,17-d5 (17β-E2 -D5 ) as internal standard, obtaining recoveries ranging from 85 to 120% for the three types of water samples and from 77 to 115% for powdered infant milk. RSD values were lower than 10%. The LOQs achieved were in the range 0.05-2.90 μg/L for water samples and 2.02-31.9 μg/L for powdered infant milk samples. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. A framework for measurement and harmonization of pediatric multiple sclerosis etiologic research studies: The Pediatric MS Tool-Kit.

    PubMed

    Magalhaes, Sandra; Banwell, Brenda; Bar-Or, Amit; Fortier, Isabel; Hanwell, Heather E; Lim, Ming; Matt, Georg E; Neuteboom, Rinze F; O'Riordan, David L; Schneider, Paul K; Pugliatti, Maura; Shatenstein, Bryna; Tansey, Catherine M; Wassmer, Evangeline; Wolfson, Christina

    2018-06-01

    While studying the etiology of multiple sclerosis (MS) in children has several methodological advantages over studying etiology in adults, studies are limited by small sample sizes. Using a rigorous methodological process, we developed the Pediatric MS Tool-Kit, a measurement framework that includes a minimal set of core variables to assess etiological risk factors. We solicited input from the International Pediatric MS Study Group to select three risk factors: environmental tobacco smoke (ETS) exposure, sun exposure, and vitamin D intake. To develop the Tool-Kit, we used a Delphi study involving a working group of epidemiologists, neurologists, and content experts from North America and Europe. The Tool-Kit includes six core variables to measure ETS, six to measure sun exposure, and six to measure vitamin D intake. The Tool-Kit can be accessed online ( www.maelstrom-research.org/mica/network/tool-kit ). The goals of the Tool-Kit are to enhance exposure measurement in newly designed pediatric MS studies and comparability of results across studies, and in the longer term to facilitate harmonization of studies, a methodological approach that can be used to circumvent issues of small sample sizes. We believe the Tool-Kit will prove to be a valuable resource to guide pediatric MS researchers in developing study-specific questionnaire.

  15. Working memory training in older adults: Bayesian evidence supporting the absence of transfer.

    PubMed

    Guye, Sabrina; von Bastian, Claudia C

    2017-12-01

    The question of whether working memory training leads to generalized improvements in untrained cognitive abilities is a longstanding and heatedly debated one. Previous research provides mostly ambiguous evidence regarding the presence or absence of transfer effects in older adults. Thus, to draw decisive conclusions regarding the effectiveness of working memory training interventions, methodologically sound studies with larger sample sizes are needed. In this study, we investigated whether or not a computer-based working memory training intervention induced near and far transfer in a large sample of 142 healthy older adults (65 to 80 years). Therefore, we randomly assigned participants to either the experimental group, which completed 25 sessions of adaptive, process-based working memory training, or to the active, adaptive visual search control group. Bayesian linear mixed-effects models were used to estimate performance improvements on the level of abilities, using multiple indicator tasks for near (working memory) and far transfer (fluid intelligence, shifting, and inhibition). Our data provided consistent evidence supporting the absence of near transfer to untrained working memory tasks and the absence of far transfer effects to all of the assessed abilities. Our results suggest that working memory training is not an effective way to improve general cognitive functioning in old age. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  16. Assessment of Microphysical Models in the National Combustion Code (NCC) for Aircraft Particulate Emissions: Particle Loss in Sampling Lines

    NASA Technical Reports Server (NTRS)

    Wey, Thomas; Liu, Nan-Suey

    2008-01-01

    This paper at first describes the fluid network approach recently implemented into the National Combustion Code (NCC) for the simulation of transport of aerosols (volatile particles and soot) in the particulate sampling systems. This network-based approach complements the other two approaches already in the NCC, namely, the lower-order temporal approach and the CFD-based approach. The accuracy and the computational costs of these three approaches are then investigated in terms of their application to the prediction of particle losses through sample transmission and distribution lines. Their predictive capabilities are assessed by comparing the computed results with the experimental data. The present work will help establish standard methodologies for measuring the size and concentration of particles in high-temperature, high-velocity jet engine exhaust. Furthermore, the present work also represents the first step of a long term effort of validating physics-based tools for the prediction of aircraft particulate emissions.

  17. Assessing cross-cultural validity of scales: a methodological review and illustrative example.

    PubMed

    Beckstead, Jason W; Yang, Chiu-Yueh; Lengacher, Cecile A

    2008-01-01

    In this article, we assessed the cross-cultural validity of the Women's Role Strain Inventory (WRSI), a multi-item instrument that assesses the degree of strain experienced by women who juggle the roles of working professional, student, wife and mother. Cross-cultural validity is evinced by demonstrating the measurement invariance of the WRSI. Measurement invariance is the extent to which items of multi-item scales function in the same way across different samples of respondents. We assessed measurement invariance by comparing a sample of working women in Taiwan with a similar sample from the United States. Structural equation models (SEMs) were employed to determine the invariance of the WRSI and to estimate the unique validity variance of its items. This article also provides nurse-researchers with the necessary underlying measurement theory and illustrates how SEMs may be applied to assess cross-cultural validity of instruments used in nursing research. Overall performance of the WRSI was acceptable but our analysis showed that some items did not display invariance properties across samples. Item analysis is presented and recommendations for improving the instrument are discussed.

  18. Measuring Substance Use and Misuse via Survey Research: Unfinished Business.

    PubMed

    Johnson, Timothy P

    2015-01-01

    This article reviews unfinished business regarding the assessment of substance use behaviors by using survey research methodologies, a practice that dates back to the earliest years of this journal's publication. Six classes of unfinished business are considered including errors of sampling, coverage, non-response, measurement, processing, and ethics. It may be that there is more now that we do not know than when this work began some 50 years ago.

  19. Computational characterization of HPGe detectors usable for a wide variety of source geometries by using Monte Carlo simulation and a multi-objective evolutionary algorithm

    NASA Astrophysics Data System (ADS)

    Guerra, J. G.; Rubiano, J. G.; Winter, G.; Guerra, A. G.; Alonso, H.; Arnedo, M. A.; Tejera, A.; Martel, P.; Bolivar, J. P.

    2017-06-01

    In this work, we have developed a computational methodology for characterizing HPGe detectors by implementing in parallel a multi-objective evolutionary algorithm, together with a Monte Carlo simulation code. The evolutionary algorithm is used for searching the geometrical parameters of a model of detector by minimizing the differences between the efficiencies calculated by Monte Carlo simulation and two reference sets of Full Energy Peak Efficiencies (FEPEs) corresponding to two given sample geometries, a beaker of small diameter laid over the detector window and a beaker of large capacity which wrap the detector. This methodology is a generalization of a previously published work, which was limited to beakers placed over the window of the detector with a diameter equal or smaller than the crystal diameter, so that the crystal mount cap (which surround the lateral surface of the crystal), was not considered in the detector model. The generalization has been accomplished not only by including such a mount cap in the model, but also using multi-objective optimization instead of mono-objective, with the aim of building a model sufficiently accurate for a wider variety of beakers commonly used for the measurement of environmental samples by gamma spectrometry, like for instance, Marinellis, Petris, or any other beaker with a diameter larger than the crystal diameter, for which part of the detected radiation have to pass through the mount cap. The proposed methodology has been applied to an HPGe XtRa detector, providing a model of detector which has been successfully verificated for different source-detector geometries and materials and experimentally validated using CRMs.

  20. Winter Crop Mapping for Improving Crop Production Estimates in Argentina Using Moderation Resolution Satellite Imagery

    NASA Astrophysics Data System (ADS)

    Humber, M. L.; Copati, E.; Sanchez, A.; Sahajpal, R.; Puricelli, E.; Becker-Reshef, I.

    2017-12-01

    Accurate crop production data is fundamental for reducing uncertainly and volatility in the domestic and international agricultural markets. The Agricultural Estimates Department of the Buenos Aires Grain Exchange has worked since 2000 on the estimation of different crop production data. With this information, the Grain Exchange helps different actors of the agricultural chain, such as producers, traders, seed companies, market analyst, policy makers, into their day to day decision making. Since 2015/16 season, the Grain Exchange has worked on the development of a new earth observations-based method to identify winter crop planted area at a regional scale with the aim of improving crop production estimates. The objective of this new methodology is to create a reliable winter crop mask at moderate spatial resolution using Landsat-8 imagery by exploiting bi-temporal differences in the phenological stages of winter crops as compared to other landcover types. In collaboration with the University of Maryland, the map has been validated by photointerpretation of a stratified statistically random sample of independent ground truth data in the four largest producing provinces of Argentina: Buenos Aires, Cordoba, La Pampa, and Santa Fe. In situ measurements were also used to further investigate conditions in the Buenos Aires province. Preliminary results indicate that while there are some avenues for improvement, overall the classification accuracy of the cropland and non-cropland classes are sufficient to improve downstream production estimates. Continuing research will focus on improving the methodology for winter crop mapping exercises on a yearly basis as well as improving the sampling methodology to optimize collection of validation data in the future.

  1. Rat sperm motility analysis: methodologic considerations

    EPA Science Inventory

    The objective of these studies was to optimize conditions for computer-assisted sperm analysis (CASA) of rat epididymal spermatozoa. Methodologic issues addressed include sample collection technique, sampling region within the epididymis, type of diluent medium used, and sample c...

  2. Optimal designs for population pharmacokinetic studies of the partner drugs co-administered with artemisinin derivatives in patients with uncomplicated falciparum malaria.

    PubMed

    Jamsen, Kris M; Duffull, Stephen B; Tarning, Joel; Lindegardh, Niklas; White, Nicholas J; Simpson, Julie A

    2012-07-11

    Artemisinin-based combination therapy (ACT) is currently recommended as first-line treatment for uncomplicated malaria, but of concern, it has been observed that the effectiveness of the main artemisinin derivative, artesunate, has been diminished due to parasite resistance. This reduction in effect highlights the importance of the partner drugs in ACT and provides motivation to gain more knowledge of their pharmacokinetic (PK) properties via population PK studies. Optimal design methodology has been developed for population PK studies, which analytically determines a sampling schedule that is clinically feasible and yields precise estimation of model parameters. In this work, optimal design methodology was used to determine sampling designs for typical future population PK studies of the partner drugs (mefloquine, lumefantrine, piperaquine and amodiaquine) co-administered with artemisinin derivatives. The optimal designs were determined using freely available software and were based on structural PK models from the literature and the key specifications of 100 patients with five samples per patient, with one sample taken on the seventh day of treatment. The derived optimal designs were then evaluated via a simulation-estimation procedure. For all partner drugs, designs consisting of two sampling schedules (50 patients per schedule) with five samples per patient resulted in acceptable precision of the model parameter estimates. The sampling schedules proposed in this paper should be considered in future population pharmacokinetic studies where intensive sampling over many days or weeks of follow-up is not possible due to either ethical, logistic or economical reasons.

  3. Assessing importance and satisfaction judgments of intermodal work commuters with electronic survey methodology.

    DOT National Transportation Integrated Search

    2013-09-01

    Recent advances in multivariate methodology provide an opportunity to further the assessment of service offerings in public transportation for work commuting. We offer methodologies that are alternative to direct rating scale and have advantages in t...

  4. Weld defect identification in friction stir welding using power spectral density

    NASA Astrophysics Data System (ADS)

    Das, Bipul; Pal, Sukhomay; Bag, Swarup

    2018-04-01

    Power spectral density estimates are powerful in extraction of useful information retained in signal. In the current research work classical periodogram and Welch periodogram algorithms are used for the estimation of power spectral density for vertical force signal and transverse force signal acquired during friction stir welding process. The estimated spectral densities reveal notable insight in identification of defects in friction stir welded samples. It was observed that higher spectral density against each process signals is a key indication in identifying the presence of possible internal defects in the welded samples. The developed methodology can offer preliminary information regarding presence of internal defects in friction stir welded samples can be best accepted as first level of safeguard in monitoring the friction stir welding process.

  5. Determination of insoluble soap in agricultural soil and sewage sludge samples by liquid chromatography with ultraviolet detection.

    PubMed

    Cantarero, Samuel; Zafra-Gómez, Alberto; Ballesteros, Oscar; Navalón, Alberto; Vílchez, José L; Crovetto, Guillermo; Verge, Coral; de Ferrer, Juan A

    2010-11-01

    We have developed a new analytical procedure for determining insoluble Ca and Mg fatty acid salts (soaps) in agricultural soil and sewage sludge samples. The number of analytical methodologies that focus in the determination of insoluble soap salts in different environmental compartments is very limited. In this work, we propose a methodology that involves a sample clean-up step with petroleum ether to remove soluble salts and a conversion of Ca and Mg insoluble salts into soluble potassium salts using tripotassium ethylenediaminetetraacetate salt and potassium carbonate, followed by the extraction of analytes from the samples using microwave-assisted extraction with methanol. An improved esterification procedure using 2,4-dibromoacetophenone before the liquid chromatography with ultraviolet detection analysis also has been developed. The absence of matrix effect was demonstrated with two fatty acid Ca salts that are not commercial and are never detected in natural samples (C₁₃:₀ and C₁₇:₀). Therefore, it was possible to evaluate the matrix effect because both standards have similar environmental behavior (adsorption and precipitation) to commercial soaps (C₁₀:₀) to C₁₈:₀). We also studied the effect of the different variables on the clean-up, the conversion of Ca soap, and the extraction and derivatization procedures. The quantification limits found ranged from 0.4 to 0.8 mg/kg. The proposed method was satisfactorily applied for the development of a study on soap behavior in agricultural soil and sewage sludge samples. © 2010 SETAC.

  6. 77 FR 15092 - U.S. Energy Information Administration; Proposed Agency Information Collection

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-14

    ... conducted under this clearance will generally be methodological studies of 500 cases or less. The samples... conducted under this clearance will generally be methodological studies of 500 cases or less, but will... the methodological design, sampling procedures (where possible) and questionnaires of the full scale...

  7. Investigation of relationship between quality of working life and organizational commitment of nurses in teaching hospitals in Tabriz in 2014

    PubMed Central

    Ghoddoosi-Nejad, D; Baghban Baghestan, E; Janati, A; Imani, A; Mansoorizadeh, Z

    2015-01-01

    The current research aimed to investigate the link between the quality of working life and the systematic commitment of nurses in the teaching hospitals in Tabriz. The methodology used was functional regarding the purpose and the proportional allocation as far as the stratified sampling method was concerned. The study population consisted of all the nurses in Tabriz. The instrument used in this study was a standard questionnaire, whose reliability was approved in national and international studies. Also data were collected and inserted into SPSS 20 software and a statistical analysis was performed. The results showed that the individuals’ quality of working life had a direct effect on their action in the organization. PMID:28316742

  8. An optimized methodology for whole genome sequencing of RNA respiratory viruses from nasopharyngeal aspirates.

    PubMed

    Goya, Stephanie; Valinotto, Laura E; Tittarelli, Estefania; Rojo, Gabriel L; Nabaes Jodar, Mercedes S; Greninger, Alexander L; Zaiat, Jonathan J; Marti, Marcelo A; Mistchenko, Alicia S; Viegas, Mariana

    2018-01-01

    Over the last decade, the number of viral genome sequences deposited in available databases has grown exponentially. However, sequencing methodology vary widely and many published works have relied on viral enrichment by viral culture or nucleic acid amplification with specific primers rather than through unbiased techniques such as metagenomics. The genome of RNA viruses is highly variable and these enrichment methodologies may be difficult to achieve or may bias the results. In order to obtain genomic sequences of human respiratory syncytial virus (HRSV) from positive nasopharyngeal aspirates diverse methodologies were evaluated and compared. A total of 29 nearly complete and complete viral genomes were obtained. The best performance was achieved with a DNase I treatment to the RNA directly extracted from the nasopharyngeal aspirate (NPA), sequence-independent single-primer amplification (SISPA) and library preparation performed with Nextera XT DNA Library Prep Kit with manual normalization. An average of 633,789 and 1,674,845 filtered reads per library were obtained with MiSeq and NextSeq 500 platforms, respectively. The higher output of NextSeq 500 was accompanied by the increasing of duplicated reads percentage generated during SISPA (from an average of 1.5% duplicated viral reads in MiSeq to an average of 74% in NextSeq 500). HRSV genome recovery was not affected by the presence or absence of duplicated reads but the computational demand during the analysis was increased. Considering that only samples with viral load ≥ E+06 copies/ml NPA were tested, no correlation between sample viral loads and number of total filtered reads was observed, nor with the mapped viral reads. The HRSV genomes showed a mean coverage of 98.46% with the best methodology. In addition, genomes of human metapneumovirus (HMPV), human rhinovirus (HRV) and human parainfluenza virus types 1-3 (HPIV1-3) were also obtained with the selected optimal methodology.

  9. Guidelines for reporting methodological challenges and evaluating potential bias in dementia research

    PubMed Central

    Weuve, Jennifer; Proust-Lima, Cécile; Power, Melinda C.; Gross, Alden L.; Hofer, Scott M.; Thiébaut, Rodolphe; Chêne, Geneviève; Glymour, M. Maria; Dufouil, Carole

    2015-01-01

    Clinical and population research on dementia and related neurologic conditions, including Alzheimer’s disease, faces several unique methodological challenges. Progress to identify preventive and therapeutic strategies rests on valid and rigorous analytic approaches, but the research literature reflects little consensus on “best practices.” We present findings from a large scientific working group on research methods for clinical and population studies of dementia, which identified five categories of methodological challenges as follows: (1) attrition/sample selection, including selective survival; (2) measurement, including uncertainty in diagnostic criteria, measurement error in neuropsychological assessments, and practice or retest effects; (3) specification of longitudinal models when participants are followed for months, years, or even decades; (4) time-varying measurements; and (5) high-dimensional data. We explain why each challenge is important in dementia research and how it could compromise the translation of research findings into effective prevention or care strategies. We advance a checklist of potential sources of bias that should be routinely addressed when reporting dementia research. PMID:26397878

  10. Fast methodology of analysing major steviol glycosides from Stevia rebaudiana leaves.

    PubMed

    Lorenzo, Cándida; Serrano-Díaz, Jéssica; Plaza, Miguel; Quintanilla, Carmen; Alonso, Gonzalo L

    2014-08-15

    The aim of this work is to propose an HPLC method for analysing major steviol glycosides as well as to optimise the extraction and clarification conditions for obtaining these compounds. Toward this aim, standards of stevioside and rebaudioside A with purities ⩾99.0%, commercial samples from different companies and Stevia rebaudiana Bertoni leaves from Paraguay supplied by Insobol, S.L., were used. The analytical method proposed is adequate in terms of selectivity, sensitivity and accuracy. Optimum extraction conditions and adequate clarification conditions have been set. Moreover, this methodology is safe and eco-friendly, as we use only water for extraction and do not use solid-phase extraction, which requires solvents that are banned in the food industry to condition the cartridge and elute the steviol glycosides. In addition, this methodology consumes little time as leaves are not ground and the filtration is faster, and the peak resolution is better as we used an HPLC method with gradient elution. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. Evaluation of errors in quantitative determination of asbestos in rock

    NASA Astrophysics Data System (ADS)

    Baietto, Oliviero; Marini, Paola; Vitaliti, Martina

    2016-04-01

    The quantitative determination of the content of asbestos in rock matrices is a complex operation which is susceptible to important errors. The principal methodologies for the analysis are Scanning Electron Microscopy (SEM) and Phase Contrast Optical Microscopy (PCOM). Despite the PCOM resolution is inferior to that of SEM, PCOM analysis has several advantages, including more representativity of the analyzed sample, more effective recognition of chrysotile and a lower cost. The DIATI LAA internal methodology for the analysis in PCOM is based on a mild grinding of a rock sample, its subdivision in 5-6 grain size classes smaller than 2 mm and a subsequent microscopic analysis of a portion of each class. The PCOM is based on the optical properties of asbestos and of the liquids with note refractive index in which the particles in analysis are immersed. The error evaluation in the analysis of rock samples, contrary to the analysis of airborne filters, cannot be based on a statistical distribution. In fact for airborne filters a binomial distribution (Poisson), which theoretically defines the variation in the count of fibers resulting from the observation of analysis fields, chosen randomly on the filter, can be applied. The analysis in rock matrices instead cannot lean on any statistical distribution because the most important object of the analysis is the size of the of asbestiform fibers and bundles of fibers observed and the resulting relationship between the weights of the fibrous component compared to the one granular. The error evaluation generally provided by public and private institutions varies between 50 and 150 percent, but there are not, however, specific studies that discuss the origin of the error or that link it to the asbestos content. Our work aims to provide a reliable estimation of the error in relation to the applied methodologies and to the total content of asbestos, especially for the values close to the legal limits. The error assessments must be made through the repetition of the same analysis on the same sample to try to estimate the error on the representativeness of the sample and the error related to the sensitivity of the operator, in order to provide a sufficiently reliable uncertainty of the method. We used about 30 natural rock samples with different asbestos content, performing 3 analysis on each sample to obtain a trend sufficiently representative of the percentage. Furthermore we made on one chosen sample 10 repetition of the analysis to try to define more specifically the error of the methodology.

  12. Generalization of the van der Pauw relationship derived from electrostatics

    NASA Astrophysics Data System (ADS)

    Weiss, Jonathan D.

    2011-08-01

    In an earlier paper, this author, along with two others Weiss et al. (2008) [1], demonstrated that the original van der Pauw relationship could be derived from three-dimensional electrostatics, as opposed to van der Pauw's use of conformal mapping. The earlier derivation was done for a conducting material of rectangular cross section with contacts placed at the corners. Presented here is a generalization of the previous work involving a square sample and a square array of electrodes that are not confined to the corners, since this measurement configuration could be a more convenient one. As in the previous work, the effects of non-zero sample thickness and contact size have been investigated. Buehler and Thurber derived a similar relationship using an infinite series of current images on a large and thin conducting sheet to satisfy the conditions at the boundary of the sample. The results presented here agree with theirs numerically, but analytic agreement could not be shown using any of the perused mathematical literature. By simply equating the two solutions, it appears that, as a byproduct of this work, a new mathematical relationship has been uncovered. Finally, the application of this methodology to the Hall Effect is discussed.

  13. One-shot calculation of temperature-dependent optical spectra and phonon-induced band-gap renormalization

    NASA Astrophysics Data System (ADS)

    Zacharias, Marios; Giustino, Feliciano

    2016-08-01

    Recently, Zacharias et al. [Phys. Rev. Lett. 115, 177401 (2015), 10.1103/PhysRevLett.115.177401] developed an ab initio theory of temperature-dependent optical absorption spectra and band gaps in semiconductors and insulators. In that work, the zero-point renormalization and the temperature dependence were obtained by sampling the nuclear wave functions using a stochastic approach. In the present work, we show that the stochastic sampling of Zacharias et al. can be replaced by fully deterministic supercell calculations based on a single optimal configuration of the atomic positions. We demonstrate that a single calculation is able to capture the temperature-dependent band-gap renormalization including quantum nuclear effects in direct-gap and indirect-gap semiconductors, as well as phonon-assisted optical absorption in indirect-gap semiconductors. In order to demonstrate this methodology, we calculate from first principles the temperature-dependent optical absorption spectra and the renormalization of direct and indirect band gaps in silicon, diamond, and gallium arsenide, and we obtain good agreement with experiment and with previous calculations. In this work we also establish the formal connection between the Williams-Lax theory of optical transitions and the related theories of indirect absorption by Hall, Bardeen, and Blatt, and of temperature-dependent band structures by Allen and Heine. The present methodology enables systematic ab initio calculations of optical absorption spectra at finite temperature, including both direct and indirect transitions. This feature will be useful for high-throughput calculations of optical properties at finite temperature and for calculating temperature-dependent optical properties using high-level theories such as G W and Bethe-Salpeter approaches.

  14. Stimulated Recall Methodology for Assessing Work System Barriers and Facilitators in Family-Centered Rounds in a Pediatric Hospital

    PubMed Central

    Carayon, Pascale; Li, Yaqiong; Kelly, Michelle M.; DuBenske, Lori L.; Xie, Anping; McCabe, Brenna; Orne, Jason; Cox, Elizabeth D.

    2014-01-01

    Human factors and ergonomics methods are needed to redesign healthcare processes and support patient-centered care, in particular for vulnerable patients such as hospitalized children. We implemented and evaluated a stimulated recall methodology for collective confrontation in the context of family-centered rounds. Five parents and five healthcare team members reviewed video records of their bedside rounds, and were then interviewed using the stimulated recall methodology to identify work system barriers and facilitators in family-centered rounds. The evaluation of the methodology was based on a survey of the participants, and a qualitative analysis of interview data in light of the work system model of Smith and Carayon (1989; 2000). Positive survey feedback from the participants was received. The stimulated recall methodology identified barriers and facilitators in all work system elements. Participatory ergonomics methods such as the stimulated recall methodology allow a range of participants, including parents and children, to participate in healthcare process improvement. PMID:24894378

  15. Tunneling of spoof surface plasmon polaritons through magnetoinductive metamaterial channels

    NASA Astrophysics Data System (ADS)

    Xu, Zhixia; Liu, Siyuan; Li, Shunli; Zhao, Hongxin; Liu, Leilei; Yin, Xiaoxing

    2018-04-01

    In this work, we realize tunneling propagation through spoof surface plasmon polariton transmission lines loaded with magnetoinductive metamaterial channels above a high cutoff frequency. Magnetoinductive metamaterial channels consist of split-ring resonators, and two different structures are proposed. Samples are fabricated, and both measurements and simulations indicate a near-perfect tunneling propagation around 17 GHz. The proposed methodology could be exploited as a powerful platform for investigating tunneling surface plasmons from radio frequencies to optical frequencies.

  16. Frontal crashworthiness characterisation of a vehicle segment using curve comparison metrics.

    PubMed

    Abellán-López, D; Sánchez-Lozano, M; Martínez-Sáez, L

    2018-08-01

    The objective of this work is to propose a methodology for the characterization of the collision behaviour and crashworthiness of a segment of vehicles, by selecting the vehicle that best represents that group. It would be useful in the development of deformable barriers, to be used in crash tests intended to study vehicle compatibility, as well as for the definition of the representative standard pulses used in numerical simulations or component testing. The characterisation and selection of representative vehicles is based on the objective comparison of the occupant compartment acceleration and barrier force pulses, obtained during crash tests, by using appropriate comparison metrics. This method is complemented with another one, based exclusively on the comparison of a few characteristic parameters of crash behaviour obtained from the previous curves. The method has been applied to different vehicle groups, using test data from a sample of vehicles. During this application, the performance of several metrics usually employed in the validation of simulation models have been analysed, and the most efficient ones have been selected for the task. The methodology finally defined is useful for vehicle segment characterization, taken into account aspects of crash behaviour related to the shape of the curves, difficult to represent by simple numerical parameters, and it may be tuned in future works when applied to larger and different samples. Copyright © 2018 Elsevier Ltd. All rights reserved.

  17. Cochrane Rehabilitation Methodology Committee: an international survey of priorities for future work.

    PubMed

    Levack, William M; Meyer, Thorsten; Negrini, Stefano; Malmivaara, Antti

    2017-10-01

    Cochrane Rehabilitation aims to improve the application of evidence-based practice in rehabilitation. It also aims to support Cochrane in the production of reliable, clinically meaningful syntheses of evidence related to the practice of rehabilitation, while accommodating the many methodological challenges facing the field. To this end, Cochrane Rehabilitation established a Methodology Committee to examine, explore and find solutions for the methodological challenges related to evidence synthesis and knowledge translation in rehabilitation. We conducted an international online survey via Cochrane Rehabilitation networks to canvass opinions regarding the future work priorities for this committee and to seek information on people's current capabilities to assist with this work. The survey findings indicated strongest interest in work on how reviewers have interpreted and applied Cochrane methods in reviews on rehabilitation topics in the past, and on gathering a collection of existing publications on review methods for undertaking systematic reviews relevant to rehabilitation. Many people are already interested in contributing to the work of the Methodology Committee and there is a large amount of expertise for this work in the extended Cochrane Rehabilitation network already.

  18. Plasmonic and Magnetically Responsive Gold ShellMagnetic Nanorod Hybrids

    DTIC Science & Technology

    2017-10-10

    is unlimited. 13. SUPPLEMENTARY NOTES 14. ABSTRACT In this work we demonstrated a new methodology to create asymmetric magnetic nanorods with a...Through this work, methodologies are developed to create asymmetric nanorod morphologies composed of an iron (II, III) oxide (Fe3O4) magnetic core with a...shape are preserved throughout the process. 4. Conclusions In this work we demonstrated a new methodology to create asymmetric magnetic nanorods

  19. Spatial distribution, sampling precision and survey design optimisation with non-normal variables: The case of anchovy (Engraulis encrasicolus) recruitment in Spanish Mediterranean waters

    NASA Astrophysics Data System (ADS)

    Tugores, M. Pilar; Iglesias, Magdalena; Oñate, Dolores; Miquel, Joan

    2016-02-01

    In the Mediterranean Sea, the European anchovy (Engraulis encrasicolus) displays a key role in ecological and economical terms. Ensuring stock sustainability requires the provision of crucial information, such as species spatial distribution or unbiased abundance and precision estimates, so that management strategies can be defined (e.g. fishing quotas, temporal closure areas or marine protected areas MPA). Furthermore, the estimation of the precision of global abundance at different sampling intensities can be used for survey design optimisation. Geostatistics provide a priori unbiased estimations of the spatial structure, global abundance and precision for autocorrelated data. However, their application to non-Gaussian data introduces difficulties in the analysis in conjunction with low robustness or unbiasedness. The present study applied intrinsic geostatistics in two dimensions in order to (i) analyse the spatial distribution of anchovy in Spanish Western Mediterranean waters during the species' recruitment season, (ii) produce distribution maps, (iii) estimate global abundance and its precision, (iv) analyse the effect of changing the sampling intensity on the precision of global abundance estimates and, (v) evaluate the effects of several methodological options on the robustness of all the analysed parameters. The results suggested that while the spatial structure was usually non-robust to the tested methodological options when working with the original dataset, it became more robust for the transformed datasets (especially for the log-backtransformed dataset). The global abundance was always highly robust and the global precision was highly or moderately robust to most of the methodological options, except for data transformation.

  20. Controlled Trials in Children: Quantity, Methodological Quality and Descriptive Characteristics of Pediatric Controlled Trials Published 1948-2006

    PubMed Central

    Thomson, Denise; Hartling, Lisa; Cohen, Eyal; Vandermeer, Ben; Tjosvold, Lisa; Klassen, Terry P.

    2010-01-01

    Background The objective of this study was to describe randomized controlled trials (RCTs) and controlled clinical trials (CCTs) in child health published between 1948 and 2006, in terms of quantity, methodological quality, and publication and trial characteristics. We used the Trials Register of the Cochrane Child Health Field for overall trends and a sample from this to explore trial characteristics in more detail. Methodology/Principal Findings We extracted descriptive data on a random sample of 578 trials. Ninety-six percent of the trials were published in English; the percentage of child-only trials was 90.5%. The most frequent diagnostic categories were infectious diseases (13.2%), behavioural and psychiatric disorders (11.6%), neonatal critical care (11.4%), respiratory disorders (8.9%), non-critical neonatology (7.9%), and anaesthesia (6.5%). There were significantly fewer child-only studies (i.e., more mixed child and adult studies) over time (P = 0.0460). The proportion of RCTs to CCTs increased significantly over time (P<0.0001), as did the proportion of multicentre trials (P = 0.002). Significant increases over time were found in methodological quality (Jadad score) (P<0.0001), the proportion of double-blind studies (P<0.0001), and studies with adequate allocation concealment (P<0.0001). Additionally, we found an improvement in reporting over time: adequate description of withdrawals and losses to follow-up (P<0.0001), sample size calculations (P<0.0001), and intention-to-treat analysis (P<0.0001). However, many trials still do not describe their level of blinding, and allocation concealment was inadequately reported in the majority of studies across the entire time period. The proportion of studies with industry funding decreased slightly over time (P = 0.003), and these studies were more likely to report positive conclusions (P = 0.028). Conclusions/Significance The quantity and quality of pediatric controlled trials has increased over time; however, much work remains to be done, particularly in improving methodological issues around conduct and reporting of trials. PMID:20927344

  1. A simple and highly selective molecular imprinting polymer-based methodology for propylparaben monitoring in personal care products and industrial waste waters.

    PubMed

    Vicario, Ana; Aragón, Leslie; Wang, Chien C; Bertolino, Franco; Gomez, María R

    2018-02-05

    In this work, a novel molecularly imprinted polymer (MIP) proposed as solid phase extraction sorbent was developed for the determination of propylparaben (PP) in diverse cosmetic samples. The use of parabens (PAs) is authorized by regulatory agencies as microbiological preservative; however, recently several studies claim that large-scale use of these preservatives can be a potential health risk and harmful to the environment. Diverse factors that influence on polymer synthesis were studied, including template, functional monomer, porogen and crosslinker used. Morphological characterization of the MIP was performed using SEM and BET analysis. Parameters affecting the molecularly imprinted solid phase extraction (MISPE) and elution efficiency of PP were evaluated. After sample clean-up, the analyte was analyzed by high performance liquid chromatography (HPLC). The whole procedure was validated, showing satisfactory analytical parameters. After applying the MISPE methodology, the extraction recoveries were always better than 86.15%; the obtained precision expressed as RSD% was always lower than 2.19 for the corrected peak areas. Good linear relationship was obtained within the range 8-500ngmL -1 of PP, r 2 =0.99985. Lower limits of detection and quantification after MISPE procedure of 2.4 and 8ngmL -1 , respectively were reached, in comparison with previously reported methodologies. The development of MISPE-HPLC methodology provided a simple an economic way for accomplishing a clean-up/preconcentration step and the subsequent determination of PP in a complex matrix. The performance of the proposed method was compared against C-18 and silica solid phase extraction (SPE) cartridges. The recovery factors obtained after applying extraction methods were 96.6, 64.8 and 0.79 for MISPE, C18-SPE and silica-SPE procedures, respectively. The proposed methodology improves the retention capability of SPE material plus robustness and possibility of reutilization, enabling it to be used for PP routine monitoring in diverse personal care products (PCP) and environmental samples. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Rapid ultra-trace analysis of sucralose in multiple-origin aqueous samples by online solid-phase extraction coupled to high-resolution mass spectrometry.

    PubMed

    Batchu, Sudha Rani; Ramirez, Cesar E; Gardinali, Piero R

    2015-05-01

    Because of its widespread consumption and its persistence during wastewater treatment, the artificial sweetener sucralose has gained considerable interest as a proxy to detect wastewater intrusion into usable water resources. The molecular resilience of this compound dictates that coastal and oceanic waters are the final recipient of this compound with unknown effects on ecosystems. Furthermore, no suitable methodologies have been reported for routine, ultra-trace detection of sucralose in seawater as the sensitivity of traditional liquid chromatography-tandem mass spectrometry (LC-MS/MS) analysis is limited by a low yield of product ions upon collision-induced dissociation (CID). In this work, we report the development and field test of an alternative analysis tool for sucralose in environmental waters, with enough sensitivity for the proper quantitation and confirmation of this analyte in seawater. The methodology is based on automated online solid-phase extraction (SPE) and high-resolving-power orbitrap MS detection. Operating in full scan (no CID), detection of the unique isotopic pattern (100:96:31 for [M-H](-), [M-H+2](-), and [M-H+4](-), respectively) was used for ultra-trace quantitation and analyte identification. The method offers fast analysis (14 min per run) and low sample consumption (10 mL per sample) with method detection and confirmation limits (MDLs and MCLs) of 1.4 and 5.7 ng/L in seawater, respectively. The methodology involves low operating costs due to virtually no sample preparation steps or consumables. As an application example, samples were collected from 17 oceanic and estuarine sites in Broward County, FL, with varying salinity (6-40 PSU). Samples included the ocean outfall of the Southern Regional Wastewater Treatment Plant (WWTP) that serves Hollywood, FL. Sucralose was detected above MCL in 78% of the samples at concentrations ranging from 8 to 148 ng/L, with the exception of the WWTP ocean outfall (at pipe end, 28 m below the surface) where the measured concentration was 8418 ± 3813 ng/L. These results demonstrate the applicability of this monitoring tool for the trace-level detection of this wastewater marker in very dilute environmental waters.

  3. MAP: an iterative experimental design methodology for the optimization of catalytic search space structure modeling.

    PubMed

    Baumes, Laurent A

    2006-01-01

    One of the main problems in high-throughput research for materials is still the design of experiments. At early stages of discovery programs, purely exploratory methodologies coupled with fast screening tools should be employed. This should lead to opportunities to find unexpected catalytic results and identify the "groups" of catalyst outputs, providing well-defined boundaries for future optimizations. However, very few new papers deal with strategies that guide exploratory studies. Mostly, traditional designs, homogeneous covering, or simple random samplings are exploited. Typical catalytic output distributions exhibit unbalanced datasets for which an efficient learning is hardly carried out, and interesting but rare classes are usually unrecognized. Here is suggested a new iterative algorithm for the characterization of the search space structure, working independently of learning processes. It enhances recognition rates by transferring catalysts to be screened from "performance-stable" space zones to "unsteady" ones which necessitate more experiments to be well-modeled. The evaluation of new algorithm attempts through benchmarks is compulsory due to the lack of past proofs about their efficiency. The method is detailed and thoroughly tested with mathematical functions exhibiting different levels of complexity. The strategy is not only empirically evaluated, the effect or efficiency of sampling on future Machine Learning performances is also quantified. The minimum sample size required by the algorithm for being statistically discriminated from simple random sampling is investigated.

  4. Sample preparation prior to the LC-MS-based metabolomics/metabonomics of blood-derived samples.

    PubMed

    Gika, Helen; Theodoridis, Georgios

    2011-07-01

    Blood represents a very important biological fluid and has been the target of continuous and extensive research for diagnostic, or health and drug monitoring reasons. Recently, metabonomics/metabolomics have emerged as a new and promising 'omics' platform that shows potential in biomarker discovery, especially in areas such as disease diagnosis, assessment of drug efficacy or toxicity. Blood is collected in various establishments in conditions that are not standardized. Next, the samples are prepared and analyzed using different methodologies or tools. When targeted analysis of key molecules (e.g., a drug or its metabolite[s]) is the aim, enforcement of certain measures or additional analyses may correct and harmonize these discrepancies. In omics fields such as those performed by holistic analytical approaches, no such rules or tools are available. As a result, comparison or correlation of results or data fusion becomes impractical. However, it becomes evident that such obstacles should be overcome in the near future to allow for large-scale studies that involve the assaying of samples from hundreds of individuals. In this case the effect of sample handling and preparation becomes very serious, in order to avoid wasting months of work from experts and expensive instrument time. The present review aims to cover the different methodologies applied to the pretreatment of blood prior to LC-MS metabolomic/metabonomic studies. The article tries to critically compare the methods and highlight issues that need to be addressed.

  5. Versatile fluid-mixing device for cell and tissue microgravity research applications.

    PubMed

    Wilfinger, W W; Baker, C S; Kunze, E L; Phillips, A T; Hammerstedt, R H

    1996-01-01

    Microgravity life-science research requires hardware that can be easily adapted to a variety of experimental designs and working environments. The Biomodule is a patented, computer-controlled fluid-mixing device that can accommodate these diverse requirements. A typical shuttle payload contains eight Biomodules with a total of 64 samples, a sealed containment vessel, and a NASA refrigeration-incubation module. Each Biomodule contains eight gas-permeable Silastic T tubes that are partitioned into three fluid-filled compartments. The fluids can be mixed at any user-specified time. Multiple investigators and complex experimental designs can be easily accommodated with the hardware. During flight, the Biomodules are sealed in a vessel that provides two levels of containment (liquids and gas) and a stable, investigator-controlled experimental environment that includes regulated temperature, internal pressure, humidity, and gas composition. A cell microencapsulation methodology has also been developed to streamline launch-site sample manipulation and accelerate postflight analysis through the use of fluorescent-activated cell sorting. The Biomodule flight hardware and analytical cell encapsulation methodology are ideally suited for temporal, qualitative, or quantitative life-science investigations.

  6. Three-dimensional imaging of flat natural and cultural heritage objects by a Compton scattering modality

    NASA Astrophysics Data System (ADS)

    Guerrero Prado, Patricio; Nguyen, Mai K.; Dumas, Laurent; Cohen, Serge X.

    2017-01-01

    Characterization and interpretation of flat ancient material objects, such as those found in archaeology, paleoenvironments, paleontology, and cultural heritage, have remained a challenging task to perform by means of conventional x-ray tomography methods due to their anisotropic morphology and flattened geometry. To overcome the limitations of the mentioned methodologies for such samples, an imaging modality based on Compton scattering is proposed in this work. Classical x-ray tomography treats Compton scattering data as noise in the image formation process, while in Compton scattering tomography the conditions are set such that Compton data become the principal image contrasting agent. Under these conditions, we are able, first, to avoid relative rotations between the sample and the imaging setup, and second, to obtain three-dimensional data even when the object is supported by a dense material by exploiting backscattered photons. Mathematically this problem is addressed by means of a conical Radon transform and its inversion. The image formation process and object reconstruction model are presented. The feasibility of this methodology is supported by numerical simulations.

  7. Effective Pb2+ removal from water using nanozerovalent iron stored 10 months

    NASA Astrophysics Data System (ADS)

    Ahmed, M. A.; Bishay, Samiha T.; Ahmed, Fatma M.; El-Dek, S. I.

    2017-10-01

    Heavy metal removal from water required reliable and cost-effective considerations, fast separation as well as easy methodology. In this piece of research, nanozerovalent iron (NZVI) was prepared as ideal sorbent for Pb2+ removal. The sample was characterized using X-ray diffraction (XRD), high-resolution transmission electron microscope (HRTEM), and atomic force microscope (AFM-SPM). Batch experiments comprised the effect of pH value and contact time on the adsorption process. The same NZVI was stored for a shelf time (10 months) and the batch experiment was repeated. The outcomes of the investigation assured that NZVI publicized an extraordinary large metal uptake (98%) after a short contact time (10 h). The stored sample revealed the same effectiveness on Pb2+ removal under the same conditions. The results of the physical properties, magnetic susceptibility, and conductance were correlated with the adsorption efficiency. This work offers evidence that these NZVI particles could be potential candidate for Pb2+ removal in large scale, stored for a long time using a simple, green, and cost-effective methodology, and represent an actual feedback in waste water treatment.

  8. Speciation analysis of organotin compounds in human urine by headspace solid-phase micro-extraction and gas chromatography with pulsed flame photometric detection.

    PubMed

    Valenzuela, Aníbal; Lespes, Gaëtane; Quiroz, Waldo; Aguilar, Luis F; Bravo, Manuel A

    2014-07-01

    A new headspace solid-phase micro-extraction (HS-SPME) method followed by gas chromatography with pulsed flame photometric detection (GC-PFPD) analysis has been developed for the simultaneous determination of 11 organotin compounds, including methyl-, butyl-, phenyl- and octyltin derivates, in human urine. The methodology has been validated by the analysis of urine samples fortified with all analytes at different concentration levels, and recovery rates above 87% and relative precisions between 2% and 7% were obtained. Additionally, an experimental-design approach has been used to model the storage stability of organotin compounds in human urine, demonstrating that organotins are highly degraded in this medium, although their stability is satisfactory during the first 4 days of storage at 4 °C and pH=4. Finally, this methodology was applied to urine samples collected from harbor workers exposed to antifouling paints; methyl- and butyltins were detected, confirming human exposure in this type of work environment. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. Adapting total quality management for general practice: evaluation of a programme.

    PubMed Central

    Lawrence, M; Packwood, T

    1996-01-01

    OBJECTIVE: Assessment of the benefits and limitations of a quality improvement programme based on total quality management principles in general practice over a period of one year (October 1993-4). DESIGN: Questionnaires to practice team members before any intervention and after one year. Three progress reports completed by facilitators at four month intervals. Semistructured interviews with a sample of staff from each practice towards the end of the year. SETTING: 18 self selected practices from across the former Oxford Region. Three members of each practice received an initial residential course and three one day seminars during the year. Each practice was supported by a facilitator from their Medical Audit Advisory Group. MEASURES: Extent of understanding and implementation of quality improvement methodology. Number, completeness, and evaluation of quality improvement projects. Practice team members' attitudes to and involvement in team working and quality improvement. RESULTS: 16 of the 18 practices succeeded in implementing the quality improvement methods. 48 initiatives were considered and staff involvement was broad. Practice members showed increased involvement in, and appreciation of, strategic planning and team working, and satisfaction from improved patients services. 11 of the practices intend to continue with the methodology. The commonest barrier expressed was time. CONCLUSION: Quality improvement programmes based on total quality management principles produce beneficial changes in service delivery and team working in most general practices. It is incompatible with traditional doctor centred practice. The methodology needs to be adapted for primary care to avoid quality improvement being seen as separate from routine activity, and to save time. PMID:10161529

  10. Job stress and cardiovascular disease: a theoretic critical review.

    PubMed

    Kristensen, T S

    1996-07-01

    During the last 15 years, the research on job stress and cardiovascular diseases has been dominated by the job strain model developed by R. Karasek (1979) and colleagues (R. Karasek & T. Theorell, 1990). In this article the results of this research are briefly summarized, and the theoretical and methodological basis is discussed and criticized. A sociological interpretation of the model emphasizing theories of technological change, qualifications of the workers, and the organization of work is proposed. Furthermore, improvements with regard to measuring the job strain dimensions and to sampling the study base are suggested. Substantial improvements of the job strain research could be achieved if the principle of triangulation were used in the measurements of stressors, stress, and sickness and if occupation-based samples were used instead of large representative samples.

  11. Controlled trials in children: quantity, methodological quality and descriptive characteristics of pediatric controlled trials published 1948-2006.

    PubMed

    Thomson, Denise; Hartling, Lisa; Cohen, Eyal; Vandermeer, Ben; Tjosvold, Lisa; Klassen, Terry P

    2010-09-30

    The objective of this study was to describe randomized controlled trials (RCTs) and controlled clinical trials (CCTs) in child health published between 1948 and 2006, in terms of quantity, methodological quality, and publication and trial characteristics. We used the Trials Register of the Cochrane Child Health Field for overall trends and a sample from this to explore trial characteristics in more detail. We extracted descriptive data on a random sample of 578 trials. Ninety-six percent of the trials were published in English; the percentage of child-only trials was 90.5%. The most frequent diagnostic categories were infectious diseases (13.2%), behavioural and psychiatric disorders (11.6%), neonatal critical care (11.4%), respiratory disorders (8.9%), non-critical neonatology (7.9%), and anaesthesia (6.5%). There were significantly fewer child-only studies (i.e., more mixed child and adult studies) over time (P = 0.0460). The proportion of RCTs to CCTs increased significantly over time (P<0.0001), as did the proportion of multicentre trials (P = 0.002). Significant increases over time were found in methodological quality (Jadad score) (P<0.0001), the proportion of double-blind studies (P<0.0001), and studies with adequate allocation concealment (P<0.0001). Additionally, we found an improvement in reporting over time: adequate description of withdrawals and losses to follow-up (P<0.0001), sample size calculations (P<0.0001), and intention-to-treat analysis (P<0.0001). However, many trials still do not describe their level of blinding, and allocation concealment was inadequately reported in the majority of studies across the entire time period. The proportion of studies with industry funding decreased slightly over time (P = 0.003), and these studies were more likely to report positive conclusions (P = 0.028). The quantity and quality of pediatric controlled trials has increased over time; however, much work remains to be done, particularly in improving methodological issues around conduct and reporting of trials.

  12. Single point aerosol sampling: evaluation of mixing and probe performance in a nuclear stack.

    PubMed

    Rodgers, J C; Fairchild, C I; Wood, G O; Ortiz, C A; Muyshondt, A; McFarland, A R

    1996-01-01

    Alternative reference methodologies have been developed for sampling of radionuclides from stacks and ducts, which differ from the methods previously required by the United States Environmental Protection Agency. These alternative reference methodologies have recently been approved by the U.S. EPA for use in lieu of the current standard techniques. The standard EPA methods are prescriptive in selection of sampling locations and in design of sampling probes whereas the alternative reference methodologies are performance driven. Tests were conducted in a stack at Los Alamos National Laboratory to demonstrate the efficacy of some aspects of the alternative reference methodologies. Coefficients of variation of velocity, tracer gas, and aerosol particle profiles were determined at three sampling locations. Results showed that numerical criteria placed upon the coefficients of variation by the alternative reference methodologies were met at sampling stations located 9 and 14 stack diameters from the flow entrance, but not at a location that was 1.5 diameters downstream from the inlet. Experiments were conducted to characterize the transmission of 10 microns aerodynamic diameter liquid aerosol particles through three types of sampling probes. The transmission ratio (ratio of aerosol concentration at the probe exit plane to the concentration in the free stream) was 107% for a 113 L min-1 (4-cfm) anisokinetic shrouded probe, but only 20% for an isokinetic probe that follows the existing EPA standard requirements. A specially designed isokinetic probe showed a transmission ratio of 63%. The shrouded probe performance would conform to the alternative reference methodologies criteria; however, the isokinetic probes would not.

  13. Work, malaise, and well-being in Spanish and Latin-American doctors

    PubMed Central

    Ochoa, Paola; Blanch, Josep M

    2016-01-01

    ABSTRACT OBJECTIVE To analyze the relations between the meanings of working and the levels of doctors work well-being in the context of their working conditions. METHOD The research combined the qualitative methodology of textual analysis and the quantitative one of correspondence factor analysis. A convenience, intentional, and stratified sample composed of 305 Spanish and Latin American doctors completed an extensive questionnaire on the topics of the research. RESULTS The general meaning of working for the group located in the quartile of malaise included perceptions of discomfort, frustration, and exhaustion. However, those showing higher levels of well-being, located on the opposite quartile, associated their working experience with good conditions and the development of their professional and personal competences. CONCLUSIONS The study provides empirical evidence of the relationship between contextual factors and the meanings of working for participants with higher levels of malaise, and of the importance granted both to intrinsic and extrinsic factors by those who scored highest on well-being. PMID:27191157

  14. Electron work function-a promising guiding parameter for material design.

    PubMed

    Lu, Hao; Liu, Ziran; Yan, Xianguo; Li, Dongyang; Parent, Leo; Tian, Harry

    2016-04-14

    Using nickel added X70 steel as a sample material, we demonstrate that electron work function (EWF), which largely reflects the electron behavior of materials, could be used as a guide parameter for material modification or design. Adding Ni having a higher electron work function to X70 steel brings more "free" electrons to the steel, leading to increased overall work function, accompanied with enhanced e(-)-nuclei interactions or higher atomic bond strength. Young's modulus and hardness increase correspondingly. However, the free electron density and work function decrease as the Ni content is continuously increased, accompanied with the formation of a second phase, FeNi3, which is softer with a lower work function. The decrease in the overall work function corresponds to deterioration of the mechanical strength of the steel. It is expected that EWF, a simple but fundamental parameter, may lead to new methodologies or supplementary approaches for metallic materials design or tailoring on a feasible electronic base.

  15. Electron work function–a promising guiding parameter for material design

    PubMed Central

    Lu, Hao; Liu, Ziran; Yan, Xianguo; Li, Dongyang; Parent, Leo; Tian, Harry

    2016-01-01

    Using nickel added X70 steel as a sample material, we demonstrate that electron work function (EWF), which largely reflects the electron behavior of materials, could be used as a guide parameter for material modification or design. Adding Ni having a higher electron work function to X70 steel brings more “free” electrons to the steel, leading to increased overall work function, accompanied with enhanced e−–nuclei interactions or higher atomic bond strength. Young’s modulus and hardness increase correspondingly. However, the free electron density and work function decrease as the Ni content is continuously increased, accompanied with the formation of a second phase, FeNi3, which is softer with a lower work function. The decrease in the overall work function corresponds to deterioration of the mechanical strength of the steel. It is expected that EWF, a simple but fundamental parameter, may lead to new methodologies or supplementary approaches for metallic materials design or tailoring on a feasible electronic base. PMID:27074974

  16. Validation and evaluation of an HPLC methodology for the quantification of the potent antimitotic compound (+)-discodermolide in the Caribbean marine sponge Discodermia dissoluta.

    PubMed

    Valderrama, Katherine; Castellanos, Leonardo; Zea, Sven

    2010-08-01

    The sponge Discodermia dissoluta is the source of the potent antimitotic compound (+)-discodermolide. The relatively abundant and shallow populations of this sponge in Santa Marta, Colombia, allow for studies to evaluate the natural and biotechnological supply options of (+)-discodermolide. In this work, an RP-HPLC-UV methodology for the quantification of (+)-discodermolide from sponge samples was tested and validated. Our protocol for extracting this compound from the sponge included lyophilization, exhaustive methanol extraction, partitioning using water and dichloromethane, purification of the organic fraction in RP-18 cartridges and then finally retrieving the (+)-discodermolide in the methanol-water (80:20 v/v) fraction. This fraction was injected into an HPLC system with an Xterra RP-18 column and a detection wavelength of 235 nm. The calibration curve was linear, making it possible to calculate the LODs and quantification in these experiments. The intra-day and inter-day precision showed relative standard deviations lower than 5%. The accuracy, determined as the percentage recovery, was 99.4%. Nine samples of the sponge from the Bahamas, Bonaire, Curaçao and Santa Marta had concentrations of (+)-discodermolide ranging from 5.3 to 29.3 microg/g(-1) of wet sponge. This methodology is quick and simple, allowing for the quantification in sponges from natural environments, in situ cultures or dissociated cells.

  17. Virtual standardized patients: an interactive method to examine variation in depression care among primary care physicians

    PubMed Central

    Hooper, Lisa M.; Weinfurt, Kevin P.; Cooper, Lisa A.; Mensh, Julie; Harless, William; Kuhajda, Melissa C.; Epstein, Steven A.

    2009-01-01

    Background Some primary care physicians provide less than optimal care for depression (Kessler et al., Journal of the American Medical Association 291, 2581–90, 2004). However, the literature is not unanimous on the best method to use in order to investigate this variation in care. To capture variations in physician behaviour and decision making in primary care settings, 32 interactive CD-ROM vignettes were constructed and tested. Aim and method The primary aim of this methods-focused paper was to review the extent to which our study method – an interactive CD-ROM patient vignette methodology – was effective in capturing variation in physician behaviour. Specifically, we examined the following questions: (a) Did the interactive CD-ROM technology work? (b) Did we create believable virtual patients? (c) Did the research protocol enable interviews (data collection) to be completed as planned? (d) To what extent was the targeted study sample size achieved? and (e) Did the study interview protocol generate valid and reliable quantitative data and rich, credible qualitative data? Findings Among a sample of 404 randomly selected primary care physicians, our voice-activated interactive methodology appeared to be effective. Specifically, our methodology – combining interactive virtual patient vignette technology, experimental design, and expansive open-ended interview protocol – generated valid explanations for variations in primary care physician practice patterns related to depression care. PMID:20463864

  18. Effect of modulation of the particle size distributions in the direct solid analysis by total-reflection X-ray fluorescence

    NASA Astrophysics Data System (ADS)

    Fernández-Ruiz, Ramón; Friedrich K., E. Josue; Redrejo, M. J.

    2018-02-01

    The main goal of this work was to investigate, in a systematic way, the influence of the controlled modulation of the particle size distribution of a representative solid sample with respect to the more relevant analytical parameters of the Direct Solid Analysis (DSA) by Total-reflection X-Ray Fluorescence (TXRF) quantitative method. In particular, accuracy, uncertainty, linearity and detection limits were correlated with the main parameters of their size distributions for the following elements; Al, Si, P, S, K, Ca, Ti, V, Cr, Mn, Fe, Ni, Cu, Zn, As, Se, Rb, Sr, Ba and Pb. In all cases strong correlations were finded. The main conclusion of this work can be resumed as follows; the modulation of particles shape to lower average sizes next to a minimization of the width of particle size distributions, produce a strong increment of accuracy, minimization of uncertainties and limit of detections for DSA-TXRF methodology. These achievements allow the future use of the DSA-TXRF analytical methodology for development of ISO norms and standardized protocols for the direct analysis of solids by mean of TXRF.

  19. Methodologies for Removing/Desorbing and Transporting Particles from Surfaces to Instrumentation

    NASA Astrophysics Data System (ADS)

    Miller, Carla J.; Cespedes, Ernesto R.

    2012-12-01

    Explosive trace detection (ETD) continues to be a key technology supporting the fight against terrorist bombing threats. Very selective and sensitive ETD instruments have been developed to detect explosive threats concealed on personnel, in vehicles, in luggage, and in cargo containers, as well as for forensic analysis (e.g. post blast inspection, bomb-maker identification, etc.) in a broad range of homeland security, law enforcement, and military applications. A number of recent studies have highlighted the fact that significant improvements in ETD systems' capabilities will be achieved, not by increasing the selectivity/sensitivity of the sensors, but by improved techniques for particle/vapor sampling, pre-concentration, and transport to the sensors. This review article represents a compilation of studies focused on characterizing the adhesive properties of explosive particles, the methodologies for removing/desorbing these particles from a range of surfaces, and approaches for transporting them to the instrument. The objectives of this review are to summarize fundamental work in explosive particle characterization, to describe experimental work performed in harvesting and transport of these particles, and to highlight those approaches that indicate high potential for improving ETD capabilities.

  20. Organizational Change Efforts: Methodologies for Assessing Organizational Effectiveness and Program Costs versus Benefits.

    ERIC Educational Resources Information Center

    Macy, Barry A.; Mirvis, Philip H.

    1982-01-01

    A standardized methodology for identifying, defining, and measuring work behavior and performance rather than production, and a methodology that estimates the costs and benefits of work innovation are presented for assessing organizational effectiveness and program costs versus benefits in organizational change programs. Factors in a cost-benefit…

  1. DEVELOPMENT OF A SUB-SLAB AIR SAMPLING PROTOCOL TO SUPPORT ASSESSMENT OF VAPOR INTRUSION

    EPA Science Inventory

    The primary purpose of this research effort is to develop a methodology for sub-slab sampling to support the EPA guidance and vapor intrusion investigations after vapor intrusion has been established at a site. Methodologies for sub-slab air sampling are currently lacking in ref...

  2. Methodological Choices in Rating Speech Samples

    ERIC Educational Resources Information Center

    O'Brien, Mary Grantham

    2016-01-01

    Much pronunciation research critically relies upon listeners' judgments of speech samples, but researchers have rarely examined the impact of methodological choices. In the current study, 30 German native listeners and 42 German L2 learners (L1 English) rated speech samples produced by English-German L2 learners along three continua: accentedness,…

  3. Influence of Environmental Factors on Vibrio spp. in Coastal Ecosystems.

    PubMed

    Johnson, Crystal N

    2015-06-01

    Various studies have examined the relationships between vibrios and the environmental conditions surrounding them. However, very few reviews have compiled these studies into cohesive points. This may be due to the fact that these studies examine different environmental parameters, use different sampling, detection, and enumeration methodologies, and occur in diverse geographic locations. The current article is one approach to compile these studies into a cohesive work that assesses the importance of environmental determinants on the abundance of vibrios in coastal ecosystems.

  4. Molecular detection of native and invasive marine invertebrate larvae present in ballast and open water environmental samples collected in Puget Sound

    USGS Publications Warehouse

    Harvey, J.B.J.; Hoy, M.S.; Rodriguez, R.J.

    2009-01-01

    Non-native marine species have been and continue to be introduced into Puget Sound via several vectors including ship's ballast water. Some non-native species become invasive and negatively impact native species or near shore habitats. We present a new methodology for the development and testing of taxon specific PCR primers designed to assess environmental samples of ocean water for the presence of native and non-native bivalves, crustaceans and algae. The intergenic spacer regions (IGS; ITS1, ITS2 and 5.8S) of the ribosomal DNA were sequenced for adult samples of each taxon studied. We used these data along with those available in Genbank to design taxon and group specific primers and tested their stringency against artificial populations of plasmid constructs containing the entire IGS region for each of the 25 taxa in our study, respectively. Taxon and group specific primer sets were then used to detect the presence or absence of native and non-native planktonic life-history stages (propagules) from environmental samples of ballast water and plankton tow net samples collected in Puget Sound. This methodology provides an inexpensive and efficient way to test the discriminatory ability of taxon specific oligonucleotides (PCR primers) before creating molecular probes or beacons for use in molecular ecological applications such as probe hybridizations or microarray analyses. This work addresses the current need to develop molecular tools capable of diagnosing the presence of planktonic life-history stages from non-native marine species (potential invaders) in ballast water and other environmental samples. ?? 2008 Elsevier B.V.

  5. Comparison between low-cost marker-less and high-end marker-based motion capture systems for the computer-aided assessment of working ergonomics.

    PubMed

    Patrizi, Alfredo; Pennestrì, Ettore; Valentini, Pier Paolo

    2016-01-01

    The paper deals with the comparison between a high-end marker-based acquisition system and a low-cost marker-less methodology for the assessment of the human posture during working tasks. The low-cost methodology is based on the use of a single Microsoft Kinect V1 device. The high-end acquisition system is the BTS SMART that requires the use of reflective markers to be placed on the subject's body. Three practical working activities involving object lifting and displacement have been investigated. The operational risk has been evaluated according to the lifting equation proposed by the American National Institute for Occupational Safety and Health. The results of the study show that the risk multipliers computed from the two acquisition methodologies are very close for all the analysed activities. In agreement to this outcome, the marker-less methodology based on the Microsoft Kinect V1 device seems very promising to promote the dissemination of computer-aided assessment of ergonomics while maintaining good accuracy and affordable costs. PRACTITIONER’S SUMMARY: The study is motivated by the increasing interest for on-site working ergonomics assessment. We compared a low-cost marker-less methodology with a high-end marker-based system. We tested them on three different working tasks, assessing the working risk of lifting loads. The two methodologies showed comparable precision in all the investigations.

  6. Analysis and differentiation of paper samples by capillary electrophoresis and multivariate analysis.

    PubMed

    Fernández de la Ossa, Ma Ángeles; Ortega-Ojeda, Fernando; García-Ruiz, Carmen

    2014-11-01

    This work reports an investigation for the analysis of different paper samples using CE with laser-induced detection. Papers from four different manufactures (white-copy paper) and four different paper sources (white and recycled-copy papers, adhesive yellow paper notes and restaurant serviettes) were pulverized by scratching with a surgical scalpel prior to their derivatization with a fluorescent labeling agent, 8-aminopyrene-1,3,6-trisulfonic acid. Methodological conditions were evaluated, specifically the derivatization conditions with the aim to achieve the best S/N signals and the separation conditions in order to obtain optimum values of sensitivity and reproducibility. The best conditions, in terms of fastest, and easiest sample preparation procedure, minimal sample consumption, as well as the use of the simplest and fastest CE-procedure for obtaining the best analytical parameters, were applied to the analysis of the different paper samples. The registered electropherograms were pretreated (normalized and aligned) and subjected to multivariate analysis (principal component analysis). A successful discrimination among paper samples without entanglements was achieved. To the best of our knowledge, this work presents the first approach to achieve a successful differentiation among visually similar white-copy paper samples produced by different manufactures and paper from different paper sources through their direct analysis by CE-LIF and subsequent comparative study of the complete cellulose electropherogram by chemometric tools. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Capillary electrophoresis: Imaging of electroosmotic and pressure driven flow profiles in fused silica capillaries

    NASA Technical Reports Server (NTRS)

    Williams, George O., Jr.

    1996-01-01

    This study is a continuation of the summer of 1994 NASA/ASEE Summer Faculty Fellowship Program. This effort is a portion of the ongoing work by the Biophysics Branch of the Marshall Space Flight Center. The work has focused recently on the separation of macromolecules using capillary electrophoresis (CE). Two primary goals were established for the effort this summer. First, we wanted to use capillary electrophoresis to study the electrohydrodynamics of a sample stream. Secondly, there was a need to develop a methodology for using CE for separation of DNA molecules of various sizes. In order to achieve these goals we needed to establish a procedure for detection of a sample plug under the influence of an electric field Detection of the sample with the microscope and image analysis system would be helpful in studying the electrohydrodynamics of this stream under load. Videotaping this process under the influence of an electric field in real time would also be useful. Imaging and photography of the sample/background electrolyte interface would be vital to this study. Finally, detection and imaging of electroosmotic flow and pressure driven flow must be accomplished.

  8. Eye-gaze determination of user intent at the computer interface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldberg, J.H.; Schryver, J.C.

    1993-12-31

    Determination of user intent at the computer interface through eye-gaze monitoring can significantly aid applications for the disabled, as well as telerobotics and process control interfaces. Whereas current eye-gaze control applications are limited to object selection and x/y gazepoint tracking, a methodology was developed here to discriminate a more abstract interface operation: zooming-in or out. This methodology first collects samples of eve-gaze location looking at controlled stimuli, at 30 Hz, just prior to a user`s decision to zoom. The sample is broken into data frames, or temporal snapshots. Within a data frame, all spatial samples are connected into a minimummore » spanning tree, then clustered, according to user defined parameters. Each cluster is mapped to one in the prior data frame, and statistics are computed from each cluster. These characteristics include cluster size, position, and pupil size. A multiple discriminant analysis uses these statistics both within and between data frames to formulate optimal rules for assigning the observations into zooming, zoom-out, or no zoom conditions. The statistical procedure effectively generates heuristics for future assignments, based upon these variables. Future work will enhance the accuracy and precision of the modeling technique, and will empirically test users in controlled experiments.« less

  9. Sharply rising prevalence of HIV infection in Bali: a critical assessment of the surveillance data.

    PubMed

    Januraga, P P; Wulandari, L P L; Muliawan, P; Sawitri, S; Causer, L; Wirawan, D N; Kaldor, J M

    2013-08-01

    This study critically examines serological survey data for HIV infection in selected populations in Bali, Indonesia. Sero-survey data reported by the Bali Health Office between 2000 and 2010 were collated, and provincial health staff were interviewed to gain a detailed understanding of survey methods. Analysis of time series restricted to districts that have used the same sampling methods and sites each year indicates that there has been a steady decline in HIV prevalence among prisoners, from 18.7% in 2000 to 4.3% in 2010. In contrast, HIV prevalence among women engaged in sex work increased sharply: from 0.62% in 2000 to 20.2% in 2010 (brothel based), and from 0% in 2000 to 7.2% in 2010 (non-brothel based). The highest prevalence was recorded among people who injected drugs. Recent surveys of gay men and transvestites also found high prevalences, at 18.7% and 40.9%, respectively. Review of the methodology used in the surveys identified inconsistencies in the sampling technique, sample numbers and sites over time, and incomplete recording of individual information about survey participants. Attention to methodological aspects and incorporation of additional information on behavioural factors will ensure that the surveillance system is in the best position to support prevention activities.

  10. Protocol for Microplastics Sampling on the Sea Surface and Sample Analysis

    PubMed Central

    Kovač Viršek, Manca; Palatinus, Andreja; Koren, Špela; Peterlin, Monika; Horvat, Petra; Kržan, Andrej

    2016-01-01

    Microplastic pollution in the marine environment is a scientific topic that has received increasing attention over the last decade. The majority of scientific publications address microplastic pollution of the sea surface. The protocol below describes the methodology for sampling, sample preparation, separation and chemical identification of microplastic particles. A manta net fixed on an »A frame« attached to the side of the vessel was used for sampling. Microplastic particles caught in the cod end of the net were separated from samples by visual identification and use of stereomicroscopes. Particles were analyzed for their size using an image analysis program and for their chemical structure using ATR-FTIR and micro FTIR spectroscopy. The described protocol is in line with recommendations for microplastics monitoring published by the Marine Strategy Framework Directive (MSFD) Technical Subgroup on Marine Litter. This written protocol with video guide will support the work of researchers that deal with microplastics monitoring all over the world. PMID:28060297

  11. Protocol for Microplastics Sampling on the Sea Surface and Sample Analysis.

    PubMed

    Kovač Viršek, Manca; Palatinus, Andreja; Koren, Špela; Peterlin, Monika; Horvat, Petra; Kržan, Andrej

    2016-12-16

    Microplastic pollution in the marine environment is a scientific topic that has received increasing attention over the last decade. The majority of scientific publications address microplastic pollution of the sea surface. The protocol below describes the methodology for sampling, sample preparation, separation and chemical identification of microplastic particles. A manta net fixed on an »A frame« attached to the side of the vessel was used for sampling. Microplastic particles caught in the cod end of the net were separated from samples by visual identification and use of stereomicroscopes. Particles were analyzed for their size using an image analysis program and for their chemical structure using ATR-FTIR and micro FTIR spectroscopy. The described protocol is in line with recommendations for microplastics monitoring published by the Marine Strategy Framework Directive (MSFD) Technical Subgroup on Marine Litter. This written protocol with video guide will support the work of researchers that deal with microplastics monitoring all over the world.

  12. Recent approaches for enhancing sensitivity in enantioseparations by CE.

    PubMed

    Sánchez-Hernández, Laura; García-Ruiz, Carmen; Luisa Marina, María; Luis Crego, Antonio

    2010-01-01

    This article reviews the latest methodological and instrumental improvements for enhancing sensitivity in chiral analysis by CE. The review covers literature from March 2007 until May 2009, that is, the works published after the appearance of the latest review article on the same topic by Sánchez-Hernández et al. [Electrophoresis 2008, 29, 237-251]. Off-line and on-line sample treatment techniques, on-line sample preconcentration strategies based on electrophoretic and chromatographic principles, and alternative detection systems to the widely employed UV/Vis detection in CE are the most relevant approaches discussed for improving sensitivity. Microchip technologies are also included since they can open up great possibilities to achieve sensitive and fast enantiomeric separations.

  13. QESA: Quarantine Extraterrestrial Sample Analysis Methodology

    NASA Astrophysics Data System (ADS)

    Simionovici, A.; Lemelle, L.; Beck, P.; Fihman, F.; Tucoulou, R.; Kiryukhina, K.; Courtade, F.; Viso, M.

    2018-04-01

    Our nondestructive, nm-sized, hyperspectral analysis methodology of combined X-rays/Raman/IR probes in BSL4 quarantine, renders our patented mini-sample holder ideal for detecting extraterrestrial life. Our Stardust and Archean results validate it.

  14. 7 CFR 3600.3 - Functions.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    .... agricultural and rural economy. (2) Administering a methodological research program to improve agricultural... design and data collection methodologies to the agricultural statistics program. Major functions include...) Designing, testing, and establishing survey techniques and standards, including sample design, sample...

  15. 7 CFR 3600.3 - Functions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    .... agricultural and rural economy. (2) Administering a methodological research program to improve agricultural... design and data collection methodologies to the agricultural statistics program. Major functions include...) Designing, testing, and establishing survey techniques and standards, including sample design, sample...

  16. 7 CFR 3600.3 - Functions.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    .... agricultural and rural economy. (2) Administering a methodological research program to improve agricultural... design and data collection methodologies to the agricultural statistics program. Major functions include...) Designing, testing, and establishing survey techniques and standards, including sample design, sample...

  17. 7 CFR 3600.3 - Functions.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    .... agricultural and rural economy. (2) Administering a methodological research program to improve agricultural... design and data collection methodologies to the agricultural statistics program. Major functions include...) Designing, testing, and establishing survey techniques and standards, including sample design, sample...

  18. 7 CFR 3600.3 - Functions.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    .... agricultural and rural economy. (2) Administering a methodological research program to improve agricultural... design and data collection methodologies to the agricultural statistics program. Major functions include...) Designing, testing, and establishing survey techniques and standards, including sample design, sample...

  19. Optimization of a Viability PCR Method for the Detection of Listeria monocytogenes in Food Samples.

    PubMed

    Agustí, Gemma; Fittipaldi, Mariana; Codony, Francesc

    2018-06-01

    Rapid detection of Listeria and other microbial pathogens in food is an essential part of quality control and it is critical for ensuring the safety of consumers. Culture-based methods for detecting foodborne pathogens are time-consuming, laborious and cannot detect viable but non-culturable microorganism, whereas viability PCR methodology provides quick results; it is able to detect viable but non-culturable cells, and allows for easier handling of large amount of samples. Although the most critical point to use viability PCR technique is achieving the complete exclusion of dead cell amplification signals, many improvements are being introduced to overcome this. In the present work, the yield of dead cell DNA neutralization was enhanced by incorporating two new sample treatment strategies: tube change combined with a double light treatment. This procedure was successfully tested using artificially contaminated food samples, showing improved neutralization of dead cell DNA.

  20. Understanding the physiology of mindfulness: aortic hemodynamics and heart rate variability.

    PubMed

    May, Ross W; Bamber, Mandy; Seibert, Gregory S; Sanchez-Gonzalez, Marcos A; Leonard, Joseph T; Salsbury, Rebecca A; Fincham, Frank D

    2016-01-01

    Data were collected to examine autonomic and hemodynamic cardiovascular modulation underlying mindfulness from two independent samples. An initial sample (N = 185) underwent laboratory assessments of central aortic blood pressure and myocardial functioning to investigated the association between mindfulness and cardiac functioning. Controlling for religiosity, mindfulness demonstrated a strong negative relationship with myocardial oxygen consumption and left ventricular work but not heart rate or blood pressure. A second sample (N = 124) underwent a brief (15 min) mindfulness inducing intervention to examine the influence of mindfulness on cardiovascular autonomic modulation via blood pressure variability and heart rate variability. The intervention had a strong positive effect on cardiovascular modulation by decreasing cardiac sympathovagal tone, vasomotor tone, vascular resistance and ventricular workload. This research establishes a link between mindfulness and cardiovascular functioning via correlational and experimental methodologies in samples of mostly female undergraduates. Future directions for research are outlined.

  1. THE RHETORICAL USE OF RANDOM SAMPLING: CRAFTING AND COMMUNICATING THE PUBLIC IMAGE OF POLLS AS A SCIENCE (1935-1948).

    PubMed

    Lusinchi, Dominic

    2017-03-01

    The scientific pollsters (Archibald Crossley, George H. Gallup, and Elmo Roper) emerged onto the American news media scene in 1935. Much of what they did in the following years (1935-1948) was to promote both the political and scientific legitimacy of their enterprise. They sought to be recognized as the sole legitimate producers of public opinion. In this essay I examine the, mostly overlooked, rhetorical work deployed by the pollsters to publicize the scientific credentials of their polling activities, and the central role the concept of sampling has had in that pursuit. First, they distanced themselves from the failed straw poll by claiming that their sampling methodology based on quotas was informed by science. Second, although in practice they did not use random sampling, they relied on it rhetorically to derive the symbolic benefits of being associated with the "laws of probability." © 2017 Wiley Periodicals, Inc.

  2. Challenges and progress in making DNA-based AIS early ...

    EPA Pesticide Factsheets

    The ability of DNA barcoding to find additional species in hard-to-sample locations or hard-to-identify samples is well established. Nevertheless, adoption of DNA barcoding into regular monitoring programs has been slow, in part due to issues of standardization and interpretation that need resolving. In this presentation, we describe our progress towards incorporating DNA-based identification into broad-spectrum aquatic invasive species early-detection monitoring in the Laurentian Great Lakes. Our work uses community biodiversity information as the basis for evaluating survey performance for various taxonomic groups. Issues we are tackling in bringing DNA-based data to bear on AIS monitoring design include: 1) Standardizing methodology and work flow from field collection and sample handling through bioinformatics post-processing; 2) Determining detection sensitivity and accounting for inter-species differences in DNA amplification and primer affinity; 3) Differentiating sequencing and barcoding errors from legitimate new finds when range and natural history information is limited; and 4) Accounting for the different nature of morphology- vs. DNA-based biodiversity information in subsequent analysis (e.g., via species accumulation curves, multi-metric indices). not applicable

  3. Electrochemical sensing of total antioxidant capacity and polyphenol content in wine samples using amperometry online-coupled with microdialysis.

    PubMed

    Jakubec, Petr; Bancirova, Martina; Halouzka, Vladimir; Lojek, Antonin; Ciz, Milan; Denev, Petko; Cibicek, Norbert; Vacek, Jan; Vostalova, Jitka; Ulrichova, Jitka; Hrbac, Jan

    2012-08-15

    This work describes the method for total antioxidant capacity (TAC) and/or total content of phenolics (TCP) analysis in wines using microdialysis online-coupled with amperometric detection using a carbon microfiber working electrode. The system was tested on 10 selected wine samples, and the results were compared with total reactive antioxidant potential (TRAP), oxygen radical absorbance capacity (ORAC), and chemiluminescent determination of total antioxidant capacity (CL-TAC) methods using Trolox and catechin as standards. Microdialysis online-coupled with amperometric detection gives similar results to the widely used cyclic voltammetry methodology and closely correlates with ORAC and TRAP. The problem of electrode fouling is overcome by the introduction of an electrochemical cleaning step (1-2 min at the potential of 0 V vs Ag/AgCl). Such a procedure is sufficient to fully regenerate the electrode response for both red and white wine samples as well as catechin/Trolox standards. The appropriate size of microdialysis probes enables easy automation of the electrochemical TAC/TCP measurement using 96-well microtitration plates.

  4. Mass selective separation applied to radioisotopes of cesium: Mass selective applied to radioisotopes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dion, Michael; Eiden, Greg; Farmer, Orville

    2016-07-22

    A developed technique that uses the intrinsic mass-based separation capability of a quadrupole mass spectrometer has been used to resolve spectral radiometric interference of two isotopes of the same element. In this work the starting sample was a combination of 137Cs and 134Cs and was (activity) dominated by 137Cs and this methodology separated and “implanted” 134Cs that was later quantified for spectral features and ac- tivity with traditional radiometric techniques. This work demonstrated a 134Cs/137Cs activity ratio enhancement of >4 orders of magnitude and complete removal of 137Cs spectral features from the implanted target mass (i.e., 134).

  5. Predicting the onset and persistence of episodes of depression in primary health care. The predictD-Spain study: Methodology

    PubMed Central

    Bellón, Juan Ángel; Moreno-Küstner, Berta; Torres-González, Francisco; Montón-Franco, Carmen; GildeGómez-Barragán, María Josefa; Sánchez-Celaya, Marta; Díaz-Barreiros, Miguel Ángel; Vicens, Catalina; de Dios Luna, Juan; Cervilla, Jorge A; Gutierrez, Blanca; Martínez-Cañavate, María Teresa; Oliván-Blázquez, Bárbara; Vázquez-Medrano, Ana; Sánchez-Artiaga, María Soledad; March, Sebastia; Motrico, Emma; Ruiz-García, Victor Manuel; Brangier-Wainberg, Paulette Renée; del Mar Muñoz-García, María; Nazareth, Irwin; King, Michael

    2008-01-01

    Background The effects of putative risk factors on the onset and/or persistence of depression remain unclear. We aim to develop comprehensive models to predict the onset and persistence of episodes of depression in primary care. Here we explain the general methodology of the predictD-Spain study and evaluate the reliability of the questionnaires used. Methods This is a prospective cohort study. A systematic random sample of general practice attendees aged 18 to 75 has been recruited in seven Spanish provinces. Depression is being measured with the CIDI at baseline, and at 6, 12, 24 and 36 months. A set of individual, environmental, genetic, professional and organizational risk factors are to be assessed at each follow-up point. In a separate reliability study, a proportional random sample of 401 participants completed the test-retest (251 researcher-administered and 150 self-administered) between October 2005 and February 2006. We have also checked 118,398 items for data entry from a random sample of 480 patients stratified by province. Results All items and questionnaires had good test-retest reliability for both methods of administration, except for the use of recreational drugs over the previous six months. Cronbach's alphas were good and their factorial analyses coherent for the three scales evaluated (social support from family and friends, dissatisfaction with paid work, and dissatisfaction with unpaid work). There were 191 (0.16%) data entry errors. Conclusion The items and questionnaires were reliable and data quality control was excellent. When we eventually obtain our risk index for the onset and persistence of depression, we will be able to determine the individual risk of each patient evaluated in primary health care. PMID:18657275

  6. Examining the Self-Assembly of Rod-Coil Block Copolymers via Physics Based Polymer Models and Polarized X-Ray Scattering

    NASA Astrophysics Data System (ADS)

    Hannon, Adam; Sunday, Daniel; Windover, Donald; Liman, Christopher; Bowen, Alec; Khaira, Gurdaman; de Pablo, Juan; Delongchamp, Dean; Kline, R. Joseph

    Photovoltaics, flexible electronics, and stimuli-responsive materials all require enhanced methodology to examine their nanoscale molecular orientation. The mechanical, electronic, optical, and transport properties of devices made from these materials are all a function of this orientation. The polymer chains in these materials are best modeled as semi-flexible to rigid rods. Characterizing the rigidity and molecular orientation of these polymers non-invasively is currently being pursued by using polarized resonant soft X-ray scattering (P-RSoXS). In this presentation, we show recent work on implementing such a characterization process using a rod-coil block copolymer system in the rigid-rod limit. We first demonstrate how we have used physics based models such as self-consistent field theory (SCFT) in non-polarized RSoXS work to fit scattering profiles for thin film coil-coil PS- b-PMMA block copolymer systems. We then show by using a wormlike chain partition function in the SCFT formulism to model the rigid-rod block, the methodology can be used there as well to extract the molecular orientation of the rod block from a simulated P-RSoXS experiment. The results from the work show the potential of the technique to extract thermodynamic and morphological sample information.

  7. Improving core outcome set development: qualitative interviews with developers provided pointers to inform guidance.

    PubMed

    Gargon, Elizabeth; Williamson, Paula R; Young, Bridget

    2017-06-01

    The objective of the study was to explore core outcome set (COS) developers' experiences of their work to inform methodological guidance on COS development and identify areas for future methodological research. Semistructured, audio-recorded interviews with a purposive sample of 32 COS developers. Analysis of transcribed interviews was informed by the constant comparative method and framework analysis. Developers found COS development to be challenging, particularly in relation to patient participation and accessing funding. Their accounts raised fundamental questions about the status of COS development and whether it is consultation or research. Developers emphasized how the absence of guidance had affected their work and identified areas where guidance or evidence about COS development would be useful including, patient participation, ethics, international development, and implementation. They particularly wanted guidance on systematic reviews, Delphi, and consensus meetings. The findings raise important questions about the funding, status, and process of COS development and indicate ways that it could be strengthened. Guidance could help developers to strengthen their work, but over specification could threaten quality in COS development. Guidance should therefore highlight common issues to consider and encourage tailoring of COS development to the context and circumstances of particular COS. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  8. 45 CFR 1356.71 - Federal review of the eligibility of children in foster care and the eligibility of foster care...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... by ACF statistical staff from the Adoption and Foster Care Analysis and Reporting System (AFCARS... primary review utilizing probability sampling methodologies. Usually, the chosen methodology will be simple random sampling, but other probability samples may be utilized, when necessary and appropriate. (3...

  9. Understanding palliative care on the heart failure care team: an innovative research methodology.

    PubMed

    Lingard, Lorelei A; McDougall, Allan; Schulz, Valerie; Shadd, Joshua; Marshall, Denise; Strachan, Patricia H; Tait, Glendon R; Arnold, J Malcolm; Kimel, Gil

    2013-05-01

    There is a growing call to integrate palliative care for patients with advanced heart failure (HF). However, the knowledge to inform integration efforts comes largely from interview and survey research with individual patients and providers. This work has been critically important in raising awareness of the need for integration, but it is insufficient to inform solutions that must be enacted not by isolated individuals but by complex care teams. Research methods are urgently required to support systematic exploration of the experiences of patients with HF, family caregivers, and health care providers as they interact as a care team. To design a research methodology that can support systematic exploration of the experiences of patients with HF, caregivers, and health care providers as they interact as a care team. This article describes in detail a methodology that we have piloted and are currently using in a multisite study of HF care teams. We describe three aspects of the methodology: the theoretical framework, an innovative sampling strategy, and an iterative system of data collection and analysis that incorporates four data sources and four analytical steps. We anticipate that this innovative methodology will support groundbreaking research in both HF care and other team settings in which palliative integration efforts are emerging for patients with advanced nonmalignant disease. Copyright © 2013 U.S. Cancer Pain Relief Committee. Published by Elsevier Inc. All rights reserved.

  10. A critical analysis of the implementation of service user involvement in primary care research and health service development using normalization process theory.

    PubMed

    Tierney, Edel; McEvoy, Rachel; O'Reilly-de Brún, Mary; de Brún, Tomas; Okonkwo, Ekaterina; Rooney, Michelle; Dowrick, Chris; Rogers, Anne; MacFarlane, Anne

    2016-06-01

    There have been recent important advances in conceptualizing and operationalizing involvement in health research and health-care service development. However, problems persist in the field that impact on the scope for meaningful involvement to become a routine - normalized - way of working in primary care. In this review, we focus on current practice to critically interrogate factors known to be relevant for normalization - definition, enrolment, enactment and appraisal. Ours was a multidisciplinary, interagency team, with community representation. We searched EBSCO host for papers from 2007 to 2011 and engaged in an iterative, reflexive approach to sampling, appraising and analysing the literature following the principles of a critical interpretive synthesis approach and using Normalization Process Theory. Twenty-six papers were chosen from 289 papers, as a purposeful sample of work that is reported as service user involvement in the field. Few papers provided a clear working definition of service user involvement. The dominant identified rationale for enrolling service users in primary care projects was linked with policy imperatives for co-governance and emancipatory ideals. The majority of methodologies employed were standard health services research methods that do not qualify as research with service users. This indicates a lack of congruence between the stated aims and methods. Most studies only reported positive outcomes, raising questions about the balance or completeness of the published appraisals. To improve normalization of meaningful involvement in primary care, it is necessary to encourage explicit reporting of definitions, methodological innovation to enhance co-governance and dissemination of research processes and findings. © 2014 The Authors Health Expectations Published by John Wiley & Sons Ltd.

  11. Quadratic partial eigenvalue assignment in large-scale stochastic dynamic systems for resilient and economic design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Das, Sonjoy; Goswami, Kundan; Datta, Biswa N.

    2014-12-10

    Failure of structural systems under dynamic loading can be prevented via active vibration control which shifts the damped natural frequencies of the systems away from the dominant range of loading spectrum. The damped natural frequencies and the dynamic load typically show significant variations in practice. A computationally efficient methodology based on quadratic partial eigenvalue assignment technique and optimization under uncertainty has been formulated in the present work that will rigorously account for these variations and result in an economic and resilient design of structures. A novel scheme based on hierarchical clustering and importance sampling is also developed in this workmore » for accurate and efficient estimation of probability of failure to guarantee the desired resilience level of the designed system. Numerical examples are presented to illustrate the proposed methodology.« less

  12. Monitoring the fracture behavior of metal matrix composites by combined NDE methodologies

    NASA Astrophysics Data System (ADS)

    Kordatos, E. Z.; Exarchos, D. A.; Mpalaskas, A. C.; Matikas, T. E.

    2015-03-01

    Current work deals with the non-destructive evaluation (NDE) of the fatigue behavior of metal matrix composites (MMCs) materials using Infrared Thermography (IRT) and Acoustic Emission (AE). AE monitoring was employed to record a wide spectrum of cracking events enabling the characterization of the severity of fracture in relation to the applied load. IR thermography as a non-destructive, real-time and non-contact technique, allows the detection of heat waves generated by the thermo-mechanical coupling during mechanical loading of the sample. In this study an IR methodology, based on the monitoring of the intrinsically dissipated energy, was applied for the determination of the fatigue limit of A359/SiCp composites. The thermographic monitoring is in agreement with the AE results enabling the reliable monitoring of the MMCs' fatigue behavior.

  13. A probabilistic assessment of health risks associated with short-term exposure to tropospheric ozone

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whitfield, R.G; Biller, W.F.; Jusko, M.J.

    1996-06-01

    The work described in this report is part of a larger risk assessment sponsored by the U.S. Environmental Protection Agency. Earlier efforts developed exposure-response relationships for acute health effects among populations engaged in heavy exertion. Those efforts also developed a probabilistic national ambient air quality standards exposure model and a general methodology for integrating probabilistic exposure-response relation- ships and exposure estimates to calculate overall risk results. Recently published data make it possible to model additional health endpoints (for exposure at moderate exertion), including hospital admissions. New air quality and exposure estimates for alternative national ambient air quality standards for ozonemore » are combined with exposure-response models to produce the risk results for hospital admissions and acute health effects. Sample results explain the methodology and introduce risk output formats.« less

  14. Evaluation of Heavy Metals in Solid Waste Disposal Sites in Campinas City, Brazil Using Synchrotron Radiation Total Reflection X-Ray Fluorescence

    NASA Astrophysics Data System (ADS)

    de Faria, Bruna Fernanda; Moreira, Silvana

    2011-12-01

    The problem of solid waste in most countries is on the rise as a result of rapid population growth, urbanization, industrial development and changes in consumption habits. Amongst the various forms of waste disposals, landfills are today the most viable for the Brazilian reality, both technically and economically. Proper landfill construction practices allow minimizing the effects of the two main sources of pollution from solid waste: landfill gas and slurry. However, minimizing is not synonymous with eliminating; consequently, the landfill alone cannot resolve all the problems with solid waste disposal. The main goal of this work is to evaluate the content of trace elements in samples of groundwater, surface water and slurry arising from local solid waste disposals in the city of Campinas, SP, Brazil. Samples were collected at the Delta, Santa Barbara and Pirelli landfills. At the Delta and Santa Barbara sites, values above the maximum permitted level established by CETESB for Cr, Mn, Fe, Ni and Pb were observed in samples of groundwater, while at the Pirelli site, elements with concentrations above the permitted levels were Mn, Fe, Ba and Pb. At Delta, values above levels permitted by the CONAMA 357 legislation were still observed in surface water samples for Cr, Mn, Fe and Cu, whereas in slurry samples, values above the permitted levels were observed for Cr, Mn, Fe, Ni, Cu, Zn and Pb. Slurry samples were prepared in accordance with two extraction methodologies, EPA 3050B and EPA 200.8. Concentrations of Cr, Ni, Cu and Pb were higher than the limit established by CONAMA 357 for most samples collected at different periods (dry and rainy) and also for the two extraction methodologies employed.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faria, Bruna Fernanda de; Moreira, Silvana

    The problem of solid waste in most countries is on the rise as a result of rapid population growth, urbanization, industrial development and changes in consumption habits. Amongst the various forms of waste disposals, landfills are today the most viable for the Brazilian reality, both technically and economically. Proper landfill construction practices allow minimizing the effects of the two main sources of pollution from solid waste: landfill gas and slurry. However, minimizing is not synonymous with eliminating; consequently, the landfill alone cannot resolve all the problems with solid waste disposal. The main goal of this work is to evaluate themore » content of trace elements in samples of groundwater, surface water and slurry arising from local solid waste disposals in the city of Campinas, SP, Brazil. Samples were collected at the Delta, Santa Barbara and Pirelli landfills. At the Delta and Santa Barbara sites, values above the maximum permitted level established by CETESB for Cr, Mn, Fe, Ni and Pb were observed in samples of groundwater, while at the Pirelli site, elements with concentrations above the permitted levels were Mn, Fe, Ba and Pb. At Delta, values above levels permitted by the CONAMA 357 legislation were still observed in surface water samples for Cr, Mn, Fe and Cu, whereas in slurry samples, values above the permitted levels were observed for Cr, Mn, Fe, Ni, Cu, Zn and Pb. Slurry samples were prepared in accordance with two extraction methodologies, EPA 3050B and EPA 200.8. Concentrations of Cr, Ni, Cu and Pb were higher than the limit established by CONAMA 357 for most samples collected at different periods (dry and rainy) and also for the two extraction methodologies employed.« less

  16. What roles do team climate, roster control, and work life conflict play in shiftworkers' fatigue longitudinally?

    PubMed

    Pisarski, Anne; Barbour, Jennifer P

    2014-05-01

    The study aimed to examine shiftworkers fatigue and the longitudinal relationships that impact on fatigue such as team climate, work life conflict, control of shifts and shift type in shift working nurses. We used a quantitative survey methodology and analysed data with a moderated hierarchical multiple regression. After matching across two time periods 18 months apart, the sample consisted of 166 nurses from one Australian hospital. Of these nurses, 61 worked two rotating day shifts (morning & afternoon/evening) and 105 were rotating shiftworkers who worked three shifts (morning afternoon/evening and nights). The findings suggest that control over shift scheduling can have significant effects on fatigue for both two-shift and three-shift workers. A significant negative relationship between positive team climate and fatigue was moderated by shift type. At both Time 1 and Time 2, work life conflict was the strongest predictor of concurrent fatigue, but over time it was not. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  17. Parents' nonstandard work schedules and child well-being: a critical review of the literature.

    PubMed

    Li, Jianghong; Johnson, Sarah E; Han, Wen-Jui; Andrews, Sonia; Kendall, Garth; Strazdins, Lyndall; Dockery, Alfred

    2014-02-01

    This paper provides a comprehensive review of empirical evidence linking parental nonstandard work schedules to four main child developmental outcomes: internalizing and externalizing problems, cognitive development, and body mass index. We evaluated the studies based on theory and methodological rigor (longitudinal data, representative samples, consideration of selection and information bias, confounders, moderators, and mediators). Of 23 studies published between 1980 and 2012 that met the selection criteria, 21 reported significant associations between nonstandard work schedules and an adverse child developmental outcome. The associations were partially mediated through parental depressive symptoms, low quality parenting, reduced parent-child interaction and closeness, and a less supportive home environment. These associations were more pronounced in disadvantaged families and when parents worked such schedules full time. We discuss the nuance, strengths, and limitations of the existing studies, and propose recommendations for future research.

  18. Sequential determination of nickel and cadmium in tobacco, molasses and refill solutions for e-cigarettes samples by molecular fluorescence.

    PubMed

    Talio, María Carolina; Alesso, Magdalena; Acosta, Mariano; Wills, Verónica S; Fernández, Liliana P

    2017-11-01

    In this work, a new procedure was developed for separation and preconcentration of nickel(II) and cadmium(II) in several and varied tobacco samples. Tobacco samples were selected considering the main products consumed by segments of the population, in particular the age (youth) and lifestyle of the consumer. To guarantee representative samples, a randomized strategy of sampling was used. In the first step, a chemofiltration on nylon membrane is carried out employing eosin (Eo) and carbon nanotubes dispersed in sodium dodecylsulfate (SDS) solution (phosphate buffer pH 7). In this condition, Ni(II) was selectively retained on the solid support. After that, the filtrate liquid with Cd(II) was re-conditioned with acetic acid /acetate buffer solution (pH 5) and followed by detection. A spectrofluorimetric determination of both metals was carried out, on the solid support and the filtered aqueous solution, for Ni(II) and Cd(II), respectively. The solid surface fluorescence (SSF) determination was performed at λ em = 545nm (λ ex = 515nm) for Ni(II)-Eo complex and the fluorescence of Cd(II)-Eo was quantified in aqueous solution using λ em = 565nm (λ ex = 540nm). The calibration graphs resulted linear in a range of 0.058-29.35μgL -1 for Ni(II) and 0.124-56.20μgL -1 for Cd(II), with detection limits of 0.019 and 0.041μgL -1 (S/N = 3). The developed methodology shows good sensitivity and adequate selectivity, and it was successfully applied to the determination of trace amounts of nickel and cadmium present in tobacco samples (refill solutions for e-cigarettes, snuff used in narguille (molasses) and traditional tobacco) with satisfactory results. The new methodology was validated by ICP-MS with adequate agreement. The proposed methodology represents a novel fluorescence application to Ni(II) and Cd(II) quantification with sensitivity and accuracy similar to atomic spectroscopies, introducing for the first time the quenching effect on SSF. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Representation of scientific methodology in secondary science textbooks

    NASA Astrophysics Data System (ADS)

    Binns, Ian C.

    The purpose of this investigation was to assess the representation of scientific methodology in secondary science textbooks. More specifically, this study looked at how textbooks introduced scientific methodology and to what degree the examples from the rest of the textbook, the investigations, and the images were consistent with the text's description of scientific methodology, if at all. The sample included eight secondary science textbooks from two publishers, McGraw-Hill/Glencoe and Harcourt/Holt, Rinehart & Winston. Data consisted of all student text and teacher text that referred to scientific methodology. Second, all investigations in the textbooks were analyzed. Finally, any images that depicted scientists working were also collected and analyzed. The text analysis and activity analysis used the ethnographic content analysis approach developed by Altheide (1996). The rubrics used for the text analysis and activity analysis were initially guided by the Benchmarks (AAAS, 1993), the NSES (NRC, 1996), and the nature of science literature. Preliminary analyses helped to refine each of the rubrics and grounded them in the data. Image analysis used stereotypes identified in the DAST literature. Findings indicated that all eight textbooks presented mixed views of scientific methodology in their initial descriptions. Five textbooks placed more emphasis on the traditional view and three placed more emphasis on the broad view. Results also revealed that the initial descriptions, examples, investigations, and images all emphasized the broad view for Glencoe Biology and the traditional view for Chemistry: Matter and Change. The initial descriptions, examples, investigations, and images in the other six textbooks were not consistent. Overall, the textbook with the most appropriate depiction of scientific methodology was Glencoe Biology and the textbook with the least appropriate depiction of scientific methodology was Physics: Principles and Problems. These findings suggest that compared to earlier investigations, textbooks have begun to improve in how they represent scientific methodology. However, there is still much room for improvement. Future research needs to consider how textbooks impact teachers' and students' understandings of scientific methodology.

  20. A Comparison of Temporal Dominance of Sensation (TDS) and Quantitative Descriptive Analysis (QDA™) to Identify Flavors in Strawberries.

    PubMed

    Oliver, Penelope; Cicerale, Sara; Pang, Edwin; Keast, Russell

    2018-04-01

    Temporal dominance of sensations (TDS) is a rapid descriptive method that offers a different magnitude of information to traditional descriptive analysis methodologies. This methodology considers the dynamic nature of eating, assessing sensory perception of foods as they change throughout the eating event. Limited research has applied the TDS methodology to strawberries and subsequently validated the results against Quantitative Descriptive Analysis (QDA™). The aim of this research is to compare the TDS methodology using an untrained consumer panel to the results obtained via QDA™ with a trained sensory panel. The trained panelists (n = 12, minimum 60 hr each panelist) were provided with six strawberry samples (three cultivars at two maturation levels) and applied QDA™ techniques to profile each strawberry sample. Untrained consumers (n = 103) were provided with six strawberry samples (three cultivars at two maturation levels) and required to use TDS methodology to assess the dominant sensations for each sample as they change over time. Results revealed moderately comparable product configurations produced via TDS in comparison to QDA™ (RV coefficient = 0.559), as well as similar application of the sweet attribute (correlation coefficient of 0.895 at first bite). The TDS methodology however was not in agreement with the QDA™ methodology regarding more complex flavor terms. These findings support the notion that the lack of training on the definition of terms, together with the limitations of the methodology to ignore all attributes other than those dominant, provide a different magnitude of information than the QDA™ methodology. A comparison of TDS to traditional descriptive analysis indicate that TDS provides additional information to QDA™ regarding the lingering component of eating. The QDA™ results however provide more precise detail regarding singular attributes. Therefore, the TDS methodology has an application in industry when it is important to understand the lingering profile of products. However, this methodology should not be employed as a replacement to traditional descriptive analysis methods. © 2018 Institute of Food Technologists®.

  1. Methodological issues concerning the application of reliable laser particle sizing in soils

    NASA Astrophysics Data System (ADS)

    de Mascellis, R.; Impagliazzo, A.; Basile, A.; Minieri, L.; Orefice, N.; Terribile, F.

    2009-04-01

    During the past decade, the evolution of technologies has enabled laser diffraction (LD) to become a much widespread means of particle size distribution (PSD), replacing sedimentation and sieve analysis in many scientific fields mainly due to its advantages of versatility, fast measurement and high reproducibility. Despite such developments of the last decade, the soil scientist community has been quite reluctant to replace the good old sedimentation techniques (ST); possibly because of (i) the large complexity of the soil matrix inducing different types of artefacts (aggregates, deflocculating dynamics, etc.), (ii) the difficulties in relating LD results with results obtained through sedimentation techniques and (iii) the limited size range of most LD equipments. More recently LD granulometry is slowly gaining appreciation in soil science also because of some innovations including an enlarged size dynamic range (0,01-2000 m) and the ability to implement more powerful algorithms (e.g. Mie theory). Furthermore, LD PSD can be successfully used in the application of physically based pedo-transfer functions (i.e., Arya and Paris model) for investigations of soil hydraulic properties, due to the direct determination of PSD in terms of volume percentage rather than in terms of mass percentage, thus eliminating the need to adopt the rough approximation of a single value for soil particle density in the prediction process. Most of the recent LD work performed in soil science deals with the comparison with sedimentation techniques and show the general overestimation of the silt fraction following a general underestimation of the clay fraction; these well known results must be related with the different physical principles behind the two techniques. Despite these efforts, it is indeed surprising that little if any work is devoted to more basic methodological issues related to the high sensitivity of LD to the quantity and the quality of the soil samples. Our work aims to both analyse and to suggest technical solutions to address the following key methodological problems: (i) sample representativeness due to the very small amount of soil sample required by LD (e.g. 0,2 g) as compared to ST (e.g. 40 g for densimetry); (ii) PSD reading variability caused by the large number of instantaneous reading on a very small volume of the solution, (iii) the varying soil mineralogy that in turn produce varying refractive indexes affecting PSD results, (iv) the determination of the mass density of the soil samples to compare results with those obtained from ST. Our results, referring to many different soil types (Vertisols, Regosols, Andosols, Calcisols, Luvisols) show that the listed major technical problems can be successfully addressed by the following set of solutions: (i) adequate subsampling in both solid and liquid phases (including a setup of a dilution system); (ii) preliminary study of the PSD variability to reasonably increase the number of readings per each sample; (iii, iv) preliminary sensitivity analysis of both refractive indexes and mass density in accordance to the specific soil mineralogy.

  2. Cosmological Constraints from the Redshift Dependence of the Volume Effect Using the Galaxy 2-point Correlation Function across the Line of Sight

    NASA Astrophysics Data System (ADS)

    Li, Xiao-Dong; Park, Changbom; Sabiu, Cristiano G.; Park, Hyunbae; Cheng, Cheng; Kim, Juhan; Hong, Sungwook E.

    2017-08-01

    We develop a methodology to use the redshift dependence of the galaxy 2-point correlation function (2pCF) across the line of sight, ξ ({r}\\perp ), as a probe of cosmological parameters. The positions of galaxies in comoving Cartesian space varies under different cosmological parameter choices, inducing a redshift-dependent scaling in the galaxy distribution. This geometrical distortion can be observed as a redshift-dependent rescaling in the measured ξ ({r}\\perp ). We test this methodology using a sample of 1.75 billion mock galaxies at redshifts 0, 0.5, 1, 1.5, and 2, drawn from the Horizon Run 4 N-body simulation. The shape of ξ ({r}\\perp ) can exhibit a significant redshift evolution when the galaxy sample is analyzed under a cosmology differing from the true, simulated one. Other contributions, including the gravitational growth of structure, galaxy bias, and the redshift space distortions, do not produce large redshift evolution in the shape. We show that one can make use of this geometrical distortion to constrain the values of cosmological parameters governing the expansion history of the universe. This method could be applicable to future large-scale structure surveys, especially photometric surveys such as DES and LSST, to derive tight cosmological constraints. This work is a continuation of our previous works as a strategy to constrain cosmological parameters using redshift-invariant physical quantities.

  3. Use of a machine learning framework to predict substance use disorder treatment success

    PubMed Central

    Kelmansky, Diana; van der Laan, Mark; Sahker, Ethan; Jones, DeShauna; Arndt, Stephan

    2017-01-01

    There are several methods for building prediction models. The wealth of currently available modeling techniques usually forces the researcher to judge, a priori, what will likely be the best method. Super learning (SL) is a methodology that facilitates this decision by combining all identified prediction algorithms pertinent for a particular prediction problem. SL generates a final model that is at least as good as any of the other models considered for predicting the outcome. The overarching aim of this work is to introduce SL to analysts and practitioners. This work compares the performance of logistic regression, penalized regression, random forests, deep learning neural networks, and SL to predict successful substance use disorders (SUD) treatment. A nationwide database including 99,013 SUD treatment patients was used. All algorithms were evaluated using the area under the receiver operating characteristic curve (AUC) in a test sample that was not included in the training sample used to fit the prediction models. AUC for the models ranged between 0.793 and 0.820. SL was superior to all but one of the algorithms compared. An explanation of SL steps is provided. SL is the first step in targeted learning, an analytic framework that yields double robust effect estimation and inference with fewer assumptions than the usual parametric methods. Different aspects of SL depending on the context, its function within the targeted learning framework, and the benefits of this methodology in the addiction field are discussed. PMID:28394905

  4. Use of a machine learning framework to predict substance use disorder treatment success.

    PubMed

    Acion, Laura; Kelmansky, Diana; van der Laan, Mark; Sahker, Ethan; Jones, DeShauna; Arndt, Stephan

    2017-01-01

    There are several methods for building prediction models. The wealth of currently available modeling techniques usually forces the researcher to judge, a priori, what will likely be the best method. Super learning (SL) is a methodology that facilitates this decision by combining all identified prediction algorithms pertinent for a particular prediction problem. SL generates a final model that is at least as good as any of the other models considered for predicting the outcome. The overarching aim of this work is to introduce SL to analysts and practitioners. This work compares the performance of logistic regression, penalized regression, random forests, deep learning neural networks, and SL to predict successful substance use disorders (SUD) treatment. A nationwide database including 99,013 SUD treatment patients was used. All algorithms were evaluated using the area under the receiver operating characteristic curve (AUC) in a test sample that was not included in the training sample used to fit the prediction models. AUC for the models ranged between 0.793 and 0.820. SL was superior to all but one of the algorithms compared. An explanation of SL steps is provided. SL is the first step in targeted learning, an analytic framework that yields double robust effect estimation and inference with fewer assumptions than the usual parametric methods. Different aspects of SL depending on the context, its function within the targeted learning framework, and the benefits of this methodology in the addiction field are discussed.

  5. The Combined Effect of Mere Exposure, Counterattitudinal Advocacy, and Art Criticism Methodology on Upper Elementary and Junior High Students' Affect Toward Art Works.

    ERIC Educational Resources Information Center

    Hollingsworth, Patricia

    1983-01-01

    Results indicated that, for elementary students, art criticism was more effective than a combination of methodologies for developing positive affect toward art works. For junior high students, the combination methodology was more effective than art criticism, the exposure method, or the counterattitudinal advocacy method. (Author/SR)

  6. Generating or developing grounded theory: methods to understand health and illness.

    PubMed

    Woods, Phillip; Gapp, Rod; King, Michelle A

    2016-06-01

    Grounded theory is a qualitative research methodology that aims to explain social phenomena, e.g. why particular motivations or patterns of behaviour occur, at a conceptual level. Developed in the 1960s by Glaser and Strauss, the methodology has been reinterpreted by Strauss and Corbin in more recent times, resulting in different schools of thought. Differences arise from different philosophical perspectives concerning knowledge (epistemology) and the nature of reality (ontology), demanding that researchers make clear theoretical choices at the commencement of their research when choosing this methodology. Compared to other qualitative methods it has ability to achieve understanding of, rather than simply describing, a social phenomenon. Achieving understanding however, requires theoretical sampling to choose interviewees that can contribute most to the research and understanding of the phenomenon, and constant comparison of interviews to evaluate the same event or process in different settings or situations. Sampling continues until conceptual saturation is reached, i.e. when no new concepts emerge from the data. Data analysis focusses on categorising data (finding the main elements of what is occurring and why), and describing those categories in terms of properties (conceptual characteristics that define the category and give meaning) and dimensions (the variations within properties which produce specificity and range). Ultimately a core category which theoretically explains how all other categories are linked together is developed from the data. While achieving theoretical abstraction in the core category, it should be logical and capture all of the variation within the data. Theory development requires understanding of the methodology not just working through a set of procedures. This article provides a basic overview, set in the literature surrounding grounded theory, for those wanting to increase their understanding and quality of research output.

  7. Development of Total Reflection X-ray fluorescence spectrometry quantitative methodologies for elemental characterization of building materials and their degradation products

    NASA Astrophysics Data System (ADS)

    García-Florentino, Cristina; Maguregui, Maite; Marguí, Eva; Torrent, Laura; Queralt, Ignasi; Madariaga, Juan Manuel

    2018-05-01

    In this work, a Total Reflection X-ray fluorescence (TXRF) spectrometry based quantitative methodology for elemental characterization of liquid extracts and solids belonging to old building materials and their degradation products from a building of the beginning of 20th century with a high historic cultural value in Getxo, (Basque Country, North of Spain) is proposed. This quantification strategy can be considered a faster methodology comparing to traditional Energy or Wavelength Dispersive X-ray fluorescence (ED-XRF and WD-XRF) spectrometry based methodologies or other techniques such as Inductively Coupled Plasma Mass Spectrometry (ICP-MS). In particular, two kinds of liquid extracts were analysed: (i) water soluble extracts from different mortars and (ii) acid extracts from mortars, black crusts, and calcium carbonate formations. In order to try to avoid the acid extraction step of the materials and their degradation products, it was also studied the TXRF direct measurement of the powdered solid suspensions in water. With this aim, different parameters such as the deposition volume and the measuring time were studied for each kind of samples. Depending on the quantified element, the limits of detection achieved with the TXRF quantitative methodologies for liquid extracts and solids were set around 0.01-1.2 and 2-200 mg/L respectively. The quantification of K, Ca, Ti, Mn, Fe, Zn, Rb, Sr, Sn and Pb in the liquid extracts was proved to be a faster alternative to other more classic quantification techniques (i.e. ICP-MS), accurate enough to obtain information about the composition of the acidic soluble part of the materials and their degradation products. Regarding the solid samples measured as suspensions, it was quite difficult to obtain stable and repetitive suspensions affecting in this way the accuracy of the results. To cope with this problem, correction factors based on the quantitative results obtained using ED-XRF were calculated to improve the accuracy of the TXRF results.

  8. Identification of Optimum Magnetic Behavior of NanoCrystalline CmFeAl Type Heusler Alloy Powders Using Response Surface Methodology

    NASA Astrophysics Data System (ADS)

    Srivastava, Y.; Srivastava, S.; Boriwal, L.

    2016-09-01

    Mechanical alloying is a novelistic solid state process that has received considerable attention due to many advantages over other conventional processes. In the present work, Co2FeAl healer alloy powder, prepared successfully from premix basic powders of Cobalt (Co), Iron (Fe) and Aluminum (Al) in stoichiometric of 60Co-26Fe-14Al (weight %) by novelistic mechano-chemical route. Magnetic properties of mechanically alloyed powders were characterized by vibrating sample magnetometer (VSM). 2 factor 5 level design matrix was applied to experiment process. Experimental results were used for response surface methodology. Interaction between the input process parameters and the response has been established with the help of regression analysis. Further analysis of variance technique was applied to check the adequacy of developed model and significance of process parameters. Test case study was performed with those parameters, which was not selected for main experimentation but range was same. Response surface methodology, the process parameters must be optimized to obtain improved magnetic properties. Further optimum process parameters were identified using numerical and graphical optimization techniques.

  9. Guidelines for reporting methodological challenges and evaluating potential bias in dementia research.

    PubMed

    Weuve, Jennifer; Proust-Lima, Cécile; Power, Melinda C; Gross, Alden L; Hofer, Scott M; Thiébaut, Rodolphe; Chêne, Geneviève; Glymour, M Maria; Dufouil, Carole

    2015-09-01

    Clinical and population research on dementia and related neurologic conditions, including Alzheimer's disease, faces several unique methodological challenges. Progress to identify preventive and therapeutic strategies rests on valid and rigorous analytic approaches, but the research literature reflects little consensus on "best practices." We present findings from a large scientific working group on research methods for clinical and population studies of dementia, which identified five categories of methodological challenges as follows: (1) attrition/sample selection, including selective survival; (2) measurement, including uncertainty in diagnostic criteria, measurement error in neuropsychological assessments, and practice or retest effects; (3) specification of longitudinal models when participants are followed for months, years, or even decades; (4) time-varying measurements; and (5) high-dimensional data. We explain why each challenge is important in dementia research and how it could compromise the translation of research findings into effective prevention or care strategies. We advance a checklist of potential sources of bias that should be routinely addressed when reporting dementia research. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  10. The secret lives of experiments: methods reporting in the fMRI literature.

    PubMed

    Carp, Joshua

    2012-10-15

    Replication of research findings is critical to the progress of scientific understanding. Accordingly, most scientific journals require authors to report experimental procedures in sufficient detail for independent researchers to replicate their work. To what extent do research reports in the functional neuroimaging literature live up to this standard? The present study evaluated methods reporting and methodological choices across 241 recent fMRI articles. Many studies did not report critical methodological details with regard to experimental design, data acquisition, and analysis. Further, many studies were underpowered to detect any but the largest statistical effects. Finally, data collection and analysis methods were highly flexible across studies, with nearly as many unique analysis pipelines as there were studies in the sample. Because the rate of false positive results is thought to increase with the flexibility of experimental designs, the field of functional neuroimaging may be particularly vulnerable to false positives. In sum, the present study documented significant gaps in methods reporting among fMRI studies. Improved methodological descriptions in research reports would yield significant benefits for the field. Copyright © 2012 Elsevier Inc. All rights reserved.

  11. Analytical methodologies based on LC-MS/MS for monitoring selected emerging compounds in liquid and solid phases of the sewage sludge.

    PubMed

    Boix, C; Ibáñez, M; Fabregat-Safont, D; Morales, E; Pastor, L; Sancho, J V; Sánchez-Ramírez, J E; Hernández, F

    2016-01-01

    In this work, two analytical methodologies based on liquid chromatography coupled to tandem mass spectrometry (LC-MS/MS) were developed for quantification of emerging pollutants identified in sewage sludge after a previous wide-scope screening. The target list included 13 emerging contaminants (EC): thiabendazole, acesulfame, fenofibric acid, valsartan, irbesartan, salicylic acid, diclofenac, carbamazepine, 4-aminoantipyrine (4-AA), 4-acetyl aminoantipyrine (4-AAA), 4-formyl aminoantipyrine (4-FAA), venlafaxine and benzoylecgonine. The aqueous and solid phases of the sewage sludge were analyzed making use of Solid-Phase Extraction (SPE) and UltraSonic Extraction (USE) for sample treatment, respectively. The methods were validated at three concentration levels: 0.2, 2 and 20 μg L(-1) for the aqueous phase, and 50, 500 and 2000 μg kg(-1) for the solid phase of the sludge. In general, the method was satisfactorily validated, showing good recoveries (70-120%) and precision (RSD < 20%). Regarding the limit of quantification (LOQ), it was below 0.1 μg L(-1) in the aqueous phase and below 50 μg kg(-1) in the solid phase for the majority of the analytes. The method applicability was tested by analysis of samples from a wider study on degradation of emerging pollutants in sewage sludge under anaerobic digestion. The key benefits of these methodologies are: • SPE and USE are appropriate sample procedures to extract selected emerging contaminants from the aqueous phase of the sewage sludge and the solid residue. • LC-MS/MS is highly suitable for determining emerging contaminants in both sludge phases. • Up to our knowledge, the main metabolites of dipyrone had not been studied before in sewage sludge.

  12. New semi-pilot-scale reactor to study the photocatalytic inactivation of phages contained in aerosol.

    PubMed

    Briggiler Marcó, Mariángeles; Negro, Antonio Carlos; Alfano, Orlando Mario; Quiberoni, Andrea Del Luján

    2017-04-12

    The aims of this work were to design and build a photocatalytic reactor (UV-A/TiO 2 ) to study the inactivation of phages contained in bioaerosols, which constitute the main dissemination via phages in industrial environments. The reactor is a close system with recirculation that consists of a stainless steel camera (cubic form, side of 60 cm) in which air containing the phage particles circulates and an acrylic compartment with six borosilicate plates covered with TiO 2 . The reactor is externally illuminated by 20 UV-A lamps. Both compartments are connected by a fan to facilitate the sample circulation. Samples are injected into the camera using two piston nebulizers working in series whereas several methodologies for sampling (impinger/syringe, sampling on photocatalytic plates, and impact of air on slide) were assayed. The reactor setup was carried out using phage B1 (Lactobacillus plantarum), and assays demonstrated a decrease of phage counts of 2.7 log orders after 1 h of photocatalytic treatment. Photonic efficiencies of inactivation were assessed by phage sampling on the photocatalytic plates or by impact of air on a glass slide at the photocatalytic reactor exit. Efficiencies of the same order of magnitude were observed using both sampling methods. This study demonstrated that the designed photocatalytic reactor is effective to inactivate phage B1 (Lb. plantarum) contained in bioaerosols.

  13. Recent developments in atomic/nuclear methodologies used for the study of cultural heritage objects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Appoloni, Carlos Roberto

    2013-05-06

    Archaeometry is an area established in the international community since the 60s, with extensive use of atomic-nuclear methods in the characterization of art, archaeological and cultural heritage objects in general. In Brazil, however, until the early '90s, employing methods of physics, only the area of archaeological dating was implemented. It was only after this period that Brazilian groups became involved in the characterization of archaeological and art objects with these methodologies. The Laboratory of Applied Nuclear Physics, State University of Londrina (LFNA/UEL) introduced, pioneered in 1994, Archaeometry and related issues among its priority lines of research, after a member ofmore » LFNA has been involved in 1992 with the possibilities of tomography in archaeometry, as well as the analysis of ancient bronzes by EDXRF. Since then, LFNA has been working with PXRF and Portable Raman in several museums in Brazil, in field studies of cave paintings and in the laboratory with material sent by archaeologists, as well as carrying out collaborative work with new groups that followed in this area. From 2003/2004 LAMFI/DFN/IFUSP and LIN/COPPE/UFRJ began to engage in the area, respectively with methodologies using ion beams and PXRF, then over time incorporating other techniques, followed later by other groups. Due to the growing number of laboratories and institutions/archaeologists/conservators interested in these applications, in may 2012 was created a network of available laboratories, based at http://www.dfn.if.usp.br/lapac. It will be presented a panel of recent developments and applications of these methodologies by national groups, as well as a sampling of what has been done by leading groups abroad.« less

  14. Potential reuse of small household waste electrical and electronic equipment: Methodology and case study.

    PubMed

    Bovea, María D; Ibáñez-Forés, Valeria; Pérez-Belis, Victoria; Quemades-Beltrán, Pilar

    2016-07-01

    This study proposes a general methodology for assessing and estimating the potential reuse of small waste electrical and electronic equipment (sWEEE), focusing on devices classified as domestic appliances. Specific tests for visual inspection, function and safety have been defined for ten different types of household appliances (vacuum cleaner, iron, microwave, toaster, sandwich maker, hand blender, juicer, boiler, heater and hair dryer). After applying the tests, reuse protocols have been defined in the form of easy-to-apply checklists for each of the ten types of appliance evaluated. This methodology could be useful for reuse enterprises, since there is a lack of specific protocols, adapted to each type of appliance, to test its potential of reuse. After applying the methodology, electrical and electronic appliances (used or waste) can be segregated into three categories: the appliance works properly and can be classified as direct reuse (items can be used by a second consumer without prior repair operations), the appliance requires a later evaluation of its potential refurbishment and repair (restoration of products to working order, although with possible loss of quality) or the appliance needs to be finally discarded from the reuse process and goes directly to a recycling process. Results after applying the methodology to a sample of 87.7kg (96 units) show that 30.2% of the appliances have no potential for reuse and should be diverted for recycling, while 67.7% require a subsequent evaluation of their potential refurbishment and repair, and only 2.1% of them could be directly reused with minor cleaning operations. This study represents a first approach to the "preparation for reuse" strategy that the European Directive related to Waste Electrical and Electronic Equipment encourages to be applied. However, more research needs to be done as an extension of this study, mainly related to the identification of the feasibility of repair or refurbishment operations. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. High Resolution Melting (HRM) applied to wine authenticity.

    PubMed

    Pereira, Leonor; Gomes, Sónia; Castro, Cláudia; Eiras-Dias, José Eduardo; Brazão, João; Graça, António; Fernandes, José R; Martins-Lopes, Paula

    2017-02-01

    Wine authenticity methods are in increasing demand mainly in Denomination of Origin designations. The DNA-based methodologies are a reliable means of tracking food/wine varietal composition. The main aim of this work was the study of High Resolution Melting (HRM) application as a screening method for must and wine authenticity. Three sample types (leaf, must and wine) were used to validate the three developed HRM assays (Vv1-705bp; Vv2-375bp; and Vv3-119bp). The Vv1 HRM assay was only successful when applied to leaf and must samples. The Vv2 HRM assay successfully amplified all sample types, allowing genotype discrimination based on melting temperature values. The smallest amplicon, Vv3, produced a coincident melting curve shape in all sample types (leaf and wine) with corresponding genotypes. This study presents sensitive, rapid and efficient HRM assays applied for the first time to wine samples suitable for wine authenticity purposes. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality.

    PubMed

    L'Engle, Kelly; Sefa, Eunice; Adimazoya, Edward Akolgo; Yartey, Emmanuel; Lenzi, Rachel; Tarpo, Cindy; Heward-Mills, Nii Lante; Lew, Katherine; Ampeh, Yvonne

    2018-01-01

    Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample. The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census. The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample. The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in comparison to household surveys. Random digit dialing of mobile phones offers promise for future data collection in Ghana and may be suitable for other developing countries.

  17. Sensitive trace enrichment of environmental andiandrogen vinclozolin from natural waters and sediment samples using hollow-fiber liquid-phase microextraction.

    PubMed

    Lambropoulou, Dimitra A; Albanis, Triantafyllos A

    2004-12-17

    The presence of vinclozolin in the environment as far as the endocrine disruption effects in biota are concerned has raised interest in the environmental fate of this compound. In this respect, the present study attempts to investigate the feasibility of applying a novel quantitative method, liquid-phase microextraction (LPME), so as to determine this environmental andiandrogen in environmental samples such as water and sediment samples. The technique involved the use of a small amount (3 microL) of organic solvent impregnated in a hollow fiber membrane, which was attached to the needle of a conventional GC syringe. The extracted samples were analyzed by gas chromatography coupled with electron-capture detection. Experimental LPME conditions such as extraction solvent, stirring rate, content of NaCl and pH were tested. Once LPME was optimized, the performance of the proposed technique was evaluated for the determination of vinclozolin in different types of natural water samples. The recovery of spiked water samples was from 80 to 99%. The procedure was adequate for quantification of vinclozolin in waters at levels of 0.010 to 50 microg/L (r> 0.994) with a detection limit of 0.001 microg/L (S/N= 3). Natural sediment samples from the Aliakmonas River area (Macedonia, Greece) spiked with the target andiandrogen compound were liquid-liquid extracted and analyzed by the methodology developed in this work. No significant interferences from the samples matrix were noticed, indicating that the reported methodology is an innovative tactic for sample preparation in sediment analysis, with a considerable improvement in the achieved detection limits. The results demonstrated that apart from analyte enrichment, the proposed LPME procedure also serves as clean-up method and could be successfully performed to determine trace amounts of vinclozolin in water and sediment samples.

  18. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality

    PubMed Central

    Sefa, Eunice; Adimazoya, Edward Akolgo; Yartey, Emmanuel; Lenzi, Rachel; Tarpo, Cindy; Heward-Mills, Nii Lante; Lew, Katherine; Ampeh, Yvonne

    2018-01-01

    Introduction Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample. Materials and methods The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census. Results The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample. Conclusions The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in comparison to household surveys. Random digit dialing of mobile phones offers promise for future data collection in Ghana and may be suitable for other developing countries. PMID:29351349

  19. Shift work and sleep disorder among textile mill workers in Bahir Dar, northwest Ethiopia.

    PubMed

    Abebe, Y; Fantahun, M

    1999-07-01

    To assess the length and quality of sleep among shift workers at Bahir Dar textile mill. A cross sectional study using structured questionnaire that contained sociodemographic variables, duration of work, work schedule, number of sleeping hours, sleep disorders, and associated reasons for such disorders. A textile mill in Bahir Dar, northwest Ethiopia. Three-hundred ninety four random sample of production workers of the mill. Sleep disorders, and the impact of external and home environment on sleep. The mean duration of work in the factory was 25.4 +/- 7.1 years. Ninety-seven per cent of the study population work in a rotating eight hourly shift system. The mean number of hours a worker sleeps after a worked shift was 5.1 +/- 2.3. Two hundred thirty (58.4%) claimed to experience a sleep disorder. Sleep disturbance was significantly associated with rotating shift work, external environmental noise, and working in the spinning department. The majority of the workers in Bahir Dar textile mill experienced sleep disturbances as detailed in the study methodology.

  20. From theory to 'measurement' in complex interventions: methodological lessons from the development of an e-health normalisation instrument.

    PubMed

    Finch, Tracy L; Mair, Frances S; O'Donnell, Catherine; Murray, Elizabeth; May, Carl R

    2012-05-17

    Although empirical and theoretical understanding of processes of implementation in health care is advancing, translation of theory into structured measures that capture the complex interplay between interventions, individuals and context remain limited. This paper aimed to (1) describe the process and outcome of a project to develop a theory-based instrument for measuring implementation processes relating to e-health interventions; and (2) identify key issues and methodological challenges for advancing work in this field. A 30-item instrument (Technology Adoption Readiness Scale (TARS)) for measuring normalisation processes in the context of e-health service interventions was developed on the basis on Normalization Process Theory (NPT). NPT focuses on how new practices become routinely embedded within social contexts. The instrument was pre-tested in two health care settings in which e-health (electronic facilitation of healthcare decision-making and practice) was used by health care professionals. The developed instrument was pre-tested in two professional samples (N=46; N=231). Ratings of items representing normalisation 'processes' were significantly related to staff members' perceptions of whether or not e-health had become 'routine'. Key methodological challenges are discussed in relation to: translating multi-component theoretical constructs into simple questions; developing and choosing appropriate outcome measures; conducting multiple-stakeholder assessments; instrument and question framing; and more general issues for instrument development in practice contexts. To develop theory-derived measures of implementation process for progressing research in this field, four key recommendations are made relating to (1) greater attention to underlying theoretical assumptions and extent of translation work required; (2) the need for appropriate but flexible approaches to outcomes measurement; (3) representation of multiple perspectives and collaborative nature of work; and (4) emphasis on generic measurement approaches that can be flexibly tailored to particular contexts of study.

  1. Introducing a new methodology for the calculation of local philicity and multiphilic descriptor: an alternative to the finite difference approximation

    NASA Astrophysics Data System (ADS)

    Sánchez-Márquez, Jesús; Zorrilla, David; García, Víctor; Fernández, Manuel

    2018-07-01

    This work presents a new development based on the condensation scheme proposed by Chamorro and Pérez, in which new terms to correct the frozen molecular orbital approximation have been introduced (improved frontier molecular orbital approximation). The changes performed on the original development allow taking into account the orbital relaxation effects, providing equivalent results to those achieved by the finite difference approximation and leading also to a methodology with great advantages. Local reactivity indices based on this new development have been obtained for a sample set of molecules and they have been compared with those indices based on the frontier molecular orbital and finite difference approximations. A new definition based on the improved frontier molecular orbital methodology for the dual descriptor index is also shown. In addition, taking advantage of the characteristics of the definitions obtained with the new condensation scheme, the descriptor local philicity is analysed by separating the components corresponding to the frontier molecular orbital approximation and orbital relaxation effects, analysing also the local parameter multiphilic descriptor in the same way. Finally, the effect of using the basis set is studied and calculations using DFT, CI and Möller-Plesset methodologies are performed to analyse the consequence of different electronic-correlation levels.

  2. HIV Risks, Testing, and Treatment in the Former Soviet Union: Challenges and Future Directions in Research and Methodology.

    PubMed

    Saadat, Victoria M

    2015-01-01

    The dissolution of the USSR resulted in independence for constituent republics but left them battling an unstable economic environment and healthcare. Increases in injection drug use, prostitution, and migration were all widespread responses to this transition and have contributed to the emergence of an HIV epidemic in the countries of former Soviet Union. Researchers have begun to identify the risks of HIV infection as well as the barriers to HIV testing and treatment in the former Soviet Union. Significant methodological challenges have arisen and need to be addressed. The objective of this review is to determine common threads in HIV research in the former Soviet Union and provide useful recommendations for future research studies. In this systematic review of the literature, Pubmed was searched for English-language studies using the key search terms "HIV", "AIDS", "human immunodeficiency virus", "acquired immune deficiency syndrome", "Central Asia", "Kazakhstan", "Kyrgyzstan", "Uzbekistan", "Tajikistan", "Turkmenistan", "Russia", "Ukraine", "Armenia", "Azerbaijan", and "Georgia". Studies were evaluated against eligibility criteria for inclusion. Thirty-nine studies were identified across the two main topic areas of HIV risk and barriers to testing and treatment, themes subsequently referred to as "risk" and "barriers". Study design was predominantly cross-sectional. The most frequently used sampling methods were peer-to-peer and non-probabilistic sampling. The most frequently reported risks were condom misuse, risky intercourse, and unsafe practices among injection drug users. Common barriers to testing included that testing was inconvenient, and that results would not remain confidential. Frequent barriers to treatment were based on a distrust in the treatment system. The findings of this review reveal methodological limitations that span the existing studies. Small sample size, cross-sectional design, and non-probabilistic sampling methods were frequently reported limitations. Future work is needed to examine barriers to testing and treatment as well as longitudinal studies on HIV risk over time in most-at-risk populations.

  3. Application of the experimental design of experiments (DoE) for the determination of organotin compounds in water samples using HS-SPME and GC-MS/MS.

    PubMed

    Coscollà, Clara; Navarro-Olivares, Santiago; Martí, Pedro; Yusà, Vicent

    2014-02-01

    When attempting to discover the important factors and then optimise a response by tuning these factors, experimental design (design of experiments, DoE) gives a powerful suite of statistical methodology. DoE identify significant factors and then optimise a response with respect to them in method development. In this work, a headspace-solid-phase micro-extraction (HS-SPME) combined with gas chromatography tandem mass spectrometry (GC-MS/MS) methodology for the simultaneous determination of six important organotin compounds namely monobutyltin (MBT), dibutyltin (DBT), tributyltin (TBT), monophenyltin (MPhT), diphenyltin (DPhT), triphenyltin (TPhT) has been optimized using a statistical design of experiments (DOE). The analytical method is based on the ethylation with NaBEt4 and simultaneous headspace-solid-phase micro-extraction of the derivative compounds followed by GC-MS/MS analysis. The main experimental parameters influencing the extraction efficiency selected for optimization were pre-incubation time, incubation temperature, agitator speed, extraction time, desorption temperature, buffer (pH, concentration and volume), headspace volume, sample salinity, preparation of standards, ultrasonic time and desorption time in the injector. The main factors (excitation voltage, excitation time, ion source temperature, isolation time and electron energy) affecting the GC-IT-MS/MS response were also optimized using the same statistical design of experiments. The proposed method presented good linearity (coefficient of determination R(2)>0.99) and repeatibilty (1-25%) for all the compounds under study. The accuracy of the method measured as the average percentage recovery of the compounds in spiked surface and marine waters was higher than 70% for all compounds studied. Finally, the optimized methodology was applied to real aqueous samples enabled the simultaneous determination of all compounds under study in surface and marine water samples obtained from Valencia region (Spain). © 2013 Elsevier B.V. All rights reserved.

  4. Medical education... meet Michel Foucault.

    PubMed

    Hodges, Brian D; Martimianakis, Maria A; McNaughton, Nancy; Whitehead, Cynthia

    2014-06-01

    There have been repeated calls for the greater use of conceptual frameworks and of theory in medical education. Although it is familiar to few medical educators, Michel Foucault's work is a helpful theoretical and methodological source. This article explores what it means to use a 'Foucauldian approach', presents a sample of Foucault's historical-genealogical studies that are relevant to medical education, and introduces the work of four researchers currently undertaking Foucauldian-inspired medical education research. Although they are not without controversy, Foucauldian approaches are employed by an increasing number of scholars and are helpful in shedding light on what it is possible to think, say and be in medical education. Our hope in sharing this Foucauldian work and perspective is that we might stimulate a dialogue that is forward-looking and optimistic about the possibilities for change in medical education. © 2014 John Wiley & Sons Ltd.

  5. Methodological framework for projecting the potential loss of intraspecific genetic diversity due to global climate change

    PubMed Central

    2012-01-01

    Background While research on the impact of global climate change (GCC) on ecosystems and species is flourishing, a fundamental component of biodiversity – molecular variation – has not yet received its due attention in such studies. Here we present a methodological framework for projecting the loss of intraspecific genetic diversity due to GCC. Methods The framework consists of multiple steps that combines 1) hierarchical genetic clustering methods to define comparable units of inference, 2) species accumulation curves (SAC) to infer sampling completeness, and 3) species distribution modelling (SDM) to project the genetic diversity loss under GCC. We suggest procedures for existing data sets as well as specifically designed studies. We illustrate the approach with two worked examples from a land snail (Trochulus villosus) and a caddisfly (Smicridea (S.) mucronata). Results Sampling completeness was diagnosed on the third coarsest haplotype clade level for T. villosus and the second coarsest for S. mucronata. For both species, a substantial species range loss was projected under the chosen climate scenario. However, despite substantial differences in data set quality concerning spatial sampling and sampling depth, no loss of haplotype clades due to GCC was predicted for either species. Conclusions The suggested approach presents a feasible method to tap the rich resources of existing phylogeographic data sets and guide the design and analysis of studies explicitly designed to estimate the impact of GCC on a currently still neglected level of biodiversity. PMID:23176586

  6. Scalability of a Methodology for Generating Technical Trading Rules with GAPs Based on Risk-Return Adjustment and Incremental Training

    NASA Astrophysics Data System (ADS)

    de La Cal, E. A.; Fernández, E. M.; Quiroga, R.; Villar, J. R.; Sedano, J.

    In previous works a methodology was defined, based on the design of a genetic algorithm GAP and an incremental training technique adapted to the learning of series of stock market values. The GAP technique consists in a fusion of GP and GA. The GAP algorithm implements the automatic search for crisp trading rules taking as objectives of the training both the optimization of the return obtained and the minimization of the assumed risk. Applying the proposed methodology, rules have been obtained for a period of eight years of the S&P500 index. The achieved adjustment of the relation return-risk has generated rules with returns very superior in the testing period to those obtained applying habitual methodologies and even clearly superior to Buy&Hold. This work probes that the proposed methodology is valid for different assets in a different market than previous work.

  7. Methodological considerations in the use of audio diaries in work psychology: Adding to the qualitative toolkit.

    PubMed

    Crozier, Sarah E; Cassell, Catherine M

    2016-06-01

    The use of longitudinal methodology as a means of capturing the intricacies in complex organizational phenomena is well documented, and many different research strategies for longitudinal designs have been put forward from both a qualitative and quantitative stance. This study explores a specific emergent qualitative methodology, audio diaries, and assesses their utility for work psychology research drawing on the findings from a four-stage study addressing transient working patterns and stress in UK temporary workers. Specifically, we explore some important methodological, analytical and technical issues for practitioners and researchers who seek to use these methods and explain how this type of methodology has much to offer when studying stress and affective experiences at work. We provide support for the need to implement pluralistic and complementary methodological approaches in unearthing the depth in sense-making and assert their capacity to further illuminate the process orientation of stress. This study illustrates the importance of verbalization in documenting stress and affective experience as a mechanism for accessing cognitive processes in making sense of such experience.This study compares audio diaries with more traditional qualitative methods to assess applicability to different research contexts.This study provides practical guidance and a methodological framework for the design of audio diary research and design, taking into account challenges and solutions for researchers and practitioners.

  8. Novel methodology to isolate microplastics from vegetal-rich samples.

    PubMed

    Herrera, Alicia; Garrido-Amador, Paloma; Martínez, Ico; Samper, María Dolores; López-Martínez, Juan; Gómez, May; Packard, Theodore T

    2018-04-01

    Microplastics are small plastic particles, globally distributed throughout the oceans. To properly study them, all the methodologies for their sampling, extraction, and measurement should be standardized. For heterogeneous samples containing sediments, animal tissues and zooplankton, several procedures have been described. However, definitive methodologies for samples, rich in algae and plant material, have not yet been developed. The aim of this study was to find the best extraction protocol for vegetal-rich samples by comparing the efficacies of five previously described digestion methods, and a novel density separation method. A protocol using 96% ethanol for density separation was better than the five digestion methods tested, even better than using H 2 O 2 digestion. As it was the most efficient, simple, safe and inexpensive method for isolating microplastics from vegetal rich samples, we recommend it as a standard separation method. Copyright © 2018 Elsevier Ltd. All rights reserved.

  9. Publishing Linked Open Data for Physical Samples - Lessons Learned

    NASA Astrophysics Data System (ADS)

    Ji, P.; Arko, R. A.; Lehnert, K.; Bristol, S.

    2016-12-01

    Most data and information about physical samples and associated sampling features currently reside in relational databases. Integrating common concepts from various databases has motivated us to publish Linked Open Data for collections of physical samples, using Semantic Web technologies including the Resource Description Framework (RDF), RDF Query Language (SPARQL), and Web Ontology Language (OWL). The goal of our work is threefold: To evaluate and select ontologies in different granularities for common concepts; to establish best practices and develop a generic methodology for publishing physical sample data stored in relational database as Linked Open Data; and to reuse standard community vocabularies from the International Commission on Stratigraphy (ICS), Global Volcanism Program (GVP), General Bathymetric Chart of the Oceans (GEBCO), and others. Our work leverages developments in the EarthCube GeoLink project and the Interdisciplinary Earth Data Alliance (IEDA) facility for modeling and extracting physical sample data stored in relational databases. Reusing ontologies developed by GeoLink and IEDA has facilitated discovery and integration of data and information across multiple collections including the USGS National Geochemical Database (NGDB), System for Earth Sample Registration (SESAR), and Index to Marine & Lacustrine Geological Samples (IMLGS). We have evaluated, tested, and deployed Linked Open Data tools including Morph, Virtuoso Server, LodView, LodLive, and YASGUI for converting, storing, representing, and querying data in a knowledge base (RDF triplestore). Using persistent identifiers such as Open Researcher & Contributor IDs (ORCIDs) and International Geo Sample Numbers (IGSNs) at the record level makes it possible for other repositories to link related resources such as persons, datasets, documents, expeditions, awards, etc. to samples, features, and collections. This work is supported by the EarthCube "GeoLink" project (NSF# ICER14-40221 and others) and the "USGS-IEDA Partnership to Support a Data Lifecycle Framework and Tools" project (USGS# G13AC00381).

  10. Blow Collection as a Non-Invasive Method for Measuring Cortisol in the Beluga (Delphinapterus leucas)

    PubMed Central

    Thompson, Laura A.; Spoon, Tracey R.; Goertz, Caroline E. C.; Hobbs, Roderick C.; Romano, Tracy A.

    2014-01-01

    Non-invasive sampling techniques are increasingly being used to monitor glucocorticoids, such as cortisol, as indicators of stressor load and fitness in zoo and wildlife conservation, research and medicine. For cetaceans, exhaled breath condensate (blow) provides a unique sampling matrix for such purposes. The purpose of this work was to develop an appropriate collection methodology and validate the use of a commercially available EIA for measuring cortisol in blow samples collected from belugas (Delphinapterus leucas). Nitex membrane stretched over a petri dish provided the optimal method for collecting blow. A commercially available cortisol EIA for measuring human cortisol (detection limit 35 pg ml−1) was adapted and validated for beluga cortisol using tests of parallelism, accuracy and recovery. Blow samples were collected from aquarium belugas during monthly health checks and during out of water examination, as well as from wild belugas. Two aquarium belugas showed increased blow cortisol between baseline samples and 30 minutes out of water (Baseline, 0.21 and 0.04 µg dl−1; 30 minutes, 0.95 and 0.14 µg dl−1). Six wild belugas also showed increases in blow cortisol between pre and post 1.5 hour examination (Pre 0.03, 0.23, 0.13, 0.19, 0.13, 0.04 µg dl−1, Post 0.60, 0.31, 0.36, 0.24, 0.14, 0.16 µg dl−1). Though this methodology needs further investigation, this study suggests that blow sampling is a good candidate for non-invasive monitoring of cortisol in belugas. It can be collected from both wild and aquarium animals efficiently for the purposes of health monitoring and research, and may ultimately be useful in obtaining data on wild populations, including endangered species, which are difficult to handle directly. PMID:25464121

  11. Blow collection as a non-invasive method for measuring cortisol in the beluga (Delphinapterus leucas).

    PubMed

    Thompson, Laura A; Spoon, Tracey R; Goertz, Caroline E C; Hobbs, Roderick C; Romano, Tracy A

    2014-01-01

    Non-invasive sampling techniques are increasingly being used to monitor glucocorticoids, such as cortisol, as indicators of stressor load and fitness in zoo and wildlife conservation, research and medicine. For cetaceans, exhaled breath condensate (blow) provides a unique sampling matrix for such purposes. The purpose of this work was to develop an appropriate collection methodology and validate the use of a commercially available EIA for measuring cortisol in blow samples collected from belugas (Delphinapterus leucas). Nitex membrane stretched over a petri dish provided the optimal method for collecting blow. A commercially available cortisol EIA for measuring human cortisol (detection limit 35 pg ml-1) was adapted and validated for beluga cortisol using tests of parallelism, accuracy and recovery. Blow samples were collected from aquarium belugas during monthly health checks and during out of water examination, as well as from wild belugas. Two aquarium belugas showed increased blow cortisol between baseline samples and 30 minutes out of water (Baseline, 0.21 and 0.04 µg dl-1; 30 minutes, 0.95 and 0.14 µg dl-1). Six wild belugas also showed increases in blow cortisol between pre and post 1.5 hour examination (Pre 0.03, 0.23, 0.13, 0.19, 0.13, 0.04 µg dl-1, Post 0.60, 0.31, 0.36, 0.24, 0.14, 0.16 µg dl-1). Though this methodology needs further investigation, this study suggests that blow sampling is a good candidate for non-invasive monitoring of cortisol in belugas. It can be collected from both wild and aquarium animals efficiently for the purposes of health monitoring and research, and may ultimately be useful in obtaining data on wild populations, including endangered species, which are difficult to handle directly.

  12. Associations between psychosocial work factors and provider mental well-being in emergency departments: A systematic review.

    PubMed

    Schneider, Anna; Weigl, Matthias

    2018-01-01

    Emergency departments (ED) are complex and dynamic work environments with various psychosocial work stressors that increase risks for providers' well-being. Yet, no systematic review is available which synthesizes the current research base as well as quantitatively aggregates data on associations between ED work factors and provider well-being outcomes. We aimed at synthesizing the current research base on quantitative associations between psychosocial work factors (classified into patient-/ task-related, organizational, and social factors) and mental well-being of ED providers (classified into positive well-being outcomes, affective symptoms and negative psychological functioning, cognitive-behavioural outcomes, and psychosomatic health complaints). A systematic literature search in eight databases was conducted in December 2017. Original studies were extracted following a stepwise procedure and predefined inclusion criteria. A standardized assessment of methodological quality and risk of bias was conducted for each study with the Quality Assessment Tool for Quantitative Studies from the Effective Public Health Practice Project. In addition to a systematic compilation of included studies, frequency and strength of quantitative associations were synthesized by means of harvest plots. Subgroup analyses for ED physicians and nurses were conducted. N = 1956 records were retrieved. After removal of duplicates, 1473 records were screened for titles and abstracts. 199 studies were eligible for full-text review. Finally, 39 original studies were included whereof 37 reported cross-sectional surveys. Concerning the methodological quality of included studies, the majority was evaluated as weak to moderate with considerable risk of bias. Most frequently surveyed provider outcomes were affective symptoms (e.g., burnout) and positive well-being outcomes (e.g., job satisfaction). 367 univariate associations and 370 multivariate associations were extracted with the majority being weak to moderate. Strong associations were mostly reported for social and organizational work factors. To the best of our knowledge, this review is the first to provide a quantitative summary of the research base on associations of psychosocial ED work factors and provider well-being. Conclusive results reveal that peer support, well-designed organizational structures, and employee reward systems balance the negative impact of adverse work factors on ED providers' well-being. This review identifies avenues for future research in this field including methodological advances by using quasi-experimental and prospective designs, representative samples, and adequate confounder control. Protocol registration number: PROSPERO 2016 CRD42016037220.

  13. Forensic Investigation of Formaldehyde in Illicit Products for Hair Treatment by DAD-HPLC: A Case Study.

    PubMed

    Oiye, Erica N; Ribeiro, Maria Fernanda M; Okumura, Leonardo L; Saczk, Adelir A; Ciancaglini, Pietro; de Oliveira, Marcelo F

    2016-07-01

    The illegal use of formalin (commercial formaldehyde) in cosmetic products harms the health of individuals exposed to this substance. Over the last years, the commercial availability of these products, especially those containing irregular dosage of formaldehyde, has increased in Brazil. This work analyzes some products for hair treatment available in the Brazilian market and verifies their safety. The adopted analytical methodology involved sample derivatization with 2,4-dinitrophenylhydrazine, followed by high-performance liquid chromatography with ultraviolet detection (UV-VIS) at λ = 365 nm. The limit of quantification is 2.5 × 10 -3% w/w, and the recovery tests were around 93%. Some of the samples contained high and illegal formaldehyde levels ranging from 9% to 19% (w/w) and others presented suitable concentrations of the analyte. On the basis of the results, this work discusses the efficiency and practicality of this analytical method for forensic purposes. © 2016 American Academy of Forensic Sciences.

  14. Risk practices for HIV infection and other STDs amongst female prostitutes working in legalized brothels.

    PubMed

    Pyett, P M; Haste, B R; Snow, J

    1996-02-01

    Most research investigating risk practices for HIV infection and other STDs amongst sex workers has focused on street prostitutes to the exclusion of those prostitutes who work in different sections of the industry. This is largely a consequence of methodological difficulties in accessing prostitutes other than those who work on the streets. HIV prevention research and interventions must address the fact that risk practices may vary according to the type of prostitution engaged in. This paper reports on risk practices for HIV infection and other STDs amongst prostitutes working in legalized brothels in Victoria, Australia. A self-administered questionnaire was distributed by representatives of a sex worker organization whose collaboration was an important factor in obtaining a large sample of prostitutes. The study found low levels of risk practices for prostitutes working in legal brothels in Victoria. The major risk practices indentified were injecting drug use and condom non-use with non-paying partners.

  15. Optimizing Clinical Trial Enrollment Methods Through "Goal Programming"

    PubMed Central

    Davis, J.M.; Sandgren, A.J.; Manley, A.R.; Daleo, M.A.; Smith, S.S.

    2014-01-01

    Introduction Clinical trials often fail to reach desired goals due to poor recruitment outcomes, including low participant turnout, high recruitment cost, or poor representation of minorities. At present, there is limited literature available to guide recruitment methodology. This study, conducted by researchers at the University of Wisconsin Center for Tobacco Research and Intervention (UW-CTRI), provides an example of how iterative analysis of recruitment data may be used to optimize recruitment outcomes during ongoing recruitment. Study methodology UW-CTRI’s research team provided a description of methods used to recruit smokers in two randomized trials (n = 196 and n = 175). The trials targeted low socioeconomic status (SES) smokers and involved time-intensive smoking cessation interventions. Primary recruitment goals were to meet required sample size and provide representative diversity while working with limited funds and limited time. Recruitment data was analyzed repeatedly throughout each study to optimize recruitment outcomes. Results Estimates of recruitment outcomes based on prior studies on smoking cessation suggested that researchers would be able to recruit 240 low SES smokers within 30 months at a cost of $72,000. With employment of methods described herein, researchers were able to recruit 374 low SES smokers over 30 months at a cost of $36,260. Discussion Each human subjects study presents unique recruitment challenges with time and cost of recruitment dependent on the sample population and study methodology. Nonetheless, researchers may be able to improve recruitment outcomes though iterative analysis of recruitment data and optimization of recruitment methods throughout the recruitment period. PMID:25642125

  16. Airborne microorganisms associated with waste management and recovery: biomonitoring methodologies.

    PubMed

    Coccia, Anna Maria; Gucci, Paola Margherita Bianca; Lacchetti, Ines; Paradiso, Rosa; Scaini, Federica

    2010-01-01

    This paper presents preliminary results from a year-long indoor bioaerosol monitoring performed in three working environments of a municipal composting facility treating green and organic waste. Composting, whereby organic matter is stabilized through aerobic decomposition, requires aeration, causing the dispersion of microbial particles (microorganisms and associated toxins). Waste can, therefore, become a potential source of biological hazard. Bioaerosol samples were collected on a monthly basis. Through a comparison of results obtained using two samplers - the Surface Air System DUO SAS 360 and the BioSampler - the study aimed at assessing the presence of biological pollutants, and at contributing to the definition of standard sampling methods for bioaerosols leading, eventually, to the establishment of exposure limits for these occupational pollutants.

  17. Economic consequences of improved temperature forecasts: An experiment with the Florida citrus growers (an update of control group results)

    NASA Technical Reports Server (NTRS)

    Braen, C.

    1978-01-01

    The economic experiment, the results obtained to date and the work which still remains to be done are summarized. Specifically, the experiment design is described in detail as are the developed data collection methodology and procedures, sampling plan, data reduction techniques, cost and loss models, establishment of frost severity measures, data obtained from citrus growers, National Weather Service and Federal Crop Insurance Corp. Resulting protection costs and crop losses for the control group sample, extrapolation of results of control group to the Florida citrus industry and the method for normalization of these results to a normal or average frost season so that results may be compared with anticipated similar results from test group measurements are discussed.

  18. Synthesis of discipline-based education research in physics

    NASA Astrophysics Data System (ADS)

    Docktor, Jennifer L.; Mestre, José P.

    2014-12-01

    This paper presents a comprehensive synthesis of physics education research at the undergraduate level. It is based on work originally commissioned by the National Academies. Six topical areas are covered: (1) conceptual understanding, (2) problem solving, (3) curriculum and instruction, (4) assessment, (5) cognitive psychology, and (6) attitudes and beliefs about teaching and learning. Each topical section includes sample research questions, theoretical frameworks, common research methodologies, a summary of key findings, strengths and limitations of the research, and areas for future study. Supplemental material proposes promising future directions in physics education research.

  19. Diffusion orientation transform revisited.

    PubMed

    Canales-Rodríguez, Erick Jorge; Lin, Ching-Po; Iturria-Medina, Yasser; Yeh, Chun-Hung; Cho, Kuan-Hung; Melie-García, Lester

    2010-01-15

    Diffusion orientation transform (DOT) is a powerful imaging technique that allows the reconstruction of the microgeometry of fibrous tissues based on diffusion MRI data. The three main error sources involving this methodology are the finite sampling of the q-space, the practical truncation of the series of spherical harmonics and the use of a mono-exponential model for the attenuation of the measured signal. In this work, a detailed mathematical description that provides an extension to the DOT methodology is presented. In particular, the limitations implied by the use of measurements with a finite support in q-space are investigated and clarified as well as the impact of the harmonic series truncation. Near- and far-field analytical patterns for the diffusion propagator are examined. The near-field pattern makes available the direct computation of the probability of return to the origin. The far-field pattern allows probing the limitations of the mono-exponential model, which suggests the existence of a limit of validity for DOT. In the regimen from moderate to large displacement lengths the isosurfaces of the diffusion propagator reveal aberrations in form of artifactual peaks. Finally, the major contribution of this work is the derivation of analytical equations that facilitate the accurate reconstruction of some orientational distribution functions (ODFs) and skewness ODFs that are relatively immune to these artifacts. The new formalism was tested using synthetic and real data from a phantom of intersecting capillaries. The results support the hypothesis that the revisited DOT methodology could enhance the estimation of the microgeometry of fiber tissues.

  20. Smartphone serious games for vision and hearing assessment.

    PubMed

    Dias, Pedro; Aguiar, Bruno; Sousa, Filipe; Sousa, Augusto

    2015-01-01

    Falls are the second leading cause of accidental injury deaths worldwide. In this paper, it is intended to define methodologies that permit the evaluation of two potential factors which might have an impact on fall risk, these are: visual and hearing loss. The aim of the work developed is not to replace clinic visits, but to offer the user the means to continue the tracking of his vision and hearing at home, during the long time intervals between clinical tests. Tests conducted in a sample of our target users indicate a good ability to measure vision and hearing using an android smartphone and the proposed methodologies. While some tests require further validation, promising results were achieved in the most common tests for vision and hearing, presenting a good correlation between the system's results when compared to the traditional tests (for distance visual acuity) and the data gathered from the users (for hearing tests).

  1. Development of CNC prototype for the characterization of the nanoparticle release during physical manipulation of nanocomposites.

    PubMed

    Gendre, Laura; Marchante, Veronica; Abhyankar, Hrushikesh A; Blackburn, Kim; Temple, Clive; Brighton, James L

    2016-01-01

    This work focuses on the release of nanoparticles from commercially used nanocomposites during machining operations. A reliable and repeatable method was developed to assess the intentionally exposure to nanoparticles, in particular during drilling. This article presents the description and validation of results obtained from a new prototype used for the measurement and monitoring of nanoparticles in a controlled environment. This methodology was compared with the methodologies applied in other studies. Also, some preliminary experiments on drilling nanocomposites are included. Size, shape and chemical composition of the released nanoparticles were investigated in order to understand their hazard potential. No significant differences were found in the amount of nanoparticles released between samples with and without nanoadditives. Also, no chemical alteration was observed between the dust generated and the bulk material. Finally, further developments of the prototype are proposed.

  2. The effect of erosion on the fatigue limit of metallic materials for aerospace applications

    NASA Astrophysics Data System (ADS)

    Kordatos, E. Z.; Exarchos, D. A.; Matikas, T. E.

    2018-03-01

    This work deals with the study of the fatigue behavior of metallic materials for aerospace applications which have undergone erosion. Particularly, an innovative non-destructive methodology based on infrared lock-in thermography was applied on aluminum samples for the rapid determination of their fatigue limit. The effect of erosion on the structural integrity of materials can lead to a catastrophic failure and therefore an efficient assessment of the fatigue behavior is of high importance. Infrared thermography (IRT) as a non-destructive, non-contact, real time and full field method can be employed in order the fatigue limit to be rapidly determined. The basic principle of this method is the detection and monitoring of the intrinsically dissipated energy due to the cyclic fatigue loading. This methodology was successfully applied on both eroded and non-eroded aluminum specimens in order the severity of erosion to be evaluated.

  3. Estimation of the number of fluorescent end-members for quantitative analysis of multispectral FLIM data.

    PubMed

    Gutierrez-Navarro, Omar; Campos-Delgado, Daniel U; Arce-Santana, Edgar R; Maitland, Kristen C; Cheng, Shuna; Jabbour, Joey; Malik, Bilal; Cuenca, Rodrigo; Jo, Javier A

    2014-05-19

    Multispectral fluorescence lifetime imaging (m-FLIM) can potentially allow identifying the endogenous fluorophores present in biological tissue. Quantitative description of such data requires estimating the number of components in the sample, their characteristic fluorescent decays, and their relative contributions or abundances. Unfortunately, this inverse problem usually requires prior knowledge about the data, which is seldom available in biomedical applications. This work presents a new methodology to estimate the number of potential endogenous fluorophores present in biological tissue samples from time-domain m-FLIM data. Furthermore, a completely blind linear unmixing algorithm is proposed. The method was validated using both synthetic and experimental m-FLIM data. The experimental m-FLIM data include in-vivo measurements from healthy and cancerous hamster cheek-pouch epithelial tissue, and ex-vivo measurements from human coronary atherosclerotic plaques. The analysis of m-FLIM data from in-vivo hamster oral mucosa identified healthy from precancerous lesions, based on the relative concentration of their characteristic fluorophores. The algorithm also provided a better description of atherosclerotic plaques in term of their endogenous fluorophores. These results demonstrate the potential of this methodology to provide quantitative description of tissue biochemical composition.

  4. Work Group on American Indian Research and Program Evaluation Methodology, Symposium on Research and Evaluation Methodology: Lifespan Issues Related to American Indians/Alaska Natives with Disabilities (Washington, DC, April 26-27, 2002).

    ERIC Educational Resources Information Center

    Davis, Jamie D., Ed.; Erickson, Jill Shepard, Ed.; Johnson, Sharon R., Ed.; Marshall, Catherine A., Ed.; Running Wolf, Paulette, Ed.; Santiago, Rolando L., Ed.

    This first symposium of the Work Group on American Indian Research and Program Evaluation Methodology (AIRPEM) explored American Indian and Alaska Native cultural considerations in relation to "best practices" in research and program evaluation. These cultural considerations include the importance of tribal consultation on research…

  5. Applying the Methodology of the Community College Classification Scheme to the Public Master's Colleges and Universities Sector

    ERIC Educational Resources Information Center

    Kinkead, J. Clint.; Katsinas, Stephen G.

    2011-01-01

    This work brings forward the geographically-based classification scheme for the public Master's Colleges and Universities sector. Using the same methodology developed by Katsinas and Hardy (2005) to classify community colleges, this work classifies Master's Colleges and Universities. This work has four major findings and conclusions. First, a…

  6. The Meaning of Work among Chinese University Students: Findings from Prototype Research Methodology

    ERIC Educational Resources Information Center

    Zhou, Sili; Leung, S. Alvin; Li, Xu

    2012-01-01

    This study examined Chinese university students' conceptualization of the meaning of work. One hundred and ninety students (93 male, 97 female) from Beijing, China, participated in the study. Prototype research methodology (J. Li, 2001) was used to explore the meaning of work and the associations among the identified meanings. Cluster analysis was…

  7. Resource selection for an interdisciplinary field: a methodology.

    PubMed

    Jacoby, Beth E; Murray, Jane; Alterman, Ina; Welbourne, Penny

    2002-10-01

    The Health Sciences and Human Services Library of the University of Maryland developed and implemented a methodology to evaluate print and digital resources for social work. Although this methodology was devised for the interdisciplinary field of social work, the authors believe it may lend itself to resource selection in other interdisciplinary fields. The methodology was developed in response to the results of two separate surveys conducted in late 1999, which indicated improvement was needed in the library's graduate-level social work collections. Library liaisons evaluated the print collection by identifying forty-five locally relevant Library of Congress subject headings and then using these subjects or synonymous terms to compare the library's titles to collections of peer institutions, publisher catalogs, and Amazon.com. The collection also was compared to social work association bibliographies, ISI Journal Citation Reports, and major social work citation databases. An approval plan for social work books was set up to assist in identifying newly published titles. The library acquired new print and digital social work resources as a result of the evaluation, thus improving both print and digital collections for its social work constituents. Visibility of digital resources was increased by cataloging individual titles in aggregated electronic journal packages and listing each title on the library Web page.

  8. Resource selection for an interdisciplinary field: a methodology*

    PubMed Central

    Jacoby, Beth E.; Murray, Jane; Alterman, Ina; Welbourne, Penny

    2002-01-01

    The Health Sciences and Human Services Library of the University of Maryland developed and implemented a methodology to evaluate print and digital resources for social work. Although this methodology was devised for the interdisciplinary field of social work, the authors believe it may lend itself to resource selection in other interdisciplinary fields. The methodology was developed in response to the results of two separate surveys conducted in late 1999, which indicated improvement was needed in the library's graduate-level social work collections. Library liaisons evaluated the print collection by identifying forty-five locally relevant Library of Congress subject headings and then using these subjects or synonymous terms to compare the library's titles to collections of peer institutions, publisher catalogs, and Amazon.com. The collection also was compared to social work association bibliographies, ISI Journal Citation Reports, and major social work citation databases. An approval plan for social work books was set up to assist in identifying newly published titles. The library acquired new print and digital social work resources as a result of the evaluation, thus improving both print and digital collections for its social work constituents. Visibility of digital resources was increased by cataloging individual titles in aggregated electronic journal packages and listing each title on the library Web page. PMID:12398245

  9. Classification of red wine based on its protected designation of origin (PDO) using Laser-induced Breakdown Spectroscopy (LIBS).

    PubMed

    Moncayo, S; Rosales, J D; Izquierdo-Hornillos, R; Anzano, J; Caceres, J O

    2016-09-01

    This work reports on a simple and fast classification procedure for the quality control of red wines with protected designation of origin (PDO) by means of Laser Induced Breakdown Spectroscopy (LIBS) technique combined with Neural Networks (NN) in order to increase the quality assurance and authenticity issues. A total of thirty-eight red wine samples from different PDO were analyzed to detect fake wines and to avoid unfair competition in the market. LIBS is well known for not requiring sample preparation, however, in order to increase its analytical performance a new sample preparation treatment by previous liquid-to-solid transformation of the wine using a dry collagen gel has been developed. The use of collagen pellets allowed achieving successful classification results, avoiding the limitations and difficulties of working with aqueous samples. The performance of the NN model was assessed by three validation procedures taking into account their sensitivity (internal validation), generalization ability and robustness (independent external validation). The results of the use of a spectroscopic technique coupled with a chemometric analysis (LIBS-NN) are discussed in terms of its potential use in the food industry, providing a methodology able to perform the quality control of alcoholic beverages. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Multiresidue analytical method for pharmaceuticals and personal care products in sewage and sewage sludge by online direct immersion SPME on-fiber derivatization - GCMS.

    PubMed

    López-Serna, Rebeca; Marín-de-Jesús, David; Irusta-Mata, Rubén; García-Encina, Pedro Antonio; Lebrero, Raquel; Fdez-Polanco, María; Muñoz, Raúl

    2018-08-15

    The work here presented aimed at developing an analytical method for the simultaneous determination of 22 pharmaceuticals and personal care products, including 3 transformation products, in sewage and sludge. A meticulous method optimization, involving an experimental design, was carried out. The developed method was fully automated and consisted of the online extraction of 17 mL of water sample by Direct Immersion Solid Phase MicroExtraction followed by On-fiber Derivatization coupled to Gas Chromatography - Mass Spectrometry (DI-SPME - On-fiber Derivatization - GC - MS). This methodology was validated for 12 of the initial compounds as a reliable (relative recoveries above 90% for sewage and 70% for sludge; repeatability as %RSD below 10% in all cases), sensitive (LODs below 20 ng L -1 in sewage and 10 ng g -1 in sludge), versatile (sewage and sewage-sludge samples up to 15,000 ng L -1 and 900 ng g -1 , respectively) and green analytical alternative for many medium-tech routine laboratories around the world to keep up with both current and forecast environmental regulations requirements. The remaining 10 analytes initially considered showed insufficient suitability to be included in the final method. The methodology was successfully applied to real samples generated in a pilot scale sewage treatment reactor. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. Structural, morphological and magnetic properties of Sr0.3La0.48Ca0.25n[Fe(2-0.4/n)O3]Co0.4 (n = 5.5, 5.6,5.7,5.8, 5.9, 6.0) hexaferrites prepared by facile ceramic route methodology

    NASA Astrophysics Data System (ADS)

    Rehman, Khalid Mehmood Ur; Liu, Xiansong; Yang, Yujie; Feng, Shuangjiu; Tang, Jin; Ali, Zulfiqar; Wazir, Z.; Khan, Muhammad Wasim; Shezad, Mudssir; Iqbal, Muhammad Shahid; Zhang, Cong; Liu, Chaocheng

    2018-03-01

    In present work, M-type strontium hexaferrite with chemical composition of Sr0.3La0.48Ca0.25n[Fe(2-0.4/n)O3]Co0.4 (n = 5.5, 5.6, 5.7, 5.8, 5.9, 6.0) magnetic powder were synthesized by using facile ceramic route methodology. The structural, morphological and magnetic properties of the products were investigated by using X-rays diffraction (XRD), Scanning Electron Microscopy (SEM) and Vibrating Sample Magnetometer (VSM) techniques, respectively. There is a single magnetoplumbite phase in the magnetic powders containing (5.5 ≤ n ≤5.8) and (n ≥ 5.9) magnetic some impurities begin to seem in the structure. The magnets have shaped hexagonal structures. Magnetic properties of the samples were metric by permanent magnetic measuring equipment Vibrating Sample Magnetometer, respectively. We report our investigation of n-aggregation iron content on crystalline size characterization and magnetic properties of the specimen. It is originate that the desirable quantity of n-aggregation iron content substitution may curiously increase saturation magnetization (Ms) and intrinsic coercivity (Hc). With the iron addition for the same sintering temperature at 1260 °C, (Ms) and (Hc) first increase and then decrease gradually.

  12. Supercritical fluid chromatography with photodiode array detection for pesticide analysis in papaya and avocado samples.

    PubMed

    Pano-Farias, Norma S; Ceballos-Magaña, Silvia G; Gonzalez, Jorge; Jurado, José M; Muñiz-Valencia, Roberto

    2015-04-01

    To improve the analysis of pesticides in complex food matrices with economic importance, alternative chromatographic techniques, such as supercritical fluid chromatography, can be used. Supercritical fluid chromatography has barely been applied for pesticide analysis in food matrices. In this paper, an analytical method using supercritical fluid chromatography coupled to a photodiode array detection has been established for the first time for the quantification of pesticides in papaya and avocado. The extraction of methyl parathion, atrazine, ametryn, carbofuran, and carbaryl was performed through the quick, easy, cheap, effective, rugged, and safe methodology. The method was validated using papaya and avocado samples. For papaya, the correlation coefficient values were higher than 0.99; limits of detection and quantification ranged from 130-380 and 220-640 μg/kg, respectively; recovery values ranged from 72.8-94.6%; precision was lower than 3%. For avocado, limit of detection values were ˂450 μg/kg; precision was lower than 11%; recoveries ranged from 50.0-94.2%. Method feasibility was tested for lime, banana, mango, and melon samples. Our results demonstrate that the proposed method is applicable to methyl parathion, atrazine, ametryn, and carbaryl, toxics pesticides used worldwide. The methodology presented in this work could be applicable to other fruits. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Applicability of a novel immunoassay based on surface plasmon resonance for the diagnosis of Chagas disease.

    PubMed

    Luz, João G G; Souto, Dênio E P; Machado-Assis, Girley F; de Lana, Marta; Luz, Rita C S; Martins-Filho, Olindo A; Damos, Flávio S; Martins, Helen R

    2016-02-15

    We defined the methodological criteria for the interpretation of the results provided by a novel immunoassay based on surface plasmon resonance (SPR) to detect antibodies anti-Trypanosoma cruzi in human sera (SPRCruzi). Then, we evaluated its applicability as a diagnostic tool for Chagas disease. To define the cut-off point and serum dilution factor, 57 samples were analyzed at SPRCruzi and the obtained values of SPR angle displacement (ΔθSPR) were submitted to statistical analysis. Adopting the indicated criteria, its performance was evaluated into a wide panel of samples, being 99 Chagas disease patients, 30 non-infected subjects and 42 with other parasitic/infectious diseases. In parallel, these samples were also analyzed by ELISA. Our data demonstrated that 1:320 dilution and cut-off point at ∆θSPR=17.2 m° provided the best results. Global performance analysis demonstrated satisfactory sensitivity (100%), specificity (97.2%), positive predictive value (98%), negative predictive value (100%) and global accuracy (99.6%). ELISA and SPRCruzi showed almost perfect agreement, mainly between chagasic and non-infected individuals. However, the new immunoassay was better in discriminate Chagas disease from other diseases. This work demonstrated the applicability of SPRCruzi as a feasible, real time, label free, sensible and specific methodology for the diagnosis of Chagas disease. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. RAS testing in metastatic colorectal cancer: advances in Europe.

    PubMed

    Van Krieken, J Han J M; Rouleau, Etienne; Ligtenberg, Marjolijn J L; Normanno, Nicola; Patterson, Scott D; Jung, Andreas

    2016-04-01

    Personalized medicine shows promise for maximizing efficacy and minimizing toxicity of anti-cancer treatment. KRAS exon 2 mutations are predictive of resistance to epidermal growth factor receptor-directed monoclonal antibodies in patients with metastatic colorectal cancer. Recent studies have shown that broader RAS testing (KRAS and NRAS) is needed to select patients for treatment. While Sanger sequencing is still used, approaches based on various methodologies are available. Few CE-approved kits, however, detect the full spectrum of RAS mutations. More recently, "next-generation" sequencing has been developed for research use, including parallel semiconductor sequencing and reversible termination. These techniques have high technical sensitivities for detecting mutations, although the ideal threshold is currently unknown. Finally, liquid biopsy has the potential to become an additional tool to assess tumor-derived DNA. For accurate and timely RAS testing, appropriate sampling and prompt delivery of material is critical. Processes to ensure efficient turnaround from sample request to RAS evaluation must be implemented so that patients receive the most appropriate treatment. Given the variety of methodologies, external quality assurance programs are important to ensure a high standard of RAS testing. Here, we review technical and practical aspects of RAS testing for pathologists working with metastatic colorectal cancer tumor samples. The extension of markers from KRAS to RAS testing is the new paradigm for biomarker testing in colorectal cancer.

  15. Determination of demineralization depth in tooth enamel exposed to abusive use of whitening gel using micro-Energy Dispersive X ray Fluorescence

    NASA Astrophysics Data System (ADS)

    Pessanha, Sofia; Coutinho, Sara; Carvalho, Maria Luisa; Silveira, João Miguel; Mata, António

    2017-12-01

    In this work, we present a methodology for the determination of the depth of demineralization in dental enamel caused by extended use of an Over-The-Counter (OTC) whitening product. Teeth whitening is a very common practice in Dentistry, but concerns have been raised regarding the invasiveness of the treatment, especially regarding OTC products, that can be used without medical supervision and sometimes with concentrations of active agent that exceed the allowed regulations. In this work, we studied tooth enamel samples, treated with a whitening product during an extended period of time, both directly on the enamel surface and in the cross-section. Specimens were analyzed using microbeam X-Ray Fluorescence (micro-XRF) using polycapillary optics to obtain a spot down to 25 μm. Due to the relatively large spot size of our setup point analysis of the cross-sections would be inadequate. This way, line scans were performed instead, before and after whitening, and using appropriate data treatment the depth of demineralization was inferred. The used methodology indicated an average demineralization depth of 25 μm, the same order of magnitude as the aprismatic enamel layer.

  16. A pilot study to test psychophonetics methodology for self-care and empathy in compassion fatigue, burnout and secondary traumatic stress

    PubMed Central

    Butler, Nadine

    2013-01-01

    Abstract Background Home-based care is recognised as being a stressful occupation. Practitioners working with patients experiencing high levels of trauma may be susceptible to compassion fatigue, with the sustained need to remain empathic being a contributing factor. Objectives The aim of this research was to evaluate psychophonetics methodology for self-care and empathy skills as an intervention for compassion fatigue. Objectives were to measure levels of compassion fatigue pre-intervention, then to apply the intervention and retest levels one month and six months post-intervention. Method The research applied a pilot test of a developed intervention as a quasi-experiment. The study sample comprised home-based carers working with HIV-positive patients at a hospice in Grabouw, a settlement in the Western Cape facing socioeconomic challenge. Results The result of the pilot study showed a statistically-significant improvement in secondary traumatic stress, a component of compassion fatigue, measured with the ProQOL v5 instrument post-intervention. Conclusion The results gave adequate indication for the implementation of a larger study in order to apply and test the intervention. The study highlights a dire need for further research in this field.

  17. Methodology for Calculating Latency of GPS Probe Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Zhongxiang; Hamedi, Masoud; Young, Stanley

    Crowdsourced GPS probe data, such as travel time on changeable-message signs and incident detection, have been gaining popularity in recent years as a source for real-time traffic information to driver operations and transportation systems management and operations. Efforts have been made to evaluate the quality of such data from different perspectives. Although such crowdsourced data are already in widespread use in many states, particularly the high traffic areas on the Eastern seaboard, concerns about latency - the time between traffic being perturbed as a result of an incident and reflection of the disturbance in the outsourced data feed - havemore » escalated in importance. Latency is critical for the accuracy of real-time operations, emergency response, and traveler information systems. This paper offers a methodology for measuring probe data latency regarding a selected reference source. Although Bluetooth reidentification data are used as the reference source, the methodology can be applied to any other ground truth data source of choice. The core of the methodology is an algorithm for maximum pattern matching that works with three fitness objectives. To test the methodology, sample field reference data were collected on multiple freeway segments for a 2-week period by using portable Bluetooth sensors as ground truth. Equivalent GPS probe data were obtained from a private vendor, and their latency was evaluated. Latency at different times of the day, impact of road segmentation scheme on latency, and sensitivity of the latency to both speed-slowdown and recovery-from-slowdown episodes are also discussed.« less

  18. Evaluation of glucose controllers in virtual environment: methodology and sample application.

    PubMed

    Chassin, Ludovic J; Wilinska, Malgorzata E; Hovorka, Roman

    2004-11-01

    Adaptive systems to deliver medical treatment in humans are safety-critical systems and require particular care in both the testing and the evaluation phase, which are time-consuming, costly, and confounded by ethical issues. The objective of the present work is to develop a methodology to test glucose controllers of an artificial pancreas in a simulated (virtual) environment. A virtual environment comprising a model of the carbohydrate metabolism and models of the insulin pump and the glucose sensor is employed to simulate individual glucose excursions in subjects with type 1 diabetes. The performance of the control algorithm within the virtual environment is evaluated by considering treatment and operational scenarios. The developed methodology includes two dimensions: testing in relation to specific life style conditions, i.e. fasting, post-prandial, and life style (metabolic) disturbances; and testing in relation to various operating conditions, i.e. expected operating conditions, adverse operating conditions, and system failure. We define safety and efficacy criteria and describe the measures to be taken prior to clinical testing. The use of the methodology is exemplified by tuning and evaluating a model predictive glucose controller being developed for a wearable artificial pancreas focused on fasting conditions. Our methodology to test glucose controllers in a virtual environment is instrumental in anticipating the results of real clinical tests for different physiological conditions and for different operating conditions. The thorough testing in the virtual environment reduces costs and speeds up the development process.

  19. A reliability evaluation methodology for memory chips for space applications when sample size is small

    NASA Technical Reports Server (NTRS)

    Chen, Y.; Nguyen, D.; Guertin, S.; Berstein, J.; White, M.; Menke, R.; Kayali, S.

    2003-01-01

    This paper presents a reliability evaluation methodology to obtain the statistical reliability information of memory chips for space applications when the test sample size needs to be kept small because of the high cost of the radiation hardness memories.

  20. THE IMPACT OF PASSIVE SAMPLING METHODOLOGIES USED IN THE DEARS

    EPA Science Inventory

    This abstract details the use of passive sampling methodologies in the Detroit Exposure and Aerosol Research Study (DEARS). A discussion about the utility of various gas-phase passive samplers used in the study will be described along with examples of field data measurements empl...

  1. The combined positive impact of Lean methodology and Ventana Symphony autostainer on histology lab workflow

    PubMed Central

    2010-01-01

    Background Histologic samples all funnel through the H&E microtomy staining area. Here manual processes intersect with semi-automated processes creating a bottleneck. We compare alternate work processes in anatomic pathology primarily in the H&E staining work cell. Methods We established a baseline measure of H&E process impact on personnel, information management and sample flow from historical workload and production data and direct observation. We compared this to performance after implementing initial Lean process modifications, including workstation reorganization, equipment relocation and workflow levelling, and the Ventana Symphony stainer to assess the impact on productivity in the H&E staining work cell. Results Average time from gross station to assembled case decreased by 2.9 hours (12%). Total process turnaround time (TAT) exclusive of processor schedule changes decreased 48 minutes/case (4%). Mean quarterly productivity increased 8.5% with the new methods. Process redesign reduced the number of manual steps from 219 to 182, a 17% reduction. Specimen travel distance was reduced from 773 ft/case to 395 ft/case (49%) overall, and from 92 to 53 ft/case in the H&E cell (42% improvement). Conclusions Implementation of Lean methods in the H&E work cell of histology can result in improved productivity, improved through-put and case availability parameters including TAT. PMID:20181123

  2. Exhaustive thin-layer cyclic voltammetry for absolute multianalyte halide detection.

    PubMed

    Cuartero, Maria; Crespo, Gastón A; Ghahraman Afshar, Majid; Bakker, Eric

    2014-11-18

    Water analysis is one of the greatest challenges in the field of environmental analysis. In particular, seawater analysis is often difficult because a large amount of NaCl may mask the determination of other ions, i.e., nutrients, halides, and carbonate species. We demonstrate here the use of thin-layer samples controlled by cyclic voltammetry to analyze water samples for chloride, bromide, and iodide. The fabrication of a microfluidic electrochemical cell based on a Ag/AgX wire (working electrode) inserted into a tubular Nafion membrane is described, which confines the sample solution layer to less than 15 μm. By increasing the applied potential, halide ions present in the thin-layer sample (X(-)) are electrodeposited on the working electrode as AgX, while their respective counterions are transported across the perm-selective membrane to an outer solution. Thin-layer cyclic voltammetry allows us to obtain separated peaks in mixed samples of these three halides, finding a linear relationship between the halide concentration and the corresponding peak area from about 10(-5) to 0.1 M for bromide and iodide and from 10(-4) to 0.6 M for chloride. This technique was successfully applied for the halide analysis in tap, mineral, and river water as well as seawater. The proposed methodology is absolute and potentially calibration-free, as evidenced by an observed 2.5% RSD cell to cell reproducibility and independence from the operating temperature.

  3. Rapid Analysis of Trace Drugs and Metabolites Using a Thermal Desorption DART-MS Configuration.

    PubMed

    Sisco, Edward; Forbes, Thomas P; Staymates, Matthew E; Gillen, Greg

    2016-01-01

    The need to analyze trace narcotic samples rapidly for screening or confirmatory purposes is of increasing interest to the forensic, homeland security, and criminal justice sectors. This work presents a novel method for the detection and quantification of trace drugs and metabolites off of a swipe material using a thermal desorption direct analysis in real time mass spectrometry (TD-DART-MS) configuration. A variation on traditional DART, this configuration allows for desorption of the sample into a confined tube, completely independent of the DART source, allowing for more efficient and thermally precise analysis of material present on a swipe. Over thirty trace samples of narcotics, metabolites, and cutting agents deposited onto swipes were rapidly differentiated using this methodology. The non-optimized method led to sensitivities ranging from single nanograms to hundreds of picograms. Direct comparison to traditional DART with a subset of the samples highlighted an improvement in sensitivity by a factor of twenty to thirty and an increase in reproducibility sample to sample from approximately 45 % RSD to less than 15 % RSD. Rapid extraction-less quantification was also possible.

  4. Using Modern Methodologies with Maintenance Software

    NASA Technical Reports Server (NTRS)

    Streiffert, Barbara A.; Francis, Laurie K.; Smith, Benjamin D.

    2014-01-01

    Jet Propulsion Laboratory uses multi-mission software produced by the Mission Planning and Sequencing (MPS) team to process, simulate, translate, and package the commands that are sent to a spacecraft. MPS works under the auspices of the Multi-Mission Ground Systems and Services (MGSS). This software consists of nineteen applications that are in maintenance. The MPS software is classified as either class B (mission critical) or class C (mission important). The scheduling of tasks is difficult because mission needs must be addressed prior to performing any other tasks and those needs often spring up unexpectedly. Keeping track of the tasks that everyone is working on is also difficult because each person is working on a different software component. Recently the group adopted the Scrum methodology for planning and scheduling tasks. Scrum is one of the newer methodologies typically used in agile development. In the Scrum development environment, teams pick their tasks that are to be completed within a sprint based on priority. The team specifies the sprint length usually a month or less. Scrum is typically used for new development of one application. In the Scrum methodology there is a scrum master who is a facilitator who tries to make sure that everything moves smoothly, a product owner who represents the user(s) of the software and the team. MPS is not the traditional environment for the Scrum methodology. MPS has many software applications in maintenance, team members who are working on disparate applications, many users, and is interruptible based on mission needs, issues and requirements. In order to use scrum, the methodology needed adaptation to MPS. Scrum was chosen because it is adaptable. This paper is about the development of the process for using scrum, a new development methodology, with a team that works on disparate interruptible tasks on multiple software applications.

  5. [Theoretical and methodological uses of research in Social and Human Sciences in Health].

    PubMed

    Deslandes, Suely Ferreira; Iriart, Jorge Alberto Bernstein

    2012-12-01

    The current article aims to map and critically reflect on the current theoretical and methodological uses of research in the subfield of social and human sciences in health. A convenience sample was used to select three Brazilian public health journals. Based on a reading of 1,128 abstracts published from 2009 to 2010, 266 articles were selected that presented the empirical base of research stemming from social and human sciences in health. The sample was classified thematically as "theoretical/ methodological reference", "study type/ methodological design", "analytical categories", "data production techniques", and "analytical procedures". We analyze the sample's emic categories, drawing on the authors' literal statements. All the classifications and respective variables were tabulated in Excel. Most of the articles were self-described as qualitative and used more than one data production technique. There was a wide variety of theoretical references, in contrast with the almost total predominance of a single type of data analysis (content analysis). In several cases, important gaps were identified in expounding the study methodology and instrumental use of the qualitative research techniques and methods. However, the review did highlight some new objects of study and innovations in theoretical and methodological approaches.

  6. Contributions of Various Radiological Sources to Background in a Suburban Environment

    DOE PAGES

    Milvenan, Richard D.; Hayes, Robert B.

    2016-11-01

    This work is a brief overview and comparison of dose rates stemming from both indoor and outdoor natural background radiation and household objects within a suburban environment in North Carolina. Combined gamma and beta dose rates were taken from indoor objects that ranged from the potassium in fruit to the americium in smoke detectors. For outdoor measurements, various height and time data samples were collected to show fluctuations in dose rate due to temperature inversion and geometric attenuation. Although each sample tested proved to have a statistically significant increase over background using Students t-test, no sample proved to be moremore » than a minor increase in natural radiation dose. Furthermore, the relative contributions from natural radioactivity such as potassium in foods and common household items are shown to be easily distinguished from background using standard handheld instrumentation when applied in a systematic, methodological manner.« less

  7. Contributions of Various Radiological Sources to Background in a Suburban Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Milvenan, Richard D.; Hayes, Robert B.

    This work is a brief overview and comparison of dose rates stemming from both indoor and outdoor natural background radiation and household objects within a suburban environment in North Carolina. Combined gamma and beta dose rates were taken from indoor objects that ranged from the potassium in fruit to the americium in smoke detectors. For outdoor measurements, various height and time data samples were collected to show fluctuations in dose rate due to temperature inversion and geometric attenuation. Although each sample tested proved to have a statistically significant increase over background using Students t-test, no sample proved to be moremore » than a minor increase in natural radiation dose. Furthermore, the relative contributions from natural radioactivity such as potassium in foods and common household items are shown to be easily distinguished from background using standard handheld instrumentation when applied in a systematic, methodological manner.« less

  8. On the use of the optothermal window technique for the determination of iron (II) content in fortified commercial milk

    NASA Astrophysics Data System (ADS)

    Cardoso, S. L.; Dias, C. M. F.; Lima, J. A. P.; Massunaga, M. S. O.; da Silva, M. G.; Vargas, H.

    2003-01-01

    This work reports on the use of the optothermal window and a well-proven phenanthroline colorimetry method for determination of iron (II) content in a commercial fortified milk. Initially, iron (II) in distilled water was determined using a series of calibration samples with ferrous sulfate acting as the source of iron (II). In the following phase, this calibration methodology was applied to commercial milk as the sample matrix. The phenanthroline colorimetry [American Public Health Association, Washington, DC (1998)] was chosen in an attempt to achieve proper selectivity (i.e., to obtain the absorption band, the wavelength of which is centered near the radiation wavelength available for our experiments: Excitation wavelength at a 514-nm line of a 20-mW tunable Ar ion laser). Finally, samples of commercially available fortified milk were analyzed in an attempt to access Fe (II) content.

  9. Simultaneous determination of V, Ni and Fe in fuel fly ash using solid sampling high resolution continuum source graphite furnace atomic absorption spectrometry.

    PubMed

    Cárdenas Valdivia, A; Vereda Alonso, E; López Guerrero, M M; Gonzalez-Rodriguez, J; Cano Pavón, J M; García de Torres, A

    2018-03-01

    A green and simple method has been proposed in this work for the simultaneous determination of V, Ni and Fe in fuel ash samples by solid sampling high resolution continuum source graphite furnace atomic absorption spectrometry (SS HR CS GFAAS). The application of fast programs in combination with direct solid sampling allows eliminating pretreatment steps, involving minimal manipulation of sample. Iridium treated platforms were applied throughout the present study, enabling the use of aqueous standards for calibration. Correlation coefficients for the calibration curves were typically better than 0.9931. The concentrations found in the fuel ash samples analysed ranged from 0.66% to 4.2% for V, 0.23-0.7% for Ni and 0.10-0.60% for Fe. Precision (%RSD) were 5.2%, 10.0% and 9.8% for V, Ni and Fe, respectively, obtained as the average of the %RSD of six replicates of each fuel ash sample. The optimum conditions established were applied to the determination of the target analytes in fuel ash samples. In order to test the accuracy and applicability of the proposed method in the analysis of samples, five ash samples from the combustion of fuel in power stations, were analysed. The method accuracy was evaluated by comparing the results obtained using the proposed method with the results obtained by ICP OES previous acid digestion. The results showed good agreement between them. The goal of this work has been to develop a fast and simple methodology that permits the use of aqueous standards for straightforward calibration and the simultaneous determination of V, Ni and Fe in fuel ash samples by direct SS HR CS GFAAS. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Educational intervention assessment aiming the hearing preservation of workers at a hospital laundry.

    PubMed

    Fontoura, Francisca Pinheiro; Gonçalves, Cláudia Giglio de Oliveira; Willig, Mariluci Hautsch; Lüders, Debora

    2018-02-19

    Evaluate the effectiveness of educational interventions on hearing health developed at a hospital laundry. Quantitative assessment conducted at a hospital laundry. The study sample comprised 80 workers of both genders divided into two groups: Study Group (SG) and Control Group (CG). The educational interventions in hearing preservation were evaluated based on a theoretical approach using the Participatory Problem-based Methodology in five workshops. To assess the results of the workshops, an instrument containing 36 questions on knowledge, attitudes, and practices in hearing preservation at work was used. Questionnaires A and B were applied prior to and one month after intervention, respectively. The answers to both questionnaires were analyzed by group according to gender and schooling. Results of the pre-intervention phase showed low scores regarding knowledge about hearing health in the work setting for both groups, but significant improvement in knowledge was observed after intervention in the SG, with 77.7% of the answers presenting significant difference between the groups. There was also an improvement in the mean scores, with 35 responses (95.22%) presenting scores >4 (considered adequate). The women presented lower knowledge scores than the men; however, these differences were not observed in the SG after the workshops. Schooling was not a relevant factor in the assessment. The educational proposal grounded in the Participatory Problem-based Methodology expanded knowledge about hearing health at work among the participants.

  11. Heterogenic Solid Biofuel Sampling Methodology and Uncertainty Associated with Prompt Analysis

    PubMed Central

    Pazó, Jose A.; Granada, Enrique; Saavedra, Ángeles; Patiño, David; Collazo, Joaquín

    2010-01-01

    Accurate determination of the properties of biomass is of particular interest in studies on biomass combustion or cofiring. The aim of this paper is to develop a methodology for prompt analysis of heterogeneous solid fuels with an acceptable degree of accuracy. Special care must be taken with the sampling procedure to achieve an acceptable degree of error and low statistical uncertainty. A sampling and error determination methodology for prompt analysis is presented and validated. Two approaches for the propagation of errors are also given and some comparisons are made in order to determine which may be better in this context. Results show in general low, acceptable levels of uncertainty, demonstrating that the samples obtained in the process are representative of the overall fuel composition. PMID:20559506

  12. 77 FR 4002 - Privacy Act of 1974; System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-26

    ... the methodological research previously included in the original System of Record Notice (SORN). This... methodological research on improving various aspects of surveys authorized by Title 13, U.S.C. 8(b), 182, and 196, such as: survey sampling frame design; sample selection algorithms; questionnaire development, design...

  13. 77 FR 26292 - Risk Evaluation and Mitigation Strategy Assessments: Social Science Methodologies to Assess Goals...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-03

    ... determine endpoints; questionnaire design and analyses; and presentation of survey results. To date, FDA has..., the workshop will invest considerable time in identifying best methodological practices for conducting... sample, sample size, question design, process, and endpoints. Panel 2 will focus on alternatives to...

  14. Improvements in the analytical methodology for the residue determination of the herbicide glyphosate in soils by liquid chromatography coupled to mass spectrometry.

    PubMed

    Botero-Coy, A M; Ibáñez, M; Sancho, J V; Hernández, F

    2013-05-31

    The determination of glyphosate (GLY) in soils is of great interest due to the widespread use of this herbicide and the need of assessing its impact on the soil/water environment. However, its residue determination is very problematic especially in soils with high organic matter content, where strong interferences are normally observed, and because of the particular physico-chemical characteristics of this polar/ionic herbicide. In the present work, we have improved previous LC-MS/MS analytical methodology reported for GLY and its main metabolite AMPA in order to be applied to "difficult" soils, like those commonly found in South-America, where this herbicide is extensively used in large areas devoted to soya or maize, among other crops. The method is based on derivatization with FMOC followed by LC-MS/MS analysis, using triple quadrupole. After extraction with potassium hydroxide, a combination of extract dilution, adjustment to appropriate pH, and solid phase extraction (SPE) clean-up was applied to minimize the strong interferences observed. Despite the clean-up performed, the use of isotope labelled glyphosate as internal standard (ILIS) was necessary for the correction of matrix effects and to compensate for any error occurring during sample processing. The analytical methodology was satisfactorily validated in four soils from Colombia and Argentina fortified at 0.5 and 5mg/kg. In contrast to most LC-MS/MS methods, where the acquisition of two transitions is recommended, monitoring all available transitions was required for confirmation of positive samples, as some of them were interfered by unknown soil components. This was observed not only for GLY and AMPA but also for the ILIS. Analysis by QTOF MS was useful to confirm the presence of interferent compounds that shared the same nominal mass of analytes as well as some of their main product ions. Therefore, the selection of specific transitions was crucial to avoid interferences. The methodology developed was applied to the analysis of 26 soils from different areas of Colombia and Argentina, and the method robustness was demonstrated by analysis of quality control samples along 4 months. Copyright © 2012 Elsevier B.V. All rights reserved.

  15. Quantitative Determination of Noa (Naturally Occurring Asbestos) in Rocks : Comparison Between Pcom and SEM Analysis

    NASA Astrophysics Data System (ADS)

    Baietto, Oliviero; Amodeo, Francesco; Giorgis, Ilaria; Vitaliti, Martina

    2017-04-01

    The quantification of NOA (Naturally Occurring Asbestos) in a rock or soil matrix is complex and subject to numerous errors. The purpose of this study is to compare two fundamental methodologies used for the analysis: the first one uses Phase Contrast Optical Microscope (PCOM) while the second one uses Scanning Electron Microscope (SEM). The two methods, although they provide the same result, which is the asbestos mass to total mass ratio, have completely different characteristics and both present pros and cons. The current legislation in Italy involves the use of SEM, DRX, FTIR, PCOM (DM 6/9/94) for the quantification of asbestos in bulk materials and soils and the threshold beyond which the material is considered as hazardous waste is a concentration of asbestos fiber of 1000 mg/kg.(DM 161/2012). The most used technology is the SEM which is the one among these with the better analytical sensitivity.(120mg/Kg DM 6 /9/94) The fundamental differences among the analyses are mainly: - Amount of analyzed sample portion - Representativeness of the sample - Resolution - Analytical precision - Uncertainty of the methodology - Operator errors Due to the problem of quantification of DRX and FTIR (1% DM 6/9/94) our Asbestos Laboratory (DIATI POLITO) since more than twenty years apply the PCOM methodology and in the last years the SEM methodology for quantification of asbestos content. The aim of our research is to compare the results obtained from a PCOM analysis with the results provided by SEM analysis on the base of more than 100 natural samples both from cores (tunnel-boring or explorative-drilling) and from tunnelling excavation . The results obtained show, in most cases, a good correlation between the two techniques. Of particular relevance is the fact that both techniques are reliable for very low quantities of asbestos, even lower than the analytical sensitivity. This work highlights the comparison between the two techniques emphasizing strengths and weaknesses of the two procedures and suggests how an integrated approach, together with the skills and experience of the operator may be the best way forward in order to obtain a constructive improvement of analysis techniques.

  16. Chair Report Consultancy Meeting on Nuclear Security Assessment Methodologies (NUSAM) Transport Case Study Working Group

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shull, Doug

    The purpose of the consultancy assignment was to (i) apply the NUSAM assessment methods to hypothetical transport security table top exercise (TTX) analyses and (ii) document its results to working materials of NUSAM case study on transport. A number of working group observations, using the results of TTX methodologies, are noted in the report.

  17. Advanced Cyberinfrastructure for Geochronology as a Collaborative Endeavor: A Decade of Progress, A Decade of Plans

    NASA Astrophysics Data System (ADS)

    Bowring, J. F.; McLean, N. M.; Walker, J. D.; Gehrels, G. E.; Rubin, K. H.; Dutton, A.; Bowring, S. A.; Rioux, M. E.

    2015-12-01

    The Cyber Infrastructure Research and Development Lab for the Earth Sciences (CIRDLES.org) has worked collaboratively for the last decade with geochronologists from EARTHTIME and EarthChem to build cyberinfrastructure geared to ensuring transparency and reproducibility in geoscience workflows and is engaged in refining and extending that work to serve additional geochronology domains during the next decade. ET_Redux (formerly U-Pb_Redux) is a free open-source software system that provides end-to-end support for the analysis of U-Pb geochronological data. The system reduces raw mass spectrometer (TIMS and LA-ICPMS) data to U-Pb dates, allows users to interpret ages from these data, and then facilitates the seamless federation of the results from one or more labs into a community web-accessible database using standard and open techniques. This EarthChem database - GeoChron.org - depends on keyed references to the System for Earth Sample Registration (SESAR) database that stores metadata about registered samples. These keys are each a unique International Geo Sample Number (IGSN) assigned to a sample and to its derivatives. ET_Redux provides for interaction with this archive, allowing analysts to store, maintain, retrieve, and share their data and analytical results electronically with whomever they choose. This initiative has created an open standard for the data elements of a complete reduction and analysis of U-Pb data, and is currently working to complete the same for U-series geochronology. We have demonstrated the utility of interdisciplinary collaboration between computer scientists and geoscientists in achieving a working and useful system that provides transparency and supports reproducibility, allowing geochemists to focus on their specialties. The software engineering community also benefits by acquiring research opportunities to improve development process methodologies used in the design, implementation, and sustainability of domain-specific software.

  18. A novel HS-SBSE system coupled with gas chromatography and mass spectrometry for the analysis of organochlorine pesticides in water samples.

    PubMed

    Grossi, Paula; Olivares, Igor R B; de Freitas, Diego R; Lancas, Fernando M

    2008-10-01

    A methodology to analyze organochlorine pesticides (OCPs) in water samples has been accomplished by using headspace stir bar sorptive extraction (HS-SBSE). The bars were in house coated with a thick film of PDMS in order to properly work in the headspace mode. Sampling was done by a novel HS-SBSE system whereas the analysis was performed by capillary GC coupled mass spectrometric detection (HS-SBSE-GC-MS). The extraction optimization, using different experimental parameters has been established by a standard equilibrium time of 120 min at 85 degrees C. A mixture of ACN/toluene as back extraction solvent promoted a good performance to remove the OCPs sorbed in the bar. Reproducibility between 2.1 and 14.8% and linearity between 0.96 and 1.0 were obtained for pesticides spiked in a linear range between 5 and 17 ng/g in water samples during the bar evaluation.

  19. Classification of Brazilian and foreign gasolines adulterated with alcohol using infrared spectroscopy.

    PubMed

    da Silva, Neirivaldo C; Pimentel, Maria Fernanda; Honorato, Ricardo S; Talhavini, Marcio; Maldaner, Adriano O; Honorato, Fernanda A

    2015-08-01

    The smuggling of products across the border regions of many countries is a practice to be fought. Brazilian authorities are increasingly worried about the illicit trade of fuels along the frontiers of the country. In order to confirm this as a crime, the Federal Police must have a means of identifying the origin of the fuel. This work describes the development of a rapid and nondestructive methodology to classify gasoline as to its origin (Brazil, Venezuela and Peru), using infrared spectroscopy and multivariate classification. Partial Least Squares Discriminant Analysis (PLS-DA) and Soft Independent Modeling Class Analogy (SIMCA) models were built. Direct standardization (DS) was employed aiming to standardize the spectra obtained in different laboratories of the border units of the Federal Police. Two approaches were considered in this work: (1) local and (2) global classification models. When using Approach 1, the PLS-DA achieved 100% correct classification, and the deviation of the predicted values for the secondary instrument considerably decreased after performing DS. In this case, SIMCA models were not efficient in the classification, even after standardization. Using a global model (Approach 2), both PLS-DA and SIMCA techniques were effective after performing DS. Considering that real situations may involve questioned samples from other nations (such as Peru), the SIMCA method developed according to Approach 2 is a more adequate, since the sample will be classified neither as Brazil nor Venezuelan. This methodology could be applied to other forensic problems involving the chemical classification of a product, provided that a specific modeling is performed. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  20. Participatory ergonomics among female cashiers from a department store.

    PubMed

    Cristancho, María Yanire León

    2012-01-01

    The objective of this paper was to control ergonomic risks among female cashiers working in a department store belonging to the retail market. This study was conducted between May and November 2010. Participatory ergonomics was applied through knowing and understanding how the company works, establishing the work team (Ergo group), training the team in ergonomics-related topics, and making decisions and interventions. The sample was composed of 71 participants--mostly female cashiers--, and all of them have a musculoskeletal compromise, declaring pain or discomfort mainly in the neck, lower back, right wrist and shoulders. Among others, following problems were found: postural overload, repetitive work, manual load handling, mental fatigue, environmental discomfort, variable work schedules, extended working days, and absence of breaks. In the intervention, the main implemented changes were the redesign of workstation, complete change of chairs and keyboards, and the implementation of a rotation system, as well breaks for compensatory exercises. After that, an evident improvement of found problems was observed, therefore it can be concluded that participatory ergonomics is an attractive methodology, appropriate and efficient for solving and controlling ergonomic risks and problems.

  1. Education requirements for nurses working with people with complex neurological conditions: nurses' perceptions.

    PubMed

    Baker, Mark

    2012-01-01

    Following a service evaluation methodology, this paper reports on registered nurses' (RNs) and healthcare assistants' (HCAs) perceptions about education and training requirements in order to work with people with complex neurological disabilities. A service evaluation was undertaken to meet the study aim using a non-probability, convenience method of sampling 368 nurses (n=110 RNs, n=258 HCAs) employed between October and November 2008 at one specialist hospital in south-west London in the U.K. The main results show that respondents were clear about the need to develop an education and training programme for RNs and HCAs working in this speciality area (91% of RNs and 94% of HCAs). A variety of topics were identified to be included within a work-based education and training programme, such as positively managing challenging behaviour, moving and handling, working with families. Adults with complex neurological needs have diverse needs and thus nurses working with this patient group require diverse education and training in order to deliver quality patient-focused nursing care. Copyright © 2011 Elsevier Ltd. All rights reserved.

  2. Fitting distributions to microbial contamination data collected with an unequal probability sampling design.

    PubMed

    Williams, M S; Ebel, E D; Cao, Y

    2013-01-01

    The fitting of statistical distributions to microbial sampling data is a common application in quantitative microbiology and risk assessment applications. An underlying assumption of most fitting techniques is that data are collected with simple random sampling, which is often times not the case. This study develops a weighted maximum likelihood estimation framework that is appropriate for microbiological samples that are collected with unequal probabilities of selection. A weighted maximum likelihood estimation framework is proposed for microbiological samples that are collected with unequal probabilities of selection. Two examples, based on the collection of food samples during processing, are provided to demonstrate the method and highlight the magnitude of biases in the maximum likelihood estimator when data are inappropriately treated as a simple random sample. Failure to properly weight samples to account for how data are collected can introduce substantial biases into inferences drawn from the data. The proposed methodology will reduce or eliminate an important source of bias in inferences drawn from the analysis of microbial data. This will also make comparisons between studies and the combination of results from different studies more reliable, which is important for risk assessment applications. © 2012 No claim to US Government works.

  3. Validation of a sampling plan to generate food composition data.

    PubMed

    Sammán, N C; Gimenez, M A; Bassett, N; Lobo, M O; Marcoleri, M E

    2016-02-15

    A methodology to develop systematic plans for food sampling was proposed. Long life whole and skimmed milk, and sunflower oil were selected to validate the methodology in Argentina. Fatty acid profile in all foods, proximal composition, and calcium's content in milk were determined with AOAC methods. The number of samples (n) was calculated applying Cochran's formula with variation coefficients ⩽12% and an estimate error (r) maximum permissible ⩽5% for calcium content in milks and unsaturated fatty acids in oil. n were 9, 11 and 21 for long life whole and skimmed milk, and sunflower oil respectively. Sample units were randomly collected from production sites and sent to labs. Calculated r with experimental data was ⩽10%, indicating high accuracy in the determination of analyte content of greater variability and reliability of the proposed sampling plan. The methodology is an adequate and useful tool to develop sampling plans for food composition analysis. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Published methodological quality of randomized controlled trials does not reflect the actual quality assessed in protocols

    PubMed Central

    Mhaskar, Rahul; Djulbegovic, Benjamin; Magazin, Anja; Soares, Heloisa P.; Kumar, Ambuj

    2011-01-01

    Objectives To assess whether reported methodological quality of randomized controlled trials (RCTs) reflect the actual methodological quality, and to evaluate the association of effect size (ES) and sample size with methodological quality. Study design Systematic review Setting Retrospective analysis of all consecutive phase III RCTs published by 8 National Cancer Institute Cooperative Groups until year 2006. Data were extracted from protocols (actual quality) and publications (reported quality) for each study. Results 429 RCTs met the inclusion criteria. Overall reporting of methodological quality was poor and did not reflect the actual high methodological quality of RCTs. The results showed no association between sample size and actual methodological quality of a trial. Poor reporting of allocation concealment and blinding exaggerated the ES by 6% (ratio of hazard ratio [RHR]: 0.94, 95%CI: 0.88, 0.99) and 24% (RHR: 1.24, 95%CI: 1.05, 1.43), respectively. However, actual quality assessment showed no association between ES and methodological quality. Conclusion The largest study to-date shows poor quality of reporting does not reflect the actual high methodological quality. Assessment of the impact of quality on the ES based on reported quality can produce misleading results. PMID:22424985

  5. Abundance, distribution and diversity of gelatinous predators along the northern Mid-Atlantic Ridge: A comparison of different sampling methodologies

    PubMed Central

    Falkenhaug, Tone; Baxter, Emily J.

    2017-01-01

    The diversity and distribution of gelatinous zooplankton were investigated along the northern Mid-Atlantic Ridge (MAR) from June to August 2004.Here, we present results from macrozooplankton trawl sampling, as well as comparisons made between five different methodologies that were employed during the MAR-ECO survey. In total, 16 species of hydromedusae, 31 species of siphonophores and four species of scyphozoans were identified to species level from macrozooplankton trawl samples. Additional taxa were identified to higher taxonomic levels and a single ctenophore genus was observed. Samples were collected at 17 stations along the MAR between the Azores and Iceland. A divergence in the species assemblages was observed at the southern limit of the Subpolar Frontal Zone. The catch composition of gelatinous zooplankton is compared between different sampling methodologies including: a macrozooplankton trawl; a Multinet; a ringnet attached to bottom trawl; and optical platforms (Underwater Video Profiler (UVP) & Remotely Operated Vehicle (ROV)). Different sampling methodologies are shown to exhibit selectivity towards different groups of gelatinous zooplankton. Only ~21% of taxa caught during the survey were caught by both the macrozooplankton trawl and the Multinet when deployed at the same station. The estimates of gelatinous zooplankton abundance calculated using these two gear types also varied widely (1.4 ± 0.9 individuals 1000 m-3 estimated by the macrozooplankton trawl vs. 468.3 ± 315.4 individuals 1000 m-3 estimated by the Multinet (mean ± s.d.) when used at the same stations (n = 6). While it appears that traditional net sampling can generate useful data on pelagic cnidarians, comparisons with results from the optical platforms suggest that ctenophore diversity and abundance are consistently underestimated, particularly when net sampling is conducted in combination with formalin fixation. The results emphasise the importance of considering sampling methodology both when planning surveys, as well as when interpreting existing data. PMID:29095891

  6. Abundance, distribution and diversity of gelatinous predators along the northern Mid-Atlantic Ridge: A comparison of different sampling methodologies.

    PubMed

    Hosia, Aino; Falkenhaug, Tone; Baxter, Emily J; Pagès, Francesc

    2017-01-01

    The diversity and distribution of gelatinous zooplankton were investigated along the northern Mid-Atlantic Ridge (MAR) from June to August 2004.Here, we present results from macrozooplankton trawl sampling, as well as comparisons made between five different methodologies that were employed during the MAR-ECO survey. In total, 16 species of hydromedusae, 31 species of siphonophores and four species of scyphozoans were identified to species level from macrozooplankton trawl samples. Additional taxa were identified to higher taxonomic levels and a single ctenophore genus was observed. Samples were collected at 17 stations along the MAR between the Azores and Iceland. A divergence in the species assemblages was observed at the southern limit of the Subpolar Frontal Zone. The catch composition of gelatinous zooplankton is compared between different sampling methodologies including: a macrozooplankton trawl; a Multinet; a ringnet attached to bottom trawl; and optical platforms (Underwater Video Profiler (UVP) & Remotely Operated Vehicle (ROV)). Different sampling methodologies are shown to exhibit selectivity towards different groups of gelatinous zooplankton. Only ~21% of taxa caught during the survey were caught by both the macrozooplankton trawl and the Multinet when deployed at the same station. The estimates of gelatinous zooplankton abundance calculated using these two gear types also varied widely (1.4 ± 0.9 individuals 1000 m-3 estimated by the macrozooplankton trawl vs. 468.3 ± 315.4 individuals 1000 m-3 estimated by the Multinet (mean ± s.d.) when used at the same stations (n = 6). While it appears that traditional net sampling can generate useful data on pelagic cnidarians, comparisons with results from the optical platforms suggest that ctenophore diversity and abundance are consistently underestimated, particularly when net sampling is conducted in combination with formalin fixation. The results emphasise the importance of considering sampling methodology both when planning surveys, as well as when interpreting existing data.

  7. New classification of natural breeding habitats for Neotropical anophelines in the Yanomami Indian Reserve, Amazon Region, Brazil and a new larval sampling methodology.

    PubMed

    Sánchez-Ribas, Jordi; Oliveira-Ferreira, Joseli; Rosa-Freitas, Maria Goreti; Trilla, Lluís; Silva-do-Nascimento, Teresa Fernandes

    2015-09-01

    Here we present the first in a series of articles about the ecology of immature stages of anophelines in the Brazilian Yanomami area. We propose a new larval habitat classification and a new larval sampling methodology. We also report some preliminary results illustrating the applicability of the methodology based on data collected in the Brazilian Amazon rainforest in a longitudinal study of two remote Yanomami communities, Parafuri and Toototobi. In these areas, we mapped and classified 112 natural breeding habitats located in low-order river systems based on their association with river flood pulses, seasonality and exposure to sun. Our classification rendered seven types of larval habitats: lakes associated with the river, which are subdivided into oxbow lakes and nonoxbow lakes, flooded areas associated with the river, flooded areas not associated with the river, rainfall pools, small forest streams, medium forest streams and rivers. The methodology for larval sampling was based on the accurate quantification of the effective breeding area, taking into account the area of the perimeter and subtypes of microenvironments present per larval habitat type using a laser range finder and a small portable inflatable boat. The new classification and new sampling methodology proposed herein may be useful in vector control programs.

  8. New classification of natural breeding habitats for Neotropical anophelines in the Yanomami Indian Reserve, Amazon Region, Brazil and a new larval sampling methodology

    PubMed Central

    Sánchez-Ribas, Jordi; Oliveira-Ferreira, Joseli; Rosa-Freitas, Maria Goreti; Trilla, Lluís; Silva-do-Nascimento, Teresa Fernandes

    2015-01-01

    Here we present the first in a series of articles about the ecology of immature stages of anophelines in the Brazilian Yanomami area. We propose a new larval habitat classification and a new larval sampling methodology. We also report some preliminary results illustrating the applicability of the methodology based on data collected in the Brazilian Amazon rainforest in a longitudinal study of two remote Yanomami communities, Parafuri and Toototobi. In these areas, we mapped and classified 112 natural breeding habitats located in low-order river systems based on their association with river flood pulses, seasonality and exposure to sun. Our classification rendered seven types of larval habitats: lakes associated with the river, which are subdivided into oxbow lakes and nonoxbow lakes, flooded areas associated with the river, flooded areas not associated with the river, rainfall pools, small forest streams, medium forest streams and rivers. The methodology for larval sampling was based on the accurate quantification of the effective breeding area, taking into account the area of the perimeter and subtypes of microenvironments present per larval habitat type using a laser range finder and a small portable inflatable boat. The new classification and new sampling methodology proposed herein may be useful in vector control programs. PMID:26517655

  9. Determination of alcohol sulfates in wastewater treatment plant influents and effluents by gas chromatography-mass spectrometry.

    PubMed

    Fernández-Ramos, C; Ballesteros, O; Blanc, R; Zafra-Gómez, A; Jiménez-Díaz, I; Navalón, A; Vílchez, J L

    2012-08-30

    In the present paper, we developed an accurate method for the analysis of alcohol sulfates (AS) in wastewater samples from wastewater treatment plant (WWTP) influents and effluents. Although many methodologies have been published in the literature concerning the study of anionic surfactants in environmental samples, at present, the number of analytical methodologies that focus in the determination of AS by gas chromatography in the different environmental compartments is limited. The reason for this is that gas chromatography-mass spectrometry (GC-MS) technique requires a previous hydrolysis reaction followed by derivatization reactions. In the present work, we proposed a new procedure in which the hydrolysis and derivatization reactions take place in one single step and AS are directly converted to trimethylsilyl derivatives. The main factors affecting solid-phase extraction (SPE), hydrolysis/derivatization and GC-MS procedures were accurately optimised. Quantification of the target compounds was performed by using GC-MS in selected ion monitoring (SIM) mode. The limits of detection (LOD) obtained ranged from 0.2 to 0.3 μg L(-1), and limits of quantification (LOQ) from 0.5 to 1.0 μg L(-1), while inter- and intra-day variability was under 5%. A recovery assay was also carried out. Recovery rates for homologues in spiked samples ranged from 96 to 103%. The proposed method was successfully applied for the determination of anionic surfactants in wastewater samples from one WWTP located in Granada (Spain). Concentration levels for the homologues up to 39.4 μg L(-1) in influent and up to 8.1 μg L(-1) in effluent wastewater samples. Copyright © 2012 Elsevier B.V. All rights reserved.

  10. Prediction of Cortisol and Progesterone Concentrations in Cow Hair Using Near-Infrared Reflectance Spectroscopy (NIRS).

    PubMed

    Tallo-Parra, Oriol; Albanell, Elena; Carbajal, Annais; Monclús, Laura; Manteca, Xavier; Lopez-Bejar, Manel

    2017-08-01

    Concentrations of different steroid hormones have been used in cows as a measure of adrenal or gonadal activity and, thus, as indicators of stress or reproductive state. Detecting cortisol and progesterone in cow hair provides a long-term integrative value of retrospective adrenal or gonadal/placental activity, respectively. Current techniques for steroid detection require a hormone-extraction procedure that involves time, several types of equipment, management of reagents, and some assay procedures (which can also be time-consuming and can destroy the samples). In contrast, near-infrared reflectance spectroscopy (NIRS) is a multi-component predictor technique, characterized as rapid, nondestructive for the sample, and reagent-free. However, as a predictor technique, NIRS needs to be calibrated and validated for each matrix, hormone, and species. The main objective of this study was to evaluate the predictive value of the NIRS technique for hair cortisol and progesterone quantification in cows by using specific enzyme immunoassay as a reference method. Hair samples from 52 adult Friesian lactating cows from a commercial dairy farm were used. Reflectance spectra of hair samples were determined with a NIR reflectance spectrophotometer before and after trimming them. Although similar results were obtained, a slightly better relationship between the reference data and NIRS predicted values was found using trimmed samples. Near infrared reflectance spectroscopy demonstrated its ability to predict cortisol and progesterone concentrations with certain accuracy (R 2  = 0.90 for cortisol and R 2  = 0.87 for progesterone). Although NIRS is far from being a complete alternative to current methodologies, the proposed equations can offer screening capability. Considering the advantages of both fields, our results open the possibility for future work on the combination of hair steroid measurement and NIRS methodology.

  11. Evaluation of thermal optical analysis method of elemental carbon for marine fuel exhaust.

    PubMed

    Lappi, Maija K; Ristimäki, Jyrki M

    2017-12-01

    The awareness of black carbon (BC) as the second largest anthropogenic contributor in global warming and an ice melting enhancer has increased. Due to prospected increase in shipping especially in the Arctic reliability of BC emissions and their invented amounts from ships is gaining more attention. The International Maritime Organization (IMO) is actively working toward estimation of quantities and effects of BC especially in the Arctic. IMO has launched work toward constituting a definition for BC and agreeing appropriate methods for its determination from shipping emission sources. In our study we evaluated the suitability of elemental carbon (EC) analysis by a thermal-optical transmittance (TOT) method to marine exhausts and possible measures to overcome the analysis interferences related to the chemically complex emissions. The measures included drying with CaSO 4, evaporation at 40-180ºC, H 2 O treatment, and variation of the sampling method (in-stack and diluted) and its parameters (e.g., dilution ratio, Dr). A reevaluation of the nominal organic carbon (OC)/EC split point was made. Measurement of residual carbon after solvent extraction (TC-C SOF ) was used as a reference, and later also filter smoke number (FSN) measurement, which is dealt with in a forthcoming paper by the authors. Exhaust sources used for collecting the particle sample were mainly four-stroke marine engines operated with variable loads and marine fuels ranging from light to heavy fuel oils (LFO and HFO) with a sulfur content range of <0.1-2.4% S. The results were found to be dependent on many factors, namely, sampling, preparation and analysis method, and fuel quality. It was found that the condensed H 2 SO 4 + H 2 O on the particulate matter (PM) filter had an effect on the measured EC content, and also promoted the formation of pyrolytic carbon (PyC) from OC, affecting the accuracy of EC determination. Thus, uncertainty remained regarding the EC results from HFO fuels. The work supports one part of the decision making in black carbon (BC) determination methodology. If regulations regarding BC emissions from marine engines will be implemented in the future, a well-defined and at best unequivocal method of BC determination is required for coherent and comparable emission inventories and estimating BC effects. As the aerosol from marine emission sources may be very heterogeneous and low in BC, special attention to the effects of sampling conditions and sample pretreatments on the validity of the results was paid in developing the thermal-optical analysis methodology (TOT).

  12. Seeking Connectivity in Nurses' Work Environments: Advancing Nurse Empowerment Theory.

    PubMed

    Udod, Sonia

    2014-09-01

    The purpose of this study was to investigate how staff nurses and their managers exercise power in a hospital setting in order to better understand what fosters or constrains staff nurses' empowerment and to extend nurse empowerment theory. Power is integral to empowerment, and attention to the challenges in nurses' work environment and nurse outcomes by administrators, researchers, and policy-makers has created an imperative to advance a theoretical understanding of power in the nurse-manager relationship. A sample of 26 staff nurses on 3 units of a tertiary hospital in western Canada were observed and interviewed about how the manager affected their ability to do their work. Grounded theory methodology was used. The process of seeking connectivity was the basic social process, indicating that the manager plays a critical role in the work environment and nurses need the manager to share power with them in the provision of safe, quality patient care. Copyright© by Ingram School of Nursing, McGill University.

  13. Development and prevention of work related disorders in a sample of Brazilian violinists.

    PubMed

    Lima, Ronise Costa; Pinheiro, Tarcísio Márcio Magalhães; Dias, Elizabeth Costa; de Andrade, Edson Queiroz

    2015-06-05

    The present study is part of a project designed to investigate the development of disorders related to the work of orchestra violinists. To describe and analyze the functional disorders of the musculoskeletal systems of violinists from the four orchestras in Belo Horizonte, Brazil. Analyses of musculoskeletal system disorders found in violinists from orchestras in Belo Horizonte, Brazil, were completed using a variety of approaches, including Occupational Therapy, Epidemiology and the Social Science methodologies. Participants sustained musculoskeletal disorders despite their common sense belief that musicians are generally healthier than other professional groups. The struggle for a better financial situation forced study participants to work harder, in a variety of work environments, increasing and diversifying their exposure to risk factors. Protective and preventive measures were scarce and in most cases these were only employed after the onset of musculoskeletal disorders. The use of inadequate strategies and the lack of appropriate options to deal with risk factors contributed to the maintenance of symptoms or the onset of health disorders.

  14. Quantifying chemical reactions by using mixing analysis.

    PubMed

    Jurado, Anna; Vázquez-Suñé, Enric; Carrera, Jesús; Tubau, Isabel; Pujades, Estanislao

    2015-01-01

    This work is motivated by a sound understanding of the chemical processes that affect the organic pollutants in an urban aquifer. We propose an approach to quantify such processes using mixing calculations. The methodology consists of the following steps: (1) identification of the recharge sources (end-members) and selection of the species (conservative and non-conservative) to be used, (2) identification of the chemical processes and (3) evaluation of mixing ratios including the chemical processes. This methodology has been applied in the Besòs River Delta (NE Barcelona, Spain), where the River Besòs is the main aquifer recharge source. A total number of 51 groundwater samples were collected from July 2007 to May 2010 during four field campaigns. Three river end-members were necessary to explain the temporal variability of the River Besòs: one river end-member is from the wet periods (W1) and two are from dry periods (D1 and D2). This methodology has proved to be useful not only to compute the mixing ratios but also to quantify processes such as calcite and magnesite dissolution, aerobic respiration and denitrification undergone at each observation point. Copyright © 2014 Elsevier B.V. All rights reserved.

  15. Clinical results from a noninvasive blood glucose monitor

    NASA Astrophysics Data System (ADS)

    Blank, Thomas B.; Ruchti, Timothy L.; Lorenz, Alex D.; Monfre, Stephen L.; Makarewicz, M. R.; Mattu, Mutua; Hazen, Kevin

    2002-05-01

    Non-invasive blood glucose monitoring has long been proposed as a means for advancing the management of diabetes through increased measurement and control. The use of a near-infrared, NIR, spectroscopy based methodology for noninvasive monitoring has been pursued by a number of groups. The accuracy of the NIR measurement technology is limited by challenges related to the instrumentation, the heterogeneity and time-variant nature of skin tissue, and the complexity of the calibration methodology. In this work, we discuss results from a clinical study that targeted the evaluation of individual calibrations for each subject based on a series of controlled calibration visits. While the customization of the calibrations to individuals was intended to reduce model complexity, the extensive requirements for each individual set of calibration data were difficult to achieve and required several days of measurement. Through the careful selection of a small subset of data from all samples collected on the 138 study participants in a previous study, we have developed a methodology for applying a single standard calibration to multiple persons. The standard calibrations have been applied to a plurality of individuals and shown to be persistent over periods greater than 24 weeks.

  16. FDAAA legislation is working, but methodological flaws undermine the reliability of clinical trials: a cross-sectional study.

    PubMed

    Marin Dos Santos, Douglas H; Atallah, Álvaro N

    2015-01-01

    The relationship between clinical research and the pharmaceutical industry has placed clinical trials in jeopardy. According to the medical literature, more than 70% of clinical trials are industry-funded. Many of these trials remain unpublished or have methodological flaws that distort their results. In 2007, it was signed into law the Food and Drug Administration Amendments Act (FDAAA), aiming to provide publicly access to a broad range of biomedical information to be made available on the platform ClinicalTrials (available at https://www.clinicaltrials.gov). We accessed ClinicalTrials.gov and evaluated the compliance of researchers and sponsors with the FDAAA. Our sample comprised 243 protocols of clinical trials of biological monoclonal antibodies (mAb) adalimumab, bevacizumab, infliximab, rituximab, and trastuzumab. We demonstrate that the new legislation has positively affected transparency patterns in clinical research, through a significant increase in publication and online reporting rates after the enactment of the law. Poorly designed trials, however, remain a challenge to be overcome, due to a high prevalence of methodological flaws. These flaws affect the quality of clinical information available, breaching ethical duties of sponsors and researchers, as well as the human right to health.

  17. FDAAA legislation is working, but methodological flaws undermine the reliability of clinical trials: a cross-sectional study

    PubMed Central

    Atallah, Álvaro N.

    2015-01-01

    The relationship between clinical research and the pharmaceutical industry has placed clinical trials in jeopardy. According to the medical literature, more than 70% of clinical trials are industry-funded. Many of these trials remain unpublished or have methodological flaws that distort their results. In 2007, it was signed into law the Food and Drug Administration Amendments Act (FDAAA), aiming to provide publicly access to a broad range of biomedical information to be made available on the platform ClinicalTrials (available at https://www.clinicaltrials.gov). We accessed ClinicalTrials.gov and evaluated the compliance of researchers and sponsors with the FDAAA. Our sample comprised 243 protocols of clinical trials of biological monoclonal antibodies (mAb) adalimumab, bevacizumab, infliximab, rituximab, and trastuzumab. We demonstrate that the new legislation has positively affected transparency patterns in clinical research, through a significant increase in publication and online reporting rates after the enactment of the law. Poorly designed trials, however, remain a challenge to be overcome, due to a high prevalence of methodological flaws. These flaws affect the quality of clinical information available, breaching ethical duties of sponsors and researchers, as well as the human right to health. PMID:26131374

  18. 77 FR 7109 - Establishment of User Fees for Filovirus Testing of Nonhuman Primate Liver Samples

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-10

    ... assay (ELISA) or other appropriate methodology. Each specimen will be held for six months. After six... loss of the only commercially available antigen-detection ELISA filovirus testing facility. Currently... current methodology (ELISA) used to test NHP liver samples. This cost determines the amount of the user...

  19. Learning and Innovation in Agriculture and Rural Development: The Use of the Concepts of Boundary Work and Boundary Objects

    ERIC Educational Resources Information Center

    Tisenkopfs, Talis; Kunda, Ilona; šumane, Sandra; Brunori, Gianluca; Klerkx, Laurens; Moschitz, Heidrun

    2015-01-01

    Purpose: The paper explores the role of boundary work and boundary objects in enhancing learning and innovation processes in hybrid multi-actor networks for sustainable agriculture (LINSA). Design/Methodology/Approach: Boundary work in LINSA is analysed on the basis of six case studies carried out in SOLINSA project under a common methodology. In…

  20. Tsunami hazard assessments with consideration of uncertain earthquakes characteristics

    NASA Astrophysics Data System (ADS)

    Sepulveda, I.; Liu, P. L. F.; Grigoriu, M. D.; Pritchard, M. E.

    2017-12-01

    The uncertainty quantification of tsunami assessments due to uncertain earthquake characteristics faces important challenges. First, the generated earthquake samples must be consistent with the properties observed in past events. Second, it must adopt an uncertainty propagation method to determine tsunami uncertainties with a feasible computational cost. In this study we propose a new methodology, which improves the existing tsunami uncertainty assessment methods. The methodology considers two uncertain earthquake characteristics, the slip distribution and location. First, the methodology considers the generation of consistent earthquake slip samples by means of a Karhunen Loeve (K-L) expansion and a translation process (Grigoriu, 2012), applicable to any non-rectangular rupture area and marginal probability distribution. The K-L expansion was recently applied by Le Veque et al. (2016). We have extended the methodology by analyzing accuracy criteria in terms of the tsunami initial conditions. Furthermore, and unlike this reference, we preserve the original probability properties of the slip distribution, by avoiding post sampling treatments such as earthquake slip scaling. Our approach is analyzed and justified in the framework of the present study. Second, the methodology uses a Stochastic Reduced Order model (SROM) (Grigoriu, 2009) instead of a classic Monte Carlo simulation, which reduces the computational cost of the uncertainty propagation. The methodology is applied on a real case. We study tsunamis generated at the site of the 2014 Chilean earthquake. We generate earthquake samples with expected magnitude Mw 8. We first demonstrate that the stochastic approach of our study generates consistent earthquake samples with respect to the target probability laws. We also show that the results obtained from SROM are more accurate than classic Monte Carlo simulations. We finally validate the methodology by comparing the simulated tsunamis and the tsunami records for the 2014 Chilean earthquake. Results show that leading wave measurements fall within the tsunami sample space. At later times, however, there are mismatches between measured data and the simulated results, suggesting that other sources of uncertainty are as relevant as the uncertainty of the studied earthquake characteristics.

  1. Effects of disease severity distribution on the performance of quantitative diagnostic methods and proposal of a novel 'V-plot' methodology to display accuracy values.

    PubMed

    Petraco, Ricardo; Dehbi, Hakim-Moulay; Howard, James P; Shun-Shin, Matthew J; Sen, Sayan; Nijjer, Sukhjinder S; Mayet, Jamil; Davies, Justin E; Francis, Darrel P

    2018-01-01

    Diagnostic accuracy is widely accepted by researchers and clinicians as an optimal expression of a test's performance. The aim of this study was to evaluate the effects of disease severity distribution on values of diagnostic accuracy as well as propose a sample-independent methodology to calculate and display accuracy of diagnostic tests. We evaluated the diagnostic relationship between two hypothetical methods to measure serum cholesterol (Chol rapid and Chol gold ) by generating samples with statistical software and (1) keeping the numerical relationship between methods unchanged and (2) changing the distribution of cholesterol values. Metrics of categorical agreement were calculated (accuracy, sensitivity and specificity). Finally, a novel methodology to display and calculate accuracy values was presented (the V-plot of accuracies). No single value of diagnostic accuracy can be used to describe the relationship between tests, as accuracy is a metric heavily affected by the underlying sample distribution. Our novel proposed methodology, the V-plot of accuracies, can be used as a sample-independent measure of a test performance against a reference gold standard.

  2. Quantification of phototrophic biomass on rocks: optimization of chlorophyll-a extraction by response surface methodology.

    PubMed

    Fernández-Silva, I; Sanmartín, P; Silva, B; Moldes, A; Prieto, B

    2011-01-01

    Biological colonization of rock surfaces constitutes an important problem for maintenance of buildings and monuments. In this work, we aim to establish an efficient extraction protocol for chlorophyll-a specific for rock materials, as this is one of the most commonly used biomarkers for quantifying phototrophic biomass. For this purpose, rock samples were cut into blocks, and three different mechanical treatments were tested, prior to extraction in dimethyl sulfoxide (DMSO). To evaluate the influence of the experimental factors (1) extractant-to-sample ratio, (2) temperature, and (3) time of incubation, on chlorophyll-a recovery (response variable), incomplete factorial designs of experiments were followed. Temperature of incubation was the most relevant variable for chlorophyll-a extraction. The experimental data obtained were analyzed following a response surface methodology, which allowed the development of empirical models describing the interrelationship between the considered response and experimental variables. The optimal extraction conditions for chlorophyll-a were estimated, and the expected yields were calculated. Based on these results, we propose a method involving application of ultrasound directly to intact sample, followed by incubation in 0.43 ml DMSO/cm(2) sample at 63°C for 40 min. Confirmation experiments were performed at the predicted optimal conditions, allowing chlorophyll-a recovery of 84.4 ± 11.6% (90% was expected), which implies a substantial improvement with respect to the expected recovery using previous methods (68%). This method will enable detection of small amounts of photosynthetic microorganisms and quantification of the extent of biocolonization of stone surfaces.

  3. 76 FR 54216 - Pacific Fishery Management Council (Council); Work Session To Review Proposed Salmon Methodology...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-31

    ... Fishery Management Council (Council); Work Session To Review Proposed Salmon Methodology Changes AGENCY.... ACTION: Notice of a public meeting. SUMMARY: The Pacific Fishery Management Council's Salmon Technical Team (STT), Scientific and Statistical Committee (SSC) Salmon Subcommittee, and Model Evaluation...

  4. Microextraction by Packed Sorbent (MEPS) and Solid-Phase Microextraction (SPME) as Sample Preparation Procedures for the Metabolomic Profiling of Urine

    PubMed Central

    Silva, Catarina; Cavaco, Carina; Perestrelo, Rosa; Pereira, Jorge; Câmara, José S.

    2014-01-01

    For a long time, sample preparation was unrecognized as a critical issue in the analytical methodology, thus limiting the performance that could be achieved. However, the improvement of microextraction techniques, particularly microextraction by packed sorbent (MEPS) and solid-phase microextraction (SPME), completely modified this scenario by introducing unprecedented control over this process. Urine is a biological fluid that is very interesting for metabolomics studies, allowing human health and disease characterization in a minimally invasive form. In this manuscript, we will critically review the most relevant and promising works in this field, highlighting how the metabolomic profiling of urine can be an extremely valuable tool for the early diagnosis of highly prevalent diseases, such as cardiovascular, oncologic and neurodegenerative ones. PMID:24958388

  5. Strategy for design NIR calibration sets based on process spectrum and model space: An innovative approach for process analytical technology.

    PubMed

    Cárdenas, V; Cordobés, M; Blanco, M; Alcalà, M

    2015-10-10

    The pharmaceutical industry is under stringent regulations on quality control of their products because is critical for both, productive process and consumer safety. According to the framework of "process analytical technology" (PAT), a complete understanding of the process and a stepwise monitoring of manufacturing are required. Near infrared spectroscopy (NIRS) combined with chemometrics have lately performed efficient, useful and robust for pharmaceutical analysis. One crucial step in developing effective NIRS-based methodologies is selecting an appropriate calibration set to construct models affording accurate predictions. In this work, we developed calibration models for a pharmaceutical formulation during its three manufacturing stages: blending, compaction and coating. A novel methodology is proposed for selecting the calibration set -"process spectrum"-, into which physical changes in the samples at each stage are algebraically incorporated. Also, we established a "model space" defined by Hotelling's T(2) and Q-residuals statistics for outlier identification - inside/outside the defined space - in order to select objectively the factors to be used in calibration set construction. The results obtained confirm the efficacy of the proposed methodology for stepwise pharmaceutical quality control, and the relevance of the study as a guideline for the implementation of this easy and fast methodology in the pharma industry. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Sampling methods to the statistical control of the production of blood components.

    PubMed

    Pereira, Paulo; Seghatchian, Jerard; Caldeira, Beatriz; Santos, Paula; Castro, Rosa; Fernandes, Teresa; Xavier, Sandra; de Sousa, Gracinda; de Almeida E Sousa, João Paulo

    2017-12-01

    The control of blood components specifications is a requirement generalized in Europe by the European Commission Directives and in the US by the AABB standards. The use of a statistical process control methodology is recommended in the related literature, including the EDQM guideline. The control reliability is dependent of the sampling. However, a correct sampling methodology seems not to be systematically applied. Commonly, the sampling is intended to comply uniquely with the 1% specification to the produced blood components. Nevertheless, on a purely statistical viewpoint, this model could be argued not to be related to a consistent sampling technique. This could be a severe limitation to detect abnormal patterns and to assure that the production has a non-significant probability of producing nonconforming components. This article discusses what is happening in blood establishments. Three statistical methodologies are proposed: simple random sampling, sampling based on the proportion of a finite population, and sampling based on the inspection level. The empirical results demonstrate that these models are practicable in blood establishments contributing to the robustness of sampling and related statistical process control decisions for the purpose they are suggested for. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. A rapid sample screening method for authenticity control of whiskey using capillary electrophoresis with online preconcentration.

    PubMed

    Heller, Melina; Vitali, Luciano; Oliveira, Marcone Augusto Leal; Costa, Ana Carolina O; Micke, Gustavo Amadeu

    2011-07-13

    The present study aimed to develop a methodology using capillary electrophoresis for the determination of sinapaldehyde, syringaldehyde, coniferaldehyde, and vanillin in whiskey samples. The main objective was to obtain a screening method to differentiate authentic samples from seized samples suspected of being false using the phenolic aldehydes as chemical markers. The optimized background electrolyte was composed of 20 mmol L(-1) sodium tetraborate with 10% MeOH at pH 9.3. The study examined two kinds of sample stacking, using a long-end injection mode: normal sample stacking (NSM) and sample stacking with matrix removal (SWMR). In SWMR, the optimized injection time of the samples was 42 s (SWMR42); at this time, no matrix effects were observed. Values of r were >0.99 for the both methods. The LOD and LOQ were better than 100 and 330 mg mL(-1) for NSM and better than 22 and 73 mg L(-1) for SWMR. The CE-UV reliability in the aldehyde analysis in the real sample was compared statistically with LC-MS/MS methodology, and no significant differences were found, with a 95% confidence interval between the methodologies.

  8. Interrogating discourse: the application of Foucault's methodological discussion to specific inquiry.

    PubMed

    Fadyl, Joanna K; Nicholls, David A; McPherson, Kathryn M

    2013-09-01

    Discourse analysis following the work of Michel Foucault has become a valuable methodology in the critical analysis of a broad range of topics relating to health. However, it can be a daunting task, in that there seems to be both a huge number of possible approaches to carrying out this type of project, and an abundance of different, often conflicting, opinions about what counts as 'Foucauldian'. This article takes the position that methodological design should be informed by ongoing discussion and applied as appropriate to a particular area of inquiry. The discussion given offers an interpretation and application of Foucault's methodological principles, integrating a reading of Foucault with applications of his work by other authors, showing how this is then applied to interrogate the practice of vocational rehabilitation. It is intended as a contribution to methodological discussion in this area, offering an interpretation of various methodological elements described by Foucault, alongside specific application of these aspects.

  9. Respondent-Driven Sampling: An Assessment of Current Methodology.

    PubMed

    Gile, Krista J; Handcock, Mark S

    2010-08-01

    Respondent-Driven Sampling (RDS) employs a variant of a link-tracing network sampling strategy to collect data from hard-to-reach populations. By tracing the links in the underlying social network, the process exploits the social structure to expand the sample and reduce its dependence on the initial (convenience) sample.The current estimators of population averages make strong assumptions in order to treat the data as a probability sample. We evaluate three critical sensitivities of the estimators: to bias induced by the initial sample, to uncontrollable features of respondent behavior, and to the without-replacement structure of sampling.Our analysis indicates: (1) that the convenience sample of seeds can induce bias, and the number of sample waves typically used in RDS is likely insufficient for the type of nodal mixing required to obtain the reputed asymptotic unbiasedness; (2) that preferential referral behavior by respondents leads to bias; (3) that when a substantial fraction of the target population is sampled the current estimators can have substantial bias.This paper sounds a cautionary note for the users of RDS. While current RDS methodology is powerful and clever, the favorable statistical properties claimed for the current estimates are shown to be heavily dependent on often unrealistic assumptions. We recommend ways to improve the methodology.

  10. Stigma-related experiences in non-communicable respiratory diseases: A systematic review.

    PubMed

    Rose, Shiho; Paul, Christine; Boyes, Allison; Kelly, Brian; Roach, Della

    2017-08-01

    The stigma of non-communicable respiratory diseases (NCRDs), whether perceived or otherwise, can be an important element of a patient's experience of his/her illness and a contributing factor to poor psychosocial, treatment and clinical outcomes. This systematic review examines the evidence regarding the associations between stigma-related experiences and patient outcomes, comparing findings across a range of common NCRDs. Electronic databases and manual searches were conducted to identify original quantitative research published to December 2015. Articles focussing on adult patient samples diagnosed with asthma, chronic obstructive pulmonary disease (COPD), cystic fibrosis, lung cancer or mesothelioma, and included a measurement of stigma-related experience (i.e. perceived stigma, shame, blame or guilt), were eligible for inclusion. Included articles were described for study characteristics, outcome scores, correlates between stigma-related experiences and patient outcomes and methodological rigor. Twenty-five articles were eligible for this review, with most ( n = 20) related to lung cancer. No articles for cystic fibrosis were identified. Twenty unique scales were used, with low to moderate stigma-related experiences reported overall. The stigma-related experiences significantly correlated with all six patient-related domains explored (psychosocial, quality of life, behavioral, physical, treatment and work), which were investigated more widely in COPD and lung cancer samples. No studies adequately met all criteria for methodological rigor. The inter-connectedness of stigma-related experiences to other aspects of patient experiences highlight that an integrated approach is needed to address this important issue. Future studies should adopt more rigorous methodology, including streamlining measures, to provide robust evidence.

  11. Willingness to Donate Human Samples for Establishing a Dermatology Research Biobank: Results of a Survey.

    PubMed

    Gross, Durdana; Schmitz, Arndt A; Vonk, Richardus; Igney, Frederik H; Döcke, Wolf-Dietrich; Schoepe, Stefanie; Sterry, Wolfram; Asadullah, Khusru

    2011-09-01

    There is a rising need for biomaterial in dermatological research with regard to both quality and quantity. Research biobanks as organized collections of biological material with associated personal and clinical data are of increasing importance. Besides technological/methodological and legal aspects, the willingness to donate samples by patients and healthy volunteers is a key success factor. To analyze the theoretical willingness to donate blood and skin samples, we developed and distributed a questionnaire. Six hundred nineteen questionnaires were returned and analyzed. The willingness to donate samples of blood (82.5%) and skin (58.7%) is high among the population analyzed and seems to be largely independent of any expense allowance. People working in the healthcare system, dermatological patients, and higher qualified individuals seem to be in particular willing to donate material. An adequate patient insurance as well as an extensive education about risks and benefits is requested. In summary, there is a high willingness to donate biological samples for dermatological research. This theoretical awareness fits well with our own experiences in establishing such a biobank.

  12. Culturally Competent Social Work Research: Methodological Considerations for Research with Language Minorities

    ERIC Educational Resources Information Center

    Casado, Banghwa Lee; Negi, Nalini Junko; Hong, Michin

    2012-01-01

    Despite the growing number of language minorities, foreign-born individuals with limited English proficiency, this population has been largely left out of social work research, often due to methodological challenges involved in conducting research with this population. Whereas the professional standard calls for cultural competence, a discussion…

  13. Exploring Children's Perceptions of Play Using Visual Methodologies

    ERIC Educational Resources Information Center

    Anthamatten, Peter; Wee, Bryan Shao-Chang; Korris, Erin

    2013-01-01

    Objective: A great deal of scholarly work has examined the way that physical, social and cultural environments relate to children's health behaviour, particularly with respect to diet and exercise. While this work is critical, little research attempts to incorporate the views and perspectives of children themselves using visual methodologies.…

  14. Contextualising Learning at the Education-Training-Work Interface

    ERIC Educational Resources Information Center

    Harreveld, Bobby; Singh, Michael

    2009-01-01

    Purpose: The purpose of this paper is to investigate the ways in which learning is contextualised among the intersecting worlds of education, training and work. Design/methodology/approach: A case study methodology is used. Findings: It was found that contextualised learning is integral to industry-school transition strategies in senior secondary…

  15. Cultural variation in the motivational standards of self-enhancement and self-criticism among bicultural Asian American and Anglo American students.

    PubMed

    Zusho, Akane

    2008-10-01

    Recent work on biculturalism has made theoretical and methodological inroads into our understanding of the relation of cultural processes with psychological functioning. Through the use of cultural priming methodologies, investigators have demonstrated that biculturals, or individuals who have experienced and identify with more than one culture, can switch between various "cultural frames of reference" in response to corresponding social cues (Hong, Morris, Chiu, & Benet-Martinez, 2000). Drawing on this work on the cognitive implications of biculturalism, the purpose of the present study was to examine the assumption that independent and interdependent self-construals are associated with the motivational standards of self-enhancement and self-criticism, respectively. More specifically, the effects of differential primes of self on ratings of self-enhancement were investigated in a sample of bicultural Asian American (N = 42) and Anglo American (N = 60) college students; overall, more similarities than differences were noted between the two groups. It was hypothesized that Anglo American students would display marked tendencies toward self-enhancement. However, this hypothesis was not supported. Nevertheless, consistent prime effects were observed for a selected number of ratings related to academic virtues, with those who received an independent-self prime often exhibiting greater self-enhancing tendencies than those who received an interdependent-self prime. For example, participants in the independent-self condition reported on average significantly higher ratings for self-discipline and initiative, as well as the degree to which they perceived themselves to be hard working. Implications for the work on self-representations, motivation, and acculturation are discussed.

  16. Microcantilever sensor platform for UGV-based detection

    NASA Astrophysics Data System (ADS)

    Lawrence, Tyson T.; Halleck, A. E.; Schuler, Peter S.; Mahmud, K. K.; Hicks, David R.

    2010-04-01

    The increased use of Unmanned Ground Vehicles (UGVs) drives the need for new lightweight, low cost sensors. Microelectromechanical System (MEMS) based microcantilever sensors are a promising technology to meet this need, because they can be manufactured at low cost on a mass scale, and are easily integrated into a UGV platform for detection of explosives and other threat agents. While the technology is extremely sensitive, selectivity is a major challenge and the response modes are not well understood. This work summarizes advances in characterizing ultrasensitive microcantilever responses, sampling considerations, and sensor design and cantilever coating methodologies consistent with UGV point detector needs.

  17. A self-learning algorithm for biased molecular dynamics

    PubMed Central

    Tribello, Gareth A.; Ceriotti, Michele; Parrinello, Michele

    2010-01-01

    A new self-learning algorithm for accelerated dynamics, reconnaissance metadynamics, is proposed that is able to work with a very large number of collective coordinates. Acceleration of the dynamics is achieved by constructing a bias potential in terms of a patchwork of one-dimensional, locally valid collective coordinates. These collective coordinates are obtained from trajectory analyses so that they adapt to any new features encountered during the simulation. We show how this methodology can be used to enhance sampling in real chemical systems citing examples both from the physics of clusters and from the biological sciences. PMID:20876135

  18. Development of Methodology and Technology for Identifying and Quantifying Emission Products from Open Burning and Open Detonation Thermal Treatment Methods. Field Test Series A, B, and C. Volume 2, Part B. Quality Assurance and Quality Control. Appendices

    DTIC Science & Technology

    1992-01-01

    the uncertainty. The above method can give an estimate of the precision of the * analysis. However, determining the accuracy can not be done as...speciation has been determined from analyzing model samples as well as comparison with other methods and combinations of other methods with this method . 3...laboratory. The output of the sensor is characterized over its working range and an appropriate response factor determined by linear regression of the

  19. An Approach for Transmission Loss and Cost Allocation by Loss Allocation Index and Co-operative Game Theory

    NASA Astrophysics Data System (ADS)

    Khan, Baseem; Agnihotri, Ganga; Mishra, Anuprita S.

    2016-03-01

    In the present work authors proposed a novel method for transmission loss and cost allocation to users (generators and loads). In the developed methodology transmission losses are allocated to users based on their usage of the transmission line. After usage allocation, particular loss allocation indices (PLAI) are evaluated for loads and generators. Also Cooperative game theory approach is applied for comparison of results. The proposed method is simple and easy to implement on the practical power system. Sample 6 bus and IEEE 14 bus system is used for showing the effectiveness of proposed method.

  20. Construcción de un catálogo de cúmulos de galaxias en proceso de colisión

    NASA Astrophysics Data System (ADS)

    de los Ríos, M.; Domínguez, M. J.; Paz, D.

    2015-08-01

    In this work we present first results of the identification of colliding galaxy clusters in galaxy catalogs with redshift measurements (SDSS, 2DF), and introduce the methodology. We calibrated a method by studying the merger trees of clusters in a mock catalog based on a full-blown semi-analytic model of galaxy formation on top of the Millenium cosmological simulation. We also discuss future actions for studding our sample of colliding galaxy clusters, including x-ray observations and mass reconstruction obtained by using weak gravitational lenses.

  1. [Reference citation].

    PubMed

    Brkić, Silvija

    2013-01-01

    Scientific and professional papers represent the information basis for scientific research and professional work. References important for the paper should be cited within the text, and listed at the end of the paper. This paper deals with different styles of reference citation. Special emphasis was placed on the Vancouver Style for reference citation in biomedical journals established by the International Committee of Medical Journal Editors. It includes original samples for citing various types of articles, both printed and electronic, as well as recommendations related to reference citation in accordance with the methodology and ethics of scientific research and guidelines for preparing manuscripts for publication.

  2. Analytical Chemistry Developmental Work Using a 243Am Solution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spencer, Khalil J.; Stanley, Floyd E.; Porterfield, Donivan R.

    2015-02-24

    This project seeks to reestablish our analytical capability to characterize Am bulk material and develop a reference material suitable to characterizing the purity and assay of 241Am oxide for industrial use. The tasks associated with this phase of the project included conducting initial separations experiments, developing thermal ionization mass spectrometry capability using the 243Am isotope as an isotope dilution spike , optimizing the spike for the determination of 241Pu- 241 Am radiochemistry, and, additionally, developing and testing a methodology which can detect trace to ultra- trace levels of Pu (both assay and isotopics) in bulk Am samples .

  3. The work of Galileo and conformation of the experiment in physics

    NASA Astrophysics Data System (ADS)

    Alvarez, J. L.; Posadas, Y.

    2003-02-01

    It is very frequent to find comments and references to Galileo's work suggesting that he based his affirmations on a logic thought and not on observations. In this paper we present an analysis of some experiments that he realized and were unknown in the XVI and XVII centuries; in they we find a clear description of the methodology that Galileo follows in order to reach the results that he presents in his formal work, particularly in Discorsi. In contrast with the Aristotelian philosophy, in these manuscripts Galileo adopt a methodology with which he obtain great contributions for the modem conformation of the experimental method, founding so a methodology for the study of the movement. We use this analysis as an example of the difficulties that are present in the conformation of the modem experimentation and we point out the necessity to stress the importance of the scientific methodology in the teaching of physics.

  4. The impact of temporal sampling resolution on parameter inference for biological transport models.

    PubMed

    Harrison, Jonathan U; Baker, Ruth E

    2018-06-25

    Imaging data has become an essential tool to explore key biological questions at various scales, for example the motile behaviour of bacteria or the transport of mRNA, and it has the potential to transform our understanding of important transport mechanisms. Often these imaging studies require us to compare biological species or mutants, and to do this we need to quantitatively characterise their behaviour. Mathematical models offer a quantitative description of a system that enables us to perform this comparison, but to relate mechanistic mathematical models to imaging data, we need to estimate their parameters. In this work we study how collecting data at different temporal resolutions impacts our ability to infer parameters of biological transport models; performing exact inference for simple velocity jump process models in a Bayesian framework. The question of how best to choose the frequency with which data is collected is prominent in a host of studies because the majority of imaging technologies place constraints on the frequency with which images can be taken, and the discrete nature of observations can introduce errors into parameter estimates. In this work, we mitigate such errors by formulating the velocity jump process model within a hidden states framework. This allows us to obtain estimates of the reorientation rate and noise amplitude for noisy observations of a simple velocity jump process. We demonstrate the sensitivity of these estimates to temporal variations in the sampling resolution and extent of measurement noise. We use our methodology to provide experimental guidelines for researchers aiming to characterise motile behaviour that can be described by a velocity jump process. In particular, we consider how experimental constraints resulting in a trade-off between temporal sampling resolution and observation noise may affect parameter estimates. Finally, we demonstrate the robustness of our methodology to model misspecification, and then apply our inference framework to a dataset that was generated with the aim of understanding the localization of RNA-protein complexes.

  5. Efficacy of exercise therapy in workers with rotator cuff tendinopathy: a systematic review

    PubMed Central

    Desmeules, François; Boudreault, Jennifer; Dionne, Clermont E.; Frémont, Pierre; Lowry, Véronique; MacDermid, Joy C.; Roy, Jean-Sébastien

    2016-01-01

    Objective: To perform a systematic review of randomized controlled trials (RCTs) on the efficacy of therapeutic exercises for workers suffering from rotator cuff (RC) tendinopathy. Methods: A literature search in four bibliographical databases (Pubmed, CINAHL, EMBASE, and PEDro) was conducted from inception up to February 2015. RCTs were included if participants were workers suffering from RC tendinopathy, the outcome measures included work-related outcomes, and at least one of the interventions under study included exercises. The methodological quality of the studies was evaluated with the Cochrane Risk of Bias Assessment tool. Results: The mean methodological score of the ten included studies was 54.4%±17.2%. Types of workers included were often not defined, and work-related outcome measures were heterogeneous and often not validated. Three RCTs of moderate methodological quality concluded that exercises were superior to a placebo or no intervention in terms of function and return-to-work outcomes. No significant difference was found between surgery and exercises based on the results of two studies of low to moderate methodological quality. One study of low methodological quality, comparing a workplace-based exercise program focusing on the participants' work demands to an exercise program delivered in a clinical setting, concluded that the work-based intervention was superior in terms of function and return-to-work outcomes. Conclusion: There is low to moderate-grade evidence that therapeutic exercises provided in a clinical setting are an effective modality to treat workers suffering from RC tendinopathy and to promote return-to-work. Further high quality studies comparing different rehabilitation programs including exercises in different settings with defined workers populations are needed to draw firm conclusions on the optimal program to treat workers. PMID:27488037

  6. Efficacy of exercise therapy in workers with rotator cuff tendinopathy: a systematic review.

    PubMed

    Desmeules, François; Boudreault, Jennifer; Dionne, Clermont E; Frémont, Pierre; Lowry, Véronique; MacDermid, Joy C; Roy, Jean-Sébastien

    2016-09-30

    To perform a systematic review of randomized controlled trials (RCTs) on the efficacy of therapeutic exercises for workers suffering from rotator cuff (RC) tendinopathy. A literature search in four bibliographical databases (Pubmed, CINAHL, EMBASE, and PEDro) was conducted from inception up to February 2015. RCTs were included if participants were workers suffering from RC tendinopathy, the outcome measures included work-related outcomes, and at least one of the interventions under study included exercises. The methodological quality of the studies was evaluated with the Cochrane Risk of Bias Assessment tool. The mean methodological score of the ten included studies was 54.4%±17.2%. Types of workers included were often not defined, and work-related outcome measures were heterogeneous and often not validated. Three RCTs of moderate methodological quality concluded that exercises were superior to a placebo or no intervention in terms of function and return-to-work outcomes. No significant difference was found between surgery and exercises based on the results of two studies of low to moderate methodological quality. One study of low methodological quality, comparing a workplace-based exercise program focusing on the participants' work demands to an exercise program delivered in a clinical setting, concluded that the work-based intervention was superior in terms of function and return-to-work outcomes. There is low to moderate-grade evidence that therapeutic exercises provided in a clinical setting are an effective modality to treat workers suffering from RC tendinopathy and to promote return-to-work. Further high quality studies comparing different rehabilitation programs including exercises in different settings with defined workers populations are needed to draw firm conclusions on the optimal program to treat workers.

  7. The ALHAMBRA survey: accurate merger fractions derived by PDF analysis of photometrically close pairs

    NASA Astrophysics Data System (ADS)

    López-Sanjuan, C.; Cenarro, A. J.; Varela, J.; Viironen, K.; Molino, A.; Benítez, N.; Arnalte-Mur, P.; Ascaso, B.; Díaz-García, L. A.; Fernández-Soto, A.; Jiménez-Teja, Y.; Márquez, I.; Masegosa, J.; Moles, M.; Pović, M.; Aguerri, J. A. L.; Alfaro, E.; Aparicio-Villegas, T.; Broadhurst, T.; Cabrera-Caño, J.; Castander, F. J.; Cepa, J.; Cerviño, M.; Cristóbal-Hornillos, D.; Del Olmo, A.; González Delgado, R. M.; Husillos, C.; Infante, L.; Martínez, V. J.; Perea, J.; Prada, F.; Quintana, J. M.

    2015-04-01

    Aims: Our goal is to develop and test a novel methodology to compute accurate close-pair fractions with photometric redshifts. Methods: We improved the currently used methodologies to estimate the merger fraction fm from photometric redshifts by (i) using the full probability distribution functions (PDFs) of the sources in redshift space; (ii) including the variation in the luminosity of the sources with z in both the sample selection and the luminosity ratio constrain; and (iii) splitting individual PDFs into red and blue spectral templates to reliably work with colour selections. We tested the performance of our new methodology with the PDFs provided by the ALHAMBRA photometric survey. Results: The merger fractions and rates from the ALHAMBRA survey agree excellently well with those from spectroscopic work for both the general population and red and blue galaxies. With the merger rate of bright (MB ≤ -20-1.1z) galaxies evolving as (1 + z)n, the power-law index n is higher for blue galaxies (n = 2.7 ± 0.5) than for red galaxies (n = 1.3 ± 0.4), confirming previous results. Integrating the merger rate over cosmic time, we find that the average number of mergers per galaxy since z = 1 is Nmred = 0.57 ± 0.05 for red galaxies and Nmblue = 0.26 ± 0.02 for blue galaxies. Conclusions: Our new methodology statistically exploits all the available information provided by photometric redshift codes and yields accurate measurements of the merger fraction by close pairs from using photometric redshifts alone. Current and future photometric surveys will benefit from this new methodology. Based on observations collected at the German-Spanish Astronomical Center, Calar Alto, jointly operated by the Max-Planck-Institut für Astronomie (MPIA) at Heidelberg and the Instituto de Astrofísica de Andalucía (CSIC).The catalogues, probabilities, and figures of the ALHAMBRA close pairs detected in Sect. 5.1 are available at http://https://cloud.iaa.csic.es/alhambra/catalogues/ClosePairs

  8. Published methodological quality of randomized controlled trials does not reflect the actual quality assessed in protocols.

    PubMed

    Mhaskar, Rahul; Djulbegovic, Benjamin; Magazin, Anja; Soares, Heloisa P; Kumar, Ambuj

    2012-06-01

    To assess whether the reported methodological quality of randomized controlled trials (RCTs) reflects the actual methodological quality and to evaluate the association of effect size (ES) and sample size with methodological quality. Systematic review. This is a retrospective analysis of all consecutive phase III RCTs published by eight National Cancer Institute Cooperative Groups up to 2006. Data were extracted from protocols (actual quality) and publications (reported quality) for each study. Four hundred twenty-nine RCTs met the inclusion criteria. Overall reporting of methodological quality was poor and did not reflect the actual high methodological quality of RCTs. The results showed no association between sample size and actual methodological quality of a trial. Poor reporting of allocation concealment and blinding exaggerated the ES by 6% (ratio of hazard ratio [RHR]: 0.94; 95% confidence interval [CI]: 0.88, 0.99) and 24% (RHR: 1.24; 95% CI: 1.05, 1.43), respectively. However, actual quality assessment showed no association between ES and methodological quality. The largest study to date shows that poor quality of reporting does not reflect the actual high methodological quality. Assessment of the impact of quality on the ES based on reported quality can produce misleading results. Copyright © 2012 Elsevier Inc. All rights reserved.

  9. Methodological quality of behavioural weight loss studies: a systematic review

    PubMed Central

    Lemon, S. C.; Wang, M. L.; Haughton, C. F.; Estabrook, D. P.; Frisard, C. F.; Pagoto, S. L.

    2018-01-01

    Summary This systematic review assessed the methodological quality of behavioural weight loss intervention studies conducted among adults and associations between quality and statistically significant weight loss outcome, strength of intervention effectiveness and sample size. Searches for trials published between January, 2009 and December, 2014 were conducted using PUBMED, MEDLINE and PSYCINFO and identified ninety studies. Methodological quality indicators included study design, anthropometric measurement approach, sample size calculations, intent-to-treat (ITT) analysis, loss to follow-up rate, missing data strategy, sampling strategy, report of treatment receipt and report of intervention fidelity (mean = 6.3). Indicators most commonly utilized included randomized design (100%), objectively measured anthropometrics (96.7%), ITT analysis (86.7%) and reporting treatment adherence (76.7%). Most studies (62.2%) had a follow-up rate >75% and reported a loss to follow-up analytic strategy or minimal missing data (69.9%). Describing intervention fidelity (34.4%) and sampling from a known population (41.1%) were least common. Methodological quality was not associated with reporting a statistically significant result, effect size or sample size. This review found the published literature of behavioural weight loss trials to be of high quality for specific indicators, including study design and measurement. Identified for improvement include utilization of more rigorous statistical approaches to loss to follow up and better fidelity reporting. PMID:27071775

  10. A solid phase extraction-ion chromatography with conductivity detection procedure for determining cationic surfactants in surface water samples.

    PubMed

    Olkowska, Ewa; Polkowska, Żaneta; Namieśnik, Jacek

    2013-11-15

    A new analytical procedure for the simultaneous determination of individual cationic surfactants (alkyl benzyl dimethyl ammonium chlorides) in surface water samples has been developed. We describe this methodology for the first time: it involves the application of solid phase extraction (SPE-for sample preparation) coupled with ion chromatography-conductivity detection (IC-CD-for the final determination). Mean recoveries of analytes between 79% and 93%, and overall method quantification limits in the range from 0.0018 to 0.038 μg/mL for surface water and CRM samples were achieved. The methodology was applied to the determination of individual alkyl benzyl quaternary ammonium compounds in environmental samples (reservoir water) and enables their presence in such types of waters to be confirmed. In addition, it is a simpler, less time-consuming, labour-intensive, avoiding use of toxic chloroform and significantly less expensive methodology than previously described approaches (liquid-liquid extraction coupled with liquid chromatography-mass spectrometry). Copyright © 2013 Elsevier B.V. All rights reserved.

  11. Aeroelastic optimization methodology for viscous and turbulent flows

    NASA Astrophysics Data System (ADS)

    Barcelos Junior, Manuel Nascimento Dias

    2007-12-01

    In recent years, the development of faster computers and parallel processing allowed the application of high-fidelity analysis methods to the aeroelastic design of aircraft. However, these methods are restricted to the final design verification, mainly due to the computational cost involved in iterative design processes. Therefore, this work is concerned with the creation of a robust and efficient aeroelastic optimization methodology for inviscid, viscous and turbulent flows by using high-fidelity analysis and sensitivity analysis techniques. Most of the research in aeroelastic optimization, for practical reasons, treat the aeroelastic system as a quasi-static inviscid problem. In this work, as a first step toward the creation of a more complete aeroelastic optimization methodology for realistic problems, an analytical sensitivity computation technique was developed and tested for quasi-static aeroelastic viscous and turbulent flow configurations. Viscous and turbulent effects are included by using an averaged discretization of the Navier-Stokes equations, coupled with an eddy viscosity turbulence model. For quasi-static aeroelastic problems, the traditional staggered solution strategy has unsatisfactory performance when applied to cases where there is a strong fluid-structure coupling. Consequently, this work also proposes a solution methodology for aeroelastic and sensitivity analyses of quasi-static problems, which is based on the fixed point of an iterative nonlinear block Gauss-Seidel scheme. The methodology can also be interpreted as the solution of the Schur complement of the aeroelastic and sensitivity analyses linearized systems of equations. The methodologies developed in this work are tested and verified by using realistic aeroelastic systems.

  12. Which Methodology Works Better? English Language Teachers' Awareness of the Innovative Language Learning Methodologies

    ERIC Educational Resources Information Center

    Kurt, Mustafa

    2015-01-01

    The present study investigated whether English language teachers were aware of the innovative language learning methodologies in language learning, how they made use of these methodologies and the learners' reactions to them. The descriptive survey method was employed to disclose the frequencies and percentages of 175 English language teachers'…

  13. [Development of an optimized formulation of damask marmalade with low energy level using Taguchi methodology].

    PubMed

    Villarroel, Mario; Castro, Ruth; Junod, Julio

    2003-06-01

    The goal of this present study was the development of an optimized formula of damask marmalade low in calories applying Taguchi methodology to improve the quality of this product. The selection of this methodology lies on the fact that in real life conditions the result of an experiment frequently depends on the influence of several variables, therefore, one expedite way to solve this problem is utilizing factorial desings. The influence of acid, thickener, sweetener and aroma additives, as well as time of cooking, and possible interactions among some of them, were studied trying to get the best combination of these factors to optimize the sensorial quality of an experimental formulation of dietetic damask marmalade. An orthogonal array L8 (2(7)) was applied in this experience, as well as level average analysis was carried out according Taguchi methodology to determine the suitable working levels of the design factors previously choiced, to achieve a desirable product quality. A sensory trained panel was utilized to analyze the marmalade samples using a composite scoring test with a descriptive acuantitative scale ranging from 1 = Bad, 5 = Good. It was demonstrated that the design factors sugar/aspartame, pectin and damask aroma had a significant effect (p < 0.05) on the sensory quality of the marmalade with 82% of contribution on the response. The optimal combination result to be: citric acid 0.2%; pectin 1%; 30 g sugar/16 mg aspartame/100 g, damask aroma 0.5 ml/100 g, time of cooking 5 minutes. Regarding chemical composition, the most important results turned out to be the decrease in carbohydrate content compaired with traditional marmalade with a reduction of 56% in coloric value and also the amount of dietary fiber greater than similar commercial products. Assays of storage stability were carried out on marmalade samples submitted to different temperatures held in plastic bags of different density. Non percetible sensorial, microbiological and chemical changes were detected after 90 days of storage under controlled conditions.

  14. T-Pattern Analysis and Cognitive Load Manipulation to Detect Low-Stake Lies: An Exploratory Study.

    PubMed

    Diana, Barbara; Zurloni, Valentino; Elia, Massimiliano; Cavalera, Cesare; Realdon, Olivia; Jonsson, Gudberg K; Anguera, M Teresa

    2018-01-01

    Deception has evolved to become a fundamental aspect of human interaction. Despite the prolonged efforts in many disciplines, there has been no definite finding of a univocally "deceptive" signal. This work proposes an approach to deception detection combining cognitive load manipulation and T-pattern methodology with the objective of: (a) testing the efficacy of dual task-procedure in enhancing differences between truth tellers and liars in a low-stakes situation; (b) exploring the efficacy of T-pattern methodology in discriminating truthful reports from deceitful ones in a low-stakes situation; (c) setting the experimental design and procedure for following research. We manipulated cognitive load to enhance differences between truth tellers and liars, because of the low-stakes lies involved in our experiment. We conducted an experimental study with a convenience sample of 40 students. We carried out a first analysis on the behaviors' frequencies coded through the observation software, using SPSS (22). The aim was to describe shape and characteristics of behavior's distributions and explore differences between groups. Datasets were then analyzed with Theme 6.0 software which detects repeated patterns (T-patterns) of coded events (non-verbal behaviors) that regularly or irregularly occur within a period of observation. A descriptive analysis on T-pattern frequencies was carried out to explore differences between groups. An in-depth analysis on more complex patterns was performed to get qualitative information on the behavior structure expressed by the participants. Results show that the dual-task procedure enhances differences observed between liars and truth tellers with T-pattern methodology; moreover, T-pattern detection reveals a higher variety and complexity of behavior in truth tellers than in liars. These findings support the combination of cognitive load manipulation and T-pattern methodology for deception detection in low-stakes situations, suggesting the testing of directional hypothesis on a larger probabilistic sample of population.

  15. T-Pattern Analysis and Cognitive Load Manipulation to Detect Low-Stake Lies: An Exploratory Study

    PubMed Central

    Diana, Barbara; Zurloni, Valentino; Elia, Massimiliano; Cavalera, Cesare; Realdon, Olivia; Jonsson, Gudberg K.; Anguera, M. Teresa

    2018-01-01

    Deception has evolved to become a fundamental aspect of human interaction. Despite the prolonged efforts in many disciplines, there has been no definite finding of a univocally “deceptive” signal. This work proposes an approach to deception detection combining cognitive load manipulation and T-pattern methodology with the objective of: (a) testing the efficacy of dual task-procedure in enhancing differences between truth tellers and liars in a low-stakes situation; (b) exploring the efficacy of T-pattern methodology in discriminating truthful reports from deceitful ones in a low-stakes situation; (c) setting the experimental design and procedure for following research. We manipulated cognitive load to enhance differences between truth tellers and liars, because of the low-stakes lies involved in our experiment. We conducted an experimental study with a convenience sample of 40 students. We carried out a first analysis on the behaviors’ frequencies coded through the observation software, using SPSS (22). The aim was to describe shape and characteristics of behavior’s distributions and explore differences between groups. Datasets were then analyzed with Theme 6.0 software which detects repeated patterns (T-patterns) of coded events (non-verbal behaviors) that regularly or irregularly occur within a period of observation. A descriptive analysis on T-pattern frequencies was carried out to explore differences between groups. An in-depth analysis on more complex patterns was performed to get qualitative information on the behavior structure expressed by the participants. Results show that the dual-task procedure enhances differences observed between liars and truth tellers with T-pattern methodology; moreover, T-pattern detection reveals a higher variety and complexity of behavior in truth tellers than in liars. These findings support the combination of cognitive load manipulation and T-pattern methodology for deception detection in low-stakes situations, suggesting the testing of directional hypothesis on a larger probabilistic sample of population. PMID:29551986

  16. Rapid and sensitive ultrasonic-assisted derivatisation microextraction (UDME) technique for bitter taste-free amino acids (FAA) study by HPLC-FLD.

    PubMed

    Chen, Guang; Li, Jun; Sun, Zhiwei; Zhang, Shijuan; Li, Guoliang; Song, Cuihua; Suo, Yourui; You, Jinmao

    2014-01-15

    Amino acids, as the main contributors to taste, are usually found in relatively high levels in bitter foods. In this work, we focused on seeking a rapid, sensitive and simple method to determine FAA for large batches of micro-samples and to explore the relationship between FAA and bitterness. Overall condition optimisation indicated that the new UDME technique offered higher derivatisation yields and extraction efficiencies than traditional methods. Only 35min was needed in the whole operation process. Very low LLOQ (Lower limit of quantification: 0.21-5.43nmol/L) for FAA in twelve bitter foods was obtained, with which BTT (bitter taste thresholds) and CABT (content of FAA at BTT level) were newly determined. The ratio of CABT to BTT increased with decreasing of BTT. This work provided powerful potential for the high-throughput trace analysis of micro-sample and also a methodology to study the relationship between the chemical constituents and the taste. Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. Nanorobotic System iTRo for Controllable 1D Micro/nano Material Twisting Test.

    PubMed

    Lu, Haojian; Shang, Wanfeng; Wei, Xueyong; Yang, Zhan; Fukuda, Toshio; Shen, Yajing

    2017-06-08

    In-situ micro/nano characterization is an indispensable methodology for material research. However, the precise in-situ SEM twisting of 1D material with large range is still challenge for current techniques, mainly due to the testing device's large size and the misalignment between specimen and the rotation axis. Herein, we propose an in-situ twist test robot (iTRo) to address the above challenges and realize the precise in-situ SEM twisting test for the first time. Firstly, we developed the iTRo and designed a series of control strategies, including assembly error initialization, triple-image alignment (TIA) method for rotation axis alignment, deformation-based contact detection (DCD) method for sample assembly, and switch control for robots cooperation. After that, we chose three typical 1D material, i.e., magnetic microwire Fe 74 B 13 Si 11 C 2 , glass fiber, and human hair, for twisting test and characterized their properties. The results showed that our approach is able to align the sample to the twisting axis accurately, and it can provide large twisting range, heavy load and high controllability. This work fills the blank of current in-situ mechanical characterization methodologies, which is expected to give significant impact in the fundamental nanomaterial research and practical micro/nano characterization.

  18. MERCURY MEASUREMENTS USING DIRECT-ANALYZER ...

    EPA Pesticide Factsheets

    Under EPA's Water Quality Research Program, exposure studies are needed to determine how well control strategies and guidance are working. Consequently, reliable and convenient techniques that minimize waste production are of special interest. While traditional methods for determining mercury in solid samples involve the use of aggressive chemicals to dissolve the matrix and the use of other chemicals to properly reduce the mercury to the volatile elemental form, pyrolysis-based analyzers can be used by directly weighing the solid in a sampling boat and initiating the instrumental analysis for total mercury. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at con

  19. Fixation filter, device for the rapid in situ preservation of particulate samples

    NASA Astrophysics Data System (ADS)

    Taylor, C. D.; Edgcomb, V. P.; Doherty, K. W.; Engstrom, I.; Shanahan, T.; Pachiadaki, M. G.; Molyneaux, S. J.; Honjo, S.

    2015-02-01

    Niskin bottle rosettes have for years been the workhorse technology for collection of water samples used in biological and chemical oceanography. Studies of marine microbiology and biogeochemical cycling that aim to analyze labile organic molecules including messenger RNA, must take into account factors associated with sampling methodology that obscure an accurate picture of in situ activities/processes. With Niskin sampling, the large and often variable times between sample collection and preservation on deck of a ship, and the sometimes significant physico-chemical changes (e.g., changes in pressure, light, temperature, redox state, etc.) that water samples and organisms are exposed to, are likely to introduce artifacts. These concerns are likely more significant when working with phototrophs, deep-sea microbes, and/or organisms inhabiting low-oxygen or anoxic environments. We report here the development of a new technology for the in situ collection and chemical preservation of particulate microbial samples for a variety of downstream analyses depending on preservative choice by the user. The Fixation Filter Unit, version 3 (FF3) permits filtration of water sample through 47 mm diameter filters of the user's choice and upon completion of filtration, chemically preserves the retained sample within 10's of seconds. The stand-alone devices can be adapted to hydrocasting or mooring-based platforms.

  20. Semi-Quantitative Method for Streptococci Magnetic Detection in Raw Milk.

    PubMed

    Duarte, Carla; Costa, Tiago; Carneiro, Carla; Soares, Rita; Jitariu, Andrei; Cardoso, Susana; Piedade, Moisés; Bexiga, Ricardo; Freitas, Paulo

    2016-04-27

    Bovine mastitis is the most costly disease for dairy farmers and the most frequent reason for the use of antibiotics in dairy cattle; thus, control measures to detect and prevent mastitis are crucial for dairy farm sustainability. The aim of this study was to develop and validate a sensitive method to magnetically detect Streptococcus agalactiae (a Group B streptococci) and Streptococcus uberis in raw milk samples. Mastitic milk samples were collected aseptically from 44 cows with subclinical mastitis, from 11 Portuguese dairy farms. Forty-six quarter milk samples were selected based on bacterial identification by conventional microbiology. All samples were submitted to PCR analysis. In parallel, these milk samples were mixed with a solution combining specific antibodies and magnetic nanoparticles, to be analyzed using a lab-on-a-chip magnetoresistive cytometer, with microfluidic sample handling. This paper describes a point of care methodology used for detection of bacteria, including analysis of false positive/negative results. This immunological recognition was able to detect bacterial presence in samples spiked above 100 cfu/mL, independently of antibody and targeted bacteria used in this work. Using PCR as a reference, this method correctly identified 73% of positive samples for streptococci species with an anti-S. agalactiae antibody, and 41% of positive samples for an anti-GB streptococci antibody.

  1. Semi-Quantitative Method for Streptococci Magnetic Detection in Raw Milk

    PubMed Central

    Duarte, Carla; Costa, Tiago; Carneiro, Carla; Soares, Rita; Jitariu, Andrei; Cardoso, Susana; Piedade, Moisés; Bexiga, Ricardo; Freitas, Paulo

    2016-01-01

    Bovine mastitis is the most costly disease for dairy farmers and the most frequent reason for the use of antibiotics in dairy cattle; thus, control measures to detect and prevent mastitis are crucial for dairy farm sustainability. The aim of this study was to develop and validate a sensitive method to magnetically detect Streptococcus agalactiae (a Group B streptococci) and Streptococcus uberis in raw milk samples. Mastitic milk samples were collected aseptically from 44 cows with subclinical mastitis, from 11 Portuguese dairy farms. Forty-six quarter milk samples were selected based on bacterial identification by conventional microbiology. All samples were submitted to PCR analysis. In parallel, these milk samples were mixed with a solution combining specific antibodies and magnetic nanoparticles, to be analyzed using a lab-on-a-chip magnetoresistive cytometer, with microfluidic sample handling. This paper describes a point of care methodology used for detection of bacteria, including analysis of false positive/negative results. This immunological recognition was able to detect bacterial presence in samples spiked above 100 cfu/mL, independently of antibody and targeted bacteria used in this work. Using PCR as a reference, this method correctly identified 73% of positive samples for streptococci species with an anti-S. agalactiae antibody, and 41% of positive samples for an anti-GB streptococci antibody. PMID:27128950

  2. 75 FR 29744 - Federal Pell Grant, Federal Perkins Loan, Federal Work-Study, Federal Supplemental Educational...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-27

    ... DEPARTMENT OF EDUCATION Federal Pell Grant, Federal Perkins Loan, Federal Work-Study, Federal... Analysis Methodology for the 2011-2012 award year. SUMMARY: The Secretary announces the annual updates to the tables that will be used in the statutory ``Federal Need Analysis Methodology'' to determine a...

  3. From the Analysis of Work-Processes to Designing Competence-Based Occupational Standards and Vocational Curricula

    ERIC Educational Resources Information Center

    Tutlys, Vidmantas; Spöttl, Georg

    2017-01-01

    Purpose: This paper aims to explore methodological and institutional challenges on application of the work-process analysis approach in the design and development of competence-based occupational standards for Lithuania. Design/methodology/approach: The theoretical analysis is based on the review of scientific literature and the analysis of…

  4. Ancient DNA studies: new perspectives on old samples

    PubMed Central

    2012-01-01

    In spite of past controversies, the field of ancient DNA is now a reliable research area due to recent methodological improvements. A series of recent large-scale studies have revealed the true potential of ancient DNA samples to study the processes of evolution and to test models and assumptions commonly used to reconstruct patterns of evolution and to analyze population genetics and palaeoecological changes. Recent advances in DNA technologies, such as next-generation sequencing make it possible to recover DNA information from archaeological and paleontological remains allowing us to go back in time and study the genetic relationships between extinct organisms and their contemporary relatives. With the next-generation sequencing methodologies, DNA sequences can be retrieved even from samples (for example human remains) for which the technical pitfalls of classical methodologies required stringent criteria to guaranty the reliability of the results. In this paper, we review the methodologies applied to ancient DNA analysis and the perspectives that next-generation sequencing applications provide in this field. PMID:22697611

  5. [Risk assessment work-related stress. pilot study on perceived stress, quality of health and work problems in a sample of workers of judicial offices in rome.

    PubMed

    Berivi, Sandra; Grassi, Antonio; Russello, Carla; Palummieri, Antonio

    2017-11-01

    In 2008, it was introduced by the Legislature legislation which provided the inclusion of Article 28, paragraph 1 of Legislative Decree. N. 81/2008, which stipulates for businesses and public authorities a duty to assess, among a variety of risks that could threaten the safety and health of workers (chemical, biological risk, etc) and also the work-related stress. The implementation of this decree is, therefore, specified as "work-related stress" as one of the subjects of mandatory assessment risks. The decree, then entrusted to the Permanent Consultative Commission for health and safety at work the task to "prepare the necessary information for the risk assessment of work-related stress", subsequently issued on 17/11/2010 in the form of a "methodological path which represents the minimum level of implementation of the obligation". In light of this regulatory framework, we established our pilot study, with the objective of analyzing a growing occupational discomfort. This objective has been diffused and palpable, but very difficult to define, in a sample of employees of the Judiciary Lazio Offices. The study was commissioned by Law Committee of Guarantee of Equal Opportunity Enhancement of Welfare Work and those against Discrimination (CUG) of the Judicial Offices Romans of the Court of Appeal of Rome also contributed to its realization. The data collected from the administration of two standardized questionnaires was analyzed (Questionnaire-gauge instrument INAIL and the SF-12 v1). More evidently in this pilot study, there was a serious problem in the organizational dimension, in specific, in Managerial Support. Just as it appears, the study sample is perceived "less healthy", both physically and mentally, than the Italian normative sample. Although the sample is only a part of the study population, 26% of workers of the Judicial Offices Romans, the data obtained shows however, from both a quantitative and qualitative view point, a significant occupational stress and suggests the need to broaden our search in order to find different possible solutions to improve the condition of workers and, consequently, the degree of citizens' satisfaction that caters to this delicate area of expertise. Copyright© by Aracne Editrice, Roma, Italy.

  6. Isolation of phenolic compounds from hop extracts using polyvinylpolypyrrolidone: characterization by high-performance liquid chromatography-diode array detection-electrospray tandem mass spectrometry.

    PubMed

    Magalhães, Paulo J; Vieira, Joana S; Gonçalves, Luís M; Pacheco, João G; Guido, Luís F; Barros, Aquiles A

    2010-05-07

    The aim of the present work was the development of a suitable methodology for the separation and determination of phenolic compounds in the hop plant. The developed methodology was based on the sample purification by adsorption of phenolic compounds from the matrix to polyvinylpolypyrrolidone (PVPP) and subsequent desorption of the adsorbed polyphenols with acetone/water (70:30, v/v). At last, the extract was analyzed by HPLC-DAD and HPLC-ESI-MS/MS. The first phase of this work consisted of the study of the adsorption behavior of several classes of phenolic compounds (e.g. phenolic acids, flavonols, and flavanols) by PVPP in model solutions. It has been observed that the process of adsorption of the different phenolic compounds to PVPP (at low concentrations) is differentiated, depending on the structure of the compound (number of OH groups, aromatic rings, and stereochemistry hindrance). For example, within the phenolic acids class (benzoic, p-hydroxybenzoic, protocatechuic and gallic acids) the PVPP adsorption increases with the number of OH groups of the phenolic compound. On the other hand, the derivatization of OH groups (methylation and glycosylation) resulted in a greatly diminished binding. The use of PVPP revealed to be very efficient for adsorption of several phenolic compounds such as catechin, epicatechin, xanthohumol and quercetin, since high adsorption and recovery values were obtained. The methodology was further applied for the extraction and isolation of phenolic compounds from hops. With this methodology, it was possible to obtain high adsorption values (>or=80%) and recovery yield values (>or=70%) for the most important phenolic compounds from hops such as xanthohumol, catechin, epicatechin, quercetin and kaempferol glycosides, and in addition it allows the identification of about 30 phenolic compounds by HPLC-DAD and HPLC-ESI-MS/MS. Copyright (c) 2009 Elsevier B.V. All rights reserved.

  7. Documentation of probabilistic fracture mechanics codes used for reactor pressure vessels subjected to pressurized thermal shock loading: Parts 1 and 2. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Balkey, K.; Witt, F.J.; Bishop, B.A.

    1995-06-01

    Significant attention has been focused on the issue of reactor vessel pressurized thermal shock (PTS) for many years. Pressurized thermal shock transient events are characterized by a rapid cooldown at potentially high pressure levels that could lead to a reactor vessel integrity concern for some pressurized water reactors. As a result of regulatory and industry efforts in the early 1980`s, a probabilistic risk assessment methodology has been established to address this concern. Probabilistic fracture mechanics analyses are performed as part of this methodology to determine conditional probability of significant flaw extension for given pressurized thermal shock events. While recent industrymore » efforts are underway to benchmark probabilistic fracture mechanics computer codes that are currently used by the nuclear industry, Part I of this report describes the comparison of two independent computer codes used at the time of the development of the original U.S. Nuclear Regulatory Commission (NRC) pressurized thermal shock rule. The work that was originally performed in 1982 and 1983 to compare the U.S. NRC - VISA and Westinghouse (W) - PFM computer codes has been documented and is provided in Part I of this report. Part II of this report describes the results of more recent industry efforts to benchmark PFM computer codes used by the nuclear industry. This study was conducted as part of the USNRC-EPRI Coordinated Research Program for reviewing the technical basis for pressurized thermal shock (PTS) analyses of the reactor pressure vessel. The work focused on the probabilistic fracture mechanics (PFM) analysis codes and methods used to perform the PTS calculations. An in-depth review of the methodologies was performed to verify the accuracy and adequacy of the various different codes. The review was structured around a series of benchmark sample problems to provide a specific context for discussion and examination of the fracture mechanics methodology.« less

  8. Reduction in alert fatigue in an assisted electronic prescribing system, through the Lean Six Sigma methodology.

    PubMed

    Cuéllar Monreal, Mª Jesús; Reig Aguado, Jorge; Font Noguera, Isabel; Poveda Andrés, José Luis

    2017-01-01

    To reduce the alert fatigue in our Assisted Electronic Prescribing System (AEPS), through the Lean Six Sigma (LSS) methodology. An observational (transversal) and retrospective study, in a general hospital with 850 beds and AEPS. The LSS methodology was followed in order to evaluate the alert fatigue situation in the AEPS system, to implement improvements, and to assess outcomes. The alerts generated during two trimesters studied (before and after the intervention) were analyzed. In order to measure the qualitative indicators, the most frequent alert types were analyzed, as well as the molecules responsible for over 50% of each type of alert. The action by the prescriber was analyzed in a sample of 496 prescriptions that generated such alerts. For each type of alert and molecule, there was a prioritization of the improvements to be implemented according to the alert generated and its quality. A second survey evaluated the pharmacist action for the alerts most highly valued by physicians. The problem, the objective, the work team and the project schedule were defined. A survey was designed in order to understand the opinion of the client about the alert system in the program. Based on the surveys collected (n = 136), the critical characteristics and the quanti/qualitative indicators were defined. Sixty (60) fields in the alert system were modified, corresponding to 32 molecules, and this led to a 28% reduction in the total number of alerts. Regarding quality indicators, false po sitive results were reduced by 25% (p < 0.05), 100% of those alerts ignored with justification were sustained, and there were no significant differences in user adherence to the system. The project improvements and outcomes were reviewed by the work team. LSS methodology has demonstrated being a valid tool for the quantitative and qualitative improvement of the alert system in an Assisted Electronic Prescription Program, thus reducing alert fatigue. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.

  9. Applying Incremental Sampling Methodology to Soils Containing Heterogeneously Distributed Metallic Residues to Improve Risk Analysis.

    PubMed

    Clausen, J L; Georgian, T; Gardner, K H; Douglas, T A

    2018-01-01

    This study compares conventional grab sampling to incremental sampling methodology (ISM) to characterize metal contamination at a military small-arms-range. Grab sample results had large variances, positively skewed non-normal distributions, extreme outliers, and poor agreement between duplicate samples even when samples were co-located within tens of centimeters of each other. The extreme outliers strongly influenced the grab sample means for the primary contaminants lead (Pb) and antinomy (Sb). In contrast, median and mean metal concentrations were similar for the ISM samples. ISM significantly reduced measurement uncertainty of estimates of the mean, increasing data quality (e.g., for environmental risk assessments) with fewer samples (e.g., decreasing total project costs). Based on Monte Carlo resampling simulations, grab sampling resulted in highly variable means and upper confidence limits of the mean relative to ISM.

  10. Long-Term Bioeffects of 435-MHz Radiofrequency Radiation on Selected Blood-Borne Endpoints in Cannulated Rats. Volume 4. Plasma Catecholamines.

    DTIC Science & Technology

    1987-08-01

    out. To use each animal as its own control , arterial blood was sampled by means of chronically implanted aortic cannulas 112,13,14]. This simple...APPENDIX B STATISTICAL METHODOLOGY 37 APPENDIX B STATISTICAL METHODOLOGY The balanced design of this experiment (requiring that 25 animals from each...protoccl in that, in numerous cases, samples were collected at odd intervals (invalidating the orthogonality of the design ) and the number of samples’taken

  11. MULTIRESIDUE DETERMINATION OF ACIDIC PESTICIDES ...

    EPA Pesticide Factsheets

    A multiresidue pesticide methodology has been studied and results for acidics are reported here with base/neutral to follow. This work studies a literature procedure as a possible general approach to many pesticides and potentially other analytes that are considered to be liquid chromatographic candidates rather than gas chromatographic ones. The analysis of thesewage effluent of a major southwestern US city serves as an example of the application of the methodology to a real sample. Recovery studies were also conducted to validate the proposed extraction step. A gradient elution program was followed for the high performance liquid chromatography leading to a general approach for acidics. Confirmation of identity was by EI GC/MS after conversion of the acids to the methyl ester (or other appropriate methylation) by means of trimethylsilyldiazomethane. The 3,4-dichlorophenoxyacetic acid was used as an internal standard to monitor the reaction and PCB #19 was used for the quantitation internal standard. Although others have reported similar analyses of acids, conversion to the methyl ester was by means of diazomethane itself rather than by the more convenient and safer trimethylsilyldiazomethane. Thus, the present paper supports the use of trimethylsilyldiazomethane with all of these acids (trimethylsilyldiazomethane has been used in environmental work with some phenoxyacetic acid herbicides) and further supports the usefulness of this reagent as a potential re

  12. Canadian-led capacity-building in biostatistics and methodology in cardiovascular and diabetes trials: the CANNeCTIN Biostatistics and Methodological Innovation Working Group

    PubMed Central

    2011-01-01

    The Biostatistics and Methodological Innovation Working (BMIW) Group is one of several working groups within the CANadian Network and Centre for Trials INternationally (CANNeCTIN). This programme received funding from the Canadian Institutes of Health Research and the Canada Foundation for Innovation beginning in 2008, to enhance the infrastructure and build capacity for large Canadian-led clinical trials in cardiovascular diseases (CVD) and diabetes mellitus (DM). The overall aims of the BMIW Group's programme within CANNeCTIN, are to advance biostatistical and methodological research, and to build biostatistical capacity in CVD and DM. Our program of research and training includes: monthly videoconferences on topical biostatistical and methodological issues in CVD/DM clinical studies; providing presentations on methods issues at the annual CANNeCTIN meetings; collaborating with clinician investigators on their studies; training young statisticians in biostatistics and methods in CVD/DM trials and organizing annual symposiums on topical methodological issues. We are focused on the development of new biostatistical methods and the recruitment and training of highly qualified personnel - who will become leaders in the design and analysis of CVD/DM trials. The ultimate goal is to enhance global health by contributing to efforts to reduce the burden of CVD and DM. PMID:21332987

  13. Weighting issues in recreation research and in identifying support for resource conservation management alternatives

    Treesearch

    Amy L. Sheaffer; Jay Beaman; Joseph T. O' Leary; Rebecca L. Williams; Doran M. Mason

    2001-01-01

    Sampling for research in recreation settings in an ongoing challenge. Often certain groups of users are more likely to be sampled. It is important in measuring public support for resource conservation and in understanding use of natural resources for recreation to evaluate issues of bias in survey methodologies. Important methodological issues emerged from a statewide...

  14. Effects of disease severity distribution on the performance of quantitative diagnostic methods and proposal of a novel ‘V-plot’ methodology to display accuracy values

    PubMed Central

    Dehbi, Hakim-Moulay; Howard, James P; Shun-Shin, Matthew J; Sen, Sayan; Nijjer, Sukhjinder S; Mayet, Jamil; Davies, Justin E; Francis, Darrel P

    2018-01-01

    Background Diagnostic accuracy is widely accepted by researchers and clinicians as an optimal expression of a test’s performance. The aim of this study was to evaluate the effects of disease severity distribution on values of diagnostic accuracy as well as propose a sample-independent methodology to calculate and display accuracy of diagnostic tests. Methods and findings We evaluated the diagnostic relationship between two hypothetical methods to measure serum cholesterol (Cholrapid and Cholgold) by generating samples with statistical software and (1) keeping the numerical relationship between methods unchanged and (2) changing the distribution of cholesterol values. Metrics of categorical agreement were calculated (accuracy, sensitivity and specificity). Finally, a novel methodology to display and calculate accuracy values was presented (the V-plot of accuracies). Conclusion No single value of diagnostic accuracy can be used to describe the relationship between tests, as accuracy is a metric heavily affected by the underlying sample distribution. Our novel proposed methodology, the V-plot of accuracies, can be used as a sample-independent measure of a test performance against a reference gold standard. PMID:29387424

  15. Electro-thermal vaporization direct analysis in real time-mass spectrometry for water contaminant analysis during space missions.

    PubMed

    Dwivedi, Prabha; Gazda, Daniel B; Keelor, Joel D; Limero, Thomas F; Wallace, William T; Macatangay, Ariel V; Fernández, Facundo M

    2013-10-15

    The development of a direct analysis in real time-mass spectrometry (DART-MS) method and first prototype vaporizer for the detection of low molecular weight (∼30-100 Da) contaminants representative of those detected in water samples from the International Space Station is reported. A temperature-programmable, electro-thermal vaporizer (ETV) was designed, constructed, and evaluated as a sampling interface for DART-MS. The ETV facilitates analysis of water samples with minimum user intervention while maximizing analytical sensitivity and sample throughput. The integrated DART-ETV-MS methodology was evaluated in both positive and negative ion modes to (1) determine experimental conditions suitable for coupling DART with ETV as a sample inlet and ionization platform for time-of-flight MS, (2) to identify analyte response ions, (3) to determine the detection limit and dynamic range for target analyte measurement, and (4) to determine the reproducibility of measurements made with the method when using manual sample introduction into the vaporizer. Nitrogen was used as the DART working gas, and the target analytes chosen for the study were ethyl acetate, acetone, acetaldehyde, ethanol, ethylene glycol, dimethylsilanediol, formaldehyde, isopropanol, methanol, methylethyl ketone, methylsulfone, propylene glycol, and trimethylsilanol.

  16. 77 FR 24221 - Agency Information Collection Activities: Proposed Collection; Comments Requested; Research To...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-23

    ... collection: Extension of the time frame required to complete approved and ongoing methodological research on... methodological research on the National Crime Victimization Survey. (2) Title of the Form/Collection: National.... This generic clearance will cover methodological research that will use existing or new sampled...

  17. Critical Thinking: Comparing Instructional Methodologies in a Senior-Year Learning Community

    ERIC Educational Resources Information Center

    Zelizer, Deborah A.

    2013-01-01

    This quasi-experimental, nonequivalent control group study compared the impact of Ennis's (1989) mixed instructional methodology to the immersion methodology on the development of critical thinking in a multicultural, undergraduate senior-year learning community. A convenience sample of students (n =171) were selected from four sections of a…

  18. The analytical calibration in (bio)imaging/mapping of the metallic elements in biological samples--definitions, nomenclature and strategies: state of the art.

    PubMed

    Jurowski, Kamil; Buszewski, Bogusław; Piekoszewski, Wojciech

    2015-01-01

    Nowadays, studies related to the distribution of metallic elements in biological samples are one of the most important issues. There are many articles dedicated to specific analytical atomic spectrometry techniques used for mapping/(bio)imaging the metallic elements in various kinds of biological samples. However, in such literature, there is a lack of articles dedicated to reviewing calibration strategies, and their problems, nomenclature, definitions, ways and methods used to obtain quantitative distribution maps. The aim of this article was to characterize the analytical calibration in the (bio)imaging/mapping of the metallic elements in biological samples including (1) nomenclature; (2) definitions, and (3) selected and sophisticated, examples of calibration strategies with analytical calibration procedures applied in the different analytical methods currently used to study an element's distribution in biological samples/materials such as LA ICP-MS, SIMS, EDS, XRF and others. The main emphasis was placed on the procedures and methodology of the analytical calibration strategy. Additionally, the aim of this work is to systematize the nomenclature for the calibration terms: analytical calibration, analytical calibration method, analytical calibration procedure and analytical calibration strategy. The authors also want to popularize the division of calibration methods that are different than those hitherto used. This article is the first work in literature that refers to and emphasizes many different and complex aspects of analytical calibration problems in studies related to (bio)imaging/mapping metallic elements in different kinds of biological samples. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. Micro X-ray Fluorescence Study of Late Pre-Hispanic Ceramics from the Western Slopes of the South Central Andes Region in the Arica y Parinacota Region, Chile: A New Methodological Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Flewett, S.; Saintenoy, T.; Sepulveda, M.

    Archeological ceramic paste material typically consists of a mix of a clay matrix and various millimeter and sub-millimeter sized mineral inclusions. Micro X-ray Fluorescence (μXRF) is a standard compositional classification tool, and in this work we propose and demonstrate an improved fluorescence map processing protocol where the mineral inclusions are automatically separated from the clay matrix to allow independent statistical analysis of the two parts. Application of this protocol allowed us to improve enhance the differentiation discrimination between different ceramic shards compared with the standard procedure of comparing working with only the spatially averaged elemental concentrations. Using the new protocol,more » we performed an initial compositional classification of a set of 83 ceramic shards from the western slopes of the south central Andean region in the Arica y Parinacota region of present-day far northern Chile. Comparing the classifications obtained using the new versus the old (average concentrations only) protocols, we found that some samples were erroneously classified with the old protocol. From an archaeological perspective, a very broad and heterogeneous sample set was used in this study due to the fact that this was the first such study to be performed on ceramics from this region. This allowed a general overview to be obtained, however further work on more specific sample sets will be necessary to extract concrete archaeological conclusions.« less

  20. FINDING A METHOD FOR THE MADNESS: A COMPARATIVE ANALYSIS OF STRATEGIC DESIGN METHODOLOGIES

    DTIC Science & Technology

    2017-06-01

    FINDING A METHOD FOR THE MADNESS: A COMPARATIVE ANALYSIS OF STRATEGIC DESIGN METHODOLOGIES BY AMANDA DONNELLY A THESIS...work develops a comparative model for strategic design methodologies, focusing on the primary elements of vision, time, process, communication and...collaboration, and risk assessment. My analysis dissects and compares three potential design methodologies including, net assessment, scenarios and

  1. Interventions to address parenting and parental substance abuse: conceptual and methodological considerations.

    PubMed

    Neger, Emily N; Prinz, Ronald J

    2015-07-01

    Parental substance abuse is a serious problem affecting the well-being of children and families. The co-occurrence of parental substance abuse and problematic parenting is recognized as a major public health concern. This review focuses on 21 outcome studies that tested dual treatment of substance abuse and parenting. A summary of theoretical conceptualizations of the connections between substance abuse and parenting provides a backdrop for the review. Outcomes of the dual treatment studies were generally positive with respect to reduction of parental substance use and improvement of parenting. Research in this area varied in methodological rigor and needs to overcome challenges regarding design issues, sampling frame, and complexities inherent in such a high-risk population. This area of work can be strengthened by randomized controlled trials, use of mixed-methods outcome measures, consideration of parent involvement with child protective services, involvement of significant others in treatment, provision of concrete supports for treatment attendance and facilitative public policies. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. A methodological approach to study the stability of selected watercolours for painting reintegration, through reflectance spectrophotometry, Fourier transform infrared spectroscopy and hyperspectral imaging

    NASA Astrophysics Data System (ADS)

    Pelosi, Claudia; Capobianco, Giuseppe; Agresti, Giorgia; Bonifazi, Giuseppe; Morresi, Fabio; Rossi, Sara; Santamaria, Ulderico; Serranti, Silvia

    2018-06-01

    The aim of this work is to investigate the stability to simulated solar radiation of some paintings samples through a new methodological approach adopting non-invasive spectroscopic techniques. In particular, commercial watercolours and iron oxide based pigments were used, these last ones being prepared for the experimental by gum Arabic in order to propose a possible substitute for traditional reintegration materials. Reflectance spectrophotometry in the visible range and Hyperspectral Imaging in the short wave infrared were chosen as non-invasive techniques for evaluation the stability to irradiation of the chosen pigments. These were studied before and after artificial ageing procedure performed in Solar Box chamber under controlled conditions. Data were treated and elaborated in order to evaluate the sensitivity of the chosen techniques in identifying the variations on paint layers, induced by photo-degradation, before they could be observed by eye. Furthermore a supervised classification method for monitoring the painted surface changes adopting a multivariate approach was successfully applied.

  3. A Fatigue Life Prediction Model of Welded Joints under Combined Cyclic Loading

    NASA Astrophysics Data System (ADS)

    Goes, Keurrie C.; Camarao, Arnaldo F.; Pereira, Marcos Venicius S.; Ferreira Batalha, Gilmar

    2011-01-01

    A practical and robust methodology is developed to evaluate the fatigue life in seam welded joints when subjected to combined cyclic loading. The fatigue analysis was conducted in virtual environment. The FE stress results from each loading were imported to fatigue code FE-Fatigue and combined to perform the fatigue life prediction using the S x N (stress x life) method. The measurement or modelling of the residual stresses resulting from the welded process is not part of this work. However, the thermal and metallurgical effects, such as distortions and residual stresses, were considered indirectly through fatigue curves corrections in the samples investigated. A tube-plate specimen was submitted to combined cyclic loading (bending and torsion) with constant amplitude. The virtual durability analysis result was calibrated based on these laboratory tests and design codes such as BS7608 and Eurocode 3. The feasibility and application of the proposed numerical-experimental methodology and contributions for the technical development are discussed. Major challenges associated with this modelling and improvement proposals are finally presented.

  4. A method based on infrared detection for determining the moisture content of ceramic plaster materials.

    PubMed

    Macias-Melo, E V; Aguilar-Castro, K M; Alvarez-Lemus, M A; Flores-Prieto, J J

    2015-09-01

    In this work, we describe a methodology for developing a mathematical model based on infrared (IR) detection to determine the moisture content (M) in solid samples. For this purpose, an experimental setup was designed, developed and calibrated against the gravimetric method. The experimental arrangement allowed for the simultaneous measurement of M and the electromotive force (EMF), fitting the experimental variables as much as possible. These variables were correlated by a mathematical model, and the obtained correlation was M=1.12×exp(3.47×EMF), ±2.54%. This finding suggests that it is feasible to measure the moisture content when it has greater values than 2.54%. The proposed methodology could be used for different conditions of temperature, relative humidity and drying rates to evaluate the influence of these variables on the amount of energy received by the IR detector. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  5. Relationship between perceived politeness and spectral characteristics of voice

    NASA Astrophysics Data System (ADS)

    Ito, Mika

    2005-04-01

    This study investigates the role of voice quality in perceiving politeness under conditions of varying relative social status among Japanese male speakers. The work focuses on four important methodological issues: experimental control of sociolinguistic aspects, eliciting natural spontaneous speech, obtaining recording quality suitable for voice quality analysis, and assessment of glottal characteristics through the use of non-invasive direct measurements of the speech spectrum. To obtain natural, unscripted utterances, the speech data were collected with a Map Task. This methodology allowed us to study the effect of manipulating relative social status among participants in the same community. We then computed the relative amplitudes of harmonics and formant peaks in spectra obtained from the Map Task recordings. Finally, an experiment was conducted to observe the alignment between acoustic measures and the perceived politeness of the voice samples. The results suggest that listeners' perceptions of politeness are determined by spectral characteristics of speakers, in particular, spectral tilts obtained by computing the difference in amplitude between the first harmonic and the third formant.

  6. Interventions to Address Parenting and Parental Substance Abuse: Conceptual and Methodological Considerations

    PubMed Central

    Neger, Emily N.; Prinz, Ronald J.

    2015-01-01

    Parental substance abuse is a serious problem affecting the well-being of children and families. The co-occurrence of parental substance abuse and problematic parenting is recognized as a major public health concern. This review focuses on 21 outcome studies that tested dual treatment of substance abuse and parenting. A summary of theoretical conceptualizations of the connections between substance abuse and parenting provides a backdrop for the review. Outcomes of the dual treatment studies were generally positive with respect to reduction of parental substance use and improvement of parenting. Research in this area varied in methodological rigor and needs to overcome challenges regarding design issues, sampling frame, and complexities inherent in such a high-risk population. This area of work can be strengthened by randomized controlled trials, use of mixed-methods outcome measures, consideration of parent involvement with child protective services, involvement of significant others in treatment, provision of concrete supports for treatment attendance and facilitative public policies. PMID:25939033

  7. Identification of vegetable oil botanical speciation in refined vegetable oil blends using an innovative combination of chromatographic and spectroscopic techniques.

    PubMed

    Osorio, Maria Teresa; Haughey, Simon A; Elliott, Christopher T; Koidis, Anastasios

    2015-12-15

    European Regulation 1169/2011 requires producers of foods that contain refined vegetable oils to label the oil types. A novel rapid and staged methodology has been developed for the first time to identify common oil species in oil blends. The qualitative method consists of a combination of a Fourier Transform Infrared (FTIR) spectroscopy to profile the oils and fatty acid chromatographic analysis to confirm the composition of the oils when required. Calibration models and specific classification criteria were developed and all data were fused into a simple decision-making system. The single lab validation of the method demonstrated the very good performance (96% correct classification, 100% specificity, 4% false positive rate). Only a small fraction of the samples needed to be confirmed with the majority of oils identified rapidly using only the spectroscopic procedure. The results demonstrate the huge potential of the methodology for a wide range of oil authenticity work. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. Surface laser marking optimization using an experimental design approach

    NASA Astrophysics Data System (ADS)

    Brihmat-Hamadi, F.; Amara, E. H.; Lavisse, L.; Jouvard, J. M.; Cicala, E.; Kellou, H.

    2017-04-01

    Laser surface marking is performed on a titanium substrate using a pulsed frequency doubled Nd:YAG laser ( λ= 532 nm, τ pulse=5 ns) to process the substrate surface under normal atmospheric conditions. The aim of the work is to investigate, following experimental and statistical approaches, the correlation between the process parameters and the response variables (output), using a Design of Experiment method (DOE): Taguchi methodology and a response surface methodology (RSM). A design is first created using MINTAB program, and then the laser marking process is performed according to the planned design. The response variables; surface roughness and surface reflectance were measured for each sample, and incorporated into the design matrix. The results are then analyzed and the RSM model is developed and verified for predicting the process output for the given set of process parameters values. The analysis shows that the laser beam scanning speed is the most influential operating factor followed by the laser pumping intensity during marking, while the other factors show complex influences on the objective functions.

  9. Improved population estimates through the use of auxiliary information

    USGS Publications Warehouse

    Johnson, D.H.; Ralph, C.J.; Scott, J.M.

    1981-01-01

    When estimating the size of a population of birds, the investigator may have, in addition to an estimator based on a statistical sample, information on one of several auxiliary variables, such as: (1) estimates of the population made on previous occasions, (2) measures of habitat variables associated with the size of the population, and (3) estimates of the population sizes of other species that correlate with the species of interest. Although many studies have described the relationships between each of these kinds of data and the population size to be estimated, very little work has been done to improve the estimator by incorporating such auxiliary information. A statistical methodology termed 'empirical Bayes' seems to be appropriate to these situations. The potential that empirical Bayes methodology has for improved estimation of the population size of the Mallard (Anas platyrhynchos) is explored. In the example considered, three empirical Bayes estimators were found to reduce the error by one-fourth to one-half of that of the usual estimator.

  10. Combination of LC/TOF-MS and LC/Ion Trap MS/MS for the Identification of Diphenhydramine in Sediment Samples

    USGS Publications Warehouse

    Ferrer, I.; Heine, C.E.; Thurman, E.M.

    2004-01-01

    Diphenhydramine (Benadryl) is a popular over-the-counter antihistaminic medication used for the treatment of allergies. After consumption, excretion, and subsequent discharge from wastewater treatment plants, it is possible that diphenhydramine will be found in environmental sediments due to its hydrophobicity (log P = 3.27). This work describes a methodology for the first unequivocal determination of diphenhydramine bound to environmental sediments. The drug is removed from the sediments by accelerated solvent extraction and then analyzed by liquid chromatography with a time-of-flight mass spectrometer and an ion trap mass spectrometer. This combination of techniques provided unequivocal identification and confirmation of diphenhydramine in two sediment samples. The accurate mass measurements of the protonated molecules were m/z 256.1703 and 256.1696 compared to the calculated mass of m/z 256.1701, resulting in errors of 0.8 and 2.3 ppm. This mass accuracy was sufficient to verify the elemental composition of diphenhydramine in each sample. Furthermore, accurate mass measurements of the primary fragment ion were obtained. This work is the first application of time-of-flight mass spectrometry for the identification of diphenhydramine and shows the accumulation of an over-the-counter medication in aquatic sediments at five different locations.

  11. Design and analysis of group-randomized trials in cancer: A review of current practices.

    PubMed

    Murray, David M; Pals, Sherri L; George, Stephanie M; Kuzmichev, Andrey; Lai, Gabriel Y; Lee, Jocelyn A; Myles, Ranell L; Nelson, Shakira M

    2018-06-01

    The purpose of this paper is to summarize current practices for the design and analysis of group-randomized trials involving cancer-related risk factors or outcomes and to offer recommendations to improve future trials. We searched for group-randomized trials involving cancer-related risk factors or outcomes that were published or online in peer-reviewed journals in 2011-15. During 2016-17, in Bethesda MD, we reviewed 123 articles from 76 journals to characterize their design and their methods for sample size estimation and data analysis. Only 66 (53.7%) of the articles reported appropriate methods for sample size estimation. Only 63 (51.2%) reported exclusively appropriate methods for analysis. These findings suggest that many investigators do not adequately attend to the methodological challenges inherent in group-randomized trials. These practices can lead to underpowered studies, to an inflated type 1 error rate, and to inferences that mislead readers. Investigators should work with biostatisticians or other methodologists familiar with these issues. Funders and editors should ensure careful methodological review of applications and manuscripts. Reviewers should ensure that studies are properly planned and analyzed. These steps are needed to improve the rigor and reproducibility of group-randomized trials. The Office of Disease Prevention (ODP) at the National Institutes of Health (NIH) has taken several steps to address these issues. ODP offers an online course on the design and analysis of group-randomized trials. ODP is working to increase the number of methodologists who serve on grant review panels. ODP has developed standard language for the Application Guide and the Review Criteria to draw investigators' attention to these issues. Finally, ODP has created a new Research Methods Resources website to help investigators, reviewers, and NIH staff better understand these issues. Published by Elsevier Inc.

  12. A synthesis of convenience survey and other data to estimate undiagnosed HIV infection among men who have sex with men in England and Wales.

    PubMed

    Walker, Kate; Seaman, Shaun R; De Angelis, Daniela; Presanis, Anne M; Dodds, Julie P; Johnson, Anne M; Mercey, Danielle; Gill, O Noel; Copas, Andrew J

    2011-10-01

    Hard-to-reach population subgroups are typically investigated using convenience sampling, which may give biased estimates. Combining information from such surveys, a probability survey and clinic surveillance, can potentially minimize the bias. We developed a methodology to estimate the prevalence of undiagnosed HIV infection among men who have sex with men (MSM) in England and Wales aged 16-44 years in 2003, making fuller use of the available data than earlier work. We performed a synthesis of three data sources: genitourinary medicine clinic surveillance (11 380 tests), a venue-based convenience survey including anonymous HIV testing (3702 MSM) and a general population sexual behaviour survey (134 MSM). A logistic regression model to predict undiagnosed infection was fitted to the convenience survey data and then applied to the MSMs in the population survey to estimate the prevalence of undiagnosed infection in the general MSM population. This estimate was corrected for selection biases in the convenience survey using clinic surveillance data. A sensitivity analysis addressed uncertainty in our assumptions. The estimated prevalence of undiagnosed HIV in MSM was 2.4% [95% confidence interval (95% CI 1.7-3.0%)], and between 1.6% (95% CI 1.1-2.0%) and 3.3% (95% CI 2.4-4.1%) depending on assumptions; corresponding to 5500 (3390-7180), 3610 (2180-4740) and 7570 (4790-9840) men, and undiagnosed fractions of 33, 24 and 40%, respectively. Our estimates are consistent with earlier work that did not make full use of data sources. Reconciling data from multiple sources, including probability-, clinic- and venue-based convenience samples can reduce bias in estimates. This methodology could be applied in other settings to take full advantage of multiple imperfect data sources.

  13. Quantitative detection of crystalline lysine supplementation in poultry feeds using a rapid bacterial bioluminescence assay.

    PubMed

    Zabala Díaz, I B; Ricke, S C

    2003-08-01

    Lysine is an essential amino acid for both humans and animals; and it is usually the first or second limiting amino acid in most formulated diets. In order to estimate the lysine content in feeds and feed sources, rapid amino acid bioassays have been developed. The objective of this work is to assess a rapid assay for lysine supplementation in chicken feeds, using a luminescent Escherichia coli lysine-auxotrophic strain, to avoid prior thermal sterilization. An E. coli lysine auxotroph carrying a plasmid with lux genes was used as the test organism. The lysine assay was conducted using depleted auxotrophic cells in lysine samples. Luminescence was measured with a Dynex MLX luminometer after addition of the aldehyde substrate. Growth response (monitored as optical density at 600 nm) and light emission response of the assay E. coli strain were monitored to generate standard curves. Bioluminescent analysis of feed samples indicated that the method works well in the presence of a complex feed matrix. Comparison of both optical density and luminescent-based methods indicated that, when the assay takes place under optimal conditions, both methodologies correlated well ( r(2)=0.99). Except for the 0.64% lysine-supplemented feed, estimates for lysine based on the bacterial assay were over 80% (82-97%) of the theoretical values. Animal data showed that the bacterial bioluminescent method correlated well with the chick bioassay when diets with different levels of lysine supplementation were assayed for lysine bioavailability ( r(2)=0.97). Luminescent methodology coupled with a bacterial growth assay is a promising technique to assess lysine availability in supplemented animal feeds.

  14. Pulsed field gradient magic angle spinning NMR self-diffusion measurements in liquids

    NASA Astrophysics Data System (ADS)

    Viel, Stéphane; Ziarelli, Fabio; Pagès, Guilhem; Carrara, Caroline; Caldarelli, Stefano

    2008-01-01

    Several investigations have recently reported the combined use of pulsed field gradient (PFG) with magic angle spinning (MAS) for the analysis of molecular mobility in heterogeneous materials. In contrast, little attention has been devoted so far to delimiting the role of the extra force field induced by sample rotation on the significance and reliability of self-diffusivity measurements. The main purpose of this work is to examine this phenomenon by focusing on pure liquids for which its impact is expected to be largest. Specifically, we show that self-diffusion coefficients can be accurately determined by PFG MAS NMR diffusion measurements in liquids, provided that specific experimental conditions are met. First, the methodology to estimate the gradient uniformity and to properly calibrate its absolute strength is briefly reviewed and applied on a MAS probe equipped with a gradient coil aligned along the rotor spinning axis, the so-called 'magic angle gradient' coil. Second, the influence of MAS on the outcome of PFG MAS diffusion measurements in liquids is investigated for two distinct typical rotors of different active volumes, 12 and 50 μL. While the latter rotor led to totally unreliable results, especially for low viscosity compounds, the former allowed for the determination of accurate self-diffusion coefficients both for fast and slowly diffusing species. Potential implications of this work are the possibility to measure accurate self-diffusion coefficients of sample-limited mixtures or to avoid radiation damping interferences in NMR diffusion measurements. Overall, the outlined methodology should be of interest to anyone who strives to improve the reliability of MAS diffusion studies, both in homogeneous and heterogeneous media.

  15. Diverse Ways to Fore-Ground Methodological Insights about Qualitative Research

    ERIC Educational Resources Information Center

    Koro-Ljungberg, Mirka; Mazzei, Lisa A.; Ceglowski, Deborah

    2013-01-01

    Texts and articles that put epistemological theories and methodologies to work in the context of qualitative research can stimulate scholarship in various ways such as through methodological innovations, transferability of theories and methods, interdisciplinarity, and transformative reflections across traditions and frameworks. Such…

  16. Data Centric Development Methodology

    ERIC Educational Resources Information Center

    Khoury, Fadi E.

    2012-01-01

    Data centric applications, an important effort of software development in large organizations, have been mostly adopting a software methodology, such as a waterfall or Rational Unified Process, as the framework for its development. These methodologies could work on structural, procedural, or object oriented based applications, but fails to capture…

  17. Designing prospective cohort studies for assessing reproductive and developmental toxicity during sensitive windows of human reproduction and development--the LIFE Study.

    PubMed

    Buck Louis, Germaine M; Schisterman, Enrique F; Sweeney, Anne M; Wilcosky, Timothy C; Gore-Langton, Robert E; Lynch, Courtney D; Boyd Barr, Dana; Schrader, Steven M; Kim, Sungduk; Chen, Zhen; Sundaram, Rajeshwari

    2011-09-01

    The relationship between the environment and human fecundity and fertility remains virtually unstudied from a couple-based perspective in which longitudinal exposure data and biospecimens are captured across sensitive windows. In response, we completed the LIFE Study with methodology that intended to empirically evaluate a priori purported methodological challenges: implementation of population-based sampling frameworks suitable for recruiting couples planning pregnancy; obtaining environmental data across sensitive windows of reproduction and development; home-based biospecimen collection; and development of a data management system for hierarchical exposome data. We used two sampling frameworks (i.e., fish/wildlife licence registry and a direct marketing database) for 16 targeted counties with presumed environmental exposures to persistent organochlorine chemicals to recruit 501 couples planning pregnancies for prospective longitudinal follow-up while trying to conceive and throughout pregnancy. Enrolment rates varied from <1% of the targeted population (n = 424,423) to 42% of eligible couples who were successfully screened; 84% of the targeted population could not be reached, while 36% refused screening. Among enrolled couples, ∼ 85% completed daily journals while trying; 82% of pregnant women completed daily early pregnancy journals, and 80% completed monthly pregnancy journals. All couples provided baseline blood/urine samples; 94% of men provided one or more semen samples and 98% of women provided one or more saliva samples. Women successfully used urinary fertility monitors for identifying ovulation and home pregnancy test kits. Couples can be recruited for preconception cohorts and will comply with intensive data collection across sensitive windows. However, appropriately sized sampling frameworks are critical, given the small percentage of couples contacted found eligible and reportedly planning pregnancy at any point in time. © Published 2011. This article is a US Government work and is in the public domain in the USA.

  18. Archetype modeling methodology.

    PubMed

    Moner, David; Maldonado, José Alberto; Robles, Montserrat

    2018-03-01

    Clinical Information Models (CIMs) expressed as archetypes play an essential role in the design and development of current Electronic Health Record (EHR) information structures. Although there exist many experiences about using archetypes in the literature, a comprehensive and formal methodology for archetype modeling does not exist. Having a modeling methodology is essential to develop quality archetypes, in order to guide the development of EHR systems and to allow the semantic interoperability of health data. In this work, an archetype modeling methodology is proposed. This paper describes its phases, the inputs and outputs of each phase, and the involved participants and tools. It also includes the description of the possible strategies to organize the modeling process. The proposed methodology is inspired by existing best practices of CIMs, software and ontology development. The methodology has been applied and evaluated in regional and national EHR projects. The application of the methodology provided useful feedback and improvements, and confirmed its advantages. The conclusion of this work is that having a formal methodology for archetype development facilitates the definition and adoption of interoperable archetypes, improves their quality, and facilitates their reuse among different information systems and EHR projects. Moreover, the proposed methodology can be also a reference for CIMs development using any other formalism. Copyright © 2018 Elsevier Inc. All rights reserved.

  19. [Occupational exposure to airborne chemical substances in paintings conservators].

    PubMed

    Jezewska, Anna; Szewczyńska, Małgorzata; Woźnica, Agnieszka

    2014-01-01

    This paper presents the results of the quantitative study of the airborne chemical substances detected in the conservator's work environment. The quantitative tests were carried out in 6 museum easel paintings conservation studios. The air test samples were taken at various stages of restoration works, such as cleaning, doubling, impregnation, varnishing, retouching, just to name a few. The chemical substances in the sampled air were measured by the GC-FID (gas chromatography with flame ionization detector) test method. The study results demonstrated that concentrations of airborne substances, e.g., toluene, 1,4-dioxane, turpentine and white spirit in the work environment of paintings conservators exceeded the values allowed by hygiene standards. It was found that exposure levels to the same chemical agents, released during similar activities, varied for different paintings conservation studios. It is likely that this discrepancy resulted from the indoor air exchange system for a given studio (e.g. type of ventilation and its efficiency), the size of the object under maintenance, and also from the methodology and protection used by individual employees. The levels of organic solvent vapors, present in the workplace air in the course of painting conservation, were found to be well above the occupational exposure limits, thus posing a threat to the worker's health.

  20. Network Structure and Biased Variance Estimation in Respondent Driven Sampling

    PubMed Central

    Verdery, Ashton M.; Mouw, Ted; Bauldry, Shawn; Mucha, Peter J.

    2015-01-01

    This paper explores bias in the estimation of sampling variance in Respondent Driven Sampling (RDS). Prior methodological work on RDS has focused on its problematic assumptions and the biases and inefficiencies of its estimators of the population mean. Nonetheless, researchers have given only slight attention to the topic of estimating sampling variance in RDS, despite the importance of variance estimation for the construction of confidence intervals and hypothesis tests. In this paper, we show that the estimators of RDS sampling variance rely on a critical assumption that the network is First Order Markov (FOM) with respect to the dependent variable of interest. We demonstrate, through intuitive examples, mathematical generalizations, and computational experiments that current RDS variance estimators will always underestimate the population sampling variance of RDS in empirical networks that do not conform to the FOM assumption. Analysis of 215 observed university and school networks from Facebook and Add Health indicates that the FOM assumption is violated in every empirical network we analyze, and that these violations lead to substantially biased RDS estimators of sampling variance. We propose and test two alternative variance estimators that show some promise for reducing biases, but which also illustrate the limits of estimating sampling variance with only partial information on the underlying population social network. PMID:26679927

  1. Critical Inquiry for the Social Good: Methodological Work as a Means for Truth-Telling in Education

    ERIC Educational Resources Information Center

    Kuntz, Aaron M.; Pickup, Austin

    2016-01-01

    This article questions the ubiquity of the term "critical" in methodological scholarship, calling for a renewed association of the term with projects concerned with social justice, truth-telling, and overt articulations of the social good. Drawing on Michel Foucault's work with parrhesia (or truth-telling) and Aristotle's articulation of…

  2. Joint Autoethnography of Teacher Experience in the Academy: Exploring Methods for Collaborative Inquiry

    ERIC Educational Resources Information Center

    Adamson, John; Muller, Theron

    2018-01-01

    This manuscript uses a joint autoethnographic methodology to explore the experiences of two language teacher scholars working in the academy outside the global centre in Japan. Emphasis is given to how the methodology used, cycles of reflective writing, reveals commonalities and differences in our respective experiences of working in the Japanese…

  3. Collaborative Action Research in the Context of Developmental Work Research: A Methodological Approach for Science Teachers' Professional Development

    ERIC Educational Resources Information Center

    Piliouras, Panagiotis; Lathouris, Dimitris; Plakitsi, Katerina; Stylianou, Liana

    2015-01-01

    The paper refers to the theoretical establishment and brief presentation of collaborative action research with the characteristics of "developmental work research" as an effective methodological approach so that science teachers develop themselves professionally. A specific case study is presented, in which we aimed to transform the…

  4. The Effect of Soft Skills and Training Methodology on Employee Performance

    ERIC Educational Resources Information Center

    Ibrahim, Rosli; Boerhannoeddin, Ali; Bakare, Kazeem Kayode

    2017-01-01

    Purpose: The purpose of this paper is to investigate the effect of soft skill acquisition and the training methodology adopted on employee work performance. In this study, the authors study the trends of research in training and work performance in organisations that focus on the acquisition of technical or "hard skills" for employee…

  5. Determination of trace levels of parabens in real matrices by bar adsorptive microextraction using selective sorbent phases.

    PubMed

    Almeida, C; Nogueira, J M F

    2014-06-27

    In the present work, the development of an analytical methodology which combines bar adsorptive microextraction with microliquid desorption followed by high performance liquid chromatography-diode array detection (BAμE-μLD/HPLC-DAD) is proposed for the determination of trace levels of four parabens (methyl, ethyl, propyl and buthyl paraben) in real matrices. By comparing six polymer (P1, P2, P3, P4, P5 and P6) and five activated carbon (AC1, AC2, AC3, AC4 and AC5) coatings through BAμE, AC2 exhibited much higher selectivity and efficiency from all the sorbent phases tested, even when compared with the commercial stir bar sorptive extraction with polydimethylsiloxane. Assays performed through BAμE(AC2, 1.7mg) on 25mL of ultrapure water samples spiked at the 8.0μg/L level, yielded recoveries ranging from 85.6±6.3% to 100.6±11.8%, under optimized experimental conditions. The analytical performance showed also convenient limits of detection (0.1μg/L) and quantification (0.3μg/L), as well as good linear dynamic ranges (0.5-28.0μg/L) with remarkable determination coefficients (r(2)>0.9982). Excellent repeatability was also achieved through intraday (RSD<10.2%) and interday (RSD<10.0%) assays. By downsizing the analytical device to half-length (BAμE(AC2, 0.9mg)), similar analytical data was also achieved for the four parabens, under optimized experimental conditions, showing that this analytical technology can be design to operate with lower volumes of sample and desorption solvent, thus increasing the sensitivity and effectiveness. The application of the proposed analytical approach using the standard addition methodology on tap, underground, estuarine, swimming pool and waste water samples, as well as on commercial cosmetic products and urine samples, revealed good sensitivity, absence of matrix effects and the occurrence of levels of some parabens. Moreover, the present methodology is easy to implement, reliable, sensitive, requiring low sample and minimized desorption solvent volume, having the possibility to tune the most selective sorbent coating, according to the target compounds involved. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. Perceptions of health professionals towards the management of back pain in the context of work: a qualitative study

    PubMed Central

    2014-01-01

    Background Musculoskeletal complaints have a significant impact on work in terms of reduced productivity, sickness absence and long term incapacity for work. This study sought to explore GPs’ and physiotherapists’ perceptions of sickness certification in patients with musculoskeletal problems. Methods Eleven (11) GPs were sampled from an existing general practice survey, and six (6) physiotherapists were selected randomly using ‘snowball’ sampling techniques, through established contacts in local physiotherapy departments. Semi-structured qualitative interviews were conducted with respondents lasting up to 30 minutes. The interviews were audio recorded and transcribed verbatim, following which they were coded using N-Vivo qualitative software and analysed thematically using the constant comparative methodology, where themes were identified and contrasted between and within both groups of respondents. Results Three themes were identified from the analysis: 1) Approaches to evaluating patients’ work problems 2) Perceived ability to manage ‘work and pain’, and 3) Policies and penalties in the work-place. First, physiotherapists routinely asked patients about their job and work difficulties using a structured (protocol-driven) approach, whilst GPs rarely used such structured measures and were less likely to enquire about patients’ work situation. Second, return to work assessments revealed a tension between GPs’ gatekeeper and patient advocacy roles, often resolved in favour of patients’ concerns and needs. Some physiotherapists perceived that GPs’ decisions could be influenced by patients’ demand for a sick certificate and their close relationship with patients made them vulnerable to manipulation. Third, the workplace was considered to be a specific source of strain for patients acting as a barrier to work resumption, and over which GPs and physiotherapists could exercise only limited control. Conclusion We conclude that healthcare professionals need to take account of patients’ work difficulties, their own perceived ability to offer effective guidance, and consider the ‘receptivity’ of employment contexts to patients’ work problems, in order to ensure a smooth transition back to work. PMID:24941952

  7. Perceptions of health professionals towards the management of back pain in the context of work: a qualitative study.

    PubMed

    Wynne-Jones, Gwenllian; van der Windt, Danielle; Ong, Bie Nio; Bishop, Annette; Cowen, Jemma; Artus, Majid; Sanders, Tom

    2014-06-18

    Musculoskeletal complaints have a significant impact on work in terms of reduced productivity, sickness absence and long term incapacity for work. This study sought to explore GPs' and physiotherapists' perceptions of sickness certification in patients with musculoskeletal problems. Eleven (11) GPs were sampled from an existing general practice survey, and six (6) physiotherapists were selected randomly using 'snowball' sampling techniques, through established contacts in local physiotherapy departments. Semi-structured qualitative interviews were conducted with respondents lasting up to 30 minutes. The interviews were audio recorded and transcribed verbatim, following which they were coded using N-Vivo qualitative software and analysed thematically using the constant comparative methodology, where themes were identified and contrasted between and within both groups of respondents. Three themes were identified from the analysis: 1) Approaches to evaluating patients' work problems 2) Perceived ability to manage 'work and pain', and 3) Policies and penalties in the work-place. First, physiotherapists routinely asked patients about their job and work difficulties using a structured (protocol-driven) approach, whilst GPs rarely used such structured measures and were less likely to enquire about patients' work situation. Second, return to work assessments revealed a tension between GPs' gatekeeper and patient advocacy roles, often resolved in favour of patients' concerns and needs. Some physiotherapists perceived that GPs' decisions could be influenced by patients' demand for a sick certificate and their close relationship with patients made them vulnerable to manipulation. Third, the workplace was considered to be a specific source of strain for patients acting as a barrier to work resumption, and over which GPs and physiotherapists could exercise only limited control. We conclude that healthcare professionals need to take account of patients' work difficulties, their own perceived ability to offer effective guidance, and consider the 'receptivity' of employment contexts to patients' work problems, in order to ensure a smooth transition back to work.

  8. How much is enough? An analysis of CD measurement amount for mask characterization

    NASA Astrophysics Data System (ADS)

    Ullrich, Albrecht; Richter, Jan

    2009-10-01

    The demands on CD (critical dimension) metrology amount in terms of both reproducibility and measurement uncertainty steadily increase from node to node. Different mask characterization requirements have to be addressed like very small features, unevenly distributed features, contacts, semi-dense structures to name only a few. Usually this enhanced need is met by an increasing number of CD measurements, where the new CD requirements are added to the well established CD characterization recipe. This leads straight forwardly to prolonged cycle times and highly complex evaluation routines. At the same time mask processes are continuously improved to become more stable. The enhanced stability offers potential to actually reduce the number of measurements. Thus, in this work we will start to address the fundamental question of how many CD measurements are needed for mask characterization for a given confidence level. We used analysis of variances (ANOVA) to distinguish various contributors like mask making process, measurement tool stability and measurement methodology. These contributions have been investigated for classical photomask CD specifications e.g. mean to target, CD uniformity, target offset tolerance and x-y bias. We found depending on specification that the importance of the contributors interchanges. Interestingly, not only short and long-term metrology contributions are dominant. Also the number of measurements and their spatial distribution on the mask layout (sampling methodology) can be the most important part of the variance. The knowledge of contributions can be used to optimize the sampling plan. As a major finding, we conclude that there is potential to reduce a significant amount of measurements without loosing confidence at all. Here, full sampling in x and y as well as full sampling for different features can be shortened substantially almost up to 50%.

  9. Monitoring and Surveillance of Marine Invasive Species in Californian Waters by DNA Barcoding: Methodological and Analytical Solutions

    NASA Astrophysics Data System (ADS)

    Campbell, T. L.; Geller, J. B.; Heller, P.; Ruiz, G.; Chang, A.; McCann, L.; Ceballos, L.; Marraffini, M.; Ashton, G.; Larson, K.; Havard, S.; Meagher, K.; Wheelock, M.; Drake, C.; Rhett, G.

    2016-02-01

    The Ballast Water Management Act, the Marine Invasive Species Act, and the Coastal Ecosystem Protection Act require the California Department of Fish and Wildlife to monitor and evaluate the extent of biological invasions in the state's marine and estuarine waters. This has been performed statewide, using a variety of methodologies. Conventional sample collection and processing is laborious, slow and costly, and may require considerable taxonomic expertise requiring detailed time-consuming microscopic study of multiple specimens. These factors limit the volume of biomass that can be searched for introduced species. New technologies continue to reduce the cost and increase the throughput of genetic analyses, which become efficient alternatives to traditional morphological analysis for identification, monitoring and surveillance of marine invasive species. Using next-generation sequencing of mitochondrial Cytochrome c oxidase subunit I (COI) and nuclear large subunit ribosomal RNA (LSU), we analyzed over 15,000 individual marine invertebrates collected in Californian waters. We have created sequence databases of California native and non-native species to assist in molecular identification and surveillance in North American waters. Metagenetics, the next-generation sequencing of environmental samples with comparison to DNA sequence databases, is a faster and cost-effective alternative to individual sample analysis. We have sequenced from biomass collected from whole settlement plates and plankton in California harbors, and used our introduced species database to create species lists. We can combine these species lists for individual marinas with collected environmental data, such as temperature, salinity, and dissolved oxygen to understand the ecology of marine invasions. Here we discuss high throughput sampling, sequencing, and COASTLINE, our data analysis answer to challenges working with hundreds of millions of sequencing reads from tens of thousands of specimens.

  10. Identification of alkyl dimethylbenzylammonium surfactants in water samples by solid-phase extraction followed by ion trap LC/MS and LC/MS/MS

    USGS Publications Warehouse

    Ferrer, I.; Furlong, E.T.

    2001-01-01

    A novel methodology was developed for the determination of alkyl (C12, C14, and C16) dimethylbenzylammonium chloride (benzalkonium chloride or BAC, Chemical Abstract Service number: 8001-54-5) in water samples. This method is based on solid-phase extraction (SPE) using polymeric cartridges, followed by high-performance liquid chromatography/ion trap mass spectrometry (LC/MS) and tandem mass spectrometry(MS/MS) detection, equipped with an electrospray interface in positive ion mode. Chromatographic separation was achieved for three BAC homologues by using a C18 column and a gradient of acetonitrile/10 millimolar aqueous ammonium formate. Total method recoveries were higher than 71% in different water matrices. The main ions observed by LC/MS were at mass-to-charge ratios (m/z) of 304, 332, and 360, which correspond to the molecular ions of the C12, C14, and C16 alkyl BAC, respectively. The unequivocal structural identification of these compounds in water samples was performed by LC/MS/MS after isolation and subsequent fragmentation of each molecular ion. The main fragmentation observed for the three different homologues corresponded to the loss of the toluyl group in the chemical structure, which leads to the fragment ions at m/z 212, 240, and 268 and a tropylium ion, characteristic of all homologues, at m/z 91. Detection limits for the methodology developed in this work were in the low nanogram-per-liter range. Concentration levels of BAC - ranging from 1.2 to 36.6 micrograms per liter - were found in surface-water samples collected downstream from different wastewater-treatment discharges, thus indicating its input and persistence through the wastewater-treatment process.

  11. The Deskilling Controversy.

    ERIC Educational Resources Information Center

    Attewell, Paul

    1987-01-01

    Braverman and others argue that capitalism continues to degrade and deskill work. The author presents theoretical, empirical, and methodological criticisms that highlight methodological weaknesses in the deskilling approach. (SK)

  12. Meaning and Problems of Planning

    ERIC Educational Resources Information Center

    Brieve, Fred J.; Johnston, A. P.

    1973-01-01

    Examines the educational planning process. Discusses what planning is, how methodological planning can work in education, misunderstandings about planning, and difficulties in applying the planning methodology. (DN)

  13. Do I Just Look Stressed or am I Stressed? Work-related Stress in a Sample of Italian Employees

    PubMed Central

    GIORGI, Gabriele; LEON-PEREZ, Jose M.; CUPELLI, Vincenzo; MUCCI, Nicola; ARCANGELI, Giulio

    2013-01-01

    Work-related stress is becoming a significant problem in Italy and it is therefore essential to advance the theory and methodology required to detect this phenomenon at work. Thus, the aim of this paper is to propose a new method for evaluating stress at work by measuring the discrepancies between employees’ perceptions of stress and their leaders’ evaluation of the stress of their subordinates. In addition, a positive impression scale was added to determine whether workers might give socially desirable responses in organizational diagnosis. Over 1,100 employees and 200 leaders within several Italian organizations were involved in this study. Structural equation modeling was used to test such new method for evaluating stress in a model of stress at work that incorporates relationships among individual (positive impression), interpersonal (workplace bullying) and organizational factors (working conditions, welfare culture, training). Results showed that the leaders’ capacity to understand subordinates’ stress is associated with subordinates’ psychological well-being since higher disagreement between self and leaders’ ratings was related to lower well-being. We discuss the implications of healthy leadership for the development of healthy organizations. PMID:24292877

  14. Do I just look stressed or am I stressed? Work-related stress in a sample of Italian employees.

    PubMed

    Giorgi, Gabriele; Leon-Perez, Jose M; Cupelli, Vincenzo; Mucci, Nicola; Arcangeli, Giulio

    2014-01-01

    Work-related stress is becoming a significant problem in Italy and it is therefore essential to advance the theory and methodology required to detect this phenomenon at work. Thus, the aim of this paper is to propose a new method for evaluating stress at work by measuring the discrepancies between employees' perceptions of stress and their leaders' evaluation of the stress of their subordinates. In addition, a positive impression scale was added to determine whether workers might give socially desirable responses in organizational diagnosis. Over 1,100 employees and 200 leaders within several Italian organizations were involved in this study. Structural equation modeling was used to test such new method for evaluating stress in a model of stress at work that incorporates relationships among individual (positive impression), interpersonal (workplace bullying) and organizational factors (working conditions, welfare culture, training). Results showed that the leaders' capacity to understand subordinates' stress is associated with subordinates' psychological well-being since higher disagreement between self and leaders' ratings was related to lower well-being. We discuss the implications of healthy leadership for the development of healthy organizations.

  15. A Narrative in Search of a Methodology.

    PubMed

    Treloar, Anna; Stone, Teresa Elizabeth; McMillan, Margaret; Flakus, Kirstin

    2015-07-01

    Research papers present us with the summaries of scholars' work; what we readers do not see are the struggles behind the decision to choose one methodology over another. A student's mental health portfolio contained a narrative that led to an exploration of the most appropriate methodology for a projected study of clinical anecdotes told by nurses who work in mental health settings to undergraduates and new recruits about mental health nursing. This paper describes the process of struggle, beginning with the student's account, before posing a number of questions needing answers before the choice of the most appropriate methodology. We argue, after discussing the case for the use of literary analysis, discourse analysis, symbolic interactionism, hermeneutics, and narrative research, that case study research is the methodology of choice. Case study is frequently used in educational research and is sufficiently flexible to allow for an exploration of the phenomenon. © 2014 Wiley Periodicals, Inc.

  16. Comparison of soil sampling and analytical methods for asbestos at the Sumas Mountain Asbestos Site-Working towards a toolbox for better assessment.

    PubMed

    Wroble, Julie; Frederick, Timothy; Frame, Alicia; Vallero, Daniel

    2017-01-01

    Established soil sampling methods for asbestos are inadequate to support risk assessment and risk-based decision making at Superfund sites due to difficulties in detecting asbestos at low concentrations and difficulty in extrapolating soil concentrations to air concentrations. Environmental Protection Agency (EPA)'s Office of Land and Emergency Management (OLEM) currently recommends the rigorous process of Activity Based Sampling (ABS) to characterize site exposures. The purpose of this study was to compare three soil analytical methods and two soil sampling methods to determine whether one method, or combination of methods, would yield more reliable soil asbestos data than other methods. Samples were collected using both traditional discrete ("grab") samples and incremental sampling methodology (ISM). Analyses were conducted using polarized light microscopy (PLM), transmission electron microscopy (TEM) methods or a combination of these two methods. Data show that the fluidized bed asbestos segregator (FBAS) followed by TEM analysis could detect asbestos at locations that were not detected using other analytical methods; however, this method exhibited high relative standard deviations, indicating the results may be more variable than other soil asbestos methods. The comparison of samples collected using ISM versus discrete techniques for asbestos resulted in no clear conclusions regarding preferred sampling method. However, analytical results for metals clearly showed that measured concentrations in ISM samples were less variable than discrete samples.

  17. Comparison of soil sampling and analytical methods for asbestos at the Sumas Mountain Asbestos Site—Working towards a toolbox for better assessment

    PubMed Central

    2017-01-01

    Established soil sampling methods for asbestos are inadequate to support risk assessment and risk-based decision making at Superfund sites due to difficulties in detecting asbestos at low concentrations and difficulty in extrapolating soil concentrations to air concentrations. Environmental Protection Agency (EPA)’s Office of Land and Emergency Management (OLEM) currently recommends the rigorous process of Activity Based Sampling (ABS) to characterize site exposures. The purpose of this study was to compare three soil analytical methods and two soil sampling methods to determine whether one method, or combination of methods, would yield more reliable soil asbestos data than other methods. Samples were collected using both traditional discrete (“grab”) samples and incremental sampling methodology (ISM). Analyses were conducted using polarized light microscopy (PLM), transmission electron microscopy (TEM) methods or a combination of these two methods. Data show that the fluidized bed asbestos segregator (FBAS) followed by TEM analysis could detect asbestos at locations that were not detected using other analytical methods; however, this method exhibited high relative standard deviations, indicating the results may be more variable than other soil asbestos methods. The comparison of samples collected using ISM versus discrete techniques for asbestos resulted in no clear conclusions regarding preferred sampling method. However, analytical results for metals clearly showed that measured concentrations in ISM samples were less variable than discrete samples. PMID:28759607

  18. A low-rank matrix recovery approach for energy efficient EEG acquisition for a wireless body area network.

    PubMed

    Majumdar, Angshul; Gogna, Anupriya; Ward, Rabab

    2014-08-25

    We address the problem of acquiring and transmitting EEG signals in Wireless Body Area Networks (WBAN) in an energy efficient fashion. In WBANs, the energy is consumed by three operations: sensing (sampling), processing and transmission. Previous studies only addressed the problem of reducing the transmission energy. For the first time, in this work, we propose a technique to reduce sensing and processing energy as well: this is achieved by randomly under-sampling the EEG signal. We depart from previous Compressed Sensing based approaches and formulate signal recovery (from under-sampled measurements) as a matrix completion problem. A new algorithm to solve the matrix completion problem is derived here. We test our proposed method and find that the reconstruction accuracy of our method is significantly better than state-of-the-art techniques; and we achieve this while saving sensing, processing and transmission energy. Simple power analysis shows that our proposed methodology consumes considerably less power compared to previous CS based techniques.

  19. WAIS-III index score profiles in the Canadian standardization sample.

    PubMed

    Lange, Rael T

    2007-01-01

    Representative index score profiles were examined in the Canadian standardization sample of the Wechsler Adult Intelligence Scale-Third Edition (WAIS-III). The identification of profile patterns was based on the methodology proposed by Lange, Iverson, Senior, and Chelune (2002) that aims to maximize the influence of profile shape and minimize the influence of profile magnitude on the cluster solution. A two-step cluster analysis procedure was used (i.e., hierarchical and k-means analyses). Cluster analysis of the four index scores (i.e., Verbal Comprehension [VCI], Perceptual Organization [POI], Working Memory [WMI], Processing Speed [PSI]) identified six profiles in this sample. Profiles were differentiated by pattern of performance and were primarily characterized as (a) high VCI/POI, low WMI/PSI, (b) low VCI/POI, high WMI/PSI, (c) high PSI, (d) low PSI, (e) high VCI/WMI, low POI/PSI, and (f) low VCI, high POI. These profiles are potentially useful for determining whether a patient's WAIS-III performance is unusual in a normal population.

  20. Multi-modal image registration: matching MRI with histology

    NASA Astrophysics Data System (ADS)

    Alic, Lejla; Haeck, Joost C.; Klein, Stefan; Bol, Karin; van Tiel, Sandra T.; Wielopolski, Piotr A.; Bijster, Magda; Niessen, Wiro J.; Bernsen, Monique; Veenland, Jifke F.; de Jong, Marion

    2010-03-01

    Spatial correspondence between histology and multi sequence MRI can provide information about the capabilities of non-invasive imaging to characterize cancerous tissue. However, shrinkage and deformation occurring during the excision of the tumor and the histological processing complicate the co registration of MR images with histological sections. This work proposes a methodology to establish a detailed 3D relation between histology sections and in vivo MRI tumor data. The key features of the methodology are a very dense histological sampling (up to 100 histology slices per tumor), mutual information based non-rigid B-spline registration, the utilization of the whole 3D data sets, and the exploitation of an intermediate ex vivo MRI. In this proof of concept paper, the methodology was applied to one tumor. We found that, after registration, the visual alignment of tumor borders and internal structures was fairly accurate. Utilizing the intermediate ex vivo MRI, it was possible to account for changes caused by the excision of the tumor: we observed a tumor expansion of 20%. Also the effects of fixation, dehydration and histological sectioning could be determined: 26% shrinkage of the tumor was found. The annotation of viable tissue, performed in histology and transformed to the in vivo MRI, matched clearly with high intensity regions in MRI. With this methodology, histological annotation can be directly related to the corresponding in vivo MRI. This is a vital step for the evaluation of the feasibility of multi-spectral MRI to depict histological groundtruth.

  1. Diagnosing Conceptions about the Epistemology of Science: Contributions of a Quantitative Assessment Methodology

    ERIC Educational Resources Information Center

    Vázquez-Alonso, Ángel; Manassero-Mas, María-Antonia; García-Carmona, Antonio; Montesano de Talavera, Marisa

    2016-01-01

    This study applies a new quantitative methodological approach to diagnose epistemology conceptions in a large sample. The analyses use seven multiple-rating items on the epistemology of science drawn from the item pool Views on Science-Technology-Society (VOSTS). The bases of the new methodological diagnostic approach are the empirical…

  2. Teachers' Attitude towards Implementation of Learner-Centered Methodology in Science Education in Kenya

    ERIC Educational Resources Information Center

    Ndirangu, Caroline

    2017-01-01

    This study aims to evaluate teachers' attitude towards implementation of learner-centered methodology in science education in Kenya. The study used a survey design methodology, adopting the purposive, stratified random and simple random sampling procedures and hypothesised that there was no significant relationship between the head teachers'…

  3. Interference modelling, experimental design and pre-concentration steps in validation of the Fenton's reagent for pesticides determination.

    PubMed

    Ostra, Miren; Ubide, Carlos; Zuriarrain, Juan

    2007-02-12

    The determination of atrazine in real samples (commercial pesticide preparations and water matrices) shows how the Fenton's reagent can be used with analytical purposes when kinetic methodology and multivariate calibration methods are applied. Also, binary mixtures of atrazine-alachlor and atrazine-bentazone in pesticide preparations have been resolved. The work shows the way in which interferences and the matrix effect can be modelled. Experimental design has been used to optimize experimental conditions, including the effect of solvent (methanol) used for extraction of atrazine from the sample. The determination of pesticides in commercial preparations was accomplished without any pre-treatment of sample apart from evaporation of solvent; the calibration model was developed for concentration ranges between 0.46 and 11.6 x 10(-5) mol L(-1) with mean relative errors under 4%. Solid-phase extraction is used for pre-concentration of atrazine in water samples through C(18) disks, and the concentration range for determination was established between 4 and 115 microg L(-1) approximately. Satisfactory results for recuperation of atrazine were always obtained.

  4. Associations between characteristics of the nurse work environment and five nurse-sensitive patient outcomes in hospitals: a systematic review of literature.

    PubMed

    Stalpers, Dewi; de Brouwer, Brigitte J M; Kaljouw, Marian J; Schuurmans, Marieke J

    2015-04-01

    To systematically review the literature on relationships between characteristics of the nurse work environment and five nurse-sensitive patient outcomes in hospitals. The search was performed in Medline (PubMed), Cochrane, Embase, and CINAHL. Included were quantitative studies published from 2004 to 2012 that examined associations between work environment and the following patient outcomes: delirium, malnutrition, pain, patient falls and pressure ulcers. The Dutch version of Cochrane's critical appraisal instrument was used to assess the methodological quality of the included studies. Of the initial 1120 studies, 29 were included in the review. Nurse staffing was inversely related to patient falls; more favorable staffing hours were associated with fewer fall incidents. Mixed results were shown for nurse staffing in relation to pressure ulcers. Characteristics of work environment other than nurse staffing that showed significant effects were: (i) collaborative relationships; positively perceived communication between nurses and physicians was associated with fewer patient falls and lower rates of pressure ulcers, (ii) nurse education; higher levels of education were related to fewer patient falls and (iii) nursing experience; lower levels of experience were related to more patient falls and higher rates of pressure ulcers. No eligible studies were found regarding delirium and malnutrition, and only one study found that favorable staffing was related to better pain management. Our findings show that there is evidence on associations between work environment and nurse-sensitive patient outcomes. However, the results are equivocal and studies often do not provide clear conclusions. A quantitative meta-analysis was not feasible due to methodological issues in the primary studies (for example, poorly described samples). The diversity in outcome measures and the majority of cross-sectional designs make quantitative analysis even more difficult. In the future, well-described research designs of a longitudinal character will be needed in this field of work environment and nursing quality. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. A novel functionalisation process for glucose oxidase immobilisation in poly(methyl methacrylate) microchannels in a flow system for amperometric determinations.

    PubMed

    Cerqueira, Marcos Rodrigues Facchini; Grasseschi, Daniel; Matos, Renato Camargo; Angnes, Lucio

    2014-08-01

    Different materials like glass, silicon and poly(methyl methacrylate) (PMMA) are being used to immobilise enzymes in microchannels. PMMA shows advantages such as its low price, biocompatibility and attractive mechanical and chemical properties. Despite this, the introduction of reactive functional groups on PMMA is still problematic, either because of the complex chemistry or extended reaction time involved. In this paper, a new methodology was developed to immobilise glucose oxidase (GOx) in PMMA microchannels, with the benefit of a rapid immobilisation process and a very simple route. The new procedure involves only two steps, based on the reaction of 5.0% (w/w) polyethyleneimine (PEI) with PMMA in a dimethyl sulphoxide medium, followed by the immobilisation of glucose oxidase using a solution containing 100U enzymes and 1.0% (v/v) glutaraldehyde. The reactors prepared in this way were evaluated by a flowing system with amperometric detection (+0.60V) based on the oxidation of the H2O2 produced by the reactor. The microreactor proposed here was able to work with high bioconversion and a frequency of 60 samples h(-1), with detection and quantification limits of 0.50 and 1.66µmol L(-1), respectively. Michaelis-Menten parameters (Vmax and KM) were calculated as 449±47.7nmol min(-1) and 7.79±0.98mmol. Statistical evaluations were done to validate the proposed methodology. The content of glucose in natural and commercial coconut water samples was evaluated using the developed method. Comparison with spectrophotometric measurements showed that both methodologies have a very good correlation (tcalculated, 0.05, 4=1.35

  6. Use of qualitative methods alongside randomised controlled trials of complex healthcare interventions: methodological study

    PubMed Central

    Glenton, Claire; Oxman, Andrew D

    2009-01-01

    Objective To examine the use of qualitative approaches alongside randomised trials of complex healthcare interventions. Design Review of randomised controlled trials of interventions to change professional practice or the organisation of care. Data sources Systematic sample of 100 trials published in English from the register of the Cochrane Effective Practice and Organisation of Care Review Group. Methods Published and unpublished qualitative studies linked to the randomised controlled trials were identified through database searches and contact with authors. Data were extracted from each study by two reviewers using a standard form. We extracted data describing the randomised controlled trials and qualitative studies, the quality of these studies, and how, if at all, the qualitative and quantitative findings were combined. A narrative synthesis of the findings was done. Results 30 of the 100 trials had associated qualitative work and 19 of these were published studies. 14 qualitative studies were done before the trial, nine during the trial, and four after the trial. 13 studies reported an explicit theoretical basis and 11 specified their methodological approach. Approaches to sampling and data analysis were poorly described. For most cases (n=20) we found no indication of integration of qualitative and quantitative findings at the level of either analysis or interpretation. The quality of the qualitative studies was highly variable. Conclusions Qualitative studies alongside randomised controlled trials remain uncommon, even where relatively complex interventions are being evaluated. Most of the qualitative studies were carried out before or during the trials with few studies used to explain trial results. The findings of the qualitative studies seemed to be poorly integrated with those of the trials and often had major methodological shortcomings. PMID:19744976

  7. Multi-scale characterization of the energy landscape of proteins with application to the C3D/Efb-C complex.

    PubMed

    Haspel, Nurit; Geisbrecht, Brian V; Lambris, John; Kavraki, Lydia

    2010-03-01

    We present a novel multi-level methodology to explore and characterize the low energy landscape and the thermodynamics of proteins. Traditional conformational search methods typically explore only a small portion of the conformational space of proteins and are hard to apply to large proteins due to the large amount of calculations required. In our multi-scale approach, we first provide an initial characterization of the equilibrium state ensemble of a protein using an efficient computational conformational sampling method. We then enrich the obtained ensemble by performing short Molecular Dynamics (MD) simulations on selected conformations from the ensembles as starting points. To facilitate the analysis of the results, we project the resulting conformations on a low-dimensional landscape to efficiently focus on important interactions and examine low energy regions. This methodology provides a more extensive sampling of the low energy landscape than an MD simulation starting from a single crystal structure as it explores multiple trajectories of the protein. This enables us to obtain a broader view of the dynamics of proteins and it can help in understanding complex binding, improving docking results and more. In this work, we apply the methodology to provide an extensive characterization of the bound complexes of the C3d fragment of human Complement component C3 and one of its powerful bacterial inhibitors, the inhibitory domain of Staphylococcus aureus extra-cellular fibrinogen-binding domain (Efb-C) and two of its mutants. We characterize several important interactions along the binding interface and define low free energy regions in the three complexes. Proteins 2010. (c) 2009 Wiley-Liss, Inc.

  8. Medical Libraries of the Soviet Union *

    PubMed Central

    Morozov, A.

    1964-01-01

    Medical libraries are a part of the Soviet library system. The total number of medical libraries in the country is more than 4,000, with a collection of over 42,000,000 volumes that are used by over a million readers. The State Central Medical Library occupies a special place among these libraries. It carries out the functions of a methodological, bibliographic, and coordinating center. It has a collection of over 1,000,000 units of books and periodicals. All the bibliographic work of medical libraries as well as any other work is provided to help medical institutions. Libraries prepare special bibliographies of medical literature for publication according to a plan. The libraries also conduct reference and information work. The methodological work helps to solve the most important problems that arise in libraries with reference to the specific character of their work and tasks. The chief means of rendering methodological guidance are to analyze the work of various libraries, to hold conferences, to exchange visits with libraries, to give both field and correspondence consultations, and to organize qualification courses for librarians and bibliographers. PMID:14119299

  9. Mikhail Geraskov (1874-1957): Methodological Concepts of Learning Physics

    ERIC Educational Resources Information Center

    Ilieva, Mariyana

    2014-01-01

    Mikhail Geraskov is a distinguished Bulgarian educator from the first half of the twentieth century, who developed the scientific foundations of didactics and methodology of training. His work contributed a lot to the development of the Bulgarian pedagogy. The subject of scientific research is didactical conceptions and methodological conceptions…

  10. Discourse Analysis and the Study of Educational Leadership

    ERIC Educational Resources Information Center

    Anderson, Gary; Mungal, Angus Shiva

    2015-01-01

    Purpose: The purpose of this paper is to provide an overview of the current and past work using discourse analysis in the field of educational administration and of discourse analysis as a methodology. Design/Methodology/Approach: Authors reviewed research in educational leadership that uses discourse analysis as a methodology. Findings: While…

  11. How have researchers studied multiracial populations? A content and methodological review of 20 years of research.

    PubMed

    Charmaraman, Linda; Woo, Meghan; Quach, Ashley; Erkut, Sumru

    2014-07-01

    The U.S. Census shows that the racial-ethnic makeup of over 9 million people (2.9% of the total population) who self-identified as multiracial is extremely diverse. Each multiracial subgroup has unique social and political histories that may lead to distinct societal perceptions, economic situations, and health outcomes. Despite the increasing academic and media interest in multiracial individuals, there are methodological and definitional challenges in studying the population, resulting in conflicting representations in the literature. This content and methods review of articles on multiracial populations provides a comprehensive understanding of which multiracial populations have been included in research and how they have been studied, both to recognize emerging research and to identify gaps for guiding future research on this complex but increasingly visible population. We examine 125 U.S.-based peer-reviewed journal articles published over the past 20 years (1990 to 2009) containing 133 separate studies focused on multiracial individuals, primarily from the fields of psychology, sociology, social work, education, and public health. Findings include (a) descriptive data regarding the sampling strategies, methodologies, and demographic characteristics of studies, including which multiracial subgroups are most studied, gender, age range, region of country, and socioeconomic status; (b) major thematic trends in research topics concerning multiracial populations; and (c) implications and recommendations for future studies.

  12. [Role of an educational-and-methodological complex in the optimization of teaching at the stage of additional professional education of physicians in the specialty "anesthesiology and reanimatology"].

    PubMed

    Buniatian, A A; Sizova, Zh M; Vyzhigina, M A; Shikh, E V

    2010-01-01

    An educational-and-methodological complex (EMC) in the specialty 'Anesthesiology and Reanimatology", which promotes manageability, flexibility, and dynamism of an educational process, is of great importance in solving the problem in the systematization of knowledge and its best learning by physicians at a stage of additional professional education (APE). EMC is a set of educational-and-methodological materials required to organize and hold an educational process for the advanced training of anesthesiologists and resuscitation specialists at the stage of APE. EMC includes a syllabus for training in the area "Anesthesiology and Reanimatology" by the appropriate training pattern (certification cycles, topical advanced training cycles); a work program for training in the specialty "Anesthesiology and Reanimatology"; a work curriculums for training in allied specialties (surgery, traumatology and orthopedics, obstetrics and gynecology, and pediatrics); work programs on basic disciplines (pharmacology, normal and pathological physiology, normal anatomy, chemistry and biology); working programs on the area "Public health care and health care service", guidelines for the teacher; educational-and-methodological materials for the student; and quiz programs. The main point of EMC in the specialty "Anesthesiology and Reanimatology" is a work program. Thus, educational-and-methodological and teaching materials included into the EMC in the specialty 'Anesthesiology and Reanimatology" should envisage the logically successive exposition of a teaching material, the use of currently available methods and educational facilities, which facilitates the optimization of training of anesthesiologists and resuscitation specialists at the stage of APE.

  13. Working conditions and health in Central America: a survey of 12,024 workers in six countries.

    PubMed

    Benavides, Fernando G; Wesseling, Catharina; Delclos, George L; Felknor, Sarah; Pinilla, Javier; Rodrigo, Fernando

    2014-07-01

    To describe the survey methodology and initial general findings of the first Central American Survey of Working Conditions and Health. A representative sample of 12,024 workers was interviewed at home in Costa Rica, El Salvador, Guatemala, Honduras, Nicaragua and Panama. Questionnaire items addressed worker demographics, employment conditions, occupational risk factors and self-perceived health. Overall, self-employment (37%) is the most frequent type of employment, 8% of employees lack a work contract and 74% of the workforce is not covered by social security. These percentages are higher in Guatemala, Honduras and El Salvador, and lower in Costa Rica, Panama and Nicaragua. A third of the workforce works more than 48 h per week, regardless of gender; this is similar across countries. Women and men report frequent or usual exposures to high ambient temperature (16% and 25%, respectively), dangerous tools and machinery (10%, 24%), work on slippery surfaces (10%, 23%), breathing chemicals (12.1%, 18%), handling toxic substances (5%, 12.1%), heavy loads (6%, 20%) and repetitive movements (43%, 49%). Two-thirds of the workforce perceive their health as being good or very good, and slightly more than half reports having good mental health. The survey offers, for the first time, comparable data on the work and health status of workers in the formal and informal economy in the six Spanish-speaking Central American countries, based on representative national samples. This provides a benchmark for future monitoring of employment and working conditions across countries. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  14. Revolution in Field Science: Apollo Approach to Inaccessible Surface Exploration

    NASA Astrophysics Data System (ADS)

    Clark, P. E.

    2010-07-01

    The extraordinary challenge mission designers, scientists, and engineers, faced in planning the first human expeditions to the surface of another solar system body led to the development of a distinctive and even revolutionary approach to field work. Not only were those involved required to deal effectively with the extreme limitation in resources available for and access to a target as remote as the lunar surface; they were required to developed a rigorous approach to science activities ranging from geological field work to deploying field instruments. Principal aspects and keys to the success of the field work are discussed here, including the highly integrated, intensive, and lengthy science planning, simulation, and astronaut training; the development of a systematic scheme for description and documentation of geological sites and samples; and a flexible yet disciplined methodology for site documentation and sample collection. The capability for constant communication with a ‘backroom’ of geological experts who make requests and weigh in on surface operations was innovative and very useful in encouraging rapid dissemination of information to the greater community in general. An extensive archive of the Apollo era science activity related documents provides evidence of the principal aspects and keys to the success of the field work. The Apollo Surface Journal allows analysis of the astronaut’s performance in terms of capability for traveling on foot, documentation and sampling of field stations, and manual operation of tools and instruments, all as a function of time. The application of these analysis as ‘lessons learned’ for planning the next generation of human or robotic field science activities on the Moon and elsewhere are considered here as well.

  15. Organotin speciation in environmental matrices by automated on-line hydride generation-programmed temperature vaporization-capillary gas chromatography-mass spectrometry detection.

    PubMed

    Serra, H; Nogueira, J M F

    2005-11-11

    In the present contribution, a new automated on-line hydride generation methodology was developed for dibutyltin and tributyltin speciation at the trace level, using a programmable temperature-vaporizing inlet followed by capillary gas chromatography coupled to mass spectrometry in the selected ion-monitoring mode acquisition (PTV-GC/MS(SIM)). The methodology involves a sequence defined by two running methods, the first one configured for hydride generation with sodium tetrahydroborate as derivatising agent and the second configured for speciation purposes, using a conventional autosampler and data acquisition controlled by the instrument's software. From the method-development experiments, it had been established that injector configuration has a great effect on the speciation of the actual methodology, particularly, the initial inlet temperature (-20 degrees C; He: 150 ml/min), injection volume (2 microl) and solvent characteristics using the solvent venting mode. Under optimized conditions, a remarkable instrumental performance including very good precision (RSD < 4%), excellent linear dynamic range (up to 50 microg/ml) and limits of detection of 0.12 microg/ml and 9 ng/ml, were obtained for dibutyltin and tributyltin, respectively. The feasibility of the present methodology was validated through assays upon in-house spiked water (2 ng/ml) and a certified reference sediment matrix (Community Bureau of Reference, CRM 462, Nr. 330 dibutyltin: 68+/-12 ng/g; tributyltin: 54+/-15 ng/g on dry mass basis), using liquid-liquid extraction (LLE) and solid-phase extraction (SPE) sample enrichment and multiple injections (2 x 5 microl) for sensitivity enhancement. The methodology evidenced high reproducibility, is easy to work-up, sensitive and showed to be a suitable alternative to replace the currently dedicated analytical systems for organotin speciation in environmental matrices at the trace level.

  16. Considering the Role of Self-Study of Teaching and Teacher Education Practices Research in Transforming Urban Classrooms

    ERIC Educational Resources Information Center

    Hamilton, Mary Lynn; Pinnegar, Stefinee

    2015-01-01

    We explore the first four articles in this Special Issue of "Studying Teacher Education" to identify challenges to the self-study of teaching and teacher education practices (S-STEP) methodology, and how this methodology supports the work of teachers and teacher educators working in urban settings. We respond to these articles by…

  17. Teaching Note--An Exploration of Team-Based Learning and Social Work Education: A Natural Fit

    ERIC Educational Resources Information Center

    Robinson, Michael A.; Robinson, Michelle Bachelor; McCaskill, Gina M.

    2013-01-01

    The literature on team-based learning (TBL) as a pedagogical methodology in social work education is limited; however, TBL, which was developed as a model for business, has been successfully used as a teaching methodology in nursing, business, engineering, medical school, and many other disciplines in academia. This project examines the use of TBL…

  18. Comparison of Methodologies of Activation Barrier Measurements for Reactions with Deactivation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xie, Zhenhua; Yan, Binhang; Zhang, Li

    In this work, methodologies of activation barrier measurements for reactions with deactivation were theoretically analyzed. Reforming of ethane with CO 2 was introduced as an example for reactions with deactivation to experimentally evaluate these methodologies. Both the theoretical and experimental results showed that due to catalyst deactivation, the conventional method would inevitably lead to a much lower activation barrier, compared to the intrinsic value, even though heat and mass transport limitations were excluded. In this work, an optimal method was identified in order to provide a reliable and efficient activation barrier measurement for reactions with deactivation.

  19. Comparison of Methodologies of Activation Barrier Measurements for Reactions with Deactivation

    DOE PAGES

    Xie, Zhenhua; Yan, Binhang; Zhang, Li; ...

    2017-01-25

    In this work, methodologies of activation barrier measurements for reactions with deactivation were theoretically analyzed. Reforming of ethane with CO 2 was introduced as an example for reactions with deactivation to experimentally evaluate these methodologies. Both the theoretical and experimental results showed that due to catalyst deactivation, the conventional method would inevitably lead to a much lower activation barrier, compared to the intrinsic value, even though heat and mass transport limitations were excluded. In this work, an optimal method was identified in order to provide a reliable and efficient activation barrier measurement for reactions with deactivation.

  20. An efficient and cost-effective method for preparing transmission electron microscopy samples from powders

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wen, Haiming; Lin, Yaojun; Seidman, David N.

    The preparation of transmission electron microcopy (TEM) samples from powders with particle sizes larger than ~100 nm poses a challenge. The existing methods are complicated and expensive, or have a low probability of success. Herein, we report a modified methodology for preparation of TEM samples from powders, which is efficient, cost-effective, and easy to perform. This method involves mixing powders with an epoxy on a piece of weighing paper, curing the powder–epoxy mixture to form a bulk material, grinding the bulk to obtain a thin foil, punching TEM discs from the foil, dimpling the discs, and ion milling the dimpledmore » discs to electron transparency. Compared with the well established and robust grinding–dimpling–ion-milling method for TEM sample preparation for bulk materials, our modified approach for preparing TEM samples from powders only requires two additional simple steps. In this article, step-by-step procedures for our methodology are described in detail, and important strategies to ensure success are elucidated. Furthermore, our methodology has been applied successfully for preparing TEM samples with large thin areas and high quality for many different mechanically milled metallic powders.« less

  1. Measuring solids concentration in stormwater runoff: comparison of analytical methods.

    PubMed

    Clark, Shirley E; Siu, Christina Y S

    2008-01-15

    Stormwater suspended solids typically are quantified using one of two methods: aliquot/subsample analysis (total suspended solids [TSS]) or whole-sample analysis (suspended solids concentration [SSC]). Interproject comparisons are difficult because of inconsistencies in the methods and in their application. To address this concern, the suspended solids content has been measured using both methodologies in many current projects, but the question remains about how to compare these values with historical water-quality data where the analytical methodology is unknown. This research was undertaken to determine the effect of analytical methodology on the relationship between these two methods of determination of the suspended solids concentration, including the effect of aliquot selection/collection method and of particle size distribution (PSD). The results showed that SSC was best able to represent the known sample concentration and that the results were independent of the sample's PSD. Correlations between the results and the known sample concentration could be established for TSS samples, but they were highly dependent on the sample's PSD and on the aliquot collection technique. These results emphasize the need to report not only the analytical method but also the particle size information on the solids in stormwater runoff.

  2. An efficient and cost-effective method for preparing transmission electron microscopy samples from powders

    DOE PAGES

    Wen, Haiming; Lin, Yaojun; Seidman, David N.; ...

    2015-09-09

    The preparation of transmission electron microcopy (TEM) samples from powders with particle sizes larger than ~100 nm poses a challenge. The existing methods are complicated and expensive, or have a low probability of success. Herein, we report a modified methodology for preparation of TEM samples from powders, which is efficient, cost-effective, and easy to perform. This method involves mixing powders with an epoxy on a piece of weighing paper, curing the powder–epoxy mixture to form a bulk material, grinding the bulk to obtain a thin foil, punching TEM discs from the foil, dimpling the discs, and ion milling the dimpledmore » discs to electron transparency. Compared with the well established and robust grinding–dimpling–ion-milling method for TEM sample preparation for bulk materials, our modified approach for preparing TEM samples from powders only requires two additional simple steps. In this article, step-by-step procedures for our methodology are described in detail, and important strategies to ensure success are elucidated. Furthermore, our methodology has been applied successfully for preparing TEM samples with large thin areas and high quality for many different mechanically milled metallic powders.« less

  3. CBT Specific Process in Exposure-Based Treatments: Initial Examination in a Pediatric OCD Sample

    PubMed Central

    Benito, Kristen Grabill; Conelea, Christine; Garcia, Abbe M.; Freeman, Jennifer B.

    2012-01-01

    Cognitive-Behavioral theory and empirical support suggest that optimal activation of fear is a critical component for successful exposure treatment. Using this theory, we developed coding methodology for measuring CBT-specific process during exposure. We piloted this methodology in a sample of young children (N = 18) who previously received CBT as part of a randomized controlled trial. Results supported the preliminary reliability and predictive validity of coding variables with 12 week and 3 month treatment outcome data, generally showing results consistent with CBT theory. However, given our limited and restricted sample, additional testing is warranted. Measurement of CBT-specific process using this methodology may have implications for understanding mechanism of change in exposure-based treatments and for improving dissemination efforts through identification of therapist behaviors associated with improved outcome. PMID:22523609

  4. Polyphenols excreted in urine as biomarkers of total polyphenol intake.

    PubMed

    Medina-Remón, Alexander; Tresserra-Rimbau, Anna; Arranz, Sara; Estruch, Ramón; Lamuela-Raventos, Rosa M

    2012-11-01

    Nutritional biomarkers have several advantages in acquiring data for epidemiological and clinical studies over traditional dietary assessment tools, such as food frequency questionnaires. While food frequency questionnaires constitute a subjective methodology, biomarkers can provide a less biased and more accurate measure of specific nutritional intake. A precise estimation of polyphenol consumption requires blood or urine sample biomarkers, although their association is usually highly complex. This article reviews recent research on urinary polyphenols as potential biomarkers of polyphenol intake, focusing on clinical and epidemiological studies. We also report a potentially useful methodology to assess total polyphenols in urine samples, which allows a rapid, simultaneous determination of total phenols in a large number of samples. This methodology can be applied in studies evaluating the utility of urinary polyphenols as markers of polyphenol intake, bioavailability and accumulation in the body.

  5. Quantification by SEM-EDS in uncoated non-conducting samples

    NASA Astrophysics Data System (ADS)

    Galván Josa, V.; Castellano, G.; Bertolino, S. R.

    2013-07-01

    An approach to perform elemental quantitative analysis in a conventional scanning electron microscope with an energy dispersive spectrometer has been developed for non-conductive samples in which the conductive coating should be avoided. Charge accumulation effects, which basically decrease the energy of the primary beam, were taken into account by means of the Duane-Hunt limit. This value represents the maximum energy of the continuum X-ray spectrum, and is related to the effective energy of the incident electron beam. To validate the results obtained by this procedure, a non-conductive sample of known composition was quantified without conductive coating. Complementarily, changes in the X-ray spectrum due to charge accumulation effects were studied by Monte Carlo simulations, comparing relative characteristic intensities as a function of the incident energy. This methodology is exemplified here to obtain the chemical composition of white and reddish archaeological pigments belonging to the Ambato style of "Aguada" culture (Catamarca, Argentina 500-1100 AD). The results obtained in this work show that the quantification procedure taking into account the Duane-Hunt limit is suitable for this kind of samples. This approach may be recommended for the quantification of samples for which coating is not desirable, such as ancient artwork, forensic or archaeological samples, or when the coating element is also present in the sample.

  6. Prediction of physical workload in reduced gravity environments

    NASA Technical Reports Server (NTRS)

    Goldberg, Joseph H.

    1987-01-01

    The background, development, and application of a methodology to predict human energy expenditure and physical workload in low gravity environments, such as a Lunar or Martian base, is described. Based on a validated model to predict energy expenditures in Earth-based industrial jobs, the model relies on an elemental analysis of the proposed job. Because the job itself need not physically exist, many alternative job designs may be compared in their physical workload. The feasibility of using the model for prediction of low gravity work was evaluated by lowering body and load weights, while maintaining basal energy expenditure. Comparison of model results was made both with simulated low gravity energy expenditure studies and with reported Apollo 14 Lunar EVA expenditure. Prediction accuracy was very good for walking and for cart pulling on slopes less than 15 deg, but the model underpredicted the most difficult work conditions. This model was applied to example core sampling and facility construction jobs, as presently conceptualized for a Lunar or Martian base. Resultant energy expenditures and suggested work-rest cycles were well within the range of moderate work difficulty. Future model development requirements were also discussed.

  7. Novel measurement techniques (development and analysis of silicon solar cells near 20% effciency)

    NASA Technical Reports Server (NTRS)

    Wolf, M.; Newhouse, M.

    1986-01-01

    Work in identifying, developing, and analyzing techniques for measuring bulk recombination rates, and surface recombination velocities and rates in all regions of high-efficiency silicon solar cells is presented. The accuracy of the previously developed DC measurement system was improved by adding blocked interference filters. The system was further automated by writing software that completely samples the unkown solar cell regions with data of numerous recombination velocity and lifetime pairs. The results can be displayed in three dimensions and the best fit can be found numerically using the simplex minimization algorithm. Also described is a theoretical methodology to analyze and compare existing dynamic measurement techniques.

  8. Analyzing speckle contrast for HiLo microscopy optimization.

    PubMed

    Mazzaferri, J; Kunik, D; Belisle, J M; Singh, K; Lefrançois, S; Costantino, S

    2011-07-18

    HiLo microscopy is a recently developed technique that provides both optical sectioning and fast imaging with a simple implementation and at a very low cost. The methodology combines widefield and speckled illumination images to obtain one optically sectioned image. Hence, the characteristics of such speckle illumination ultimately determine the quality of HiLo images and the overall performance of the method. In this work, we study how speckle contrast influence local variations of fluorescence intensity and brightness profiles of thick samples. We present this article as a guide to adjust the parameters of the system for optimizing the capabilities of this novel technology.

  9. Analyzing speckle contrast for HiLo microscopy optimization

    NASA Astrophysics Data System (ADS)

    Mazzaferri, J.; Kunik, D.; Belisle, J. M.; Singh, K.; Lefrançois, S.; Costantino, S.

    2011-07-01

    HiLo microscopy is a recently developed technique that provides both optical sectioning and fast imaging with a simple implementation and at a very low cost. The methodology combines widefield and speckled illumination images to obtain one optically sectioned image. Hence, the characteristics of such speckle illumination ultimately determine the quality of HiLo images and the overall performance of the method. In this work, we study how speckle contrast influence local variations of fluorescence intensity and brightness profiles of thick samples. We present this article as a guide to adjust the parameters of the system for optimizing the capabilities of this novel technology.

  10. Novel measurement techniques (development and analysis of silicon solar cells near 20% effciency)

    NASA Astrophysics Data System (ADS)

    Wolf, M.; Newhouse, M.

    Work in identifying, developing, and analyzing techniques for measuring bulk recombination rates, and surface recombination velocities and rates in all regions of high-efficiency silicon solar cells is presented. The accuracy of the previously developed DC measurement system was improved by adding blocked interference filters. The system was further automated by writing software that completely samples the unkown solar cell regions with data of numerous recombination velocity and lifetime pairs. The results can be displayed in three dimensions and the best fit can be found numerically using the simplex minimization algorithm. Also described is a theoretical methodology to analyze and compare existing dynamic measurement techniques.

  11. Multiplex mass spectrometry imaging for latent fingerprints.

    PubMed

    Yagnik, Gargey B; Korte, Andrew R; Lee, Young Jin

    2013-01-01

    We have previously developed in-parallel data acquisition of orbitrap mass spectrometry (MS) and ion trap MS and/or MS/MS scans for matrix-assisted laser desorption/ionization MS imaging (MSI) to obtain rich chemical information in less data acquisition time. In the present study, we demonstrate a novel application of this multiplex MSI methodology for latent fingerprints. In a single imaging experiment, we could obtain chemical images of various endogenous and exogenous compounds, along with simultaneous MS/MS images of a few selected compounds. This work confirms the usefulness of multiplex MSI to explore chemical markers when the sample specimen is very limited. Copyright © 2013 John Wiley & Sons, Ltd.

  12. Dietary exposure to trace elements and radionuclides: the methodology of the Italian Total Diet Study 2012-2014.

    PubMed

    D'Amato, Marilena; Turrini, Aida; Aureli, Federica; Moracci, Gabriele; Raggi, Andrea; Chiaravalle, Eugenio; Mangiacotti, Michele; Cenci, Telemaco; Orletti, Roberta; Candela, Loredana; di Sandro, Alessandra; Cubadda, Francesco

    2013-01-01

    This article presents the methodology of the Italian Total Diet Study 2012-2014 aimed at assessing the dietary exposure of the general Italian population to selected nonessential trace elements (Al, inorganic As, Cd, Pb, methyl-Hg, inorganic Hg, U) and radionuclides (40K, 134Cs, 137Cs, 90Sr). The establishment of the TDS food list, the design of the sampling plan, and details about the collection of food samples, their standardized culinary treatment, pooling into analytical samples and subsequent sample treatment are described. Analytical techniques and quality assurance are discussed, with emphasis on the need for speciation data and for minimizing the percentage of left-censored data so as to reduce uncertainties in exposure assessment. Finally the methodology for estimating the exposure of the general population and of population subgroups according to age (children, teenagers, adults, and the elderly) and gender, both at the national level and for each of the four main geographical areas of Italy, is presented.

  13. Viscoelastic characterization of soft biological materials

    NASA Astrophysics Data System (ADS)

    Nayar, Vinod Timothy

    Progressive and irreversible retinal diseases are among the primary causes of blindness in the United States, attacking the cells in the eye that transform environmental light into neural signals for the optic pathway. Medical implants designed to restore visual function to afflicted patients can cause mechanical stress and ultimately damage to the host tissues. Research shows that an accurate understanding of the mechanical properties of the biological tissues can reduce damage and lead to designs with improved safety and efficacy. Prior studies on the mechanical properties of biological tissues show characterization of these materials can be affected by environmental, length-scale, time, mounting, stiffness, size, viscoelastic, and methodological conditions. Using porcine sclera tissue, the effects of environmental, time, and mounting conditions are evaluated when using nanoindentation. Quasi-static tests are used to measure reduced modulus during extended exposure to phosphate-buffered saline (PBS), as well as the chemical and mechanical analysis of mounting the sample to a solid substrate using cyanoacrylate. The less destructive nature of nanoindentation tests allows for variance of tests within a single sample to be compared to the variance between samples. The results indicate that the environmental, time, and mounting conditions can be controlled for using modified nanoindentation procedures for biological samples and are in line with averages modulus values from previous studies but with increased precision. By using the quasi-static and dynamic characterization capabilities of the nanoindentation setup, the additional stiffness and viscoelastic variables are measured. Different quasi-static control methods were evaluated along with maximum load parameters and produced no significant difference in reported reduced modulus values. Dynamic characterization tests varied frequency and quasi-static load, showing that the agar could be modeled as a linearly elastic material. The effects of sample stiffness were evaluated by testing both the quasi-static and dynamic mechanical properties of different concentration agar samples, ranging from 0.5% to 5.0%. The dynamic nanoindentation protocol showed some sensitivity to sample stiffness, but characterization remained consistently applicable to soft biological materials. Comparative experiments were performed on both 0.5% and 5.0% agar as well as porcine eye tissue samples using published dynamic macrocompression standards. By comparing these new tests to those obtained with nanoindentation, the effects due to length-scale, stiffness, size, viscoelastic, and methodological conditions are evaluated. Both testing methodologies can be adapted for the environmental and mounting conditions, but the limitations of standardized macro-scale tests are explored. The factors affecting mechanical characterization of soft and thin viscoelastic biological materials are researched and a comprehensive protocol is presented. This work produces material mechanical properties for use in improving future medical implant designs on a wide variety of biological tissue and materials.

  14. Behavioral networks as a model for intelligent agents

    NASA Technical Reports Server (NTRS)

    Sliwa, Nancy E.

    1990-01-01

    On-going work at NASA Langley Research Center in the development and demonstration of a paradigm called behavioral networks as an architecture for intelligent agents is described. This work focuses on the need to identify a methodology for smoothly integrating the characteristics of low-level robotic behavior, including actuation and sensing, with intelligent activities such as planning, scheduling, and learning. This work assumes that all these needs can be met within a single methodology, and attempts to formalize this methodology in a connectionist architecture called behavioral networks. Behavioral networks are networks of task processes arranged in a task decomposition hierarchy. These processes are connected by both command/feedback data flow, and by the forward and reverse propagation of weights which measure the dynamic utility of actions and beliefs.

  15. A Microfluidic Interface for the Culture and Sampling of Adiponectin from Primary Adipocytes

    PubMed Central

    Godwin, Leah A.; Brooks, Jessica C.; Hoepfner, Lauren D.; Wanders, Desiree; Judd, Robert L.; Easley, Christopher J.

    2014-01-01

    Secreted from adipose tissue, adiponectin is a vital endocrine hormone that acts in glucose metabolism, thereby establishing its crucial role in diabetes, obesity, and other metabolic disease states. Insulin exposure to primary adipocytes cultured in static conditions has been shown to stimulate adiponectin secretion. However, conventional, static methodology for culturing and stimulating adipocytes falls short of truly mimicking physiological environments. Along with decreases in experimental costs and sample volume, and increased temporal resolution, microfluidic platforms permit small-volume flowing cell culture systems, which more accurately represent the constant flow conditions through vasculature in vivo. Here, we have integrated a customized primary tissue culture reservoir into a passively operated microfluidic device made of polydimethylsiloxane (PDMS). Fabrication of the reservoir was accomplished through unique PDMS “landscaping” above sampling channels, with a design strategy targeted to primary adipocytes to overcome issues of positive cell buoyancy. This reservoir allowed three-dimensional culture of primary murine adipocytes, accurate control over stimulants via constant perfusion, and sampling of adipokine secretion during various treatments. As the first report of primary adipocyte culture and sampling within microfluidic systems, this work sets the stage for future studies in adipokine secretion dynamics. PMID:25423362

  16. Time-dependent importance sampling in semiclassical initial value representation calculations for time correlation functions. II. A simplified implementation.

    PubMed

    Tao, Guohua; Miller, William H

    2012-09-28

    An efficient time-dependent (TD) Monte Carlo (MC) importance sampling method has recently been developed [G. Tao and W. H. Miller, J. Chem. Phys. 135, 024104 (2011)] for the evaluation of time correlation functions using the semiclassical (SC) initial value representation (IVR) methodology. In this TD-SC-IVR method, the MC sampling uses information from both time-evolved phase points as well as their initial values, and only the "important" trajectories are sampled frequently. Even though the TD-SC-IVR was shown in some benchmark examples to be much more efficient than the traditional time-independent sampling method (which uses only initial conditions), the calculation of the SC prefactor-which is computationally expensive, especially for large systems-is still required for accepted trajectories. In the present work, we present an approximate implementation of the TD-SC-IVR method that is completely prefactor-free; it gives the time correlation function as a classical-like magnitude function multiplied by a phase function. Application of this approach to flux-flux correlation functions (which yield reaction rate constants) for the benchmark H + H(2) system shows very good agreement with exact quantum results. Limitations of the approximate approach are also discussed.

  17. Optimal spatio-temporal design of water quality monitoring networks for reservoirs: Application of the concept of value of information

    NASA Astrophysics Data System (ADS)

    Maymandi, Nahal; Kerachian, Reza; Nikoo, Mohammad Reza

    2018-03-01

    This paper presents a new methodology for optimizing Water Quality Monitoring (WQM) networks of reservoirs and lakes using the concept of the value of information (VOI) and utilizing results of a calibrated numerical water quality simulation model. With reference to the value of information theory, water quality of every checkpoint with a specific prior probability differs in time. After analyzing water quality samples taken from potential monitoring points, the posterior probabilities are updated using the Baye's theorem, and VOI of the samples is calculated. In the next step, the stations with maximum VOI is selected as optimal stations. This process is repeated for each sampling interval to obtain optimal monitoring network locations for each interval. The results of the proposed VOI-based methodology is compared with those obtained using an entropy theoretic approach. As the results of the two methodologies would be partially different, in the next step, the results are combined using a weighting method. Finally, the optimal sampling interval and location of WQM stations are chosen using the Evidential Reasoning (ER) decision making method. The efficiency and applicability of the methodology are evaluated using available water quantity and quality data of the Karkheh Reservoir in the southwestern part of Iran.

  18. Quantification and micron-scale imaging of spatial distribution of trace beryllium in shrapnel fragments and metallurgic samples with correlative fluorescence detection method and secondary ion mass spectrometry (SIMS)

    PubMed Central

    Abraham, Jerrold L.; Chandra, Subhash; Agrawal, Anoop

    2014-01-01

    Recently, a report raised the possibility of shrapnel-induced chronic beryllium disease (CBD) from long-term exposure to the surface of retained aluminum shrapnel fragments in the body. Since the shrapnel fragments contained trace beryllium, methodological developments were needed for beryllium quantification and to study its spatial distribution in relation to other matrix elements, such as aluminum and iron, in metallurgic samples. In this work, we developed methodology for quantification of trace beryllium in samples of shrapnel fragments and other metallurgic sample-types with main matrix of aluminum (aluminum cans from soda, beer, carbonated water, and aluminum foil). Sample preparation procedures were developed for dissolving beryllium for its quantification with the fluorescence detection method for homogenized measurements. The spatial distribution of trace beryllium on the sample surface and in 3D was imaged with a dynamic secondary ion mass spectrometry (SIMS) instrument, CAMECA IMS 3f SIMS ion microscope. The beryllium content of shrapnel (~100 ppb) was the same as the trace quantities of beryllium found in aluminum cans. The beryllium content of aluminum foil (~25 ppb) was significantly lower than cans. SIMS imaging analysis revealed beryllium to be distributed in the form of low micron-sized particles and clusters distributed randomly in X-Y-and Z dimensions, and often in association with iron, in the main aluminum matrix of cans. These observations indicate a plausible formation of Be-Fe or Al-Be alloy in the matrix of cans. Further observations were made on fluids (carbonated water) for understanding if trace beryllium in cans leached out and contaminated the food product. A direct comparison of carbonated water in aluminum cans and plastic bottles revealed that beryllium was below the detection limits of the fluorescence detection method (~0.01 ppb). These observations indicate that beryllium present in aluminum matrix was either present in an immobile form or its mobilization into the food product was prevented by a polymer coating on the inside of cans, a practice used in food industry to prevent contamination of food products. The lack of such coating in retained shrapnel fragments renders their surface a possible source of contamination for long-term exposure of tissues and fluids and induction of disease, as characterized in a recent study. Methodological developments reported here can be extended to studies of beryllium in electronics devices and components. PMID:25146877

  19. Quantification and micron-scale imaging of spatial distribution of trace beryllium in shrapnel fragments and metallurgic samples with correlative fluorescence detection method and secondary ion mass spectrometry (SIMS).

    PubMed

    Abraham, J L; Chandra, S; Agrawal, A

    2014-11-01

    Recently, a report raised the possibility of shrapnel-induced chronic beryllium disease from long-term exposure to the surface of retained aluminum shrapnel fragments in the body. Since the shrapnel fragments contained trace beryllium, methodological developments were needed for beryllium quantification and to study its spatial distribution in relation to other matrix elements, such as aluminum and iron, in metallurgic samples. In this work, we developed methodology for quantification of trace beryllium in samples of shrapnel fragments and other metallurgic sample-types with main matrix of aluminum (aluminum cans from soda, beer, carbonated water and aluminum foil). Sample preparation procedures were developed for dissolving beryllium for its quantification with the fluorescence detection method for homogenized measurements. The spatial distribution of trace beryllium on the sample surface and in 3D was imaged with a dynamic secondary ion mass spectrometry instrument, CAMECA IMS 3f secondary ion mass spectrometry ion microscope. The beryllium content of shrapnel (∼100 ppb) was the same as the trace quantities of beryllium found in aluminum cans. The beryllium content of aluminum foil (∼25 ppb) was significantly lower than cans. SIMS imaging analysis revealed beryllium to be distributed in the form of low micron-sized particles and clusters distributed randomly in X-Y- and Z dimensions, and often in association with iron, in the main aluminum matrix of cans. These observations indicate a plausible formation of Be-Fe or Al-Be alloy in the matrix of cans. Further observations were made on fluids (carbonated water) for understanding if trace beryllium in cans leached out and contaminated the food product. A direct comparison of carbonated water in aluminum cans and plastic bottles revealed that beryllium was below the detection limits of the fluorescence detection method (∼0.01 ppb). These observations indicate that beryllium present in aluminum matrix was either present in an immobile form or its mobilization into the food product was prevented by a polymer coating on the inside of cans, a practice used in food industry to prevent contamination of food products. The lack of such coating in retained shrapnel fragments renders their surface a possible source of contamination for long-term exposure of tissues and fluids and induction of disease, as characterized in a recent study. Methodological developments reported here can be extended to studies of beryllium in electronics devices and components. © 2014 The Authors Journal of Microscopy © 2014 Royal Microscopical Society.

  20. ARAMIS project: a comprehensive methodology for the identification of reference accident scenarios in process industries.

    PubMed

    Delvosalle, Christian; Fievez, Cécile; Pipart, Aurore; Debray, Bruno

    2006-03-31

    In the frame of the Accidental Risk Assessment Methodology for Industries (ARAMIS) project, this paper aims at presenting the work carried out in the part of the project devoted to the definition of accident scenarios. This topic is a key-point in risk assessment and serves as basis for the whole risk quantification. The first result of the work is the building of a methodology for the identification of major accident hazards (MIMAH), which is carried out with the development of generic fault and event trees based on a typology of equipment and substances. The term "major accidents" must be understood as the worst accidents likely to occur on the equipment, assuming that no safety systems are installed. A second methodology, called methodology for the identification of reference accident scenarios (MIRAS) takes into account the influence of safety systems on both the frequencies and possible consequences of accidents. This methodology leads to identify more realistic accident scenarios. The reference accident scenarios are chosen with the help of a tool called "risk matrix", crossing the frequency and the consequences of accidents. This paper presents both methodologies and an application on an ethylene oxide storage.

  1. Fusion And Inference From Multiple And Massive Disparate Distributed Dynamic Data Sets

    DTIC Science & Technology

    2017-07-01

    principled methodology for two-sample graph testing; designed a provably almost-surely perfect vertex clustering algorithm for block model graphs; proved...3.7 Semi-Supervised Clustering Methodology ...................................................................... 9 3.8 Robust Hypothesis Testing...dimensional Euclidean space – allows the full arsenal of statistical and machine learning methodology for multivariate Euclidean data to be deployed for

  2. Axial Magneto-Inductive Effect in Soft Magnetic Microfibers, Test Methodology, and Experiments

    DTIC Science & Technology

    2016-03-24

    NUWC-NPT Technical Report 12,186 24 March 2016 Axial Magneto-Inductive Effect in Soft Magnetic Microfibers, Test Methodology , and...Microfibers, Test Methodology , and Experiments 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Anthony B...5 4 MEASUREMENTS AND EXPERIMENTAL APPARATUS ...........................................9 5 SAMPLE PREPARATION

  3. 77 FR 6971 - Establishment of User Fees for Filovirus Testing of Nonhuman Primate Liver Samples

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-10

    ... (ELISA) or other appropriate methodology. Each specimen will be held for six months. After six months.../CDC's analysis of costs to the Government is based on the current methodology (ELISA) used to test NHP... different methodology or changes in the availability of ELISA reagents will affect the amount of the user...

  4. Conventionalism and Methodological Standards in Contending with Skepticism about Uncertainty

    NASA Astrophysics Data System (ADS)

    Brumble, K. C.

    2012-12-01

    What it means to measure and interpret confidence and uncertainty in a result is often particular to a specific scientific community and its methodology of verification. Additionally, methodology in the sciences varies greatly across disciplines and scientific communities. Understanding the accuracy of predictions of a particular science thus depends largely upon having an intimate working knowledge of the methods, standards, and conventions utilized and underpinning discoveries in that scientific field. Thus, valid criticism of scientific predictions and discoveries must be conducted by those who are literate in the field in question: they must have intimate working knowledge of the methods of the particular community and of the particular research under question. The interpretation and acceptance of uncertainty is one such shared, community-based convention. In the philosophy of science, this methodological and community-based way of understanding scientific work is referred to as conventionalism. By applying the conventionalism of historian and philosopher of science Thomas Kuhn to recent attacks upon methods of multi-proxy mean temperature reconstructions, I hope to illuminate how climate skeptics and their adherents fail to appreciate the need for community-based fluency in the methodological standards for understanding uncertainty shared by the wider climate science community. Further, I will flesh out a picture of climate science community standards of evidence and statistical argument following the work of philosopher of science Helen Longino. I will describe how failure to appreciate the conventions of professionalism and standards of evidence accepted in the climate science community results in the application of naïve falsification criteria. Appeal to naïve falsification in turn has allowed scientists outside the standards and conventions of the mainstream climate science community to consider themselves and to be judged by climate skeptics as valid critics of particular statistical reconstructions with naïve and misapplied methodological criticism. Examples will include the skeptical responses to multi-proxy mean temperature reconstructions and congressional hearings criticizing the work of Michael Mann et al.'s Hockey Stick.

  5. Ultrasensitive Genotypic Detection of Antiviral Resistance in Hepatitis B Virus Clinical Isolates▿ †

    PubMed Central

    Fang, Jie; Wichroski, Michael J.; Levine, Steven M.; Baldick, Carl J.; Mazzucco, Charles E.; Walsh, Ann W.; Kienzle, Bernadette K.; Rose, Ronald E.; Pokornowski, Kevin A.; Colonno, Richard J.; Tenney, Daniel J.

    2009-01-01

    Amino acid substitutions that confer reduced susceptibility to antivirals arise spontaneously through error-prone viral polymerases and are selected as a result of antiviral therapy. Resistance substitutions first emerge in a fraction of the circulating virus population, below the limit of detection by nucleotide sequencing of either the population or limited sets of cloned isolates. These variants can expand under drug pressure to dominate the circulating virus population. To enhance detection of these viruses in clinical samples, we established a highly sensitive quantitative, real-time allele-specific PCR assay for hepatitis B virus (HBV) DNA. Sensitivity was accomplished using a high-fidelity DNA polymerase and oligonucleotide primers containing locked nucleic acid bases. Quantitative measurement of resistant and wild-type variants was accomplished using sequence-matched standards. Detection methodology that was not reliant on hybridization probes, and assay modifications, minimized the effect of patient-specific sequence polymorphisms. The method was validated using samples from patients chronically infected with HBV through parallel sequencing of large numbers of cloned isolates. Viruses with resistance to lamivudine and other l-nucleoside analogs and entecavir, involving 17 different nucleotide substitutions, were reliably detected at levels at or below 0.1% of the total population. The method worked across HBV genotypes. Longitudinal analysis of patient samples showed earlier emergence of resistance on therapy than was seen with sequencing methodologies, including some cases of resistance that existed prior to treatment. In summary, we established and validated an ultrasensitive method for measuring resistant HBV variants in clinical specimens, which enabled earlier, quantitative measurement of resistance to therapy. PMID:19433559

  6. Hepameta-- prevalence of hepatitis B/C and metabolic syndrome in population living in separated and segregated Roma settlements: a methodology for a cross-sectional population-based study using community-based approach.

    PubMed

    Gecková, Andrea Madarasová; Jarcuska, Peter; Mareková, Mária; Pella, Daniel; Siegfried, Leonard; Jarcuska, Pavol; Halánová, Monika

    2014-03-01

    Roma represent one of the largest and oldest minorities in Europe. Health of many of them, particularly those living in settlements, is heavily compromised by poor dwelling, low educational level, unemployment, and poverty rooted in generational poverty, segregation and discrimination. The cross-sectional population-based study using community based approach aimed to map the prevalence of viral hepatitis B/C and metabolic syndrome in the population living in separated and segregated Roma settlements and to compare it with the occurrence of the same health indicators in the majority population, considering selected risk and protective factors of these health indicators. The sample consisted of 452 Roma (mean age = 34.7; 35.2% men) and 403 non-Roma (mean age = 33.5; 45.9% men) respondents. Data were collected in 2011 via questionnaire, anthropometric measures and analysed blood and urine samples. A methodology used in the study as well as in the following scientific papers is described in the Methods section (i.e. study design, procedures, samples, methods including questionnaire, anthropometric measurements, physical measurements, blood and urine measurements). There are regions of declining prosperity due to high unemployment, long-term problems with poverty and depleted resources. Populations living in these areas, i.e. in Central and Eastern Europe in Roma settlements, are at risk of poverty, social exclusion and other factors affecting health. Therefore, we should look for successful long-term strategies and tools (e.g. Roma mediators, terrain work) in order to improve the future prospects of these minorities.

  7. Understanding the role of saliva in aroma release from wine by using static and dynamic headspace conditions.

    PubMed

    Muñoz-González, Carolina; Feron, Gilles; Guichard, Elisabeth; Rodríguez-Bencomo, J José; Martín-Álvarez, Pedro J; Moreno-Arribas, M Victoria; Pozo-Bayón, M Ángeles

    2014-08-20

    The aim of this work was to determine the role of saliva in wine aroma release by using static and dynamic headspace conditions. In the latter conditions, two different sampling points (t = 0 and t = 10 min) corresponding with oral (25.5 °C) and postoral phases (36 °C) were monitored. Both methodologies were applied to reconstituted dearomatized white and red wines with different nonvolatile wine matrix compositions and a synthetic wine (without matrix effect). All of the wines had the same ethanol concentration and were spiked with a mixture of 45 aroma compounds covering a wide range of physicochemical characteristics at typical wine concentrations. Two types of saliva (human and artificial) or control samples (water) were added to the wines. The adequacy of the two headspace methodologies for the purposes of the study (repeatability, linear ranges, determination coefficients, etc.) was previously determined. After application of different chemometric analysis (ANOVA, LSD, PCA), results showed a significant effect of saliva on aroma release dependent on saliva type (differences between artificial and human) and on wine matrix using static headspace conditions. Red wines were more affected than white and synthetic wines by saliva, specifically human saliva, which provoked a reduction in aroma release for most of the assayed aroma compounds independent of their chemical structure. The application of dynamic headspace conditions using a saliva bioreactor at the two different sampling points (t = 0 and t = 10 min) showed a lesser but significant effect of saliva than matrix composition and a high influence of temperature (oral and postoral phases) on aroma release.

  8. An improved method to quantitate mature plant microRNA in biological matrices using modified periodate treatment and inclusion of internal controls.

    PubMed

    Huang, Haiqiu; Roh, Jamin; Davis, Cindy D; Wang, Thomas T Y

    2017-01-01

    MicroRNAs (miRNAs) ubiquitously exist in microorganisms, plants, and animals, and appear to modulate a wide range of critical biological processes. However, no definitive conclusion has been reached regarding the uptake of exogenous dietary small RNAs into mammalian circulation and organs and cross-kingdom regulation. One of the critical issues is our ability to assess and distinguish the origin of miRNAs. Although periodate oxidation has been used to differentiate mammalian and plant miRNAs, validation of treatment efficiency and the inclusion of proper controls for this method were lacking in previous studies. This study aimed to address: 1) the efficiency of periodate treatment in a plant or mammalian RNA matrix, and 2) the necessity of inclusion of internal controls. We designed and tested spike-in synthetic miRNAs in various plant and mammalian matrices and showed that they can be used as a control for the completion of periodate oxidation. We found that overloading the reaction system with high concentration of RNA resulted in incomplete oxidation of unmethylated miRNA. The abundant miRNAs from soy and corn were analyzed in the plasma, liver, and fecal samples of C57BL/6 mice fed a corn and soy-based chow diet using our improved methodology. The improvement resulted in the elimination of the false positive detection in the liver, and we did not detect plant miRNAs in the mouse plasma or liver samples. In summary, an improved methodology was developed for plant miRNA detection that appears to work well in different sample matrices.

  9. AERIS: An Integrated Domain Information System for Aerospace Science and Technology

    ERIC Educational Resources Information Center

    Hatua, Sudip Ranjan; Madalli, Devika P.

    2011-01-01

    Purpose: The purpose of this paper is to discuss the methodology in building an integrated domain information system with illustrations that provide proof of concept. Design/methodology/approach: The present work studies the usual search engine approach to information and its pitfalls. A methodology was adopted for construction of a domain-based…

  10. Diffraction or Reflection? Sketching the Contours of Two Methodologies in Educational Research

    ERIC Educational Resources Information Center

    Bozalek, Vivienne; Zembylas, Michalinos

    2017-01-01

    Internationally, an interest is emerging in a growing body of work on what has become known as "diffractive methodologies" drawing attention to ontological aspects of research. Diffractive methodologies have largely been developed in response to a dissatisfaction with practices of "reflexivity", which are seen to be grounded in…

  11. A Call for a New National Norming Methodology.

    ERIC Educational Resources Information Center

    Ligon, Glynn; Mangino, Evangelina

    Issues related to achieving adequate national norms are reviewed, and a new methodology is proposed that would work to provide a true measure of national achievement levels on an annual basis and would enable reporting results in current-year norms. Statistical methodology and technology could combine to create a national norming process that…

  12. Imaging Girls: Visual Methodologies and Messages for Girls' Education

    ERIC Educational Resources Information Center

    Magno, Cathryn; Kirk, Jackie

    2008-01-01

    This article describes the use of visual methodologies to examine images of girls used by development agencies to portray and promote their work in girls' education, and provides a detailed discussion of three report cover images. It details the processes of methodology and tool development for the visual analysis and presents initial 'readings'…

  13. 78 FR 29353 - Federal Need Analysis Methodology for the 2014-15 Award Year-Federal Pell Grant, Federal Perkins...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-20

    ... DEPARTMENT OF EDUCATION Federal Need Analysis Methodology for the 2014-15 Award Year-- Federal Pell Grant, Federal Perkins Loan, Federal Work-Study, Federal Supplemental Educational Opportunity... announces the annual updates to the tables used in the statutory Federal Need Analysis Methodology that...

  14. Prediction and standard error estimation for a finite universe total when a stratum is not sampled

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wright, T.

    1994-01-01

    In the context of a universe of trucks operating in the United States in 1990, this paper presents statistical methodology for estimating a finite universe total on a second occasion when a part of the universe is sampled and the remainder of the universe is not sampled. Prediction is used to compensate for the lack of data from the unsampled portion of the universe. The sample is assumed to be a subsample of an earlier sample where stratification is used on both occasions before sample selection. Accounting for births and deaths in the universe between the two points in time,more » the detailed sampling plan, estimator, standard error, and optimal sample allocation, are presented with a focus on the second occasion. If prior auxiliary information is available, the methodology is also applicable to a first occasion.« less

  15. Microbialites vs detrital micrites: Degree of biogenicity, parameter suitable for Mars analogues

    NASA Astrophysics Data System (ADS)

    Blanco, Armando; D'Elia, Marcella; Orofino, Vincenzo; Mancarella, Francesca; Fonti, Sergio; Mastandrea, Adelaide; Guido, Adriano; Tosti, Fabio; Russo, Franco

    2014-07-01

    In upcoming years several space missions will investigate the habitability of Mars and the possibility of extinct or extant life on the planet. In previous laboratory works we have investigated the infrared spectral modifications induced by thermal processing on different carbonate samples, in the form of recent shells and fossils of different ages, whose biogenic origin is indisputable. The goal was to develop a method able to discriminate biogenic carbonate samples from their abiogenic counterparts. The method has been successfully applied to microbialites, i.e. bio-induced microcrystalline carbonate deposits, and particularly to stromatolites, the laminated fabric of microbialites, some of which can be ascribed among the oldest traces of biological activity known on Earth. In this work we show that, by applying our method to different parts of the same carbonate rock, we are able to discriminate the presence, nature and biogenicity of various micrite types (i.e. detrital vs autochthonous) and to distinguish them from the skeletal grains. To test our methodology we preliminarily used the epifluorescence technique to select on polished samples, skeletal grains, autochthonous and allochthonous micrites, each one characterized by different organic matter content. The results on the various components show that, applying the infrared spectral modifications induced by thermal processing, it is possible to determine the degree of biogenicity of the different carbonate samples. The results are of valuable importance since such carbonates are linked to primitive living organisms that can be considered as good analogues for putative Martian life forms.

  16. A Step-by-Step Design Methodology for a Base Case Vanadium Redox-Flow Battery

    ERIC Educational Resources Information Center

    Moore, Mark; Counce, Robert M.; Watson, Jack S.; Zawodzinski, Thomas A.; Kamath, Haresh

    2012-01-01

    The purpose of this work is to develop an evolutionary procedure to be used by Chemical Engineering students for the base-case design of a Vanadium Redox-Flow Battery. The design methodology is based on the work of Douglas (1985) and provides a profitability analysis at each decision level so that more profitable alternatives and directions can be…

  17. Manufacturing Technology for Shipbuilding

    DTIC Science & Technology

    1983-01-01

    during which time there was an interface of Japanese and American shipbuilding concepts and methodology . Those two years of work resulted in many...lanes is of paramount importance in maintaining a smoothy orderly flow of pre-fabricated steel. Occasionally, the process lanes may fall behind schedule...design methodology . The earlier the start that Engineering has, the better the chance that all required engineering work will be completed at start of

  18. A methodology to assess performance of human-robotic systems in achievement of collective tasks

    NASA Technical Reports Server (NTRS)

    Howard, Ayanna M.

    2005-01-01

    In this paper, we present a methodology to assess system performance of human-robotic systems in achievement of collective tasks such as habitat construction, geological sampling, and space exploration.

  19. [Development of New Mathematical Methodology in Air Traffic Control for the Analysis of Hybrid Systems

    NASA Technical Reports Server (NTRS)

    Hermann, Robert

    1997-01-01

    The aim of this research is to develop new mathematical methodology for the analysis of hybrid systems of the type involved in Air Traffic Control (ATC) problems. Two directions of investigation were initiated. The first used the methodology of nonlinear generalized functions, whose mathematical foundations were initiated by Colombeau and developed further by Oberguggenberger; it has been extended to apply to ordinary differential. Systems of the type encountered in control in joint work with the PI and M. Oberguggenberger. This involved a 'mixture' of 'continuous' and 'discrete' methodology. ATC clearly involves mixtures of two sorts of mathematical problems: (1) The 'continuous' dynamics of a standard control type described by ordinary differential equations (ODE) of the form: {dx/dt = f(x, u)} and (2) the discrete lattice dynamics involved of cellular automata. Most of the CA literature involves a discretization of a partial differential equation system of the type encountered in physics problems (e.g. fluid and gas problems). Both of these directions requires much thinking and new development of mathematical fundamentals before they may be utilized in the ATC work. Rather than consider CA as 'discretization' of PDE systems, I believe that the ATC applications will require a completely different and new mathematical methodology, a sort of discrete analogue of jet bundles and/or the sheaf-theoretic techniques to topologists. Here too, I have begun work on virtually 'virgin' mathematical ground (at least from an 'applied' point of view) which will require considerable preliminary work.

  20. Solving a methodological challenge in work stress evaluation with the Stress Assessment and Research Toolkit (StART): a study protocol.

    PubMed

    Guglielmi, Dina; Simbula, Silvia; Vignoli, Michela; Bruni, Ilaria; Depolo, Marco; Bonfiglioli, Roberta; Tabanelli, Maria Carla; Violante, Francesco Saverio

    2013-06-22

    Stress evaluation is a field of strong interest and challenging due to several methodological aspects in the evaluation process. The aim of this study is to propose a study protocol to test a new method (i.e., the Stress Assessment and Research Toolkit) to assess psychosocial risk factors at work. This method addresses several methodological issues (e.g., subjective vs. objective, qualitative vs quantitative data) by assessing work-related stressors using different kinds of data: i) organisational archival data (organisational indicators sheet); ii) qualitative data (focus group); iii) worker perception (questionnaire); and iv) observational data (observational checklist) using mixed methods research. In addition, it allows positive and negative aspects of work to be considered conjointly, using an approach that considers at the same time job demands and job resources. The integration of these sources of data can reduce the theoretical and methodological bias related to stress research in the work setting, allows researchers and professionals to obtain a reliable description of workers' stress, providing a more articulate vision of psychosocial risks, and allows a large amount of data to be collected. Finally, the implementation of the method ensures in the long term a primary prevention for psychosocial risk management in that it aims to reduce or modify the intensity, frequency or duration of organisational demands.

Top